Harding, Keith; Benson, Erica E
2015-01-01
Standard operating procedures are a systematic way of making sure that biopreservation processes, tasks, protocols, and operations are correctly and consistently performed. They are the basic documents of biorepository quality management systems and are used in quality assurance, control, and improvement. Methodologies for constructing workflows and writing standard operating procedures and work instructions are described using a plant cryopreservation protocol as an example. This chapter is pertinent to other biopreservation sectors because how methods are written, interpreted, and implemented can affect the quality of storage outcomes.
High efficiency endocrine operation protocol: From design to implementation.
Mascarella, Marco A; Lahrichi, Nadia; Cloutier, Fabienne; Kleiman, Simcha; Payne, Richard J; Rosenberg, Lawrence
2016-10-01
We developed a high efficiency endocrine operative protocol based on a mathematical programming approach, process reengineering, and value-stream mapping to increase the number of operations completed per day without increasing operating room time at a tertiary-care, academic center. Using this protocol, a case-control study of 72 patients undergoing endocrine operation during high efficiency days were age, sex, and procedure-matched to 72 patients undergoing operation during standard days. The demographic profile, operative times, and perioperative complications were noted. The average number of cases per 8-hour workday in the high efficiency and standard operating rooms were 7 and 5, respectively. Mean procedure times in both groups were similar. The turnaround time (mean ± standard deviation) in the high efficiency group was 8.5 (±2.7) minutes as compared with 15.4 (±4.9) minutes in the standard group (P < .001). Transient postoperative hypocalcemia was 6.9% (5/72) and 8.3% (6/72) for the high efficiency and standard groups, respectively (P = .99). In this study, patients undergoing high efficiency endocrine operation had similar procedure times and perioperative complications compared with the standard group. The proposed high efficiency protocol seems to better utilize operative time and decrease the backlog of patients waiting for endocrine operation in a country with a universal national health care program. Copyright © 2016 Elsevier Inc. All rights reserved.
Park, Hyoung-Chul; Kim, Min Jeong; Lee, Bong Hwa
Although it is accepted that complicated appendicitis requires antibiotic therapy to prevent post-operative surgical infections, consensus protocols on the duration and regimens of treatment are not well established. This study aimed to compare the outcome of post-operative infectious complications in patients receiving old non-standardized and new standard antibiotic protocols, involving either 5 or 10 days of treatment, respectively. We enrolled 1,343 patients who underwent laparoscopic surgery for complicated appendicitis between January 2009 and December 2014. At the beginning of the new protocol, the patients were divided into two groups; 10 days of various antibiotic regimens (between January 2009 and June 2012, called the non-standardized protocol; n = 730) and five days of cefuroxime and metronidazole regimen (between July 2012 and December 2014; standardized protocol; n = 613). We compared the clinical outcomes, including surgical site infection (SSI) (superficial and deep organ/space infections) in the two groups. The standardized protocol group had a slightly shorter operative time (67 vs. 69 min), a shorter hospital stay (5 vs. 5.4 d), and lower medical cost (US$1,564 vs. US$1,654). Otherwise, there was no difference between the groups. No differences were found in the non-standardized and standard protocol groups with regard to the rate of superficial infection (10.3% vs. 12.7%; p = 0.488) or deep organ/space infection (2.3% vs. 2.1%; p = 0.797). In patients undergoing laparoscopic surgery for complicated appendicitis, five days of cefuroxime and metronidazole did not lead to more SSIs, and it decreased the medical costs compared with non-standardized antibiotic regimens.
Bourier, Felix; Reents, Tilko; Ammar-Busch, Sonia; Buiatti, Alessandra; Kottmaier, Marc; Semmler, Verena; Telishevska, Marta; Brkic, Amir; Grebmer, Christian; Lennerz, Carsten; Kolb, Christof; Hessling, Gabriele; Deisenhofer, Isabel
2016-01-01
Aims This study presents and evaluates the impact of a new lowest-dose fluoroscopy protocol (Siemens AG), especially designed for electrophysiology (EP) procedures, on X-ray dose levels. Methods and results From October 2014 to March 2015, 140 patients underwent an EP study on an Artis zee angiography system. The standard low-dose protocol was operated at 23 nGy (fluoroscopy) and at 120 nGy (cine-loop), the new lowest-dose protocol was operated at 8 nGy (fluoroscopy) and at 36 nGy (cine-loop). Procedural data, X-ray times, and doses were analysed in 100 complex left atrial and in 40 standard EP procedures. The resulting dose–area products were 877.9 ± 624.7 µGym² (n = 50 complex procedures, standard low dose), 199 ± 159.6 µGym² (n = 50 complex procedures, lowest dose), 387.7 ± 36.0 µGym² (n = 20 standard procedures, standard low dose), and 90.7 ± 62.3 µGym² (n = 20 standard procedures, lowest dose), P < 0.01. In the low-dose and lowest-dose groups, procedure times were 132.6 ± 35.7 vs. 126.7 ± 34.7 min (P = 0.40, complex procedures) and 72.3 ± 20.9 vs. 85.2 ± 44.1 min (P = 0.24, standard procedures), radiofrequency (RF) times were 53.8 ± 26.1 vs. 50.4 ± 29.4 min (P = 0.54, complex procedures) and 10.1 ± 9.9 vs. 12.2 ± 14.7 min (P = 0.60, standard procedures). One complication occurred in the standard low-dose and lowest-dose groups (P = 1.0). Conclusion The new lowest-dose imaging protocol reduces X-ray dose levels by 77% compared with the currently available standard low-dose protocol. From an operator standpoint, lowest X-ray dose levels create a different, reduced image quality. The new image quality did not significantly affect procedure or RF times and did not result in higher complication rates. Regarding radiological protection, operating at lowest-dose settings should become standard in EP procedures. PMID:26589627
Enhanced International Space Station Ku-Band Telemetry Service
NASA Technical Reports Server (NTRS)
Cecil, Andrew; Pitts, Lee; Welch, Steven; Bryan, Jason
2014-01-01
(1) The ISS is diligently working to increase utilization of the resources this unique laboratory provides; (2) Recent upgrades enabled the use of Internet Protocol communication using the CCSDS IP Encapsulation protocol; and (3) The Huntsville Operations Support Center has extended the onboard LAN to payload teams enabling the use of standard IP protocols for payload operations.
An operational open-end file transfer protocol for mobile satellite communications
NASA Technical Reports Server (NTRS)
Wang, Charles; Cheng, Unjeng; Yan, Tsun-Yee
1988-01-01
This paper describes an operational open-end file transfer protocol which includes the connecting procedure, data transfer, and relinquishment procedure for mobile satellite communications. The protocol makes use of the frame level and packet level formats of the X.25 standard for the data link layer and network layer, respectively. The structure of a testbed for experimental simulation of this protocol over a mobile fading channel is also introduced.
40 CFR 792.35 - Quality assurance unit.
Code of Federal Regulations, 2010 CFR
2010-07-01
... from approved protocols or standard operating procedures were made without proper authorization and... standard operating procedures, and that the reported results accurately reflect the raw data of the study... ACT (CONTINUED) GOOD LABORATORY PRACTICE STANDARDS Organization and Personnel § 792.35 Quality...
40 CFR 792.35 - Quality assurance unit.
Code of Federal Regulations, 2011 CFR
2011-07-01
... from approved protocols or standard operating procedures were made without proper authorization and... standard operating procedures, and that the reported results accurately reflect the raw data of the study... ACT (CONTINUED) GOOD LABORATORY PRACTICE STANDARDS Organization and Personnel § 792.35 Quality...
40 CFR 792.35 - Quality assurance unit.
Code of Federal Regulations, 2012 CFR
2012-07-01
... from approved protocols or standard operating procedures were made without proper authorization and... standard operating procedures, and that the reported results accurately reflect the raw data of the study... ACT (CONTINUED) GOOD LABORATORY PRACTICE STANDARDS Organization and Personnel § 792.35 Quality...
40 CFR 792.35 - Quality assurance unit.
Code of Federal Regulations, 2014 CFR
2014-07-01
... from approved protocols or standard operating procedures were made without proper authorization and... standard operating procedures, and that the reported results accurately reflect the raw data of the study... ACT (CONTINUED) GOOD LABORATORY PRACTICE STANDARDS Organization and Personnel § 792.35 Quality...
40 CFR 792.35 - Quality assurance unit.
Code of Federal Regulations, 2013 CFR
2013-07-01
... from approved protocols or standard operating procedures were made without proper authorization and... standard operating procedures, and that the reported results accurately reflect the raw data of the study... ACT (CONTINUED) GOOD LABORATORY PRACTICE STANDARDS Organization and Personnel § 792.35 Quality...
NHEXAS PHASE I ARIZONA STUDY--LIST OF STANDARD OPERATING PROCEDURES
This document lists available protocols and SOPs for the NHEXAS Phase I Arizona study. It identifies protocols and SOPs for the following study components: (1) Sample collection and field operations, (2) Sample analysis, (3) General laboratory procedures, (4) Quality Assurance, (...
Bourier, Felix; Reents, Tilko; Ammar-Busch, Sonia; Buiatti, Alessandra; Kottmaier, Marc; Semmler, Verena; Telishevska, Marta; Brkic, Amir; Grebmer, Christian; Lennerz, Carsten; Kolb, Christof; Hessling, Gabriele; Deisenhofer, Isabel
2016-09-01
This study presents and evaluates the impact of a new lowest-dose fluoroscopy protocol (Siemens AG), especially designed for electrophysiology (EP) procedures, on X-ray dose levels. From October 2014 to March 2015, 140 patients underwent an EP study on an Artis zee angiography system. The standard low-dose protocol was operated at 23 nGy (fluoroscopy) and at 120 nGy (cine-loop), the new lowest-dose protocol was operated at 8 nGy (fluoroscopy) and at 36 nGy (cine-loop). Procedural data, X-ray times, and doses were analysed in 100 complex left atrial and in 40 standard EP procedures. The resulting dose-area products were 877.9 ± 624.7 µGym² (n = 50 complex procedures, standard low dose), 199 ± 159.6 µGym² (n = 50 complex procedures, lowest dose), 387.7 ± 36.0 µGym² (n = 20 standard procedures, standard low dose), and 90.7 ± 62.3 µGym² (n = 20 standard procedures, lowest dose), P < 0.01. In the low-dose and lowest-dose groups, procedure times were 132.6 ± 35.7 vs. 126.7 ± 34.7 min (P = 0.40, complex procedures) and 72.3 ± 20.9 vs. 85.2 ± 44.1 min (P = 0.24, standard procedures), radiofrequency (RF) times were 53.8 ± 26.1 vs. 50.4 ± 29.4 min (P = 0.54, complex procedures) and 10.1 ± 9.9 vs. 12.2 ± 14.7 min (P = 0.60, standard procedures). One complication occurred in the standard low-dose and lowest-dose groups (P = 1.0). The new lowest-dose imaging protocol reduces X-ray dose levels by 77% compared with the currently available standard low-dose protocol. From an operator standpoint, lowest X-ray dose levels create a different, reduced image quality. The new image quality did not significantly affect procedure or RF times and did not result in higher complication rates. Regarding radiological protection, operating at lowest-dose settings should become standard in EP procedures. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2015. For permissions please email: journals.permissions@oup.com.
Data Sharing to Improve Close Approach Monitoring and Safety of Flight
NASA Astrophysics Data System (ADS)
Chan, Joseph; DalBello, Richard; Hope, Dean; Wauthier, Pascal; Douglas, Tim; Inghram, Travis
2009-03-01
Individual satellite operators have done a good job of developing the internal protocols and procedures to ensure the safe operation of their fleets. However, data sharing among operators for close approach monitoring is conducted in an ad-hoc manner during relocations, and there is currently no standardized agreement among operators on the content, format, and distribution protocol for data sharing. Crowding in geostationary orbit, participation by new commercial actors, government interest in satellite constellations, and highly maneuverable spacecraft all suggest that satellite operators will need to begin a dialogue on standard communication protocols and procedure to improve situation awareness. We will give an overview of the current best practices among different operators for close approach monitoring and discuss the concept of an active data center to improve data sharing, conjunction monitoring, and avoidance among satellite operators. We will also report on the progress and lessons learned from a Data Center prototype conducted by several operators over a one year period.
U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--LIST OF STANDARD OPERATING PROCEDURES
This document lists available protocols and SOPs for the U.S.-Mexico Border Program study. It identifies protocols and SOPs for the following study components: (1) Sample collection and field operations, (2) Sample analysis, (3) General laboratory procedures, (4) Quality Assuranc...
Petersen, James C.; Justus, B.G.; Dodd, H.R.; Bowles, D.E.; Morrison, L.W.; Williams, M.H.; Rowell, G.A.
2008-01-01
Buffalo National River located in north-central Arkansas, and Ozark National Scenic Riverways, located in southeastern Missouri, are the two largest units of the National Park Service in the Ozark Plateaus physiographic province. The purpose of this report is to provide a protocol that will be used by the National Park Service to sample fish communities and collect related water-quality, habitat, and stream discharge data of Buffalo National River and Ozark National Scenic Riverways to meet inventory and long-term monitoring objectives. The protocol includes (1) a protocol narrative, (2) several standard operating procedures, and (3) supplemental information helpful for implementation of the protocol. The protocol narrative provides background information about the protocol such as the rationale of why a particular resource or resource issue was selected for monitoring, information concerning the resource or resource issue of interest, a description of how monitoring results will inform management decisions, and a discussion of the linkages between this and other monitoring projects. The standard operating procedures cover preparation, training, reach selection, water-quality sampling, fish community sampling, physical habitat collection, measuring stream discharge, equipment maintenance and storage, data management and analysis, reporting, and protocol revision procedures. Much of the information in the standard operating procedures was gathered from existing protocols of the U.S. Geological Survey National Water Quality Assessment program or other sources. Supplemental information that would be helpful for implementing the protocol is included. This information includes information on fish species known or suspected to occur in the parks, sample sites, sample design, fish species traits, index of biotic integrity metrics, sampling equipment, and field forms.
21 CFR 58.35 - Quality assurance unit.
Code of Federal Regulations, 2013 CFR
2013-04-01
... the corrective actions taken. (5) Determine that no deviations from approved protocols or standard operating procedures were made without proper authorization and documentation. (6) Review the final study report to assure that such report accurately describes the methods and standard operating procedures, and...
21 CFR 58.35 - Quality assurance unit.
Code of Federal Regulations, 2010 CFR
2010-04-01
... the corrective actions taken. (5) Determine that no deviations from approved protocols or standard operating procedures were made without proper authorization and documentation. (6) Review the final study report to assure that such report accurately describes the methods and standard operating procedures, and...
21 CFR 58.35 - Quality assurance unit.
Code of Federal Regulations, 2012 CFR
2012-04-01
... the corrective actions taken. (5) Determine that no deviations from approved protocols or standard operating procedures were made without proper authorization and documentation. (6) Review the final study report to assure that such report accurately describes the methods and standard operating procedures, and...
21 CFR 58.35 - Quality assurance unit.
Code of Federal Regulations, 2011 CFR
2011-04-01
... the corrective actions taken. (5) Determine that no deviations from approved protocols or standard operating procedures were made without proper authorization and documentation. (6) Review the final study report to assure that such report accurately describes the methods and standard operating procedures, and...
21 CFR 58.35 - Quality assurance unit.
Code of Federal Regulations, 2014 CFR
2014-04-01
... the corrective actions taken. (5) Determine that no deviations from approved protocols or standard operating procedures were made without proper authorization and documentation. (6) Review the final study report to assure that such report accurately describes the methods and standard operating procedures, and...
16 CFR 1210.4 - Test protocol.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 16 Commercial Practices 2 2012-01-01 2012-01-01 false Test protocol. 1210.4 Section 1210.4... STANDARD FOR CIGARETTE LIGHTERS Requirements for Child Resistance § 1210.4 Test protocol. (a) Child test panel. (1) The test to determine if a lighter is resistant to successful operation by children uses a...
16 CFR 1212.4 - Test protocol.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 16 Commercial Practices 2 2012-01-01 2012-01-01 false Test protocol. 1212.4 Section 1212.4... STANDARD FOR MULTI-PURPOSE LIGHTERS Requirements for Child-Resistance § 1212.4 Test protocol. (a) Child test panel. (1) The test to determine if a multi-purpose lighter is resistant to successful operation...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abeykoon, A. M. Milinda; Hu, Hefei; Wu, Lijun
2015-01-30
Different protocols for calibrating electron pair distribution function (ePDF) measurements are explored and described for quantitative studies on nanomaterials. It is found that the most accurate approach to determine the camera length is to use a standard calibration sample of Au nanoparticles from the National Institute of Standards and Technology. Different protocols for data collection are also explored, as are possible operational errors, to find the best approaches for accurate data collection for quantitative ePDF studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abeykoon, A. M. Milinda; Hu, Hefei; Wu, Lijun
2015-02-01
We explore and describe different protocols for calibrating electron pair distribution function (ePDF) measurements for quantitative studies on nano-materials. We find the most accurate approach to determine the camera-length is to use a standard calibration sample of Au nanoparticles from National Institute of Standards and Technology. Different protocols for data collection are also explored, as are possible operational errors, to find the best approaches for accurate data collection for quantitative ePDF studies.
Ozarda, Yesim; Ichihara, Kiyoshi; Barth, Julian H; Klee, George
2013-05-01
The reference intervals (RIs) given in laboratory reports have an important role in aiding clinicians in interpreting test results in reference to values of healthy populations. In this report, we present a proposed protocol and standard operating procedures (SOPs) for common use in conducting multicenter RI studies on a national or international scale. The protocols and consensus on their contents were refined through discussions in recent C-RIDL meetings. The protocol describes in detail (1) the scheme and organization of the study, (2) the target population, inclusion/exclusion criteria, ethnicity, and sample size, (3) health status questionnaire, (4) target analytes, (5) blood collection, (6) sample processing and storage, (7) assays, (8) cross-check testing, (9) ethics, (10) data analyses, and (11) reporting of results. In addition, the protocol proposes the common measurement of a panel of sera when no standard materials exist for harmonization of test results. It also describes the requirements of the central laboratory, including the method of cross-check testing between the central laboratory of each country and local laboratories. This protocol and the SOPs remain largely exploratory and may require a reevaluation from the practical point of view after their implementation in the ongoing worldwide study. The paper is mainly intended to be a basis for discussion in the scientific community.
Integrating medical devices in the operating room using service-oriented architectures.
Ibach, Bastian; Benzko, Julia; Schlichting, Stefan; Zimolong, Andreas; Radermacher, Klaus
2012-08-01
Abstract With the increasing documentation requirements and communication capabilities of medical devices in the operating room, the integration and modular networking of these devices have become more and more important. Commercial integrated operating room systems are mainly proprietary developments using usually proprietary communication standards and interfaces, which reduce the possibility of integrating devices from different vendors. To overcome these limitations, there is a need for an open standardized architecture that is based on standard protocols and interfaces enabling the integration of devices from different vendors based on heterogeneous software and hardware components. Starting with an analysis of the requirements for device integration in the operating room and the techniques used for integrating devices in other industrial domains, a new concept for an integration architecture for the operating room based on the paradigm of a service-oriented architecture is developed. Standardized communication protocols and interface descriptions are used. As risk management is an important factor in the field of medical engineering, a risk analysis of the developed concept has been carried out and the first prototypes have been implemented.
Tiszler, John; Rodriguez, Dirk; Lombardo, Keith; Sagar, Tarja; Aguilar, Luis; Lee, Lena; Handley, Timothy; McEachern, A. Kathryn; Harrod Starcevich, Leigh Ann; Witter, Marti; Philippi, Tom; Ostermann-Kelm, Stacey
2016-01-01
These Standard Operating Procedures are one part of a two-part protocol for monitoring terrestrial vegetation in the Mediterranean Coast Network. The second part of the protocol is the narrative:Tiszler, J., D. Rodriguez, K. Lombardo, T. Sagar, L. Aguilar, L. Lee, T. Handley, K. McEachern, L. Starcevich, M. Witter, T. Philippi, and S. Ostermann-Kelm. 2016. Terrestrial vegetation monitoring protocol for the Mediterranean Coast Network—Cabrillo National Monument, Channel Islands National Park, and Santa Monica Mountains National Recreation Area: Narrative, version 1.0. Natural Resource Report NPS/MEDN/NRR—2016/1296. National Park Service, Fort Collins, Colorado.National parks in the Mediterranean Inventory and Monitoring Network:Cabrillo National Monument (CABR)Channel Islands National Park (CHIS)Santa Monica Mountains National Recreation Area (SAMO)
Protocols | Office of Cancer Clinical Proteomics Research
Each reagent on the Antibody Portal has been characterized by a combination of methods specific for that antibody. To view the customized antibody methods and protocols (Standard Operating Procedures) used to generate and characterize each reagent, select an antibody of interest and open the protocols associated with their respective characterization methods along with characterization data.
Test Protocols for Advanced Inverter Interoperability Functions – Main Document
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Jay Dean; Gonzalez, Sigifredo; Ralph, Mark E.
2013-11-01
Distributed energy resources (DER) such as photovoltaic (PV) systems, when deployed in a large scale, are capable of influencing significantly the operation of power systems. Looking to the future, stakeholders are working on standards to make it possible to manage the potentially complex interactions between DER and the power system. In 2009, the Electric Power Research Institute (EPRI), Sandia National Laboratories (SNL) with the U.S. Department of Energy (DOE), and the Solar Electric Power Association (SEPA) initiated a large industry collaborative to identify and standardize definitions for a set of DER grid support functions. While the initial effort concentrated onmore » grid-tied PV inverters and energy storage systems, the concepts have applicability to all DER. A partial product of this on-going effort is a reference definitions document (IEC TR 61850-90-7, Object models for power converters in distributed energy resources (DER) systems) that has become a basis for expansion of related International Electrotechnical Commission (IEC) standards, and is supported by US National Institute of Standards and Technology (NIST) Smart Grid Interoperability Panel (SGIP). Some industry-led organizations advancing communications protocols have also embraced this work. As standards continue to evolve, it is necessary to develop test protocols to independently verify that the inverters are properly executing the advanced functions. Interoperability is assured by establishing common definitions for the functions and a method to test compliance with operational requirements. This document describes test protocols developed by SNL to evaluate the electrical performance and operational capabilities of PV inverters and energy storage, as described in IEC TR 61850-90-7. While many of these functions are not currently required by existing grid codes or may not be widely available commercially, the industry is rapidly moving in that direction. Interoperability issues are already apparent as some of these inverter capabilities are being incorporated in large demonstration and commercial projects. The test protocols are intended to be used to verify acceptable performance of inverters within the standard framework described in IEC TR 61850-90-7. These test protocols, as they are refined and validated over time, can become precursors for future certification test procedures for DER advanced grid support functions.« less
Test Protocols for Advanced Inverter Interoperability Functions - Appendices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Jay Dean; Gonzalez, Sigifredo; Ralph, Mark E.
2013-11-01
Distributed energy resources (DER) such as photovoltaic (PV) systems, when deployed in a large scale, are capable of influencing significantly the operation of power systems. Looking to the future, stakeholders are working on standards to make it possible to manage the potentially complex interactions between DER and the power system. In 2009, the Electric Power Research Institute (EPRI), Sandia National Laboratories (SNL) with the U.S. Department of Energy (DOE), and the Solar Electric Power Association (SEPA) initiated a large industry collaborative to identify and standardize definitions for a set of DER grid support functions. While the initial effort concentrated onmore » grid-tied PV inverters and energy storage systems, the concepts have applicability to all DER. A partial product of this on-going effort is a reference definitions document (IEC TR 61850-90-7, Object models for power converters in distributed energy resources (DER) systems) that has become a basis for expansion of related International Electrotechnical Commission (IEC) standards, and is supported by US National Institute of Standards and Technology (NIST) Smart Grid Interoperability Panel (SGIP). Some industry-led organizations advancing communications protocols have also embraced this work. As standards continue to evolve, it is necessary to develop test protocols to independently verify that the inverters are properly executing the advanced functions. Interoperability is assured by establishing common definitions for the functions and a method to test compliance with operational requirements. This document describes test protocols developed by SNL to evaluate the electrical performance and operational capabilities of PV inverters and energy storage, as described in IEC TR 61850-90-7. While many of these functions are not now required by existing grid codes or may not be widely available commercially, the industry is rapidly moving in that direction. Interoperability issues are already apparent as some of these inverter capabilities are being incorporated in large demonstration and commercial projects. The test protocols are intended to be used to verify acceptable performance of inverters within the standard framework described in IEC TR 61850-90-7. These test protocols, as they are refined and validated over time, can become precursors for future certification test procedures for DER advanced grid support functions.« less
Internet Data Delivery for Future Space Missions
NASA Technical Reports Server (NTRS)
Rash, James; Casasanta, Ralph; Hogie, Keith; Hennessy, Joseph F. (Technical Monitor)
2002-01-01
Ongoing work at National Aeronautics and Space Administration Goddard Space Flight Center (NASA/GSFC), seeks to apply standard Internet applications and protocols to meet the technology challenge of future satellite missions. Internet protocols and technologies are under study as a future means to provide seamless dynamic communication among heterogeneous instruments, spacecraft, ground stations, constellations of spacecraft, and science investigators. The primary objective is to design and demonstrate in the laboratory the automated end-to-end transport of files in a simulated dynamic space environment using off-the-shelf, low-cost, commodity-level standard applications and protocols. The demonstrated functions and capabilities will become increasingly significant in the years to come as both earth and space science missions fly more sensors and as the need increases for more network-oriented mission operations. Another element of increasing significance will be the increased cost effectiveness of designing, building, integrating, and operating instruments and spacecraft that will come to the fore as more missions take up the approach of using commodity-level standard communications technologies. This paper describes how an IP (Internet Protocol)-based communication architecture can support all existing operations concepts and how it will enable some new and complex communication and science concepts. The authors identify specific end-to-end data flows from the instruments to the control centers and scientists, and then describe how each data flow can be supported using standard Internet protocols and applications. The scenarios include normal data downlink and command uplink as well as recovery scenarios for both onboard and ground failures. The scenarios are based on an Earth orbiting spacecraft with downlink data rates from 300 Kbps to 4 Mbps. Included examples are based on designs currently being investigated for potential use by the Global Precipitation Measurement (GPM) mission.
A Novel Process Audit for Standardized Perioperative Handoff Protocols.
Pallekonda, Vinay; Scholl, Adam T; McKelvey, George M; Amhaz, Hassan; Essa, Deanna; Narreddy, Spurthy; Tan, Jens; Templonuevo, Mark; Ramirez, Sasha; Petrovic, Michelle A
2017-11-01
A perioperative handoff protocol provides a standardized delivery of communication during a handoff that occurs from the operating room to the postanestheisa care unit or ICU. The protocol's success is dependent, in part, on its continued proper use over time. A novel process audit was developed to help ensure that a perioperative handoff protocol is used accurately and appropriately over time. The Audit Observation Form is used for the Audit Phase of the process audit, while the Audit Averages Form is used for the Data Analysis Phase. Employing minimal resources and using quantitative methods, the process audit provides the necessary means to evaluate the proper execution of any perioperative handoff protocol. Copyright © 2017 The Joint Commission. Published by Elsevier Inc. All rights reserved.
The purpose of this SOP is to establish a uniform format for the preparation of SOPs. Use of these protocols ensures consistent implementation of project tasks, documents the preparation and implementation of the procedures used, describes quality control measures and the limits...
45 CFR 156.715 - Compliance reviews of QHP issuers in Federally-facilitated Exchanges.
Code of Federal Regulations, 2014 CFR
2014-10-01
...'s enrollees; (2) The QHP issuer's policies and procedures, protocols, standard operating procedures... REQUIREMENTS RELATING TO HEALTH CARE ACCESS HEALTH INSURANCE ISSUER STANDARDS UNDER THE AFFORDABLE CARE ACT, INCLUDING STANDARDS RELATED TO EXCHANGES Oversight and Financial Integrity Standards for Issuers of...
Fussell, Holly E; Kunkel, Lynn E; McCarty, Dennis; Lewy, Colleen S
2011-09-01
Training research staff to implement clinical trials occurring in community-based addiction treatment programs presents unique challenges. Standardized patient walkthroughs of study procedures may enhance training and protocol implementation. Examine and discuss cross-site and cross-study challenges of participant screening and data collection procedures identified during standardized patient walkthroughs of multi-site clinical trials. Actors portrayed clients and "walked through" study procedures with protocol research staff. The study completed 57 walkthroughs during implementation of 4 clinical trials. Observers and walkthrough participants identified three areas of concern (consent procedures, screening and assessment processes, and protocol implementation) and made suggestions for resolving the concerns. Standardized patient walkthroughs capture issues with study procedures previously unidentified with didactic training or unscripted rehearsals. Clinical trials within the National Drug Abuse Treatment Clinical Trials Network are conducted in addiction treatment centers that vary on multiple dimensions. Based on walkthrough observations, the national protocol team and local site leadership modify standardized operating procedures and resolve cross-site problems prior to recruiting study participants. The standardized patient walkthrough improves consistency across study sites and reduces potential site variation in study outcomes.
The effect of personalized versus standard patient protocols for radiostereometric analysis (RSA).
Muharemovic, O; Troelsen, A; Thomsen, M G; Kallemose, T; Gosvig, K K
2018-05-01
Increasing pressure in the clinic requires a more standardized approach to radiostereometric analysis (RSA) imaging. The aim of this study was to investigate whether implementation of personalized RSA patient protocols could increase image quality and decrease examination time and the number of exposure repetitions. Forty patients undergoing primary total hip arthroplasty were equally randomized to either a case or a control group. Radiographers in the case group were assisted by personalized patient protocols containing information about each patient's post-operative RSA imaging. Radiographers in the control group used a standard RSA protocol. At three months, radiographers in the case group significantly reduced (p < 0.001) the number of exposures by 1.6, examination time with 19.2 min, and distance between centrum of prosthesis and centrum of calibration field with 34.1 mm when compared to post-operative (baseline) results. At twelve months, the case group significantly reduced (p < 0.001) number of exposures by two, examination time with 22.5 min, and centrum of prosthesis to centrum of calibration field distance with 43.1 mm when compared to baseline results. No significant improvements were found in the control group at any time point. There is strong evidence that personalized RSA patient protocols have a positive effect on image quality and radiation dose savings. Implementation of personal patient protocols as a RSA standard will contribute to the reduction of examination time, thus ensuring a cost benefit for department and patient safety. Copyright © 2017 The College of Radiographers. Published by Elsevier Ltd. All rights reserved.
Vidotto, Laís Silva; Bigliassi, Marcelo; Alencar, Tatiane Romanini Rodrigues; Silva, Thaísa Maria Santos; Probst, Vanessa Suziane
2015-07-01
To compare the acute effects of a standardized physiotherapy protocol versus a typical non-standardized physiotherapy protocol on pain and performance of patients undergoing alveolar bone graft (ABG). Sixteen patients (9 males; 12 [11-13] years) with cleft lip and palate undergoing ABG were allocated into two groups: (1) experimental group--EG (standardized physiotherapy protocol); and (2) control group--CG (typical, non-standardized physiotherapy treatment). Range of motion, muscle strength, gait speed, and pain level were assessed prior to surgical intervention (PRE), as well as on the first, second, and third post-operative days (1st, 2nd, and 3rd PO, respectively). Recovery with respect to range of motion of hip flexion was more pronounced in the EG (64.6 ± 11.0°) in comparison to the CG (48.5 ± 17.7° on the 3rd PO; p < 0.05). In addition, less pain was observed in the EG (0 [0-0.2] versus 2 [0.7-3] in the CG on the 3rd PO; p < 0.05). A standardized physiotherapy protocol appears to be better than a non-standardized physiotherapy protocol for acute improvement of range of motion of hip flexion and for reducing pain in patients undergoing ABG.
Evaluation of Protocol Uniformity Concerning Laparoscopic Cholecystectomy in The Netherlands
Goossens, Richard H. M.; van Eijk, Daan J.; Lange, Johan F.
2008-01-01
Background Iatrogenic bile duct injury remains a current complication of laparoscopic cholecystectomy. One uniform and standardized protocol, based on the “critical view of safety” concept of Strasberg, should reduce the incidence of this complication. Furthermore, owing to the rapid development of minimally invasive surgery, technicians are becoming more frequently involved. To improve communication between the operating team and technicians, standardized actions should also be defined. The aim of this study was to compare existing protocols for laparoscopic cholecystectomy from various Dutch hospitals. Methods Fifteen Dutch hospitals were contacted for evaluation of their protocols for laparoscopic cholecystectomy. All evaluated protocols were divided into six steps and were compared accordingly. Results In total, 13 hospitals responded—5 academic hospitals, 5 teaching hospitals, 3 community hospitals—of which 10 protocols were usable for comparison. Concerning the trocar positions, only minor differences were found. The concept of “critical view of safety” was represented in just one protocol. Furthermore, the order of clipping and cutting the cystic artery and duct differed. Descriptions of instruments and apparatus were also inconsistent. Conclusions Present protocols differ too much to define a universal procedure among surgeons in The Netherlands. The authors propose one (inter)national standardized protocol, including standardized actions. This uniform standardized protocol has to be officially released and recommended by national scientific associations (e.g., the Dutch Society of Surgery) or international societies (e.g., European Association for Endoscopic Surgery and Society of American Gastrointestinal and Endoscopic Surgeons). The aim is to improve patient safety and professional communication, which are necessary for new developments. PMID:18224485
The purpose of this SOP is to establish a uniform format for the preparation of SOPs. Use of these protocols ensures consistent implementation of project tasks, documents the preparation and implementation of the procedures used, describes quality control measures and the limits...
Um, Keehong
2016-05-01
We have designed a protocol analyzer to be used in wireless power systems and analyzed the operation of wireless chargers defined by standards of Qi of Wireless Power Consortium (WPC) and Power Matters Alliance (PMA) protocols. The integrated circuit (IC, or microchip) developed so far for wireless power transmission is not easily adopted by chargers for specific purposes. A device for measuring the performance of test equipment currently available is required to transform and expand the types of protocol. Since a protocol analyzer with these functions is required, we have developed a device that can analyze the two protocols of WPC and PMA at the same time. As a result of our research, we present a dual-mode system that can analyze the protocols of both WPC and PMA.
Cooperative Energy Harvesting-Adaptive MAC Protocol for WBANs
Esteves, Volker; Antonopoulos, Angelos; Kartsakli, Elli; Puig-Vidal, Manel; Miribel-Català, Pere; Verikoukis, Christos
2015-01-01
In this paper, we introduce a cooperative medium access control (MAC) protocol, named cooperative energy harvesting (CEH)-MAC, that adapts its operation to the energy harvesting (EH) conditions in wireless body area networks (WBANs). In particular, the proposed protocol exploits the EH information in order to set an idle time that allows the relay nodes to charge their batteries and complete the cooperation phase successfully. Extensive simulations have shown that CEH-MAC significantly improves the network performance in terms of throughput, delay and energy efficiency compared to the cooperative operation of the baseline IEEE 802.15.6 standard. PMID:26029950
Space flight operations communications phraseology and techniques
NASA Technical Reports Server (NTRS)
Noneman, S. R.
1986-01-01
Communications are a critical link in space flight operations. Specific communications phraseology and techniques have been developed to allow rapid and clear transfer of information. Communications will be clear and brief through the use of procedural words and phrases. Communications protocols standardize the required information transferred. The voicing of letters and numbers is discussed. The protocols used in air-to-ground communications are given. A glossary of communications terminology is presented in the appendix.
Development of wide area environment accelerator operation and diagnostics method
NASA Astrophysics Data System (ADS)
Uchiyama, Akito; Furukawa, Kazuro
2015-08-01
Remote operation and diagnostic systems for particle accelerators have been developed for beam operation and maintenance in various situations. Even though fully remote experiments are not necessary, the remote diagnosis and maintenance of the accelerator is required. Considering remote-operation operator interfaces (OPIs), the use of standard protocols such as the hypertext transfer protocol (HTTP) is advantageous, because system-dependent protocols are unnecessary between the remote client and the on-site server. Here, we have developed a client system based on WebSocket, which is a new protocol provided by the Internet Engineering Task Force for Web-based systems, as a next-generation Web-based OPI using the Experimental Physics and Industrial Control System Channel Access protocol. As a result of this implementation, WebSocket-based client systems have become available for remote operation. Also, as regards practical application, the remote operation of an accelerator via a wide area network (WAN) faces a number of challenges, e.g., the accelerator has both experimental device and radiation generator characteristics. Any error in remote control system operation could result in an immediate breakdown. Therefore, we propose the implementation of an operator intervention system for remote accelerator diagnostics and support that can obviate any differences between the local control room and remote locations. Here, remote-operation Web-based OPIs, which resolve security issues, are developed.
Two deterministic models (US EPA’s Office of Pesticide Programs Residential Standard Operating Procedures (OPP Residential SOPs) and Draft Protocol for Measuring Children’s Non-Occupational Exposure to Pesticides by all Relevant Pathways (Draft Protocol)) and four probabilistic mo...
Use of a Microprocessor to Implement an ADCCP Protocol (Federal Standard 1003).
1980-07-01
results of other studies, to evaluate the operational and economic impact of incorporating various options in Federal Standard 1003. The effort...the LSI interface and the microprocessor; the LSI chip deposits bytes in its buffer as the producer, and the MPU reads this data as the consumer...on the interface between the MPU and the LSI protocol chip. This requires two main processes to be running at the same time--transmit and receive. The
Communications among elements of a space construction ensemble
NASA Technical Reports Server (NTRS)
Davis, Randal L.; Grasso, Christopher A.
1989-01-01
Space construction projects will require careful coordination between managers, designers, manufacturers, operators, astronauts, and robots with large volumes of information of varying resolution, timeliness, and accuracy flowing between the distributed participants over computer communications networks. Within the CSC Operations Branch, we are researching the requirements and options for such communications. Based on our work to date, we feel that communications standards being developed by the International Standards Organization, the CCITT, and other groups can be applied to space construction. We are currently studying in depth how such standards can be used to communicate with robots and automated construction equipment used in a space project. Specifically, we are looking at how the Manufacturing Automation Protocol (MAP) and the Manufacturing Message Specification (MMS), which tie together computers and machines in automated factories, might be applied to space construction projects. Together with our CSC industrial partner Computer Technology Associates, we are developing a MAP/MMS companion standard for space construction and we will produce software to allow the MAP/MMS protocol to be used in our CSC operations testbed.
Nitrosi, Andrea; Bertolini, Marco; Borasi, Giovanni; Botti, Andrea; Barani, Adriana; Rivetti, Stefano; Pierotti, Luisa
2009-12-01
Ideally, medical x-ray imaging systems should be designed to deliver maximum image quality at an acceptable radiation risk to the patient. Quality assurance procedures are employed to ensure that these standards are maintained. A quality control protocol for direct digital radiography (DDR) systems is described and discussed. Software to automatically process and analyze the required images was developed. In this paper, the initial results obtained on equipment of different DDR manufacturers were reported. The protocol was developed to highlight even small discrepancies in standard operating performance.
Huser, Vojtech; Sastry, Chandan; Breymaier, Matthew; Idriss, Asma; Cimino, James J
2015-10-01
Efficient communication of a clinical study protocol and case report forms during all stages of a human clinical study is important for many stakeholders. An electronic and structured study representation format that can be used throughout the whole study life-span can improve such communication and potentially lower total study costs. The most relevant standard for representing clinical study data, applicable to unregulated as well as regulated studies, is the Operational Data Model (ODM) in development since 1999 by the Clinical Data Interchange Standards Consortium (CDISC). ODM's initial objective was exchange of case report forms data but it is increasingly utilized in other contexts. An ODM extension called Study Design Model, introduced in 2011, provides additional protocol representation elements. Using a case study approach, we evaluated ODM's ability to capture all necessary protocol elements during a complete clinical study lifecycle in the Intramural Research Program of the National Institutes of Health. ODM offers the advantage of a single format for institutions that deal with hundreds or thousands of concurrent clinical studies and maintain a data warehouse for these studies. For each study stage, we present a list of gaps in the ODM standard and identify necessary vendor or institutional extensions that can compensate for such gaps. The current version of ODM (1.3.2) has only partial support for study protocol and study registration data mainly because it is outside the original development goal. ODM provides comprehensive support for representation of case report forms (in both the design stage and with patient level data). Inclusion of requirements of observational, non-regulated or investigator-initiated studies (outside Food and Drug Administration (FDA) regulation) can further improve future revisions of the standard. Published by Elsevier Inc.
Higdon, Lauren E; Lee, Karim; Tang, Qizhi; Maltzman, Jonathan S
2016-09-01
Research on human immune responses frequently involves the use of peripheral blood mononuclear cells (PBMC) immediately, or at significantly delayed timepoints, after collection. This requires PBMC isolation from whole blood and cryopreservation for some applications. It is important to standardize protocols for blood collection, PBMC isolation, cryopreservation, and thawing that maximize survival and functionality of PBMC at the time of analysis. This resource includes detailed protocols describing blood collection tubes, isolation of PBMC using a density gradient, cryopreservation of PBMC, and thawing of cells as well as preparation for functional assays. For each protocol, we include important considerations, such as timing, storage temperatures, and freezing rate. In addition, we provide alternatives so that researchers can make informed decisions in determining the optimal protocol for their application.
Experimental control in software reliability certification
NASA Technical Reports Server (NTRS)
Trammell, Carmen J.; Poore, Jesse H.
1994-01-01
There is growing interest in software 'certification', i.e., confirmation that software has performed satisfactorily under a defined certification protocol. Regulatory agencies, customers, and prospective reusers all want assurance that a defined product standard has been met. In other industries, products are typically certified under protocols in which random samples of the product are drawn, tests characteristic of operational use are applied, analytical or statistical inferences are made, and products meeting a standard are 'certified' as fit for use. A warranty statement is often issued upon satisfactory completion of a certification protocol. This paper outlines specific engineering practices that must be used to preserve the validity of the statistical certification testing protocol. The assumptions associated with a statistical experiment are given, and their implications for statistical testing of software are described.
OR2020: The Operating Room of the Future
2004-05-01
25 3.3 Technical Requirements: Standards and Tools for Improved Operating R oom Process Integration...Image processing and visualization tools must be made available to the operating room. 5. Communications issues must be addressed and aim toward...protocols for effectively performing advanced surgeries and using telecommunications-ready tools as needed. The following recommendations were made
LINCS: Livermore's network architecture. [Octopus computing network
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fletcher, J.G.
1982-01-01
Octopus, a local computing network that has been evolving at the Lawrence Livermore National Laboratory for over fifteen years, is currently undergoing a major revision. The primary purpose of the revision is to consolidate and redefine the variety of conventions and formats, which have grown up over the years, into a single standard family of protocols, the Livermore Interactive Network Communication Standard (LINCS). This standard treats the entire network as a single distributed operating system such that access to a computing resource is obtained in a single way, whether that resource is local (on the same computer as the accessingmore » process) or remote (on another computer). LINCS encompasses not only communication but also such issues as the relationship of customer to server processes and the structure, naming, and protection of resources. The discussion includes: an overview of the Livermore user community and computing hardware, the functions and structure of each of the seven layers of LINCS protocol, the reasons why we have designed our own protocols and why we are dissatisfied by the directions that current protocol standards are taking.« less
Standard Operational Protocols in professional nursing practice: use, weaknesses and potentialities.
Sales, Camila Balsero; Bernardes, Andrea; Gabriel, Carmen Silvia; Brito, Maria de Fátima Paiva; Moura, André Almeida de; Zanetti, Ariane Cristina Barboza
2018-01-01
to evaluate the use of Standard Operational Protocols (SOPs) in the professional practice of the nursing team based on the theoretical framework of Donabedian, as well as to identify the weaknesses and potentialities from its implementation. Evaluative research, with quantitative approach performed with nursing professionals working in the Health Units of a city of São Paulo, composed of two stages: document analysis and subsequent application of a questionnaire to nursing professionals. A total of 247 nursing professionals participated and reported changes in the way the interventions were performed. The main weaknesses were the small number of professionals, inadequate physical structure and lack of materials. Among the potentialities were: the standardization of materials and concern of the manager and professional related to patient safety. The reassessment of SOPs is necessary, as well as the adoption of a strategy of permanent education of professionals aiming at improving the quality of care provided.
Common MD-IS infrastructure for wireless data technologies
NASA Astrophysics Data System (ADS)
White, Malcolm E.
1995-12-01
The expansion of global networks, caused by growth and acquisition within the commercial sector, is forcing users to move away from proprietary systems in favor of standards-based, open systems architectures. The same is true in the wireless data communications arena, where operators of proprietary wireless data networks have endeavored to convince users that their particular implementation provides the best service. However, most of the vendors touting these solutions have failed to gain the critical mass that might have lead to their technologies' adoption as a defacto standard, and have been held back by a lack of applications and the high cost of mobile devices. The advent of the cellular digital packet data (CDPD) specification and its support by much of the public cellular service industry has set the stage for the ubiquitous coverage of wireless packet data services across the Unites States. Although CDPD was developed for operation over the advanced mobile phone system (AMPS) cellular network, many of the defined protocols are industry standards that can be applied to the construction of a common infrastructure supporting multiple airlink standards. This approach offers overall cost savings and operation efficiency for service providers, hardware, and software developers and end-users alike, and could be equally advantageous for those service operators using proprietary end system protocols, should they wish to migrate towards an open standard.
This protocol describes how quality control samples should be handled in the field, and was designed as a quick reference source for the field staff. The protocol describes quality control samples for air-VOCs, air-particles, water samples, house dust, soil, urine, blood, hair, a...
The purpose of this protocol is to provide guidelines for the analysis of hair samples for total mercury by cold vapor atomic fluorescence (CVAFS) spectrometry. This protocol describes the methodology and all other analytical aspects involved in the analysis. Keywords: hair; s...
Chapter A10. Lakes and reservoirs: Guidelines for study design and sampling
Green, William R.; Robertson, Dale M.; Wilde, Franceska D.
2015-09-29
Within this chapter are references to other chapters of the NFM that provide more detailed guidelines related to specific topics and more detailed protocols for the quality assurance and assessment of the lake and reservoir data. Protocols and procedures to address and document the quality of lake and reservoir investigations are adapted from, or referenced to, the protocols and standard operating procedures contained in related chapters of this National Field Manual.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-29
... regarding how and when data collection forms and standard operating procedures should be included with the... topic. It does not create or confer any rights for or on any person and does not operate to bind FDA or...
Role of memory errors in quantum repeaters
NASA Astrophysics Data System (ADS)
Hartmann, L.; Kraus, B.; Briegel, H.-J.; Dür, W.
2007-03-01
We investigate the influence of memory errors in the quantum repeater scheme for long-range quantum communication. We show that the communication distance is limited in standard operation mode due to memory errors resulting from unavoidable waiting times for classical signals. We show how to overcome these limitations by (i) improving local memory and (ii) introducing two operational modes of the quantum repeater. In both operational modes, the repeater is run blindly, i.e., without waiting for classical signals to arrive. In the first scheme, entanglement purification protocols based on one-way classical communication are used allowing to communicate over arbitrary distances. However, the error thresholds for noise in local control operations are very stringent. The second scheme makes use of entanglement purification protocols with two-way classical communication and inherits the favorable error thresholds of the repeater run in standard mode. One can increase the possible communication distance by an order of magnitude with reasonable overhead in physical resources. We outline the architecture of a quantum repeater that can possibly ensure intercontinental quantum communication.
AIRNET: A real-time comunications network for aircraft
NASA Technical Reports Server (NTRS)
Weaver, Alfred C.; Cain, Brendan G.; Colvin, M. Alexander; Simoncic, Robert
1990-01-01
A real-time local area network was developed for use on aircraft and space vehicles. It uses token ring technology to provide high throughput, low latency, and high reliability. The system was implemented on PCs and PC/ATs operating on PCbus, and on Intel 8086/186/286/386s operating on Multibus. A standard IEEE 802.2 logical link control interface was provided to (optional) upper layer software; this permits the controls designer to utilize standard communications protocols (e.g., ISO, TCP/IP) if time permits, or to utilize a very fast link level protocol directly if speed is critical. Both unacknowledged datagram and reliable virtual circuit services are supported. A station operating an 8 MHz Intel 286 as a host can generate a sustained load of 1.8 megabits per second per station, and a 100-byte message can be delivered from the transmitter's user memory to the receiver's user memory, including all operating system and network overhead, in under 4 milliseconds.
Assessing operating characteristics of CAD algorithms in the absence of a gold standard
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roy Choudhury, Kingshuk; Paik, David S.; Yi, Chin A.
2010-04-15
Purpose: The authors examine potential bias when using a reference reader panel as ''gold standard'' for estimating operating characteristics of CAD algorithms for detecting lesions. As an alternative, the authors propose latent class analysis (LCA), which does not require an external gold standard to evaluate diagnostic accuracy. Methods: A binomial model for multiple reader detections using different diagnostic protocols was constructed, assuming conditional independence of readings given true lesion status. Operating characteristics of all protocols were estimated by maximum likelihood LCA. Reader panel and LCA based estimates were compared using data simulated from the binomial model for a range ofmore » operating characteristics. LCA was applied to 36 thin section thoracic computed tomography data sets from the Lung Image Database Consortium (LIDC): Free search markings of four radiologists were compared to markings from four different CAD assisted radiologists. For real data, bootstrap-based resampling methods, which accommodate dependence in reader detections, are proposed to test of hypotheses of differences between detection protocols. Results: In simulation studies, reader panel based sensitivity estimates had an average relative bias (ARB) of -23% to -27%, significantly higher (p-value <0.0001) than LCA (ARB -2% to -6%). Specificity was well estimated by both reader panel (ARB -0.6% to -0.5%) and LCA (ARB 1.4%-0.5%). Among 1145 lesion candidates LIDC considered, LCA estimated sensitivity of reference readers (55%) was significantly lower (p-value 0.006) than CAD assisted readers' (68%). Average false positives per patient for reference readers (0.95) was not significantly lower (p-value 0.28) than CAD assisted readers' (1.27). Conclusions: Whereas a gold standard based on a consensus of readers may substantially bias sensitivity estimates, LCA may be a significantly more accurate and consistent means for evaluating diagnostic accuracy.« less
The purpose of this SOP is to describe the standard approach used for cleaning glassware and plasticware during the Arizona NHEXAS project and the "Border" study. Keywords: lab; equipment; cleaning.
The National Human Exposure Assessment Survey (NHEXAS) is a federal interagency...
NASA Technical Reports Server (NTRS)
Ferrara, Jeffrey; Calk, William; Atwell, William; Tsui, Tina
2013-01-01
MPISS is an automatic file transfer system that implements a combination of standard and mission-unique transfer protocols required by the Global Precipitation Measurement Mission (GPM) Precipitation Processing System (PPS) to control the flow of data between the MOC and the PPS. The primary features of MPISS are file transfers (both with and without PPS specific protocols), logging of file transfer and system events to local files and a standard messaging bus, short term storage of data files to facilitate retransmissions, and generation of file transfer accounting reports. The system includes a graphical user interface (GUI) to control the system, allow manual operations, and to display events in real time. The PPS specific protocols are an enhanced version of those that were developed for the Tropical Rainfall Measuring Mission (TRMM). All file transfers between the MOC and the PPS use the SSH File Transfer Protocol (SFTP). For reports and data files generated within the MOC, no additional protocols are used when transferring files to the PPS. For observatory data files, an additional handshaking protocol of data notices and data receipts is used. MPISS generates and sends to the PPS data notices containing data start and stop times along with a checksum for the file for each observatory data file transmitted. MPISS retrieves the PPS generated data receipts that indicate the success or failure of the PPS to ingest the data file and/or notice. MPISS retransmits the appropriate files as indicated in the receipt when required. MPISS also automatically retrieves files from the PPS. The unique feature of this software is the use of both standard and PPS specific protocols in parallel. The advantage of this capability is that it supports users that require the PPS protocol as well as those that do not require it. The system is highly configurable to accommodate the needs of future users.
Protocol for monitoring metals in Ozark National Scenic Riverways, Missouri: Version 1.0
Schmitt, Christopher J.; Brumbaugh, William G.; Besser, John M.; Hinck, Jo Ellen; Bowles, David E.; Morrison, Lloyd W.; Williams, Michael H.
2008-01-01
The National Park Service is developing a monitoring plan for the Ozark National Scenic Riverways in southeastern Missouri. Because of concerns about the release of lead, zinc, and other metals from lead-zinc mining to streams, the monitoring plan will include mining-related metals. After considering a variety of alternatives, the plan will consist of measuring the concentrations of cadmium, cobalt, lead, nickel, and zinc in composite samples of crayfish (Orconectes luteus or alternate species) and Asian clam (Corbicula fluminea) collected periodically from selected sites. This document, which comprises a protocol narrative and supporting standard operating procedures, describes the methods to be employed prior to, during, and after collection of the organisms, along with procedures for their chemical analysis and quality assurance; statistical analysis, interpretation, and reporting of the data; and for modifying the protocol narrative and supporting standard operating procedures. A list of supplies and equipment, data forms, and sample labels are also included. An example based on data from a pilot study is presented.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-16
... consumer protection, the Agency issued GLP regulations. The regulations specify minimum standards for the proper conduct of safety testing and contain sections on facilities, personnel, equipment, standard operating procedures (SOPs), test and control articles, quality assurance, protocol and conduct of a safety...
Serial Interface through Stream Protocol on EPICS Platform for Distributed Control and Monitoring
NASA Astrophysics Data System (ADS)
Das Gupta, Arnab; Srivastava, Amit K.; Sunil, S.; Khan, Ziauddin
2017-04-01
Remote operation of any equipment or device is implemented in distributed systems in order to control and proper monitoring of process values. For such remote operations, Experimental Physics and Industrial Control System (EPICS) is used as one of the important software tool for control and monitoring of a wide range of scientific parameters. A hardware interface is developed for implementation of EPICS software so that different equipment such as data converters, power supplies, pump controllers etc. could be remotely operated through stream protocol. EPICS base was setup on windows as well as Linux operating system for control and monitoring while EPICS modules such as asyn and stream device were used to interface the equipment with standard RS-232/RS-485 protocol. Stream Device protocol communicates with the serial line with an interface to asyn drivers. Graphical user interface and alarm handling were implemented with Motif Editor and Display Manager (MEDM) and Alarm Handler (ALH) command line channel access utility tools. This paper will describe the developed application which was tested with different equipment and devices serially interfaced to the PCs on a distributed network.
OSI in the NASA science internet: An analysis
NASA Technical Reports Server (NTRS)
Nitzan, Rebecca
1990-01-01
The Open Systems Interconnection (OSI) protocol suite is a result of a world-wide effort to develop international standards for networking. OSI is formalized through the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC). The goal of OSI is to provide interoperability between network products without relying on one particular vendor, and to do so on a multinational basis. The National Institute for Standards and Technology (NIST) has developed a Government OSI Profile (GOSIP) that specified a subset of the OSI protocols as a Federal Information Processing Standard (FIPS 146). GOSIP compatibility has been adopted as the direction for all U.S. government networks. OSI is extremely diverse, and therefore adherence to a profile will facilitate interoperability within OSI networks. All major computer vendors have indicated current or future support of GOSIP-compliant OSI protocols in their products. The NASA Science Internet (NSI) is an operational network, serving user requirements under NASA's Office of Space Science and Applications. NSI consists of the Space Physics Analysis Network (SPAN) that uses the DECnet protocols and the NASA Science Network (NSN) that uses TCP/IP protocols. The NSI Project Office is currently working on an OSI integration analysis and strategy. A long-term goal is to integrate SPAN and NSN into one unified network service, using a full OSI protocol suite, which will support the OSSA user community.
NASA Astrophysics Data System (ADS)
Martinez, Ralph; Nam, Jiseung
1992-07-01
Picture Archiving and Communication Systems (PACS) is an integration of digital image formation in a hospital, which encompasses various imaging equipment, image viewing workstations, image databases, and a high speed network. The integration requires a standardization of communication protocols to connect devices from different vendors. The American College of Radiology and the National Electrical Manufacturers Association (ACR- NEMA) standard Version 2.0 provides a point-to-point hardware interface, a set of software commands, and a consistent set of data formats for PACS. But, it is inadequate for PACS networking environments, because of its point-to-point nature and its inflexibility to allow other services and protocols in the future. Based on previous experience of PACS developments in The University of Arizona, a new communication protocol for PACS networks and an approach were proposed to ACR-NEMA Working Group VI. The defined PACS protocol is intended to facilitate the development of PACS''s capable of interfacing with other hospital information systems. Also, it is intended to allow the creation of diagnostic information data bases which can be interrogated by a variety of distributed devices. A particularly important goal is to support communications in a multivendor environment. The new protocol specifications are defined primarily as a combination of the International Organization for Standardization/Open Systems Interconnection (ISO/OSI), TCP/IP protocols, and the data format portion of ACR-NEMA standard. This paper addresses the specification and implementation of the ISO-based protocol into a PACS prototype. The protocol specification, which covers Presentation, Session, Transport, and Network layers, is summarized briefly. The protocol implementation is discussed based on our implementation efforts in the UNIX Operating System Environment. At the same time, results of performance comparison between the ISO and TCP/IP implementations are presented to demonstrate the implementation of defined protocol. The testing of performance analysis is done by prototyping PACS on available platforms, which are Micro VAX II, DECstation and SUN Workstation.
Performance Analysis of TCP Enhancements in Satellite Data Networks
NASA Technical Reports Server (NTRS)
Broyles, Ren H.
1999-01-01
This research examines two proposed enhancements to the well-known Transport Control Protocol (TCP) in the presence of noisy communication links. The Multiple Pipes protocol is an application-level adaptation of the standard TCP protocol, where several TCP links cooperate to transfer data. The Space Communication Protocol Standard - Transport Protocol (SCPS-TP) modifies TCP to optimize performance in a satellite environment. While SCPS-TP has inherent advantages that allow it to deliver data more rapidly than Multiple Pipes, the protocol, when optimized for operation in a high-error environment, is not compatible with legacy TCP systems, and requires changes to the TCP specification. This investigation determines the level of improvement offered by SCPS-TP's Corruption Mode, which will help determine if migration to the protocol is appropriate in different environments. As the percentage of corrupted packets approaches 5 %, Multiple Pipes can take over five times longer than SCPS-TP to deliver data. At high error rates, SCPS-TP's advantage is primarily caused by Multiple Pipes' use of congestion control algorithms. The lack of congestion control, however, limits the systems in which SCPS-TP can be effectively used.
A Constrained and Versioned Data Model for TEAM Data
NASA Astrophysics Data System (ADS)
Andelman, S.; Baru, C.; Chandra, S.; Fegraus, E.; Lin, K.
2009-04-01
The objective of the Tropical Ecology Assessment and Monitoring Network (www.teamnetwork.org) is "To generate real time data for monitoring long-term trends in tropical biodiversity through a global network of TEAM sites (i.e. field stations in tropical forests), providing an early warning system on the status of biodiversity to effectively guide conservation action". To achieve this, the TEAM Network operates by collecting data via standardized protocols at TEAM Sites. The standardized TEAM protocols include the Climate, Vegetation and Terrestrial Vertebrate Protocols. Some sites also implement additional protocols. There are currently 7 TEAM Sites with plans to grow the network to 15 by June 30, 2009 and 50 TEAM Sites by the end of 2010. At each TEAM Site, data is gathered as defined by the protocols and according to a predefined sampling schedule. The TEAM data is organized and stored in a database based on the TEAM spatio-temporal data model. This data model is at the core of the TEAM Information System - it consumes and executes spatio-temporal queries, and analytical functions that are performed on TEAM data, and defines the object data types, relationships and operations that maintain database integrity. The TEAM data model contains object types including types for observation objects (e.g. bird, butterfly and trees), sampling unit, person, role, protocol, site and the relationship of these object types. Each observation data record is a set of attribute values of an observation object and is always associated with a sampling unit, an observation timestamp or time interval, a versioned protocol and data collectors. The operations on the TEAM data model can be classified as read operations, insert operations and update operations. Following are some typical operations: The operation get(site, protocol, [sampling unit block, sampling unit,] start time, end time) returns all data records using the specified protocol and collected at the specified site, block, sampling unit and time range. The operation insertSamplingUnit(sampling unit, site, protocol) saves a new sampling unit into the data model and links it with the site and protocol. The operation updateSampligUnit(sampling_unit_id, attribute, value) changes the attribute (e.g. latitude or longitude) of the sampling unit to the specified value. The operation insertData(observation record, site, protocol, sampling unit, timestamps, data collectors) saves a new observation record into the database and associates it with specified objects. The operation updateData(protocol, data_id, attribute, value) modifies the attribute of an existing observation record to the specified value. All the insert or update operations require: 1) authorization to ensure the user has necessary privileges to perform the operation; 2) timestamp validation to ensure the observation timestamps are in the designated time range specified in the sampling schedule; 3) data validation to check that the data records use correct taxonomy terms and data values. No authorization is performed for get operations, but under some specific condition, a username may be required for the purpose of authentication. Along with the validations above, the TEAM data model also supports human based data validation on observed data through the Data Review subsystem to ensure data quality. The data review is implemented by adding two attributes review_tag and review_comment to each observation data record. The attribute review_tag is used by a reviewer to specify the quality of data, and the attribute review_comment is for reviewers to give more information when a problem is identified. The review_tag attribute can be populated by either the system conducting QA/QC tests or by pre-specified scientific experts. The following is the review operation, which is actually a special case of the operation updateData: The operation updateReview(protocol, data_id, judgment, comment) sets the attribute review_tag and review_comment to the specified values. By systematically tracking every step, The TEAM data model can roll back to any previous state. This is achieved by introducing a historical data container for each editable object type. When the operation updateData is applied to an object to modify its attribute, the object will be tagged with the current timestamp and the name of the user who conducts the operation, the tagged object will then be moved into the historical data container, and finally a new object will be created with the new value for the specified attribute. The diagram illustrates the architecture of the TEAM data management system. A data collector can use the Data Ingestion subsystem to load new data records into the TEAM data model. The system establishes a first level of review (i.e. meets minimum data standards via QA/QC tests). Further review is done via experts and they can verify and provide their comments on data records through the Data Review subsystem. The data editor can then address data records based on the reviewer's comments. Users can use the Data Query and Download application to find data by sites, protocols and time ranges. The Data Query and Download system packages selected data with the data license and important metadata information into a single package and delivers it to the user.
Nam, Kyoung Won; Lee, Jung Joo; Hwang, Chang Mo; Choi, Seong Wook; Son, Ho Sung; Sun, Kyung
2007-11-01
Currently, personal mobile communication devices have become quite common, and the applications of such devices have expanded quickly. Remote communication systems might be employed for the telemonitoring of patients or the operating status of their medical devices. In this article, we describe the development of a mobile-based artificial heart telemanagement system for use in a wearable extracorporeal pneumatic biventricular assist device, which is capable of telemonitoring and telecontrolling the operating status of the ventricular assist device from any site. The system developed herein utilized small mobile phones for the client device and adopted a standard transmission control protocol/Internet protocol communication protocol for the purposes of telecommunication. The results of in vitro and animal experiments showed that the telemanagement system developed herein operated in accordance with the desired parameters.
78 FR 14654 - Standards for Business Practices and Communication Protocols for Public Utilities
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-07
... also require each regional transmission organization (RTO) and independent system operator (ISO) to... 20426, (202) 502-6404. Mindi Sauter (legal issues), Office of the General Counsel, Federal Energy... require each regional transmission organization (RTO) and independent system operator (ISO) to address in...
EDRN Standard Operating Procedures (SOP) — EDRN Public Portal
The NCI’s Early Detection Research Network is developing a number of standard operating procedures for assays, methods, and protocols for collection and processing of biological samples, and other reference materials to assist investigators to conduct experiments in a consistent, reliable manner. These SOPs are established by the investigators of the Early Detection Research Network to maintain constancy throughout the Network. These SOPs represent neither a consensus, nor are the recommendations of NCI.
Developing a Standard Method for Link-Layer Security of CCSDS Space Communications
NASA Technical Reports Server (NTRS)
Biggerstaff, Craig
2009-01-01
Communications security for space systems has been a specialized field generally far removed from considerations of mission interoperability and cross-support in fact, these considerations often have been viewed as intrinsically opposed to security objectives. The space communications protocols defined by the Consultative Committee for Space Data Systems (CCSDS) have a twenty-five year history of successful use in over 400 missions. While the CCSDS Telemetry, Telecommand, and Advancing Orbiting Systems protocols for use at OSI Layer 2 are operationally mature, there has been no direct support within these protocols for communications security techniques. Link-layer communications security has been successfully implemented in the past using mission-unique methods, but never before with an objective of facilitating cross-support and interoperability. This paper discusses the design of a standard method for cryptographic authentication, encryption, and replay protection at the data link layer that can be integrated into existing CCSDS protocols without disruption to legacy communications services. Integrating cryptographic operations into existing data structures and processing sequences requires a careful assessment of the potential impediments within spacecraft, ground stations, and operations centers. The objective of this work is to provide a sound method for cryptographic encapsulation of frame data that also facilitates Layer 2 virtual channel switching, such that a mission may procure data transport services as needed without involving third parties in the cryptographic processing, or split independent data streams for separate cryptographic processing.
2013-03-01
series of checkpoints in a complex route network,” while observing standard traffic etiquette and regulations [17]. The rules for the 2012 RoboCup...structure or protocols above the PHY. To support AVEP operation, we developed a packet structure based on the transmission control protocol (TCP...Control Protocol .” 1981. [37] F. Ge, Q. Chen, Y. Wang, C. W. Bostian, T. W. Rondeau, and B. Le, “Cognitive radio: from spectrum sharing to adaptive
Building America House Simulation Protocols
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hendron, Robert; Engebrecht, Cheryn
2010-09-01
The House Simulation Protocol document was developed to track and manage progress toward Building America's multi-year, average whole-building energy reduction research goals for new construction and existing homes, using a consistent analytical reference point. This report summarizes the guidelines for developing and reporting these analytical results in a consistent and meaningful manner for all home energy uses using standard operating conditions.
Mayer, Dieter; Rancic, Zoran; Pfammatter, Thomas; Hechelhammer, Lukas; Veith, Frank J; Donas, Konstantin; Lachat, Mario
2010-01-01
The value of emergency endovascular aneurysm repair (EVAR) in the setting of ruptured abdominal aortic aneurysm remains controversial owing to differing results. However, interpretation of published results remains difficult as there is a lack of generally accepted protocols or standard operating procedures. Furthermore, such protocols and standard operating procedures often are reported incompletely or not at all, thereby making interpretation of results difficult. We herein report our integrated logistic system for the endovascular treatment of ruptured abdominal aortic aneurysms. Important components of this system are prehospital logistics, in-hospital treatment logistics, and aftercare. Further studies should include details about all of these components, and a description of these logistic components must be included in all future studies of emergency EVAR for ruptured abdominal aortic aneurysms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rice, Mark J.; Bonebrake, Christopher A.; Dayley, Greg K.
Inter-Control Center Communications Protocol (ICCP), defined by the IEC 60870-6 TASE.2 standard, was developed to enable data exchange over wide area networks between electric system entities, including utility control centers, Independent System Operators (ISOs), Regional Transmission Operators (RTOs) and Independent Power Producers (IPP) also known as Non-Utility Generators (NUG). ICCP is an unprotected protocol, and as a result is vulnerable to such actions as integrity violation, interception or alteration, spoofing, and eavesdropping. Because of these vulnerabilities with unprotected ICCP communication, security enhancements, referred to as Secure ICCP, have been added and are included in the ICCP products that utilities havemore » received since 2003 when the standard was defined. This has resulted in an ICCP product whose communication can be encrypted and authenticated to address these vulnerabilities.« less
The purpose of this SOP is to describe the standard approach used for cleaning glassware and plasticware during the Arizona NHEXAS project and the Border study. Keywords: lab; equipment; cleaning.
The U.S.-Mexico Border Program is sponsored by the Environmental Health Workgroup...
The Virtual Insect Brain protocol: creating and comparing standardized neuroanatomy
Jenett, Arnim; Schindelin, Johannes E; Heisenberg, Martin
2006-01-01
Background In the fly Drosophila melanogaster, new genetic, physiological, molecular and behavioral techniques for the functional analysis of the brain are rapidly accumulating. These diverse investigations on the function of the insect brain use gene expression patterns that can be visualized and provide the means for manipulating groups of neurons as a common ground. To take advantage of these patterns one needs to know their typical anatomy. Results This paper describes the Virtual Insect Brain (VIB) protocol, a script suite for the quantitative assessment, comparison, and presentation of neuroanatomical data. It is based on the 3D-reconstruction and visualization software Amira, version 3.x (Mercury Inc.) [1]. Besides its backbone, a standardization procedure which aligns individual 3D images (series of virtual sections obtained by confocal microscopy) to a common coordinate system and computes average intensities for each voxel (volume pixel) the VIB protocol provides an elaborate data management system for data administration. The VIB protocol facilitates direct comparison of gene expression patterns and describes their interindividual variability. It provides volumetry of brain regions and helps to characterize the phenotypes of brain structure mutants. Using the VIB protocol does not require any programming skills since all operations are carried out at an intuitively usable graphical user interface. Although the VIB protocol has been developed for the standardization of Drosophila neuroanatomy, the program structure can be used for the standardization of other 3D structures as well. Conclusion Standardizing brains and gene expression patterns is a new approach to biological shape and its variability. The VIB protocol provides a first set of tools supporting this endeavor in Drosophila. The script suite is freely available at [2] PMID:17196102
Tsang, Lap Fung; Cheng, Hang Cheong; Ho, Hon Shuen; Hsu, Yung Chak; Chow, Chiu Man; Law, Heung Wah; Fong, Lup Chau; Leung, Lok Ming; Kong, Ivy Ching Yan; Chan, Chi Wai; Sham, Alice So Yuen
2016-05-01
Although various drains have long been used in total joint replacement, evidence suggests inconsistent practice exists in the use of drainage systems including intermittently applying suction or free of drainage suction, and variations in the optimal timing for wound drain removal. A comprehensive systematic review of available evidence up to 2013 was conducted in a previous study and a protocol was adapted for clinical application according to the summary of the retrieved information (Tsang, 2015). To determine if the protocol could reduce blood loss and blood transfusion after operation and to develop a record form so as to enhance communication of drainage record amongst surgeons and nurses. A quasi-experimental time-series design was undertaken. In the conventional group, surgeons ordered free drainage if the drain output was more than 300 ml. The time of removal of the drain was based on their professional judgement. In the protocol group the method of drainage was dependant of the drainage output as was the timing of the removal of the drain. A standardized record form was developed to guide operating room and orthopaedic ward nurses to manage the drainage system. The drain was removed significantly earlier in the protocol group. Blood loss rate at the first hour of post-operation was extremely low in the protocol group due to clamping effect. Blood loss in volume during the first three hours in the protocol group was significantly lower than that in the conventional group. Only in 11.1% and 4% of cases was it necessary to clamp at the three and four hour post-operative hours. No clamping was required at the two and eight hour postoperative period. There was no significant difference in blood loss during the removal of the drain and during blood transfusion, which was required for patients upon removal of the drain in the two groups. This is the first clinical study to develop an evidence-based protocol to manage wound drain effectively in Hong Kong. Total blood loss and blood transfusions were not significantly different between the conventional and protocol groups. A standard documentation document is beneficial to enhance communication between doctors and nurses as well as to monitor and observe drainage effectively. Copyright © 2016 Elsevier Ltd. All rights reserved.
Landbird Monitoring Protocol for National Parks in the North Coast and Cascades Network
Siegel, Rodney B.; Wilkerson, Robert L.; Jenkins, Kurt J.; Kuntz, Robert C.; Boetsch, John R.; Schaberl, James P.; Happe, Patricia J.
2007-01-01
This protocol narrative outlines the rationale, sampling design and methods for monitoring landbirds in the North Coast and Cascades Network (NCCN) during the breeding season. The NCCN, one of 32 networks of parks in the National Park System, comprises seven national park units in the Pacific Northwest, including three large, mountainous, natural area parks (Mount Rainier [MORA] and Olympic [OLYM] National Parks, North Cascades National Park Service Complex [NOCA]), and four small historic cultural parks (Ebey's Landing National Historical Reserve [EBLA], Lewis and Clark National Historical Park [LEWI], Fort Vancouver National Historical Park [FOVA], and San Juan Island National Historical Park [SAJH]). The protocol reflects decisions made by the NCCN avian monitoring group, which includes NPS representatives from each of the large parks in the Network as well as personnel from the U.S. Geological Survey Forest and Rangeland Ecosystem Science Center (USGS-FRESC) Olympic Field Station, and The Institute for Bird Populations, at meetings held between 2000 (Siegel and Kuntz, 2000) and 2005. The protocol narrative describes the monitoring program in relatively broad terms, and its structure and content adhere to the outline and recommendations developed by Oakley and others (2003) and adopted by NPS. Finer details of the methodology are addressed in a set of standard operating procedures (SOPs) that accompany the protocol narrative. We also provide appendixes containing additional supporting materials that do not clearly belong in either the protocol narrative or the standard operating procedures.
Compact Modbus TCP/IP protocol for data acquisition systems based on limited hardware resources
NASA Astrophysics Data System (ADS)
Bai, Q.; Jin, B.; Wang, D.; Wang, Y.; Liu, X.
2018-04-01
The Modbus TCP/IP has been a standard industry communication protocol and widely utilized for establishing sensor-cloud platforms on the Internet. However, numerous existing data acquisition systems built on traditional single-chip microcontrollers without sufficient resources cannot support it, because the complete Modbus TCP/IP protocol always works dependent on a full operating system which occupies abundant hardware resources. Hence, a compact Modbus TCP/IP protocol is proposed in this work to make it run efficiently and stably even on a resource-limited hardware platform. Firstly, the Modbus TCP/IP protocol stack is analyzed and the refined protocol suite is rebuilt by streamlining the typical TCP/IP suite. Then, specific implementation of every hierarchical layer is respectively presented in detail according to the protocol structure. Besides, the compact protocol is implemented in a traditional microprocessor to validate the feasibility of the scheme. Finally, the performance of the proposed scenario is assessed. The experimental results demonstrate that message packets match the frame format of Modbus TCP/IP protocol and the average bandwidth reaches to 1.15 Mbps. The compact protocol operates stably even based on a traditional microcontroller with only 4-kB RAM and 12-MHz system clock, and no communication congestion or frequent packet loss occurs.
Lessons Learned from the Afghan Mission Network: Developing a Coalition Contingency Network
2014-01-01
SIPRNet Secret Internet Protocol Router Network SOP Standard Operating Procedure SVTC Secure Video Teleconference (or –Conferencing) TTP Tactics...Voice over internet protocol (VOIP) telephone connectivity • Email • Web browsing • Secure video teleconferencing (SVTC...10, 2012. As of January 15, 2013: http://www.guardian.co.uk/world/2012/oct/10/us-troops-jordan-syria-crisis Baldor, Lolita C., and Pauline Jelinek
Documentation of operational protocol for the use of MAMA software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwartz, Daniel S.
2016-01-21
Image analysis of Scanning Electron Microscope (SEM) micrographs is a complex process that can vary significantly between analysts. The factors causing the variation are numerous, and the purpose of Task 2b is to develop and test a set of protocols designed to minimize variation in image analysis between different analysts and laboratories, specifically using the MAMA software package, Version 2.1. The protocols were designed to be “minimally invasive”, so that expert SEM operators will not be overly constrained in the way they analyze particle samples. The protocols will be tested using a round-robin approach where results from expert SEM usersmore » at Los Alamos National Laboratory, Lawrence Livermore National Laboratory, Pacific Northwest National Laboratory, Savannah River National Laboratory, and the National Institute of Standards and Testing will be compared. The variation of the results will be used to quantify uncertainty in the particle image analysis process. The round-robin exercise will proceed with 3 levels of rigor, each with their own set of protocols, as described below in Tasks 2b.1, 2b.2, and 2b.3. The uncertainty will be developed using NIST standard reference material SRM 1984 “Thermal Spray Powder – Particle Size Distribution, Tungsten Carbide/Cobalt (Acicular)” [Reference 1]. Full details are available in the Certificate of Analysis, posted on the NIST website (http://www.nist.gov/srm/).« less
Increasing the Automation and Autonomy for Spacecraft Operations with Criteria Action Table
NASA Technical Reports Server (NTRS)
Li, Zhen-Ping; Savki, Cetin
2005-01-01
The Criteria Action Table (CAT) is an automation tool developed for monitoring real time system messages for specific events and processes in order to take user defined actions based on a set of user-defined rules. CAT was developed by Lockheed Martin Space Operations as a part of a larger NASA effort at the Goddard Space Flight Center (GSFC) to create a component-based, middleware-based, and standard-based general purpose ground system architecture referred as GMSEC - the GSFC Mission Services Evolution Center. CAT has been integrated into the upgraded ground systems for Tropical Rainfall Measuring Mission (TRMM) and Small Explorer (SMEX) satellites and it plays the central role in their automation effort to reduce the cost and increase the reliability for spacecraft operations. The GMSEC architecture provides a standard communication interface and protocol for components to publish/describe messages to an information bus. It also provides a standard message definition so components can send and receive messages to the bus interface rather than each other, thus reducing the component-to-component coupling, interface, protocols, and link (socket) management. With the GMSEC architecture, components can publish standard event messages to the bus for all nominal, significant, and surprising events in regard to satellite, celestial, ground system, or any other activity. In addition to sending standard event messages, each GMSEC compliant component is required to accept and process GMSEC directive request messages.
Recommended Practices for the Safe Design and Operation of Flywheels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bender, Donald Arthur
2015-12-01
Flywheel energy storage systems are in use globally in increasing numbers . No codes pertaining specifically to flywheel energy storage exist. A number of industrial incidents have occurred. This protocol recommends a technical basis for safe flywheel de sign and operation for consideration by flywheel developers, users of flywheel systems and standards setting organizations.
Kaddis, John S.; Hanson, Matthew S.; Cravens, James; Qian, Dajun; Olack, Barbara; Antler, Martha; Papas, Klearchos K.; Iglesias, Itzia; Barbaro, Barbara; Fernandez, Luis; Powers, Alvin C.; Niland, Joyce C.
2013-01-01
Preservation of cell quality during shipment of human pancreatic islets for use in laboratory research is a crucial, but neglected, topic. Mammalian cells, including islets, have been shown to be adversely affected by temperature changes in vitro and in vivo, yet protocols that control for thermal fluctuations during cell transport are lacking. To evaluate an optimal method of shipping human islets, an initial assessment of transportation conditions was conducted using standardized materials and operating procedures in 48 shipments sent to a central location by 8 pancreas-processing laboratories using a single commercial airline transporter. Optimization of preliminary conditions was conducted, and human islet quality was then evaluated in 2,338 shipments pre- and post-implementation of a finalized transportation container and standard operating procedures. The initial assessment revealed that the outside temperature ranged from a mean of −4.6±10.3°C to 20.9±4.8°C. Within-container temperature drops to or below 15°C occurred in 16 shipments (36%), while the temperature was found to be stabilized between 15–29°C in 29 shipments (64%). Implementation of an optimized transportation container and operating procedure reduced the number of within-container temperature drops (≤15°C) to 13% (n=37 of 289 winter shipments), improved the number desirably maintained between 15–29°C to 86% (n=250), but also increased the number reaching or exceeding 29°C to 1% (n=2; overall p<0.0001). Additionally, post-receipt quality ratings of excellent to good improved pre- vs. post- implementation of the standardized protocol, adjusting for pre-shipment purity/viability levels (p<0.0001). Our results show that extreme temperature fluctuations during transport of human islets, occurring when using a commercial airline transporter for long distance shipping, can be controlled using standardized containers, materials, and operating procedures. This cost-effective and pragmatic standardized protocol for the transportation of human islets can potentially be adapted for use with other mammalian cell systems, and is available online at: http://iidp.coh.org/sops.aspx. PMID:22889479
Protocol for Communication Networking for Formation Flying
NASA Technical Reports Server (NTRS)
Jennings, Esther; Okino, Clayton; Gao, Jay; Clare, Loren
2009-01-01
An application-layer protocol and a network architecture have been proposed for data communications among multiple autonomous spacecraft that are required to fly in a precise formation in order to perform scientific observations. The protocol could also be applied to other autonomous vehicles operating in formation, including robotic aircraft, robotic land vehicles, and robotic underwater vehicles. A group of spacecraft or other vehicles to which the protocol applies could be characterized as a precision-formation- flying (PFF) network, and each vehicle could be characterized as a node in the PFF network. In order to support precise formation flying, it would be necessary to establish a corresponding communication network, through which the vehicles could exchange position and orientation data and formation-control commands. The communication network must enable communication during early phases of a mission, when little positional knowledge is available. Particularly during early mission phases, the distances among vehicles may be so large that communication could be achieved only by relaying across multiple links. The large distances and need for omnidirectional coverage would limit communication links to operation at low bandwidth during these mission phases. Once the vehicles were in formation and distances were shorter, the communication network would be required to provide high-bandwidth, low-jitter service to support tight formation-control loops. The proposed protocol and architecture, intended to satisfy the aforementioned and other requirements, are based on a standard layered-reference-model concept. The proposed application protocol would be used in conjunction with conventional network, data-link, and physical-layer protocols. The proposed protocol includes the ubiquitous Institute of Electrical and Electronics Engineers (IEEE) 802.11 medium access control (MAC) protocol to be used in the datalink layer. In addition to its widespread and proven use in diverse local-area networks, this protocol offers both (1) a random- access mode needed for the early PFF deployment phase and (2) a time-bounded-services mode needed during PFF-maintenance operations. Switching between these two modes could be controlled by upper-layer entities using standard link-management mechanisms. Because the early deployment phase of a PFF mission can be expected to involve multihop relaying to achieve network connectivity (see figure), the proposed protocol includes the open shortest path first (OSPF) network protocol that is commonly used in the Internet. Each spacecraft in a PFF network would be in one of seven distinct states as the mission evolved from initial deployment, through coarse formation, and into precise formation. Reconfiguration of the formation to perform different scientific observations would also cause state changes among the network nodes. The application protocol provides for recognition and tracking of the seven states for each node and for protocol changes under specified conditions to adapt the network and satisfy communication requirements associated with the current PFF mission phase. Except during early deployment, when peer-to-peer random access discovery methods would be used, the application protocol provides for operation in a centralized manner.
A protocol for lifetime energy and environmental impact assessment of building insulation materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shrestha, Som S., E-mail: shresthass@ornl.gov; Biswas, Kaushik; Desjarlais, Andre O.
This article describes a proposed protocol that is intended to provide a comprehensive list of factors to be considered in evaluating the direct and indirect environmental impacts of building insulation materials, as well as detailed descriptions of standardized calculation methodologies to determine those impacts. The energy and environmental impacts of insulation materials can generally be divided into two categories: (1) direct impact due to the embodied energy of the insulation materials and other factors and (2) indirect or environmental impacts avoided as a result of reduced building energy use due to addition of insulation. Standards and product category rules exist,more » which provide guidelines about the life cycle assessment (LCA) of materials, including building insulation products. However, critical reviews have suggested that these standards fail to provide complete guidance to LCA studies and suffer from ambiguities regarding the determination of the environmental impacts of building insulation and other products. The focus of the assessment protocol described here is to identify all factors that contribute to the total energy and environmental impacts of different building insulation products and, more importantly, provide standardized determination methods that will allow comparison of different insulation material types. Further, the intent is not to replace current LCA standards but to provide a well-defined, easy-to-use comparison method for insulation materials using existing LCA guidelines. - Highlights: • We proposed a protocol to evaluate the environmental impacts of insulation materials. • The protocol considers all life cycle stages of an insulation material. • Both the direct environmental impacts and the indirect impacts are defined. • Standardized calculation methods for the ‘avoided operational energy’ is defined. • Standardized calculation methods for the ‘avoided environmental impact’ is defined.« less
Fu, Xi; Qiao, Jia; Girod, Sabine; Niu, Feng; Liu, Jian Feng; Lee, Gordon K; Gui, Lai
2017-09-01
Mandible contour surgery, including reduction gonioplasty and genioplasty, has become increasingly popular in East Asia. However, it is technically challenging and, hence, leads to a long learning curve and high complication rates and often needs secondary revisions. The increasing use of 3-dimensional (3D) technology makes accurate single-stage mandible contour surgery with minimum complication rates possible with a virtual surgical plan (VSP) and 3-D surgical templates. This study is to establish a standardized protocol for VSP and 3-D surgical templates-assisted mandible contour surgery and evaluate the accuracy of the protocol. In this study, we enrolled 20 patients for mandible contour surgery. Our protocol is to perform VSP based on 3-D computed tomography data. Then, design and 3-D print surgical templates based on preoperative VSP. The accuracy of the method was analyzed by 3-D comparison of VSP and postoperative results using detailed computer analysis. All patients had symmetric, natural osteotomy lines and satisfactory facial ratios in a single-stage operation. The average relative error of VSP and postoperative result on the entire skull was 0.41 ± 0.13 mm. The average new left gonial error was 0.43 ± 0.77 mm. The average new right gonial error was 0.45 ± 0.69 mm. The average pognion error was 0.79 ± 1.21 mm. Patients were very satisfied with the aesthetic results. Surgeons were very satisfied with the performance of surgical templates to facilitate the operation. Our standardized protocol of VSP and 3-D printed surgical templates-assisted single-stage mandible contour surgery results in accurate, safe, and predictable outcome in a single stage.
NASA Technical Reports Server (NTRS)
Iannicca, Dennis; Hylton, Alan; Ishac, Joseph
2012-01-01
Delay-Tolerant Networking (DTN) is an active area of research in the space communications community. DTN uses a standard layered approach with the Bundle Protocol operating on top of transport layer protocols known as convergence layers that actually transmit the data between nodes. Several different common transport layer protocols have been implemented as convergence layers in DTN implementations including User Datagram Protocol (UDP), Transmission Control Protocol (TCP), and Licklider Transmission Protocol (LTP). The purpose of this paper is to evaluate several stand-alone implementations of negative-acknowledgment based transport layer protocols to determine how they perform in a variety of different link conditions. The transport protocols chosen for this evaluation include Consultative Committee for Space Data Systems (CCSDS) File Delivery Protocol (CFDP), Licklider Transmission Protocol (LTP), NACK-Oriented Reliable Multicast (NORM), and Saratoga. The test parameters that the protocols were subjected to are characteristic of common communications links ranging from terrestrial to cis-lunar and apply different levels of delay, line rate, and error.
Fast-track surgery for uncomplicated appendicitis in children: a matched case-control study.
Cundy, Thomas P; Sierakowski, Kyra; Manna, Alexandra; Cooper, Celia M; Burgoyne, Laura L; Khurana, Sanjeev
2017-04-01
Standardized post-operative protocols reduce variation and enhance efficiency in patient care. Patients may benefit from these initiatives by improved quality of care. This matched case-control study investigates the effect of a multidisciplinary criteria-led discharge protocol for uncomplicated appendicitis in children. Key protocol components included limiting post-operative antibiotics to two intravenous doses, avoidance of intravenous opioid analgesia, prompt resumption of diet, active encouragement of early ambulation and nursing staff autonomy to discharge patients that met assigned criteria. The study period was from August 2015 to February 2016. Outcomes were compared with a historical control group matched for operative approach. Outcomes for 83 patients enrolled to our protocol were compared with those of 83 controls. There was a 29.2% reduction in median post-operative length of stay in our protocol-based care group (19.6 versus 27.7 h; P < 0.001). The rate of discharges within 24 h improved from 12 to 42%. There was no significant difference in complication rate (4.8 versus 7.2%; P = 0.51). Mean oral morphine dose equivalent per kilogram requirement was less than half (46%) that of control group patients (P < 0.001). Mean number of ondansetron doses was also significantly lower. Projected annual direct cost savings following protocol implementation was AUD$77 057. Implementation of a criteria-led discharge protocol at our hospital decreased length of stay, reduced variation in care, preserved existing low morbidity, incurred substantial cost savings, and safely rationalized opioid and antiemetic medication. These protocols are inexpensive and offer tangible benefits that are accessible to all health care settings. © 2016 Royal Australasian College of Surgeons.
From Fob to Noc: A Pathway to a Cyber Career for Combat Veterans
2014-06-01
Assurance Certifications GS general schedule HSAC Homeland Security Advisory Council IDS intrusion detection system IP internet protocol IPS...NIPRNET non-secure internet protocol router network NIST National Institute for Standards and Technology NOC network operations center NSA National...twice a day on an irregular schedule or during contact with the enemy to keep any observing enemy wary of the force protection 13 condition at any
EMPReSS: European mouse phenotyping resource for standardized screens.
Green, Eain C J; Gkoutos, Georgios V; Lad, Heena V; Blake, Andrew; Weekes, Joseph; Hancock, John M
2005-06-15
Standardized phenotyping protocols are essential for the characterization of phenotypes so that results are comparable between different laboratories and phenotypic data can be related to ontological descriptions in an automated manner. We describe a web-based resource for the visualization, searching and downloading of standard operating procedures and other documents, the European Mouse Phenotyping Resource for Standardized Screens-EMPReSS. Direct access: http://www.empress.har.mrc.ac.uk e.green@har.mrc.ac.uk.
CCSDS File Delivery Protocol (CFDP): Why it's Useful and How it Works
NASA Technical Reports Server (NTRS)
Ray, Tim
2003-01-01
Reliable delivery of data products is often required across space links. For example, a NASA mission will require reliable delivery of images produced by an on-board detector. Many missions have their own (unique) way of accomplishing this, requiring custom software. Many missions also require manual operations (e.g. the telemetry receiver software keeps track of what data is missing, and a person manually inputs the appropriate commands to request retransmissions). The Consultative Committee for Space Data Systems (CCSDS) developed the CCSDS File Delivery Protocol (CFDP) specifically for this situation. CFDP is an international standard communication protocol that provides reliable delivery of data products. It is designed for use across space links. It will work well if run over the widely used CCSDS Telemetry and Telecommand protocols. However, it can be run over any protocol, and will work well as long as the underlying protocol delivers a reasonable portion of the data. The CFDP receiver will autonomously determine what data is missing, and request retransmissions as needed. The CFDP sender will autonomously perform the requested transmissions. When the entire data product is delivered, the CFDP receiver will let the CFDP sender know that the transaction has completed successfully. The result is that custom software becomes standard, and manual operations become autonomous. This paper will consider various ways of achieving reliable file delivery, explain why CFDP is the optimal choice for use over space links, explain how the core protocol works, and give some guidance on how to best utilize CFDP within various mission scenarios. It will also touch on additional features of CFDP, as well as other uses for CFDP (e.g. the loading of on-board memory and tables).
Scarani, Valerio; Renner, Renato
2008-05-23
We derive a bound for the security of quantum key distribution with finite resources under one-way postprocessing, based on a definition of security that is composable and has an operational meaning. While our proof relies on the assumption of collective attacks, unconditional security follows immediately for standard protocols such as Bennett-Brassard 1984 and six-states protocol. For single-qubit implementations of such protocols, we find that the secret key rate becomes positive when at least N approximately 10(5) signals are exchanged and processed. For any other discrete-variable protocol, unconditional security can be obtained using the exponential de Finetti theorem, but the additional overhead leads to very pessimistic estimates.
Digital Motion Imagery, Interoperability Challenges for Space Operations
NASA Technical Reports Server (NTRS)
Grubbs, Rodney
2012-01-01
With advances in available bandwidth from spacecraft and between terrestrial control centers, digital motion imagery and video is becoming more practical as a data gathering tool for science and engineering, as well as for sharing missions with the public. The digital motion imagery and video industry has done a good job of creating standards for compression, distribution, and physical interfaces. Compressed data streams can easily be transmitted or distributed over radio frequency, internet protocol, and other data networks. All of these standards, however, can make sharing video between spacecraft and terrestrial control centers a frustrating and complicated task when different standards and protocols are used by different agencies. This paper will explore the challenges presented by the abundance of motion imagery and video standards, interfaces and protocols with suggestions for common formats that could simplify interoperability between spacecraft and ground support systems. Real-world examples from the International Space Station will be examined. The paper will also discuss recent trends in the development of new video compression algorithms, as well likely expanded use of Delay (or Disruption) Tolerant Networking nodes.
RFID applications in transportation operation and intelligent transportation systems (ITS).
DOT National Transportation Integrated Search
2009-06-01
Radio frequency identification (RFID) transmits the identity of an object or a person wirelessly. It is grouped under : the broad category of automatic identification technologies with corresponding standards and established protocols. : RFID is suit...
Operations Brigade S3 Replaced by Operations Battalion
2012-12-06
Prisoner abuse exposed at Abu Ghraib prison in Iraq between October and December 2003 highlighted the need for modifications in detainee protocols and... Abu Ghraib Torture and Prisoner Abuse,” www.martinfrost.ws/htmlfiles/ abughraib2.html (accessed 19 June 2012). 2 Many Arabs and Muslims associated...operations in an effort to further the standardization process initially set in motion after the Abu Ghraib investigations. 7 LTG Miller’s directives
NASA Technical Reports Server (NTRS)
Wagner, Raymond S.; Barton, Richard J.
2011-01-01
Standards-based wireless sensor network (WSN) protocols are promising candidates for spacecraft avionics systems, offering unprecedented instrumentation flexibility and expandability. Ensuring reliable data transport is key, however, when migrating from wired to wireless data gathering systems. In this paper, we conduct a rigorous laboratory analysis of the relative performances of the ZigBee Pro and ISA100.11a protocols in a representative crewed aerospace environment. Since both operate in the 2.4 GHz radio frequency (RF) band shared by systems such as Wi-Fi, they are subject at times to potentially debilitating RF interference. We compare goodput (application-level throughput) achievable by both under varying levels of 802.11g Wi-Fi traffic. We conclude that while the simpler, more inexpensive ZigBee Pro protocol performs well under moderate levels of interference, the more complex and costly ISA100.11a protocol is needed to ensure reliable data delivery under heavier interference. This paper represents the first published, rigorous analysis of WSN protocols in an aerospace environment that we are aware of and the first published head-to-head comparison of ZigBee Pro and ISA100.11a.
Ferreri, Matthew; Slagley, Jeremy; Felker, Daniel
2015-01-01
This study compared four treatment protocols to reduce airborne composite fiber particulates during simulated aircraft crash recovery operations. Four different treatments were applied to determine effectiveness in reducing airborne composite fiber particulates as compared to a "no treatment" protocol. Both "gold standard" gravimetric methods and real-time instruments were used to describe mass per volume concentration, particle size distribution, and surface area. The treatment protocols were applying water, wetted water, wax, or aqueous film-forming foam (AFFF) to both burnt and intact tickets of aircraft composite skin panels. The tickets were then cut using a small high-speed rotary tool to simulate crash recovery operations. Aerosol test chamber. None. Airborne particulate control treatments. Measures included concentration units of milligrams per cubic meter of air, particle size distribution as described by both count median diameter and mass median diameter and geometric standard deviation of particles in micrometers, and surface area concentration in units of square micrometers per cubic centimeter. Finally, a Monte Carlo simulation was run on the particle size distribution results. Comparison was made via one-way analysis of variance. A significant difference (p < 0.0001) in idealized particle size distribution was found between the water and wetted water treatments as compared to the other treatments for burnt tickets. Emergency crash recovery operations should include a treatment of the debris with water or wetted water. The resulting increase in particle size will make respiratory protection more effective in protecting the response crews.
Ferreri, Matthew; Slagley, Jeremy; Felker, Daniel
2015-01-01
This study compared four treatment protocols to reduce airborne composite fiber particulates during simulated aircraft crash recovery operations. Four different treatments were applied to determine effectiveness in reducing airborne composite fiber particulates as compared to a "no treatment" protocol. Both "gold standard" gravimetric methods and real-time instruments were used to describe mass per volume concentration, particle size distribution, and surface area. The treatment protocols were applying water, wetted water, wax, or aqueous film-forming foam (AFFF) to both burnt and intact tickets of aircraft composite skin panels. The tickets were then cut using a small high-speed rotary tool to simulate crash recovery operations. Aerosol test chamber. None. Airborne particulate control treatments. Measures included concentration units of milligrams per cubic meter of air, particle size distribution as described by both count median diameter and mass median diameter and geometric standard deviation of particles in micrometers, and surface area concentration in units of square micrometers per cubic centimeter. Finally, a Monte Carlo simulation was run on the particle size distribution results. Comparison was made via one-way analysis of variance. A significant difference (p<0.0001) in idealized particle size distribution was found between the water and wetted water treatments as compared to the other treatments for burnt tickets. Emergency crash recovery operations should include a treatment of the debris with water or wetted water. The resulting increase in particle size will make respiratory protection more effective in protecting the response crews.
Research into alternative network approaches for space operations
NASA Technical Reports Server (NTRS)
Kusmanoff, Antone L.; Barton, Timothy J.
1990-01-01
The main goal is to resolve the interoperability problem of applications employing DOD TCP/IP (Department of Defence Transmission Control Protocol/Internet Protocol) family of protocols on a CCITT/ISO based network. The objective is to allow them to communicate over the CCITT/ISO protocol GPLAN (General Purpose Local Area Network) network without modification to the user's application programs. There were two primary assumptions associated with the solution that was actually realized. The first is that the solution had to allow for future movement to the exclusive use of the CCITT/ISO standards. The second is that the solution had to be software transparent to the currently installed TCP/IP and CCITT/ISO user application programs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hacke, Peter; Lokanath, Sumanth; Williams, Paul
Data indicate that the inverter is the element of the photovoltaic plant that has the highest number of service calls and the greatest operation and maintenance cost burden. This paper describes the projects and relevant background needed in developing design qualification standards that would serve to establish a minimum level of reliability, along with a review of photovoltaic inverter quality and safety standards, most of which are in their infancy. We compare stresses and levels for accelerated testing of inverters proposed in the standard drafts, and those proposed by manufacturers and purchasers of inverters. We also review bases for themore » methods, stress types, and stress levels for durability testing of key inverter components. Many of the test protocols appear to need more comprehensive inclusion of stress factors existing in the natural environment such as wind driven rain, dust, and grid disturbances. Further understanding of how temperature, humidity ingress, and voltage bias affect the inverters and their components is also required. We provide data indicating inconsistent quality of the inverters and the durability of components leading to greater cost for the photovoltaic plant operator. Accordingly, the recommendation for data collection within quality standards for obtaining cost of ownership metrics is made. Design validation testing using realistic operation, environmental, and connection conditions, including under end-use field conditions with feedback for continuous improvement is recommended for inclusion within a quality standard.« less
Hacke, Peter; Lokanath, Sumanth; Williams, Paul; ...
2017-10-10
Data indicate that the inverter is the element of the photovoltaic plant that has the highest number of service calls and the greatest operation and maintenance cost burden. This paper describes the projects and relevant background needed in developing design qualification standards that would serve to establish a minimum level of reliability, along with a review of photovoltaic inverter quality and safety standards, most of which are in their infancy. We compare stresses and levels for accelerated testing of inverters proposed in the standard drafts, and those proposed by manufacturers and purchasers of inverters. We also review bases for themore » methods, stress types, and stress levels for durability testing of key inverter components. Many of the test protocols appear to need more comprehensive inclusion of stress factors existing in the natural environment such as wind driven rain, dust, and grid disturbances. Further understanding of how temperature, humidity ingress, and voltage bias affect the inverters and their components is also required. We provide data indicating inconsistent quality of the inverters and the durability of components leading to greater cost for the photovoltaic plant operator. Accordingly, the recommendation for data collection within quality standards for obtaining cost of ownership metrics is made. Design validation testing using realistic operation, environmental, and connection conditions, including under end-use field conditions with feedback for continuous improvement is recommended for inclusion within a quality standard.« less
Recommendations for the use of mist nets for inventory and monitoring of bird populations
Ralph, C. John; Dunn, Erica H.; Peach, Will J.; Handel, Colleen M.; Ralph, C. John; Dunn, Erica H.
2004-01-01
We provide recommendations on the best practices for mist netting for the purposes of monitoring population parameters such as abundance and demography. Studies should be carefully thought out before nets are set up, to ensure that sampling design and estimated sample size will allow study objectives to be met. Station location, number of nets, type of nets, net placement, and schedule of operation should be determined by the goals of the particular project, and we provide guidelines for typical mist-net studies. In the absence of study-specific requirements for novel protocols, commonly used protocols should be used to enable comparison of results among studies. Regardless of the equipment, net layout, or netting schedule selected, it is important for all studies that operations be strictly standardized, and a well-written operation protocol will help in attaining this goal. We provide recommendations for data to be collected on captured birds, and emphasize the need for good training of project personnel
Piao, Wenhua; Kim, Changwon; Cho, Sunja; Kim, Hyosoo; Kim, Minsoo; Kim, Yejin
2016-12-01
In wastewater treatment plants (WWTPs), the portion of operating costs related to electric power consumption is increasing. If the electric power consumption decreased, however, it would be difficult to comply with the effluent water quality requirements. A protocol was proposed to minimize the environmental impacts as well as to optimize the electric power consumption under the conditions needed to meet the effluent water quality standards in this study. This protocol was comprised of six phases of procedure and was tested using operating data from S-WWTP to prove its applicability. The 11 major operating variables were categorized into three groups using principal component analysis and K-mean cluster analysis. Life cycle assessment (LCA) was conducted for each group to deduce the optimal operating conditions for each operating state. Then, employing mathematical modeling, six improvement plans to reduce electric power consumption were deduced. The electric power consumptions for suggested plans were estimated using an artificial neural network. This was followed by a second round of LCA conducted on the plans. As a result, a set of optimized improvement plans were derived for each group that were able to optimize the electric power consumption and life cycle environmental impact, at the same time. Based on these test results, the WWTP operating management protocol presented in this study is deemed able to suggest optimal operating conditions under which power consumption can be optimized with minimal life cycle environmental impact, while allowing the plant to meet water quality requirements.
[Interface interconnection and data integration in implementing of digital operating room].
Feng, Jingyi; Chen, Hua; Liu, Jiquan
2011-10-01
The digital operating-room, with highly integrated clinical information, is very important for rescuing lives of patients and improving quality of operations. Since equipments in domestic operating-rooms have diversified interface and nonstandard communication protocols, designing and implementing an integrated data sharing program for different kinds of diagnosing, monitoring, and treatment equipments become a key point in construction of digital operating room. This paper addresses interface interconnection and data integration for commonly used clinical equipments from aspects of hardware interface, interface connection and communication protocol, and offers a solution for interconnection and integration of clinical equipments in heterogeneous environment. Based on the solution, a case of an optimal digital operating-room is presented in this paper. Comparing with the international solution for digital operating-room, the solution proposed in this paper is more economical and effective. And finally, this paper provides a proposal for the platform construction of digital perating-room as well as a viewpoint for standardization of domestic clinical equipments.
EVA safety: Space suit system interoperability
NASA Technical Reports Server (NTRS)
Skoog, A. I.; McBarron, J. W.; Abramov, L. P.; Zvezda, A. O.
1995-01-01
The results and the recommendations of the International Academy of Astronautics extravehicular activities (IAA EVA) Committee work are presented. The IAA EVA protocols and operation were analyzed for harmonization procedures and for the standardization of safety critical and operationally important interfaces. The key role of EVA and how to improve the situation based on the identified EVA space suit system interoperability deficiencies were considered.
Zimmerman, Janice L; Sprung, Charles L
2010-04-01
To provide recommendations and standard operating procedures for intensive care unit and hospital preparations for an influenza pandemic or mass disaster with a specific focus on ensuring that adequate resources are available and appropriate protocols are developed to safely perform procedures in patients with and without influenza illness. Based on a literature review and expert opinion, a Delphi process was used to define the essential topics including performing medical procedures. Key recommendations include: (1) specify high-risk procedures (aerosol generating-procedures); (2) determine if certain procedures will not be performed during a pandemic; (3) develop protocols for safe performance of high-risk procedures that include appropriateness, qualifications of personnel, site, personal protection equipment, safe technique and equipment needs; (4) ensure adequate training of personnel in high-risk procedures; (5) procedures should be performed at the bedside whenever possible; (6) ensure safe respiratory therapy practices to avoid aerosols; (7) provide safe respiratory equipment; and (8) determine criteria for cancelling and/or altering elective procedures. Judicious planning and adoption of protocols for safe performance of medical procedures are necessary to optimize outcomes during a pandemic.
Standardisation of neonatal clinical practice.
Bhutta, Z A; Giuliani, F; Haroon, A; Knight, H E; Albernaz, E; Batra, M; Bhat, B; Bertino, E; McCormick, K; Ochieng, R; Rajan, V; Ruyan, P; Cheikh Ismail, L; Paul, V
2013-09-01
The International Fetal and Newborn Growth Consortium for the 21(st) Century (INTERGROWTH-21(st) ) is a large-scale, population-based, multicentre project involving health institutions from eight geographically diverse countries, which aims to assess fetal, newborn and preterm growth under optimal conditions. Given the multicentre nature of the project and the expected number of preterm births, it is vital that all centres follow the same standardised clinical care protocols to assess and manage preterm infants, so as to ensure maximum validity of the resulting standards as indicators of growth and nutrition with minimal confounding. Moreover, it is well known that evidence-based clinical practice guidelines can reduce the delivery of inappropriate care and support the introduction of new knowledge into clinical practice. The INTERGROWTH-21(st) Neonatal Group produced an operations manual, which reflects the consensus reached by members of the group regarding standardised definitions of neonatal morbidities and the minimum standards of care to be provided by all centres taking part in the project. The operational definitions and summary management protocols were developed by consensus through a Delphi process based on systematic reviews of relevant guidelines and management protocols by authoritative bodies. This paper describes the process of developing the Basic Neonatal Care Manual, as well as the morbidity definitions and standardised neonatal care protocols applied across all the INTERGROWTH-21(st) participating centres. Finally, thoughts about implementation strategies are presented. © 2013 Royal College of Obstetricians and Gynaecologists.
Mukhopadhyay, Dhriti; Wiggins-Dohlvik, Katie C; MrDutt, Mary M; Hamaker, Jeffrey S; Machen, Graham L; Davis, Matthew L; Regner, Justin L; Smith, Randall W; Ciceri, David P; Shake, Jay G
2018-01-01
The transfer of critically ill patients from the operating room (OR) to the surgical intensive care unit (SICU) involves handoffs between multiple providers. Incomplete handoffs lead to poor communication, a major contributor to sentinel events. Our aim was to determine whether handoff standardization led to improvements in caregiver involvement and communication. A prospective intervention study was designed to observe thirty one patient handoffs from OR to SICU for 49 critical parameters including caregiver presence, peri-operative details, and time required to complete key steps. Following a six month implementation period, thirty one handoffs were observed to determine improvement. A significant improvement in presence of physician providers including intensivists and surgeons was observed (p = 0.0004 and p < 0.0001, respectively). Critical details were communicated more consistently, including procedure performed (p = 0.0048), complications (p < 0.0001), difficult airways (p < 0.0001), ventilator settings (p < 0.0001) and pressor requirements (p = 0.0134). Conversely, handoff duration did not increase significantly (p = 0.22). Implementation of a standardized protocol for handoffs between OR and SICU significantly improved caregiver involvement and reduced information omission without affecting provider time commitment. Copyright © 2017 Elsevier Inc. All rights reserved.
Nacul, Luis; O'Donovan, Dominic G; Lacerda, Eliana M; Gveric, Djordje; Goldring, Kirstin; Hall, Alison; Bowman, Erinna; Pheby, Derek
2014-06-18
Our aim, having previously investigated through a qualitative study involving extensive discussions with experts and patients the issues involved in establishing and maintaining a disease specific brain and tissue bank for myalgic encephalomyelitis/chronic fatigue syndrome (ME/CFS), was to develop a protocol for a UK ME/CFS repository of high quality human tissue from well characterised subjects with ME/CFS and controls suitable for a broad range of research applications. This would involve a specific donor program coupled with rapid tissue collection and processing, supplemented by comprehensive prospectively collected clinical, laboratory and self-assessment data from cases and controls. We reviewed the operations of existing tissue banks from published literature and from their internal protocols and standard operating procedures (SOPs). On this basis, we developed the protocol presented here, which was designed to meet high technical and ethical standards and legal requirements and was based on recommendations of the MRC UK Brain Banks Network. The facility would be most efficient and cost-effective if incorporated into an existing tissue bank. Tissue collection would be rapid and follow robust protocols to ensure preservation sufficient for a wide range of research uses. A central tissue bank would have resources both for wide-scale donor recruitment and rapid response to donor death for prompt harvesting and processing of tissue. An ME/CFS brain and tissue bank could be established using this protocol. Success would depend on careful consideration of logistic, technical, legal and ethical issues, continuous consultation with patients and the donor population, and a sustainable model of funding ideally involving research councils, health services, and patient charities. This initiative could revolutionise the understanding of this still poorly-understood disease and enhance development of diagnostic biomarkers and treatments.
van Putten, Maaike; Aartsma-Rus, Annemieke; Grounds, Miranda D; Kornegay, Joe N; Mayhew, Anna; Gillingwater, Thomas H; Takeda, Shin'ichi; Rüegg, Markus A; De Luca, Annamaria; Nagaraju, Kanneboyina; Willmann, Raffaella
A workshop took place in 2015 to follow up TREAT-NMD activities dedicated to improving quality in the preclinical phase of drug development for neuromuscular diseases. In particular, this workshop adressed necessary future steps regarding common standard experimental protocols and the issue of improving the translatability of preclinical efficacy studies.
He, Longjun; Xu, Lang; Ming, Xing; Liu, Qian
2015-02-01
Three-dimensional post-processing operations on the volume data generated by a series of CT or MR images had important significance on image reading and diagnosis. As a part of the DIOCM standard, WADO service defined how to access DICOM objects on the Web, but it didn't involve three-dimensional post-processing operations on the series images. This paper analyzed the technical features of three-dimensional post-processing operations on the volume data, and then designed and implemented a web service system for three-dimensional post-processing operations of medical images based on the WADO protocol. In order to improve the scalability of the proposed system, the business tasks and calculation operations were separated into two modules. As results, it was proved that the proposed system could support three-dimensional post-processing service of medical images for multiple clients at the same moment, which met the demand of accessing three-dimensional post-processing operations on the volume data on the web.
Surgical Models of Roux-en-Y Gastric Bypass Surgery and Sleeve Gastrectomy in Rats and Mice
Bruinsma, Bote G.; Uygun, Korkut; Yarmush, Martin L.; Saeidi, Nima
2015-01-01
Bariatric surgery is the only definitive solution currently available for the present obesity pandemic. These operations typically involve reconfiguration of gastrointestinal tract anatomy and impose profound metabolic and physiological benefits, such as substantially reducing body weight and ameliorating type II diabetes. Therefore, animal models of these surgeries offer unique and exciting opportunities to delineate the underlying mechanisms that contribute to the resolution of obesity and diabetes. Here we describe a standardized procedure for mouse and rat models of Roux-en-Y gastric bypass (80–90 minutes operative time) and sleeve gastrectomy (30–45 minutes operative time), which, to a high degree resemble operations in human. We also provide detailed protocols for both pre- and post-operative techniques that ensure a high success rate in the operations. These protocols provide the opportunity to mechanistically investigate the systemic effects of the surgical interventions, such as regulation of body weight, glucose homeostasis, and gut microbiome. PMID:25719268
An International Survey of Brain Banking Operation and Characterization Practices
Palmer-Aronsten, Beatrix; McCrossin, Toni; Kril, Jillian
2016-01-01
Brain banks continue to make a major contribution to the study of neurological and psychiatric disorders. The current complexity and scope of research heighten the need for well-characterized cases and the demand for larger cohorts and necessitate strategies, such as the establishment of bank networks based in regional areas. While individual brain banks have developed protocols that meet researchers' needs within the confines of resources and funding, to further promote collaboration, standardization and scientific validity and understanding of the current protocols of participating banks are required. A survey was sent to brain banks, identified by an Internet search, to investigate operational protocols, case characterization, cohort management, data collection, standardization, and degree of collaboration between banks. The majority of the 24 banks that returned the survey have been established for more than 20 years, and most are affiliated with a regional network. While prospective donor programs were the primary source of donation, the data collected on donors varied. Longitudinal information assists case characterization and enhances the analysis capabilities of research. However, acquiring this information depended on the availability of qualified staff. Respondents indicated a high level of importance for standardization, but only 8 of 24 considered this occurred between banks. Standard diagnostic criteria were not achieved in the classification of controls, and some banks relied on the researcher to indicate the criteria for classification of controls. Although the capacity to collaborate with other banks was indicated by 16 of 24 banks, this occurred infrequently. Engagement of all brain banks to participate toward a consensus of diagnostic tools, especially for controls, will strengthen collaboration. PMID:27399803
An International Survey of Brain Banking Operation and Characterization Practices.
Palmer-Aronsten, Beatrix; Sheedy, Donna; McCrossin, Toni; Kril, Jillian
2016-12-01
Brain banks continue to make a major contribution to the study of neurological and psychiatric disorders. The current complexity and scope of research heighten the need for well-characterized cases and the demand for larger cohorts and necessitate strategies, such as the establishment of bank networks based in regional areas. While individual brain banks have developed protocols that meet researchers' needs within the confines of resources and funding, to further promote collaboration, standardization and scientific validity and understanding of the current protocols of participating banks are required. A survey was sent to brain banks, identified by an Internet search, to investigate operational protocols, case characterization, cohort management, data collection, standardization, and degree of collaboration between banks. The majority of the 24 banks that returned the survey have been established for more than 20 years, and most are affiliated with a regional network. While prospective donor programs were the primary source of donation, the data collected on donors varied. Longitudinal information assists case characterization and enhances the analysis capabilities of research. However, acquiring this information depended on the availability of qualified staff. Respondents indicated a high level of importance for standardization, but only 8 of 24 considered this occurred between banks. Standard diagnostic criteria were not achieved in the classification of controls, and some banks relied on the researcher to indicate the criteria for classification of controls. Although the capacity to collaborate with other banks was indicated by 16 of 24 banks, this occurred infrequently. Engagement of all brain banks to participate toward a consensus of diagnostic tools, especially for controls, will strengthen collaboration.
Evaluating the Process of Generating a Clinical Trial Protocol
Franciosi, Lui G.; Butterfield, Noam N.; MacLeod, Bernard A.
2002-01-01
The research protocol is the principal document in the conduct of a clinical trial. Its generation requires knowledge about the research problem, the potential experimental confounders, and the relevant Good Clinical Practices for conducting the trial. However, such information is not always available to authors during the writing process. A checklist of over 80 items has been developed to better understand the considerations made by authors in generating a protocol. It is based on the most cited requirements for designing and implementing the randomised controlled trial. Items are categorised according to the trial's research question, experimental design, statistics, ethics, and standard operating procedures. This quality assessment tool evaluates the extent that a generated protocol deviates from the best-planned clinical trial.
Braune, S; Sperling, C; Maitz, M F; Steinseifer, U; Clauser, J; Hiebl, B; Krajewski, S; Wendel, H P; Jung, F
2017-10-01
The regulatory agencies provide recommendations rather than protocols or standard operation procedures for the hemocompatibility evaluation of novel materials e.g. for cardiovascular applications. Thus, there is a lack of specifications with regard to test setups and procedures. As a consequence, laboratories worldwide perform in vitro assays under substantially different test conditions, so that inter-laboratory and inter-study comparisons are impossible. Here, we report about a prospective, randomized and double-blind multicenter trial which demonstrates that standardization of in vitro test protocols allows a reproducible assessment of platelet adhesion and activation from fresh human platelet rich plasma as possible indicators of the thrombogenicity of cardiovascular implants. Standardization of the reported static in vitro setup resulted in a laboratory independent scoring of the following materials: poly(dimethyl siloxane) (PDMS), poly(ethylene terephthalate) (PET) and poly(tetrafluoro ethylene) (PTFE). The results of this in vitro study provide evidence that inter-laboratory and inter-study comparisons can be achieved for the evaluation of the adhesion and activation of platelets on blood-contacting biomaterials by stringent standardization of test protocols. Copyright © 2017 Elsevier B.V. All rights reserved.
Standardizing Quality Assessment of Fused Remotely Sensed Images
NASA Astrophysics Data System (ADS)
Pohl, C.; Moellmann, J.; Fries, K.
2017-09-01
The multitude of available operational remote sensing satellites led to the development of many image fusion techniques to provide high spatial, spectral and temporal resolution images. The comparison of different techniques is necessary to obtain an optimized image for the different applications of remote sensing. There are two approaches in assessing image quality: 1. Quantitatively by visual interpretation and 2. Quantitatively using image quality indices. However an objective comparison is difficult due to the fact that a visual assessment is always subject and a quantitative assessment is done by different criteria. Depending on the criteria and indices the result varies. Therefore it is necessary to standardize both processes (qualitative and quantitative assessment) in order to allow an objective image fusion quality evaluation. Various studies have been conducted at the University of Osnabrueck (UOS) to establish a standardized process to objectively compare fused image quality. First established image fusion quality assessment protocols, i.e. Quality with No Reference (QNR) and Khan's protocol, were compared on varies fusion experiments. Second the process of visual quality assessment was structured and standardized with the aim to provide an evaluation protocol. This manuscript reports on the results of the comparison and provides recommendations for future research.
A Multi-Center Space Data System Prototype Based on CCSDS Standards
NASA Technical Reports Server (NTRS)
Rich, Thomas M.
2016-01-01
Deep space missions beyond earth orbit will require new methods of data communications in order to compensate for increasing Radio Frequency (RF) propagation delay. The Consultative Committee for Space Data Systems (CCSDS) standard protocols Spacecraft Monitor & Control (SM&C), Asynchronous Message Service (AMS), and Delay/Disruption Tolerant Networking (DTN) provide such a method. However, the maturity level of this protocol stack is insufficient for mission inclusion at this time. This Space Data System prototype is intended to provide experience which will raise the Technical Readiness Level (TRL) of this protocol set. In order to reduce costs, future missions can take advantage of these standard protocols, which will result in increased interoperability between control centers. This prototype demonstrates these capabilities by implementing a realistic space data system in which telemetry is published to control center applications at the Jet Propulsion Lab (JPL), the Marshall Space Flight Center (MSFC), and the Johnson Space Center (JSC). Reverse publishing paths for commanding from each control center are also implemented. The target vehicle consists of realistic flight computer hardware running Core Flight Software (CFS) in the integrated Power, Avionics, and Power (iPAS) Pathfinder Lab at JSC. This prototype demonstrates a potential upgrade path for future Deep Space Network (DSN) modification, in which the automatic error recovery and communication gap compensation capabilities of DTN would be exploited. In addition, SM&C provides architectural flexibility by allowing new service providers and consumers to be added efficiently anywhere in the network using the common interface provided by SM&C's Message Abstraction Layer (MAL). In FY 2015, this space data system was enhanced by adding telerobotic operations capability provided by the Robot API Delegate (RAPID) family of protocols developed at NASA. RAPID is one of several candidates for consideration and inclusion in a new international standard being developed by the CCSDS Telerobotic Operations Working Group. Software gateways for the purpose of interfacing RAPID messages with the existing SM&C based infrastructure were developed. Telerobotic monitor, control, and bridge applications were written in the RAPID framework, which were then tailored to the NAO telerobotic test article hardware, a product of Aldebaran Robotics.
Improving language mapping in clinical fMRI through assessment of grammar.
Połczyńska, Monika; Japardi, Kevin; Curtiss, Susan; Moody, Teena; Benjamin, Christopher; Cho, Andrew; Vigil, Celia; Kuhn, Taylor; Jones, Michael; Bookheimer, Susan
2017-01-01
Brain surgery in the language dominant hemisphere remains challenging due to unintended post-surgical language deficits, despite using pre-surgical functional magnetic resonance (fMRI) and intraoperative cortical stimulation. Moreover, patients are often recommended not to undergo surgery if the accompanying risk to language appears to be too high. While standard fMRI language mapping protocols may have relatively good predictive value at the group level, they remain sub-optimal on an individual level. The standard tests used typically assess lexico-semantic aspects of language, and they do not accurately reflect the complexity of language either in comprehension or production at the sentence level. Among patients who had left hemisphere language dominance we assessed which tests are best at activating language areas in the brain. We compared grammar tests (items testing word order in actives and passives, wh -subject and object questions, relativized subject and object clauses and past tense marking) with standard tests (object naming, auditory and visual responsive naming), using pre-operative fMRI. Twenty-five surgical candidates (13 females) participated in this study. Sixteen patients presented with a brain tumor, and nine with epilepsy. All participants underwent two pre-operative fMRI protocols: one including CYCLE-N grammar tests (items testing word order in actives and passives, wh-subject and object questions, relativized subject and object clauses and past tense marking); and a second one with standard fMRI tests (object naming, auditory and visual responsive naming). fMRI activations during performance in both protocols were compared at the group level, as well as in individual candidates. The grammar tests generated more volume of activation in the left hemisphere (left/right angular gyrus, right anterior/posterior superior temporal gyrus) and identified additional language regions not shown by the standard tests (e.g., left anterior/posterior supramarginal gyrus). The standard tests produced more activation in left BA 47. Ten participants had more robust activations in the left hemisphere in the grammar tests and two in the standard tests. The grammar tests also elicited substantial activations in the right hemisphere and thus turned out to be superior at identifying both right and left hemisphere contribution to language processing. The grammar tests may be an important addition to the standard pre-operative fMRI testing.
Automated Planning Enables Complex Protocols on Liquid-Handling Robots.
Whitehead, Ellis; Rudolf, Fabian; Kaltenbach, Hans-Michael; Stelling, Jörg
2018-03-16
Robotic automation in synthetic biology is especially relevant for liquid handling to facilitate complex experiments. However, research tasks that are not highly standardized are still rarely automated in practice. Two main reasons for this are the substantial investments required to translate molecular biological protocols into robot programs, and the fact that the resulting programs are often too specific to be easily reused and shared. Recent developments of standardized protocols and dedicated programming languages for liquid-handling operations addressed some aspects of ease-of-use and portability of protocols. However, either they focus on simplicity, at the expense of enabling complex protocols, or they entail detailed programming, with corresponding skills and efforts required from the users. To reconcile these trade-offs, we developed Roboliq, a software system that uses artificial intelligence (AI) methods to integrate (i) generic formal, yet intuitive, protocol descriptions, (ii) complete, but usually hidden, programming capabilities, and (iii) user-system interactions to automatically generate executable, optimized robot programs. Roboliq also enables high-level specifications of complex tasks with conditional execution. To demonstrate the system's benefits for experiments that are difficult to perform manually because of their complexity, duration, or time-critical nature, we present three proof-of-principle applications for the reproducible, quantitative characterization of GFP variants.
Duan, Dongsheng; Rafael-Fortney, Jill A; Blain, Alison; Kass, David A; McNally, Elizabeth M; Metzger, Joseph M; Spurney, Christopher F; Kinnett, Kathi
2016-02-01
A recent working group meeting focused on contemporary cardiac issues in Duchenne muscular dystrophy (DMD) was hosted by the National Heart, Lung, and Blood Institute in collaboration with the Parent Project Muscular Dystrophy. An outcome of this meeting was to provide freely available detailed protocols for preclinical animal studies. The goal of these protocols is to improve the quality and reproducibility of cardiac preclinical studies aimed at developing new therapeutics for the prevention and treatment of DMD cardiomyopathy.
2015-09-17
network intrusion detection systems NIST National Institute of Standards and Technology p-tree protocol tree PI protocol informatics PLC programmable logic...electrical, water, oil , natural gas, manufacturing, and pharmaceutical industries, to name a few. The differences between SCADA and DCS systems are often... Oil Company, also known as Saudi Aramco, suffered huge data loss that resulted in the disruption of daily operations for nearly two weeks [BTR13]. As it
In-Office Endoscopic Laryngeal Laser Procedures: A Patient Safety Initiative.
Anderson, Jennifer; Bensoussan, Yael; Townsley, Richard; Kell, Erika
2018-05-01
Objective To review complications of in-office endoscopic laryngeal laser procedures after implementation of standardized safety protocol. Methods A retrospective review was conducted of the first 2 years of in-office laser procedures at St Michaels Hospital after the introduction of a standardized safety protocol. The protocol included patient screening, procedure checklist with standardized reporting of processes, medications, and complications. Primary outcomes measured were complication rates of in-office laryngeal laser procedures. Secondary outcomes included hemodynamic changes, local anesthetic dose, laser settings, total laser/procedure time, and incidence of sedation. Results A total of 145 in-office KTP procedures performed on 65 patients were reviewed. In 98% of cases, the safety protocol was fully implemented. The overall complication rate was 4.8%. No major complications were encountered. Minor complications included vasovagal episodes and patient intolerance. The rate of patient intolerance resulting early termination of anticipated procedure was 13.1%. Total local anesthetic dose averaged 172.9 mg lidocaine per procedure. The mean amount of laser energy dispersed was 261.2 J, with mean total procedure time of 48.3 minutes. Sixteen percent of patients had preprocedure sedation. Vital signs were found to vary modestly. Systolic blood pressure was lower postprocedure in 13.8% and symptomatic in 4.1%. Discussion The review of our standardized safety protocol has revealed that in-office laser treatment for laryngeal pathology has extremely low complication rates with safe patient outcomes. Implications for Practice The trend of shifting procedures out of the operating room into the office/clinic setting requires new processes designed to promote patient safety.
Data transmission protocol for Pi-of-the-Sky cameras
NASA Astrophysics Data System (ADS)
Uzycki, J.; Kasprowicz, G.; Mankiewicz, M.; Nawrocki, K.; Sitek, P.; Sokolowski, M.; Sulej, R.; Tlaczala, W.
2006-10-01
The large amount of data collected by the automatic astronomical cameras has to be transferred to the fast computers in a reliable way. The method chosen should ensure data streaming in both directions but in nonsymmetrical way. The Ethernet interface is very good choice because of its popularity and proven performance. However it requires TCP/IP stack implementation in devices like cameras for full compliance with existing network and operating systems. This paper describes NUDP protocol, which was made as supplement to standard UDP protocol and can be used as a simple-network protocol. The NUDP does not need TCP protocol implementation and makes it possible to run the Ethernet network with simple devices based on microcontroller and/or FPGA chips. The data transmission idea was created especially for the "Pi of the Sky" project.
QNAP 1263U Network Attached Storage (NAS)/ Storage Area Network (SAN) Device Users Guide
2016-11-01
standard Ethernet network. Operating either a NAS or SAN is vital for the integrity of the data stored on the drives found in the device. Redundant...speed of the network itself. Many standards are in place for transferring data, including more standard ones such as File Transfer Protocol and Server ...following are the procedures for connecting to the NAS administrative web page: 1) Open a web browser and browse to 192.168.40.8:8080. 2) Enter the
How to Stop Disagreeing and Start Cooperatingin the Presence of Asymmetric Packet Loss.
Morales-Ponce, Oscar; Schiller, Elad M; Falcone, Paolo
2018-04-22
We consider the design of a disagreement correction protocol in multi-vehicle systems. Vehicles broadcast in real-time vital information such as position, direction, speed, acceleration, intention, etc. This information is then used to identify the risks and adapt their trajectory to maintain the highest performance without compromising the safety. To minimize the risk due to the use of inconsistent information, all cooperating vehicles must agree whether to use the exchanged information to operate in a cooperative mode or use the only local information to operate in an autonomous mode. However, since wireless communications are prone to failures, it is impossible to deterministically reach an agreement. Therefore, any protocol will exhibit necessary disagreement periods. In this paper, we investigate whether vehicles can still cooperate despite communication failures even in the scenario where communication is suddenly not available. We present a deterministic protocol that allows all participants to either operate a cooperative mode when vehicles can exchange all the information in a timely manner or operate in autonomous mode when messages are lost. We show formally that the disagreement time is bounded by the time that the communication channel requires to deliver messages and validate our protocol using NS-3 simulations. We explain how the proposed solution can be used in vehicular platooning to attain high performance and still guarantee high safety standards despite communication failures.
How to Stop Disagreeing and Start Cooperatingin the Presence of Asymmetric Packet Loss
2018-01-01
We consider the design of a disagreement correction protocol in multi-vehicle systems. Vehicles broadcast in real-time vital information such as position, direction, speed, acceleration, intention, etc. This information is then used to identify the risks and adapt their trajectory to maintain the highest performance without compromising the safety. To minimize the risk due to the use of inconsistent information, all cooperating vehicles must agree whether to use the exchanged information to operate in a cooperative mode or use the only local information to operate in an autonomous mode. However, since wireless communications are prone to failures, it is impossible to deterministically reach an agreement. Therefore, any protocol will exhibit necessary disagreement periods. In this paper, we investigate whether vehicles can still cooperate despite communication failures even in the scenario where communication is suddenly not available. We present a deterministic protocol that allows all participants to either operate a cooperative mode when vehicles can exchange all the information in a timely manner or operate in autonomous mode when messages are lost. We show formally that the disagreement time is bounded by the time that the communication channel requires to deliver messages and validate our protocol using NS-3 simulations. We explain how the proposed solution can be used in vehicular platooning to attain high performance and still guarantee high safety standards despite communication failures. PMID:29690572
Security Implications of OPC, OLE, DCOM, and RPC in Control Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2006-01-01
OPC is a collection of software programming standards and interfaces used in the process control industry. It is intended to provide open connectivity and vendor equipment interoperability. The use of OPC technology simplifies the development of control systems that integrate components from multiple vendors and support multiple control protocols. OPC-compliant products are available from most control system vendors, and are widely used in the process control industry. OPC was originally known as OLE for Process Control; the first standards for OPC were based on underlying services in the Microsoft Windows computing environment. These underlying services (OLE [Object Linking and Embedding],more » DCOM [Distributed Component Object Model], and RPC [Remote Procedure Call]) have been the source of many severe security vulnerabilities. It is not feasible to automatically apply vendor patches and service packs to mitigate these vulnerabilities in a control systems environment. Control systems using the original OPC data access technology can thus inherit the vulnerabilities associated with these services. Current OPC standardization efforts are moving away from the original focus on Microsoft protocols, with a distinct trend toward web-based protocols that are independent of any particular operating system. However, the installed base of OPC equipment consists mainly of legacy implementations of the OLE for Process Control protocols.« less
Code of Federal Regulations, 2010 CFR
2010-07-01
... handling and transfer of VOHAP-containing materials to and from containers, tanks, vats, drums, and piping systems is conducted in a manner that minimizes spills. (2) All containers, tanks, vats, drums, and piping... monitoring protocol that includes operating parameter values to be monitored for compliance and an...
Code of Federal Regulations, 2011 CFR
2011-07-01
... handling and transfer of VOHAP-containing materials to and from containers, tanks, vats, drums, and piping systems is conducted in a manner that minimizes spills. (2) All containers, tanks, vats, drums, and piping... monitoring protocol that includes operating parameter values to be monitored for compliance and an...
An Analysis for an Internet Grid to Support Space Based Operations
NASA Technical Reports Server (NTRS)
Bradford, Robert; McNair, Ann R. (Technical Monitor)
2002-01-01
Currently, and in the past, dedicated communication circuits and "network services" with very stringent performance requirements have been used to support manned and unmanned mission critical ground operations at GSFC, JSC, MSFC, KSC and other NASA facilities. Because of the evolution of network technology, it is time to investigate other approaches to providing mission services for space ground and flight operations. In various scientific disciplines, effort is under way to develop network/komputing grids. These grids consisting of networks and computing equipment are enabling lower cost science. Specifically, earthquake research is headed in this direction. With a standard for network and computing interfaces using a grid, a researcher would not be required to develop and engineer NASA/DoD specific interfaces with the attendant increased cost. Use of the Internet Protocol (IP), CCSDS packet spec, and reed-solomon for satellite error correction etc. can be adopted/standardized to provide these interfaces. Generally most interfaces are developed at least to some degree end to end. This study would investigate the feasibility of using existing standards and protocols necessary to implement a SpaceOps Grid. New interface definitions or adoption/modification of existing ones for the various space operational services is required for voice both space based and ground, video, telemetry, commanding and planning may play a role to some undefined level. Security will be a separate focus in the study since security is such a large issue in using public networks. This SpaceOps Grid would be transparent to users. It would be anagulous to the Ethernet protocol's ease of use in that a researcher would plug in their experiment or instrument at one end and would be connected to the appropriate host or server without further intervention. Free flyers would be in this category as well. They would be launched and would transmit without any further intervention with the researcher or ground ops personnel. The payback in developing these new approaches in support of manned and unmanned operations is lower cost and will enable direct participation by more people in organizations and educational institutions in space based science. By lowering the high cost of space based operations and networking, more resource will be available to the science community for science. With a specific grid in place, experiment development and operations would be much less costly by using standardized network interfaces. Because of the extensive connectivity on a global basis, significant numbers of people would participate in science who otherwise would not be able to participate.
Upgrade to the control system of the reflectometry diagnostic of ASDEX upgrade
NASA Astrophysics Data System (ADS)
Graça, S.; Santos, J.; Manso, M. E.
2004-10-01
The broadband frequency modulation-continuous wave microwave/millimeter wave reflectometer of ASDEX upgrade tokamak (Institut für Plasma Physik (IPP), Garching, Germany) developed by Centro de Fusão Nuclear (Lisboa, Portugal) with the collaboration of IPP, is a complex system with 13 channels (O and X modes) and two types of operation modes (swept and fixed frequency). The control system that ensures remote operation of the diagnostic incorporates VME and CAMAC bus based acquisition/timing systems. Microprocessor input/output boards are used to control and monitor the microwave circuitry and associated electronic devices. The implementation of the control system is based on an object-oriented client/server model: a centralized server manages the hardware and receives input from remote clients. Communication is handled through transmission control protocol/internet protocol sockets. Here we describe recent upgrades of the control system aiming to: (i) accommodate new channels; (ii) adapt to the heterogeneity of computing platforms and operating systems; and (iii) overcome remote access restrictions. Platform and operating system independence was achieved by redesigning the graphical user interface in JAVA. As secure shell is the standard remote access protocol adopted in major fusion laboratories, secure shell tunneling was implemented to allow remote operation of the diagnostic through the existing firewalls.
System approach to distributed sensor management
NASA Astrophysics Data System (ADS)
Mayott, Gregory; Miller, Gordon; Harrell, John; Hepp, Jared; Self, Mid
2010-04-01
Since 2003, the US Army's RDECOM CERDEC Night Vision Electronic Sensor Directorate (NVESD) has been developing a distributed Sensor Management System (SMS) that utilizes a framework which demonstrates application layer, net-centric sensor management. The core principles of the design support distributed and dynamic discovery of sensing devices and processes through a multi-layered implementation. This results in a sensor management layer that acts as a System with defined interfaces for which the characteristics, parameters, and behaviors can be described. Within the framework, the definition of a protocol is required to establish the rules for how distributed sensors should operate. The protocol defines the behaviors, capabilities, and message structures needed to operate within the functional design boundaries. The protocol definition addresses the requirements for a device (sensors or processes) to dynamically join or leave a sensor network, dynamically describe device control and data capabilities, and allow dynamic addressing of publish and subscribe functionality. The message structure is a multi-tiered definition that identifies standard, extended, and payload representations that are specifically designed to accommodate the need for standard representations of common functions, while supporting the need for feature-based functions that are typically vendor specific. The dynamic qualities of the protocol enable a User GUI application the flexibility of mapping widget-level controls to each device based on reported capabilities in real-time. The SMS approach is designed to accommodate scalability and flexibility within a defined architecture. The distributed sensor management framework and its application to a tactical sensor network will be described in this paper.
Operation of the HP2250 with the HP9000 series 200 using PASCAL 3.0
NASA Technical Reports Server (NTRS)
Perry, John; Stroud, C. W.
1986-01-01
A computer program has been written to provide an interface between the HP Series 200 desktop computers, operating under HP Standard Pascal 3.0, and the HP2250 Data Acquisition and Control System. Pascal 3.0 for the HP9000 desktop computer gives a number of procedures for handling bus communication at various levels. It is necessary, however, to reach the lowest possible level in Pascal to handle the bus protocols required by the HP2250. This makes programming extremely complex since these protocols are not documented. The program described solves those problems and allows the user to immediately program, simply and efficiently, any measurement and control language (MCL/50) application with a few procedure calls. The complete set of procedures is available on a 5 1/4 inch diskette from Cosmic. Included in this group of procedures is an Exerciser which allows the user to exercise his HP2250 interactively. The exerciser operates in a fashion similar to the Series 200 operating system programs, but is adapted to the requirements of the HP2250. The programs on the diskette and the user's manual assume the user is acquainted with both the MCL/50 programming language and HP Standard Pascal 3.0 for the HP series 200 desktop computers.
Tuning of automatic exposure control strength in lumbar spine CT.
D'Hondt, A; Cornil, A; Bohy, P; De Maertelaer, V; Gevenois, P A; Tack, D
2014-05-01
To investigate the impact of tuning the automatic exposure control (AEC) strength curve (specific to Care Dose 4D®; Siemens Healthcare, Forchheim, Germany) from "average" to "strong" on image quality, radiation dose and operator dependency during lumbar spine CT examinations. Two hospitals (H1, H2), both using the same scanners, were considered for two time periods (P1 and P2). During P1, the AEC curve was "average" and radiographers had to select one of two protocols according to the body mass index (BMI): "standard" if BMI <30.0 kg m(-2) (120 kV-330 mAs) or "large" if BMI >30.0 kg m(-2) (140 kV-280 mAs). During P2, the AEC curve was changed to "strong", and all acquisitions were obtained with one protocol (120 kV and 270 mAs). Image quality was scored and patients' diameters calculated for both periods. 497 examinations were analysed. There was no significant difference in mean diameters according to hospitals and periods (p > 0.801) and in quality scores between periods (p > 0.172). There was a significant difference between hospitals regarding how often the "large" protocol was assigned [13 (10%)/132 patients in H1 vs 37 (28%)/133 in H2] (p < 0.001). During P1, volume CT dose index (CTDIvol) was higher in H2 (+13%; p = 0.050). In both hospitals, CTDIvol was reduced between periods (-19.2% in H1 and -29.4% in H2; p < 0.001). An operator dependency in protocol selection, unexplained by patient diameters or highlighted by image quality scores, has been observed. Tuning the AEC curve from average to strong enables suppression of the operator dependency in protocol selection and related dose increase, while preserving image quality. CT acquisition protocols based on weight are responsible for biases in protocol selection. Using an appropriate AEC strength curve reduces the number of protocols to one. Operator dependency of protocol selection is thereby eliminated.
The operations manual: a mechanism for improving the research process.
Bowman, Ann; Wyman, Jean F; Peters, Jennifer
2002-01-01
The development and use of an operations manual has the potential to improve the capacity of nurse scientists to address the complex, multifaceted issues associated with conducting research in today's healthcare environment. An operations manual facilitates communication, standardizes training and evaluation, and enhances the development and standard implementation of clear policies, processes, and protocols. A 10-year review of methodology articles in relevant nursing journals revealed no attention to this topic. This article will discuss how an operations manual can improve the conduct of research methods and outcomes for both small-scale and large-scale research studies. It also describes the purpose and components of a prototype operations manual for use in quantitative research. The operations manual increases reliability and reproducibility of the research while improving the management of study processes. It can prevent costly and untimely delays or errors in the conduct of research.
Wilson, Marcia H.; Rowe, Barbara L.; Gitzen, Robert A.; Wilson, Stephen K.; Paintner-Green, Kara J.
2014-01-01
As recommended by Oakley et al. (2003), this protocol provides a narrative and the rationale for selection of streams and rivers within the NGPN that will be measured for water quality, including dissolved oxygen, pH, specific conductivity, and temperature. Standard operating procedures (SOPs) that detail the steps to collect, manage, and disseminate the NGPN water quality data are in an accompanying document. The sampling design documented in this protocol may be updated as monitoring information is collected and interpreted, and as refinement of methodologies develop through time. In addition, evaluation of data and refinement of the program may necessitate potential changes of program objectives. Changes to the NGPN water quality protocols and SOPs will be carefully documented in a revision history log.
Improving operating room safety
2009-01-01
Despite the introduction of the Universal Protocol, patient safety in surgery remains a daily challenge in the operating room. This present study describes one community health system's efforts to improve operating room safety through human factors training and ultimately the development of a surgical checklist. Using a combination of formal training, local studies documenting operating room safety issues and peer to peer mentoring we were able to substantially change the culture of our operating room. Our efforts have prepared us for successfully implementing a standardized checklist to improve operating room safety throughout our entire system. Based on these findings we recommend a multimodal approach to improving operating room safety. PMID:19930577
Demonstrating a Realistic IP Mission Prototype
NASA Technical Reports Server (NTRS)
Rash, James; Ferrer, Arturo B.; Goodman, Nancy; Ghazi-Tehrani, Samira; Polk, Joe; Johnson, Lorin; Menke, Greg; Miller, Bill; Criscuolo, Ed; Hogie, Keith
2003-01-01
Flight software and hardware and realistic space communications environments were elements of recent demonstrations of the Internet Protocol (IP) mission concept in the lab. The Operating Missions as Nodes on the Internet (OMNI) Project and the Flight Software Branch at NASA/GSFC collaborated to build the prototype of a representative space mission that employed unmodified off-the-shelf Internet protocols and technologies for end-to-end communications between the spacecraft/instruments and the ground system/users. The realistic elements used in the prototype included an RF communications link simulator and components of the TRIANA mission flight software and ground support system. A web-enabled camera connected to the spacecraft computer via an Ethernet LAN represented an on-board instrument creating image data. In addition to the protocols at the link layer (HDLC), transport layer (UDP, TCP), and network (IP) layer, a reliable file delivery protocol (MDP) at the application layer enabled reliable data delivery both to and from the spacecraft. The standard Network Time Protocol (NTP) performed on-board clock synchronization with a ground time standard. The demonstrations of the prototype mission illustrated some of the advantages of using Internet standards and technologies for space missions, but also helped identify issues that must be addressed. These issues include applicability to embedded real-time systems on flight-qualified hardware, range of applicability of TCP, and liability for and maintenance of commercial off-the-shelf (COTS) products. The NASA Earth Science Technology Office (ESTO) funded the collaboration to build and demonstrate the prototype IP mission.
Recommendations for a service framework to access astronomical archives
NASA Technical Reports Server (NTRS)
Travisano, J. J.; Pollizzi, J.
1992-01-01
There are a large number of astronomical archives and catalogs on-line for network access, with many different user interfaces and features. Some systems are moving towards distributed access, supplying users with client software for their home sites which connects to servers at the archive site. Many of the issues involved in defining a standard framework of services that archive/catalog suppliers can use to achieve a basic level of interoperability are described. Such a framework would simplify the development of client and server programs to access the wide variety of astronomical archive systems. The primary services that are supplied by current systems include: catalog browsing, dataset retrieval, name resolution, and data analysis. The following issues (and probably more) need to be considered in establishing a standard set of client/server interfaces and protocols: Archive Access - dataset retrieval, delivery, file formats, data browsing, analysis, etc.; Catalog Access - database management systems, query languages, data formats, synchronous/asynchronous mode of operation, etc.; Interoperability - transaction/message protocols, distributed processing mechanisms (DCE, ONC/SunRPC, etc), networking protocols, etc.; Security - user registration, authorization/authentication mechanisms, etc.; Service Directory - service registration, lookup, port/task mapping, parameters, etc.; Software - public vs proprietary, client/server software, standard interfaces to client/server functions, software distribution, operating system portability, data portability, etc. Several archive/catalog groups, notably the Astrophysics Data System (ADS), are already working in many of these areas. In the process of developing StarView, which is the user interface to the Space Telescope Data Archive and Distribution Service (ST-DADS), these issues and the work of others were analyzed. A framework of standard interfaces for accessing services on any archive system which would benefit archive user and supplier alike is proposed.
NASA Technical Reports Server (NTRS)
Norcross, Jason; Jarvis, Sarah; Bekdash, Omar; Cupples, Scott; Abercromby, Andrew
2017-01-01
The primary objective of this study is to develop a protocol to reliably characterize human health and performance metrics for individuals working inside various EVA suits under realistic spaceflight conditions. Expected results and methodologies developed during this study will provide the baseline benchmarking data and protocols with which future EVA suits and suit configurations (e.g., varied pressure, mass, center of gravity [CG]) and different test subject populations (e.g., deconditioned crewmembers) may be reliably assessed and compared. Results may also be used, in conjunction with subsequent testing, to inform fitness-for-duty standards, as well as design requirements and operations concepts for future EVA suits and other exploration systems.
NASA Technical Reports Server (NTRS)
Rash, James; Parise, Ron; Hogie, Keith; Criscuolo, Ed; Langston, Jim; Jackson, Chris; Price, Harold; Powers, Edward I. (Technical Monitor)
2000-01-01
The Operating Missions as Nodes on the Internet (OMNI) project at NASA's Goddard Space flight Center (GSFC), is demonstrating the use of standard Internet protocols for spacecraft communication systems. This year, demonstrations of Internet access to a flying spacecraft have been performed with the UoSAT-12 spacecraft owned and operated by Surrey Satellite Technology Ltd. (SSTL). Previously, demonstrations were performed using a ground satellite simulator and NASA's Tracking and Data Relay Satellite System (TDRSS). These activities are part of NASA's Space Operations Management Office (SOMO) Technology Program, The work is focused on defining the communication architecture for future NASA missions to support both NASA's "faster, better, cheaper" concept and to enable new types of collaborative science. The use of standard Internet communication technology for spacecraft simplifies design, supports initial integration and test across an IP based network, and enables direct communication between scientists and instruments as well as between different spacecraft, The most recent demonstrations consisted of uploading an Internet Protocol (IP) software stack to the UoSAT- 12 spacecraft, simple modifications to the SSTL ground station, and a series of tests to measure performance of various Internet applications. The spacecraft was reconfigured on orbit at very low cost. The total period between concept and the first tests was only 3 months. The tests included basic network connectivity (PING), automated clock synchronization (NTP), and reliable file transfers (FTP). Future tests are planned to include additional protocols such as Mobile IP, e-mail, and virtual private networks (VPN) to enable automated, operational spacecraft communication networks. The work performed and results of the initial phase of tests are summarized in this paper. This work is funded and directed by NASA/GSFC with technical leadership by CSC in arrangement with SSTL, and Vytek Wireless.
Asynchronous Message Service Reference Implementation
NASA Technical Reports Server (NTRS)
Burleigh, Scott C.
2011-01-01
This software provides a library of middleware functions with a simple application programming interface, enabling implementation of distributed applications in conformance with the CCSDS AMS (Consultative Committee for Space Data Systems Asynchronous Message Service) specification. The AMS service, and its protocols, implement an architectural concept under which the modules of mission systems may be designed as if they were to operate in isolation, each one producing and consuming mission information without explicit awareness of which other modules are currently operating. Communication relationships among such modules are self-configuring; this tends to minimize complexity in the development and operations of modular data systems. A system built on this model is a society of generally autonomous, inter-operating modules that may fluctuate freely over time in response to changing mission objectives, modules functional upgrades, and recovery from individual module failure. The purpose of AMS, then, is to reduce mission cost and risk by providing standard, reusable infrastructure for the exchange of information among data system modules in a manner that is simple to use, highly automated, flexible, robust, scalable, and efficient. The implementation is designed to spawn multiple threads of AMS functionality under the control of an AMS application program. These threads enable all members of an AMS-based, distributed application to discover one another in real time, subscribe to messages on specific topics, and to publish messages on specific topics. The query/reply (client/server) communication model is also supported. Message exchange is optionally subject to encryption (to support confidentiality) and authorization. Fault tolerance measures in the discovery protocol minimize the likelihood of overall application failure due to any single operational error anywhere in the system. The multi-threaded design simplifies processing while enabling application nodes to operate at high speeds; linked lists protected by mutex semaphores and condition variables are used for efficient, inter-thread communication. Applications may use a variety of transport protocols underlying AMS itself, including TCP (Transmission Control Protocol), UDP (User Datagram Protocol), and message queues.
2014-01-01
Background Our aim, having previously investigated through a qualitative study involving extensive discussions with experts and patients the issues involved in establishing and maintaining a disease specific brain and tissue bank for myalgic encephalomyelitis/chronic fatigue syndrome (ME/CFS), was to develop a protocol for a UK ME/CFS repository of high quality human tissue from well characterised subjects with ME/CFS and controls suitable for a broad range of research applications. This would involve a specific donor program coupled with rapid tissue collection and processing, supplemented by comprehensive prospectively collected clinical, laboratory and self-assessment data from cases and controls. Findings We reviewed the operations of existing tissue banks from published literature and from their internal protocols and standard operating procedures (SOPs). On this basis, we developed the protocol presented here, which was designed to meet high technical and ethical standards and legal requirements and was based on recommendations of the MRC UK Brain Banks Network. The facility would be most efficient and cost-effective if incorporated into an existing tissue bank. Tissue collection would be rapid and follow robust protocols to ensure preservation sufficient for a wide range of research uses. A central tissue bank would have resources both for wide-scale donor recruitment and rapid response to donor death for prompt harvesting and processing of tissue. Conclusion An ME/CFS brain and tissue bank could be established using this protocol. Success would depend on careful consideration of logistic, technical, legal and ethical issues, continuous consultation with patients and the donor population, and a sustainable model of funding ideally involving research councils, health services, and patient charities. This initiative could revolutionise the understanding of this still poorly-understood disease and enhance development of diagnostic biomarkers and treatments. PMID:24938650
Internet Data Delivery for Future Space Missions
NASA Technical Reports Server (NTRS)
Rash, James; Hogie, Keith; Casasanta, Ralph; Hennessy, Joseph F. (Technical Monitor)
2002-01-01
This paper presents work being done at NASA/GSFC (Goddard Space Flight Center) on applying standard Internet applications and protocols to meet the technology challenge of future satellite missions. Internet protocols (IP) can provide seamless dynamic communication among heterogeneous instruments, spacecraft, ground stations, and constellations of spacecraft. A primary component of this work is to design and demonstrate automated end-to-end transport of files in a dynamic space environment using off-the-shelf, low-cost, commodity-level standard applications and protocols. These functions and capabilities will become increasingly significant in the years to come as both Earth and space science missions fly more sensors and the present labor-intensive, mission-specific techniques for processing and routing data become prohibitively expensive. This paper describes how an IP-based communication architecture can support existing operations concepts and how it will enable some new and complex communication and science concepts. The authors identify specific end-to-end file transfers all the way from instruments to control centers and scientists, and then describe how each data flow can be supported using standard Internet protocols and applications. The scenarios include normal data downlink and command uplink as well as recovery scenarios for both onboard and ground failures. The scenarios are based on an Earth orbiting spacecraft with data rates and downlink capabilities from 300 Kbps to 4 Mbps. Many examples are based on designs currently being investigated for the Global Precipitation Measurement (GPM) mission.
Brensel, Robert; Brensel, Scott; Ng, Amy
2013-01-01
Since the New England Compounding Center disaster in 2012, the importance of following correct procedures during every phase of customized pharmacy has been a focus of governmental interest and action as well as public scrutiny. Many pharmacies rely on the rote review of standard operating procedures to ensure that staff members understand and follow protocols that ensure the safety and potency of all compounds prepared, but that approach to continuing education can be cumbersome and needlessly time-consuming. In addition, documenting and retrieving evidence of employee competence can be difficult. In this article, we describe our use of online technology to improve our methods of educating staff about the full range of standard operating procedures that must be followed in our pharmacy. The system we devised and implemented has proven to be effective, easy to update and maintain, very inexpensive, and user friendly. Its use has reduced the time previously required for a read-over review of standard operating procedures from 30 or 40 minutes to 5 or 10 minutes in weekly staff meetings, and we can now easily document and access proof of employees' comprehension of that content. It is our hope that other small compounding pharmacies will also find this system of online standard operating procedure review helpful.
Study on Network Error Analysis and Locating based on Integrated Information Decision System
NASA Astrophysics Data System (ADS)
Yang, F.; Dong, Z. H.
2017-10-01
Integrated information decision system (IIDS) integrates multiple sub-system developed by many facilities, including almost hundred kinds of software, which provides with various services, such as email, short messages, drawing and sharing. Because the under-layer protocols are different, user standards are not unified, many errors are occurred during the stages of setup, configuration, and operation, which seriously affect the usage. Because the errors are various, which may be happened in different operation phases, stages, TCP/IP communication protocol layers, sub-system software, it is necessary to design a network error analysis and locating tool for IIDS to solve the above problems. This paper studies on network error analysis and locating based on IIDS, which provides strong theory and technology supports for the running and communicating of IIDS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hendron, R.; Engebrecht, C.
The House Simulation Protocol document was developed to track and manage progress toward Building America's multi-year, average whole-building energy reduction research goals for new construction and existing homes, using a consistent analytical reference point. This report summarizes the guidelines for developing and reporting these analytical results in a consistent and meaningful manner for all home energy uses using standard operating conditions.
This procedure summarizes the sample shipping procedures that have been described in the individual NHEXAS sample collection protocols. This procedure serves as a quick reference tool for the field staff when samples are prepared for shipment at the field lab/staging area. For ea...
40 CFR 63.642 - General standards.
Code of Federal Regulations, 2010 CFR
2010-07-01
... reduction. (4) Data shall be reduced in accordance with the EPA-approved methods specified in the applicable section or, if other test methods are used, the data and methods shall be validated according to the protocol in Method 301 of appendix A of this part. (e) Each owner or operator of a source subject to this...
This technical report provides a description of the field project design, quality control, the sampling protocols and analysis methodology used, and standard operating procedures for the South Fork Broad River Watershed (SFBR) Total Maximum Daily Load (TMDL) project. This watersh...
Network Monitor and Control of Disruption-Tolerant Networks
NASA Technical Reports Server (NTRS)
Torgerson, J. Leigh
2014-01-01
For nearly a decade, NASA and many researchers in the international community have been developing Internet-like protocols that allow for automated network operations in networks where the individual links between nodes are only sporadically connected. A family of Disruption-Tolerant Networking (DTN) protocols has been developed, and many are reaching CCSDS Blue Book status. A NASA version of DTN known as the Interplanetary Overlay Network (ION) has been flight-tested on the EPOXI spacecraft and ION is currently being tested on the International Space Station. Experience has shown that in order for a DTN service-provider to set up a large scale multi-node network, a number of network monitor and control technologies need to be fielded as well as the basic DTN protocols. The NASA DTN program is developing a standardized means of querying a DTN node to ascertain its operational status, known as the DTN Management Protocol (DTNMP), and the program has developed some prototypes of DTNMP software. While DTNMP is a necessary component, it is not sufficient to accomplish Network Monitor and Control of a DTN network. JPL is developing a suite of tools that provide for network visualization, performance monitoring and ION node control software. This suite of network monitor and control tools complements the GSFC and APL-developed DTN MP software, and the combined package can form the basis for flight operations using DTN.
Hardware platform for multiple mobile robots
NASA Astrophysics Data System (ADS)
Parzhuber, Otto; Dolinsky, D.
2004-12-01
This work is concerned with software and communications architectures that might facilitate the operation of several mobile robots. The vehicles should be remotely piloted or tele-operated via a wireless link between the operator and the vehicles. The wireless link will carry control commands from the operator to the vehicle, telemetry data from the vehicle back to the operator and frequently also a real-time video stream from an on board camera. For autonomous driving the link will carry commands and data between the vehicles. For this purpose we have developed a hardware platform which consists of a powerful microprocessor, different sensors, stereo- camera and Wireless Local Area Network (WLAN) for communication. The adoption of IEEE802.11 standard for the physical and access layer protocols allow a straightforward integration with the internet protocols TCP/IP. For the inspection of the environment the robots are equipped with a wide variety of sensors like ultrasonic, infrared proximity sensors and a small inertial measurement unit. Stereo cameras give the feasibility of the detection of obstacles, measurement of distance and creation of a map of the room.
Şükür, Yavuz Emre; Koyuncu, Kazibe; Seval, Mehmet Murat; Çetinkaya, Esra; Dökmeci, Fulya
2017-12-01
To evaluate the performances of five different βhCG follow-up protocols after single-dose methotrexate therapy for tubal ectopic pregnancy (EP). Data of patients who received single-dose methotrexate therapy for tubal EP at a university hospital between January 2011 and July 2016 were reviewed. A 'successful methotrexate treatment' was defined if the EP treated with no need for surgery. The performances of different protocols were tested by comparing with the currently used '15% βhCG decrease between days 4 and 7' protocol. The tested follow-up protocols were '20, 25%, and any βhCG decrease between days 0/1 and 7' and '20% and any βhCG decrease between days 0/1 and 4'. Among the 96 patients evaluated, 12 (12.5%) required second dose. Totally, 91 (94.8%) patients treated successfully with no need for surgery. Four patients were operated within 4 days following the second dose. One patient who did not need second dose according to the standard follow-up protocol was operated on the 10th day due to rupture (specificity = 80%). Two protocols, namely '20% βhCG decrease between days 0/1 and 7' and 'any βhCG decrease between days 0/1 and 7' did not show statistically significant differences from the index protocol regarding the number of patients who should be assigned to 2nd dose. 'Any βhCG decrease between days 0/1 and 7' protocol may substitute the currently used one to decide second dose methotrexate in tubal EP management. Omitting 4th day measurement seems to be more convenient and cost effective.
Koudryavtcev, Sergey A; Lazarev, Vyacheslav M
2011-01-01
Automatic blood pressure (BP) measuring devices are more and more often used in BP self-checks and in 24-hour BP monitoring. Nowadays, 24-hour BP monitoring is a necessary procedure in arterial hypertension treatment. The aim of this study was to validate the BPLab(®) ambulatory blood pressure monitor according to the European standard BS EN 1060-4:2004 and the British Hypertension Society (BHS) protocol, as well as to work out solutions regarding the suitability of using this device in clinical practice. A group of 85 patients of both sexes and different ages, who voluntarily agreed to take part in the tests and were given detailed instructions on the measurement technique were recruited for this study. The results of the BP measurement obtained by a qualified operator using the BPLab(®) device were compared with the BP values measured using the Korotkov auscultatory method. Data were obtained simultaneously by two experts with experience of over 10 years and had completed a noninvasive BP measurement standardization training course. Discrepancies in the systolic and diastolic BP measurements (N = 510; 255 for each expert) were analyzed according to the criteria specified in the BHS-93 protocol. The device passed the requirements of the European Standard BS EN 1060-4:2004 and was graded 'A' according to the criteria of the BHS protocol for both systolic BP and diastolic BP. The BPLab(®) 24-hour ambulatory blood pressure monitoring device may be recommended for extensive clinical use.
Software Implements a Space-Mission File-Transfer Protocol
NASA Technical Reports Server (NTRS)
Rundstrom, Kathleen; Ho, Son Q.; Levesque, Michael; Sanders, Felicia; Burleigh, Scott; Veregge, John
2004-01-01
CFDP is a computer program that implements the CCSDS (Consultative Committee for Space Data Systems) File Delivery Protocol, which is an international standard for automatic, reliable transfers of files of data between locations on Earth and in outer space. CFDP administers concurrent file transfers in both directions, delivery of data out of transmission order, reliable and unreliable transmission modes, and automatic retransmission of lost or corrupted data by use of one or more of several lost-segment-detection modes. The program also implements several data-integrity measures, including file checksums and optional cyclic redundancy checks for each protocol data unit. The metadata accompanying each file can include messages to users application programs and commands for operating on remote file systems.
Anzalone, Nicoletta; Castellano, Antonella; Cadioli, Marcello; Conte, Gian Marco; Cuccarini, Valeria; Bizzi, Alberto; Grimaldi, Marco; Costa, Antonella; Grillea, Giovanni; Vitali, Paolo; Aquino, Domenico; Terreni, Maria Rosa; Torri, Valter; Erickson, Bradley J; Caulo, Massimo
2018-06-01
Purpose To evaluate the feasibility of a standardized protocol for acquisition and analysis of dynamic contrast material-enhanced (DCE) and dynamic susceptibility contrast (DSC) magnetic resonance (MR) imaging in a multicenter clinical setting and to verify its accuracy in predicting glioma grade according to the new World Health Organization 2016 classification. Materials and Methods The local research ethics committees of all centers approved the study, and informed consent was obtained from patients. One hundred patients with glioma were prospectively examined at 3.0 T in seven centers that performed the same preoperative MR imaging protocol, including DCE and DSC sequences. Two independent readers identified the perfusion hotspots on maps of volume transfer constant (K trans ), plasma (v p ) and extravascular-extracellular space (v e ) volumes, initial area under the concentration curve, and relative cerebral blood volume (rCBV). Differences in parameters between grades and molecular subtypes were assessed by using Kruskal-Wallis and Mann-Whitney U tests. Diagnostic accuracy was evaluated by using receiver operating characteristic curve analysis. Results The whole protocol was tolerated in all patients. Perfusion maps were successfully obtained in 94 patients. An excellent interreader reproducibility of DSC- and DCE-derived measures was found. Among DCE-derived parameters, v p and v e had the highest accuracy (are under the receiver operating characteristic curve [A z ] = 0.847 and 0.853) for glioma grading. DSC-derived rCBV had the highest accuracy (A z = 0.894), but the difference was not statistically significant (P > .05). Among lower-grade gliomas, a moderate increase in both v p and rCBV was evident in isocitrate dehydrogenase wild-type tumors, although this was not significant (P > .05). Conclusion A standardized multicenter acquisition and analysis protocol of DCE and DSC MR imaging is feasible and highly reproducible. Both techniques showed a comparable, high diagnostic accuracy for grading gliomas. © RSNA, 2018 Online supplemental material is available for this article.
Colloidal Fouling of Nanofiltration Membranes: Development of a Standard Operating Procedure
Al Mamun, Md Abdullaha; Bhattacharjee, Subir; Pernitsky, David; Sadrzadeh, Mohtada
2017-01-01
Fouling of nanofiltration (NF) membranes is the most significant obstacle to the development of a sustainable and energy-efficient NF process. Colloidal fouling and performance decline in NF processes is complex due to the combination of cake formation and salt concentration polarization effects, which are influenced by the properties of the colloids and the membrane, the operating conditions of the test, and the solution chemistry. Although numerous studies have been conducted to investigate the influence of these parameters on the performance of the NF process, the importance of membrane preconditioning (e.g., compaction and equilibrating with salt water), as well as the determination of key parameters (e.g., critical flux and trans-membrane osmotic pressure) before the fouling experiment have not been reported in detail. The aim of this paper is to present a standard experimental and data analysis protocol for NF colloidal fouling experiments. The developed methodology covers preparation and characterization of water samples and colloidal particles, pre-test membrane compaction and critical flux determination, measurement of experimental data during the fouling test, and the analysis of that data to determine the relative importance of various fouling mechanisms. The standard protocol is illustrated with data from a series of flat sheet, bench-scale experiments. PMID:28106775
NASA'S Standard Measures During Bed Rest: Adaptations in the Cardiovascular System
NASA Technical Reports Server (NTRS)
Lee, Stuart M. C.; Feiveson, Alan H.; Martin, David S.; Cromwell, Roni L.; Platts, Steven H.; Stenger, Michael B.
2016-01-01
Bed rest is a well-accepted analog of space flight that has been used extensively to investigate physiological adaptations in a larger number of subjects in a shorter amount of time than can be studied with space flight and without the confounding effects associated with normal mission operations. However, comparison across studies of different bed rest durations, between sexes, and between various countermeasure protocols have been hampered by dissimilarities in bed rest conditions, measurement protocols, and testing schedules. To address these concerns, NASA instituted standard bed rest conditions and standard measures for all physiological disciplines participating in studies conducted at the Flight Analogs Research Unit (FARU) at the University of Texas-Medical Branch. Investigators for individual studies employed their own targeted study protocols to address specific hypothesis-driven questions, but standard measures tests were conducted within these studies on a non-interference basis to maximize data availability while reducing the need to implement multiple bed rest studies to understand the effects of a specific countermeasure. When possible, bed rest standard measures protocols were similar to tests nominally used for medically-required measures or research protocols conducted before and after Space Shuttle and International Space Station missions. Specifically, bed rest standard measures for the cardiovascular system implemented before, during, and after bed rest at the FARU included plasma volume (carbon monoxide rebreathing), cardiac mass and function (2D, 3D and Doppler echocardiography), and orthostatic tolerance testing (15- or 30-minutes of 80 degree head-up tilt). Results to-date indicate that when countermeasures are not employed, plasma volume decreases and the incidence of presyncope during head-up tilt is more frequent even after short-duration bed rest while reductions in cardiac function and mass are progressive as bed rest duration increases. Additionally, while plasma volume loss can be corrected and cardiac mass can be prevented with properly applied countermeasures, orthostatic tolerance is more difficult to protect when supine exercise is the only countermeasure. Similar results have been observed after space flight. Plasma volume, cardiac chamber volume, and orthostatic tolerance recover relatively quickly with resumption of ambulation and normal activity levels after bed rest but restoration of cardiac mass is prolonged.
Lopez-Rendon, Xochitl; Zhang, Guozhi; Coudyzer, Walter; Develter, Wim; Bosmans, Hilde; Zanca, Federica
2017-11-01
To compare the lung and breast dose associated with three chest protocols: standard, organ-based tube current modulation (OBTCM) and fast-speed scanning; and to estimate the error associated with organ dose when modelling the longitudinal (z-) TCM versus the 3D-TCM in Monte Carlo simulations (MC) for these three protocols. Five adult and three paediatric cadavers with different BMI were scanned. The CTDI vol of the OBTCM and the fast-speed protocols were matched to the patient-specific CTDI vol of the standard protocol. Lung and breast doses were estimated using MC with both z- and 3D-TCM simulated and compared between protocols. The fast-speed scanning protocol delivered the highest doses. A slight reduction for breast dose (up to 5.1%) was observed for two of the three female cadavers with the OBTCM in comparison to the standard. For both adult and paediatric, the implementation of the z-TCM data only for organ dose estimation resulted in 10.0% accuracy for the standard and fast-speed protocols, while relative dose differences were up to 15.3% for the OBTCM protocol. At identical CTDI vol values, the standard protocol delivered the lowest overall doses. Only for the OBTCM protocol is the 3D-TCM needed if an accurate (<10.0%) organ dosimetry is desired. • The z-TCM information is sufficient for accurate dosimetry for standard protocols. • The z-TCM information is sufficient for accurate dosimetry for fast-speed scanning protocols. • For organ-based TCM schemes, the 3D-TCM information is necessary for accurate dosimetry. • At identical CTDI vol , the fast-speed scanning protocol delivered the highest doses. • Lung dose was higher in XCare than standard protocol at identical CTDI vol .
Chung, Jinyong; Yoo, Kwangsun; Lee, Peter; Kim, Chan Mi; Roh, Jee Hoon; Park, Ji Eun; Kim, Sang Joon; Seo, Sang Won; Shin, Jeong-Hyeon; Seong, Joon-Kyung; Jeong, Yong
2017-10-01
The use of different 3D T1-weighted magnetic resonance (T1 MR) imaging protocols induces image incompatibility across multicenter studies, negating the many advantages of multicenter studies. A few methods have been developed to address this problem, but significant image incompatibility still remains. Thus, we developed a novel and convenient method to improve image compatibility. W-score standardization creates quality reference values by using a healthy group to obtain normalized disease values. We developed a protocol-specific w-score standardization to control the protocol effect, which is applied to each protocol separately. We used three data sets. In dataset 1, brain T1 MR images of normal controls (NC) and patients with Alzheimer's disease (AD) from two centers, acquired with different T1 MR protocols, were used (Protocol 1 and 2, n = 45/group). In dataset 2, data from six subjects, who underwent MRI with two different protocols (Protocol 1 and 2), were used with different repetition times, echo times, and slice thicknesses. In dataset 3, T1 MR images from a large number of healthy normal controls (Protocol 1: n = 148, Protocol 2: n = 343) were collected for w-score standardization. The protocol effect and disease effect on subjects' cortical thickness were analyzed before and after the application of protocol-specific w-score standardization. As expected, different protocols resulted in differing cortical thickness measurements in both NC and AD subjects. Different measurements were obtained for the same subject when imaged with different protocols. Multivariate pattern difference between measurements was observed between the protocols. Classification accuracy between two protocols was nearly 90%. After applying protocol-specific w-score standardization, the differences between the protocols substantially decreased. Most importantly, protocol-specific w-score standardization reduced both univariate and multivariate differences in the images while maintaining the AD disease effect. Compared to conventional regression methods, our method showed the best performance for in terms of controlling the protocol effect while preserving disease information. Protocol-specific w-score standardization effectively resolved the concerns of conventional regression methods. It showed the best performance for improving the compatibility of a T1 MR post-processed feature, cortical thickness. Copyright © 2017 Elsevier Inc. All rights reserved.
IPHE Regulations Codes and Standards Working Group - Type IV COPV Round Robin Testing
NASA Technical Reports Server (NTRS)
Maes, M.; Starritt, L.; Zheng, J. Y.; Ou, K.; Keller, J.
2017-01-01
This manuscript presents the results of a multi-lateral international activity intended to understand how to execute a cycle stress test as specified in a chosen standard (GTR, SAE, ISO, EIHP...). The purpose of this work was to establish a harmonized test method protocol to ensure that the same results would be achieved regardless of the testing facility. It was found that accurate temperature measurement of the working fluid is necessary to ensure the test conditions remain within the tolerances specified. Continuous operation is possible with adequate cooling of the working fluid but this becomes more demanding if the cycle frequency increases. Recommendations for future test system design and operation are presented.
Visibility-Based Hypothesis Testing Using Higher-Order Optical Interference
NASA Astrophysics Data System (ADS)
Jachura, Michał; Jarzyna, Marcin; Lipka, Michał; Wasilewski, Wojciech; Banaszek, Konrad
2018-03-01
Many quantum information protocols rely on optical interference to compare data sets with efficiency or security unattainable by classical means. Standard implementations exploit first-order coherence between signals whose preparation requires a shared phase reference. Here, we analyze and experimentally demonstrate the binary discrimination of visibility hypotheses based on higher-order interference for optical signals with a random relative phase. This provides a robust protocol implementation primitive when a phase lock is unavailable or impractical. With the primitive cost quantified by the total detected optical energy, optimal operation is typically reached in the few-photon regime.
National protocol framework for the inventory and monitoring of bees
Droege, Sam; Engler, Joseph D.; Sellers, Elizabeth A.; Lee O'Brien,
2016-01-01
This national protocol framework is a standardized tool for the inventory and monitoring of the approximately 4,200 species of native and non-native bee species that may be found within the National Wildlife Refuge System (NWRS) administered by the U.S. Fish and Wildlife Service (USFWS). However, this protocol framework may also be used by other organizations and individuals to monitor bees in any given habitat or location. Our goal is to provide USFWS stations within the NWRS (NWRS stations are land units managed by the USFWS such as national wildlife refuges, national fish hatcheries, wetland management districts, conservation areas, leased lands, etc.) with techniques for developing an initial baseline inventory of what bee species are present on their lands and to provide an inexpensive, simple technique for monitoring bees continuously and for monitoring and evaluating long-term population trends and management impacts. The latter long-term monitoring technique requires a minimal time burden for the individual station, yet can provide a good statistical sample of changing populations that can be investigated at the station, regional, and national levels within the USFWS’ jurisdiction, and compared to other sites within the United States and Canada. This protocol framework was developed in cooperation with the United States Geological Survey (USGS), the USFWS, and a worldwide network of bee researchers who have investigated the techniques and methods for capturing bees and tracking population changes. The protocol framework evolved from field and lab-based investigations at the USGS Bee Inventory and Monitoring Laboratory at the Patuxent Wildlife Research Center in Beltsville, Maryland starting in 2002 and was refined by a large number of USFWS, academic, and state groups. It includes a Protocol Introduction and a set of 8 Standard Operating Procedures or SOPs and adheres to national standards of protocol content and organization. The Protocol Narrative describes the history and need for the protocol framework and summarizes the basic elements of objectives, sampling design, field methods, training, data management, analysis, and reporting. The SOPs provide more detail and specific instructions for implementing the protocol framework. A central database, for managing all the resulting data is under development. We welcome use of this protocol framework by our partners, as appropriate for their bee inventory and monitoring objectives.
Secure and Lightweight Cloud-Assisted Video Reporting Protocol over 5G-Enabled Vehicular Networks
2017-01-01
In the vehicular networks, the real-time video reporting service is used to send the recorded videos in the vehicle to the cloud. However, when facilitating the real-time video reporting service in the vehicular networks, the usage of the fourth generation (4G) long term evolution (LTE) was proved to suffer from latency while the IEEE 802.11p standard does not offer sufficient scalability for a such congested environment. To overcome those drawbacks, the fifth-generation (5G)-enabled vehicular network is considered as a promising technology for empowering the real-time video reporting service. In this paper, we note that security and privacy related issues should also be carefully addressed to boost the early adoption of 5G-enabled vehicular networks. There exist a few research works for secure video reporting service in 5G-enabled vehicular networks. However, their usage is limited because of public key certificates and expensive pairing operations. Thus, we propose a secure and lightweight protocol for cloud-assisted video reporting service in 5G-enabled vehicular networks. Compared to the conventional public key certificates, the proposed protocol achieves entities’ authorization through anonymous credential. Also, by using lightweight security primitives instead of expensive bilinear pairing operations, the proposed protocol minimizes the computational overhead. From the evaluation results, we show that the proposed protocol takes the smaller computation and communication time for the cryptographic primitives than that of the well-known Eiza-Ni-Shi protocol. PMID:28946633
Secure and Lightweight Cloud-Assisted Video Reporting Protocol over 5G-Enabled Vehicular Networks.
Nkenyereye, Lewis; Kwon, Joonho; Choi, Yoon-Ho
2017-09-23
In the vehicular networks, the real-time video reporting service is used to send the recorded videos in the vehicle to the cloud. However, when facilitating the real-time video reporting service in the vehicular networks, the usage of the fourth generation (4G) long term evolution (LTE) was proved to suffer from latency while the IEEE 802.11p standard does not offer sufficient scalability for a such congested environment. To overcome those drawbacks, the fifth-generation (5G)-enabled vehicular network is considered as a promising technology for empowering the real-time video reporting service. In this paper, we note that security and privacy related issues should also be carefully addressed to boost the early adoption of 5G-enabled vehicular networks. There exist a few research works for secure video reporting service in 5G-enabled vehicular networks. However, their usage is limited because of public key certificates and expensive pairing operations. Thus, we propose a secure and lightweight protocol for cloud-assisted video reporting service in 5G-enabled vehicular networks. Compared to the conventional public key certificates, the proposed protocol achieves entities' authorization through anonymous credential. Also, by using lightweight security primitives instead of expensive bilinear pairing operations, the proposed protocol minimizes the computational overhead. From the evaluation results, we show that the proposed protocol takes the smaller computation and communication time for the cryptographic primitives than that of the well-known Eiza-Ni-Shi protocol.
Network Design and Performance of the System Integration Test, Linked Simulators Phase.
1998-01-01
community has primarily used UNIX systems. UNIX is not a real - time operating system and thus very accurate time stamping, i.e., millisecond accuracy, is... time operating system works against us. The clock time on the UNIX workstations drifts from the UTC standard over time and this drift varies from...loggers at each site use the Network Time Protocol to synchronize to the master clock on a workstation in the TCAC. Again, the fact that UNIX is not a real
Rasmussen, Stinne Eika; Nebsbjerg, Mette Amalie; Krogh, Lise Qvirin; Bjørnshave, Katrine; Krogh, Kristian; Povlsen, Jonas Agerlund; Riddervold, Ingunn Skogstad; Grøfte, Thorbjørn; Kirkegaard, Hans; Løfgren, Bo
2017-01-01
Emergency dispatchers use protocols to instruct bystanders in cardiopulmonary resuscitation (CPR). Studies changing one element in the dispatcher's protocol report improved CPR quality. Whether several changes interact is unknown and the effect of combining multiple changes previously reported to improve CPR quality into one protocol remains to be investigated. We hypothesize that a novel dispatch protocol, combining multiple beneficial elements improves CPR quality compared with a standard protocol. A novel dispatch protocol was designed including wording on chest compressions, using a metronome, regular encouragements and a 10-s rest each minute. In a simulated cardiac arrest scenario, laypersons were randomized to perform single-rescuer CPR guided with the novel or the standard protocol. a composite endpoint of time to first compression, hand position, compression depth and rate and hands-off time (maximum score: 22 points). Afterwards participants answered a questionnaire evaluating the dispatcher assistance. The novel protocol (n=61) improved CPR quality score compared with the standard protocol (n=64) (mean (SD): 18.6 (1.4)) points vs. 17.5 (1.7) points, p<0.001. The novel protocol resulted in deeper chest compressions (mean (SD): 58 (12)mm vs. 52 (13)mm, p=0.02) and improved rate of correct hand position (61% vs. 36%, p=0.01) compared with the standard protocol. In both protocols hands-off time was short. The novel protocol improved motivation among rescuers compared with the standard protocol (p=0.002). Participants guided with a standard dispatch protocol performed high quality CPR. A novel bundle of care protocol improved CPR quality score and motivation among rescuers. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Internet Technology for Future Space Missions
NASA Technical Reports Server (NTRS)
Hennessy, Joseph F. (Technical Monitor); Rash, James; Casasanta, Ralph; Hogie, Keith
2002-01-01
Ongoing work at National Aeronautics and Space Administration Goddard Space Flight Center (NASA/GSFC), seeks to apply standard Internet applications and protocols to meet the technology challenge of future satellite missions. Internet protocols and technologies are under study as a future means to provide seamless dynamic communication among heterogeneous instruments, spacecraft, ground stations, constellations of spacecraft, and science investigators. The primary objective is to design and demonstrate in the laboratory the automated end-to-end transport of files in a simulated dynamic space environment using off-the-shelf, low-cost, commodity-level standard applications and protocols. The demonstrated functions and capabilities will become increasingly significant in the years to come as both earth and space science missions fly more sensors and the present labor-intensive, mission-specific techniques for processing and routing data become prohibitively. This paper describes how an IP-based communication architecture can support all existing operations concepts and how it will enable some new and complex communication and science concepts. The authors identify specific end-to-end data flows from the instruments to the control centers and scientists, and then describe how each data flow can be supported using standard Internet protocols and applications. The scenarios include normal data downlink and command uplink as well as recovery scenarios for both onboard and ground failures. The scenarios are based on an Earth orbiting spacecraft with downlink data rates from 300 Kbps to 4 Mbps. Included examples are based on designs currently being investigated for potential use by the Global Precipitation Measurement (GPM) mission.
RICIS Symposium 1992: Mission and Safety Critical Systems Research and Applications
NASA Technical Reports Server (NTRS)
1992-01-01
This conference deals with computer systems which control systems whose failure to operate correctly could produce the loss of life and or property, mission and safety critical systems. Topics covered are: the work of standards groups, computer systems design and architecture, software reliability, process control systems, knowledge based expert systems, and computer and telecommunication protocols.
The purpose of this SOP is to describe the procedures for collecting surface wipe samples inside a home for analysis of either metals or pesticides. This procedure covers the preparation of the surface wipe material and field activities. This protocol was followed to ensure con...
This protocol describes the procedures for weighing, handling, and archiving aerosol filters and for managing the associated analytical and quality assurance data. Filter samples were weighed for aerosol mass at RTI laboratory, with only the automated field sampling data transfer...
EuroFlow standardization of flow cytometer instrument settings and immunophenotyping protocols
Kalina, T; Flores-Montero, J; van der Velden, V H J; Martin-Ayuso, M; Böttcher, S; Ritgen, M; Almeida, J; Lhermitte, L; Asnafi, V; Mendonça, A; de Tute, R; Cullen, M; Sedek, L; Vidriales, M B; Pérez, J J; te Marvelde, J G; Mejstrikova, E; Hrusak, O; Szczepański, T; van Dongen, J J M; Orfao, A
2012-01-01
The EU-supported EuroFlow Consortium aimed at innovation and standardization of immunophenotyping for diagnosis and classification of hematological malignancies by introducing 8-color flow cytometry with fully standardized laboratory procedures and antibody panels in order to achieve maximally comparable results among different laboratories. This required the selection of optimal combinations of compatible fluorochromes and the design and evaluation of adequate standard operating procedures (SOPs) for instrument setup, fluorescence compensation and sample preparation. Additionally, we developed software tools for the evaluation of individual antibody reagents and antibody panels. Each section describes what has been evaluated experimentally versus adopted based on existing data and experience. Multicentric evaluation demonstrated high levels of reproducibility based on strict implementation of the EuroFlow SOPs and antibody panels. Overall, the 6 years of extensive collaborative experiments and the analysis of hundreds of cell samples of patients and healthy controls in the EuroFlow centers have provided for the first time laboratory protocols and software tools for fully standardized 8-color flow cytometric immunophenotyping of normal and malignant leukocytes in bone marrow and blood; this has yielded highly comparable data sets, which can be integrated in a single database. PMID:22948490
IEEE 1588 Time Synchronization Board in MTCA.4 Form Factor
NASA Astrophysics Data System (ADS)
Jabłoński, G.; Makowski, D.; Mielczarek, A.; Orlikowski, M.; Perek, P.; Napieralski, A.; Makijarvi, P.; Simrock, S.
2015-06-01
Distributed data acquisition and control systems in large-scale scientific experiments, like e.g. ITER, require time synchronization with nanosecond precision. A protocol commonly used for that purpose is the Precise Timing Protocol (PTP), also known as IEEE 1588 standard. It uses the standard Ethernet signalling and protocols and allows obtaining timing accuracy of the order of tens of nanoseconds. The MTCA.4 is gradually becoming the platform of choice for building such systems. Currently there is no commercially available implementation of the PTP receiver on that platform. In this paper, we present a module in the MTCA.4 form factor supporting this standard. The module may be used as a timing receiver providing reference clocks in an MTCA.4 chassis, generating a Pulse Per Second (PPS) signal and allowing generation of triggers and timestamping of events on 8 configurable backplane lines and two front panel connectors. The module is based on the Xilinx Spartan 6 FPGA and thermally stabilized Voltage Controlled Oscillator controlled by the digital-to-analog converter. The board supports standalone operation, without the support from the host operating system, as the entire control algorithm is run on a Microblaze CPU implemented in the FPGA. The software support for the card includes the low-level API in the form of Linux driver, user-mode library, high-level API: ITER Nominal Device Support and EPICS IOC. The device has been tested in the ITER timing distribution network (TCN) with three cascaded PTP-enabled Hirschmann switches and a GPS reference clock source. An RMS synchronization accuracy, measured by direct comparison of the PPS signals, better than 20 ns has been obtained.
Adopting Internet Standards for Orbital Use
NASA Technical Reports Server (NTRS)
Wood, Lloyd; Ivancic, William; da Silva Curiel, Alex; Jackson, Chris; Stewart, Dave; Shell, Dave; Hodgson, Dave
2005-01-01
After a year of testing and demonstrating a Cisco mobile access router intended for terrestrial use onboard the low-Earth-orbiting UK-DMC satellite as part of a larger merged ground/space IP-based internetwork, we reflect on and discuss the benefits and drawbacks of integration and standards reuse for small satellite missions. Benefits include ease of operation and the ability to leverage existing systems and infrastructure designed for general use, as well as reuse of existing, known, and well-understood security and operational models. Drawbacks include cases where integration work was needed to bridge the gaps in assumptions between different systems, and where performance considerations outweighed the benefits of reuse of pre-existing file transfer protocols. We find similarities with the terrestrial IP networks whose technologies we have adopted and also some significant differences in operational models and assumptions that must be considered.
NASA's Earth Science Data Systems Standards Process Experiences
NASA Technical Reports Server (NTRS)
Ullman, Richard E.; Enloe, Yonsook
2007-01-01
NASA has impaneled several internal working groups to provide recommendations to NASA management on ways to evolve and improve Earth Science Data Systems. One of these working groups is the Standards Process Group (SPC). The SPG is drawn from NASA-funded Earth Science Data Systems stakeholders, and it directs a process of community review and evaluation of proposed NASA standards. The working group's goal is to promote interoperability and interuse of NASA Earth Science data through broader use of standards that have proven implementation and operational benefit to NASA Earth science by facilitating the NASA management endorsement of proposed standards. The SPC now has two years of experience with this approach to identification of standards. We will discuss real examples of the different types of candidate standards that have been proposed to NASA's Standards Process Group such as OPeNDAP's Data Access Protocol, the Hierarchical Data Format, and Open Geospatial Consortium's Web Map Server. Each of the three types of proposals requires a different sort of criteria for understanding the broad concepts of "proven implementation" and "operational benefit" in the context of NASA Earth Science data systems. We will discuss how our Standards Process has evolved with our experiences with the three candidate standards.
Burggraaff, Marloes C; van Nispen, Ruth M A; Melis-Dankers, Bart J M; van Rens, Ger H M B
2010-03-10
Reading problems are frequently reported by visually impaired persons. A closed-circuit television (CCTV) can be helpful to maintain reading ability, however, it is difficult to learn how to use this device. In the Netherlands, an evidence-based rehabilitation program in the use of CCTVs was lacking. Therefore, a standard training protocol needed to be developed and tested in a randomized controlled trial (RCT) to provide an evidence-based training program in the use of this device. To develop a standard training program, information was collected by studying literature, observing training in the use of CCTVs, discussing the content of the training program with professionals and organizing focus and discussion groups. The effectiveness of the program was evaluated in an RCT, to obtain an evidence-based training program. Dutch patients (n = 122) were randomized into a treatment group: normal instructions from the supplier combined with training in the use of CCTVs, or into a control group: instructions from the supplier only. The effect of the training program was evaluated in terms of: change in reading ability (reading speed and reading comprehension), patients' skills to operate the CCTV, perceived (vision-related) quality of life and tasks performed in daily living. The development of the CCTV training protocol and the design of the RCT in the present study may serve as an example to obtain an evidence-based training program. The training program was adjusted to the needs and learning abilities of individual patients, however, for scientific reasons it might have been preferable to standardize the protocol further, in order to gain more comparable results. http://www.trialregister.nl, identifier: NTR1031.
Saito, Taiichi; Sugiyama, Kazuhiko; Ikawa, Fusao; Yamasaki, Fumiyuki; Ishifuro, Minoru; Takayasu, Takeshi; Nosaka, Ryo; Nishibuchi, Ikuno; Muragaki, Yoshihiro; Kawamata, Takakazu; Kurisu, Kaoru
2017-01-01
The current standard treatment protocol for patients with newly diagnosed glioblastoma (GBM) includes surgery, radiotherapy, and concomitant and adjuvant temozolomide (TMZ). We hypothesized that the permeability surface area product (PS) from a perfusion computed tomography (PCT) study is associated with sensitivity to TMZ. The aim of this study was to determine whether PS values were correlated with prognosis of GBM patients who received the standard treatment protocol. This study included 36 patients with GBM that were newly diagnosed between October 2005 and September 2014 and who underwent preoperative PCT study and the standard treatment protocol. We measured the maximum value of relative cerebral blood volume (rCBVmax) and the maximum PS value (PSmax). We statistically examined the relationship between PSmax and prognosis using survival analysis, including other clinicopathologic factors (age, Karnofsky performance status [KPS], extent of resection, O6-methylguanine-DNA methyltransferase [MGMT] status, second-line use of bevacizumab, and rCBVmax). Log-rank tests revealed that age, KPS, MGMT status, and PSmax were significantly correlated with overall survival. Multivariate analysis using the Cox regression model showed that PSmax was the most significant prognostic factor. Receiver operating characteristic curve analysis showed that PSmax had the highest accuracy in differentiating longtime survivors (LTSs) (surviving more than 2 years) from non-LTSs. At a cutoff point of 8.26 mL/100 g/min, sensitivity and specificity were 90% and 70%, respectively. PSmax from PCT study can help predict survival time in patients with GBM receiving the standard treatment protocol. Survival may be related to sensitivity to TMZ. Copyright © 2016 Elsevier Inc. All rights reserved.
2010-01-01
Background Reading problems are frequently reported by visually impaired persons. A closed-circuit television (CCTV) can be helpful to maintain reading ability, however, it is difficult to learn how to use this device. In the Netherlands, an evidence-based rehabilitation program in the use of CCTVs was lacking. Therefore, a standard training protocol needed to be developed and tested in a randomized controlled trial (RCT) to provide an evidence-based training program in the use of this device. Methods/Design To develop a standard training program, information was collected by studying literature, observing training in the use of CCTVs, discussing the content of the training program with professionals and organizing focus and discussion groups. The effectiveness of the program was evaluated in an RCT, to obtain an evidence-based training program. Dutch patients (n = 122) were randomized into a treatment group: normal instructions from the supplier combined with training in the use of CCTVs, or into a control group: instructions from the supplier only. The effect of the training program was evaluated in terms of: change in reading ability (reading speed and reading comprehension), patients' skills to operate the CCTV, perceived (vision-related) quality of life and tasks performed in daily living. Discussion The development of the CCTV training protocol and the design of the RCT in the present study may serve as an example to obtain an evidence-based training program. The training program was adjusted to the needs and learning abilities of individual patients, however, for scientific reasons it might have been preferable to standardize the protocol further, in order to gain more comparable results. Trial registration http://www.trialregister.nl, identifier: NTR1031 PMID:20219120
“Early Trigger” Intravenous Vitamin K
Diament, Marina; MacLeod, Kirsty; O’Hare, Jonathan; Tate, Anne
2015-01-01
Best practice tariff (BPT) was introduced as a financial incentive model to improve compliance with evidence-based care, such as operation for hip fracture within 36 hours of admission. We previously evaluated the impact of warfarin on patients with hip fracture, revealing significant delay to operation and subsequent loss of revenue. As a result of this, an “early trigger” intravenous vitamin K (IVK) pathway was introduced and the service reaudited a year later. The first cycle was a retrospective audit of all cases with hip fracture against BPT standards over a 32-month period. Subsequent protocol change resulted in all warfarinised cases being given 2 mg IVK in the emergency department prior to blood testing. This protocol was reaudited against the same BPT standards 12 months later. An intention-to-treat approach was used, despite breaches of protocol and other reasons for patients not progressing to theater. The data were analyzed with parametric tools to establish true clinical and statistical impact of the introduction of the protocol. In the first cycle, 80 patients were admitted on warfarin with a mean time to theater of 53.71 hours. Of these patients, 79% breached BPT due to anticoagulation. Twelve months following protocol introduction, 42 patients had a mean time to theater of 37.61 hours. Of these patients, 34% breached BPT due to anticoagulation. These data are both clinically and statistically significant (P < .001). No adverse events occurred. We have shown for the first time that “early-trigger” IVK can reduce delay to theater and maximize tariff payments in warfarinised patients with hip fracture. This is in addition to other established benefits associated with early surgery such as decreasing risk of pressure lesions and pneumonia. It affords high-quality patient-centered care while ensuring trauma units achieve maximal financial reimbursement through pay for improved performance and supports a culture of change behavior. PMID:26623160
Ferrández-Pastor, Francisco Javier; García-Chamizo, Juan Manuel; Nieto-Hidalgo, Mario; Mora-Pascual, Jerónimo; Mora-Martínez, José
2016-07-22
The application of Information Technologies into Precision Agriculture methods has clear benefits. Precision Agriculture optimises production efficiency, increases quality, minimises environmental impact and reduces the use of resources (energy, water); however, there are different barriers that have delayed its wide development. Some of these main barriers are expensive equipment, the difficulty to operate and maintain and the standard for sensor networks are still under development. Nowadays, new technological development in embedded devices (hardware and communication protocols), the evolution of Internet technologies (Internet of Things) and ubiquitous computing (Ubiquitous Sensor Networks) allow developing less expensive systems, easier to control, install and maintain, using standard protocols with low-power consumption. This work develops and test a low-cost sensor/actuator network platform, based in Internet of Things, integrating machine-to-machine and human-machine-interface protocols. Edge computing uses this multi-protocol approach to develop control processes on Precision Agriculture scenarios. A greenhouse with hydroponic crop production was developed and tested using Ubiquitous Sensor Network monitoring and edge control on Internet of Things paradigm. The experimental results showed that the Internet technologies and Smart Object Communication Patterns can be combined to encourage development of Precision Agriculture. They demonstrated added benefits (cost, energy, smart developing, acceptance by agricultural specialists) when a project is launched.
Ferrández-Pastor, Francisco Javier; García-Chamizo, Juan Manuel; Nieto-Hidalgo, Mario; Mora-Pascual, Jerónimo; Mora-Martínez, José
2016-01-01
The application of Information Technologies into Precision Agriculture methods has clear benefits. Precision Agriculture optimises production efficiency, increases quality, minimises environmental impact and reduces the use of resources (energy, water); however, there are different barriers that have delayed its wide development. Some of these main barriers are expensive equipment, the difficulty to operate and maintain and the standard for sensor networks are still under development. Nowadays, new technological development in embedded devices (hardware and communication protocols), the evolution of Internet technologies (Internet of Things) and ubiquitous computing (Ubiquitous Sensor Networks) allow developing less expensive systems, easier to control, install and maintain, using standard protocols with low-power consumption. This work develops and test a low-cost sensor/actuator network platform, based in Internet of Things, integrating machine-to-machine and human-machine-interface protocols. Edge computing uses this multi-protocol approach to develop control processes on Precision Agriculture scenarios. A greenhouse with hydroponic crop production was developed and tested using Ubiquitous Sensor Network monitoring and edge control on Internet of Things paradigm. The experimental results showed that the Internet technologies and Smart Object Communication Patterns can be combined to encourage development of Precision Agriculture. They demonstrated added benefits (cost, energy, smart developing, acceptance by agricultural specialists) when a project is launched. PMID:27455265
Pannu, Gurpal S; Shah, Mitesh P; Herman, Marty J
Cervical spine clearance in the pediatric trauma patient represents a particularly challenging task. Unfortunately, standardized clearance protocols for pediatric cervical clearance are poorly reported in the literature and imaging recommendations demonstrate considerable variability. With the use of a web-based survey, this study aims to define the methods utilized by pediatric trauma centers throughout North America. Specific attention was given to the identification of personnel responsible for cervical spine care, diagnostic imaging modalities used, and the presence or absence of a written pediatric cervical spine clearance protocol. A 10-question electronic survey was given to members of the newly formed Pediatric Cervical Spine Study Group, all of whom are active POSNA members. The survey was submitted via the online service SurveyMonkey (https://www.surveymonkey.com/r/7NVVQZR). The survey assessed the respondent's institution demographics, such as trauma level and services primarily responsible for consultation and operative management of cervical spine injuries. In addition, respondents were asked to identify the protocols and primary imaging modality used for cervical spine clearance. Finally, respondents were asked if their institution had a documented cervical spine clearance protocol. Of the 25 separate institutions evaluated, 21 were designated as level 1 trauma centers. Considerable variation was reported with regards to the primary service responsible for cervical spine clearance. General Surgery/Trauma (44%) is most commonly the primary service, followed by a rotating schedule (33%), Neurosugery (11%), and Orthopaedic Surgery (8%). Spine consults tend to be seen most commonly by a rotating schedule of Orthopaedic Surgery and Neurosurgery. The majority of responding institutions utilize computed tomographic imaging (46%) as the primary imaging modality, whereas 42% of hospitals used x-ray primarily. The remaining institutions reported using a combination of x-ray and computed tomographic imaging. Only 46% of institutions utilize a written, standardized pediatric cervical spine clearance protocol. This study demonstrates a striking variability in the use of personnel, imaging modalities and, most importantly, standardized protocol in the evaluation of the pediatric trauma patient with a potential cervical spine injury. Cervical spine clearance protocols have been shown to decrease the incidence of missed injuries, minimize excessive radiation exposure, decrease the time to collar removal, and lower overall associated costs. It is our opinion that development of a task force or multicenter research protocol that incorporates existing evidence-based literature is the next best step in improving the care of children with cervical spine injuries. Level 4-economic and decision analyses.
Sachs, Peter B; Hunt, Kelly; Mansoubi, Fabien; Borgstede, James
2017-02-01
Building and maintaining a comprehensive yet simple set of standardized protocols for a cross-sectional image can be a daunting task. A single department may have difficulty preventing "protocol creep," which almost inevitably occurs when an organized "playbook" of protocols does not exist and individual radiologists and technologists alter protocols at will and on a case-by-case basis. When multiple departments or groups function in a large health system, the lack of uniformity of protocols can increase exponentially. In 2012, the University of Colorado Hospital formed a large health system (UCHealth) and became a 5-hospital provider network. CT and MR imaging studies are conducted at multiple locations by different radiology groups. To facilitate consistency in ordering, acquisition, and appearance of a given study, regardless of location, we minimized the number of protocols across all scanners and sites of practice with a clinical indication-driven protocol selection and standardization process. Here we review the steps utilized to perform this process improvement task and insure its stability over time. Actions included creation of a standardized protocol template, which allowed for changes in electronic storage and management of protocols, designing a change request form, and formation of a governance structure. We utilized rapid improvement events (1 day for CT, 2 days for MR) and reduced 248 CT protocols into 97 standardized protocols and 168 MR protocols to 66. Additional steps are underway to further standardize output and reporting of imaging interpretation. This will result in an improved, consistent radiologist, patient, and provider experience across the system.
A formal protocol test procedure for the Survivable Adaptable Fiber Optic Embedded Network (SAFENET)
NASA Astrophysics Data System (ADS)
High, Wayne
1993-03-01
This thesis focuses upon a new method for verifying the correct operation of a complex, high speed fiber optic communication network. These networks are of growing importance to the military because of their increased connectivity, survivability, and reconfigurability. With the introduction and increased dependence on sophisticated software and protocols, it is essential that their operation be correct. Because of the speed and complexity of fiber optic networks being designed today, they are becoming increasingly difficult to test. Previously, testing was accomplished by application of conformance test methods which had little connection with an implementation's specification. The major goal of conformance testing is to ensure that the implementation of a profile is consistent with its specification. Formal specification is needed to ensure that the implementation performs its intended operations while exhibiting desirable behaviors. The new conformance test method presented is based upon the System of Communicating Machine model which uses a formal protocol specification to generate a test sequence. The major contribution of this thesis is the application of the System of Communicating Machine model to formal profile specifications of the Survivable Adaptable Fiber Optic Embedded Network (SAFENET) standard which results in the derivation of test sequences for a SAFENET profile. The results applying this new method to SAFENET's OSI and Lightweight profiles are presented.
Multi-User Space Link Extension (SLE) System
NASA Technical Reports Server (NTRS)
Perkins, Toby
2013-01-01
The Multi-User Space (MUS) Link Extension system, a software and data system, provides Space Link Extension (SLE) users with three space data transfer services in timely, complete, and offline modes as applicable according to standards defined by the Consultative Committee for Space Data Systems (CCSDS). MUS radically reduces the schedule, cost, and risk of implementing a new SLE user system, minimizes operating costs with a lights-out approach to SLE, and is designed to require no sustaining engineering expense during its lifetime unless changes in the CCSDS SLE standards, combined with new provider implementations, force changes. No software modification to MUS needs to be made to support a new mission. Any systems engineer with Linux experience can begin testing SLE user service instances with MUS starting from a personal computer (PC) within five days. For flight operators, MUS provides a familiar-looking Web page for entering SLE configuration data received from SLE. Operators can also use the Web page to back up a space mission's entire set of up to approximately 500 SLE service instances in less than five seconds, or to restore or transfer from another system the same amount of data from a MUS backup file in about the same amount of time. Missions operate each MUS SLE service instance independently by sending it MUS directives, which are legible, plain ASCII strings. MUS directives are usually (but not necessarily) sent through a TCP-IP (Transmission Control Protocol Internet Protocol) socket from a MOC (Mission Operations Center) or POCC (Payload Operations Control Center) system, under scripted control, during "lights-out" spacecraft operation. MUS permits the flight operations team to configure independently each of its data interfaces; not only commands and telemetry, but also MUS status messages to the MOC. Interfaces can use single- or multiple-client TCP/IP server sockets, TCP/IP client sockets, temporary disk files, the system log, or standard in, standard out, or standard error as applicable. By defining MUS templates in ASCII, the flight operations team can include any MUS system variable in telemetry or command headers or footers, and/or in status messages. Data fields can be arranged within messages in different sequences, according to the mission s needs. The only constraints imposed are on the format of MUS directive strings, and some bare minimum logical requirements that must be met in order for MUS to read the mission control center's spacecraft command inputs. The MUS system imposes no limits or constraints on the numbers and combinations of missions and SLE service instances that it will support simultaneously. At any time, flight operators may add, change, delete, bind, connect, or disconnect.
Standard formatted data units-control authority procedures
NASA Technical Reports Server (NTRS)
1991-01-01
The purpose of this document is to establish a set of minimum and optional requirements for the implementation of Control Authority (CA) organizations within and among the Agencies participating in the Consultative Committee for Space Data Systems (CCSDS). By satisfying these requirements, the resultant cooperating set of CA organizations will produce a global CA service supporting information transfer with digital data under the Standard Formatted Data Unit (SFDU) concept. This service is primarily accomplished through the registration, permanent archiving, and dissemination of metadata in the form of Metadata Objects (MDO) that assist in the interpretation of data objects received in SFDU form. This Recommendation addresses the responsibilities, services, and interface protocols for a hierarchy of CA organizations. The top level, consisting of the CCSDS Secretariat and its operational agent, is unique and primarily provides a global coordination function. The lower levels are Agency CA organizations that have primary responsibility for the registration, archiving, and dissemination of MDOs. As experience is gained and technology evolves, the CA Procedures will be extended to include enhanced services and their supporting protocols. In particular, it is anticipated that eventually CA organizations will be linked via networks on a global basis, and will provide requestors with online automated access to CA services. While this Recommendation does not preclude such operations, it also does not recommend the specific protocols to be used to ensure global compatibility of these services. These recommendations will be generated as experience is gained.
cadcVOFS: A FUSE Based File System Layer for VOSpace
NASA Astrophysics Data System (ADS)
Kavelaars, J.; Dowler, P.; Jenkins, D.; Hill, N.; Damian, A.
2012-09-01
The CADC is now making extensive use of the VOSpace protocol for user managed storage. The VOSpace standard allows a diverse set of rich data services to be delivered to users via a simple protocol. We have recently developed the cadcVOFS, a FUSE based file-system layer for VOSpace. cadcVOFS provides a filesystem layer on-top of VOSpace so that standard Unix tools (such as ‘find’, ‘emacs’, ‘awk’ etc) can be used directly on the data objects stored in VOSpace. Once mounted the VOSpace appears as a network storage volume inside the operating system. Within the CADC Cloud Computing project (CANFAR) we have used VOSpace as the method for retrieving and storing processing inputs and products. The abstraction of storage is an important component of Cloud Computing and the high use level of our VOSpace service reflects this.
Osborne, Nikola K P; Taylor, Michael C
2018-05-01
This article describes a New Zealand forensic agency's contextual information management protocol for bloodstain pattern evidence examined in the laboratory. In an effort to create a protocol that would have minimal impact on current work-flow, while still effectively removing task-irrelevant contextual information, the protocol was designed following an in-depth consultation with management and forensic staff. The resulting design was for a protocol of independent-checking (i.e. blind peer-review) where the checker's interpretation of the evidence is conducted in the absence of case information and the original examiner's notes or interpretation(s). At the conclusion of a ten-case trial period, there was widespread agreement that the protocol had minimal impact on the number of people required, the cost, or the time to complete an item examination. The agency is now looking to adopt the protocol into standard operating procedures and in some cases the protocol has been extended to cover other laboratory-based examinations (e.g. fabric damage, shoeprint examination, and physical fits). The protocol developed during this trial provides a useful example for agencies seeking to adopt contextual information management into their workflow. Copyright © 2018 The Chartered Society of Forensic Sciences. Published by Elsevier B.V. All rights reserved.
This protocol describes the procedures for the collection, storage, and shipping of human scalp hair samples for trace metals and arsenic or potential adduct analysis. Scalp hair samples were collected from each participant that agreed to provide the sample. Thinning shears were ...
The purpose of this SOP is to describe the procedures for collecting surface wipe samples inside a home for analysis of either metals or pesticides. This procedure covers the preparation of the surface wipe material and field activities. This protocol was followed to ensure con...
Vollmer, R; Villagaray, R; Egusquiza, V; Espirilla, J; García, M; Torres, A; Rojas, E; Panta, A; Barkley, N A; Ellis, D
Cryobanks are a secure, efficient and low cost method for the long-term conservation of plant genetic resources for theoretically centuries or millennia with minimal maintenance. The present manuscript describes CIP's modified protocol for potato cryopreservation, its large-scale application, and the establishment of quality and operational standards, which included a viability reassessment of material entering the cryobank. In 2013, CIP established stricter quality and operational standards under which 1,028 potato accessions were cryopreserved with an improved PVS2-droplet protocol. In 2014 the viability of 114 accessions cryopreserved in 2013 accessions were reassessed. The average recovery rate (full plant recovery after LN exposure) of 1028 cryopreserved Solanum species ranged from 34 to 59%, and 70% of the processed accessions showed a minimum recovery rate of ≥20% and were considered as successfully cryopreserved. CIP has established a new high quality management system for cryobanking. Periodic viability reassessment, strict and clear recovery criteria and the monitoring of the percent of successful accessions meeting the criteria as well as contamination rates are metrics that need to be considered in cryobanks.
A Protocol for Diagnosis and Management of Aortic Atherosclerosis in Cardiac Surgery Patients
Brandon Bravo Bruinsma, George J.; Van 't Hof, Arnoud W. J.; Grandjean, Jan G.; Nierich, Arno P.
2017-01-01
In patients undergoing cardiac surgery, use of perioperative screening for aortic atherosclerosis with modified TEE (A-View method) was associated with lower postoperative mortality, but not stroke, as compared to patients operated on without such screening. At the time of clinical implementation and validation, we did not yet standardize the indications for modified TEE and the changes in patient management in the presence of aortic atherosclerosis. Therefore, we designed a protocol, which combined the diagnosis of atherosclerosis of thoracic aorta and the subsequent considerations with respect to the intraoperative management and provides a systematic approach to reduce the risk of cerebral complications. PMID:28852575
Carter, Yvette; Chen, Herbert; Sippel, Rebecca S.
2013-01-01
Background Symptomatic hypocalcemia after thyroidectomy is a barrier to same day surgery, and the cause of ER visits. A standard protocol of calcium and vitamin D supplementation, dependent on intact parathyroid hormone (iPTH) levels, can address this issue. How effective is it? When does it fail? Methods We performed a retrospective review of the prospective Thyroid Database from January 2006 to December 2010. 620 patients underwent completion (CT) or total thyroidectomy (TT), and followed our post-operative protocol of calcium carbonate administration for iPTH levels ≥10pg/ml and calcium carbonate and 0.25μg calcitriol BID for iPTH <10pg/ml. Calcium and iPTH values, pathology and medication, were compared to evaluate protocol efficacy. A p value <0.05 was considered statistically significant. Results Using the protocol, sixty-one (10.2%) patients were chemically hypocalcemic but never developed symptoms and twenty-four (3.9%) patients developed breakthrough symptomatic hypocalcemia. The symptomatic (SX) and asymptomatic (ASX) groups were similar with regard to gender, cancer diagnosis, and pre-operative calcium and iPTH. The symptomatic group was significantly younger (39.6 ± 2.8 vs. 49 ± 0.6 years, p=0.01), with lower post-operative iPTH levels. 33% (n=8) of SX patients had an iPTH ≤5 pg/ml vs. only 6% (n=37) of ASX patients. While the majority of patients with a PTH <5 pg/ml were asymptomatic, 62.5% (n=5) of SX patients with iPTH levels ≤5 pg/ml, required an increased in calcitriol dose to achieve both biochemical correction and symptom relief. Conclusion Prophylactic calcium and vitamin D supplementation based on post-operative iPTH levels can minimize symptomatic hypocalcemia after thyroidectomy. An iPTH ≤ 5pg/ml may warrant higher initial doses of calcitriol in order to prevent symptoms. PMID:24144426
Vignion-Dewalle, Anne-Sophie; Baert, Gregory; Devos, Laura; Thecua, Elise; Vicentini, Claire; Mortier, Laurent; Mordon, Serge
2017-09-01
Photodynamic therapy (PDT) is an emerging treatment modality for various diseases, especially for dermatological conditions. Although, the standard PDT protocol for the treatment of actinic keratoses in Europe has shown to be effective, treatment-associated pain is often observed in patients. Different modifications to this protocol attempted to decrease pain have been investigated. The decrease in fluence rate seems to be a promising solution. Moreover, it has been suggested that light fractionation significantly increases the efficacy of PDT. Based on a flexible light-emitting textile, the FLEXITHERALIGHT device specifically provides a fractionated illumination at a fluence rate more than six times lower than that of the standard protocol. In a recently completed clinical trial of PDT for the treatment of actinic keratosis, the non-inferiority of a protocol involving illumination with the FLEXITHERALIGHT device after a short incubation time and referred to as the FLEXITHERALIGHT protocol has been assessed compared to the standard protocol. In this paper, we propose a comparison of the two above mentioned 635 nm red light protocols with 37 J/cm 2 in the PDT treatment of actinic keratosis: the standard protocol and the FLEXITHERALIGHT one through a mathematical modeling. This mathematical modeling, which slightly differs from the one we have already published, enables the local damage induced by the therapy to be estimated. The comparison performed in terms of the local damage induced by the therapy demonstrates that the FLEXITHERALIGHT protocol with lower fluence rate, light fractionation and shorter incubation time is somewhat less efficient than the standard protocol. Nevertheless, from the clinical trial results, the FLEXITHERALIGHT protocol results in non-inferior response rates compared to the standard protocol. This finding raises the question of whether the PDT local damage achieved by the FLEXITHERALIGHT protocol (respectively, the standard protocol) is sufficient (respectively, excessive) to destroy actinic keratosis cells. Lasers Surg. Med. 49:686-697, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Benton, Katie; Thomson, Iain; Isenring, Elisabeth; Mark Smithers, B; Agarwal, Ekta
2018-06-01
Enhanced Recovery After Surgery (ERAS) protocols have been effectively expanded to various surgical specialities including oesophagectomy. Despite nutrition being a key component, actual nutrition outcomes and specific guidelines are lacking. This cohort comparison study aims to compare nutritional status and adherence during implementation of a standardised post-operative nutritional support protocol, as part of ERAS, compared to those who received usual care. Two groups of patients undergoing resection of oesophageal cancer were studied. Group 1 (n = 17) underwent oesophagectomy between Oct 2014 and Nov 2016 during implementation of an ERAS protocol. Patients in group 2 (n = 16) underwent oesophagectomy between Jan 2011 and Dec 2012 prior to the implementation of ERAS. Demographic, nutritional status, dietary intake and adherence data were collected. Ordinal data was analysed using independent t tests, and categorical data using chi-square tests. There was no significant difference in nutrition status, dietary intake or length of stay following implementation of an ERAS protocol. Malnutrition remained prevalent in both groups at day 42 post surgery (n = 10, 83% usual care; and n = 9, 60% ERAS). A significant difference was demonstrated in adherence with earlier initiation of oral free fluids (p <0.008), transition to soft diet (p <0.004) and continuation of jejunostomy feeds on discharge (p <0.000) for the ERAS group. A standardised post-operative nutrition protocol, within an ERAS framework, results in earlier transition to oral intake; however, malnutrition remains prevalent post surgery. Further large-scale studies are warranted to examine individualised decision-making regarding nutrition support within an ERAS protocol.
NASA Astrophysics Data System (ADS)
Phister, P. W., Jr.
1983-12-01
Development of the Air Force Institute of Technology's Digital Engineering Laboratory Network (DELNET) was continued with the development of an initial draft of a protocol standard for all seven layers as specified by the International Standards Organization's (ISO) Reference Model for Open Systems Interconnections. This effort centered on the restructuring of the Network Layer to perform Datagram routing and to conform to the developed protocol standards and actual software module development of the upper four protocol layers residing within the DELNET Monitor (Zilog MCZ 1/25 Computer System). Within the guidelines of the ISO Reference Model the Transport Layer was developed utilizing the Internet Header Format (IHF) combined with the Transport Control Protocol (TCP) to create a 128-byte Datagram. Also a limited Application Layer was created to pass the Gettysburg Address through the DELNET. This study formulated a first draft for the DELNET Protocol Standard and designed, implemented, and tested the Network, Transport, and Application Layers to conform to these protocol standards.
A Long-Distance RF-Powered Sensor Node with Adaptive Power Management for IoT Applications.
Pizzotti, Matteo; Perilli, Luca; Del Prete, Massimo; Fabbri, Davide; Canegallo, Roberto; Dini, Michele; Masotti, Diego; Costanzo, Alessandra; Franchi Scarselli, Eleonora; Romani, Aldo
2017-07-28
We present a self-sustained battery-less multi-sensor platform with RF harvesting capability down to -17 dBm and implementing a standard DASH7 wireless communication interface. The node operates at distances up to 17 m from a 2 W UHF carrier. RF power transfer allows operation when common energy scavenging sources (e.g., sun, heat, etc.) are not available, while the DASH7 communication protocol makes it fully compatible with a standard IoT infrastructure. An optimized energy-harvesting module has been designed, including a rectifying antenna (rectenna) and an integrated nano-power DC/DC converter performing maximum-power-point-tracking (MPPT). A nonlinear/electromagnetic co-design procedure is adopted to design the rectenna, which is optimized to operate at ultra-low power levels. An ultra-low power microcontroller controls on-board sensors and wireless protocol, to adapt the power consumption to the available detected power by changing wake-up policies. As a result, adaptive behavior can be observed in the designed platform, to the extent that the transmission data rate is dynamically determined by RF power. Among the novel features of the system, we highlight the use of nano-power energy harvesting, the implementation of specific hardware/software wake-up policies, optimized algorithms for best sampling rate implementation, and adaptive behavior by the node based on the power received.
Bonalumi, Sabrina; Barbonaglia, Patrizia; Bertocchi, Carmen
2006-01-01
In 2001 the General Health Direction of Region Lombardia approved (decree n. 22303) a guideline for the prevention of latex allergic reactions in patients and health care workers. This document provides general recommendations in order to standardize behaviors in regional health care facilities. The reason is due to a rise in the incident of reactions to latex products in the last 20 years. Nowadays the prevalence is higher in certain risk groups (subjected to frequent and repeated exposures) rather than the general population. The aim of the project was to organize a latex safe operating theatre in the Ospedale Maggiore Policlinico, Mangiagalli e Regina Elena of Milan (Fondazione) and to standardize behaviors in order to prevent adverse effects in latex allergic patients. Thanks to the literature review and the creation of a multidisciplinar team, we produced a protocol. Therefore, we requested manufacturers the certification of the latex content of their products. Results and conclusion. When latex allergic patients need to undergone surgery in our hospital, a latex safe operating theatre is organized by personnel following a multidisciplinar protocol. No allergic reactions were experienced during surgical procedures after the creation of an environment as free as possible from latex contamination. The project will involve an emergency room, one room or more of a ward and of the outpatients department.
A Long-Distance RF-Powered Sensor Node with Adaptive Power Management for IoT Applications
del Prete, Massimo; Fabbri, Davide; Canegallo, Roberto; Dini, Michele; Costanzo, Alessandra
2017-01-01
We present a self-sustained battery-less multi-sensor platform with RF harvesting capability down to −17 dBm and implementing a standard DASH7 wireless communication interface. The node operates at distances up to 17 m from a 2 W UHF carrier. RF power transfer allows operation when common energy scavenging sources (e.g., sun, heat, etc.) are not available, while the DASH7 communication protocol makes it fully compatible with a standard IoT infrastructure. An optimized energy-harvesting module has been designed, including a rectifying antenna (rectenna) and an integrated nano-power DC/DC converter performing maximum-power-point-tracking (MPPT). A nonlinear/electromagnetic co-design procedure is adopted to design the rectenna, which is optimized to operate at ultra-low power levels. An ultra-low power microcontroller controls on-board sensors and wireless protocol, to adapt the power consumption to the available detected power by changing wake-up policies. As a result, adaptive behavior can be observed in the designed platform, to the extent that the transmission data rate is dynamically determined by RF power. Among the novel features of the system, we highlight the use of nano-power energy harvesting, the implementation of specific hardware/software wake-up policies, optimized algorithms for best sampling rate implementation, and adaptive behavior by the node based on the power received. PMID:28788084
Channel Simulation in Quantum Metrology
NASA Astrophysics Data System (ADS)
Laurenza, Riccardo; Lupo, Cosmo; Spedalieri, Gaetana; Braunstein, Samuel L.; Pirandola, Stefano
2018-04-01
In this review we discuss how channel simulation can be used to simplify the most general protocols of quantum parameter estimation, where unlimited entanglement and adaptive joint operations may be employed. Whenever the unknown parameter encoded in a quantum channel is completely transferred in an environmental program state simulating the channel, the optimal adaptive estimation cannot beat the standard quantum limit. In this setting, we elucidate the crucial role of quantum teleportation as a primitive operation which allows one to completely reduce adaptive protocols over suitable teleportation-covariant channels and derive matching upper and lower bounds for parameter estimation. For these channels,wemay express the quantum Cramér Rao bound directly in terms of their Choi matrices. Our review considers both discrete- and continuous-variable systems, also presenting some new results for bosonic Gaussian channels using an alternative sub-optimal simulation. It is an open problem to design simulations for quantum channels that achieve the Heisenberg limit.
Re-engineering Nascom's network management architecture
NASA Technical Reports Server (NTRS)
Drake, Brian C.; Messent, David
1994-01-01
The development of Nascom systems for ground communications began in 1958 with Project Vanguard. The low-speed systems (rates less than 9.6 Kbs) were developed following existing standards; but, there were no comparable standards for high-speed systems. As a result, these systems were developed using custom protocols and custom hardware. Technology has made enormous strides since the ground support systems were implemented. Standards for computer equipment, software, and high-speed communications exist and the performance of current workstations exceeds that of the mainframes used in the development of the ground systems. Nascom is in the process of upgrading its ground support systems and providing additional services. The Message Switching System (MSS), Communications Address Processor (CAP), and Multiplexer/Demultiplexer (MDM) Automated Control System (MACS) are all examples of Nascom systems developed using standards such as, X-windows, Motif, and Simple Network Management Protocol (SNMP). Also, the Earth Observing System (EOS) Communications (Ecom) project is stressing standards as an integral part of its network. The move towards standards has produced a reduction in development, maintenance, and interoperability costs, while providing operational quality improvement. The Facility and Resource Manager (FARM) project has been established to integrate the Nascom networks and systems into a common network management architecture. The maximization of standards and implementation of computer automation in the architecture will lead to continued cost reductions and increased operational efficiency. The first step has been to derive overall Nascom requirements and identify the functionality common to all the current management systems. The identification of these common functions will enable the reuse of processes in the management architecture and promote increased use of automation throughout the Nascom network. The MSS, CAP, MACS, and Ecom projects have indicated the potential value of commercial-off-the-shelf (COTS) and standards through reduced cost and high quality. The FARM will allow the application of the lessons learned from these projects to all future Nascom systems.
Hallas, Gary; Monis, Paul
2015-01-01
The enumeration of bacteria using plate-based counts is a core technique used by food and water microbiology testing laboratories. However, manual counting of bacterial colonies is both time and labour intensive, can vary between operators and also requires manual entry of results into laboratory information management systems, which can be a source of data entry error. An alternative is to use automated digital colony counters, but there is a lack of peer-reviewed validation data to allow incorporation into standards. We compared the performance of digital counting technology (ProtoCOL3) against manual counting using criteria defined in internationally recognized standard methods. Digital colony counting provided a robust, standardized system suitable for adoption in a commercial testing environment. The digital technology has several advantages:•Improved measurement of uncertainty by using a standard and consistent counting methodology with less operator error.•Efficiency for labour and time (reduced cost).•Elimination of manual entry of data onto LIMS.•Faster result reporting to customers.
A Lightweight Protocol for Secure Video Streaming
Morkevicius, Nerijus; Bagdonas, Kazimieras
2018-01-01
The Internet of Things (IoT) introduces many new challenges which cannot be solved using traditional cloud and host computing models. A new architecture known as fog computing is emerging to address these technological and security gaps. Traditional security paradigms focused on providing perimeter-based protections and client/server point to point protocols (e.g., Transport Layer Security (TLS)) are no longer the best choices for addressing new security challenges in fog computing end devices, where energy and computational resources are limited. In this paper, we present a lightweight secure streaming protocol for the fog computing “Fog Node-End Device” layer. This protocol is lightweight, connectionless, supports broadcast and multicast operations, and is able to provide data source authentication, data integrity, and confidentiality. The protocol is based on simple and energy efficient cryptographic methods, such as Hash Message Authentication Codes (HMAC) and symmetrical ciphers, and uses modified User Datagram Protocol (UDP) packets to embed authentication data into streaming data. Data redundancy could be added to improve reliability in lossy networks. The experimental results summarized in this paper confirm that the proposed method efficiently uses energy and computational resources and at the same time provides security properties on par with the Datagram TLS (DTLS) standard. PMID:29757988
A Lightweight Protocol for Secure Video Streaming.
Venčkauskas, Algimantas; Morkevicius, Nerijus; Bagdonas, Kazimieras; Damaševičius, Robertas; Maskeliūnas, Rytis
2018-05-14
The Internet of Things (IoT) introduces many new challenges which cannot be solved using traditional cloud and host computing models. A new architecture known as fog computing is emerging to address these technological and security gaps. Traditional security paradigms focused on providing perimeter-based protections and client/server point to point protocols (e.g., Transport Layer Security (TLS)) are no longer the best choices for addressing new security challenges in fog computing end devices, where energy and computational resources are limited. In this paper, we present a lightweight secure streaming protocol for the fog computing "Fog Node-End Device" layer. This protocol is lightweight, connectionless, supports broadcast and multicast operations, and is able to provide data source authentication, data integrity, and confidentiality. The protocol is based on simple and energy efficient cryptographic methods, such as Hash Message Authentication Codes (HMAC) and symmetrical ciphers, and uses modified User Datagram Protocol (UDP) packets to embed authentication data into streaming data. Data redundancy could be added to improve reliability in lossy networks. The experimental results summarized in this paper confirm that the proposed method efficiently uses energy and computational resources and at the same time provides security properties on par with the Datagram TLS (DTLS) standard.
Cryopreservation: a cold look at technology for fertility preservation.
Gosden, Roger
2011-08-01
To outline the history of cryopreservation technology and its contributions to reproductive medicine, including fertility preservation. A search of the relevant literature using Medline and other online tools. Research and laboratory protocol development. The biology of preserving cells at low temperatures is complex and still being unraveled. Principles were first established more than half a century ago, with progress being driven empirically and often by trial and error. The protocols vary widely, and practice is still heavily dependent on operator skill, accounting for wide differences in the success rates between centers. No single protocol fits all specimen types, and differential vulnerability to cryoinjury remains a major obstacle. Nevertheless, semen cryopreservation has long been established, embryo banking is now highly effective, and vitrification appears to overcome problems with oocytes. Protocols in the future, although specific to the cell type and tissue, are likely to evolve toward generally acknowledged standards. But heterogeneity between patients and even within samples implies that each cell may have its own peculiar optimum for minimizing cryoinjury; because protocols are therefore compromises, "perfect" preservation may be unattainable. Cryopreservation has become a mainstay in the assisted reproduction laboratory and underpins fertility preservation for patients with cancer and other conditions. The practice is currently evolving from slow freezing methods toward more vitrification, and future technology is likely to reduce dependence on operator skill, which should raise success rates to higher, more uniform levels. Copyright © 2011 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Weikuan; Vetter, Jeffrey S
Parallel NFS (pNFS) is touted as an emergent standard protocol for parallel I/O access in various storage environments. Several pNFS prototypes have been implemented for initial validation and protocol examination. Previous efforts have focused on realizing the pNFS protocol to expose the best bandwidth potential from underlying file and storage systems. In this presentation, we provide an initial characterization of two pNFS prototype implementations, lpNFS (a Lustre-based parallel NFS implementation) and spNFS (another reference implementation from Network Appliance, Inc.). We show that both lpNFS and spNFS can faithfully achieve the primary goal of pNFS, i.e., aggregating I/O bandwidth from manymore » storage servers. However, they both face the challenge of scalable metadata management. Particularly, the throughput of sp-NFS metadata operations degrades significanlty with an increasing number of data servers. Even for the better-performing lpNFS, we discuss its architecture and propose a direct I/O request flow protocol to improve its performance.« less
Improving Collaboration by Standardization Efforts in Systems Biology
Dräger, Andreas; Palsson, Bernhard Ø.
2014-01-01
Collaborative genome-scale reconstruction endeavors of metabolic networks would not be possible without a common, standardized formal representation of these systems. The ability to precisely define biological building blocks together with their dynamic behavior has even been considered a prerequisite for upcoming synthetic biology approaches. Driven by the requirements of such ambitious research goals, standardization itself has become an active field of research on nearly all levels of granularity in biology. In addition to the originally envisaged exchange of computational models and tool interoperability, new standards have been suggested for an unambiguous graphical display of biological phenomena, to annotate, archive, as well as to rank models, and to describe execution and the outcomes of simulation experiments. The spectrum now even covers the interaction of entire neurons in the brain, three-dimensional motions, and the description of pharmacometric studies. Thereby, the mathematical description of systems and approaches for their (repeated) simulation are clearly separated from each other and also from their graphical representation. Minimum information definitions constitute guidelines and common operation protocols in order to ensure reproducibility of findings and a unified knowledge representation. Central database infrastructures have been established that provide the scientific community with persistent links from model annotations to online resources. A rich variety of open-source software tools thrives for all data formats, often supporting a multitude of programing languages. Regular meetings and workshops of developers and users lead to continuous improvement and ongoing development of these standardization efforts. This article gives a brief overview about the current state of the growing number of operation protocols, mark-up languages, graphical descriptions, and fundamental software support with relevance to systems biology. PMID:25538939
Evaluation of a Modified Pamidronate Protocol for the Treatment of Osteogenesis Imperfecta.
Palomo, Telma; Andrade, Maria C; Peters, Barbara S E; Reis, Fernanda A; Carvalhaes, João Tomás A; Glorieux, Francis H; Rauch, Frank; Lazaretti-Castro, Marise
2016-01-01
Intravenous pamidronate is widely used to treat children with osteogenesis imperfecta (OI). In a well-studied protocol ('standard protocol'), pamidronate is given at a daily dose of 1 mg per kg body weight over 4 h on 3 successive days; infusion cycles are repeated every 4 months. Here, we evaluated renal safety of a simpler protocol for intravenous pamidronate infusions (2 mg per kg body weight given in a single infusion over 2 h, repeated every 4 months; 'modified protocol'). Results of 18 patients with OI types I, III, or IV treated with the modified protocol for 12 months were compared to 18 historic controls, treated with standard protocol. In the modified protocol, mild transient post-infusion increases in serum creatinine were found during each infusion but after 12 months serum creatinine remained similar from baseline [0.40 mg/dl (SD: 0.13)] to the end of the study [0.41 mg/dl (SD: 0.11)] (P = 0.79). The two protocols led to similar changes in serum creatinine during the first pamidronate infusion [modified protocol: +2% (SD: 21%); standard protocol: -3% (SD: 8%); P = 0.32]. Areal lumbar spine bone mineral density Z-scores increased from -2.7 (SD: 1.5) to -1.8 (SD: 1.4) with the modified protocol, and from -4.1 (SD: 1.4) to -3.1 (SD: 1.1) with standard protocol (P = 0.68 for group differences in bone density Z-score changes). The modified pamidronate protocol is safe and may have similar effects on bone density as the standard pamidronate protocol. More studies are needed with longer follow-up to prove anti-fracture efficacy.
Web-based monitoring and management system for integrated enterprise-wide imaging networks
NASA Astrophysics Data System (ADS)
Ma, Keith; Slik, David; Lam, Alvin; Ng, Won
2003-05-01
Mass proliferation of IP networks and the maturity of standards has enabled the creation of sophisticated image distribution networks that operate over Intranets, Extranets, Communities of Interest (CoI) and even the public Internet. Unified monitoring, provisioning and management of such systems at the application and protocol levels represent a challenge. This paper presents a web based monitoring and management tool that employs established telecom standards for the creation of an open system that enables proactive management, provisioning and monitoring of image management systems at the enterprise level and across multi-site geographically distributed deployments. Utilizing established standards including ITU-T M.3100, and web technologies such as XML/XSLT, JSP/JSTL, and J2SE, the system allows for seamless device and protocol adaptation between multiple disparate devices. The goal has been to develop a unified interface that provides network topology views, multi-level customizable alerts, real-time fault detection as well as real-time and historical reporting of all monitored resources, including network connectivity, system load, DICOM transactions and storage capacities.
Meeting the Challenge of Distributed Real-Time & Embedded (DRE) Systems
2012-05-10
IP RTOS Middleware Middleware Services DRE Applications Operating Sys & Protocols Hardware & Networks Middleware Middleware Services DRE...Services COTS & standards-based middleware, language, OS , network, & hardware platforms • Real-time CORBA (TAO) middleware • ADAPTIVE Communication...SPLs) F-15 product variant A/V 8-B product variant F/A 18 product variant UCAV product variant Software Produce-Line Hardware (CPU, Memory, I/O) OS
Branched Nerve Allografts to Improve Outcomes in Facial Composite Tissue Transplantation
2017-12-01
Ethicon, Inc. Somervile, N.J.). Postoperatively, animals were recovered per standard protocol in the animal care facility. Experimental Design ...official Department of the Army position, policy or decision unless so designated by other documentation. REPORT DOCUMENTATION PAGE Form Approved OMB No...a human xenograft with or without oral Tacrolimus. Electrophysiologic assessments were performed pre -operatively and at the study endpoint (24 weeks
DDN (Defense Data Network) Protocol Handbook. Volume 1. DoD Military Standard Protocols
1985-12-01
official Military Standard communication protocols in use on the DDN are included, as are several ARPANET (Advanced Research Projects Agency Network... research protocols which are currently in use, and some protocols currently undergoing review. Tutorial information and auxiliary documents are also...compatible with DoD needs, by researchers wishing to improve the protocols, and by impleroentors of local area networks (LANs) wishing their
Gallistel, C. R.; Balci, Fuat; Freestone, David; Kheifets, Aaron; King, Adam
2014-01-01
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer. PMID:24637442
Gallistel, C R; Balci, Fuat; Freestone, David; Kheifets, Aaron; King, Adam
2014-02-26
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
James Webb Space Telescope - L2 Communications for Science Data Processing
NASA Technical Reports Server (NTRS)
Johns, Alan; Seaton, Bonita; Gal-Edd, Jonathan; Jones, Ronald; Fatig, Curtis; Wasiak, Francis
2008-01-01
JWST is the first NASA mission at the second Lagrange point (L2) to identify the need for data rates higher than 10 megabits per second (Mbps). JWST will produce approximately 235 Gigabits of science data every day that will be downlinked to the Deep Space Network (DSN). To get the data rates desired required moving away from X-band frequencies to Ka-band frequencies. To accomplish this transition, the DSN is upgrading its infrastructure. This new range of frequencies are becoming the new standard for high data rate science missions at L2. With the new frequency range, the issues of alternatives antenna deployment, off nominal scenarios, NASA implementation of the Ka-band 26 GHz, and navigation requirements will be discussed in this paper. JWST is also using Consultative Committee for Space Data Systems (CCSDS) standard process for reliable file transfer using CCSDS File Delivery Protocol (CFDP). For JWST the use of the CFDP protocol provides level zero processing at the DSN site. This paper will address NASA implementations of Ground Stations in support of Ka-band 26 GHz and lesson learned from implementing a file base (CFDP) protocol operational system.
Reilly, K A; Beard, D J; Barker, K L; Dodd, C A F; Price, A J; Murray, D W
2005-10-01
Unicompartmental knee arthroplasty (UKA) is appropriate for one in four patients with osteoarthritic knees. This study was performed to compare the safety, effectiveness and economic viability of a new accelerated protocol with current standard care in a state healthcare system. A single blind RCT design was used. Eligible patients were screened for NSAID tolerance, social circumstances and geographical location before allocation to an accelerated recovery group (A) or standard care group (S). Primary outcome was the Oxford Knee Assessment at 6 months post operation, compared using independent Mann-Whitney U-tests. A simple difference in costs incurred was calculated. The study power was sufficient to avoid type 2 errors. Forty-one patients were included. The average stay for Group A was 1.5 days. Group S averaged 4.3 days. No significant difference in outcomes was found between groups. The new protocol achieved cost savings of 27% and significantly reduced hospital bed occupancy. In addition, patient satisfaction was assessed as greater with the accelerated discharge than with the routine discharge time. The strict inclusion criteria meant that 75% of eligible patients were excluded. However, a large percentage of these were due to the distances patients lived from the hospital.
Performance evaluation of reactive and proactive routing protocol in IEEE 802.11 ad hoc network
NASA Astrophysics Data System (ADS)
Hamma, Salima; Cizeron, Eddy; Issaka, Hafiz; Guédon, Jean-Pierre
2006-10-01
Wireless technology based on the IEEE 802.11 standard is widely deployed. This technology is used to support multiple types of communication services (data, voice, image) with different QoS requirements. MANET (Mobile Adhoc NETwork) does not require a fixed infrastructure. Mobile nodes communicate through multihop paths. The wireless communication medium has variable and unpredictable characteristics. Furthermore, node mobility creates a continuously changing communication topology in which paths break and new one form dynamically. The routing table of each router in an adhoc network must be kept up-to-date. MANET uses Distance Vector or Link State algorithms which insure that the route to every host is always known. However, this approach must take into account the adhoc networks specific characteristics: dynamic topologies, limited bandwidth, energy constraints, limited physical security, ... Two main routing protocols categories are studied in this paper: proactive protocols (e.g. Optimised Link State Routing - OLSR) and reactive protocols (e.g. Ad hoc On Demand Distance Vector - AODV, Dynamic Source Routing - DSR). The proactive protocols are based on periodic exchanges that update the routing tables to all possible destinations, even if no traffic goes through. The reactive protocols are based on on-demand route discoveries that update routing tables only for the destination that has traffic going through. The present paper focuses on study and performance evaluation of these categories using NS2 simulations. We have considered qualitative and quantitative criteria. The first one concerns distributed operation, loop-freedom, security, sleep period operation. The second are used to assess performance of different routing protocols presented in this paper. We can list end-to-end data delay, jitter, packet delivery ratio, routing load, activity distribution. Comparative study will be presented with number of networking context consideration and the results show the appropriate routing protocol for two kinds of communication services (data and voice).
The Interplanetary Internet: A Communications Infrastructure for Mars Exploration
NASA Astrophysics Data System (ADS)
Burleigh, S.; Cerf, V.; Durst, R.; Fall, K.; Hooke, A.; Scott, K.; Weiss, H.
2002-01-01
A successful program of Mars Exploration will depend heavily on a robust and dependable space communications infrastructure that is well integrated with the terrestrial Internet. In the same way that the underpinnings of the Internet are the standardized "TCP/IP" suite of protocols, an "Interplanetary Internet" will need a similar set of capabilities that can support reliable communications across vast distances and highly stressed communications environments. For the past twenty years, the Consultative Committee for Space Data Systems (CCSDS) has been developing standardized long- haul space link communications techniques that are now in use by over two hundred missions within the international space community. New CCSDS developments, shortly to be infused into Mars missions, include a proximity link standard and a store-and- forward file transfer protocol. As part of its `Next Generation Internet' initiative, the U.S. Defense Advanced Projects Agency (DARPA) recently supported an architectural study of a future "InterPlaNetary Internet" (IPN). The IPN architecture assumes that in short-delay environments - such as on and around Mars - standard Internet technologies will be adapted to the locally harsh environment and deployed within surface vehicles and orbiting relays. A long-haul interplanetary backbone network that includes Deep Space Network (DSN) gateways into the terrestrial Internet will interconnect these distributed internets that are scattered across the Solar System. Just as TCP/IP unites the Earth's "network of networks" to become the Internet, a new suite of protocols known as "Bundling" will enable the IPN to become a "network of internets" to support true interplanetary dialog. An InterPlaNetary Internet Research Group has been established within the Internet community to coordinate this research and NASA has begun to support the further development of the IPN architecture and the Bundling protocols. A strategy is being developed whereby the current set of standard CCSDS data communications protocols can be incrementally evolved so that true InterPlaNetary Internet operations are feasible by the end of the decade. The strategy - which is already in progress via the deployment of Mars relay links - needs individual missions to each contribute increments of capability so that a standard communications infrastructure can rapidly accrete. This paper will describe the IPN architectural concepts, discuss the current set of standard data communications capabilities that exist to support Mars exploration and review the proposed new developments. We will also postulate that the concept is scalable and can grow to support future scenarios where human intelligence is widely distributed across the Solar System and day-to-day communications dialog among planets is routine. 1 2 3 4 5
Kaufman, David R; Sheehan, Barbara; Stetson, Peter; Bhatt, Ashish R; Field, Adele I; Patel, Chirag; Maisel, James Mark
2016-10-28
The process of documentation in electronic health records (EHRs) is known to be time consuming, inefficient, and cumbersome. The use of dictation coupled with manual transcription has become an increasingly common practice. In recent years, natural language processing (NLP)-enabled data capture has become a viable alternative for data entry. It enables the clinician to maintain control of the process and potentially reduce the documentation burden. The question remains how this NLP-enabled workflow will impact EHR usability and whether it can meet the structured data and other EHR requirements while enhancing the user's experience. The objective of this study is evaluate the comparative effectiveness of an NLP-enabled data capture method using dictation and data extraction from transcribed documents (NLP Entry) in terms of documentation time, documentation quality, and usability versus standard EHR keyboard-and-mouse data entry. This formative study investigated the results of using 4 combinations of NLP Entry and Standard Entry methods ("protocols") of EHR data capture. We compared a novel dictation-based protocol using MediSapien NLP (NLP-NLP) for structured data capture against a standard structured data capture protocol (Standard-Standard) as well as 2 novel hybrid protocols (NLP-Standard and Standard-NLP). The 31 participants included neurologists, cardiologists, and nephrologists. Participants generated 4 consultation or admission notes using 4 documentation protocols. We recorded the time on task, documentation quality (using the Physician Documentation Quality Instrument, PDQI-9), and usability of the documentation processes. A total of 118 notes were documented across the 3 subject areas. The NLP-NLP protocol required a median of 5.2 minutes per cardiology note, 7.3 minutes per nephrology note, and 8.5 minutes per neurology note compared with 16.9, 20.7, and 21.2 minutes, respectively, using the Standard-Standard protocol and 13.8, 21.3, and 18.7 minutes using the Standard-NLP protocol (1 of 2 hybrid methods). Using 8 out of 9 characteristics measured by the PDQI-9 instrument, the NLP-NLP protocol received a median quality score sum of 24.5; the Standard-Standard protocol received a median sum of 29; and the Standard-NLP protocol received a median sum of 29.5. The mean total score of the usability measure was 36.7 when the participants used the NLP-NLP protocol compared with 30.3 when they used the Standard-Standard protocol. In this study, the feasibility of an approach to EHR data capture involving the application of NLP to transcribed dictation was demonstrated. This novel dictation-based approach has the potential to reduce the time required for documentation and improve usability while maintaining documentation quality. Future research will evaluate the NLP-based EHR data capture approach in a clinical setting. It is reasonable to assert that EHRs will increasingly use NLP-enabled data entry tools such as MediSapien NLP because they hold promise for enhancing the documentation process and end-user experience. ©David R. Kaufman, Barbara Sheehan, Peter Stetson, Ashish R. Bhatt, Adele I. Field, Chirag Patel, James Mark Maisel. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 28.10.2016.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nielsen, Yousef W., E-mail: yujwni01@heh.regionh.d; Eiberg, Jonas P., E-mail: Eiberg@dadlnet.d; Logager, Vibeke B., E-mail: viloe@heh.regionh.d
The purpose of this study was to determine the diagnostic performance of 3T whole-body magnetic resonance angiography (WB-MRA) using a hybrid protocol in comparison with a standard protocol in patients with peripheral arterial disease (PAD). In 26 consecutive patients with PAD two different protocols were used for WB-MRA: a standard sequential protocol (n = 13) and a hybrid protocol (n = 13). WB-MRA was performed using a gradient echo sequence, body coil for signal reception, and gadoterate meglumine as contrast agent (0.3 mmol/kg body weight). Two blinded observers evaluated all WB-MRA examinations with regard to presence of stenoses, as wellmore » as diagnostic quality and degree of venous contamination in each of the four stations used in WB-MRA. Digital subtraction angiography served as the method of reference. Sensitivity for detecting significant arterial disease (luminal narrowing {>=} 50%) using standard-protocol WB-MRA for the two observers was 0.63 (95%CI: 0.51-0.73) and 0.66 (0.58-0.78). Specificities were 0.94 (0.91-0.97) and 0.96 (0.92-0.98), respectively. In the hybrid protocol WB-MRA sensitivities were 0.75 (0.64-0.84) and 0.70 (0.58-0.8), respectively. Specificities were 0.93 (0.88-0.96) and 0.95 (0.91-0.97). Interobserver agreement was good using both the standard and the hybrid protocol, with {kappa} = 0.62 (0.44-0.67) and {kappa} = 0.70 (0.59-0.79), respectively. WB-MRA quality scores were significantly higher in the lower leg using the hybrid protocol compared to standard protocol (p = 0.003 and p = 0.03, observers 1 and 2). Distal venous contamination scores were significantly lower with the hybrid protocol (p = 0.02 and p = 0.01, observers 1 and 2). In conclusion, hybrid-protocol WB-MRA shows a better diagnostic performance than standard protocol WB-MRA at 3 T in patients with PAD.« less
Quantum Bit Commitment and the Reality of the Quantum State
NASA Astrophysics Data System (ADS)
Srikanth, R.
2018-01-01
Quantum bit commitment is insecure in the standard non-relativistic quantum cryptographic framework, essentially because Alice can exploit quantum steering to defer making her commitment. Two assumptions in this framework are that: (a) Alice knows the ensembles of evidence E corresponding to either commitment; and (b) system E is quantum rather than classical. Here, we show how relaxing assumption (a) or (b) can render her malicious steering operation indeterminable or inexistent, respectively. Finally, we present a secure protocol that relaxes both assumptions in a quantum teleportation setting. Without appeal to an ontological framework, we argue that the protocol's security entails the reality of the quantum state, provided retrocausality is excluded.
Tips on hybridizing, washing, and scanning affymetrix microarrays.
Ares, Manuel
2014-02-01
Starting in the late 1990s, Affymetrix, Inc. produced a commercial system for hybridizing, washing, and scanning microarrays that was designed to be easy to operate and reproducible. The system used arrays packaged in a plastic cassette or chamber in which the prefabricated array was mounted and could be filled with fluid through resealable membrane ports either by hand or by an automated "fluidics station" specially designed to handle the arrays. A special rotating hybridization oven and a specially designed scanner were also required. Primarily because of automation and standardization the Affymetrix system was and still remains popular. Here, we provide a skeleton protocol with the potential pitfalls identified. It is designed to augment the protocols provided by Affymetrix.
Tan, Swee Jin; Phan, Huan; Gerry, Benjamin Michael; Kuhn, Alexandre; Hong, Lewis Zuocheng; Min Ong, Yao; Poon, Polly Suk Yean; Unger, Marc Alexander; Jones, Robert C; Quake, Stephen R; Burkholder, William F
2013-01-01
Library preparation for next-generation DNA sequencing (NGS) remains a key bottleneck in the sequencing process which can be relieved through improved automation and miniaturization. We describe a microfluidic device for automating laboratory protocols that require one or more column chromatography steps and demonstrate its utility for preparing Next Generation sequencing libraries for the Illumina and Ion Torrent platforms. Sixteen different libraries can be generated simultaneously with significantly reduced reagent cost and hands-on time compared to manual library preparation. Using an appropriate column matrix and buffers, size selection can be performed on-chip following end-repair, dA tailing, and linker ligation, so that the libraries eluted from the chip are ready for sequencing. The core architecture of the device ensures uniform, reproducible column packing without user supervision and accommodates multiple routine protocol steps in any sequence, such as reagent mixing and incubation; column packing, loading, washing, elution, and regeneration; capture of eluted material for use as a substrate in a later step of the protocol; and removal of one column matrix so that two or more column matrices with different functional properties can be used in the same protocol. The microfluidic device is mounted on a plastic carrier so that reagents and products can be aliquoted and recovered using standard pipettors and liquid handling robots. The carrier-mounted device is operated using a benchtop controller that seals and operates the device with programmable temperature control, eliminating any requirement for the user to manually attach tubing or connectors. In addition to NGS library preparation, the device and controller are suitable for automating other time-consuming and error-prone laboratory protocols requiring column chromatography steps, such as chromatin immunoprecipitation.
Laugisch, Oliver; Ramseier, Christoph A; Salvi, Giovanni E; Hägi, Tobias T; Bürgin, Walter; Eick, Sigrun; Sculean, Anton
2016-11-01
The aim of this study was to compare early wound healing, tooth staining and patient acceptance with two different post-surgical maintenance protocols. Forty patients scheduled for flap surgery to treat periodontal pockets or accommodate dental implants were randomly assigned to receive the following two different post-surgical maintenance protocols: (a) 2 weeks rinsing with a 0.05 % chlorhexidine digluconate (CHX)/herbal extract combination (test) or (b) a 0.1 % CHX solution (control). Early wound healing was evaluated clinically and immunologically. Tooth staining and patient acceptance were assessed by means of visual analogue scale (VAS). Both groups presented with comparable wound healing profiles. No statistically significant differences were observed between the two protocols regarding early wound healing and plaque index (p > 0.05). However, in the control group, statistically significantly more patients felt discomfort due to tooth staining (p = 0.0467). Compared with patients from the test group, patients in the control group reported statistically significant more irritation of taste at week 1 (p = 0.0359) and at week 2 (p = 0.0042). The present findings indicate that the two CHX protocols resulted in comparable healing and inhibition of plaque formation. Tooth staining and subjective discomfort related to irritation of taste were more frequent in the control group. A post-operative protocol including 0.05 % CHX/herbal extract may have the potential to improve patient compliance during post-operative maintenance.
Tan, Swee Jin; Phan, Huan; Gerry, Benjamin Michael; Kuhn, Alexandre; Hong, Lewis Zuocheng; Min Ong, Yao; Poon, Polly Suk Yean; Unger, Marc Alexander; Jones, Robert C.; Quake, Stephen R.; Burkholder, William F.
2013-01-01
Library preparation for next-generation DNA sequencing (NGS) remains a key bottleneck in the sequencing process which can be relieved through improved automation and miniaturization. We describe a microfluidic device for automating laboratory protocols that require one or more column chromatography steps and demonstrate its utility for preparing Next Generation sequencing libraries for the Illumina and Ion Torrent platforms. Sixteen different libraries can be generated simultaneously with significantly reduced reagent cost and hands-on time compared to manual library preparation. Using an appropriate column matrix and buffers, size selection can be performed on-chip following end-repair, dA tailing, and linker ligation, so that the libraries eluted from the chip are ready for sequencing. The core architecture of the device ensures uniform, reproducible column packing without user supervision and accommodates multiple routine protocol steps in any sequence, such as reagent mixing and incubation; column packing, loading, washing, elution, and regeneration; capture of eluted material for use as a substrate in a later step of the protocol; and removal of one column matrix so that two or more column matrices with different functional properties can be used in the same protocol. The microfluidic device is mounted on a plastic carrier so that reagents and products can be aliquoted and recovered using standard pipettors and liquid handling robots. The carrier-mounted device is operated using a benchtop controller that seals and operates the device with programmable temperature control, eliminating any requirement for the user to manually attach tubing or connectors. In addition to NGS library preparation, the device and controller are suitable for automating other time-consuming and error-prone laboratory protocols requiring column chromatography steps, such as chromatin immunoprecipitation. PMID:23894273
Pireau, Nathalie; Cordemans, Virginie; Banse, Xavier; Irda, Nadia; Lichtherte, Sébastien; Kaminski, Ludovic
2017-11-01
Spine surgery still remains a challenge for every spine surgeon, aware of the potential serious outcomes of misplaced instrumentation. Though many studies have highlighted that using intraoperative cone beam CT imaging and navigation systems provides higher accuracy than conventional freehand methods for placement of pedicle screws in spine surgery, few studies are concerned about how to reduce radiation exposure for patients with the use of such technology. One of the main focuses of this study is based on the ALARA principle (as low as reasonably achievable). A prospective randomized trial was conducted in the hybrid operating room between December 2015 and December 2016, including 50 patients operated on for posterior instrumented thoracic and/or lumbar spinal fusion. Patients were randomized to intraoperative 3D acquisition high-dose (standard dose) or low-dose protocol, and a total of 216 pedicle screws were analyzed in terms of screw position. Two different methods were used to measure ionizing radiation: the total skin dose (derived from the dose-area product) and the radiation dose evaluated by thermoluminescent dosimeters on the surgical field. According to Gertzbein and Heary classifications, low-dose protocol provided a significant higher accuracy of pedicle screw placement than the high-dose protocol (96.1 versus 92%, respectively). Seven screws (3.2%), all implanted with the high-dose protocol, needed to be revised intraoperatively. The use of low-dose acquisition protocols reduced patient exposure by a factor of five. This study emphasizes the paramount importance of using low-dose protocols for intraoperative cone beam CT imaging coupled with the navigation system, as it at least does not affect the accuracy of pedicle screw placement and irradiates drastically less.
CoAP-Based Mobility Management for the Internet of Things
Chun, Seung-Man; Kim, Hyun-Su; Park, Jong-Tae
2015-01-01
Most of the current mobility management protocols such as Mobile IP and its variants standardized by the IETF may not be suitable to support mobility management for Web-based applications in an Internet of Things (IoT) environment. This is because the sensor nodes have limited power capacity, usually operating in sleep/wakeup mode in a constrained wireless network. In addition, sometimes the sensor nodes may act as the server using the CoAP protocol in an IoT environment. This makes it difficult for Web clients to properly retrieve the sensing data from the mobile sensor nodes in an IoT environment. In this article, we propose a mobility management protocol, named CoMP, which can effectively retrieve the sensing data of sensor nodes while they are moving. The salient feature of CoMP is that it makes use of the IETF CoAP protocol for mobility management, instead of using Mobile IP. Thus CoMP can eliminates the additional signaling overhead of Mobile IP, provides reliable mobility management, and prevents the packet loss. CoMP employs a separate location management server to keep track of the location of the mobile sensor nodes. In order to prevent the loss of important sensing data during movement, a holding mode of operation has been introduced. All the signaling procedures including discovery, registration, binding and holding have been designed by extending the IETF CoAP protocol. The numerical analysis and simulation have been done for performance evaluation in terms of the handover latency and packet loss. The results show that the proposed CoMP is superior to previous mobility management protocols, i.e., Mobile IPv4/v6 (MIPv4/v6), Hierarchical Mobile IPv4/v6 (HMIPv4/v6), in terms of the handover latency and packet loss. PMID:26151214
[Stapled transanal rectal resection (STARR) in the treatment of rectocele: personal experience].
Guarnieri, Alfredo; Cesaretti, Manuela; Tirone, Andrea; Vuolo, Giuseppe; Verre, Luigi; Savelli, Vinno; Piccolomini, Alessandro; Di Cosmo, Leonardo; Carli, Anton Ferdinando; Burroni, Mariagrazia; Pitzalis, Marcella
2008-01-01
Rectocele is an organic cause of chronic constipation, with a prevalence ranging from 8.95% to 12% in Europe and United States. Necessarily, the approach for rectocele repair is a surgical operation. Stapled transanal rectal resection (STARR) is safe and effective in the treatment of obstructed defecation syndrome. The authors' experience suggests that the surgical operation needs to be combined to rehabilitation exercises, before and after the surgical treatment, in order to strengthen the muscles of the pelvic pavement. From January 2005 to January 2007, 20 patients with outlet obstruction underwent STARR. Patients were selected for operation based on a strict diagnostic protocol: anamnesis, clinical examination, coloproctological and urogynaecological examinations, defecography, anorectal manometry, transrectal ultrasonography and peritoneal electromyography. The therapeutic protocol consists of 3 parts: phase I: rehabilitation of the pelvic pavement; phase II: surgical operation; III phase: post-surgical rehabilitation of the pelvic pavement; The clinical result was classified into: excellent (6 patients), when all constipation symptoms disappeared, good (11 patients), when patient has 1 or 2 obstructed defecation episodes treated with a laxative, fairly good (2 patients), more than 2 episodes, and poor (1 patient), when surgical operation doesn't improve any of the symptoms. Our results, confirmed by the literature, suggest that Longo's technique should be considered as gold standard for rectocele treatment.
Saade, Charbel; El-Merhi, Fadi; El-Achkar, Bassam; Kerek, Racha; Vogl, Thomas J; Maroun, Gilbert Georges; Jamjoom, Lamia; Al-Mohiy, Hussain; Naffaa, Lena
Caudocranial scan direction and contrast injection timing based on measured patient vessel dynamics can significantly improve arterial and aneurysmal opacification and reduce both contrast and radiation dose in the assessment of thoracic aortic aneurysms (TAA) using helical thoracic computed tomography angiography (CTA). To investigate opacification of the thoracic aorta and TAA using a caudocranial scan direction and a patient-specific contrast protocol. Thoracic aortic CTA was performed in 160 consecutive patients with suspected TAA using a 256-slice computed tomography scanner and a dual barrel contrast injector. Patients were subjected in equal numbers to one of two contrast protocols. Patient age and sex were equally distributed across both groups. Protocol A, the department's standard protocol, consisted of a craniocaudal scan direction with 100 mL of contrast, intravenously injected at a flow rate of 4.5 mL/s. Protocol B involved a caudocranial scan direction and a novel contrast formula based on patient cardiovascular dynamics, followed by 100 mL of saline at 4.5 mL/s. Each scan acquisition comprised of 120 kVp, 200 mA with modulation, temporal resolution 0.27 seconds, and pitch 0.889:1. The dose length product was measured between each protocol and data generated were compared using Mann-Whitney U nonparametric statistics. Receiver operating characteristic analysis, visual grading characteristic (VGC), and κ analyses were performed. Mean opacification in the thoracic aorta and aneurysm measured was 24 % and 55%, respectively. The mean contrast volume was significantly lower in protocol B (73 ± 10 mL) compared with A (100 ± 1 mL) (P<0.001). The contrast-to-noise ratio demonstrated significant differences between the protocols (protocol A, 18.2 ± 12.9; protocol B, 29.7 ± 0.61; P < 0.003). Mean effective dose in protocol B (2.6 ± 0.4 mSv) was reduced by 19% compared with A (3.2 ± 0.8 mSv) (P < 0.004). Aneurysmal detectability demonstrated significant increases by receiver operating characteristic and visual grading characteristic analysis for protocol B compared with A (P < 0.02), and reader agreement increased from poor to excellent. Significant increase in the visualization of TAAs following a caudocranial scan direction during helical thoracic CTA can be achieved using low-contrast volume based on patient-specific contrast formula.
Space-Based Telemetry and Range Safety (STARS) Study
NASA Technical Reports Server (NTRS)
Hogie, Keith; Crisuolo, Ed; Parise, Ron
2004-01-01
This presentation will describe the design, development, and testing of a system to collect telemetry, format it into UDP/IP packets, and deliver it to a ground test range using standard IP technologies over a TDRSS link. This presentation will discuss the goal of the STARS IP Formatter along with the overall design. It will also present performance results of the current version of the IP formatter. Finally, it will discuss key issues for supporting constant rate telemetry data delivery when using standard components such as PCI/104 processors, the Linux operating system, Internet Protocols, and synchronous serial interfaces.
The Interlibrary Loan Protocol: An OSI Solution to ILL Messaging.
ERIC Educational Resources Information Center
Turner, Fay
1990-01-01
Discusses the interlibrary loan (ILL) protocol, a standard based on the principles of the Open Systems Interconnection (OSI) Reference Model. Benefits derived from protocol use are described, the status of the protocol as an international standard is reviewed, and steps taken by the National Library of Canada to facilitate migration to an ILL…
An engineering database management system for spacecraft operations
NASA Technical Reports Server (NTRS)
Cipollone, Gregorio; Mckay, Michael H.; Paris, Joseph
1993-01-01
Studies at ESOC have demonstrated the feasibility of a flexible and powerful Engineering Database Management System in support for spacecraft operations documentation. The objectives set out were three-fold: first an analysis of the problems encountered by the Operations team in obtaining and managing operations documents; secondly, the definition of a concept for operations documentation and the implementation of prototype to prove the feasibility of the concept; and thirdly, definition of standards and protocols required for the exchange of data between the top-level partners in a satellite project. The EDMS prototype was populated with ERS-l satellite design data and has been used by the operations team at ESOC to gather operational experience. An operational EDMS would be implemented at the satellite prime contractor's site as a common database for all technical information surrounding a project and would be accessible by the cocontractor's and ESA teams.
Ko, Dae-Hyun; Ji, Misuk; Kim, Sollip; Cho, Eun-Jung; Lee, Woochang; Yun, Yeo-Min; Chun, Sail; Min, Won-Ki
2016-01-01
The results of urine sediment analysis have been reported semiquantitatively. However, as recent guidelines recommend quantitative reporting of urine sediment, and with the development of automated urine sediment analyzers, there is an increasing need for quantitative analysis of urine sediment. Here, we developed a protocol for urine sediment analysis and quantified the results. Based on questionnaires, various reports, guidelines, and experimental results, we developed a protocol for urine sediment analysis. The results of this new protocol were compared with those obtained with a standardized chamber and an automated sediment analyzer. Reference intervals were also estimated using new protocol. We developed a protocol with centrifugation at 400 g for 5 min, with the average concentration factor of 30. The correlation between quantitative results of urine sediment analysis, the standardized chamber, and the automated sediment analyzer were generally good. The conversion factor derived from the new protocol showed a better fit with the results of manual count than the default conversion factor in the automated sediment analyzer. We developed a protocol for manual urine sediment analysis to quantitatively report the results. This protocol may provide a mean for standardization of urine sediment analysis.
MDP: Reliable File Transfer for Space Missions
NASA Technical Reports Server (NTRS)
Rash, James; Criscuolo, Ed; Hogie, Keith; Parise, Ron; Hennessy, Joseph F. (Technical Monitor)
2002-01-01
This paper presents work being done at NASA/GSFC by the Operating Missions as Nodes on the Internet (OMNI) project to demonstrate the application of the Multicast Dissemination Protocol (MDP) to space missions to reliably transfer files. This work builds on previous work by the OMNI project to apply Internet communication technologies to space communication. The goal of this effort is to provide an inexpensive, reliable, standard, and interoperable mechanism for transferring files in the space communication environment. Limited bandwidth, noise, delay, intermittent connectivity, link asymmetry, and one-way links are all possible issues for space missions. Although these are link-layer issues, they can have a profound effect on the performance of transport and application level protocols. MDP, a UDP-based reliable file transfer protocol, was designed for multicast environments which have to address these same issues, and it has done so successfully. Developed by the Naval Research Lab in the mid 1990's, MDP is now in daily use by both the US Post Office and the DoD. This paper describes the use of MDP to provide automated end-to-end data flow for space missions. It examines the results of a parametric study of MDP in a simulated space link environment and discusses the results in terms of their implications for space missions. Lessons learned are addressed, which suggest minor enhancements to the MDP user interface to add specific features for space mission requirements, such as dynamic control of data rate, and a checkpoint/resume capability. These are features that are provided for in the protocol, but are not implemented in the sample MDP application that was provided. A brief look is also taken at the status of standardization. A version of MDP known as NORM (Neck Oriented Reliable Multicast) is in the process of becoming an IETF standard.
Software Design Document MCC CSCI (1). Volume 1 Sections 1.0-2.18
1991-06-01
AssociationUserProtocol /simnet/common!include/prot ____________________ ____________________ ocol/p assoc.h Primitive long Standard C type...Information. 2.2.1.4.2 ProcessMessage ProcessMessage processes a message from another process. type describes the message as either one-way, a synchronous or...Macintosh Consoles. This is sometimes necessary due to normal clock skew so that operations among the MCC components will remain synchronized . This
RTML: remote telescope markup language and you
NASA Astrophysics Data System (ADS)
Hessman, F. V.
2001-12-01
In order to coordinate the use of robotic and remotely operated telescopes in networks -- like Göttingen's MOnitoring NEtwork of Telescopes (MONET) -- a standard format for the exchange of observing requests and reports is needed. I describe the benefits of Remote Telescope Markup Language (RTML), an XML-based protocol originally developed by the Hands-On Universe Project, which is being used and further developed by several robotic telescope projects and firms.
CNES-NASA Disruption-Tolerant Networking (DTN) Interoperability
NASA Technical Reports Server (NTRS)
Mortensen, Dale; Eddy, Wesley M.; Reinhart, Richard C.; Lassere, Francois
2014-01-01
Future missions requiring robust internetworking services may use Delay-Disruption-Tolerant Networking (DTN) technology. CNES, NASA, and other international space agencies are committed to using CCSDS standards in their space and ground mission communications systems. The experiment described in this presentation will evaluate operations concepts, system performance, and advance technology readiness for the use of DTN protocols in conjunction with CCSDS ground systems, CCSDS data links, and CCSDS file transfer applications
Validity of Assessments of Youth Access to Tobacco: The Familiarity Effect
Landrine, Hope; Klonoff, Elizabeth A.
2003-01-01
Objectives. We examined the standard compliance protocol and its validity as a measure of youth access to tobacco. Methods. In Study 1, youth smokers reported buying cigarettes in stores where they are regular customers. In Study 2, youths attempted to purchase cigarettes by using the Standard Protocol, in which they appeared at stores once for cigarettes, and by using the Familiarity Protocol, in which they were rendered regular customers by purchasing nontobacco items 4 times and then requested cigarettes during their fifth visit. Results. Sales to youths aged 17 years in the Familiarity Protocol were significantly higher than sales to the same age group in the Standard Protocols (62.5% vs. 6%, respectively). Conclusions. The Standard Protocol does not match how youths obtain cigarettes. Access is low for stranger youths within compliance studies, but access is high for familiar youths outside of compliance studies. PMID:14600057
A contaminant-free assessment of Endogenous Retroviral RNA in human plasma
Karamitros, Timokratis; Paraskevis, Dimitrios; Hatzakis, Angelos; Psichogiou, Mina; Elefsiniotis, Ioannis; Hurst, Tara; Geretti, Anna-Maria; Beloukas, Apostolos; Frater, John; Klenerman, Paul; Katzourakis, Aris; Magiorkinis, Gkikas
2016-01-01
Endogenous retroviruses (ERVs) comprise 6–8% of the human genome. HERVs are silenced in most normal tissues, up-regulated in stem cells and in placenta but also in cancer and HIV-1 infection. Crucially, there are conflicting reports on detecting HERV RNA in non-cellular clinical samples such as plasma that suggest the study of HERV RNA can be daunting. Indeed, we find that the use of real-time PCR in a quality assured clinical laboratory setting can be sensitive to low-level proviral contamination. We developed a mathematical model for low-level contamination that allowed us to design a laboratory protocol and standard operating procedures for robust measurement of HERV RNA. We focus on one family, HERV-K HML-2 (HK2) that has been most recently active even though they invaded our ancestral genomes almost 30 millions ago. We extensively validated our experimental design on a model cell culture system showing high sensitivity and specificity, totally eliminating the proviral contamination. We then tested 236 plasma samples from patients infected with HIV-1, HCV or HBV and found them to be negative. The study of HERV RNA for human translational studies should be performed with extensively validated protocols and standard operating procedures to control the widespread low-level human DNA contamination. PMID:27640347
From “awake” to “monitored anesthesia care” thoracic surgery: A 15 year evolution
Mineo, Tommaso C; Tacconi, Federico
2014-01-01
Although general anesthesia still represents the standard when performing thoracic surgery, the interest toward alternative methods is increasing. These have evolved from the employ of just local or regional analgesia techniques in completely alert patients (awake thoracic surgery), to more complex protocols entailing conscious sedation and spontaneous ventilation. The main rationale of these methods is to prevent serious complications related to general anesthesia and selective ventilation, such as tracheobronchial injury, acute lung injury, and cardiovascular events. Trends toward shorter hospitalization and reduced overall costs have also been indicated in preliminary reports. Monitored anesthesia care in thoracic surgery can be successfully employed to manage diverse oncologic conditions, such as malignant pleural effusion, peripheral lung nodules, and mediastinal tumors. Main non-oncologic indications include pneumothorax, emphysema, pleural infections, and interstitial lung disease. Furthermore, as the familiarity with this surgical practice has increased, major operations are now being performed this way. Despite the absence of randomized controlled trials, there is preliminary evidence that monitored anesthesia care protocols in thoracic surgery may be beneficial in high-risk patients, with non-inferior efficacy when compared to standard operations under general anesthesia. Monitored anesthesia care in thoracic surgery should enter the armamentarium of modern thoracic surgeons, and adequate training should be scheduled in accredited residency programs. PMID:26766966
The Geostationary Operational Satellite R Series SpaceWire Based Data System
NASA Technical Reports Server (NTRS)
Anderson, William; Birmingham, Michael; Krimchansky, Alexander; Lombardi, Matthew
2016-01-01
The Geostationary Operational Environmental Satellite R-Series Program (GOES-R, S, T, and U) mission is a joint program between National Oceanic & Atmospheric Administration (NOAA) and National Aeronautics & Space Administration (NASA) Goddard Space Flight Center (GSFC). SpaceWire was selected as the science data bus as well as command and telemetry for the GOES instruments. GOES-R, S, T, and U spacecraft have a mission data loss requirement for all data transfers between the instruments and spacecraft requiring error detection and correction at the packet level. The GOES-R Reliable Data Delivery Protocol (GRDDP) [1] was developed in house to provide a means of reliably delivering data among various on board sources and sinks. The GRDDP was presented to and accepted by the European Cooperation for Space Standardization (ECSS) and is part of the ECSS Protocol Identification Standard [2]. GOES-R development and integration is complete and the observatory is scheduled for launch November 2016. Now that instrument to spacecraft integration is complete, GOES-R Project reviewed lessons learned to determine how the GRDDP could be revised to improve the integration process. Based on knowledge gained during the instrument to spacecraft integration process the following is presented to help potential GRDDP users improve their system designs and implementation.
Ammirati, Mario; Lamki, Tariq Theeb; Shaw, Andrew Brian; Forde, Braxton; Nakano, Ichiro; Mani, Matharbootham
2013-01-01
The semi-sitting position has lost favor among neurosurgeons partly due to unproven assumptions of increased complications. Many complications have been associated with this position; the most feared: venous air embolism and paradoxical air embolism. We report on this retrospective study of the outcome over 4 years of 48 neurosurgical patients operated on consecutively using a standardized protocol: 41 (85%) in the semi-sitting position, and seven (15%) in the prone position. Procedures included: tumor resection (34), posterior fossa decompression (12), cyst resection (1) and resection of arteriovenous malformation (1). Pre-operative workup was standardized. Vigilant intra-operative observation was done by an experienced neuroanesthetist. Pertinent data was extracted from surgical records. Of the 48 patients, 10 (20.8%) were found to have a patent foramen ovale (PFO) on trans-esophageal echocardiography. Of these, four (40%) patients underwent procedures in the semi-sitting position while six (60%) did not. A clinically significant venous air embolism (VAE) was detected during 2 of the 41 semi-sitting procedures (4.9%). Neither patient suffered any obvious sequelae. No other morbidity was encountered associated with surgical position. Our study suggests that a model similar to ours is effective in preventing major complications associated with the semi-sitting position. The semi-sitting position is a safe, practical position that should be considered in appropriate cases. The fear of dreadful complications seems unwarranted. Copyright © 2012 Elsevier Ltd. All rights reserved.
Ge, Meili; Shao, Yingqi; Huang, Jinbo; Huang, Zhendong; Zhang, Jing; Nie, Neng; Zheng, Yizhou
2013-01-01
Background Previous reports showed that outcome of rabbit antithymocyte globulin (rATG) was not satisfactory as the first-line therapy for severe aplastic anemia (SAA). We explored a modifying schedule of administration of rATG. Design and Methods Outcomes of a cohort of 175 SAA patients, including 51 patients administered with standard protocol (3.55 mg/kg/d for 5 days) and 124 cases with optimized protocol (1.97 mg/kg/d for 9 days) of rATG plus cyclosporine (CSA), were analyzed retrospectively. Results Of all 175 patients, response rates at 3 and 6 months were 36.6% and 56.0%, respectively. 51 cases received standard protocol had poor responses at 3 (25.5%) and 6 months (41.2%). However, 124 patients received optimized protocol had better responses at 3 (41.1%, P = 0.14) and 6 (62.1%, P = 0.01). Higher incidences of infection (57.1% versus 37.9%, P = 0.02) and early mortality (17.9% versus 0.8%, P<0.001) occurred in patients received standard protocol compared with optimized protocol. The 5-year overall survival in favor of the optimized over standard rATG protocol (76.0% versus. 50.3%, P<0.001) was observed. By multivariate analysis, optimized protocol (RR = 2.21, P = 0.04), response at 3 months (RR = 10.31, P = 0.03) and shorter interval (<23 days) between diagnosis and initial dose of rATG (RR = 5.35, P = 0.002) were independent favorable predictors of overall survival. Conclusions Optimized instead of standard rATG protocol in combination with CSA remained efficacious as a first-line immunosuppressive regimen for SAA. PMID:23554855
Internet over the VDL-2 Subnetwork: the VDL-2/IP Aviation Datalink System
NASA Technical Reports Server (NTRS)
Grappel, R. D.
2000-01-01
This report describes the design to operate the standard Internet communications protocols (IP) over the VHF aviation Data Link Mode 2 (VDL-2) subnetwork. The VDL-2/IP system specified in this report can operate transparently with the current aviation users of VDL-2 (Airline Communications and Reporting System, ACARS and Aeronautical Telecommunications Network, ATN) and proposed users (Flight Information Service via Broadcast, FIS-B). The VDL-2/IP system provides a straightforward mechanisms to utilize inexpensive, commercial off-the-shelf (COTS) communications packages developed for the Internet as part of the aviation datalink system.
Sheehan, Barbara; Stetson, Peter; Bhatt, Ashish R; Field, Adele I; Patel, Chirag; Maisel, James Mark
2016-01-01
Background The process of documentation in electronic health records (EHRs) is known to be time consuming, inefficient, and cumbersome. The use of dictation coupled with manual transcription has become an increasingly common practice. In recent years, natural language processing (NLP)–enabled data capture has become a viable alternative for data entry. It enables the clinician to maintain control of the process and potentially reduce the documentation burden. The question remains how this NLP-enabled workflow will impact EHR usability and whether it can meet the structured data and other EHR requirements while enhancing the user’s experience. Objective The objective of this study is evaluate the comparative effectiveness of an NLP-enabled data capture method using dictation and data extraction from transcribed documents (NLP Entry) in terms of documentation time, documentation quality, and usability versus standard EHR keyboard-and-mouse data entry. Methods This formative study investigated the results of using 4 combinations of NLP Entry and Standard Entry methods (“protocols”) of EHR data capture. We compared a novel dictation-based protocol using MediSapien NLP (NLP-NLP) for structured data capture against a standard structured data capture protocol (Standard-Standard) as well as 2 novel hybrid protocols (NLP-Standard and Standard-NLP). The 31 participants included neurologists, cardiologists, and nephrologists. Participants generated 4 consultation or admission notes using 4 documentation protocols. We recorded the time on task, documentation quality (using the Physician Documentation Quality Instrument, PDQI-9), and usability of the documentation processes. Results A total of 118 notes were documented across the 3 subject areas. The NLP-NLP protocol required a median of 5.2 minutes per cardiology note, 7.3 minutes per nephrology note, and 8.5 minutes per neurology note compared with 16.9, 20.7, and 21.2 minutes, respectively, using the Standard-Standard protocol and 13.8, 21.3, and 18.7 minutes using the Standard-NLP protocol (1 of 2 hybrid methods). Using 8 out of 9 characteristics measured by the PDQI-9 instrument, the NLP-NLP protocol received a median quality score sum of 24.5; the Standard-Standard protocol received a median sum of 29; and the Standard-NLP protocol received a median sum of 29.5. The mean total score of the usability measure was 36.7 when the participants used the NLP-NLP protocol compared with 30.3 when they used the Standard-Standard protocol. Conclusions In this study, the feasibility of an approach to EHR data capture involving the application of NLP to transcribed dictation was demonstrated. This novel dictation-based approach has the potential to reduce the time required for documentation and improve usability while maintaining documentation quality. Future research will evaluate the NLP-based EHR data capture approach in a clinical setting. It is reasonable to assert that EHRs will increasingly use NLP-enabled data entry tools such as MediSapien NLP because they hold promise for enhancing the documentation process and end-user experience. PMID:27793791
Continuous-variable measurement-device-independent quantum key distribution with photon subtraction
NASA Astrophysics Data System (ADS)
Ma, Hong-Xin; Huang, Peng; Bai, Dong-Yun; Wang, Shi-Yu; Bao, Wan-Su; Zeng, Gui-Hua
2018-04-01
It has been found that non-Gaussian operations can be applied to increase and distill entanglement between Gaussian entangled states. We show the successful use of the non-Gaussian operation, in particular, photon subtraction operation, on the continuous-variable measurement-device-independent quantum key distribution (CV-MDI-QKD) protocol. The proposed method can be implemented based on existing technologies. Security analysis shows that the photon subtraction operation can remarkably increase the maximal transmission distance of the CV-MDI-QKD protocol, which precisely make up for the shortcoming of the original CV-MDI-QKD protocol, and one-photon subtraction operation has the best performance. Moreover, the proposed protocol provides a feasible method for the experimental implementation of the CV-MDI-QKD protocol.
Cool Apps: Building Cryospheric Data Applications with Standards-Based Service Oriented Architecture
NASA Astrophysics Data System (ADS)
Oldenburg, J.; Truslove, I.; Collins, J. A.; Liu, M.; Lewis, S.; Brodzik, M.
2012-12-01
The National Snow and Ice Data Center (NSIDC) holds a large collection of cryospheric data, and is involved in a number of informatics research and development projects aimed at improving the discoverability and accessibility of these data. To develop high- quality software in a timely manner, we have adopted a Service- Oriented Architecture (SOA) approach for our core technical infrastructure development. Data services at NSIDC are internally exposed to other tools and applications through standards-based service interfaces. These standards include OAI-PMH (Open Archives Initiative Protocol for Metadata Harvesting), various OGC (Open Geospatial Consortium) standards including WMS (Web Map Service) and WFS (Web Feature Service), ESIP (Federation of Earth Sciences Information Partners) OpenSearch, and NSIDC-defined service endpoints which follow a RESTful architecture. By taking a standards-based approach, we are able to use off-the-shelf tools and libraries to consume, translate and broker these data services, and thus develop applications faster. Additionally, by exposing public interfaces to these services we provide valuable data services to technical collaborators; for example, NASA Reverb (http://reverb.echo.nasa.gov) uses NSIDC's WMS services. Our latest generation of web applications consume these data services directly. The most complete example of this is the Operation IceBridge Data Portal (http://nsidc.org/icebridge/ portal) which depends on many of the aforementioned services, retrieving data in several ways. The maps it displays are obtained through the use of WMS and WFS protocols from a MapServer instance hosted at NSIDC. Links to the scientific data collected on Operation IceBridge campaigns are obtained through ESIP OpenSearch requests service providers that encapsulate our metadata databases. These standards-based web services are also developed at NSIDC and are designed to be used independently of the Portal. This poster provides a visual representation of the relationships described above, with additional details and examples, and more generally outlines the benefits and challenges of this SOA approach.
Interoperability in the Planetary Science Archive (PSA)
NASA Astrophysics Data System (ADS)
Rios Diaz, C.
2017-09-01
The protocols and standards currently being supported by the recently released new version of the Planetary Science Archive at this time are the Planetary Data Access Protocol (PDAP), the EuroPlanet- Table Access Protocol (EPN-TAP) and Open Geospatial Consortium (OGC) standards. We explore these protocols in more detail providing scientifically useful examples of their usage within the PSA.
Auer, Jorg A; Goodship, Allen; Arnoczky, Steven; Pearce, Simon; Price, Jill; Claes, Lutz; von Rechenberg, Brigitte; Hofmann-Amtenbrinck, Margarethe; Schneider, Erich; Müller-Terpitz, R; Thiele, F; Rippe, Klaus-Peter; Grainger, David W
2007-08-01
In an attempt to establish some consensus on the proper use and design of experimental animal models in musculoskeletal research, AOVET (the veterinary specialty group of the AO Foundation) in concert with the AO Research Institute (ARI), and the European Academy for the Study of Scientific and Technological Advance, convened a group of musculoskeletal researchers, veterinarians, legal experts, and ethicists to discuss, in a frank and open forum, the use of animals in musculoskeletal research. The group narrowed the field to fracture research. The consensus opinion resulting from this workshop can be summarized as follows: Anaesthesia and pain management protocols for research animals should follow standard protocols applied in clinical work for the species involved. This will improve morbidity and mortality outcomes. A database should be established to facilitate selection of anaesthesia and pain management protocols for specific experimental surgical procedures and adopted as an International Standard (IS) according to animal species selected. A list of 10 golden rules and requirements for conduction of animal experiments in musculoskeletal research was drawn up comprising 1) Intelligent study designs to receive appropriate answers; 2) Minimal complication rates (5 to max. 10%); 3) Defined end-points for both welfare and scientific outputs analogous to quality assessment (QA) audit of protocols in GLP studies; 4) Sufficient details for materials and methods applied; 5) Potentially confounding variables (genetic background, seasonal, hormonal, size, histological, and biomechanical differences); 6) Post-operative management with emphasis on analgesia and follow-up examinations; 7) Study protocols to satisfy criteria established for a "justified animal study"; 8) Surgical expertise to conduct surgery on animals; 9) Pilot studies as a critical part of model validation and powering of the definitive study design; 10) Criteria for funding agencies to include requirements related to animal experiments as part of the overall scientific proposal review protocols. Such agencies are also encouraged to seriously consider and adopt the recommendations described here when awarding funds for specific projects. Specific new requirements and mandates related both to improving the welfare and scientific rigour of animal-based research models are urgently needed as part of international harmonization of standards.
A randomized trial of protocol-based care for early septic shock.
Yealy, Donald M; Kellum, John A; Huang, David T; Barnato, Amber E; Weissfeld, Lisa A; Pike, Francis; Terndrup, Thomas; Wang, Henry E; Hou, Peter C; LoVecchio, Frank; Filbin, Michael R; Shapiro, Nathan I; Angus, Derek C
2014-05-01
In a single-center study published more than a decade ago involving patients presenting to the emergency department with severe sepsis and septic shock, mortality was markedly lower among those who were treated according to a 6-hour protocol of early goal-directed therapy (EGDT), in which intravenous fluids, vasopressors, inotropes, and blood transfusions were adjusted to reach central hemodynamic targets, than among those receiving usual care. We conducted a trial to determine whether these findings were generalizable and whether all aspects of the protocol were necessary. In 31 emergency departments in the United States, we randomly assigned patients with septic shock to one of three groups for 6 hours of resuscitation: protocol-based EGDT; protocol-based standard therapy that did not require the placement of a central venous catheter, administration of inotropes, or blood transfusions; or usual care. The primary end point was 60-day in-hospital mortality. We tested sequentially whether protocol-based care (EGDT and standard-therapy groups combined) was superior to usual care and whether protocol-based EGDT was superior to protocol-based standard therapy. Secondary outcomes included longer-term mortality and the need for organ support. We enrolled 1341 patients, of whom 439 were randomly assigned to protocol-based EGDT, 446 to protocol-based standard therapy, and 456 to usual care. Resuscitation strategies differed significantly with respect to the monitoring of central venous pressure and oxygen and the use of intravenous fluids, vasopressors, inotropes, and blood transfusions. By 60 days, there were 92 deaths in the protocol-based EGDT group (21.0%), 81 in the protocol-based standard-therapy group (18.2%), and 86 in the usual-care group (18.9%) (relative risk with protocol-based therapy vs. usual care, 1.04; 95% confidence interval [CI], 0.82 to 1.31; P=0.83; relative risk with protocol-based EGDT vs. protocol-based standard therapy, 1.15; 95% CI, 0.88 to 1.51; P=0.31). There were no significant differences in 90-day mortality, 1-year mortality, or the need for organ support. In a multicenter trial conducted in the tertiary care setting, protocol-based resuscitation of patients in whom septic shock was diagnosed in the emergency department did not improve outcomes. (Funded by the National Institute of General Medical Sciences; ProCESS ClinicalTrials.gov number, NCT00510835.).
50 CFR 600.757 - Operational protocols.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 50 Wildlife and Fisheries 8 2010-10-01 2010-10-01 false Operational protocols. 600.757 Section 600.757 Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL OCEANIC AND ATMOSPHERIC... Operational protocols. (a) Services of conveners and facilitators. A Council or NMFS may employ or enter into...
One Approach for Transitioning the iNET Standards into the IRIG 106 Telemetry Standards
2015-05-26
Protocol Suite. Figure 1 illustrates the Open Systems Interconnection ( OSI ) Model, the corresponding TCP/IP Model, and the major components of the TCP...IP Protocol Suite. Figure 2 represents the iNET-specific protocols layered onto the TCP/IP Model. Figure 1. OSI and TCP/IP Model with TCP/IP...Protocol Suite TCP/IP Protocol Suite Major Components IPv4 IPv6 TCP/IP Model OSI Model Application Presentation
STANDARD MEASUREMENT PROTOCOLS - FLORIDA RADON RESEARCH PROGRAM
The manual, in support of the Florida Radon Research Program, contains standard protocols for key measurements where data quality is vital to the program. t contains two sections. he first section, soil measurements, contains field sampling protocols for soil gas permeability and...
NASA Technical Reports Server (NTRS)
Mcdonald, K. D.; Miller, C. M.; Scales, W. C.; Dement, D. K.
1990-01-01
The projected application and requirements in the near term (to 1995) and far term (to 2010) for aeronautical mobile services supporting air traffic control operations are addressed. The implications of these requirements on spectrum needs, and the resulting effects on the satellite design and operation are discussed. The U.S. is working with international standards and regulatory organizations to develop the necessary aviation standards, signalling protocols, and implementation methods. In the provision of aeronautical safety services, a number of critical issues were identified, including system reliability and availability, access time, channel restoration time, interoperability, pre-emption techniques, and the system network interfaces. Means for accomplishing these critical services in the aeronautical mobile satellite service (AMSS), and the various activities relating to the future provision of aeronautical safety services are addressed.
NASA Astrophysics Data System (ADS)
McDonald, K. D.; Miller, C. M.; Scales, W. C.; Dement, D. K.
The projected application and requirements in the near term (to 1995) and far term (to 2010) for aeronautical mobile services supporting air traffic control operations are addressed. The implications of these requirements on spectrum needs, and the resulting effects on the satellite design and operation are discussed. The U.S. is working with international standards and regulatory organizations to develop the necessary aviation standards, signalling protocols, and implementation methods. In the provision of aeronautical safety services, a number of critical issues were identified, including system reliability and availability, access time, channel restoration time, interoperability, pre-emption techniques, and the system network interfaces. Means for accomplishing these critical services in the aeronautical mobile satellite service (AMSS), and the various activities relating to the future provision of aeronautical safety services are addressed.
Standardizing the information architecture for spacecraft operations
NASA Technical Reports Server (NTRS)
Easton, C. R.
1994-01-01
This paper presents an information architecture developed for the Space Station Freedom as a model from which to derive an information architecture standard for advanced spacecraft. The information architecture provides a way of making information available across a program, and among programs, assuming that the information will be in a variety of local formats, structures and representations. It provides a format that can be expanded to define all of the physical and logical elements that make up a program, add definitions as required, and import definitions from prior programs to a new program. It allows a spacecraft and its control center to work in different representations and formats, with the potential for supporting existing spacecraft from new control centers. It supports a common view of data and control of all spacecraft, regardless of their own internal view of their data and control characteristics, and of their communications standards, protocols and formats. This information architecture is central to standardizing spacecraft operations, in that it provides a basis for information transfer and translation, such that diverse spacecraft can be monitored and controlled in a common way.
Standardized training in nurse model travel clinics.
Sofarelli, Theresa A; Ricks, Jane H; Anand, Rahul; Hale, Devon C
2011-01-01
International travel plays a significant role in the emergence and redistribution of major human diseases. The importance of travel medicine clinics for preventing morbidity and mortality has been increasingly appreciated, although few studies have thus far examined the management and staff training strategies that result in successful travel-clinic operations. Here, we describe an example of travel-clinic operation and management coordinated through the University of Utah School of Medicine, Division of Infectious Diseases. This program, which involves eight separate clinics distributed statewide, functions both to provide patient consult and care services, as well as medical provider training and continuing medical education (CME). Initial training, the use of standardized forms and protocols, routine chart reviews and monthly continuing education meetings are the distinguishing attributes of this program. An Infectious Disease team consisting of one medical doctor (MD) and a physician assistant (PA) act as consultants to travel nurses who comprise the majority of clinic staff. Eight clinics distributed throughout the state of Utah serve approximately 6,000 travelers a year. Pre-travel medical services are provided by 11 nurses, including 10 registered nurses (RNs) and 1 licensed practical nurse (LPN). This trained nursing staff receives continuing travel medical education and participate in the training of new providers. All nurses have completed a full training program and 7 of the 11 (64%) of clinic nursing staff serve more than 10 patients a week. Quality assurance measures show that approximately 0.5% of charts reviewed contain a vaccine or prescription error which require patient notification for correction. Using an initial training program, standardized patient intake forms, vaccine and prescription protocols, preprinted prescriptions, and regular CME, highly trained nurses at travel clinics are able to provide standardized pre-travel care to international travelers originating from Utah. © 2010 International Society of Travel Medicine.
DSMS investment in support of satellite constellations and formation flying
NASA Technical Reports Server (NTRS)
Statman, J. I.
2003-01-01
Over the years, NASA has supported unmanned space missions, beyond earth orbit, through a Deep Space Mission System (DSMS) that is developed and operated by the Jet Propulsion Laboratory (JPL) and subcontractors. The DSMS capabilities have been incrementally upgraded since its establishment in the late '50s and are delivered primarily through three Deep Space Communications Complexes (DSCC 's) near Goldstone, California, Madrid, Spain, and Canberra, Australia and from facilities at JPL. Traditionally, mission support (tracking, command, telemetry, etc) is assigned on an individual-mission basis, between each mission and a ground-based asset, independent of other missions. As NASA, and its international partners, move toward flying fullconstellations and precision formations, the DSMS is developing plans and technologies to provide the requisite support. The key activities under way are: (1) integrated communications architecture for Mars exploration, including relays on science orbiters and dedicated relay satellites to provide continuous coverage for orbiters, landers and rovers. JPL is developing an architecture, as well as protocols and equipment, required for the cost-effective operations of such an infrastructure. (2) Internet-type protocols that will allow for efficient operations across the deep-space distances, accounting for and accommodating the long round-trip-light-time. JPL is working with the CCSDS to convert these protocols to an international standard and will deploy such protocol, the CCSDS File Delivery Protocol (CFDP), on the Mars Reconnaissance Orbiter (MRO) and on the Deep Impact (01) missions. (3) Techniques to perform cross-navigation between spacecrafi that fly in a loose formation. Typical cases are cross-navigation between missions that approach Mars and missionsthat are at Mars, or the determination of a baseline for missions that fly in an earth-lead- lag configuration. (4) Techniques and devices that allow the precise metrology and controllability of tightformations for precision constellation missions. In this paper we discuss the four classes of constellatiodformation support with emphasis of DSMS current status (technology and implementation) and plans in the first three areas.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Jay Dean
2013-11-01
Sandia National Laboratories has created a test protocol for IEC TR 61850-90-7 advanced distributed energy resource (DER) functions, titled "Test Protocols for Advanced Inverter Interoperability Functions," often referred to as the Sandia Test Protocols. This document is currently in draft form, but has been shared with stakeholders around the world with the ultimate goal of collaborating to create a consensus set of test protocols which can be then incorporated into an International Electrotechnical Commission (IEC) and/or Underwriters Laboratories (UL) certification standard. The protocols are designed to ensure functional interoperability of DER (primarily photovoltaic (PV) inverters and energy storage systems) asmore » specified by the IEC technical report through communication and electrical tests. In this report, Sandia exercises the electrical characterization portion of the test protocols for four functions: constant power factor (INV3), volt-var (VV11), frequency-watt (FW21), and Low and High Voltage Ride Through (L/HVRT). The goal of the tests reported here was not to characterize the performance of the equipment under test (EUT), but rather to (a) exercise the draft Sandia Test Protocols in order to identify any revisions needed in test procedures, conditions, or equipment and (b) gain experience with state-of-the-art DER equipment to determine if the tests put unrealistic or overly aggressive requirements on EUT operation. In performing the work according to the current versions of the protocols, Sandia was able to identify weaknesses in the current versions and suggest improvements to the test protocols.« less
Performance management of multiple access communication networks
NASA Astrophysics Data System (ADS)
Lee, Suk; Ray, Asok
1993-12-01
This paper focuses on conceptual design, development, and implementation of a performance management tool for computer communication networks to serve large-scale integrated systems. The objective is to improve the network performance in handling various types of messages by on-line adjustment of protocol parameters. The techniques of perturbation analysis of Discrete Event Dynamic Systems (DEDS), stochastic approximation (SA), and learning automata have been used in formulating the algorithm of performance management. The efficacy of the performance management tool has been demonstrated on a network testbed. The conceptual design presented in this paper offers a step forward to bridging the gap between management standards and users' demands for efficient network operations since most standards such as ISO (International Standards Organization) and IEEE address only the architecture, services, and interfaces for network management. The proposed concept of performance management can also be used as a general framework to assist design, operation, and management of various DEDS such as computer integrated manufacturing and battlefield C(sup 3) (Command, Control, and Communications).
Evaluation of a protocol for the non-operative management of perforated peptic ulcer.
Marshall, C; Ramaswamy, P; Bergin, F G; Rosenberg, I L; Leaper, D J
1999-01-01
The non-operative management of perforated peptic ulcer has previously been shown to be both safe and effective although it remains controversial. A protocol for non-operative management was set up in this hospital in 1989. Adherence to the guidelines in the protocol has been audited over a 6-year period with a review of outcome. The case-notes of patients with a diagnosis of perforated peptic ulcer were reviewed. Twelve guidelines from the protocol were selected for evaluation of compliance to the protocol. Forty-nine patients underwent non-operative treatment initially. Eight patients failed to respond and underwent operation. Complications included abscess formation (seven patients), renal failure (one), gastric ileus (one), chest infection (two), and cardiac failure and stroke (one). Four deaths occurred in this group. Adherence to certain protocol guidelines was poor, notably those concerning prevention of thromboembolism, use of antibiotics, use of contrast examination to confirm the diagnosis and referral for follow-up endoscopy. Two gastric cancers were detected on subsequent endoscopy. This experience demonstrates that non-operative treatment can be used successfully in a general hospital. Adherence to protocol guidelines was found to be variable and the protocol has therefore been simplified. This study highlights the need for an accurate diagnosis and the importance of follow-up endoscopy.
[Phenylephrine dosing error in Intensive Care Unit. Case of the trimester].
2013-01-01
A real clinical case reported to SENSAR is presented. A patient admitted to the surgical intensive care unit following a lung resection, suffered arterial hypotension. The nurse was asked to give the patient 1 mL of phenylephrine. A few seconds afterwards, the patient experienced a hypertensive crisis, which resolved spontaneously without damage. Thereafter, the nurse was interviewed and a dosing error was identified: she had mistakenly given the patient 1 mg of phenylephrine (1 mL) instead of 100 mcg (1 mL of the standard dilution, 1mg in 10 mL). The incident analysis revealed latent factors (event triggers) due to the lack of protocols and standard operating procedures, communication errors among team members (physician-nurse), suboptimal training, and underdeveloped safety culture. In order to preempt similar incidents in the future, the following actions were implemented in the surgical intensive care unit: a protocol for bolus and short lived infusions (<30 min) was developed and to close the communication gap through the adoption of communication techniques. The protocol was designed by physicians and nurses to standardize the administration of drugs with high potential for errors. To close the communication gap, repeated checks about saying and understanding was proposed ("closed loop"). Labeling syringes with the drug dilution was also recommended. Copyright © 2013 Sociedad Española de Anestesiología, Reanimación y Terapéutica del Dolor. Published by Elsevier España. All rights reserved.
Prevention of Tracheostomy-Related Hospital-Acquired Pressure Ulcers.
O'Toole, Thomas R; Jacobs, Natalie; Hondorp, Brian; Crawford, Laura; Boudreau, Lisa R; Jeffe, Jill; Stein, Brian; LoSavio, Phillip
2017-04-01
Objective To determine if standardization of perioperative tracheostomy care procedures decreased the incidence of hospital-acquired tracheostomy-related pressure ulcers. Methods All patients at least 18 years old who underwent placement of a tracheostomy tube in the operating room from July 1, 2014, through June 30, 2015, were cared for postoperatively through an institutionally adopted quality improvement protocol. This included 4 elements: (1) placement of a hydrocolloid dressing underneath the tracheostomy flange in the postoperative period, (2) removal of plate sutures within 7 days of the tracheostomy procedure, (3) placement of a polyurethane foam dressing after suture removal, and (4) neutral positioning of the head. One year after the bundle was initiated, a retrospective analysis was performed to compare the percentage of tracheostomy patients who developed pressure ulcers versus the preintervention period. Results The incidence of tracheostomy-related pressure ulcers decreased from 20 of 183 tracheostomies (10.93%) prior to use of the standardized protocol to 2 of 155 tracheostomies (1.29%). Chi-square analysis showed a significant difference between the groups, with a P value of .0003. Discussion Adoption of this care bundle at our institution resulted in a significant reduction in the incidence of hospital-acquired tracheostomy-related pressure ulcers. The impact of any single intervention within our protocol was not assessed and could be an area of further investigation. Implications for Practice Adoption of a standardized posttracheostomy care bundle at the institution level may result in the improved care of patients with tracheostomies and specifically may reduce the incidence of pressure ulcers.
NASA Technical Reports Server (NTRS)
Rakow, Glenn; McGuirk, Patrick; Kimmery, Clifford; Jaffe, Paul
2006-01-01
The ability to rapidly deploy inexpensive satellites to meet tactical goals has become an important goal for military space systems. In fact, Operationally Responsive Space (ORS) has been in the spotlight at the highest levels. The Office of the Secretary of Defense (OSD) has identified that the critical next step is developing the bus standards and modular interfaces. Historically, satellite components have been constructed based on bus standards and standardized interfaces. However, this has not been done to a degree, which would allow the rapid deployment of a satellite. Advancements in plug-and-play (PnP) technologies for terrestrial applications can serve as a baseline model for a PnP approach for satellite applications. Since SpaceWire (SpW) has become a de facto standard for satellite high-speed (greater than 200Mbp) on-board communications, it has become important for SpW to adapt to this Plug and Play (PnP) environment. Because SpW is simply a bulk transport protocol and lacks built-in PnP features, several changes are required to facilitate PnP with SpW. The first is for Host(s) to figure out what the network looks like, i.e., how pieces of the network, routers and nodes, are connected together; network mapping, and to receive notice of changes to the network. The second is for the components connected to the network to be understood so that they can communicate. The first element, network topology mapping & change of status indication, is being defined (topic of this paper). The second element describing how components are to communicate has been defined by ARFL with the electronic data sheets known as XTEDS. The first element, network mapping, is recent activities performed by Air Force Research Lab (ARFL), Naval Research Lab (NRL), NASA and US industry (Honeywell, Clearwater, FL, and others). This work has resulted in the development of a protocol that will perform the lower level functions of network mapping and Change Of Status (COS) indication required by Plug 'n' Play over SpaceWire. This work will be presented to the SpaceWire working group for standardization under European Cooperation for Space Standardization (ECSS) and to obtain a permanent Protocol ID (see SpaceWire Protocol ID: What Does it Mean to You; IEEE Aerospace Conference 2006). The portion of the Plug 'n' Play protocol that will be described in this paper is how the Host(s) of a SpaceWire network map the network and detect additions and deletions of devices on a SpaceWire network.
JAXA-NASA Interoperability Demonstration for Application of DTN Under Simulated Rain Attenuation
NASA Technical Reports Server (NTRS)
Suzuki, Kiyoshisa; Inagawa, Shinichi; Lippincott, Jeff; Cecil, Andrew J.
2014-01-01
As is well known, K-band or higher band communications in space link segment often experience intermittent disruptions caused by heavy rainfall. In view of keeping data integrity and establishing autonomous operations under such situation, it is important to consider introducing a tolerance mechanism such as Delay/Disruption Tolerant Networking (DTN). The Consultative Committee for Space Data Systems (CCSDS) is studying DTN as part of the standardization activities for space data systems. As a contribution to CCSDS and a feasibility study for future utilization of DTN, Japan Aerospace Exploration Agency (JAXA) and National Aeronautics and Space Administration (NASA) conducted an interoperability demonstration for confirming its tolerance mechanism and capability of automatic operation using Data Relay Test Satellite (DRTS) space link and its ground terminals. Both parties used the Interplanetary Overlay Network (ION) open source software, including the Bundle Protocol, the Licklider Transmission Protocol, and Contact Graph Routing. This paper introduces the contents of the interoperability demonstration and its results.
PaR-PaR Laboratory Automation Platform
DOE Office of Scientific and Technical Information (OSTI.GOV)
Linshiz, G; Stawski, N; Poust, S
2013-05-01
Labor-intensive multistep biological tasks, such as the construction and cloning of DNA molecules, are prime candidates for laboratory automation. Flexible and biology-friendly operation of robotic equipment is key to its successful integration in biological laboratories, and the efforts required to operate a robot must be much smaller than the alternative manual lab work. To achieve these goals, a simple high-level biology-friendly robot programming language is needed. We have developed and experimentally validated such a language: Programming a Robot (PaR-PaR). The syntax and compiler for the language are based on computer science principles and a deep understanding of biological workflows. PaR-PaRmore » allows researchers to use liquid-handling robots effectively, enabling experiments that would not have been considered previously. After minimal training, a biologist can independently write complicated protocols for a robot within an hour. Adoption of PaR-PaR as a standard cross-platform language would enable hand-written or software-generated robotic protocols to be shared across laboratories.« less
Arthroscopic management of the painful total elbow arthroplasty.
Phadnis, Joideep; Bain, Gregory I
2016-01-01
Failure of total elbow arthroplasty is more common than after other major joint arthroplasties and is often a result of aseptic loosening, peri-prosthetic infection, fracture and instability. Infection can be a devastating complication, yet there are no established guidelines for the pre-operative diagnosis of total elbow peri-prosthetic infection. This is because pre-operative clinical, radiographic and biochemical tests are often unreliable. Using three case examples, a standardized protocol for the clinical and arthroscopic assessment of the painful total elbow arthroplasty is described. This is used to provide a mechanical and microbiological diagnosis of the patient's pain. There have been no complications resulting from the use of this technique in the three patients described, nor in any other patient to date. The staged protocol described in the present study, utilizing arthroscopic assessment, has refined the approach to the painful total elbow arthroplasty because it directly influences the definitive surgical management of the patient. It is recommended that other surgeons follow the principles outlined in the present study when faced with this challenging problem.
PaR-PaR laboratory automation platform.
Linshiz, Gregory; Stawski, Nina; Poust, Sean; Bi, Changhao; Keasling, Jay D; Hillson, Nathan J
2013-05-17
Labor-intensive multistep biological tasks, such as the construction and cloning of DNA molecules, are prime candidates for laboratory automation. Flexible and biology-friendly operation of robotic equipment is key to its successful integration in biological laboratories, and the efforts required to operate a robot must be much smaller than the alternative manual lab work. To achieve these goals, a simple high-level biology-friendly robot programming language is needed. We have developed and experimentally validated such a language: Programming a Robot (PaR-PaR). The syntax and compiler for the language are based on computer science principles and a deep understanding of biological workflows. PaR-PaR allows researchers to use liquid-handling robots effectively, enabling experiments that would not have been considered previously. After minimal training, a biologist can independently write complicated protocols for a robot within an hour. Adoption of PaR-PaR as a standard cross-platform language would enable hand-written or software-generated robotic protocols to be shared across laboratories.
ACTS 118x: High Speed TCP Interoperability Testing
NASA Technical Reports Server (NTRS)
Brooks, David E.; Buffinton, Craig; Beering, Dave R.; Welch, Arun; Ivancic, William D.; Zernic, Mike; Hoder, Douglas J.
1999-01-01
With the recent explosion of the Internet and the enormous business opportunities available to communication system providers, great interest has developed in improving the efficiency of data transfer over satellite links using the Transmission Control Protocol (TCP) of the Internet Protocol (IP) suite. The NASA's ACTS experiments program initiated a series of TCP experiments to demonstrate scalability of TCP/IP and determine to what extent the protocol can be optimized over a 622 Mbps satellite link. Through partnerships with the government technology oriented labs, computer, telecommunication, and satellite industries NASA Glenn was able to: (1) promote the development of interoperable, high-performance TCP/IP implementations across multiple computing / operating platforms; (2) work with the satellite industry to answer outstanding questions regarding the use of standard protocols (TCP/IP and ATM) for the delivery of advanced data services, and for use in spacecraft architectures; and (3) conduct a series of TCP/IP interoperability tests over OC12 ATM over a satellite network in a multi-vendor environment using ACTS. The experiments' various network configurations and the results are presented.
Hard real-time closed-loop electrophysiology with the Real-Time eXperiment Interface (RTXI)
George, Ansel; Dorval, Alan D.; Christini, David J.
2017-01-01
The ability to experimentally perturb biological systems has traditionally been limited to static pre-programmed or operator-controlled protocols. In contrast, real-time control allows dynamic probing of biological systems with perturbations that are computed on-the-fly during experimentation. Real-time control applications for biological research are available; however, these systems are costly and often restrict the flexibility and customization of experimental protocols. The Real-Time eXperiment Interface (RTXI) is an open source software platform for achieving hard real-time data acquisition and closed-loop control in biological experiments while retaining the flexibility needed for experimental settings. RTXI has enabled users to implement complex custom closed-loop protocols in single cell, cell network, animal, and human electrophysiology studies. RTXI is also used as a free and open source, customizable electrophysiology platform in open-loop studies requiring online data acquisition, processing, and visualization. RTXI is easy to install, can be used with an extensive range of external experimentation and data acquisition hardware, and includes standard modules for implementing common electrophysiology protocols. PMID:28557998
Telemetry Standards, RCC Standard 106-17. Chapter 21. Telemetry Network Standard Introduction
2017-07-01
Critical RF radio frequency RFC Request for Comment SNMP Simple Network Management Protocol TA test article TCP Transmission Control Protocol...chapters might be of most interest for a particular reader. In order to guide the reader toward the chapters of further interest , the applicable... Simple Network Management Protocol (SNMP) to pass management information through the system. The SNMP management information bases (MIBs) provide
Reid, Christopher M; Ryan, Philip; Miles, Helen; Willson, Kristyn; Beilin, Laurence J; Brown, Mark A; Jennings, Garry L; Johnston, Colin I; Macdonald, Graham J; Marley, John E; McNeil, John J; Morgan, Trefor O; West, Malcolm J; Wing, Lindon M H
2005-01-01
The characterization of blood pressure in treatment trials assessing the benefits of blood pressure lowering regimens is a critical factor for the appropriate interpretation of study results. With numerous operators involved in the measurement of blood pressure in many thousands of patients being screened for entry into clinical trials, it is essential that operators follow pre-defined measurement protocols involving multiple measurements and standardized techniques. Blood pressure measurement protocols have been developed by international societies and emphasize the importance of appropriate choice of cuff size, identification of Korotkoff sounds, and digit preference. Training of operators and auditing of blood pressure measurement may assist in reducing the operator-related errors in measurement. This paper describes the quality control activities adopted for the screening stage of the 2nd Australian National Blood Pressure Study (ANBP2). ANBP2 is cardiovascular outcome trial of the treatment of hypertension in the elderly that was conducted entirely in general practices in Australia. A total of 54 288 subjects were screened; 3688 previously untreated subjects were identified as having blood pressure >140/90 mmHg at the initial screening visit, 898 (24%) were not eligible for study entry after two further visits due to the elevated reading not being sustained. For both systolic and diastolic blood pressure recording, observed digit preference fell within 7 percentage points of the expected frequency. Protocol adherence, in terms of the required minimum blood pressure difference between the last two successive recordings, was 99.8%. These data suggest that adherence to blood pressure recording protocols and elimination of digit preferences can be achieved through appropriate training programs and quality control activities in large multi-centre community-based trials in general practice. Repeated blood pressure measurement prior to initial diagnosis and study entry is essential to appropriately characterize hypertension in these elderly patients.
The Effects of a Dynamic Spectrum Access Overlay in LTE-Advanced Networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Juan D. Deaton; Ryan E. lrwin; Luiz A. DaSilva
As early as 2014, wireless network operators spectral capacity will be overwhelmed by a data tsunami brought on by new devices and applications. To augment spectral capacity, operators could deploy a Dynamic Spectrum Access (DSA) overlay. In the light of the many planned Long Term Evolution (LTE) network deployments, the affects of a DSA overlay have not been fully considered into the existing LTE standards. Coalescing many different aspects of DSA, this paper develops the Spectrum Accountability (SA) framework. The SA framework defines specific network element functionality, protocol interfaces, and signaling flow diagrams for LTE to support service requests andmore » enforce rights of responsibilities of primary and secondary users, respectively. We also include a network simulation to quantify the benefits of using DSA channels to augment capacity. Based on our simulation we show that, network operators can benefit up to %40 increase in operating capacity when sharing DSA bands to augment spectral capacity. With our framework, this paper could serve as an guide in developing future LTE network standards that include DSA.« less
Bioreactor expansion of human mesenchymal stem cells according to GMP requirements.
Elseberg, Christiane L; Salzig, Denise; Czermak, Peter
2015-01-01
In cell therapy, the use of autologous and allogenic human mesenchymal stem cells is rising. Accordingly, the supply of cells for clinical applications in highest quality is required. As hMSCs are considered as an advanced therapy medicinal products (ATMP), they underlie the requirements of GMP and PAT according to the authorities (FDA and EMA). The production process of these cells must therefore be documented according to GMP, which is usually performed via a GMP protocol based on standard operating procedures. This chapter provides an example of such a GMP protocol for hMSC, here a genetically modified allogenic cell line, based on a production process in a microcarrier-based stirred tank reactor including process monitoring according to PAT and final product quality assurance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Drzymala, R; Alvarez, P; Bednarz, G
2015-06-15
Purpose: The purpose of this multi-institutional study was to compare two new gamma stereotactic radiosurgery (GSRS) dosimetry protocols to existing calibration methods. The ultimate goal was to guide AAPM Task Group 178 in recommending a standard GSRS dosimetry protocol. Methods: Nine centers (ten GSRS units) participated in the study. Each institution made eight sets of dose rate measurements: six with two different ionization chambers in three different 160mm-diameter spherical phantoms (ABS plastic, Solid Water and liquid water), and two using the same ionization chambers with a custom in-air positioning jig. Absolute dose rates were calculated using a newly proposed formalismmore » by the IAEA working group for small and non-standard radiation fields and with a new air-kerma based protocol. The new IAEA protocol requires an in-water ionization chamber calibration and uses previously reported Monte-Carlo generated factors to account for the material composition of the phantom, the type of ionization chamber, and the unique GSRS beam configuration. Results obtained with the new dose calibration protocols were compared to dose rates determined by the AAPM TG-21 and TG-51 protocols, with TG-21 considered as the standard. Results: Averaged over all institutions, ionization chambers and phantoms, the mean dose rate determined with the new IAEA protocol relative to that determined with TG-21 in the ABS phantom was 1.000 with a standard deviation of 0.008. For TG-51, the average ratio was 0.991 with a standard deviation of 0.013, and for the new in-air formalism it was 1.008 with a standard deviation of 0.012. Conclusion: Average results with both of the new protocols agreed with TG-21 to within one standard deviation. TG-51, which does not take into account the unique GSRS beam configuration or phantom material, was not expected to perform as well as the new protocols. The new IAEA protocol showed remarkably good agreement with TG-21. Conflict of Interests: Paula Petti, Josef Novotny, Gennady Neyman and Steve Goetsch are consultants for Elekta Instrument A/B; Elekta Instrument AB, PTW Freiburg GmbH, Standard Imaging, Inc., and The Phantom Laboratory, Inc. loaned equipment for use in these experiments; The University of Wisconsin Accredited Dosimetry Calibration Laboratory provided calibration services.« less
A Standard Mutual Authentication Protocol for Cloud Computing Based Health Care System.
Mohit, Prerna; Amin, Ruhul; Karati, Arijit; Biswas, G P; Khan, Muhammad Khurram
2017-04-01
Telecare Medical Information System (TMIS) supports a standard platform to the patient for getting necessary medical treatment from the doctor(s) via Internet communication. Security protection is important for medical records (data) of the patients because of very sensitive information. Besides, patient anonymity is another most important property, which must be protected. Most recently, Chiou et al. suggested an authentication protocol for TMIS by utilizing the concept of cloud environment. They claimed that their protocol is patient anonymous and well security protected. We reviewed their protocol and found that it is completely insecure against patient anonymity. Further, the same protocol is not protected against mobile device stolen attack. In order to improve security level and complexity, we design a light weight authentication protocol for the same environment. Our security analysis ensures resilience of all possible security attacks. The performance of our protocol is relatively standard in comparison with the related previous research.
Bergo, Maria do Carmo Noronha Cominato
2006-01-01
Thermal washer-disinfectors represent a technology that brought about great advantages such as, establishment of protocols, standard operating procedures, reduction in occupational risk of a biological and environmental nature. The efficacy of the cleaning and disinfection obtained by automatic washer disinfectors machines in running programs with different times and temperatures determined by the different official agencies was validated according to recommendations from ISO Standards 15883-1/1999 and HTM2030 (NHS Estates, 1997) for the determining of the Minimum Lethality and DAL both theoretically and through the use with thermocouples. In order to determine the cleaning efficacy, the Soil Test, Biotrace Pro-tect and the Protein Test Kit were used. The procedure to verify the CFU count of viable microorganisms was performed before and after the thermal disinfection. This article shows that the results are in compliance with the ISO and HTM Standards. The validation steps confirmed the high efficacy level of the Medical Washer-Disinfectors. This protocol enabled the evaluation of the procedure based on evidence supported by scientific research, aiming at the support of the Supply Center multi-professional personnel with information and the possibility of developing further research.
Karumbayaram, Saravanan; Lee, Peiyee; Azghadi, Soheila F; Cooper, Aaron R; Patterson, Michaela; Kohn, Donald B; Pyle, April; Clark, Amander; Byrne, James; Zack, Jerome A; Plath, Kathrin; Lowry, William E
2012-01-01
The clinical application of human-induced pluripotent stem cells (hiPSCs) requires not only the production of Good Manufacturing Practice-grade (GMP-grade) hiPSCs but also the derivation of specified cell types for transplantation under GMP conditions. Previous reports have suggested that hiPSCs can be produced in the absence of animal-derived reagents (xenobiotics) to ease the transition to production under GMP standards. However, to facilitate the use of hiPSCs in cell-based therapeutics, their progeny should be produced not only in the absence of xenobiotics but also under GMP conditions requiring extensive standardization of protocols, documentation, and reproducibility of methods and product. Here, we present a successful framework to produce GMP-grade derivatives of hiPSCs that are free of xenobiotic exposure from the collection of patient fibroblasts, through reprogramming, maintenance of hiPSCs, identification of reprogramming vector integration sites (nrLAM-PCR), and finally specification and terminal differentiation of clinically relevant cells. Furthermore, we developed a primary set of Standard Operating Procedures for the GMP-grade derivation and differentiation of these cells as a resource to facilitate widespread adoption of these practices.
Rodríguez-Molina, Jesús; Martínez, Belén; Bilbao, Sonia; Martín-Wanton, Tamara
2017-06-08
The utilization of autonomous maritime vehicles is becoming widespread in operations that are deemed too hazardous for humans to be directly involved in them. One of the ways to increase the productivity of the tools used during missions is the deployment of several vehicles with the same objective regarding data collection and transfer, both for the benefit of human staff and policy makers. However, the interchange of data in such an environment poses major challenges, such as a low bandwidth and the unreliability of the environment where transmissions take place. Furthermore, the relevant information that must be sent, as well as the exact size that will allow understanding it, is usually not clearly established, as standardization works are scarce in this domain. Under these conditions, establishing a way to interchange information at the data level among autonomous maritime vehicles becomes of critical importance since the needed information, along with the size of the transferred data, will have to be defined. This manuscript puts forward the Maritime Data Transfer Protocol, (MDTP) a way to interchange standardized pieces of information at the data level for maritime autonomous maritime vehicles, as well as the procedures that are required for information interchange.
Rodríguez-Molina, Jesús; Martínez, Belén; Bilbao, Sonia; Martín-Wanton, Tamara
2017-01-01
The utilization of autonomous maritime vehicles is becoming widespread in operations that are deemed too hazardous for humans to be directly involved in them. One of the ways to increase the productivity of the tools used during missions is the deployment of several vehicles with the same objective regarding data collection and transfer, both for the benefit of human staff and policy makers. However, the interchange of data in such an environment poses major challenges, such as a low bandwidth and the unreliability of the environment where transmissions take place. Furthermore, the relevant information that must be sent, as well as the exact size that will allow understanding it, is usually not clearly established, as standardization works are scarce in this domain. Under these conditions, establishing a way to interchange information at the data level among autonomous maritime vehicles becomes of critical importance since the needed information, along with the size of the transferred data, will have to be defined. This manuscript puts forward the Maritime Data Transfer Protocol, (MDTP) a way to interchange standardized pieces of information at the data level for maritime autonomous maritime vehicles, as well as the procedures that are required for information interchange. PMID:28594393
Narrow band imaging combined with water immersion technique in the diagnosis of celiac disease.
Valitutti, Francesco; Oliva, Salvatore; Iorfida, Donatella; Aloi, Marina; Gatti, Silvia; Trovato, Chiara Maria; Montuori, Monica; Tiberti, Antonio; Cucchiara, Salvatore; Di Nardo, Giovanni
2014-12-01
The "multiple-biopsy" approach both in duodenum and bulb is the best strategy to confirm the diagnosis of celiac disease; however, this increases the invasiveness of the procedure itself and is time-consuming. To evaluate the diagnostic yield of a single biopsy guided by narrow-band imaging combined with water immersion technique in paediatric patients. Prospective assessment of the diagnostic accuracy of narrow-band imaging/water immersion technique-driven biopsy approach versus standard protocol in suspected celiac disease. The experimental approach correctly diagnosed 35/40 children with celiac disease, with an overall diagnostic sensitivity of 87.5% (95% CI: 77.3-97.7). An altered pattern of narrow-band imaging/water immersion technique endoscopic visualization was significantly associated with villous atrophy at guided biopsy (Spearman Rho 0.637, p<0.001). Concordance of narrow-band imaging/water immersion technique endoscopic assessments was high between two operators (K: 0.884). The experimental protocol was highly timesaving compared to the standard protocol. An altered narrow-band imaging/water immersion technique pattern coupled with high anti-transglutaminase antibodies could allow a single guided biopsy to diagnose celiac disease. When no altered mucosal pattern is visible even by narrow-band imaging/water immersion technique, multiple bulbar and duodenal biopsies should be obtained. Copyright © 2014. Published by Elsevier Ltd.
EPA Traceability Protocol for Assay and Certification of Gaseous Calibration Standards
In 1997, the U.S. Environmental Protection Agency (EPA) in Research Triangle Park, North Carolina, revised its 1993 version of its traceability protocol for the assay and certification of compressed gas and permeation-device calibration standards. The protocol allows producers o...
Timpa, Joseph G; O'Meara, L Carlisle; Goldberg, Kellen G; Phillips, Jay P; Crawford, Jack H; Jackson, Kimberly W; Alten, Jeffrey A
2016-03-01
Perioperative transfusion of blood products is associated with increased morbidity and mortality after pediatric cardiac surgery. We report the results of a quality improvement project aimed at decreasing perioperative blood product administration and bleeding after pediatric cardiopulmonary bypass (CPB) surgery. A multidisciplinary team evaluated baseline data from 99 consecutive CPB patients, focusing on the variability in transfusion management and bleeding outcomes, to create a standardized bleeding and transfusion management protocol. A total of 62 subsequent patients were evaluated after implementation of the protocol: 17 with single pass hemoconcentrated (SPHC) blood transfusion and 45 with modified ultrafiltration (MUF). Implementation of the protocol with SPHC blood led to significant decrease in transfusion of every blood product in the cardiovascular operating room and first 6 hours in cardiovascular intensive care unit ([CVICU] p < .05). Addition of MUF to the protocol led to further decrease in transfusion of all blood products compared to preprotocol. Patients <2 months old had 49% decrease in total blood product administration: 155 mL/kg preprotocol, 117 mL/kg protocol plus SPHC, and 79 mL/kg protocol plus MUF (p < .01). There were significant decreases in postoperative bleeding in the first hour after CVICU admission: 6 mL/kg preprotocol, 3.8 mL/kg protocol plus SPHC, and 2 mL/kg protocol plusMUF (p = .02). There was also significantly decreased incidence of severe postoperative bleeding (>10 mL/kg) in the first CVICU hour for protocol plus MUF patients (p < .01). Implementation of a multidisciplinary bleeding and transfusion protocol significantly decreases perioperative blood product transfusion and improves some bleeding outcomes.
PANATIKI: A Network Access Control Implementation Based on PANA for IoT Devices
Sanchez, Pedro Moreno; Lopez, Rafa Marin; Gomez Skarmeta, Antonio F.
2013-01-01
Internet of Things (IoT) networks are the pillar of recent novel scenarios, such as smart cities or e-healthcare applications. Among other challenges, these networks cover the deployment and interaction of small devices with constrained capabilities and Internet protocol (IP)-based networking connectivity. These constrained devices usually require connection to the Internet to exchange information (e.g., management or sensing data) or access network services. However, only authenticated and authorized devices can, in general, establish this connection. The so-called authentication, authorization and accounting (AAA) services are in charge of performing these tasks on the Internet. Thus, it is necessary to deploy protocols that allow constrained devices to verify their credentials against AAA infrastructures. The Protocol for Carrying Authentication for Network Access (PANA) has been standardized by the Internet engineering task force (IETF) to carry the Extensible Authentication Protocol (EAP), which provides flexible authentication upon the presence of AAA. To the best of our knowledge, this paper is the first deep study of the feasibility of EAP/PANA for network access control in constrained devices. We provide light-weight versions and implementations of these protocols to fit them into constrained devices. These versions have been designed to reduce the impact in standard specifications. The goal of this work is two-fold: (1) to demonstrate the feasibility of EAP/PANA in IoT devices; (2) to provide the scientific community with the first light-weight interoperable implementation of EAP/PANA for constrained devices in the Contiki operating system (Contiki OS), called PANATIKI. The paper also shows a testbed, simulations and experimental results obtained from real and simulated constrained devices. PMID:24189332
PANATIKI: a network access control implementation based on PANA for IoT devices.
Moreno Sanchez, Pedro; Marin Lopez, Rafa; Gomez Skarmeta, Antonio F
2013-11-01
Internet of Things (IoT) networks are the pillar of recent novel scenarios, such as smart cities or e-healthcare applications. Among other challenges, these networks cover the deployment and interaction of small devices with constrained capabilities and Internet protocol (IP)-based networking connectivity. These constrained devices usually require connection to the Internet to exchange information (e.g., management or sensing data) or access network services. However, only authenticated and authorized devices can, in general, establish this connection. The so-called authentication, authorization and accounting (AAA) services are in charge of performing these tasks on the Internet. Thus, it is necessary to deploy protocols that allow constrained devices to verify their credentials against AAA infrastructures. The Protocol for Carrying Authentication for Network Access (PANA) has been standardized by the Internet engineering task force (IETF) to carry the Extensible Authentication Protocol (EAP), which provides flexible authentication upon the presence of AAA. To the best of our knowledge, this paper is the first deep study of the feasibility of EAP/PANA for network access control in constrained devices. We provide light-weight versions and implementations of these protocols to fit them into constrained devices. These versions have been designed to reduce the impact in standard specifications. The goal of this work is two-fold: (1) to demonstrate the feasibility of EAP/PANA in IoT devices; (2) to provide the scientific community with the first light-weight interoperable implementation of EAP/PANA for constrained devices in the Contiki operating system (Contiki OS), called PANATIKI. The paper also shows a testbed, simulations and experimental results obtained from real and simulated constrained devices.
Internetting tactical security sensor systems
NASA Astrophysics Data System (ADS)
Gage, Douglas W.; Bryan, W. D.; Nguyen, Hoa G.
1998-08-01
The Multipurpose Surveillance and Security Mission Platform (MSSMP) is a distributed network of remote sensing packages and control stations, designed to provide a rapidly deployable, extended-range surveillance capability for a wide variety of military security operations and other tactical missions. The baseline MSSMP sensor suite consists of a pan/tilt unit with video and FLIR cameras and laser rangefinder. With an additional radio transceiver, MSSMP can also function as a gateway between existing security/surveillance sensor systems such as TASS, TRSS, and IREMBASS, and IP-based networks, to support the timely distribution of both threat detection and threat assessment information. The MSSMP system makes maximum use of Commercial Off The Shelf (COTS) components for sensing, processing, and communications, and of both established and emerging standard communications networking protocols and system integration techniques. Its use of IP-based protocols allows it to freely interoperate with the Internet -- providing geographic transparency, facilitating development, and allowing fully distributed demonstration capability -- and prepares it for integration with the IP-based tactical radio networks that will evolve in the next decade. Unfortunately, the Internet's standard Transport layer protocol, TCP, is poorly matched to the requirements of security sensors and other quasi- autonomous systems in being oriented to conveying a continuous data stream, rather than discrete messages. Also, its canonical 'socket' interface both conceals short losses of communications connectivity and simply gives up and forces the Application layer software to deal with longer losses. For MSSMP, a software applique is being developed that will run on top of User Datagram Protocol (UDP) to provide a reliable message-based Transport service. In addition, a Session layer protocol is being developed to support the effective transfer of control of multiple platforms among multiple control stations.
Schenk, Michael; Huppertz, Berthold; Obermayer-Pietsch, Barbara; Kastelic, Darja; Hörmann-Kröpfl, Martina; Weiss, Gregor
2017-02-01
The aim of the present study was to develop a standard operating procedure (SOP) for the collection, transport, and storage of human cumulus cells, follicular fluid, blood serum, seminal plasma, embryo culture supernatant, and embryo culture supernatant control obtained within the IVF process under approved protocols and written informed consent from participating patients. The SOP was developed at the Kinderwunsch Institut Schenk, Dobl, Austria, together with Biobank Graz of the Medical University of Graz, Austria. The SOP provides comprehensive details of laboratory procedures and sampling of the different fluids within the IVF process. Furthermore, information on sample coding, references of involved laboratory techniques (e.g., oocyte retrieval with a Steiner-TAN needle), ethical approvals, and biobanking procedures are presented. The result of the present study is a standard operating procedure. The SOP ensures a professional way for collection and scientific use of IVF samples by the Kinderwunsch Institut Schenk, Dobl, Austria, and Biobank Graz of the Medical University of Graz, Austria. It can be used as a template for other institutions to unify specimen collection procedures in the field of reproductive health research.
DOT National Transportation Integrated Search
2004-01-01
This document defines the protocol standards for the Internet Protocol Suite (IPS), which is commonly referred to as Transmission Control Protocol/Internet Protocol (TCP/IP) protocols used for data communications within the National Airspace System (...
Envisioning Transformation in VA Mental Health Services Through Collaborative Site Visits.
Kearney, Lisa K; Schaefer, Jeanne A; Dollar, Katherine M; Iwamasa, Gayle Y; Katz, Ira; Schmitz, Theresa; Schohn, Mary; Resnick, Sandra G
2018-04-16
This column reviews the unique contributions of multiple partners in establishing a standardized site visit process to promote quality improvement in mental health care at the Veterans Health Administration. Working as a team, leaders in policy and operations, staff of research centers, and regional- and facility-level mental health leaders developed a standardized protocol for evaluating mental health services at each site and using the data to help implement policy goals. The authors discuss the challenges experienced and lessons learned in this systemwide process and how this information can be part of a framework for improving mental health services on a national level.
Abbreviated Combined MR Protocol: A New Faster Strategy for Characterizing Breast Lesions.
Moschetta, Marco; Telegrafo, Michele; Rella, Leonarda; Stabile Ianora, Amato Antonio; Angelelli, Giuseppe
2016-06-01
The use of an abbreviated magnetic resonance (MR) protocol has been recently proposed for cancer screening. The aim of our study is to evaluate the diagnostic accuracy of an abbreviated MR protocol combining short TI inversion recovery (STIR), turbo-spin-echo (TSE)-T2 sequences, a pre-contrast T1, and a single intermediate (3 minutes after contrast injection) post-contrast T1 sequence for characterizing breast lesions. A total of 470 patients underwent breast MR examination for screening, problem solving, or preoperative staging. Two experienced radiologists evaluated both standard and abbreviated protocols in consensus. Sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and diagnostic accuracy for both protocols were calculated (with the histological findings and 6-month ultrasound follow-up as the reference standard) and compared with the McNemar test. The post-processing and interpretation times for the MR images were compared with the paired t test. In 177 of 470 (38%) patients, the MR sequences detected 185 breast lesions. Standard and abbreviated protocols obtained sensitivity, specificity, diagnostic accuracy, PPV, and NPV values respectively of 92%, 92%, 92%, 68%, and 98% and of 89%, 91%, 91%, 64%, and 98% with no statistically significant difference (P < .0001). The mean post-processing and interpretation time were, respectively, 7 ± 1 minutes and 6 ± 3.2 minutes for the standard protocol and 1 ± 1.2 minutes and 2 ± 1.2 minutes for the abbreviated protocol, with a statistically significant difference (P < .01). An abbreviated combined MR protocol represents a time-saving tool for radiologists and patients with the same diagnostic potential as the standard protocol in patients undergoing breast MRI for screening, problem solving, or preoperative staging. Copyright © 2016 Elsevier Inc. All rights reserved.
Implementation and Evaluation of Four Interoperable Open Standards for the Internet of Things.
Jazayeri, Mohammad Ali; Liang, Steve H L; Huang, Chih-Yuan
2015-09-22
Recently, researchers are focusing on a new use of the Internet called the Internet of Things (IoT), in which enabled electronic devices can be remotely accessed over the Internet. As the realization of IoT concept is still in its early stages, manufacturers of Internet-connected devices and IoT web service providers are defining their proprietary protocols based on their targeted applications. Consequently, IoT becomes heterogeneous in terms of hardware capabilities and communication protocols. Addressing these heterogeneities by following open standards is a necessary step to communicate with various IoT devices. In this research, we assess the feasibility of applying existing open standards on resource-constrained IoT devices. The standard protocols developed in this research are OGC PUCK over Bluetooth, TinySOS, SOS over CoAP, and OGC SensorThings API. We believe that by hosting open standard protocols on IoT devices, not only do the devices become self-describable, self-contained, and interoperable, but innovative applications can also be easily developed with standardized interfaces. In addition, we use memory consumption, request message size, response message size, and response latency to benchmark the efficiency of the implemented protocols. In all, this research presents and evaluates standard-based solutions to better understand the feasibility of applying existing standards to the IoT vision.
Space Link Extension Protocol Emulation for High-Throughput, High-Latency Network Connections
NASA Technical Reports Server (NTRS)
Tchorowski, Nicole; Murawski, Robert
2014-01-01
New space missions require higher data rates and new protocols to meet these requirements. These high data rate space communication links push the limitations of not only the space communication links, but of the ground communication networks and protocols which forward user data to remote ground stations (GS) for transmission. The Consultative Committee for Space Data Systems, (CCSDS) Space Link Extension (SLE) standard protocol is one protocol that has been proposed for use by the NASA Space Network (SN) Ground Segment Sustainment (SGSS) program. New protocol implementations must be carefully tested to ensure that they provide the required functionality, especially because of the remote nature of spacecraft. The SLE protocol standard has been tested in the NASA Glenn Research Center's SCENIC Emulation Lab in order to observe its operation under realistic network delay conditions. More specifically, the delay between then NASA Integrated Services Network (NISN) and spacecraft has been emulated. The round trip time (RTT) delay for the continental NISN network has been shown to be up to 120ms; as such the SLE protocol was tested with network delays ranging from 0ms to 200ms. Both a base network condition and an SLE connection were tested with these RTT delays, and the reaction of both network tests to the delay conditions were recorded. Throughput for both of these links was set at 1.2Gbps. The results will show that, in the presence of realistic network delay, the SLE link throughput is significantly reduced while the base network throughput however remained at the 1.2Gbps specification. The decrease in SLE throughput has been attributed to the implementation's use of blocking calls. The decrease in throughput is not acceptable for high data rate links, as the link requires constant data a flow in order for spacecraft and ground radios to stay synchronized, unless significant data is queued a the ground station. In cases where queuing the data is not an option, such as during real time transmissions, the SLE implementation cannot support high data rate communication.
Christian, Michael D; Joynt, Gavin M; Hick, John L; Colvin, John; Danis, Marion; Sprung, Charles L
2010-04-01
To provide recommendations and standard operating procedures for intensive care unit (ICU) and hospital preparations for an influenza pandemic or mass disaster with a specific focus on critical care triage. Based on a literature review and expert opinion, a Delphi process was used to define the essential topics including critical care triage. Key recommendations include: (1) establish an Incident Management System with Emergency Executive Control Groups at facility, local, regional/state or national levels to exercise authority and direction over resources; (2) developing fair and equitable policies may require restricting ICU services to patients most likely to benefit; (3) usual treatments and standards of practice may be impossible to deliver; (4) ICU care and treatments may have to be withheld from patients likely to die even with ICU care and withdrawn after a trial in patients who do not improve or deteriorate; (5) triage criteria should be objective, ethical, transparent, applied equitably and be publically disclosed; (6) trigger triage protocols for pandemic influenza only when critical care resources across a broad geographic area are or will be overwhelmed despite all reasonable efforts to extend resources or obtain additional resources; (7) triage of patients for ICU should be based on those who are likely to benefit most or a 'first come, first served' basis; (8) a triage officer should apply inclusion and exclusion criteria to determine patient qualification for ICU admission. Judicious planning and adoption of protocols for critical care triage are necessary to optimize outcomes during a pandemic.
Luglio, Gaetano; De Palma, Giovanni Domenico; Tarquini, Rachele; Giglio, Mariano Cesare; Sollazzo, Viviana; Esposito, Emanuela; Spadarella, Emanuela; Peltrini, Roberto; Liccardo, Filomena; Bucci, Luigi
2015-01-01
Background Despite the proven benefits, laparoscopic colorectal surgery is still under utilized among surgeons. A steep learning is one of the causes of its limited adoption. Aim of the study is to determine the feasibility and morbidity rate after laparoscopic colorectal surgery in a single institution, “learning curve” experience, implementing a well standardized operative technique and recovery protocol. Methods The first 50 patients treated laparoscopically were included. All the procedures were performed by a trainee surgeon, supervised by a consultant surgeon, according to the principle of complete mesocolic excision with central vascular ligation or TME. Patients underwent a fast track recovery programme. Recovery parameters, short-term outcomes, morbidity and mortality have been assessed. Results Type of resections: 20 left side resections, 8 right side resections, 14 low anterior resection/TME, 5 total colectomy and IRA, 3 total panproctocolectomy and pouch. Mean operative time: 227 min; mean number of lymph-nodes: 18.7. Conversion rate: 8%. Mean time to flatus: 1.3 days; Mean time to solid stool: 2.3 days. Mean length of hospital stay: 7.2 days. Overall morbidity: 24%; major morbidity (Dindo–Clavien III): 4%. No anastomotic leak, no mortality, no 30-days readmission. Conclusion Proper laparoscopic colorectal surgery is safe and leads to excellent results in terms of recovery and short term outcomes, even in a learning curve setting. Key factors for better outcomes and shortening the learning curve seem to be the adoption of a standardized technique and training model along with the strict supervision of an expert colorectal surgeon. PMID:25859386
In 1997, the U.S. Environmental Protection Agency (EPA) in Research Triangle Park, North Carolina, revised its 1993 version of its traceability protocol for the assay and certification of compressed gas and permeation-device calibration standards. The protocol allows producers of...
A data transmission method for particle physics experiments based on Ethernet physical layer
NASA Astrophysics Data System (ADS)
Huang, Xi-Ru; Cao, Ping; Zheng, Jia-Jun
2015-11-01
Due to its advantages of universality, flexibility and high performance, fast Ethernet is widely used in readout system design for modern particle physics experiments. However, Ethernet is usually used together with the TCP/IP protocol stack, which makes it difficult to implement readout systems because designers have to use the operating system to process this protocol. Furthermore, TCP/IP degrades the transmission efficiency and real-time performance. To maximize the performance of Ethernet in physics experiment applications, a data readout method based on the physical layer (PHY) is proposed. In this method, TCP/IP is replaced with a customized and simple protocol, which makes it easier to implement. On each readout module, data from the front-end electronics is first fed into an FPGA for protocol processing and then sent out to a PHY chip controlled by this FPGA for transmission. This kind of data path is fully implemented by hardware. From the side of the data acquisition system (DAQ), however, the absence of a standard protocol causes problems for the network related applications. To solve this problem, in the operating system kernel space, data received by the network interface card is redirected from the traditional flow to a specified memory space by a customized program. This memory space can easily be accessed by applications in user space. For the purpose of verification, a prototype system has been designed and implemented. Preliminary test results show that this method can meet the requirements of data transmission from the readout module to the DAQ with an efficient and simple manner. Supported by National Natural Science Foundation of China (11005107) and Independent Projects of State Key Laboratory of Particle Detection and Electronics (201301)
Cerfolio, Robert James; Steenwyk, Brad L; Watson, Caroline; Sparrow, James; Belopolsky, Victoria; Townsley, Matthew; Lyerly, Ralph; Downing, Michelle; Bryant, Ayesha; Gurley, William Quinton; Henling, Colleen; Crawford, Jack; Gayeski, Thomas E
2016-03-01
Our objective was to evaluate our results after the implementation of lean (the elimination of wasteful parts of a process). After meetings with our anesthesiologists, we standardized our "in the operating room-to-skin incision protocols" before pulmonary lobectomy. Patients were divided into consecutive cohorts of 300 lobectomy patients. Several protocols were slowly adopted and outcomes were evaluated. One surgeon performed 2,206 pulmonary lobectomies, of which 84% were for cancer. Protocols for lateral decubitus positioning changed over time. We eliminated axillary rolls, arm boards, and beanbags. Monitoring devices were slowly eliminated. Central catheters decreased from 75% to 0% of patients, epidurals from 84% to 3%, arterial catheters from 93% to 4%, and finally, Foley catheters were reduced from 99% to 11% (p ≤ 0.001 for all). A protocol for the insertion of double-lumen endotracheal tubes was established and times decreased (mean, 14 minutes to 1 minute; p = 0.001). After all changes were made, the time between operating room entry and incision decreased from a mean of 64 minutes to 37 minutes (p < 0.001). Outcomes improved, mortality decreased from 3.2% to 0.26% (p = 0.015), and major morbidity decreased from 15.2% to 5.3% (p = 0.042). Lean and value stream mapping can be safely applied to the clinical algorithms of high-risk patient care. We demonstrate that elimination of non-value-added steps can safely decrease preincision time without increasing patient risk in patients who undergo pulmonary lobectomy. Selected centers may be able to adopt some of these lean-driven protocols. Copyright © 2016 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.
Post-operative monitoring of free muscle transfers by Laser Doppler Imaging: A prospective study.
Tschumi, Christian; Seyed Jafari, S Morteza; Rothenberger, Jens; Van de Ville, Dimitri; Keel, Marius; Krause, Fabian; Shafighi, Maziar
2015-10-01
Despite different existing methods, monitoring of free muscle transfer is still challenging. In the current study we evaluated our clinical setting regarding monitoring of such tissues, using a recent microcirculation-imaging camera (EasyLDI) as an additional tool for detection of perfusion incompetency. This study was performed on seven patients with soft tissue defect, who underwent reconstruction with free gracilis muscle. Beside standard monitoring protocol (clinical assessment, temperature strips, and surface Doppler), hourly EasyLDI monitoring was performed for 48 hours. Thereby a baseline value (raised flap but connected to its vascular bundle) and an ischaemia perfusion value (completely resected flap) were measured at the same point. The mean age of the patients, mean baseline value, ischaemia value perfusion were 48.00 ± 13.42 years, 49.31 ± 17.33 arbitrary perfusion units (APU), 9.87 ± 4.22 APU, respectively. The LDI measured values in six free muscle transfers were compatible with hourly standard monitoring protocol, and normalized LDI values significantly increased during time (P < 0.001, r = 0.412). One of the flaps required a return to theatre 17 hours after the operation, where an unsalvageable flap loss was detected. All normalized LDI values of this flap were under the ischaemia perfusion level and the trend was significantly descending during time (P < 0.001, r = -0.870). Due to the capability of early detection of perfusion incompetency, LDI may be recommended as an additional post-operative monitoring device for free muscle flaps, for early detection of suspected failing flaps and for validation of other methods. © 2015 Wiley Periodicals, Inc.
Miniature EVA Software Defined Radio
NASA Technical Reports Server (NTRS)
Pozhidaev, Aleksey
2012-01-01
As NASA embarks upon developing the Next-Generation Extra Vehicular Activity (EVA) Radio for deep space exploration, the demands on EVA battery life will substantially increase. The number of modes and frequency bands required will continue to grow in order to enable efficient and complex multi-mode operations including communications, navigation, and tracking applications. Whether conducting astronaut excursions, communicating to soldiers, or first responders responding to emergency hazards, NASA has developed an innovative, affordable, miniaturized, power-efficient software defined radio that offers unprecedented power-efficient flexibility. This lightweight, programmable, S-band, multi-service, frequency- agile EVA software defined radio (SDR) supports data, telemetry, voice, and both standard and high-definition video. Features include a modular design, an easily scalable architecture, and the EVA SDR allows for both stationary and mobile battery powered handheld operations. Currently, the radio is equipped with an S-band RF section. However, its scalable architecture can accommodate multiple RF sections simultaneously to cover multiple frequency bands. The EVA SDR also supports multiple network protocols. It currently implements a Hybrid Mesh Network based on the 802.11s open standard protocol. The radio targets RF channel data rates up to 20 Mbps and can be equipped with a real-time operating system (RTOS) that can be switched off for power-aware applications. The EVA SDR's modular design permits implementation of the same hardware at all Network Nodes concept. This approach assures the portability of the same software into any radio in the system. It also brings several benefits to the entire system including reducing system maintenance, system complexity, and development cost.
NASA Astrophysics Data System (ADS)
Domanski, Konrad; Alharbi, Essa A.; Hagfeldt, Anders; Grätzel, Michael; Tress, Wolfgang
2018-01-01
Perovskite solar cells have achieved power-conversion efficiency values approaching those of established photovoltaic technologies, making the reliable assessment of their operational stability the next essential step towards commercialization. Although studies increasingly often involve a form of stability characterization, they are conducted in non-standardized ways, which yields data that are effectively incomparable. Furthermore, stability assessment of a novel material system with its own peculiarities might require an adjustment of common standards. Here, we investigate the effects of different environmental factors and electrical load on the ageing behaviour of perovskite solar cells. On this basis, we comment on our perceived relevance of the different ways these are currently aged. We also demonstrate how the results of the experiments can be distorted and how to avoid the common pitfalls. We hope this work will initiate discussion on how to age perovskite solar cells and facilitate the development of consensus stability measurement protocols.
Developing standard operating procedures for gene drive research in disease vector mosquitoes.
Adelman, Zach N; Pledger, David; Myles, Kevin M
2017-12-01
Numerous arthropod species represent potential targets for gene-drive-based population suppression or replacement, including those that transmit diseases, damage crops, or act as deleterious invasive species. Containment measures for gene drive research in arthropods have been discussed in the literature, but the importance of developing safe and effective standard operating procedures (SOPs) for these types of experiments has not been adequately addressed. Concisely written SOPs link safe work practices, containment measures, institutional training, and research-specific protocols. Here we discuss information to be considered by principal investigators, biosafety officers, and institutional biosafety committees as they work together to develop SOPs for experiments involving gene drive in arthropods, and describe various courses of action that can be used to maintain the effectiveness of SOPs through evaluation and revision. The information provided herein will be especially useful to investigators and regulatory personnel who may lack extensive experience working with arthropods under containment conditions.
1980-05-01
andcoptrpormigfrteublne nra ls fpoeue nacrac with Federal Standard 1003 fTelecommunications: Synchronous Bit Oriented Data Link Control Procedures...and the higher level user. The solution to the producer/consumer problem involves the use of PASS and SICHAL primitives and event variables or... semaphores . The event variables have been defined for the LS-microprocessor interface as part of I-1 the internal registers that are included in the F6856
Results of a Prospective Echocardiography Trial in International Space Station Crew
NASA Technical Reports Server (NTRS)
Hamilton, Douglas R.; Sargsyan, Ashot E.; Martin, David; Garcia, Kathleen M.; Melton, Shannon; Feiverson, Alan; Dulchavsky, Scott A.
2009-01-01
In the framework of an operationally oriented investigation, we conducted a prospective trial of a standard clinical echocardiography protocol in a cohort of long-duration crewmembers. The resulting primary and processed data appear to have no precedents. Our tele-echocardiography paradigm, including just-in-time e-training methods, was also assessed. A critical review of the imaging technique, equipment and setting limitations, and quality assurance is provided, as well as the analysis of "space normal" data.
Entanglement Measures in Ion-Trap Quantum Simulators without Full Tomography
2014-07-21
t). This will allow us to efficiently compute correlations between ψ and ψ∗ in terms of standard expectation values in the enlarged space as follows...measure correlations of the form appearing in Eq. (2), with Θ a linear combination of tensorial products of Pauli matrices and identity operators...matrices will produce the desired correlation . Note that this protocol always results in a correlation of an odd number of Pauli matrices. In order to
Protocol and pilot data for establishing the Australian Stroke Clinical Registry.
Cadilhac, Dominique A; Lannin, Natasha A; Anderson, Craig S; Levi, Christopher R; Faux, Steven; Price, Chris; Middleton, Sandy; Lim, Joyce; Thrift, Amanda G; Donnan, Geoffrey A
2010-06-01
Disease registries assist with clinical practice improvement. The Australian Stroke Clinical Registry aims to provide national, prospective, systematic data on processes and outcomes for stroke. We describe the methods of establishment and initial experience of operation. Australian Stroke Clinical Registry conforms to new national operating principles and technical standards for clinical quality registers. Features include: online data capture from acute public and private hospital sites; opt-out consent; expert consensus agreed core minimum dataset with standard definitions; outcomes assessed at 3 months poststroke; formal governance oversight; and formative evaluations for improvements. Qualitative feedback from sites indicates that the web-tool is simple to use and the user manuals, data dictionary, and training are appropriate. However, sites desire automated data-entry methods for routine demography variables and the opt-out consent protocol has sometimes been problematic. Data from 204 patients (median age 71 years, 54% males, 60% Australian) were collected from four pilot hospitals from June to October 2009 (mean, 50 cases per month) including ischaemic stroke (in 72%), intracerebral haemorrhage (16%), transient ischaemic attack (9%), and undetermined (3%), with only one case opting out. Australian Stroke Clinical Registry has been well established, but further refinements and broad roll-out are required before realising its potential of improving patient care through clinician feedback and allowance of local, national, and international comparative data.
A proactive approach for managing indoor air quality
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greene, R.E.; Casey, J.M.; Williams, P.L.
1997-11-01
Ventilation and maintenance, followed by psychosocial issues, are the factors most often implicated in indoor air quality (IAQ) investigations. The absence of accepted exposure standards and the presence of a wide variety of building designs, ages, ventilation systems, and usages often make IAQ complaint investigations ineffective. Thus, the best approach to achieving IAQ is to prevent problems from occurring. This paper presents the framework for a proactive approach to managing the causes most often implicated in IAQ investigations. It is the aim of this proactive protocol to provide a cost-effective guide for preventing IAQ problems in nonindustrial settings and inmore » buildings for which there are no current IAQ complaints. The proposed protocol focuses on heating, ventilation, and air-conditioning (HVAC) system maintenance and operation; psychosocial factors; and the handling and investigation of complaints. An IAQ manager is designated to implement and manage the protocol. The HVAC system portion of the protocol focuses on proper maintenance of the components often identified as sources of problems in IAQ studies, documentation of the maintenance procedures, and training of individuals responsible for building maintenance. The protocol addresses the psychosocial factors with an environmental survey that rates the occupants` perceptions of the indoor air to identify potential IAQ problems. The psychosocial portion of the protocol also incorporates occupant education and awareness. Finally, a three-step initial investigation procedure for addressing IAQ problems is presented.« less
ISS and STS Commercial Off-the-Shelf Router Testing
NASA Technical Reports Server (NTRS)
Ivancie, William D.; Bell, Terry L.; Shell, Dan
2002-01-01
This report documents the results of testing performed with commercial off-the-shelf (COTS) routers and Internet Protocols (IPs) to determine if COTS equipment and IP could be utilized to upgrade NASA's current Space Transportation System (STS), the Shuttle, and the International Space Station communication infrastructure. Testing was performed by NASA Glenn Research Center (GRC) personnel within the Electronic Systems Test Laboratory (ESTE) with cooperation from the Mission Operations Directorate (MOD) Qualification and Utilization of Electronic System Technology (QUEST) personnel. The ESTE testing occurred between November 1 and 9, 2000. Additional testing was performed at NASA Glenn Research Center in a laboratory environment with equipment configured to emulate the STS. This report documents those tests and includes detailed test procedures, equipment interface requirements, test configurations and test results. The tests showed that a COTS router and standard Transmission Control Protocols and Internet Protocols (TCP/IP) could be used for both the Shuttle and the Space Station if near-error-free radio links are provided.
Physical layer simulation study for the coexistence of WLAN standards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Howlader, M. K.; Keiger, C.; Ewing, P. D.
This paper presents the results of a study on the performance of wireless local area network (WLAN) devices in the presence of interference from other wireless devices. To understand the coexistence of these wireless protocols, simplified physical-layer-system models were developed for the Bluetooth, Wireless Fidelity (WiFi), and Zigbee devices, all of which operate within the 2.4-GHz frequency band. The performances of these protocols were evaluated using Monte-Carlo simulations under various interference and channel conditions. The channel models considered were basic additive white Gaussian noise (AWGN), Rayleigh fading, and site-specific fading. The study also incorporated the basic modulation schemes, multiple accessmore » techniques, and channel allocations of the three protocols. This research is helping the U.S. Nuclear Regulatory Commission (NRC) understand the coexistence issues associated with deploying wireless devices and could prove useful in the development of a technical basis for guidance to address safety-related issues with the implementation of wireless systems in nuclear facilities. (authors)« less
de Abreu, Emanuelle Maria Sávio; Machado, Carla Jorge; Pastore Neto, Mario; de Rezende Neto, João Baptista; Sanches, Marcelo Dias
2015-01-01
to investigate the effect of standardized interventions in the management of tube thoracostomy patients and to assess the independent effect of each intervention. A chest tube management protocol was assessed in a retrospective cohort study. The tube thoracostomy protocol (TTP) was implemented in August 2012, and consisted of: antimicrobial prophylaxis, chest tube insertion in the operating room (OR), admission post chest tube thoracostomy (CTT) in a hospital floor separate from the emergency department (ED), and daily respiratory therapy (RT) sessions post-CTT. The inclusion criteria were, hemodynamic stability, patients between the ages of 15 and 59 years, and injury severity score (ISS) < 17. All patients had isolated injuries to the chest wall, lung, and pleura. During the study period 92 patients were managed according to the standardized protocol. The outcomes of those patients were compared to 99 patients treated before the TTP. Multivariate logistic regression analysis was performed to assess the independent effect of each variable of the protocol on selected outcomes. Demographics, injury severity, and trauma mechanisms were similar among the groups. As expected, protocol compliance increased after the implementation of the TTP. There was a significant reduction (p<0.05) in the incidence of retained hemothoraces, empyemas, pneumonias, surgical site infections, post-procedural complications, hospital length of stay, and number of chest tube days. Respiratory therapy was independently linked to significant reduction (p<0.05) in the incidence of seven out of eight undesired outcomes after CTT. Antimicrobial prophylaxis was linked to a significant decrease (p<0.05) in retained hemothoraces, despite no significant (p<0.10) reductions in empyema and surgical site infections. Conversely, OR chest tube insertion was associated with significant (p<0.05) reduction of both complications, and also significantly decreased the incidence of pneumonias. Implementation of a TTP effectively reduced complications after CTT in trauma patients.
Chaimani, Anna; Caldwell, Deborah M; Li, Tianjing; Higgins, Julian P T; Salanti, Georgia
2017-03-01
The number of systematic reviews that aim to compare multiple interventions using network meta-analysis is increasing. In this study, we highlight aspects of a standard systematic review protocol that may need modification when multiple interventions are to be compared. We take the protocol format suggested by Cochrane for a standard systematic review as our reference and compare the considerations for a pairwise review with those required for a valid comparison of multiple interventions. We suggest new sections for protocols of systematic reviews including network meta-analyses with a focus on how to evaluate their assumptions. We provide example text from published protocols to exemplify the considerations. Standard systematic review protocols for pairwise meta-analyses need extensions to accommodate the increased complexity of network meta-analysis. Our suggested modifications are widely applicable to both Cochrane and non-Cochrane systematic reviews involving network meta-analyses. Copyright © 2017 Elsevier Inc. All rights reserved.
Reusable single-port access device shortens operative time and reduces operative costs.
Shussman, Noam; Kedar, Asaf; Elazary, Ram; Abu Gazala, Mahmoud; Rivkind, Avraham I; Mintz, Yoav
2014-06-01
In recent years, single-port laparoscopy (SPL) has become an attractive approach for performing surgical procedures. The pitfalls of this approach are technical and financial. Financial concerns are due to the increased cost of dedicated devices and prolonged operating room time. Our aim was to calculate the cost of SPL using a reusable port and instruments in order to evaluate the cost difference between this approach to SPL using the available disposable ports and standard laparoscopy. We performed 22 laparoscopic procedures via the SPL approach using a reusable single-port access system and reusable laparoscopic instruments. These included 17 cholecystectomies and five other procedures. Operative time, postoperative length of stay (LOS) and complications were prospectively recorded and were compared with similar data from our SPL database. Student's t test was used for statistical analysis. SPL was successfully performed in all cases. Mean operative time for cholecystectomy was 72 min (range 40-116). Postoperative LOS was not changed from our standard protocols and was 1.1 days for cholecystectomy. The postoperative course was within normal limits for all patients and perioperative morbidity was recorded. Both operative time and length of hospital stay were shorter for the 17 patients who underwent cholecystectomy using a reusable port than for the matched previous 17 SPL cholecystectomies we performed (p < 0.001). Prices of disposable SPL instruments and multiport access devices as well as extraction bags from different manufacturers were used to calculate the cost difference. Operating with a reusable port ended up with an average cost savings of US$388 compared with using disposable ports, and US$240 compared with standard laparoscopy. Single-port laparoscopic surgery is a technically challenging and expensive surgical approach. Financial concerns among others have been advocated against this approach; however, we demonstrate herein that using a reusable port and instruments reduces operative time and overall operative costs, even beyond the cost of standard laparoscopy.
Tan, K. E.; Ellis, B. C.; Lee, R.; Stamper, P. D.; Zhang, S. X.
2012-01-01
Matrix-assisted laser desorption ionization–time of flight mass spectrometry (MALDI-TOF MS) has been found to be an accurate, rapid, and inexpensive method for the identification of bacteria and yeasts. Previous evaluations have compared the accuracy, time to identification, and costs of the MALDI-TOF MS method against standard identification systems or commercial panels. In this prospective study, we compared a protocol incorporating MALDI-TOF MS (MALDI protocol) with the current standard identification protocols (standard protocol) to determine the performance in actual practice using a specimen-based, bench-by-bench approach. The potential impact on time to identification (TTI) and costs had MALDI-TOF MS been the first-line identification method was quantitated. The MALDI protocol includes supplementary tests, notably for Streptococcus pneumoniae and Shigella, and indications for repeat MALDI-TOF MS attempts, often not measured in previous studies. A total of 952 isolates (824 bacterial isolates and 128 yeast isolates) recovered from 2,214 specimens were assessed using the MALDI protocol. Compared with standard protocols, the MALDI protocol provided identifications 1.45 days earlier on average (P < 0.001). In our laboratory, we anticipate that the incorporation of the MALDI protocol can reduce reagent and labor costs of identification by $102,424 or 56.9% within 12 months. The model included the fixed annual costs of the MALDI-TOF MS, such as the cost of protein standards and instrument maintenance, and the annual prevalence of organisms encountered in our laboratory. This comprehensive cost analysis model can be generalized to other moderate- to high-volume laboratories. PMID:22855510
Wound-healing outcomes using standardized assessment and care in clinical practice.
Bolton, Laura; McNees, Patrick; van Rijswijk, Lia; de Leon, Jean; Lyder, Courtney; Kobza, Laura; Edman, Kelly; Scheurich, Anne; Shannon, Ron; Toth, Michelle
2004-01-01
Wound-healing outcomes applying standardized protocols have typically been measured within controlled clinical trials, not natural settings. Standardized protocols of wound care have been validated for clinical use, creating an opportunity to measure the resulting outcomes. Wound-healing outcomes were explored during clinical use of standardized validated protocols of care based on patient and wound assessments. This was a prospective multicenter study of wound-healing outcomes management in real-world clinical practice. Healing outcomes from March 26 to October 31, 2001, were recorded on patients in 3 long-term care facilities, 1 long-term acute care hospital, and 12 home care agencies for wounds selected by staff to receive care based on computer-generated validated wound care algorithms. After diagnosis, wound dimensions and status were assessed using a tool adapted from the Pressure Sore Status Toolfor use on all wounds. Wound, ostomy, and continence nursing professionals accessed consistent protocols of care, via telemedicine in home care or paper forms in long-term care. A physician entered assessments into a desktop computer in the wound clinic. Based on evidence that healing proceeds faster with fewer infections in environments without gauze, the protocols generally avoided gauze dressings. Most of the 767 wounds selected to receive the standardized-protocols of care were stage III-IV pressure ulcers (n = 373; mean healing time 62 days) or full-thickness venous ulcers (n = 124; mean healing time 57 days). Partial-thickness wounds healed faster than same-etiology full-thickness wounds. These results provide benchmarks for natural-setting healing outcomes and help to define and address wound care challenges. Outcomes primarily using nongauze protocols of care matched or surpassed best previously published results on similar wounds using gauze-based protocols of care, including protocols applying gauze impregnated with growth factors or other agents.
Tan, K E; Ellis, B C; Lee, R; Stamper, P D; Zhang, S X; Carroll, K C
2012-10-01
Matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS) has been found to be an accurate, rapid, and inexpensive method for the identification of bacteria and yeasts. Previous evaluations have compared the accuracy, time to identification, and costs of the MALDI-TOF MS method against standard identification systems or commercial panels. In this prospective study, we compared a protocol incorporating MALDI-TOF MS (MALDI protocol) with the current standard identification protocols (standard protocol) to determine the performance in actual practice using a specimen-based, bench-by-bench approach. The potential impact on time to identification (TTI) and costs had MALDI-TOF MS been the first-line identification method was quantitated. The MALDI protocol includes supplementary tests, notably for Streptococcus pneumoniae and Shigella, and indications for repeat MALDI-TOF MS attempts, often not measured in previous studies. A total of 952 isolates (824 bacterial isolates and 128 yeast isolates) recovered from 2,214 specimens were assessed using the MALDI protocol. Compared with standard protocols, the MALDI protocol provided identifications 1.45 days earlier on average (P < 0.001). In our laboratory, we anticipate that the incorporation of the MALDI protocol can reduce reagent and labor costs of identification by $102,424 or 56.9% within 12 months. The model included the fixed annual costs of the MALDI-TOF MS, such as the cost of protein standards and instrument maintenance, and the annual prevalence of organisms encountered in our laboratory. This comprehensive cost analysis model can be generalized to other moderate- to high-volume laboratories.
Cashman, Kevin D; Kiely, Mairead; Kinsella, Michael; Durazo-Arvizu, Ramón A; Tian, Lu; Zhang, Yue; Lucey, Alice; Flynn, Albert; Gibney, Michael J; Vesper, Hubert W; Phinney, Karen W; Coates, Paul M; Picciano, Mary F; Sempos, Christopher T
2013-01-01
Background: The Vitamin D Standardization Program (VDSP) has developed protocols for standardizing procedures of 25-hydroxyvitamin D [25(OH)D] measurement in National Health/Nutrition Surveys to promote 25(OH)D measurements that are accurate and comparable over time, location, and laboratory procedure to improve public health practice. Objective: We applied VDSP protocols to existing ELISA-derived serum 25(OH)D data from the Irish National Adult Nutrition Survey (NANS) as a case-study survey and evaluated their effectiveness by comparison of the protocol-projected estimates with those from a reanalysis of survey serums by using liquid chromatography–tandem mass spectrometry (LC–tandem MS). Design: The VDSP reference system and protocols were applied to ELISA-based serum 25(OH)D data from the representative NANS sample (n = 1118). A reanalysis of 99 stored serums by using standardized LC–tandem MS and resulting regression equations yielded predicted standardized serum 25(OH)D values, which were then compared with LC–tandem MS reanalyzed values for all serums. Results: Year-round prevalence rates for serum 25(OH)D concentrations <30, <40, and <50 nmol/L were 6.5%, 21.9%, and 40.0%, respectively, via original ELISA measurements and 11.4%, 25.3%, and 43.7%, respectively, when VDSP protocols were applied. Differences in estimates at <30- and <40-nmol/L thresholds, but not at the <50-nmol/L threshold, were significant (P < 0.05). A reanalysis of all serums by using LC–tandem MS confirmed prevalence estimates as 11.2%, 27.2%, and 45.0%, respectively. Prevalences of serum 25(OH)D concentrations >125 nmol/L were 1.2%, 0.3%, and 0.6% by means of ELISA, VDSP protocols, and LC–tandem MS, respectively. Conclusion: VDSP protocols hold a major potential for national nutrition and health surveys in terms of the standardization of serum 25(OH)D data. PMID:23615829
Plugfest 2009: Global Interoperability in Telerobotics and Telemedicine
King, H. Hawkeye; Hannaford, Blake; Kwok, Ka-Wai; Yang, Guang-Zhong; Griffiths, Paul; Okamura, Allison; Farkhatdinov, Ildar; Ryu, Jee-Hwan; Sankaranarayanan, Ganesh; Arikatla, Venkata; Tadano, Kotaro; Kawashima, Kenji; Peer, Angelika; Schauß, Thomas; Buss, Martin; Miller, Levi; Glozman, Daniel; Rosen, Jacob; Low, Thomas
2014-01-01
Despite the great diversity of teleoperator designs and applications, their underlying control systems have many similarities. These similarities can be exploited to enable inter-operability between heterogeneous systems. We have developed a network data specification, the Interoperable Telerobotics Protocol, that can be used for Internet based control of a wide range of teleoperators. In this work we test interoperable telerobotics on the global Internet, focusing on the telesurgery application domain. Fourteen globally dispersed telerobotic master and slave systems were connected in thirty trials in one twenty four hour period. Users performed common manipulation tasks to demonstrate effective master-slave operation. With twenty eight (93%) successful, unique connections the results show a high potential for standardizing telerobotic operation. Furthermore, new paradigms for telesurgical operation and training are presented, including a networked surgery trainer and upper-limb exoskeleton control of micro-manipulators. PMID:24748993
Implementation and Evaluation of Four Interoperable Open Standards for the Internet of Things
Jazayeri, Mohammad Ali; Liang, Steve H. L.; Huang, Chih-Yuan
2015-01-01
Recently, researchers are focusing on a new use of the Internet called the Internet of Things (IoT), in which enabled electronic devices can be remotely accessed over the Internet. As the realization of IoT concept is still in its early stages, manufacturers of Internet-connected devices and IoT web service providers are defining their proprietary protocols based on their targeted applications. Consequently, IoT becomes heterogeneous in terms of hardware capabilities and communication protocols. Addressing these heterogeneities by following open standards is a necessary step to communicate with various IoT devices. In this research, we assess the feasibility of applying existing open standards on resource-constrained IoT devices. The standard protocols developed in this research are OGC PUCK over Bluetooth, TinySOS, SOS over CoAP, and OGC SensorThings API. We believe that by hosting open standard protocols on IoT devices, not only do the devices become self-describable, self-contained, and interoperable, but innovative applications can also be easily developed with standardized interfaces. In addition, we use memory consumption, request message size, response message size, and response latency to benchmark the efficiency of the implemented protocols. In all, this research presents and evaluates standard-based solutions to better understand the feasibility of applying existing standards to the IoT vision. PMID:26402683
Bernard, Lise; Roche, Béatrice; Batisse, Marie; Maqdasy, Salwan; Terral, Daniel; Sautou, Valérie; Tauveron, Igor
2016-10-01
In non-critically ill patients, the use of an insulin syringe pump allows the management of temporary situations during which other therapies cannot be used (failure of subcutaneous injections, awaiting advice from the diabetes team, emergency situations, prolonged corticosteroid therapy, initiation of an artificial nutrition, need for a fasting status, etc.). To manage the risks related to this «never event», the use of a standard validated protocol for insulin administration and monitoring is an essential prerequisite. To this end, a multidisciplinary approach is recommended. With the support of our subcommission «Endocrinology-Diabetology», we proceeded with a «step-by-step process» to create such a standardized protocol: (1) review of all existing protocols in our hospital; (2) overview of the literature data concerning insulin infusion protocols developed by multidisciplinary teams in France and abroad; (3) development of a standardized protocol for non-intensive care unit patients, respecting the current recommendations and adapting it to the working habits of health teams; and (4) validation of the protocol Two protocols based on the same structure but adapted to the health status of the patient have been developed. The protocols are divided in to three parts: (1) golden rules to make the use of the protocol appropriate and safe; (2) the algorithm (a double entry table) corresponding to a dynamic adaptation of insulin doses, clearly defining the target and the 'at risk situations'; and (3) practical aspects of the protocol: preparation of the syringe, treatment initiation and traceability. The protocols have been validated by the institution. Our standardized insulin infusion protocol is simple, easy to implement, safe and is likely to be applicable in diverse care units. However, the efficiency, safety and the workability of our protocols have to be clinically evaluated. © 2016 John Wiley & Sons, Ltd.
Proposal for implementation of CCSDS standards for use with spacecraft engineering/housekeeping data
NASA Technical Reports Server (NTRS)
Welch, Dave
1994-01-01
Many of today's low earth orbiting spacecraft are using the Consultative Committee for Space Data Systems (CCSDS) protocol for better optimization of down link RF bandwidth and onboard storage space. However, most of the associated housekeeping data has continued to be generated and down linked in a synchronous, Time Division Multiplexed (TDM) fashion. There are many economies that the CCSDS protocol will allow to better utilize the available bandwidth and storage space in order to optimize the housekeeping data for use in operational trending and analysis work. By only outputting what is currently important or of interest, finer resolution of critical items can be obtained. This can be accomplished by better utilizing the normally allocated housekeeping data down link and storage areas rather than taking space reserved for science.
Proposal for implementation of CCSDS standards for use with spacecraft engineering/housekeeping data
NASA Astrophysics Data System (ADS)
Welch, Dave
1994-11-01
Many of today's low earth orbiting spacecraft are using the Consultative Committee for Space Data Systems (CCSDS) protocol for better optimization of down link RF bandwidth and onboard storage space. However, most of the associated housekeeping data has continued to be generated and down linked in a synchronous, Time Division Multiplexed (TDM) fashion. There are many economies that the CCSDS protocol will allow to better utilize the available bandwidth and storage space in order to optimize the housekeeping data for use in operational trending and analysis work. By only outputting what is currently important or of interest, finer resolution of critical items can be obtained. This can be accomplished by better utilizing the normally allocated housekeeping data down link and storage areas rather than taking space reserved for science.
Satellite Communications Using Commercial Protocols
NASA Technical Reports Server (NTRS)
Ivancic, William D.; Griner, James H.; Dimond, Robert; Frantz, Brian D.; Kachmar, Brian; Shell, Dan
2000-01-01
NASA Glenn Research Center has been working with industry, academia, and other government agencies in assessing commercial communications protocols for satellite and space-based applications. In addition, NASA Glenn has been developing and advocating new satellite-friendly modifications to existing communications protocol standards. This paper summarizes recent research into the applicability of various commercial standard protocols for use over satellite and space- based communications networks as well as expectations for future protocol development. It serves as a reference point from which the detailed work can be readily accessed. Areas that will be addressed include asynchronous-transfer-mode quality of service; completed and ongoing work of the Internet Engineering Task Force; data-link-layer protocol development for unidirectional link routing; and protocols for aeronautical applications, including mobile Internet protocol routing for wireless/mobile hosts and the aeronautical telecommunications network protocol.
Cheikh Ismail, L; Knight, H E; Bhutta, Z; Chumlea, W C
2013-09-01
The primary aim of the INTERGROWTH-21(st) Project is to construct new, prescriptive standards describing optimal fetal and preterm postnatal growth. The anthropometric measurements include the head circumference, recumbent length and weight of the infants, and the stature and weight of the parents. In such a large, international, multicentre project, it is critical that all study sites follow standardised protocols to ensure maximal validity of the growth and nutrition indicators used. This paper describes, in detail, the selection of anthropometric personnel, equipment, and measurement and calibration protocols used to construct the new standards. Implementing these protocols at each study site ensures that the anthropometric data are of the highest quality to construct the international standards. © 2013 Royal College of Obstetricians and Gynaecologists.
Lippert, Dylan; Hoffman, Matthew R; Dang, Phat; McMurray, J Scott; Heatley, Diane; Kille, Tony
2014-12-01
To analyze the safety of a standardized pediatric tracheostomy care protocol in the immediate postoperative period and its impact on tracheostomy related complications. Retrospective case series. Pediatric patients undergoing tracheotomy from February 2010-February 2014. In 2012, a standardized protocol was established regarding postoperative pediatric tracheostomy care. This protocol included securing newly placed tracheostomy tubes using a foam strap with hook and loop fastener rather than twill ties, placing a fresh drain sponge around the tracheostomy tube daily, and performing the first tracheostomy tube change on postoperative day 3 or 4. Outcome measures included rate of skin breakdown and presence of a mature stoma allowing for a safe first tracheostomy tube change. Two types of tracheotomy were performed based on patient age: standard pediatric tracheotomy and adult-style tracheotomy with a Bjork flap. Patients were analyzed separately based on age and the type of tracheotomy performed. Thirty-seven patients in the pre-protocol group and 35 in the post-protocol group were analyzed. The rate of skin breakdown was significantly lower in the post-protocol group (standard: p=0.0048; Bjork flap: p=0.0003). In the post-protocol group, all tube changes were safely accomplished on postoperative day three or four, and the stomas were deemed to be adequately matured to do so in all cases. A standardized postoperative pediatric tracheostomy care protocol resulted in decreased rates of skin breakdown and demonstrated that pediatric tracheostomy tubes can be safely changed as early as 3 days postoperatively. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
PA.NET International Quality Certification Protocol for blood pressure monitors.
Omboni, Stefano; Costantini, Carlo; Pini, Claudio; Bulegato, Roberto; Manfellotto, Dario; Rizzoni, Damiano; Palatini, Paolo; O'brien, Eoin; Parati, Gianfranco
2008-10-01
Although standard validation protocols provide assurance of the accuracy of blood pressure monitors (BPMs), there is no guidance for the consumer as to the overall quality of a device. The PA.NET International Quality Certification Protocol, developed by the Association for Research and Development of Biomedical Technologies and for Continuing Medical Education (ARSMED), a nonprofit organization, with the support of the Italian Society of Hypertension-Italian Hypertension League, and the dabl Educational Trust denotes additional criteria of quality for BPMs that fulfilled basic validation criteria, published in full in peer-reviewed medical journals. The certification is characterized by three phases: (i) to determine that the device fulfilled standard validation criteria; (ii) to determine the technical and functional characteristics of the device (e.g. operativity, display dimension, accessory functions, memory availability, etc.) and (iii) to determine the commercial characteristics (e.g. price-quality ratio, after-sale service, guarantee, etc.). At the end of the certification process, ARSMED attributes a quality index to the device, based on a scale ranging from 1 to 100, and a quality seal with four different grades (bronze, silver, gold and diamond) according to the achieved score. The seal is identified by a unique alphanumeric code. The quality seal may be used on the packaging of the appliance or in advertising. A quality certification is released to the manufacturer and published on www.pressionearteriosa.net and www.dableducational.org. The PA.NET International Quality Certification Protocol represents the first attempt to provide health care personnel and consumers with an independent and objective assessment of BPMs based on their quality.
Operating systems and network protocols for wireless sensor networks.
Dutta, Prabal; Dunkels, Adam
2012-01-13
Sensor network protocols exist to satisfy the communication needs of diverse applications, including data collection, event detection, target tracking and control. Network protocols to enable these services are constrained by the extreme resource scarcity of sensor nodes-including energy, computing, communications and storage-which must be carefully managed and multiplexed by the operating system. These challenges have led to new protocols and operating systems that are efficient in their energy consumption, careful in their computational needs and miserly in their memory footprints, all while discovering neighbours, forming networks, delivering data and correcting failures.
Comparison of test protocols for standard room/corner tests
R. H. White; M. A. Dietenberger; H. Tran; O. Grexa; L. Richardson; K. Sumathipala; M. Janssens
1998-01-01
As part of international efforts to evaluate alternative reaction-to-fire tests, several series of room/comer tests have been conducted. This paper reviews the overall results of related projects in which different test protocols for standard room/corner tests were used. Differences in the test protocols involved two options for the ignition burner scenario and whether...
NASA Astrophysics Data System (ADS)
Sklavos, N.; Selimis, G.; Koufopavlou, O.
2005-01-01
The explosive growth of internet and consumer demand for mobility has fuelled the exponential growth of wireless communications and networks. Mobile users want access to services and information, from both internet and personal devices, from a range of locations without the use of a cable medium. IEEE 802.11 is one of the most widely used wireless standards of our days. The amount of access and mobility into wireless networks requires a security infrastructure that protects communication within that network. The security of this protocol is based on the wired equivalent privacy (WEP) scheme. Currently, all the IEEE 802.11 market products support WEP. But recently, the 802.11i working group introduced the advanced encryption standard (AES), as the security scheme for the future IEEE 802.11 applications. In this paper, the hardware integrations of WEP and AES are studied. A field programmable gate array (FPGA) device has been used as the hardware implementation platform, for a fair comparison between the two security schemes. Measurements for the FPGA implementation cost, operating frequency, power consumption and performance are given.
Reliable multicast protocol specifications protocol operations
NASA Technical Reports Server (NTRS)
Callahan, John R.; Montgomery, Todd; Whetten, Brian
1995-01-01
This appendix contains the complete state tables for Reliable Multicast Protocol (RMP) Normal Operation, Multi-RPC Extensions, Membership Change Extensions, and Reformation Extensions. First the event types are presented. Afterwards, each RMP operation state, normal and extended, is presented individually and its events shown. Events in the RMP specification are one of several things: (1) arriving packets, (2) expired alarms, (3) user events, (4) exceptional conditions.
Auer, Jorg A; Goodship, Allen; Arnoczky, Steven; Pearce, Simon; Price, Jill; Claes, Lutz; von Rechenberg, Brigitte; Hofmann-Amtenbrinck, Margarethe; Schneider, Erich; Müller-Terpitz, R; Thiele, F; Rippe, Klaus-Peter; Grainger, David W
2007-01-01
Background In an attempt to establish some consensus on the proper use and design of experimental animal models in musculoskeletal research, AOVET (the veterinary specialty group of the AO Foundation) in concert with the AO Research Institute (ARI), and the European Academy for the Study of Scientific and Technological Advance, convened a group of musculoskeletal researchers, veterinarians, legal experts, and ethicists to discuss, in a frank and open forum, the use of animals in musculoskeletal research. Methods The group narrowed the field to fracture research. The consensus opinion resulting from this workshop can be summarized as follows: Results & Conclusion Anaesthesia and pain management protocols for research animals should follow standard protocols applied in clinical work for the species involved. This will improve morbidity and mortality outcomes. A database should be established to facilitate selection of anaesthesia and pain management protocols for specific experimental surgical procedures and adopted as an International Standard (IS) according to animal species selected. A list of 10 golden rules and requirements for conduction of animal experiments in musculoskeletal research was drawn up comprising 1) Intelligent study designs to receive appropriate answers; 2) Minimal complication rates (5 to max. 10%); 3) Defined end-points for both welfare and scientific outputs analogous to quality assessment (QA) audit of protocols in GLP studies; 4) Sufficient details for materials and methods applied; 5) Potentially confounding variables (genetic background, seasonal, hormonal, size, histological, and biomechanical differences); 6) Post-operative management with emphasis on analgesia and follow-up examinations; 7) Study protocols to satisfy criteria established for a "justified animal study"; 8) Surgical expertise to conduct surgery on animals; 9) Pilot studies as a critical part of model validation and powering of the definitive study design; 10) Criteria for funding agencies to include requirements related to animal experiments as part of the overall scientific proposal review protocols. Such agencies are also encouraged to seriously consider and adopt the recommendations described here when awarding funds for specific projects. Specific new requirements and mandates related both to improving the welfare and scientific rigour of animal-based research models are urgently needed as part of international harmonization of standards. PMID:17678534
Methodology for analyzing environmental quality indicators in a dynamic operating room environment.
Gormley, Thomas; Markel, Troy A; Jones, Howard W; Wagner, Jennifer; Greeley, Damon; Clarke, James H; Abkowitz, Mark; Ostojic, John
2017-04-01
Sufficient quantities of quality air and controlled, unidirectional flow are important elements in providing a safe building environment for operating rooms. To make dynamic assessments of an operating room environment, a validated method of testing the multiple factors influencing the air quality in health care settings needed to be constructed. These include the following: temperature, humidity, particle load, number of microbial contaminants, pressurization, air velocity, and air distribution. The team developed the name environmental quality indicators (EQIs) to describe the overall air quality based on the actual measurements of these properties taken during the mock surgical procedures. These indicators were measured at 3 different hospitals during mock surgical procedures to simulate actual operating room conditions. EQIs included microbial assessments at the operating table and the back instrument table and real-time analysis of particle counts at 9 different defined locations in the operating suites. Air velocities were measured at the face of the supply diffusers, at the sterile field, at the back table, and at a return grille. The testing protocol provided consistent and comparable measurements of air quality indicators between institutions. At 20 air changes per hour (ACH), and an average temperature of 66.3°F, the median of the microbial contaminants for the 3 operating room sites ranged from 3-22 colony forming units (CFU)/m 3 at the sterile field and 5-27 CFU/m 3 at the back table. At 20 ACH, the median levels of the 0.5-µm particles at the 3 sites were 85,079, 85,325, and 912,232 in particles per cubic meter, with a predictable increase in particle load in the non-high-efficiency particulate air-filtered operating room site. Using a comparison with cleanroom standards, the microbial and particle counts in all 3 operating rooms were equivalent to International Organization for Standardization classifications 7 and 8 during the mock surgical procedures. The EQI protocol was measurable and repeatable and therefore can be safely used to evaluate air quality within the health care environment to provide guidance for operational practices and regulatory requirements. Copyright © 2017 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
A Mobility-Aware QoS Signaling Protocol for Ambient Networks
NASA Astrophysics Data System (ADS)
Jeong, Seong-Ho; Lee, Sung-Hyuck; Bang, Jongho
Mobility-aware quality of service (QoS) signaling is crucial to provide seamless multimedia services in the ambient environment where mobile nodes may move frequently between different wireless access networks. The mobility of an IP-based node in ambient networks affects routing paths, and as a result, can have a significant impact on the operation and state management of QoS signaling protocols. In this paper, we first analyze the impact of mobility on QoS signaling protocols and how the protocols operate in mobility scenarios. We then propose an efficient mobility-aware QoS signaling protocol which can operate adaptively in ambient networks. The key features of the protocol include the fast discovery of a crossover node where the old and new paths converge or diverge due to handover and the localized state management for seamless services. Our analytical and simulation/experimental results show that the proposed/implemented protocol works better than existing protocols in the IP-based mobile environment.
Stephens, John R; Liles, E Allen; Dancel, Ria; Gilchrist, Michael; Kirsch, Jonathan; DeWalt, Darren A
2014-04-01
Clinicians caring for patients seeking alcohol detoxification face many challenges, including lack of evidence-based guidelines for treatment and high recidivism rates. To develop a standardized protocol for determining which alcohol dependent patients seeking detoxification need inpatient versus outpatient treatment, and to study the protocol's implementation. Review of best evidence by ad hoc task force and subsequent creation of standardized protocol. Prospective observational evaluation of initial protocol implementation. Patients presenting for alcohol detoxification. Development and implementation of a protocol for evaluation and treatment of patients requesting alcohol detoxification. Number of admissions per month with primary alcohol related diagnosis (DRG), 30-day readmission rate, and length of stay, all measured before and after protocol implementation. We identified one randomized clinical trial and three cohort studies to inform the choice of inpatient versus outpatient detoxification, along with one prior protocol in this population, and combined that data with clinical experience to create an institutional protocol. After implementation, the average number of alcohol related admissions was 15.9 per month, compared with 18.9 per month before implementation (p = 0.037). There was no difference in readmission rate or length of stay. Creation and utilization of a protocol led to standardization of care for patients requesting detoxification from alcohol. Initial evaluation of protocol implementation showed a decrease in number of admissions.
Data dissemination using gossiping in wireless sensor networks
NASA Astrophysics Data System (ADS)
Medidi, Muralidhar; Ding, Jin; Medidi, Sirisha
2005-06-01
Disseminating data among sensors is a fundamental operation in energy-constrained wireless sensor networks. We present a gossip-based adaptive protocol for data dissemination to improve energy efficiency of this operation. To overcome the data implosion problems associated with dissemination operation, our protocol uses meta-data to name the data using high-level data descriptors and negotiation to eliminate redundant transmissions of duplicate data in the network. Further, we adapt the gossiping with data aggregation possibilities in sensor networks. We simulated our data dissemination protocol, and compared it to the SPIN protocol. We find that our protocol improves on the energy consumption by about 20% over others, while improving significantly over the data dissemination rate of gossiping.
Ho, Lisa M; Nelson, Rendon C; Delong, David M
2007-05-01
To prospectively evaluate the use of lean body weight (LBW) as the main determinant of the volume and rate of contrast material administration during multi-detector row computed tomography of the liver. This HIPAA-compliant study had institutional review board approval. All patients gave written informed consent. Four protocols were compared. Standard protocol involved 125 mL of iopamidol injected at 4 mL/sec. Total body weight (TBW) protocol involved 0.7 g iodine per kilogram of TBW. Calculated LBW and measured LBW protocols involved 0.86 g of iodine per kilogram and 0.92 g of iodine per kilogram calculated or measured LBW for men and women, respectively. Injection rate used for the three experimental protocols was determined proportionally on the basis of the calculated volume of contrast material. Postcontrast attenuation measurements during portal venous phase were obtained in liver, portal vein, and aorta for each group and were summed for each patient. Patient-to-patient enhancement variability in same group was measured with Levene test. Two-tailed t test was used to compare the three experimental protocols with the standard protocol. Data analysis was performed in 101 patients (25 or 26 patients per group), including 56 men and 45 women (mean age, 53 years). Average summed attenuation values for standard, TBW, calculated LBW, and measured LBW protocols were 419 HU +/- 50 (standard deviation), 443 HU +/- 51, 433 HU +/- 50, and 426 HU +/- 33, respectively (P = not significant for all). Levene test results for summed attenuation data for standard, TBW, calculated LBW, and measured LBW protocols were 40 +/- 29, 38 +/- 33 (P = .83), 35 +/- 35 (P = .56), and 26 +/- 19 (P = .05), respectively. By excluding highly variable but poorly perfused adipose tissue from calculation of contrast medium dose, the measured LBW protocol may lessen patient-to-patient enhancement variability while maintaining satisfactory hepatic and vascular enhancement.
Reliability of the individual components of the Canadian Armed Forces Physical Employment Standard.
Stockbrugger, Barry G; Reilly, Tara J; Blacklock, Rachel E; Gagnon, Patrick J
2018-01-29
This investigation recruited 24 participants from both the Canadian Armed Forces (CAF) and civilian populations to complete 4 separate trials at "best effort" of each of the 4 components in the CAF Physical Employment Standard named the FORCE Evaluation: Fitness for Operational Requirements of CAF Employment. Analyses were performed to examine the level of variability and reliability within each component. The results demonstrate that candidates should be provided with at least 1 retest if they have recently completed at least 2 previous best effort attempts as per the protocol. In addition, the minimal detectable difference is given for each of the 4 components in seconds which identifies the threshold for subsequent action, either retest or remedial training, for those unable to meet the minimum standard. These results will educate the delivery of this employment standard, function as a method of accommodation, in addition to providing direction for physical training programs.
Guidelines on Good Clinical Laboratory Practice
Ezzelle, J.; Rodriguez-Chavez, I. R.; Darden, J. M.; Stirewalt, M.; Kunwar, N.; Hitchcock, R.; Walter, T.; D’Souza, M. P.
2008-01-01
A set of Good Clinical Laboratory Practice (GCLP) standards that embraces both the research and clinical aspects of GLP were developed utilizing a variety of collected regulatory and guidance material. We describe eleven core elements that constitute the GCLP standards with the objective of filling a gap for laboratory guidance, based on IND sponsor requirements, for conducting laboratory testing using specimens from human clinical trials. These GCLP standards provide guidance on implementing GLP requirements that are critical for laboratory operations, such as performance of protocol-mandated safety assays, peripheral blood mononuclear cell processing and immunological or endpoint assays from biological interventions on IND-registered clinical trials. The expectation is that compliance with the GCLP standards, monitored annually by external audits, will allow research and development laboratories to maintain data integrity and to provide immunogenicity, safety, and product efficacy data that is repeatable, reliable, auditable and that can be easily reconstructed in a research setting. PMID:18037599
Kwon, Sung Woo; Kim, Young Jin; Shim, Jaemin; Sung, Ji Min; Han, Mi Eun; Kang, Dong Won; Kim, Ji-Ye; Choi, Byoung Wook; Chang, Hyuk-Jae
2011-04-01
To evaluate the prognostic outcome of cardiac computed tomography (CT) for prediction of major adverse cardiac events (MACEs) in low-risk patients suspected of having coronary artery disease (CAD) and to explore the differential prognostic values of coronary artery calcium (CAC) scoring and coronary CT angiography. Institutional review committee approval and informed consent were obtained. In 4338 patients who underwent 64-section CT for evaluation of suspected CAD, both CAC scoring and CT angiography were concurrently performed by using standard scanning protocols. Follow-up clinical outcome data regarding composite MACEs were procured. Multivariable Cox proportional hazards models were developed to predict MACEs. Risk-adjusted models incorporated traditional risk factors for CAC scoring and coronary CT angiography. During the mean follow-up of 828 days ± 380, there were 105 MACEs, for an event rate of 3%. The presence of obstructive CAD at coronary CT angiography had independent prognostic value, which escalated according to the number of stenosed vessels (P < .001). In the receiver operating characteristic curve (ROC) analysis, the superiority of coronary CT angiography to CAC scoring was demonstrated by a significantly greater area under the ROC curve (AUC) (0.892 vs 0.810, P < .001), whereas no significant incremental value for the addition of CAC scoring to coronary CT angiography was established (AUC = 0.892 for coronary CT angiography alone vs 0.902 with addition of CAC scoring, P = .198). Coronary CT angiography is better than CAC scoring in predicting MACEs in low-risk patients suspected of having CAD. Furthermore, the current standard multisection CT protocol (coronary CT angiography combined with CAC scoring) has no incremental prognostic value compared with coronary CT angiography alone. Therefore, in terms of determining prognosis, CAC scoring may no longer need to be incorporated in the cardiac CT protocol in this population. © RSNA, 2011.
NASA Astrophysics Data System (ADS)
Hayes, M.
2014-12-01
The IMBECS Protocol concept employs large cultivation and biorefinery installations, within the five Subtropical Convergence Zones (STCZs), to support the production of commodities such as carbon negative biofuels, seafood, organic fertilizer, polymers and freshwater, as a flexible and cost effective means of Global Warming Mitigation (GWM) with the primary objective being the global scale replacement of fossil fuels (FF). This governance approach is categorically distinct from all other large scale GWM governance concepts. Yet, many of the current marine related GWM technologies are adaptable to this proposals. The IMBECS technology would be managed by an intergovernmentally sanctioned non-profit foundation which would have the following functions/mission: Synthesises relevant treaty language Performs R&D activities and purchases relevant patents Under intergovernmental commission, functions as the primary responsible international actorfor environmental standards, production quotas and operational integrity Licence technology to for-profit actors under strict production/environmental standards Enforce production and environmental standards along with production quotas Provide a high level of transparency to all stakeholders Provide legal defence The IMBECS Protocol is conceptually related to the work found in the following documents/links. This list is not exhaustive: Climate Change Geoengineering The Science and Politics of Global Climate Change: A guide to the debate IPCC Special Report on Renewable Energy and Climate Change Mitigation DoE Roadmap for Algae Biofuels PodEnergy Ocean Agronomy development leaders and progenitor of this proposal. Artificial Upwelling of Deep Seawater Using the Perpetual Salt Fountain for Cultivation of Ocean Desert NASAs' OMEGA study. Cool Planet; Land based version of a carbon negative biofuel concept. Cellana; Leading developer of algae based bioproducts. The State of World Fisheries and Aquaculture Mariculture: A global analysis of production trends since 1950 BECCS /Biochar/ Olivine UNFCCC/IMO/CBD The President's Climate Action Plan The conclusion of this analysis calls for funding of an investigational deployment of the relevant technologies for an open evaluation at the intergovernmental level.
Promoting Robust Design of Diode Lasers for Space: A National Initiative
NASA Technical Reports Server (NTRS)
Tratt, David M.; Amzajerdian, Farzin; Kashem, Nasir B.; Shapiro, Andrew A.; Mense, Allan T.
2007-01-01
The Diode-laser Array Working Group (DAWG) is a national-level consumer/provider forum for discussion of engineering and manufacturing issues which influence the reliability and survivability of high-power broad-area laser diode devices in space, with an emphasis on laser diode arrays (LDAs) for optical pumping of solid-state laser media. The goals of the group are to formulate and validate standardized test and qualification protocols, operational control recommendations, and consensus manufacturing and certification standards. The group is using reliability and lifetime data collected by laser diode manufacturers and the user community to develop a set of standardized guidelines for specifying and qualifying laser diodes for long-duration operation in space, the ultimate goal being to promote an informed U.S. Government investment and procurement strategy for assuring the availability and durability of space-qualified LDAs. The group is also working to establish effective implementation of statistical design techniques at the supplier design, development, and manufacturing levels to help reduce product performance variability and improve product reliability for diodes employed in space applications
Improving generalized inverted index lock wait times
NASA Astrophysics Data System (ADS)
Borodin, A.; Mirvoda, S.; Porshnev, S.; Ponomareva, O.
2018-01-01
Concurrent operations on tree like data structures is a cornerstone of any database system. Concurrent operations intended for improving read\\write performance and usually implemented via some way of locking. Deadlock-free methods of concurrency control are known as tree locking protocols. These protocols provide basic operations(verbs) and algorithm (ways of operation invocations) for applying it to any tree-like data structure. These algorithms operate on data, managed by storage engine which are very different among RDBMS implementations. In this paper, we discuss tree locking protocol implementation for General inverted index (Gin) applied to multiversion concurrency control (MVCC) storage engine inside PostgreSQL RDBMS. After that we introduce improvements to locking protocol and provide usage statistics about evaluation of our improvement in very high load environment in one of the world’s largest IT company.
Joint Air-to-Surface Standoff Missile (JASSM)
2015-12-01
6.1.3) All Ops All Ops Joint Critical Ops All Ops All Ops Missile Reliability (KSA) (CPD para 6.2.8) 4th Lot .91 4th Lot .91 IOT &E .80 4th Lot .85 IOT &E...the ORD 303-95-III dated January 20, 2004 Change Explanations None Acronyms and Abbreviations IOT &E - Initial Operational Test and Evaluation KSA... Actuator Control Card, Lots 12 and 4 Systems Engineering Program Support/Program Tooling and Test Equipment, and JASSM-ER Standard Data Protocol (DS
Design of a prosumer EMS for energy trading
NASA Astrophysics Data System (ADS)
Hwang, T.; Yoo, Y.; Kang, S.; Lee, I.
2018-03-01
We design a DER management system for energy trading based on OASIS EI and EMIX. With the spread of DERs, there is a growing need of a system for integrated management of DERs and customer loads. In this paper, we give a brief overview of a DER EMS for prosumer energy saving and trading. Based on the OASIS standards, we design a functional architecture of a DER EMS for energy trading. After showing communication protocols and operation sequences, we summarize our works.
1983-12-01
Initializes the data tables shared by both the Local and Netowrk Operating Systems. 3. Invint: Written in Assembly Language. Initializes the Input/Output...connection with an appropriate type and grade of transport service and appropriate security authentication (Ref 6:38). Data Transfer within a session...V.; Kent, S. Security in oihr Level Protocolst Anorgaches. Alternatives and Recommendations, Draft Report ICST/HLNP-81-19, Wash ingt on,,D.C.: Dept
2010-10-01
the 2004 Fall Simulation Interoperability Workshop, Orlando, Florida, USA, September 2004, 04F- SIW -090. [Blacklock (2007)] - Blacklock, J. and Zalcman...Valley, CA, USA, March 2009, 09S- SIW -084. [DIS (1995)] - IEEE Standard – Protocols for Distributed Interactive Simulation Application (1995), IEEE...Workshop, Orlando, FL, USA, September 2007, 07F- SIW -111. [Gresche] - Gresche, D. et al, (2006), “International Mission Training Research
On shaky ground - A study of security vulnerabilities in control protocols
DOE Office of Scientific and Technical Information (OSTI.GOV)
Byres, E. J.; Huffman, D.; Kube, N.
2006-07-01
The recent introduction of information technologies such as Ethernet R into nuclear industry control devices has resulted in significantly less isolation from the outside world. This raises the question of whether these systems could be attacked by malware, network hackers or professional criminals to cause disruption to critical operations in a manner similar to the impacts now felt in the business world. To help answer this question, a study was undertaken to test a representative control protocol to determine if it had vulnerabilities that could be exploited. A framework was created in which a test could express a large numbermore » of test cases in very compact formal language. This in turn, allowed for the economical automation of both the generation of selectively malformed protocol traffic and the measurement of device under test's (DUT) behavior in response to this traffic. Approximately 5000 protocol conformance tests were run against two major brands of industrial controller. More than 60 categories of errors were discovered, the majority of which were in the form of incorrect error responses to malformed traffic. Several malformed packets however, caused the device to respond or communicate in inappropriate ways. These would be relatively simple for an attacker to inject into a system and could result in the plant operator losing complete view or control of the control device. Based on this relatively small set of devices, we believe that the nuclear industry urgently needs to adopt better security robustness testing of control devices as standard practice. (authors)« less
Activities report of PTT Research
NASA Astrophysics Data System (ADS)
In the field of postal infrastructure research, activities were performed on postcode readers, radiolabels, and techniques of operations research and artificial intelligence. In the field of telecommunication, transportation, and information, research was made on multipurpose coding schemes, speech recognition, hypertext, a multimedia information server, security of electronic data interchange, document retrieval, improvement of the quality of user interfaces, domotics living support (techniques), and standardization of telecommunication prototcols. In the field of telecommunication infrastructure and provisions research, activities were performed on universal personal telecommunications, advanced broadband network technologies, coherent techniques, measurement of audio quality, near field facilities, local beam communication, local area networks, network security, coupling of broadband and narrowband integrated services digital networks, digital mapping, and standardization of protocols.
The Interplanetary Internet: a communications infrastructure for Mars exploration.
Burleigh, Scott; Cerf, Vinton; Durst, Robert; Fall, Kevin; Hooke, Adrian; Scott, Keith; Weiss, Howard
2003-01-01
A strategy is being developed whereby the current set of internationally standardized space data communications protocols can be incrementally evolved so that a first version of an operational "Interplanetary Internet" is feasible by the end of the decade. This paper describes its architectural concepts, discusses the current set of standard space data communications capabilities that exist to support Mars exploration and reviews proposed new developments. We also speculate that these current capabilities can grow to support future scenarios where human intelligence is widely distributed across the Solar System and day-to-day communications dialog between planets is routine. c2003 American Institute of Aeronautics and Astronautics. Published by Elsevier Science Ltd. All rights reserved.
The Interplanetary Internet: a communications infrastructure for Mars exploration
NASA Technical Reports Server (NTRS)
Burleigh, Scott; Cerf, Vinton; Durst, Robert; Fall, Kevin; Hooke, Adrian; Scott, Keith; Weiss, Howard
2003-01-01
A strategy is being developed whereby the current set of internationally standardized space data communications protocols can be incrementally evolved so that a first version of an operational "Interplanetary Internet" is feasible by the end of the decade. This paper describes its architectural concepts, discusses the current set of standard space data communications capabilities that exist to support Mars exploration and reviews proposed new developments. We also speculate that these current capabilities can grow to support future scenarios where human intelligence is widely distributed across the Solar System and day-to-day communications dialog between planets is routine. c2003 American Institute of Aeronautics and Astronautics. Published by Elsevier Science Ltd. All rights reserved.
Gao, Chao; Zhang, Rui-Dong; Liu, Shu-Guang; Zhao, Xiao-Xi; Cui, Lei; Yue, Zhi-Xia; Li, Wei-Jing; Chen, Zhen-Ping; Li, Zhi-Gang; Rao, Qing; Wang, Min; Zheng, Hu-Yong; Wang, Jian-Xiang
2017-08-01
CREBBP alterations are associated with many diseases including leukaemia. However, CREBBP expression and its clinical relevance in paediatric acute lymphoblastic leukaemia have not been elucidated. We studied CREBBP mRNA expression in 349 patients treated with either the BCH-2003 or CCLG-2008 protocol. Using a receiver operating characteristic curve, patients were divided into low- or high-CREBBP. The association among clinicobiological characteristics, outcomes and CREBBP level was analysed. Low expression of CREBBP (<1.0) at diagnosis was found in 97.7% of patients and increased significantly after complete remission. Low-CREBBP patients were associated with unfavourable clinical presentations, poor prednisone response and high minimal residual disease (>10 -2 ) after induction. We found significantly poorer event-free survival (EFS) and overall survival (OS) in low-CREBBP group whether administered BCH-2003 or CCLG-2008. Low-CREBBP was an inferior independent prognostic factor in BCH-2003; patients with low-CREBBP had better outcomes on an intermediate-risk regimen than a standard-risk regimen involving the CCLG-2008 protocol. Patients stratified to high-risk with low-CREBBP had the worst EFS and OS. These findings indicate that low-CREBBP is predictive of unfavourable outcomes; thus, a more intensive treatment protocol is necessitated for standard-risk patients with insufficient CREBBP and that a specific target therapy is necessitated for high-risk patients. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Disentangling the nature of the nicotine stimulus.
Bevins, Rick A; Barrett, Scott T; Polewan, Robert J; Pittenger, Steven T; Swalve, Natashia; Charntikov, Sergios
2012-05-01
Learning involving interoceptive stimuli likely plays an important role in many diseases and psychopathologies. Within this area, there has been extensive research investigating the interoceptive stimulus effects of abused drugs. In this pursuit, behavioral pharmacologists have taken advantage of what is known about learning processes and adapted the techniques to investigate the behavioral and receptor mechanisms of drug stimuli. Of particular interest is the nicotine stimulus and the use of the two-lever operant drug discrimination task and the Pavlovian drug discriminated goal-tracking task. There is strong concordance between the two methods when using "standard" testing protocols that minimize learning on test days. For example, ABT-418, nornicotine, and varenicline all fully evoked nicotine-appropriate responding. Notably, research from our laboratory with the discriminated goal-tracking task has used an alternative testing protocol. This protocol assesses stimulus substitution based on how well extinction learning using a non-nicotine ligand transfers back to the nicotine stimulus. These findings challenge conclusions based on more "standard" testing procedures (e.g., ABT-418 is not nicotine-like). As a starting point, we propose Thurstone scaling as a quantitative method for more precisely comparing transfer of extinction across doses, experiments, and investigators. We close with a discussion of future research directions and potential implications of the research for understanding interoceptive stimuli. Copyright © 2011 Elsevier B.V. All rights reserved.
Web-based multi-channel analyzer
Gritzo, Russ E.
2003-12-23
The present invention provides an improved multi-channel analyzer designed to conveniently gather, process, and distribute spectrographic pulse data. The multi-channel analyzer may operate on a computer system having memory, a processor, and the capability to connect to a network and to receive digitized spectrographic pulses. The multi-channel analyzer may have a software module integrated with a general-purpose operating system that may receive digitized spectrographic pulses for at least 10,000 pulses per second. The multi-channel analyzer may further have a user-level software module that may receive user-specified controls dictating the operation of the multi-channel analyzer, making the multi-channel analyzer customizable by the end-user. The user-level software may further categorize and conveniently distribute spectrographic pulse data employing non-proprietary, standard communication protocols and formats.
Phase Transition in Protocols Minimizing Work Fluctuations
NASA Astrophysics Data System (ADS)
Solon, Alexandre P.; Horowitz, Jordan M.
2018-05-01
For two canonical examples of driven mesoscopic systems—a harmonically trapped Brownian particle and a quantum dot—we numerically determine the finite-time protocols that optimize the compromise between the standard deviation and the mean of the dissipated work. In the case of the oscillator, we observe a collection of protocols that smoothly trade off between average work and its fluctuations. However, for the quantum dot, we find that as we shift the weight of our optimization objective from average work to work standard deviation, there is an analog of a first-order phase transition in protocol space: two distinct protocols exchange global optimality with mixed protocols akin to phase coexistence. As a result, the two types of protocols possess qualitatively different properties and remain distinct even in the infinite duration limit: optimal-work-fluctuation protocols never coalesce with the minimal-work protocols, which therefore never become quasistatic.
ASRM standard embryo transfer protocol template: a committee opinion.
Penzias, Alan; Bendikson, Kristin; Butts, Samantha; Coutifaris, Christos; Falcone, Tommaso; Fossum, Gregory; Gitlin, Susan; Gracia, Clarisa; Hansen, Karl; Mersereau, Jennifer; Odem, Randall; Rebar, Robert; Reindollar, Richard; Rosen, Mitchell; Sandlow, Jay; Vernon, Michael
2017-04-01
Standardization improves performance and safety. A template for standardizing the embryo transfer procedure is presented here with 12 basic steps supported by published scientific literature and a survey of common practice of SART programs; it can be used by ART practices to model their own standard protocol. Copyright © 2017 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.
O’Rourke, Timothy K.; Erbella, Alexander; Zhang, Yu
2017-01-01
Penile prosthesis implant surgery is an effective management approach for a number of urological conditions, including medication refractory erectile dysfunction (ED). Complications encountered post-operatively include infection, bleeding/hematoma, and device malfunction. Since the 1970s, modifications to these devices have reduced complication rates through improvement in antisepsis and design using antibiotic coatings, kink-resistant tubing, lock-out valves to prevent autoinflation, and modified reservoir shapes. Device survival and complication rates have been investigated predominately by retrospective database-derived studies. This review article focuses on the identification and management of post-operative complications following penile prosthetic and implant surgery. Etiology for ED, surgical technique, and prosthesis type are variable among studies. The most common post-operative complications of infection, bleeding, and device malfunction may be minimized by adherence to consistent technique and standard protocol. Novel antibiotic coatings and standard antibiotic regimen may reduce infection rates. Meticulous hemostasis and intraoperative testing of devices may further reduce need for revision surgery. Additional prospective studies with consistent reporting of outcomes and comparison of surgical approach and prosthesis type in patients with variable ED etiology would be beneficial. PMID:29238663
ACR/NEMA Digital Image Interface Standard (An Illustrated Protocol Overview)
NASA Astrophysics Data System (ADS)
Lawrence, G. Robert
1985-09-01
The American College of Radiologists (ACR) and the National Electrical Manufacturers Association (NEMA) have sponsored a joint standards committee mandated to develop a universal interface standard for the transfer of radiology images among a variety of PACS imaging devicesl. The resulting standard interface conforms to the ISO/OSI standard reference model for network protocol layering. The standard interface specifies the lower layers of the reference model (Physical, Data Link, Transport and Session) and implies a requirement of the Network Layer should a requirement for a network exist. The message content has been considered and a flexible message and image format specified. The following Imaging Equipment modalities are supported by the standard interface... CT Computed Tomograpy DS Digital Subtraction NM Nuclear Medicine US Ultrasound MR Magnetic Resonance DR Digital Radiology The following data types are standardized over the transmission interface media.... IMAGE DATA DIGITIZED VOICE HEADER DATA RAW DATA TEXT REPORTS GRAPHICS OTHERS This paper consists of text supporting the illustrated protocol data flow. Each layer will be individually treated. Particular emphasis will be given to the Data Link layer (Frames) and the Transport layer (Packets). The discussion utilizes a finite state sequential machine model for the protocol layers.
An XML-Based Protocol for Distributed Event Services
NASA Technical Reports Server (NTRS)
Smith, Warren; Gunter, Dan; Quesnel, Darcy; Biegel, Bryan (Technical Monitor)
2001-01-01
A recent trend in distributed computing is the construction of high-performance distributed systems called computational grids. One difficulty we have encountered is that there is no standard format for the representation of performance information and no standard protocol for transmitting this information. This limits the types of performance analysis that can be undertaken in complex distributed systems. To address this problem, we present an XML-based protocol for transmitting performance events in distributed systems and evaluate the performance of this protocol.
Kish, Mary Z
2014-10-01
The ability of a preterm infant to exclusively oral feed is a necessary standard for discharge readiness from the neonatal intensive care unit (NICU). Many of the interventions related to oral feeding advancement currently employed for preterm infants in the NICU are based on individual nursing observations and judgment. Studies involving standardized feeding protocols for oral feeding advancement have been shown to decrease variability in feeding practices, facilitate shortened transition times from gavage to oral feedings, improve bottle feeding performance, and significantly decrease the length of stay (LOS) in the NICU. This project critically evaluated the implementation of an oral feeding advancement protocol in a 74-bed level III NICU in an attempt to standardize the process of advancing oral feedings in medically stable preterm infants. A comprehensive review of the literature identified key features for successful oral feeding in preterm infants. Strong levels of evidence suggested an association between both nonnutritive sucking (NNS) opportunities and standardized feeding advancement protocols with successful oral feeding in preterm infants. These findings prompted a pilot practice change using a feeding advancement protocol and consisted of NNS and standardized oral feeding advancement opportunities. Time to exclusive oral feedings and LOS were compared pre- and postprotocol implementation during more than a 2-month evaluation period. Infants using NNS and the standardized oral feeding advancement protocol had an observed reduction in time to exclusive oral feedings and LOS, although statistical significance was not achieved.
Fault discovery protocol for passive optical networks
NASA Astrophysics Data System (ADS)
Hajduczenia, Marek; Fonseca, Daniel; da Silva, Henrique J. A.; Monteiro, Paulo P.
2007-06-01
All existing flavors of passive optical networks (PONs) provide an attractive alternative to legacy copper-based access lines deployed between a central office (CO) of the service provider (SP) and a customer site. One of the most challenging tasks for PON network planners is the reduction of the overall cost of employing protection schemes for the optical fiber plant while maintaining a reasonable level of survivability and reducing the downtime, thus ensuring acceptable levels of quality of service (QoS) for end subscribers. The recently growing volume of Ethernet PONs deployment [Kramer, IEEE 802.3, CFI (2006)], connected with low-cost electronic and optical components used in the optical network unit (ONU) modules, results in the situation where remote detection of faulty/active subscriber modules becomes indispensable for proper operation of an EPON system. The problem of the remote detection of faulty ONUs in the system is addressed where the upstream channel is flooded with the cw transmission from one or more damaged ONUs and standard communication is severed, providing a solution that is applicable in any type of PON network, regardless of the operating protocol, physical structure, and data rate.
Arthroscopic management of the painful total elbow arthroplasty
Bain, Gregory I
2015-01-01
Background Failure of total elbow arthroplasty is more common than after other major joint arthroplasties and is often a result of aseptic loosening, peri-prosthetic infection, fracture and instability. Infection can be a devastating complication, yet there are no established guidelines for the pre-operative diagnosis of total elbow peri-prosthetic infection. This is because pre-operative clinical, radiographic and biochemical tests are often unreliable. Methods Using three case examples, a standardized protocol for the clinical and arthroscopic assessment of the painful total elbow arthroplasty is described. This is used to provide a mechanical and microbiological diagnosis of the patient’s pain. Results There have been no complications resulting from the use of this technique in the three patients described, nor in any other patient to date. Conclusions The staged protocol described in the present study, utilizing arthroscopic assessment, has refined the approach to the painful total elbow arthroplasty because it directly influences the definitive surgical management of the patient. It is recommended that other surgeons follow the principles outlined in the present study when faced with this challenging problem. PMID:27583000
Development of a manualized protocol of massage therapy for clinical trials in osteoarthritis.
Ali, Ather; Kahn, Janet; Rosenberger, Lisa; Perlman, Adam I
2012-10-04
Clinical trial design of manual therapies may be especially challenging as techniques are often individualized and practitioner-dependent. This paper describes our methods in creating a standardized Swedish massage protocol tailored to subjects with osteoarthritis of the knee while respectful of the individualized nature of massage therapy, as well as implementation of this protocol in two randomized clinical trials. The manualization process involved a collaborative process between methodologic and clinical experts, with the explicit goals of creating a reproducible semi-structured protocol for massage therapy, while allowing some latitude for therapists' clinical judgment and maintaining consistency with a prior pilot study. The manualized protocol addressed identical specified body regions with distinct 30- and 60-min protocols, using standard Swedish strokes. Each protocol specifies the time allocated to each body region. The manualized 30- and 60-min protocols were implemented in a dual-site 24-week randomized dose-finding trial in patients with osteoarthritis of the knee, and is currently being implemented in a three-site 52-week efficacy trial of manualized Swedish massage therapy. In the dose-finding study, therapists adhered to the protocols and significant treatment effects were demonstrated. The massage protocol was manualized, using standard techniques, and made flexible for individual practitioner and subject needs. The protocol has been applied in two randomized clinical trials. This manualized Swedish massage protocol has real-world utility and can be readily utilized both in the research and clinical settings. Clinicaltrials.gov NCT00970008 (18 August 2009).
Using manufacturing message specification for monitor and control at Venus
NASA Technical Reports Server (NTRS)
Heuser, W. Randy; Chen, Richard L.; Stockett, Michael H.
1993-01-01
The flexibility and robustness of a monitor and control (M&C) system are a direct result of the underlying interprocessor communications architecture. A new architecture for M&C at the Deep Space Communications Complexes (DSCC's) has been developed based on the Manufacturing Message Specification (MMS) process control standard of the Open System Interconnection (OSI) suite of protocols. This architecture has been tested both in a laboratory environment and under operational conditions at the Deep Space Network (DSN) experimental Venus station (DSS-13). The Venus experience in the application of OSI standards to support M&C has been extremely successful. MMS meets the functional needs of the station and provides a level of flexibility and responsiveness previously unknown in that environment. The architecture is robust enough to meet current operational needs and flexible enough to provide a migration path for new subsystems. This paper will describe the architecture of the Venus M&C system, discuss how MMS was used and the requirements this imposed on other parts of the system, and provide results from systems and operational testing at the Venus site.
A Standardized Protocol for the Prospective Follow-Up of Cleft Lip and Palate Patients.
Salimi, Negar; Jolanta, Aleksejūnienė; Edwin, Yen; Angelina, Loo
2018-01-01
To develop a standardized all-encompassing protocol for the assessment of cleft lip and palate patients with clinical and research implications. Electronic database searches were conducted and 13 major cleft centers worldwide were contacted in order to prepare for the development of the protocol. In preparation, the available evidence was reviewed and potential fistula-related risk determinants from 4 different domains were identified. No standardized protocol for the assessment of cleft patients could be found in any of the electronic database searches that were conducted. Interviews with representatives from several major centers revealed that the majority of centers do not have a standardized comprehensive strategy for the reporting and follow-up of cleft lip and palate patients. The protocol was developed and consisted of the following domains of determinants: (1) the sociodemographic domain, (2) the cleft defect domain, (3) the surgery domain, and (4) the fistula domain. The proposed protocol has the potential to enhance the quality of patient care by ensuring that multiple patient-related aspects are consistently reported. It may also facilitate future multicenter research, which could contribute to the reduction of fistula occurrence in cleft lip and palate patients.
Telescience Resource Kit (TReK)
NASA Technical Reports Server (NTRS)
Lippincott, Jeff
2015-01-01
Telescience Resource Kit (TReK) is one of the Huntsville Operations Support Center (HOSC) remote operations solutions. It can be used to monitor and control International Space Station (ISS) payloads from anywhere in the world. It is comprised of a suite of software applications and libraries that provide generic data system capabilities and access to HOSC services. The TReK Software has been operational since 2000. A new cross-platform version of TReK is under development. The new software is being released in phases during the 2014-2016 timeframe. The TReK Release 3.x series of software is the original TReK software that has been operational since 2000. This software runs on Windows. It contains capabilities to support traditional telemetry and commanding using CCSDS (Consultative Committee for Space Data Systems) packets. The TReK Release 4.x series of software is the new cross platform software. It runs on Windows and Linux. The new TReK software will support communication using standard IP protocols and traditional telemetry and commanding. All the software listed above is compatible and can be installed and run together on Windows. The new TReK software contains a suite of software that can be used by payload developers on the ground and onboard (TReK Toolkit). TReK Toolkit is a suite of lightweight libraries and utility applications for use onboard and on the ground. TReK Desktop is the full suite of TReK software -most useful on the ground. When TReK Desktop is released, the TReK installation program will provide the option to choose just the TReK Toolkit portion of the software or the full TReK Desktop suite. The ISS program is providing the TReK Toolkit software as a generic flight software capability offered as a standard service to payloads. TReK Software Verification was conducted during the April/May 2015 timeframe. Payload teams using the TReK software onboard can reference the TReK software verification. TReK will be demonstrated on-orbit running on an ISS provided T61p laptop. Target Timeframe: September 2015 -2016. The on-orbit demonstration will collect benchmark metrics, and will be used in the future to provide live demonstrations during ISS Payload Conferences. Benchmark metrics and demonstrations will address the protocols described in SSP 52050-0047 Ku Forward section 3.3.7. (Associated term: CCSDS File Delivery Protocol (CFDP)).
Klein, Jan; Teber, Dogu; Frede, Tom; Stock, Christian; Hruza, Marcel; Gözen, Ali; Seemann, Othmar; Schulze, Michael; Rassweiler, Jens
2013-03-01
Development and full validation of a laparoscopic training program for stepwise learning of a reproducible application of a standardized laparoscopic anastomosis technique and integration into the clinical course. The training of vesicourethral anastomosis (VUA) was divided into six simple standardized steps. To fix the objective criteria, four experienced surgeons performed the stepwise training protocol. Thirty-eight participants with no previous laparoscopic experience were investigated in their training performance. The times needed to manage each training step and the total training time were recorded. The integration into the clinical course was investigated. The training results and the corresponding steps during laparoscopic radical prostatectomy (LRP) were analyzed. Data analysis of corresponding operating room (OR) sections of 793 LRP was performed. Based on the validity, criteria were determined. In the laboratory section, a significant reduction of OR time for every step was seen in all participants. Coordination: 62%; longitudinal incision: 52%; inverted U-shape incision: 43%; plexus: 47%. Anastomosis catheter model: 38%. VUA: 38%. The laboratory section required a total time of 29 hours (minimum: 16 hours; maximum: 42 hours). All participants had shorter execution times in the laboratory than under real conditions. The best match was found within the VUA model. To perform an anastomosis under real conditions, 25% more time was needed. By using the training protocol, the performance of the VUA is comparable to that of an surgeon with experience of about 50 laparoscopic VUA. Data analysis proved content, construct, and prognostic validity. The use of stepwise training approaches enables a surgeon to learn and reproduce complex reconstructive surgical tasks: eg, the VUA in a safe environment. The validity of the designed system is given at all levels and should be used as a standard in the clinical surgical training in laparoscopic reconstructive urology.
Optimizing the high-resolution manometry (HRM) study protocol.
Patel, A; Ding, A; Mirza, F; Gyawali, C P
2015-02-01
Intolerance of the esophageal manometry catheter may prolong high-resolution manometry (HRM) studies and increase patient distress. We assessed the impact of obtaining the landmark phase at the end of the study when the patient has acclimatized to the HRM catheter. 366 patients (mean age 55.4 ± 0.8 years, 62.0% female) undergoing esophageal HRM over a 1-year period were studied. The standard protocol consisted of the landmark phase, 10 5 mL water swallows 20-30 s apart, and multiple rapid swallows where 4-6 2 mL swallows were administered in rapid succession. The modified protocol consisted of the landmark phase at the end of the study after test swallows. Study duration, technical characteristics, indications, and motor findings were compared between standard and modified protocols. Of the 366 patients, 89.6% underwent the standard protocol (study duration 12.9 ± 0.3 min). In 10.4% with poor catheter tolerance undergoing the modified protocol, study duration was significantly longer (15.6 ± 1.0 min, p = 0.004) despite similar duration of study maneuvers. Only elevated upper esophageal sphincter basal pressures at the beginning of the study segregated modified protocol patients. The 95th percentile time to landmark phase in the standard protocol patients was 6.1 min; as many as 31.4% of modified protocol patients could not obtain their first study maneuver within this period (p = 0.0003). Interpretation was not impacted by shifting the landmark phase to the end of the study. Modification of the HRM study protocol with the landmark phase obtained at the end of the study optimizes study duration without compromising quality. © 2014 John Wiley & Sons Ltd.
NASA Technical Reports Server (NTRS)
Hooke, A. J.
1979-01-01
A set of standard telemetry protocols for downlink data flow facilitating the end-to-end transport of instrument data from the spacecraft to the user in real time is proposed. The direct switching of data by autonomous message 'packets' that are assembled by the source instrument on the spacecraft is discussed. The data system consists thus of a format on a message rather than word basis, and such packet telemetry would include standardized protocol headers. Standards are being developed within the NASA End-to-End Data System (NEEDS) program for the source packet and transport frame protocols. The source packet protocol contains identification of both the sequence number of the packet as it is generated by the source and the total length of the packet, while the transport frame protocol includes a sequence count defining the serial number of the frame as it is generated by the spacecraft data system, and a field specifying any 'options' selected in the format of the frame itself.
NASA Technical Reports Server (NTRS)
Benbenek, Daniel B.; Walsh, William
2010-01-01
This greenbook captures some of the current, planned and possible future uses of the Internet Protocol (IP) as part of Space Operations. It attempts to describe how the Internet Protocol is used in specific scenarios. Of primary focus is low-earth-orbit space operations, which is referred to here as the design reference mission (DRM). This is because most of the program experience drawn upon derives from this type of mission. Application profiles are provided. This includes parameter settings programs have proposed for sending IP datagrams over CCSDS links, the minimal subsets and features of the IP protocol suite and applications expected for interoperability between projects, and the configuration, operations and maintenance of these IP functions. Of special interest is capturing the lessons learned from the Constellation Program in this area, since that program included a fairly ambitious use of the Internet Protocol.
Ho, David M; Huo, Michael H
2007-07-01
Total knee replacement (TKR) operation is one of the most effective procedures, both clinically and in terms of cost. Because of increased volume and cost for this procedure during the past 3 decades, TKRs are often targeted for cost reduction. The purpose of this study was to evaluate the efficacy of two cost reducing methodologies, establishment of critical clinical pathways, and standardization of implant costs. Ninety patients (90 knees) were randomly selected from a population undergoing primary TKR during a 2-year period at a tertiary teaching hospital. Patients were assigned to three groups that corresponded to different strategies implemented during the evolution of the joint-replacement program. Medical records were reviewed for type of anesthesia, operative time, length of stay, and any perioperative complications. Financial information for each patient was compared among the three groups. Data analysis demonstrated that the institution of a critical pathway significantly shortened length of hospital stay and was effective in reducing the hospital costs by 18% (p < 0.05). In addition, standardization of surgical techniques under the care of a single surgeon substantially reduced the operative time. Selection of implants from a single vendor did not have any substantial effect in additionally reducing the costs. Standardized postoperative management protocols and critical clinical pathways can reduce costs and operative time. Future efforts must focus on lowering the costs of the prostheses, particularly with competitive bidding or capitation of prostheses costs. Although a single-vendor approach was not effective in this study, it is possible that a cost reduction could have been realized if more TKRs were performed, because the pricing contract was based on projected volume of TKRs to be done by the hospital.
Cui, Ling; Shi, Yu; Zhang, G N
2016-12-15
Fast-track surgery (FTS), also known as enhanced recovery after surgery, is a multidisciplinary approach to accelerate recovery, reduce complications, minimise hospital stay without increasing readmission rates, and reduce health care costs, all without compromising patient safety. The advantages of FTS in abdominal surgery most likely extend to gynaecological surgery, but this is an assumption, as FTS in elective gynaecological surgery has not been well studied. No consensus guidelines have been developed for gynaecological oncological surgery although surgeons have attempted to introduce slightly modified FTS programmes for patients undergoing such surgery. To our knowledge, there are no published randomised controlled trials; however, some studies have shown that FTS in gynaecological oncological surgery leads to early hospital discharge with high levels of patient satisfaction. The aim of this study is whether FTS reduces the length of stay in hospital compared to traditional management. The secondary aim is whether FTS is associated with any increase in post-surgical complications compared to traditional management (for both open and laparoscopic surgery). This trial will prospectively compare FTS and traditional management protocols. The primary endpoint is the length of post-operative hospitalisation (days, mean ± standard deviation), defined as the number of days between the date of discharge and the date of surgery. The secondary endpoints are complications in both groups (FTS versus traditional protocol) occurring during the first 3 months post-operatively including infection (wound infection, lung infection, intraperitoneal infection), post-operative nausea and vomiting, ileus, post-operative haemorrhage, post-operative thrombosis, and the Acute Physiology and Chronic Health Enquiry II score. The advantages of FTS most likely extend to gynaecology, although, to our knowledge, there are no randomised controlled trials. The aim of this study is to compare the post-operative length of hospitalisation after major gynaecological or gynaecological oncological surgery and to analyse patients' post-operative complications. This trial may reveal whether FTS leads to early hospital discharge with few complications after gynaecological surgery. NCT02687412 . Approval Number: SCCHEC20160001. Date of registration: registered on 23 February 2016.
CCSDS Spacecraft Monitor and Control Mission Operations Interoperability Prototype
NASA Technical Reports Server (NTRS)
Lucord, Steve; Martinez, Lindolfo
2009-01-01
We are entering a new era in space exploration. Reduced operating budgets require innovative solutions to leverage existing systems to implement the capabilities of future missions. Custom solutions to fulfill mission objectives are no longer viable. Can NASA adopt international standards to reduce costs and increase interoperability with other space agencies? Can legacy systems be leveraged in a service oriented architecture (SOA) to further reduce operations costs? The Operations Technology Facility (OTF) at the Johnson Space Center (JSC) is collaborating with Deutsches Zentrum fur Luft- und Raumfahrt (DLR) to answer these very questions. The Mission Operations and Information Management Services Area (MOIMS) Spacecraft Monitor and Control (SM&C) Working Group within the Consultative Committee for Space Data Systems (CCSDS) is developing the Mission Operations standards to address this problem space. The set of proposed standards presents a service oriented architecture to increase the level of interoperability among space agencies. The OTF and DLR are developing independent implementations of the standards as part of an interoperability prototype. This prototype will address three key components: validation of the SM&C Mission Operations protocol, exploration of the Object Management Group (OMG) Data Distribution Service (DDS), and the incorporation of legacy systems in a SOA. The OTF will implement the service providers described in the SM&C Mission Operation standards to create a portal for interaction with a spacecraft simulator. DLR will implement the service consumers to perform the monitor and control of the spacecraft. The specifications insulate the applications from the underlying transport layer. We will gain experience with a DDS transport layer as we delegate responsibility to the middleware and explore transport bridges to connect disparate middleware products. A SOA facilitates the reuse of software components. The prototype will leverage the capabilities of existing legacy systems. Various custom applications and middleware solutions will be combined into one system providing the illusion of a set of homogenous services. This paper will document our journey as we implement the interoperability prototype. The team consists of software engineers with experience on the current command, telemetry and messaging systems that support the International Space Station (ISS) and Space Shuttle programs. Emphasis will be on the objectives, results and potential cost saving benefits.
An Adaptive OFDMA-Based MAC Protocol for Underwater Acoustic Wireless Sensor Networks
Khalil, Issa M.; Gadallah, Yasser; Hayajneh, Mohammad; Khreishah, Abdallah
2012-01-01
Underwater acoustic wireless sensor networks (UAWSNs) have many applications across various civilian and military domains. However, they suffer from the limited available bandwidth of acoustic signals and harsh underwater conditions. In this work, we present an Orthogonal Frequency Division Multiple Access (OFDMA)-based Media Access Control (MAC) protocol that is configurable to suit the operating requirements of the underwater sensor network. The protocol has three modes of operation, namely random, equal opportunity and energy-conscious modes of operation. Our MAC design approach exploits the multi-path characteristics of a fading acoustic channel to convert it into parallel independent acoustic sub-channels that undergo flat fading. Communication between node pairs within the network is done using subsets of these sub-channels, depending on the configurations of the active mode of operation. Thus, the available limited bandwidth gets fully utilized while completely avoiding interference. We derive the mathematical model for optimal power loading and subcarrier selection, which is used as basis for all modes of operation of the protocol. We also conduct many simulation experiments to evaluate and compare our protocol with other Code Division Multiple Access (CDMA)-based MAC protocols. PMID:23012517
An adaptive OFDMA-based MAC protocol for underwater acoustic wireless sensor networks.
Khalil, Issa M; Gadallah, Yasser; Hayajneh, Mohammad; Khreishah, Abdallah
2012-01-01
Underwater acoustic wireless sensor networks (UAWSNs) have many applications across various civilian and military domains. However, they suffer from the limited available bandwidth of acoustic signals and harsh underwater conditions. In this work, we present an Orthogonal Frequency Division Multiple Access (OFDMA)-based Media Access Control (MAC) protocol that is configurable to suit the operating requirements of the underwater sensor network. The protocol has three modes of operation, namely random, equal opportunity and energy-conscious modes of operation. Our MAC design approach exploits the multi-path characteristics of a fading acoustic channel to convert it into parallel independent acoustic sub-channels that undergo flat fading. Communication between node pairs within the network is done using subsets of these sub-channels, depending on the configurations of the active mode of operation. Thus, the available limited bandwidth gets fully utilized while completely avoiding interference. We derive the mathematical model for optimal power loading and subcarrier selection, which is used as basis for all modes of operation of the protocol. We also conduct many simulation experiments to evaluate and compare our protocol with other Code Division Multiple Access (CDMA)-based MAC protocols.
The International Planetary Data Alliance (IPDA): Activities in 2010-2012
NASA Astrophysics Data System (ADS)
Crichton, Daniel; Beebe, Reta; Kasaba, Yasumasa; Sarkissian, Alain; Capria, Maria Teresa; Hughes, Steven; Osuna, Pedro
2012-07-01
The IPDA is an international collaboration of space agencies with a mission of providing access to scientific data returned from solar system missions archived at international data centers. In order to improve access and share scientific data, the IPDA was founded to develop data and software standards. The IPDA has focused on promoting standards that drive common methods for collecting and describing planetary science data. An initial starting point for developing such a standard has been the internationalization of NASA's Planetary Data System (PDS) standard, which has become a de-facto standard. The IPDA has also focused on developing software standards that promote interoperability through the use of common software protocols allowing agencies to link their systems together. The IPDA has made significant progress since its inaugural meeting in 2006 adopting standards and developing collaborations across agencies to ensure data is captured in common formats. It has also grown to approximately eight agencies represented by a number of different groups through the IPDA Steering Committee [1]. The IPDA Steering Committee oversees the execution of projects. Over the past two years, the IPDA Steering Committee has conducted a number of focused projects around the development of these standards to enable interoperability, construction of compatible archives, and the operation of the IPDA as a whole. These projects have helped to establish the IPDA and to bring together the collaboration. Two key projects have been: development of a common protocol for data exchange, the Planetary Data Access Protocol (PDAP); and collaboration with the NASA Planetary Data System (PDS) on the next generation PDS standards, PDS4.. Both of these are progressing well and have draft standards that are now being tested. More recently, the IPDA has formed a Technical Experts Group (TEG) that is responsible for the technical architecture and implementation of the projects. As agencies implement archive systems, it is essential that the standards and software support exists and provide guidance to ensure that agencies can develop IPDA compatible archives. This talk will cover the results of the IPDA projects over the 2010-2012 timeframe. It will also discuss the plans for the next two years including the focus on ensuring that the IPDA standards for both the system and data are accessible for use by the international planetary science community. Finally, it will discuss progress on linking planetary archive systems together so scientists can access archived data regardless of the location. [1] http://planetarydata.org/members
Chand, Manish; De'Ath, Henry D; Rasheed, Shahnawaz; Mehta, Chaitanya; Bromilow, James; Qureshi, Tahseen
2016-01-01
Laparoscopic surgery is well established in the modern management of colorectal disease. More recently, enhanced recovery after surgery (ERAS) protocols have been introduced to further promote accelerated discharge and faster recovery. However, not all patients are suitable for early discharge. The purpose of this study was to evaluate the early outcomes of patients undergoing such a regime to determine which peri-operative factors may predict safe accelerated discharge. Data were prospectively collected on consecutive patients undergoing laparoscopic colorectal surgery. All patients followed the institution's ERAS protocol and were discharged once specific criteria were fulfilled. Clinical characteristics and outcomes were compared between patients who were discharged before and after 72 h post-surgery. Thereafter, the peri-operative factors that were associated with delayed discharge were determined using a binary logistic model. Three hundred patients were included in the analysis. The most common operation was laparoscopic anterior resection (n = 123, 41%). Mean length of stay was 4.8 days (standard deviation 5.9), with 185 (62%) patients discharged within 72 h. Ten (3%) patients had a post-operative complication. Three independent predictors of delayed discharge were identified; BMI (OR 1.06, 95%CI 1.01-1.11), operation length (OR 0.99, 95%CI 0.98-0.99) and complications (OR 16.26, 95%CI 4.88-54.08). A combined approach of laparoscopic surgery and ERAS leads to reduced length of stay. This enables more than 60% of patients to be discharged within 72 h. Increased BMI, duration of operation and complications post-operatively independently predict a longer length of stay. Copyright © 2015 IJS Publishing Group Limited. Published by Elsevier Ltd. All rights reserved.
Prediction of anaerobic power values from an abbreviated WAnT protocol.
Stickley, Christopher D; Hetzler, Ronald K; Kimura, Iris F
2008-05-01
The traditional 30-second Wingate anaerobic test (WAnT) is a widely used anaerobic power assessment protocol. An abbreviated protocol has been shown to decrease the mild to severe physical discomfort often associated with the WAnT. Therefore, the purpose of this study was to determine whether a 20-second WAnT protocol could be used to accurately predict power values of a standard 30-second WAnT. In 96 college females, anaerobic power variables were assessed using a standard 30-second WAnT protocol. Maximum power values as well as instantaneous power at 10, 15, and 20 seconds were recorded. Based on these results, stepwise regression analysis was performed to determine the accuracy with which mean power, minimum power, 30-second power, and percentage of fatigue for a standard 30-second WAnT could be predicted from values obtained during the first 20 seconds of testing. Mean power values showed the highest level of predictability (R2 = 0.99) from the 20-second values. Minimum power, 30-second power, and percentage of fatigue also showed high levels of predictability (R2 = 0.91, 0.84, and 0.84, respectively) using only values obtained during the first 20 seconds of the protocol. An abbreviated (20-second) WAnT protocol appears to effectively predict results of a standard 30-second WAnT in college-age females, allowing for comparison of data to published norms. A shortened test may allow for a decrease in unwanted side effects associated with the traditional WAnT protocol.
SpaceWire Protocol ID: What Does It Mean To You?
NASA Technical Reports Server (NTRS)
Rakow, Glenn; Schnurr, Richard; Gilley, Daniel; Parks, Steve
2006-01-01
Spacewire is becoming a popular solution for satellite high-speed data buses because it is a simple standard that provides great flexibility for a wide range of system requirements. It is simple in packet format and protocol, allowing users to easily tailor their implementation for their specific application. Some of the attractive aspects of Spacewire that make it easy to implement also make it hard for future reuse. Protocol reuse is difficult because Spacewire does not have a defined mechanism to communicate with the higher layers of the protocol stack. This has forced users of Spacewire to define unique packet formats and define how these packets are to be processed. Each mission writes their own Interface Control Document (ICD) and tailors Spacewire for their specific requirements making reuse difficult. Part of the reason for this habit may be because engineers typically optimize designs for their own requirements in the absence of a standard. This is an inefficient use of project resources and costs more to develop missions. A new packet format for Spacewire has been defined as a solution for this problem. This new packet format is a compliment to the Spacewire standard that will support protocol development upon Spacewire. The new packet definition does not replace the current packet structure, i.e., does not make the standard obsolete, but merely extends the standard for those who want to develop protocols over Spacewire. The Spacewire packet is defined with the first part being the Destination Address, which may be one or more bytes. This is followed by the packet cargo, which is user defined. The cargo is truncated with an End-Of-Packet (EOP) marker. This packet structure offers low packet overhead and allows the user to define how the contents are to be formatted. It also provides for many different addressing schemes, which provide flexibility in the system. This packet flexibility is typically an attractive part of the Spacewire. The new extended packet format adds one new field to the packet that greatly enhances the capability of Spacewire. This new field called the Protocol Identifier (ID) is used to identify the packet contents and the associated processing for the packet. This feature along with the restriction in the packet format that uses the Protocol ID, allows a deterministic method of decoding packets that was not before possible. The first part of the packet is still the Destination Address, which still conforms to the original standard but with one restriction. The restriction is that the first byte seen at the destination by the user needs to be a logical address, independent of the addressing scheme used. The second field is defined as the Protocol ID, which is usually one byte in length. The packet cargo (user defined) follows the Protocol ID. After the packet cargo is the EOP, which defines the end of packet. The value of the Protocol ID is assigned by the Spacewire working group and the protocol description published for others to use. The development of Protocols for Spacewire is currently the area of greatest activity by the Spacewire working group. The first protocol definition by the working group has been completed and is now in the process of formal standardization. There are many other protocols in development for missions that have not yet received formal Protocol ID assignment, but even if the protocols are not formally assigned a value, this effort will provide synergism for future developments.
Government Open Systems Interconnection Profile (GOSIP) Transition Strategy
1993-09-01
it explores some of the GOSIP protocol details and discusses the process by which standards organizations have developed their...version 1 and 2. Additionally, it explores some of the GOSIP protocol details and discusses the process by which standards organizations have developed ...ORGANIZATION OF STUDY 1. The Standards Process Chapter II describes the process whereby standards are developed and adopted by the ISO and how the
Construction and Organization of a BSL-3 Cryo-Electron Microscopy Laboratory at UTMB
Sherman, Michael B.; Trujillo, Juan; Leahy, Ian; Razmus, Dennis; DeHate, Robert; Lorcheim, Paul; Czarneski, Mark A.; Zimmerman, Domenica; Newton, Je T’Aime M.; Haddow, Andrew D.; Weaver, Scott C.
2013-01-01
A unique cryo-electron microscopy facility has been designed and constructed at the University of Texas Medical Branch (UTMB) to study the three-dimensional organization of viruses and bacteria classified as select agents at biological safety level (BSL)-3, and their interactions with host cells. A 200 keV high-end cryo-electron microscope was installed inside a BSL-3 containment laboratory and standard operating procedures were developed and implemented to ensure its safe and efficient operation. We also developed a new microscope decontamination protocol based on chlorine dioxide gas with a continuous flow system, which allowed us to expand the facility capabilities to study bacterial agents including spore-forming species. The new unified protocol does not require agent-specific treatment in contrast to the previously used heat decontamination. To optimize the use of the cryo-electron microscope and to improve safety conditions, it can be remotely controlled from a room outside of containment, or through a computer network world-wide. Automated data collection is provided by using JADAS (single particle imaging) and SerialEM (tomography). The facility has successfully operated for more than a year without an incident and was certified as a select agent facility by the Centers for Disease Control. PMID:23274136
What is the role of enhanced recovery after surgery in children? A scoping review.
Pearson, Katherine L; Hall, Nigel J
2017-01-01
Enhanced recovery after surgery (ERAS) pathways are standard practice in adult specialties resulting in improved outcomes. It is unclear whether ERAS principles are applicable to Paediatric Surgery. We performed a scoping review to identify the extent to which ERAS has been used in Paediatric Surgery, the nature of interventions, and outcomes. Pubmed, Cochrane library, Google Scholar, and Embase were searched using the terms enhanced recovery, post-operative protocol/pathway, fast track surgery, and paediatric surgery. Studies were excluded if they did not include abdominal/thoracic/urological procedures in children. Nine studies were identified (2003-2014; total 1269 patients): three case control studies, one retrospective review and five prospective implementations, no RCTs. Interventional elements identified were post-operative feeding, mobilisation protocols, morphine-sparing analgesia, reduced use of nasogastric tubes and urinary catheters. Outcomes reported included post-operative length of stay (LOS), time to oral feeding and stooling, complications, and parent satisfaction. Fast-track programmes significantly reduced LOS in 6/7 studies, time to oral feeding in 3/3 studies, and time to stooling in 2/3 studies. The use of ERAS pathways in Paediatric surgery appears very limited but such pathways may have benefits in children. Prospective studies should evaluate interventions used in adult ERAS on appropriate outcomes in the paediatric setting.
Construction and organization of a BSL-3 cryo-electron microscopy laboratory at UTMB.
Sherman, Michael B; Trujillo, Juan; Leahy, Ian; Razmus, Dennis; Dehate, Robert; Lorcheim, Paul; Czarneski, Mark A; Zimmerman, Domenica; Newton, Je T'aime M; Haddow, Andrew D; Weaver, Scott C
2013-03-01
A unique cryo-electron microscopy facility has been designed and constructed at the University of Texas Medical Branch (UTMB) to study the three-dimensional organization of viruses and bacteria classified as select agents at biological safety level (BSL)-3, and their interactions with host cells. A 200keV high-end cryo-electron microscope was installed inside a BSL-3 containment laboratory and standard operating procedures were developed and implemented to ensure its safe and efficient operation. We also developed a new microscope decontamination protocol based on chlorine dioxide gas with a continuous flow system, which allowed us to expand the facility capabilities to study bacterial agents including spore-forming species. The new unified protocol does not require agent-specific treatment in contrast to the previously used heat decontamination. To optimize the use of the cryo-electron microscope and to improve safety conditions, it can be remotely controlled from a room outside of containment, or through a computer network world-wide. Automated data collection is provided by using JADAS (single particle imaging) and SerialEM (tomography). The facility has successfully operated for more than a year without an incident and was certified as a select agent facility by the Centers for Disease Control. Copyright © 2012 Elsevier Inc. All rights reserved.
Stanzione, Arnaldo; Imbriaco, Massimo; Cocozza, Sirio; Fusco, Ferdinando; Rusconi, Giovanni; Nappi, Carmela; Mirone, Vincenzo; Mangiapia, Francesco; Brunetti, Arturo; Ragozzino, Alfonso; Longo, Nicola
2016-12-01
To prospectively determine the diagnostic accuracy of a biparametric 3T magnetic resonance imaging protocol (BP-MRI) for prostatic cancer detection, compared to a multiparametric MRI protocol (MP-MRI), in a biopsy naïve patient population. Eighty-two untreated patients (mean age 65±7.6years) with clinical suspicion of prostate cancer and/or altered prostate-specific antigen (PSA) levels underwent a MP-MRI, including T2-weighted imaging, diffusion-weighted imaging (with the correspondent apparent diffusion coefficient maps) and dynamic contrast enhanced sequence, followed by prostate biopsy. Two radiologists reviewed both the BP-MRI and the MP-MRI protocols to establish a radiological diagnosis. Receiver operating characteristics curves were obtained to determine the diagnostic performance of the two protocols. The mean PSA level was 8.8±8.1ng/ml. A total of 34 prostatic tumors were identified, with a Gleason score that ranged from 3+3 to 5+4. Of these 34 tumors, 29 were located within the peripheral zone and 5 in the transitional zone. BP-MRI and MP-MRI showed a similar performance in terms of overall diagnostic accuracy, with an area under the curve of 0.91 and 0.93, respectively (p=n.s.). BP-MRI prostate protocol is feasible for prostatic cancer detection compared to a standard MP-MRI protocol, requiring a shorter acquisition and interpretation time, with comparable diagnostic accuracy to the conventional protocol, without the administration of gadolinium-based contrast agent. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Automated selective disruption of slow wave sleep.
Ooms, Sharon J; Zempel, John M; Holtzman, David M; Ju, Yo-El S
2017-04-01
Slow wave sleep (SWS) plays an important role in neurophysiologic restoration. Experimentally testing the effect of SWS disruption previously required highly time-intensive and subjective methods. Our goal was to develop an automated and objective protocol to reduce SWS without affecting sleep architecture. We developed a custom Matlab™ protocol to calculate electroencephalogram spectral power every 10s live during a polysomnogram, exclude artifact, and, if measurements met criteria for SWS, deliver increasingly louder tones through earphones. Middle-aged healthy volunteers (n=10) each underwent 2 polysomnograms, one with the SWS disruption protocol and one with sham condition. The SWS disruption protocol reduced SWS compared to sham condition, as measured by spectral power in the delta (0.5-4Hz) band, particularly in the 0.5-2Hz range (mean 20% decrease). A compensatory increase in the proportion of total spectral power in the theta (4-8Hz) and alpha (8-12Hz) bands was seen, but otherwise normal sleep features were preserved. N3 sleep decreased from 20±34 to 3±6min, otherwise there were no significant changes in total sleep time, sleep efficiency, or other macrostructural sleep characteristics. This novel SWS disruption protocol produces specific reductions in delta band power similar to existing methods, but has the advantage of being automated, such that SWS disruption can be performed easily in a highly standardized and operator-independent manner. This automated SWS disruption protocol effectively reduces SWS without impacting overall sleep architecture. Copyright © 2017 Elsevier B.V. All rights reserved.
Ventre, Kathleen M; Barry, James S; Davis, Deborah; Baiamonte, Veronica L; Wentworth, Allen C; Pietras, Michele; Coughlin, Liza; Barley, Gwyn
2014-04-01
Relocating obstetric (OB) services to a children's hospital imposes demands on facility operations, which must be met to ensure quality care and a satisfactory patient experience. We used in situ simulations to prospectively and iteratively evaluate operational readiness of a children's hospital-based OB unit before it opened for patient care. This project took place at a 314-bed, university-affiliated children's hospital. We developed 3 full-scale simulation scenarios depicting a concurrent maternal and neonatal emergency. One scenario began with a standardized patient experiencing admission; the mannequin portrayed a mother during delivery. We ran all 3 scenarios on 2 dates scheduled several weeks apart. We ran 2 of the scenarios on a third day to verify the reliability of key processes. During the simulations, content experts completed equipment checklists, and participants identified latent safety hazards. Each simulation involved a unique combination of scheduled participants who were supplemented by providers from responding ancillary services. The simulations involved 133 scheduled participants representing OB, neonatology, and anesthesiology. We exposed and addressed operational deficiencies involving equipment availability, staffing, interprofessional communication, and systems issues such as transfusion protocol failures and electronic order entry challenges. Process changes between simulation days 1 to 3 decreased the elapsed time between transfusion protocol activation and blood arrival to the operating room and labor/delivery/recovery/postpartum setting. In situ simulations identified multiple operational deficiencies on the OB unit, allowing us to take corrective action before its opening. This project may guide other children's hospitals regarding care processes likely to require significant focus and possible modification to accommodate an OB service.
Brehmer, Jess L; Husband, Jeffrey B
2014-10-01
There are relatively few studies in the literature that specifically evaluate accelerated rehabilitation protocols for distal radial fractures treated with open reduction and internal fixation (ORIF). The purpose of this study was to compare the early postoperative outcomes (at zero to twelve weeks postoperatively) of patients enrolled in an accelerated rehabilitation protocol with those of patients enrolled in a standard rehabilitation protocol following ORIF for a distal radial fracture. We hypothesized that patients with accelerated rehabilitation after volar ORIF for a distal radial fracture would have an earlier return to function compared with patients who followed a standard protocol. From November 2007 to November 2010, eighty-one patients with an unstable distal radial fracture were prospectively randomized to follow either an accelerated or a standard rehabilitation protocol after undergoing ORIF with a volar plate for a distal radial fracture. Both groups began with gentle active range of motion at three to five days postoperatively. At two weeks, the accelerated group initiated wrist/forearm passive range of motion and strengthening exercises, whereas the standard group initiated passive range of motion and strengthening at six weeks postoperatively. Patients were assessed at three to five days, two weeks, three weeks, four weeks, six weeks, eight weeks, twelve weeks, and six months postoperatively. Outcomes included Disabilities of the Arm, Shoulder and Hand (DASH) scores (primary outcome) and measurements of wrist flexion/extension, supination, pronation, grip strength, and palmar pinch. The patients in the accelerated group had better mobility, strength, and DASH scores at the early postoperative time points (zero to eight weeks postoperatively) compared with the patients in the standard rehabilitation group. The difference between the groups was both clinically relevant and statistically significant. Patients who follow an accelerated rehabilitation protocol that emphasizes motion immediately postoperatively and initiates strengthening at two weeks after volar ORIF of a distal radial fracture have an earlier return to function than patients who follow a more standard rehabilitation protocol. Therapeutic Level I. See Instructions for Authors for a complete description of levels of evidence. Copyright © 2014 by The Journal of Bone and Joint Surgery, Incorporated.
M.D. Bryant; Trent McDonald; R. Aho; B.E. Wright; Michelle Bourassa Stahl
2008-01-01
We describe a protocol to monitor the effectiveness of the Tongass Land Management Plan (TLMP) management standards for maintaining fish habitat. The protocol uses juvenile coho salmon (Oncorhynchus kisutch) in small tributary streams in forested watersheds. We used a 3-year pilot study to develop detailed methods to estimate juvenile salmonid...
Stergiou, George S; Alpert, Bruce; Mieke, Stephan; Asmar, Roland; Atkins, Neil; Eckert, Siegfried; Frick, Gerhard; Friedman, Bruce; Graßl, Thomas; Ichikawa, Tsutomu; Ioannidis, John P; Lacy, Peter; McManus, Richard; Murray, Alan; Myers, Martin; Palatini, Paolo; Parati, Gianfranco; Quinn, David; Sarkis, Josh; Shennan, Andrew; Usuda, Takashi; Wang, Jiguang; Wu, Colin O; O'Brien, Eoin
2018-03-01
: In the last 30 years, several organizations, such as the US Association for the Advancement of Medical Instrumentation (AAMI), the British Hypertension Society, the European Society of Hypertension (ESH) Working Group on Blood Pressure (BP) Monitoring and the International Organization for Standardization (ISO) have developed protocols for clinical validation of BP measuring devices. However, it is recognized that science, as well as patients, consumers and manufacturers would be best served if all BP measuring devices were assessed for accuracy according to an agreed single validation protocol that had global acceptance. Therefore, an international initiative was taken by AAMI, ESH and ISO experts who agreed to develop a universal standard for device validation. This statement presents the key aspects of a validation procedure, which were agreed by the AAMI, ESH and ISO representatives as the basis for a single universal validation protocol. As soon as the AAMI/ESH/ISO standard is fully developed, this will be regarded as the single universal standard and will replace all other previous standards/protocols.
Stergiou, George S; Alpert, Bruce; Mieke, Stephan; Asmar, Roland; Atkins, Neil; Eckert, Siegfried; Frick, Gerhard; Friedman, Bruce; Graßl, Thomas; Ichikawa, Tsutomu; Ioannidis, John P; Lacy, Peter; McManus, Richard; Murray, Alan; Myers, Martin; Palatini, Paolo; Parati, Gianfranco; Quinn, David; Sarkis, Josh; Shennan, Andrew; Usuda, Takashi; Wang, Jiguang; Wu, Colin O; O'Brien, Eoin
2018-03-01
In the past 30 years, several organizations, such as the US Association for the Advancement of Medical Instrumentation (AAMI), the British Hypertension Society, the European Society of Hypertension (ESH) Working Group on Blood Pressure (BP) Monitoring, and the International Organization for Standardization (ISO), have developed protocols for clinical validation of BP measuring devices. However, it is recognized that science, as well as patients, consumers, and manufacturers, would be best served if all BP measuring devices were assessed for accuracy according to an agreed single validation protocol that had global acceptance. Therefore, an international initiative was taken by the AAMI, ESH, and ISO experts who agreed to develop a universal standard for device validation. This statement presents the key aspects of a validation procedure, which were agreed by the AAMI, ESH, and ISO representatives as the basis for a single universal validation protocol. As soon as the AAMI/ESH/ISO standard is fully developed, this will be regarded as the single universal standard and will replace all other previous standards/protocols. © 2018 American Heart Association, Inc., and Wolters Kluwer Health, Inc.
National Airspace System (NAS) open system architecture and protocols
DOT National Transportation Integrated Search
2003-08-14
This standard establishes the open systems data communications architecture and authorized protocol standards for the National Airspace System (NAS). The NAS will consist of various types of processors and communications networks procured from a vari...
Rani, Anupama; Sharma, Vivek; Arora, Sumit; Lal, Darshan; Kumar, Anil
2015-04-01
Detection of milk fat adulteration with foreign fats/oils continues to be a challenge for the dairy industry as well as food testing laboratories, especially in the present scenario of rampant adulteration using the scientific knowledge by unscrupulous persons involved in the trade. In the present investigation a rapid reversed-phase thin layer chromatographic (RP-TLC) protocol was standardized to ascertain the purity of milk fat. RP-TLC protocol did not show any false positive results in the genuine ghee (clarified butter fat) samples of known origin. Adulteration of ghee with coconut oil up to 7. 5 %, soybean oil, sunflower oil and groundnut oil up to 1 %, while, designer oil up to 2 % level could be detected using the standardized RP-TLC protocol. The protocol standardized is rapid and convenient to use.
Motion Imagery and Robotics Application (MIRA)
NASA Technical Reports Server (NTRS)
Martinez, Lindolfo; Rich, Thomas
2011-01-01
Objectives include: I. Prototype a camera service leveraging the CCSDS Integrated protocol stack (MIRA/SM&C/AMS/DTN): a) CCSDS MIRA Service (New). b) Spacecraft Monitor and Control (SM&C). c) Asynchronous Messaging Service (AMS). d) Delay/Disruption Tolerant Networking (DTN). II. Additional MIRA Objectives: a) Demo of Camera Control through ISS using CCSDS protocol stack (Berlin, May 2011). b) Verify that the CCSDS standards stack can provide end-to-end space camera services across ground and space environments. c) Test interoperability of various CCSDS protocol standards. d) Identify overlaps in the design and implementations of the CCSDS protocol standards. e) Identify software incompatibilities in the CCSDS stack interfaces. f) Provide redlines to the SM&C, AMS, and DTN working groups. d) Enable the CCSDS MIRA service for potential use in ISS Kibo camera commanding. e) Assist in long-term evolution of this entire group of CCSDS standards to TRL 6 or greater.
Standards Development Activities at White Sands Test Facility
NASA Technical Reports Server (NTRS)
Baker, D. L.; Beeson, H. D.; Saulsberry, R. L.; Julien, H. L.; Woods, S. S.
2003-01-01
The development of standards and standard activities at the JSC White Sands Test Facility (WSTF) has been expanded to include the transfer of technology and standards to voluntary consensus organizations in five technical areas of importance to NASA. This effort is in direct response to the National Technology Transfer Act designed to accelerate transfer of technology to industry and promote government-industry partnerships. Technology transfer is especially important for WSTF, whose longterm mission has been to develop and provide vital propellant safety and hazards information to aerospace designers, operations personnel, and safety personnel. Meeting this mission is being accomplished through the preparation of consensus guidelines and standards, propellant hazards analysis protocols, and safety courses for the propellant use of hydrogen, oxygen, and hypergols, as well as the design and inspection of spacecraft pressure vessels and the use of pyrovalves in spacecraft propulsion systems. The overall WSTF technology transfer program is described and the current status of technology transfer activities are summarized.
Design and Verification of a Distributed Communication Protocol
NASA Technical Reports Server (NTRS)
Munoz, Cesar A.; Goodloe, Alwyn E.
2009-01-01
The safety of remotely operated vehicles depends on the correctness of the distributed protocol that facilitates the communication between the vehicle and the operator. A failure in this communication can result in catastrophic loss of the vehicle. To complicate matters, the communication system may be required to satisfy several, possibly conflicting, requirements. The design of protocols is typically an informal process based on successive iterations of a prototype implementation. Yet distributed protocols are notoriously difficult to get correct using such informal techniques. We present a formal specification of the design of a distributed protocol intended for use in a remotely operated vehicle, which is built from the composition of several simpler protocols. We demonstrate proof strategies that allow us to prove properties of each component protocol individually while ensuring that the property is preserved in the composition forming the entire system. Given that designs are likely to evolve as additional requirements emerge, we show how we have automated most of the repetitive proof steps to enable verification of rapidly changing designs.
Al-Ekrish, Asma'a A; Al-Shawaf, Reema; Schullian, Peter; Al-Sadhan, Ra'ed; Hörmann, Romed; Widmann, Gerlig
2016-10-01
To assess the comparability of linear measurements of dental implant sites recorded from multidetector computed tomography (MDCT) images obtained using standard-dose filtered backprojection (FBP) technique with those from various ultralow doses combined with FBP, adaptive statistical iterative reconstruction (ASIR), and model-based iterative reconstruction (MBIR) techniques. The results of the study may contribute to MDCT dose optimization for dental implant site imaging. MDCT scans of two cadavers were acquired using a standard reference protocol and four ultralow-dose test protocols (TP). The volume CT dose index of the different dose protocols ranged from a maximum of 30.48-36.71 mGy to a minimum of 0.44-0.53 mGy. All scans were reconstructed using FBP, ASIR-50, ASIR-100, and MBIR, and either a bone or standard reconstruction kernel. Linear measurements were recorded from standardized images of the jaws by two examiners. Intra- and inter-examiner reliability of the measurements were analyzed using Cronbach's alpha and inter-item correlation. Agreement between the measurements obtained with the reference-dose/FBP protocol and each of the test protocols was determined with Bland-Altman plots and linear regression. Statistical significance was set at a P-value of 0.05. No systematic variation was found between the linear measurements obtained with the reference protocol and the other imaging protocols. The only exceptions were TP3/ASIR-50 (bone kernel) and TP4/ASIR-100 (bone and standard kernels). The mean measurement differences between these three protocols and the reference protocol were within ±0.1 mm, with the 95 % confidence interval limits being within the range of ±1.15 mm. A nearly 97.5 % reduction in dose did not significantly affect the height and width measurements of edentulous jaws regardless of the reconstruction algorithm used.
Loftus, Tyler J; Mira, Juan C; Ozrazgat-Baslanti, Tezcan; Ghita, Gabriella L; Wang, Zhongkai; Stortz, Julie A; Brumback, Babette A; Bihorac, Azra; Segal, Mark S; Anton, Stephen D; Leeuwenburgh, Christiaan; Mohr, Alicia M; Efron, Philip A; Moldawer, Lyle L; Moore, Frederick A; Brakenridge, Scott C
2017-01-01
Introduction Sepsis is a common, costly and morbid cause of critical illness in trauma and surgical patients. Ongoing advances in sepsis resuscitation and critical care support strategies have led to improved in-hospital mortality. However, these patients now survive to enter state of chronic critical illness (CCI), persistent low-grade organ dysfunction and poor long-term outcomes driven by the persistent inflammation, immunosuppression and catabolism syndrome (PICS). The Sepsis and Critical Illness Research Center (SCIRC) was created to provide a platform by which the prevalence and pathogenesis of CCI and PICS may be understood at a mechanistic level across multiple medical disciplines, leading to the development of novel management strategies and targeted therapies. Methods Here, we describe the design, study cohort and standard operating procedures used in the prospective study of human sepsis at a level 1 trauma centre and tertiary care hospital providing care for over 2600 critically ill patients annually. These procedures include implementation of an automated sepsis surveillance initiative, augmentation of clinical decisions with a computerised sepsis protocol, strategies for direct exportation of quality-filtered data from the electronic medical record to a research database and robust long-term follow-up. Ethics and dissemination This study has been registered at ClinicalTrials.gov, approved by the University of Florida Institutional Review Board and is actively enrolling subjects. Dissemination of results is forthcoming. PMID:28765125
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-12
... Connection testing [using current Nasdaq access protocols] during the normal operating hours of the NTF; No Charge--For Idle Connection testing [using current Nasdaq access protocols]; $333/hour--For Active Connection testing [using current Nasdaq access protocols] at all times other than the normal operating hours...
TCP Performance Enhancement Over Iridium
NASA Technical Reports Server (NTRS)
Torgerson, Leigh; Hutcherson, Joseph; McKelvey, James
2007-01-01
In support of iNET maturation, NASA-JPL has collaborated with NASA-Dryden to develop, test and demonstrate an over-the-horizon vehicle-to-ground networking capability, using Iridium as the vehicle-to-ground communications link for relaying critical vehicle telemetry. To ensure reliability concerns are met, the Space Communications Protocol Standards (SCPS) transport protocol was investigated for its performance characteristics in this environment. In particular, the SCPS-TP software performance was compared to that of the standard Transmission Control Protocol (TCP) over the Internet Protocol (IP). This paper will report on the results of this work.
Multi-party Quantum Key Agreement without Entanglement
NASA Astrophysics Data System (ADS)
Cai, Bin-Bin; Guo, Gong-De; Lin, Song
2017-04-01
A new efficient quantum key agreement protocol without entanglement is proposed. In this protocol, each user encodes his secret key into the traveling particles by performing one of four rotation operations that one cannot perfectly distinguish. In the end, all users can simultaneously obtain the final shared key. The security of the presented protocol against some common attacks is discussed. It is shown that this protocol can effectively protect the privacy of each user and satisfy the requirement of fairness in theory. Moreover, the quantum carriers and the encoding operations used in the protocol can be achieved in realistic physical devices. Therefore, the presented protocol is feasible with current technology.
Secure quantum communication using classical correlated channel
NASA Astrophysics Data System (ADS)
Costa, D.; de Almeida, N. G.; Villas-Boas, C. J.
2016-10-01
We propose a secure protocol to send quantum information from one part to another without a quantum channel. In our protocol, which resembles quantum teleportation, a sender (Alice) and a receiver (Bob) share classical correlated states instead of EPR ones, with Alice performing measurements in two different bases and then communicating her results to Bob through a classical channel. Our secure quantum communication protocol requires the same amount of classical bits as the standard quantum teleportation protocol. In our scheme, as in the usual quantum teleportation protocol, once the classical channel is established in a secure way, a spy (Eve) will never be able to recover the information of the unknown quantum state, even if she is aware of Alice's measurement results. Security, advantages, and limitations of our protocol are discussed and compared with the standard quantum teleportation protocol.
Le Brigand, H; Morille, P; Garnier, B; Bogaty-Yver, J; Samama, M; Spriet, A
A comparative clinical trial was undertaken in 2420 patients undergoing thoracic surgery during a 4-year period (1973-1977); 40% of the patients had bronchial cancer. Random allocation was not considered as being possible by the surgeons and was replaced by allocation according to the time of operation. There were three protocol groups: Protocol A: First morning operations (1007 patients): subcutaneous calcium heparin, 5000 units (Ul) 2 hours and 30 minutes before surgery then every 12 hours for 15 days. Protocol B: Second morning operations (932 patients): same dose and duration of treatment; the first injection took place 24 to 72 hours after the surgical procedure. The doses were increased from the fourth day after surgery in order to obtain a moderately prolonged partial thromboplastin time (difference patient-control: 7 to 14 seconds). Protocol 0: 481 patients received no anticoagulant treatment because of a contraindication or minor surgical procedure. Preliminary results showed and increase of per-operative bleeding (p less than 0.01) in treated patients; this was very well accepted by the surgeons. Among the heparin-treated patients, 11 pulmonary emboli out of 13 were observed in patients with bronchial cancer. Of these 13, 10 were fatal with 9 being verified at autopsy. The pulmonary emboli episodes occurred significantly earlier in protocol B than in protocol A. Fatal pulmonary embolism in patients with bronchial cancer was significantly more frequent in protocol B (7 cases) than in protocol A (1 case); P less than 0.01. These results have shown a low frequency of fatal pulmonary emboli in patients without bronchial cancer receiving twice-daily subcutaneous injections of heparin (2 of 1102 operated subjects). The rate was higher in patients with bronchial cancer and this results supports a recommended thrice-daily dose in such patients. In addition, the pre-operative administration of heparin is useful in preventing early post-operative pulmonary embolism.
Evaluation of Alternative Field Buses for Lighting ControlApplications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koch, Ed; Rubinstein, Francis
2005-03-21
The Subcontract Statement of Work consists of two major tasks. This report is the Final Report in fulfillment of the contract deliverable for Task 1. The purpose of Task 1 was to evaluate existing and emerging protocols and standards for interfacing sensors and controllers for communicating with integrated lighting control systems in commercial buildings. The detailed task description follows: Task 1. Evaluate alternative sensor/field buses. The objective of this task is to evaluate existing and emerging standards for interfacing sensors and controllers for communicating with integrated lighting control systems in commercial buildings. The protocols to be evaluated will include atmore » least: (1) 1-Wire Net, (2) DALI, (3) MODBUS (or appropriate substitute such as EIB) and (4) ZigBee. The evaluation will include a comparative matrix for comparing the technical performance features of the different alternative systems. The performance features to be considered include: (1) directionality and network speed, (2) error control, (3) latency times, (4) allowable cable voltage drop, (5) topology, and (6) polarization. Specifically, Subcontractor will: (1) Analyze the proposed network architecture and identify potential problems that may require further research and specification. (2) Help identify and specify additional software and hardware components that may be required for the communications network to operate properly. (3) Identify areas of the architecture that can benefit from existing standards and technology and enumerate those standards and technologies. (4) Identify existing companies that may have relevant technology that can be applied to this research. (5) Help determine if new standards or technologies need to be developed.« less
Lee, Kyung Hee; Lee, Kyung Won; Park, Ji Hoon; Han, Kyunghwa; Kim, Jihang; Lee, Sang Min; Park, Chang Min
2018-01-01
To measure inter-protocol agreement and analyze interchangeability on nodule classification between low-dose unenhanced CT and standard-dose enhanced CT. From nodule libraries containing both low-dose unenhanced and standard-dose enhanced CT, 80 solid and 80 subsolid (40 part-solid, 40 non-solid) nodules of 135 patients were selected. Five thoracic radiologists categorized each nodule into solid, part-solid or non-solid. Inter-protocol agreement between low-dose unenhanced and standard-dose enhanced images was measured by pooling κ values for classification into two (solid, subsolid) and three (solid, part-solid, non-solid) categories. Interchangeability between low-dose unenhanced and standard-dose enhanced CT for the classification into two categories was assessed using a pre-defined equivalence limit of 8 percent. Inter-protocol agreement for the classification into two categories {κ, 0.96 (95% confidence interval [CI], 0.94-0.98)} and that into three categories (κ, 0.88 [95% CI, 0.85-0.92]) was considerably high. The probability of agreement between readers with standard-dose enhanced CT was 95.6% (95% CI, 94.5-96.6%), and that between low-dose unenhanced and standard-dose enhanced CT was 95.4% (95% CI, 94.7-96.0%). The difference between the two proportions was 0.25% (95% CI, -0.85-1.5%), wherein the upper bound CI was markedly below 8 percent. Inter-protocol agreement for nodule classification was considerably high. Low-dose unenhanced CT can be used interchangeably with standard-dose enhanced CT for nodule classification.
SPIRIT 2013 Statement: defining standard protocol items for clinical trials.
Chan, An-Wen; Tetzlaff, Jennifer M; Altman, Douglas G; Laupacis, Andreas; Gøtzsche, Peter C; Krle A-Jerić, Karmela; Hrobjartsson, Asbjørn; Mann, Howard; Dickersin, Kay; Berlin, Jesse A; Dore, Caroline J; Parulekar, Wendy R; Summerskill, William S M; Groves, Trish; Schulz, Kenneth F; Sox, Harold C; Rockhold, Frank W; Rennie, Drummond; Moher, David
2015-12-01
The protocol of a clinical trial serves as the foundation for study planning, conduct, reporting, and appraisal. However, trial protocols and existing protocol guidelines vary greatly in content and quality. This article describes the systematic development and scope of SPIRIT (Standard Protocol Items: Recommendations for Interventional Trials) 2013, a guideline for the minimum content of a clinical trial protocol. The 33-item SPIRIT checklist applies to protocols for all clinical trials and focuses on content rather than format. The checklist recommends a full description of what is planned; it does not prescribe how to design or conduct a trial. By providing guidance for key content, the SPIRIT recommendations aim to facilitate the drafting of high-quality protocols. Adherence to SPIRIT would also enhance the transparency and completeness of trial protocols for the benefit of investigators, trial participants, patients, sponsors, funders, research ethics committees or institutional review boards, peer reviewers, journals, trial registries, policymakers, regulators, and other key stakeholders.
SPIRIT 2013 Statement: Defining Standard Protocol Items for Clinical Trials
Chan, An-Wen; Tetzlaff, Jennifer M.; Altman, Douglas G.; Laupacis, Andreas; Gøtzsche, Peter C.; Krleža-Jerić, Karmela; Hróbjartsson, Asbjørn; Mann, Howard; Dickersin, Kay; Berlin, Jesse A.; Doré, Caroline J.; Parulekar, Wendy R.; Summerskill, William S.M.; Groves, Trish; Schulz, Kenneth F.; Sox, Harold C.; Rockhold, Frank W.; Rennie, Drummond; Moher, David
2016-01-01
The protocol of a clinical trial serves as the foundation for study planning, conduct, reporting, and appraisal. However, trial protocols and existing protocol guidelines vary greatly in content and quality. This article describes the systematic development and scope of SPIRIT (Standard Protocol Items: Recommendations for Interventional Trials) 2013, a guideline for the minimum content of a clinical trial protocol. The 33-item SPIRIT checklist applies to protocols for all clinical trials and focuses on content rather than format. The checklist recommends a full description of what is planned; it does not prescribe how to design or conduct a trial. By providing guidance for key content, the SPIRIT recommendations aim to facilitate the drafting of high-quality protocols. Adherence to SPIRIT would also enhance the transparency and completeness of trial protocols for the benefit of investigators, trial participants, patients, sponsors, funders, research ethics committees or institutional review boards, peer reviewers, journals, trial registries, policymakers, regulators, and other key stakeholders. PMID:23295957
SPIRIT 2013 statement: defining standard protocol items for clinical trials.
Chan, An-Wen; Tetzlaff, Jennifer M; Altman, Douglas G; Laupacis, Andreas; Gøtzsche, Peter C; Krleža-Jerić, Karmela; Hróbjartsson, Asbjørn; Mann, Howard; Dickersin, Kay; Berlin, Jesse A; Doré, Caroline J; Parulekar, Wendy R; Summerskill, William S M; Groves, Trish; Schulz, Kenneth F; Sox, Harold C; Rockhold, Frank W; Rennie, Drummond; Moher, David
2013-02-05
The protocol of a clinical trial serves as the foundation for study planning, conduct, reporting, and appraisal. However, trial protocols and existing protocol guidelines vary greatly in content and quality. This article describes the systematic development and scope of SPIRIT (Standard Protocol Items: Recommendations for Interventional Trials) 2013, a guideline for the minimum content of a clinical trial protocol.The 33-item SPIRIT checklist applies to protocols for all clinical trials and focuses on content rather than format. The checklist recommends a full description of what is planned; it does not prescribe how to design or conduct a trial. By providing guidance for key content, the SPIRIT recommendations aim to facilitate the drafting of high-quality protocols. Adherence to SPIRIT would also enhance the transparency and completeness of trial protocols for the benefit of investigators, trial participants, patients, sponsors, funders, research ethics committees or institutional review boards, peer reviewers, journals, trial registries, policymakers, regulators, and other key stakeholders.
Imam, Mohamed A.; Abdelkafy, Ashraf; Dinah, Feroz; Adhikari, Ajeya
2015-01-01
Background: The purpose of the current study was to determine whether a systematic five-step protocol for debridement and evacuation of bone debris during anterior cruciate ligament reconstruction (ACLR) reduces the presence of such debris on post-operative radiographs. Methods: A five-step protocol for removal of bone debris during arthroscopic assisted ACLR was designed. It was applied to 60 patients undergoing ACLR (Group 1), and high-quality digital radiographs were taken post-operatively in each case to assess for the presence of intra-articular bone debris. A control group of 60 consecutive patients in whom no specific bone debris protocol was applied (Group 2) and their post-operative radiographs were also checked for the presence of intra-articular bone debris. Results: In Group 1, only 15% of post-operative radiographs showed residual bone debris, compared to 69% in Group 2 (p < 0.001). Conclusion: A five-step systematic protocol for bone debris removal during arthroscopic assisted ACLR resulted in a significant decrease in residual bone debris seen on high-quality post-operative radiographs. PMID:27163060
Bires, Angela Macci; Lawson, Dori; Wasser, Thomas E; Raber-Baer, Donna
2013-12-01
Clinically valid cardiac evaluation via treadmill stress testing requires patients to achieve specific target heart rates and to successfully complete the cardiac examination. A comparison of the standard Bruce protocol and the ramped Bruce protocol was performed using data collected over a 1-y period from a targeted patient population with a body mass index (BMI) equal to or greater than 30 to determine which treadmill protocol provided more successful examination results. The functional capacity, metabolic equivalent units achieved, pressure rate product, and total time on the treadmill as measured for the obese patients were clinically valid and comparable to normal-weight and overweight patients (P < 0.001). Data gathered from each protocol demonstrated that the usage of the ramped Bruce protocol achieved more consistent results in comparison across all BMI groups in achieving 80%-85% of their age-predicted maximum heart rate. This study did not adequately establish that the ramped Bruce protocol was superior to the standard Bruce protocol for the examination of patients with a BMI equal to or greater than 30.
NASA Astrophysics Data System (ADS)
Raju, Kota Solomon; Merugu, Naresh Babu; Neetu, Babu, E. Ram
2016-03-01
ZigBee is well-accepted industrial standard for wireless sensor networks based on IEEE 802.15.4 standard. Wireless Sensor Networks is the major concern of communication these days. These Wireless Sensor Networks investigate the properties of networks of small battery-powered sensors with wireless communication. The communication between any two wireless nodes of wireless sensor networks is carried out through a protocol stack. This protocol stack has been designed by different vendors in various ways. Every custom vendor possesses his own protocol stack and algorithms especially at the MAC layer. But, many applications require modifications in their algorithms at various layers as per their requirements, especially energy efficient protocols at MAC layer that are simulated in Wireless sensor Network Simulators which are not being tested in real time systems because vendors do not allow the programmability of each layer in their protocol stack. This problem can be quoted as Vendor-Interoperability. The solution is to develop the programmable protocol stack where we can design our own application as required. As a part of the task first we tried implementing physical layer and transmission of data using physical layer. This paper describes about the transmission of the total number of bytes of Frame according to the IEEE 802.15.4 standard using Physical Layer.
Urushihara, Hisashi; Murakami, Yuka; Matsui, Kenji; Tashiro, Shimon
2018-01-01
Under the Japanese drug regulatory system, post-marketing studies (PMS) must be in compliance with Good Post-marketing Study Practice (GPSP). The GPSP Ordinance lacks standards for the ethical conduct of PMSs; although only post-marketing clinical trials are subject to Good Clinical Practice. We conducted a web-based questionnaire survey on the ethical conduct of PMSs in collaboration with the Japanese Society of Hospital Pharmacists and pharmacists belonging to the Society. 1819 hospitals around Japan answered the questionnaire, of which 503 hospitals had conducted company-sponsored PMSs in 2015. 40.2% of the hospitals had obtained informed consent from participating patients in at least one PMS conducted in 2015, the majority of which was in written form. The first and second most frequent reasons for seeking informed consent in PMSs were to meet protocol requirements, followed by the requirement to meet institutional standard operational procedures and the request of the ethical review board of the hospital. Ethical review of PMSs was conducted in 251 hospitals. Despite a lack of standards for informed consent and ethical review in PMSs, a considerable number of study sites employed informed consent and ethical review for PMSs. While company policies and protocols are likely to be major determinants of the ethical conduct of PMSs, the governmental regulatory agency should also play a significant role in implementing a standardized ethical code for the conduct of PMSs.
Novel methods of imaging and analysis for the thermoregulatory sweat test.
Carroll, Michael Sean; Reed, David W; Kuntz, Nancy L; Weese-Mayer, Debra Ellyn
2018-06-07
The thermoregulatory sweat test (TST) can be central to the identification and management of disorders affecting sudomotor function and small sensory and autonomic nerve fibers, but the cumbersome nature of the standard testing protocol has prevented its widespread adoption. A high resolution, quantitative, clean and simple assay of sweating could significantly improve identification and management of these disorders. Images from 89 clinical TSTs were analyzed retrospectively using two novel techniques. First, using the standard indicator powder, skin surface sweat distributions were determined algorithmically for each patient. Second, a fundamentally novel method using thermal imaging of forced evaporative cooling was evaluated through comparison with the standard technique. Correlation and receiver operating characteristic analyses were used to determine the degree of match between these methods, and the potential limits of thermal imaging were examined through cumulative analysis of all studied patients. Algorithmic encoding of sweating and non-sweating regions produces a more objective analysis for clinical decision making. Additionally, results from the forced cooling method correspond well with those from indicator powder imaging, with a correlation across spatial regions of -0.78 (CI: -0.84 to -0.71). The method works similarly across body regions, and frame-by-frame analysis suggests the ability to identify sweating regions within about 1 second of imaging. While algorithmic encoding can enhance the standard sweat testing protocol, thermal imaging with forced evaporative cooling can dramatically improve the TST by making it less time-consuming and more patient-friendly than the current approach.
A biochemical protocol for the isolation and identification of current species of Vibrio in seafood.
Ottaviani, D; Masini, L; Bacchiocchi, S
2003-01-01
We report a biochemical method for the isolation and identification of the current species of vibrios using just one operative protocol. The method involves an enrichment phase with incubation at 30 degrees C for 8-24 h in alkaline peptone water and an isolation phase on thiosulphate-citrate-salt sucrose agar plates incubating at 30 degrees C for 24 h. Four biochemical tests and Alsina's scheme were performed for genus and species identification, respectively. All biochemical tests were optimized as regards conditions of temperature, time of incubation and media composition. The whole standardized protocol was always able to give a correct identification when applied to 25 reference strains of Vibrio and 134 field isolates. The data demonstrated that the assay method allows an efficient recovery, isolation and identification of current species of Vibrio in seafood obtaining results within 2-7 days. This method based on biochemical tests could be applicable even in basic microbiology laboratories, and can be used simultaneously to isolate and discriminate all clinically relevant species of Vibrio.
Cool Apps: Building Cryospheric Data Applications With Standards-Based Service Oriented Architecture
NASA Astrophysics Data System (ADS)
Collins, J. A.; Truslove, I.; Billingsley, B. W.; Oldenburg, J.; Brodzik, M.; Lewis, S.; Liu, M.
2012-12-01
The National Snow and Ice Data Center (NSIDC) holds a large collection of cryospheric data, and is involved in a number of informatics research and development projects aimed at improving the discoverability and accessibility of these data. To develop high-quality software in a timely manner, we have adopted a Service-Oriented Architecture (SOA) approach for our core technical infrastructure development. Data services at NSIDC are internally exposed to other tools and applications through standards-based service interfaces. These standards include OAI-PMH (Open Archives Initiative Protocol for Metadata Harvesting), various OGC (Open Geospatial Consortium) standards including WMS (Web Map Service) and WFS (Web Feature Service), ESIP (Federation of Earth Sciences Information Partners) OpenSearch, and NSIDC-specific RESTful services. By taking a standards-based approach, we are able to use off-the-shelf tools and libraries to consume, translate and broker these data services, and thus develop applications faster. Additionally, by exposing public interfaces to these services we provide valuable data services to technical collaborators; for example, NASA Reverb (http://reverb.echo.nasa.gov) uses NSIDC's WMS services. Our latest generation of web applications consume these data services directly. The most complete example of this is the Operation IceBridge Data Portal (http://nsidc.org/icebridge/portal) which depends on many of the aforementioned services, and clearly exhibits many of the advantages of building applications atop a service-oriented architecture. This presentation outlines the architectural approach and components and open standards and protocols adopted at NSIDC, demonstrates the interactions and uses of public and internal service interfaces currently powering applications including the IceBridge Data Portal, and outlines the benefits and challenges of this approach.
CURRENT STATUS OF THE EPA PROTOCOL GAS PROGRAM
Accurate compressed gas calibration standards are needed to calibrate continuous emission monitors (CEMs) and ambient air quality monitors that are being used for regulatory purposes. EPA has published a protocol to establish the traceability of these standards to national refer...
Evaluating the Validity Indices of the Personality Assessment Inventory-Adolescent Version.
Meyer, Justin K; Hong, Sang-Hwang; Morey, Leslie C
2015-08-01
Past research has established strong psychometric properties of several indicators of response distortion on the Personality Assessment Inventory (PAI). However, to date, it has been unclear whether the response distortion indicators of the adolescent version of the PAI (PAI-A) operate in an equally valid manner. The current study sought to examine several response distortion indicators on the PAI-A to determine their relative efficacy at the detection of distorted responding, including both positive distortion and negative distortion. Protocols of 98 college students asked to either overreport or underreport were compared with 98 age-matched individuals sampled from the clinical standardization sample and the community standardization sample, respectively. Comparisons between groups were accomplished through the examination of effect sizes and receiver operating characteristic curves. All indicators demonstrated the ability to distinguish between actual and feigned responding, including several newly developed indicators. This study provides support for the ability of distortion indicators developed for the PAI to also function appropriately on the PAI-A. © The Author(s) 2014.
Using the ACR/NEMA standard with TCP/IP and Ethernet
NASA Astrophysics Data System (ADS)
Chimiak, William J.; Williams, Rodney C.
1991-07-01
There is a need for a consolidated picture archival and communications system (PACS) in hospitals. At the Bowman Gray School of Medicine of Wake Forest University (BGSM), the authors are enhancing the ACR/NEMA Version 2 protocol using UNIX sockets and TCP/IP to greatly improve connectivity. Initially, nuclear medicine studies using gamma cameras are to be sent to PACS. The ACR/NEMA Version 2 protocol provides the functionality of the upper three layers of the open system interconnection (OSI) model in this implementation. The images, imaging equipment information, and patient information are then sent in ACR/NEMA format to a software socket. From there it is handed to the TCP/IP protocol, which provides the transport and network service. TCP/IP, in turn, uses the services of IEEE 802.3 (Ethernet) to complete the connectivity. The advantage of this implementation is threefold: (1) Only one I/O port is consumed by numerous nuclear medicine cameras, instead of a physical port for each camera. (2) Standard protocols are used which maximize interoperability with ACR/NEMA compliant PACSs. (3) The use of sockets allows a migration path to the transport and networking services of OSIs TP4 and connectionless network service as well as the high-performance protocol being considered by the American National Standards Institute (ANSI) and the International Standards Organization (ISO) -- the Xpress Transfer Protocol (XTP). The use of sockets also gives access to ANSI's Fiber Distributed Data Interface (FDDI) as well as other high-speed network standards.
Brillantino, A; Iacobellis, F; Robustelli, U; Villamaina, E; Maglione, F; Colletti, O; De Palma, M; Paladino, F; Noschese, G
2016-10-01
The advantages of the conservative approach for major spleen injuries are still debated. This study was designed to evaluate the safety and effectiveness of NOM in the treatment of minor (grade I-II according with the American Association for the Surgery of Trauma; AAST) and severe (AAST grade III-V) blunt splenic trauma, following a standardized treatment protocol. All the hemodynamically stable patients with computer tomography (CT) diagnosis of blunt splenic trauma underwent NOM, which included strict clinical and laboratory observation, 48-72 h contrast-enhanced ultrasonography (CEUS) follow-up and splenic angioembolization, performed both in patients with admission CT evidence of vascular injuries and in patients with falling hematocrit during observation. 87 patients [32 (36.7 %) women and 55 (63.2 %) men, median age 34 (range 14-68)] were included. Of these, 28 patients (32.1 %) had grade I, 22 patients (25.2 %) grade II, 20 patients (22.9 %) grade III, 11 patients (12.6 %) grade IV and 6 patients (6.8 %) grade V injuries. The overall success rate of NOM was 95.4 % (82/87). There was no significant difference in the success rate between the patients with different splenic injuries grade. Of 24 patients that had undergone angioembolization, 22 (91.6 %) showed high splenic injury grade. The success rate of embolization was 91.6 % (22/24). No major complications were observed. The minor complications (2 pleural effusions, 1 pancreatic fistula and 2 splenic abscesses) were successfully treated by EAUS or CT guided drainage. The non operative management of blunt splenic trauma, according to our protocol, represents a safe and effective treatment for both minor and severe injuries, achieving an overall success rate of 95 %. The angiographic study could be indicated both in patients with CT evidence of vascular injuries and in patients with high-grade splenic injuries, regardless of CT findings.
Practicing Surgeons Lead in Quality Care, Safety, and Cost Control
Shively, Eugene H.; Heine, Michael J.; Schell, Robert H.; Sharpe, J Neal; Garrison, R Neal; Vallance, Steven R.; DeSimone, Kenneth J.S.; Polk, Hiram C.
2004-01-01
Objective: To report the experiences of 66 surgical specialists from 15 different hospitals who performed 43 CPT-based procedures more than 16,000 times. Summary Background Data: Surgeons are under increasing pressure to demonstrate patient safety data as quantitated by objective and subjective outcomes that meet or exceed the standards of benchmark institutions or databases. Methods: Data from 66 surgical specialists on 43 CPT-based procedures were accessioned over a 4-year period. The hospitals vary from a small 30-bed hospital to large teaching hospitals. All reported deaths and complications were verified from hospital and office records and compared with benchmarks. Results: Over a 4-year inclusive period (1999–2002), 16,028 elective operations were accessioned. There was a total 1.4% complication rate and 0.05% death rate. A system has been developed for tracking outcomes. A wide range of improvements have been identified. These include the following: 1) improved classification of indications for systemic prophylactic antibiotic use and reduction in the variety of drugs used, 2) shortened length of stay for standard procedures in different surgical specialties, 3) adherence to strict indicators for selected operative procedures, 4) less use of costly diagnostic procedures, 5) decreased use of expensive home health services, 6) decreased use of very expensive drugs, 7) identification of the unnecessary expense of disposable laparoscopic devices, 8) development of a method to compare a one-surgeon hospital with his peers, and 9) development of unique protocols for interaction of anesthesia and surgery. The system also provides a very good basis for confirmation of patient safety and improvement therein. Conclusions: Since 1998, Quality Surgical Solutions, PLLC, has developed simple physician-authored protocols for delivering high-quality and cost-effective surgery that measure up to benchmark institutions. We have discovered wide areas for improvements in surgery by adherence to simple protocols, minimizing death and complications and clarifying cost issues. PMID:15166954
Yang, Min; Chen, Pei-Yu; Gong, Si-Tang; Lyman, Beth; Geng, Lan-Lan; Liu, Li-Ying; Liang, Cui-Ping; Xu, Zhao-Hui; Li, Hui-Wen; Fang, Tie-Fu; Li, Ding-You
2014-11-01
A standard nutrition screening and enteral nutrition (EN) protocol was implemented in January 2012 in a tertiary children's center in China. The aims of the present study were to evaluate the cost-effectiveness of a standard EN protocol in hospitalized patients. A retrospective chart review was performed in the gastroenterology inpatient unit. We included all inpatient children requiring EN from January 1, 2010, to December 31, 2013, with common gastrointestinal (GI) diseases. Children from January 1, 2012, to December 31, 2013, served as the standard EN treatment group, and those from January 1, 2010, to December 31, 2011, were the control EN group. Pertinent patient information was collected. We also analyzed the length of hospital stay, cost of care, and in-hospital infection rates. The standard EN treatment group received more nasojejunal tube feedings. There was a tendency for the standard EN treatment group to receive more elemental and hydrolyzed protein formulas. Implementation of a standard EN protocol significantly reduced the time to initiate EN (32.38 ± 24.50 hours vs 18.76 ± 13.53 hours; P = .011) and the time to reach a targeted calorie goal (7.42 ± 3.98 days vs 5.06 ± 3.55 days; P = .023); length of hospital stay was shortened by 3.2 days after implementation of the standard EN protocol but did not reach statistical significance. However, the shortened length of hospital stay contributed to a significant reduction in the total cost of hospital care (13,164.12 ± 6722.95 Chinese yuan [CNY] vs 9814.96 ± 4592.91 CNY; P < .032). Implementation of a standard EN protocol resulted in early initiation of EN, shortened length of stay, and significantly reduced total cost of care in hospitalized children with common GI diseases. © 2014 American Society for Parenteral and Enteral Nutrition.
Dynamic federations: storage aggregation using open tools and protocols
NASA Astrophysics Data System (ADS)
Furano, Fabrizio; Brito da Rocha, Ricardo; Devresse, Adrien; Keeble, Oliver; Álvarez Ayllón, Alejandro; Fuhrmann, Patrick
2012-12-01
A number of storage elements now offer standard protocol interfaces like NFS 4.1/pNFS and WebDAV, for access to their data repositories, in line with the standardization effort of the European Middleware Initiative (EMI). Also the LCG FileCatalogue (LFC) can offer such features. Here we report on work that seeks to exploit the federation potential of these protocols and build a system that offers a unique view of the storage and metadata ensemble and the possibility of integration of other compatible resources such as those from cloud providers. The challenge, here undertaken by the providers of dCache and DPM, and pragmatically open to other Grid and Cloud storage solutions, is to build such a system while being able to accommodate name translations from existing catalogues (e.g. LFCs), experiment-based metadata catalogues, or stateless algorithmic name translations, also known as “trivial file catalogues”. Such so-called storage federations of standard protocols-based storage elements give a unique view of their content, thus promoting simplicity in accessing the data they contain and offering new possibilities for resilience and data placement strategies. The goal is to consider HTTP and NFS4.1-based storage elements and metadata catalogues and make them able to cooperate through an architecture that properly feeds the redirection mechanisms that they are based upon, thus giving the functionalities of a “loosely coupled” storage federation. One of the key requirements is to use standard clients (provided by OS'es or open source distributions, e.g. Web browsers) to access an already aggregated system; this approach is quite different from aggregating the repositories at the client side through some wrapper API, like for instance GFAL, or by developing new custom clients. Other technical challenges that will determine the success of this initiative include performance, latency and scalability, and the ability to create worldwide storage federations that are able to redirect clients to repositories that they can efficiently access, for instance trying to choose the endpoints that are closer or applying other criteria. We believe that the features of a loosely coupled federation of open-protocols-based storage elements will open many possibilities of evolving the current computing models without disrupting them, and, at the same time, will be able to operate with the existing infrastructures, follow their evolution path and add storage centers that can be acquired as a third-party service.
A Weak Value Based QKD Protocol Robust Against Detector Attacks
NASA Astrophysics Data System (ADS)
Troupe, James
2015-03-01
We propose a variation of the BB84 quantum key distribution protocol that utilizes the properties of weak values to insure the validity of the quantum bit error rate estimates used to detect an eavesdropper. The protocol is shown theoretically to be secure against recently demonstrated attacks utilizing detector blinding and control and should also be robust against all detector based hacking. Importantly, the new protocol promises to achieve this additional security without negatively impacting the secure key generation rate as compared to that originally promised by the standard BB84 scheme. Implementation of the weak measurements needed by the protocol should be very feasible using standard quantum optical techniques.
Simultaneous dense coding affected by fluctuating massless scalar field
NASA Astrophysics Data System (ADS)
Huang, Zhiming; Ye, Yiyong; Luo, Darong
2018-04-01
In this paper, we investigate the simultaneous dense coding (SDC) protocol affected by fluctuating massless scalar field. The noisy model of SDC protocol is constructed and the master equation that governs the SDC evolution is deduced. The success probabilities of SDC protocol are discussed for different locking operators under the influence of vacuum fluctuations. We find that the joint success probability is independent of the locking operators, but other success probabilities are not. For quantum Fourier transform and double controlled-NOT operators, the success probabilities drop with increasing two-atom distance, but SWAP operator is not. Unlike the SWAP operator, the success probabilities of Bob and Charlie are different. For different noisy interval values, different locking operators have different robustness to noise.
Yu, Yang; Rajagopal, Ram
2015-02-17
Two dispatch protocols have been adopted by electricity markets to deal with the uncertainty of wind power but the effects of the selection between the dispatch protocols have not been comprehensively analyzed. We establish a framework to compare the impacts of adopting different dispatch protocols on the efficacy of using wind power and implementing a carbon tax to reduce emissions. We suggest that a market has high potential to achieve greater emission reduction by adopting the stochastic dispatch protocol instead of the static protocol when the wind energy in the market is highly uncertain or the market has enough adjustable generators, such as gas-fired combustion generators. Furthermore, the carbon-tax policy is more cost-efficient for reducing CO2 emission when the market operates according to the stochastic protocol rather than the static protocol. An empirical study, which is calibrated according to the data from the Electric Reliability Council of Texas market, confirms that using wind energy in the Texas market results in a 12% CO2 emission reduction when the market uses the stochastic dispatch protocol instead of the 8% emission reduction associated with the static protocol. In addition, if a 6$/ton carbon tax is implemented in the Texas market operated according to the stochastic protocol, the CO2 emission is similar to the emission level from the same market with a 16$/ton carbon tax operated according to the static protocol. Correspondingly, the 16$/ton carbon tax associated with the static protocol costs 42.6% more than the 6$/ton carbon tax associated with the stochastic protocol.
Low Resolution Picture Transmission (LRPT) Demonstration System. Phase II; 1.0
NASA Technical Reports Server (NTRS)
Fong, Wai; Yeh, Pen-Shu; Duran, Steve; Sank, Victor; Nyugen, Xuan; Xia, Wei; Day, John H. (Technical Monitor)
2002-01-01
Low-Resolution Picture Transmission (LRPT) is a proposed standard for direct broadcast transmission of satellite weather images. This standard is a joint effort by the European Organization for the Exploitation of Meteorological Satellites (EUMETSAT) and NOAA. As a digital transmission scheme, its purpose is to replace the current analog Automatic Picture Transmission (APT) system for use in the Meteorological Operational (METOP) satellites. GSFC has been tasked to build an LRPT Demonstration System (LDS). Its main objective is to develop or demonstrate the feasibility of a low-cost receiver utilizing a PC as the primary processing component and determine the performance of the protocol in the simulated Radio Frequency (RF) environment. The approach would consist of two phases.
Network operability of ground-based microwave radiometers: Calibration and standardization efforts
NASA Astrophysics Data System (ADS)
Pospichal, Bernhard; Löhnert, Ulrich; Küchler, Nils; Czekala, Harald
2017-04-01
Ground-based microwave radiometers (MWR) are already widely used by national weather services and research institutions all around the world. Most of the instruments operate continuously and are beginning to be implemented into data assimilation for atmospheric models. Especially their potential for continuously observing boundary-layer temperature profiles as well as integrated water vapor and cloud liquid water path makes them valuable for improving short-term weather forecasts. However until now, most MWR have been operated as stand-alone instruments. In order to benefit from a network of these instruments, standardization of calibration, operation and data format is necessary. In the frame of TOPROF (COST Action ES1303) several efforts have been undertaken, such as uncertainty and bias assessment, or calibration intercomparison campaigns. The goal was to establish protocols for providing quality controlled (QC) MWR data and their uncertainties. To this end, standardized calibration procedures for MWR have been developed and recommendations for radiometer users compiled. Based on the results of the TOPROF campaigns, a new, high-accuracy liquid-nitrogen calibration load has been introduced for MWR manufactured by Radiometer Physics GmbH (RPG). The new load improves the accuracy of the measurements considerably and will lead to even more reliable atmospheric observations. Next to the recommendations for set-up, calibration and operation of ground-based MWR within a future network, we will present homogenized methods to determine the accuracy of a running calibration as well as means for automatic data quality control. This sets the stage for the planned microwave calibration center at JOYCE (Jülich Observatory for Cloud Evolution), which will be shortly introduced.
Rethinking the NTCIP Design and Protocols - Analyzing the Issues
DOT National Transportation Integrated Search
1998-03-03
This working paper discusses the issues involved in changing the current draft NTCIP standard from an X.25-based protocol stack to an Internet-based protocol stack. It contains a methodology which could be used to change NTCIP's base protocols. This ...
Comparison of EPA Method 1615 RT-qPCR Assays in Standard and Kit Format
EPA Method 1615 contains protocols for measuring enterovirus and norovirus by reverse transcription quantitative polymerase chain reaction. A commercial kit based upon these protocols was designed and compared to the method's standard approach. Reagent grade, secondary effluent, ...
Kocha, Shyam S.; Shinozaki, Kazuma; Zack, Jason W.; ...
2017-05-02
Thin-film-rotating disk electrodes (TF-RDEs) are the half-cell electrochemical system of choice for rapid screening of oxygen reduction reaction (ORR) activity of novel Pt supported on carbon black supports (Pt/C) electrocatalysts. It has been shown that the magnitude of the measured ORR activity and reproducibility are highly dependent on the system cleanliness, evaluation protocols, and operating conditions as well as ink formulation, composition, film drying, and the resultant film thickness and uniformity. Accurate benchmarks of baseline Pt/C catalysts evaluated using standardized protocols and best practices are necessary to expedite ultra-low-platinum group metal (PGM) catalyst development that is crucial for the imminentmore » commercialization of fuel cell vehicles. We report results of evaluation in three independent laboratories of Pt/C electrocatalysts provided by commercial fuel cell catalyst manufacturers (Johnson Matthey, Umicore, Tanaka Kikinzoku Kogyo - TKK). The studies were conducted using identical evaluation protocols/ink formulation/film fabrication albeit employing unique electrochemical cell designs specific to each laboratory. Furthermore, the ORR activities reported in this work provide a baseline and criteria for selection and scale-up of novel high activity ORR electrocatalysts for implementation in proton exchange membrane fuel cells (PEMFCs).« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kocha, Shyam S.; Shinozaki, Kazuma; Zack, Jason W.
Abstract Thin-film-rotating disk electrodes (TF-RDEs) are the half-cell electrochemical system of choice for rapid screening of oxygen reduction reaction (ORR) activity of novel Pt supported on carbon black supports (Pt/C) electrocatalysts. It has been shown that the magnitude of the measured ORR activity and reproducibility are highly dependent on the system cleanliness, evaluation protocols, and operating conditions as well as ink formulation, composition, film drying, and the resultant film thickness and uniformity. Accurate benchmarks of baseline Pt/C catalysts evaluated using standardized protocols and best practices are necessary to expedite ultra-low-platinum group metal (PGM) catalyst development that is crucial for themore » imminent commercialization of fuel cell vehicles. We report results of evaluation in three independent laboratories of Pt/C electrocatalysts provided by commercial fuel cell catalyst manufacturers (Johnson Matthey, Umicore, Tanaka Kikinzoku Kogyo—TKK). The studies were conducted using identical evaluation protocols/ink formulation/film fabrication albeit employing unique electrochemical cell designs specific to each laboratory. The ORR activities reported in this work provide a baseline and criteria for selection and scale-up of novel high activity ORR electrocatalysts for implementation in proton exchange membrane fuel cells (PEMFCs).« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kocha, Shyam S.; Shinozaki, Kazuma; Zack, Jason W.
Thin-film-rotating disk electrodes (TF-RDEs) are the half-cell electrochemical system of choice for rapid screening of oxygen reduction reaction (ORR) activity of novel Pt supported on carbon black supports (Pt/C) electrocatalysts. It has been shown that the magnitude of the measured ORR activity and reproducibility are highly dependent on the system cleanliness, evaluation protocols, and operating conditions as well as ink formulation, composition, film drying, and the resultant film thickness and uniformity. Accurate benchmarks of baseline Pt/C catalysts evaluated using standardized protocols and best practices are necessary to expedite ultra-low-platinum group metal (PGM) catalyst development that is crucial for the imminentmore » commercialization of fuel cell vehicles. We report results of evaluation in three independent laboratories of Pt/C electrocatalysts provided by commercial fuel cell catalyst manufacturers (Johnson Matthey, Umicore, Tanaka Kikinzoku Kogyo - TKK). The studies were conducted using identical evaluation protocols/ink formulation/film fabrication albeit employing unique electrochemical cell designs specific to each laboratory. Furthermore, the ORR activities reported in this work provide a baseline and criteria for selection and scale-up of novel high activity ORR electrocatalysts for implementation in proton exchange membrane fuel cells (PEMFCs).« less
Toward an interim standard for patient-centered knowledge-access.
Tuttle, M. S.; Sherertz, D. D.; Fagan, L. M.; Carlson, R. W.; Cole, W. G.; Schipma, P. B.; Nelson, S. J.
1993-01-01
Most care-giver "knowledge" needs arise at the point of care and are "patient-centered." Many of these knowledge needs can be met using existing on-line knowledge sources, but the process is too time-consuming, currently, for even the computer-proficient. We are developing a set of public domain standards aimed at bringing potentially relevant knowledge to the point of care in a straight-forward and timely fashion. The standards will a) make use of selected items from a Computer-based Patient Record (CPR), e.g., a diagnosis and measure of severity, b) anticipate certain care-giver knowledge needs, e.g., "therapy," "protocols," "complications," and c) try to satisfy those needs from available knowledge sources, e.g., knowledge-bases, citation databases, practice guidelines, and on-line textbooks. The standards will use templates, i.e., fill-in-the-blank structures, to anticipate knowledge needs and UMLS Metathesaurus enhancements to represent the content of knowledge sources. Together, the standards will form the specification for a "Knowledge-Server" (KS) designed to be accessed from any CPR system. Plans are in place to test an interim version of this specification in the context of medical oncology. We are accumulating anecdotal evidence that a KS operating in conjunction with a CPR is much more compelling to users than either a CPR or a KS operating alone. PMID:8130537
Lombardo, Marco; Giannini, Daniela; Lombardo, Giuseppe; Serrao, Sebastiano
2017-06-01
To compare clinical outcomes of transepithelial corneal cross-linking using iontophoresis (T-ionto CL) and standard corneal cross-linking (standard CL) for the treatment of progressive keratoconus 12 months after the operation. Prospective randomized controlled clinical trial. Thirty-four eyes of 25 participants with progressive keratoconus were randomized into T-ionto CL (22 eyes) or standard CL (12 eyes). T-ionto CL was performed using an iontophoresis device with dextran-free 0.1% riboflavin-5-phosphate solution with enhancers and by irradiating the cornea with a 10 mW/cm 2 ultraviolet A device for 9 minutes. Standard CL was performed according to the Dresden protocol. The primary outcome measure was stabilization of keratoconus after 12 months through analysis of maximum simulated keratometry readings (K max , diopters). Other outcome measures were corrected distance visual acuity (CDVA, logarithm of the minimum angle of resolution [logMAR]), manifest spherical equivalent refraction (D), central corneal thickness (CCT, micrometers) and endothelial cell density (ECD). Follow-up examinations were arranged at 3 and 7 days and 1, 3, 6, and 12 months. Twelve months after T-ionto CL and standard CL, K max on average flattened by -0.52±1.30 D (P = 0.06) and -0.82±1.20 D (P = 0.04), respectively. The mean change in CDVA was -0.10±0.12 logMAR (P = 0.003) and -0.03±0.06 logMAR (P = 0.10) after T-ionto CL and standard CL, respectively. The manifest spherical equivalent refraction changed on average by +0.71±1.44 D (P = 0.03) and +0.21±0.76 D (P = 0.38), respectively. The CCT and ECD measures did not change significantly in any group at 12 months. Significant differences in the outcome measures between treatments were found in the first week postoperatively. No complications occurred in the T-ionto CL group; 1 eye (8%) had sterile corneal infiltrates, which did not affect the final visual acuity, in the standard CL group. Significant visual and refractive improvements were found 12 months after T-ionto CL, though the average improvement in corneal topography readings was slightly lower than the Dresden protocol in the same period. Copyright © 2017 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.
Motosugi, Utaroh; Hernando, Diego; Wiens, Curtis; Bannas, Peter; Reeder, Scott. B
2017-01-01
Purpose: To determine whether high signal-to-noise ratio (SNR) acquisitions improve the repeatability of liver proton density fat fraction (PDFF) measurements using confounder-corrected chemical shift-encoded magnetic resonance (MR) imaging (CSE-MRI). Materials and Methods: Eleven fat-water phantoms were scanned with 8 different protocols with varying SNR. After repositioning the phantoms, the same scans were repeated to evaluate the test-retest repeatability. Next, an in vivo study was performed with 20 volunteers and 28 patients scheduled for liver magnetic resonance imaging (MRI). Two CSE-MRI protocols with standard- and high-SNR were repeated to assess test-retest repeatability. MR spectroscopy (MRS)-based PDFF was acquired as a standard of reference. The standard deviation (SD) of the difference (Δ) of PDFF measured in the two repeated scans was defined to ascertain repeatability. The correlation between PDFF of CSE-MRI and MRS was calculated to assess accuracy. The SD of Δ and correlation coefficients of the two protocols (standard- and high-SNR) were compared using F-test and t-test, respectively. Two reconstruction algorithms (complex-based and magnitude-based) were used for both the phantom and in vivo experiments. Results: The phantom study demonstrated that higher SNR improved the repeatability for both complex- and magnitude-based reconstruction. Similarly, the in vivo study demonstrated that the repeatability of the high-SNR protocol (SD of Δ = 0.53 for complex- and = 0.85 for magnitude-based fit) was significantly higher than using the standard-SNR protocol (0.77 for complex, P < 0.001; and 0.94 for magnitude-based fit, P = 0.003). No significant difference was observed in the accuracy between standard- and high-SNR protocols. Conclusion: Higher SNR improves the repeatability of fat quantification using confounder-corrected CSE-MRI. PMID:28190853
Simple algorithm for improved security in the FDDI protocol
NASA Astrophysics Data System (ADS)
Lundy, G. M.; Jones, Benjamin
1993-02-01
We propose a modification to the Fiber Distributed Data Interface (FDDI) protocol based on a simple algorithm which will improve confidential communication capability. This proposed modification provides a simple and reliable system which exploits some of the inherent security properties in a fiber optic ring network. This method differs from conventional methods in that end to end encryption can be facilitated at the media access control sublayer of the data link layer in the OSI network model. Our method is based on a variation of the bit stream cipher method. The transmitting station takes the intended confidential message and uses a simple modulo two addition operation against an initialization vector. The encrypted message is virtually unbreakable without the initialization vector. None of the stations on the ring will have access to both the encrypted message and the initialization vector except the transmitting and receiving stations. The generation of the initialization vector is unique for each confidential transmission and thus provides a unique approach to the key distribution problem. The FDDI protocol is of particular interest to the military in terms of LAN/MAN implementations. Both the Army and the Navy are considering the standard as the basis for future network systems. A simple and reliable security mechanism with the potential to support realtime communications is a necessary consideration in the implementation of these systems. The proposed method offers several advantages over traditional methods in terms of speed, reliability, and standardization.
Quality indicators for eye bank.
Acharya, Manisha; Biswas, Saurabh; Das, Animesh; Mathur, Umang; Dave, Abhishek; Singh, Ashok; Dubey, Suneeta
2018-03-01
The aim of this study is to identify quality indicators of the eye bank and validate their effectivity. Adverse reaction rate, discard rate, protocol deviation rate, and compliance rate were defined as Quality Indicators of the eye bank. These were identified based on definition of quality that captures two dimensions - "result quality" and "process quality." The indicators were measured and tracked as part of quality assurance (QA) program of the eye bank. Regular audits were performed to validate alignment of standard operating procedures (SOP) with regulatory and surgeon acceptance standards and alignment of activities performed in the eye bank with the SOP. Prospective study of the indicators was performed by comparing their observed values over the period 2011-2016. Adverse reaction rate decreased more than 8-fold (from 0.61% to 0.07%), discard rate decreased and stabilized at 30%, protocol deviation rate decreased from 1.05% to 0.08%, and compliance rate reported by annual quality audits improved from 59% to 96% at the same time. In effect, adverse reaction rate, discard rate, and protocol deviation rate were leading indicators, and compliance rate was the trailing indicator. These indicators fulfill an important gap in available literature on QA in eye banking. There are two ways in which these findings can be meaningful. First, eye banks which are new to quality measurement can adopt these indicators. Second, eye banks which are already deeply engaged in quality improvement can test these indicators in their eye bank, thereby incorporating them widely and improving them over time.
Research and realization of signal simulation on virtual instrument
NASA Astrophysics Data System (ADS)
Zhao, Qi; He, Wenting; Guan, Xiumei
2010-02-01
In the engineering project, arbitrary waveform generator controlled by software interface is needed by simulation and test. This article discussed the program using the SCPI (Standard Commands For Programmable Instruments) protocol and the VISA (Virtual Instrument System Architecture) library to control the Agilent signal generator (Agilent N5182A) by instrument communication over the LAN interface. The program can conduct several signal generations such as CW (continuous wave), AM (amplitude modulation), FM (frequency modulation), ΦM (phase modulation), Sweep. As the result, the program system has good operability and portability.
Quantum communication using a multiqubit entangled channel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghose, Shohini, E-mail: sghose@wlu.ca; Institute for Quantum Computing, University of Waterloo, Ontario; Hamel, Angele
We describe a protocol in which two senders each teleport a qubit to a receiver using a multiqubit entangled state. The multiqubit channel used for teleportation is genuinely 4-qubit entangled and is not equivalent to a product of maximally entangled Bell pairs under local unitary operations. We discuss a scenario in which both senders must participate for the qubits to be successfully teleported. Such an all-or-nothing scheme cannot be implemented with standard two-qubit entangled Bell pairs and can be useful for different communication and computing tasks.
Quantum communication using a multiqubit entangled channel
NASA Astrophysics Data System (ADS)
Ghose, Shohini; Hamel, Angele
2015-12-01
We describe a protocol in which two senders each teleport a qubit to a receiver using a multiqubit entangled state. The multiqubit channel used for teleportation is genuinely 4-qubit entangled and is not equivalent to a product of maximally entangled Bell pairs under local unitary operations. We discuss a scenario in which both senders must participate for the qubits to be successfully teleported. Such an all-or-nothing scheme cannot be implemented with standard two-qubit entangled Bell pairs and can be useful for different communication and computing tasks.
Proceedings of a Conference on Telecommunication Technologies, Networkings and Libraries
NASA Astrophysics Data System (ADS)
Knight, N. K.
1981-12-01
Current and developing technologies for digital transmission of image data likely to have an impact on the operations of libraries and information centers or provide support for information networking are reviewed. Technologies reviewed include slow scan television, teleconferencing, and videodisc technology and standards development for computer network interconnection through hardware and software, particularly packet switched networks computer network protocols for library and information service applications, the structure of a national bibliographic telecommunications network; and the major policy issues involved in the regulation or deregulation of the common communications carriers industry.
Cloud Computing Security Issue: Survey
NASA Astrophysics Data System (ADS)
Kamal, Shailza; Kaur, Rajpreet
2011-12-01
Cloud computing is the growing field in IT industry since 2007 proposed by IBM. Another company like Google, Amazon, and Microsoft provides further products to cloud computing. The cloud computing is the internet based computing that shared recourses, information on demand. It provides the services like SaaS, IaaS and PaaS. The services and recourses are shared by virtualization that run multiple operation applications on cloud computing. This discussion gives the survey on the challenges on security issues during cloud computing and describes some standards and protocols that presents how security can be managed.
NASA Operational Environment Team (NOET): NASA's key to environmental technology
NASA Technical Reports Server (NTRS)
Cook, Beth
1993-01-01
NASA has stepped forward to face the environmental challenge to eliminate the use of Ozone-Layer Depleting Substances (OLDS) and to reduce our Hazardous Air Pollutants (HAP) by 50 percent in 1995. These requirements have been issued by the Clean Air Act, the Montreal Protocol, and various other legislative acts. A proactive group, the NASA Operational Environment Team or NOET, received its charter in April 1992 and was tasked with providing a network through which replacement activities and development experiences can be shared. This is a NASA-wide team which supports the research and development community by sharing information both in person and via a computerized network, assisting in specification and standard revisions, developing cleaner propulsion systems, and exploring environmentally-compliant alternatives to current processes.
Patil, Nivritti G; Cheng, Stephen W K; Wong, John
2003-08-01
Recent high-profile cases have heightened the need for a formal structure to monitor achievement and maintenance of surgical competence. Logbooks, morbidity and mortality meetings, videos and direct observation of operations using a checklist, motion analysis devices, and virtual reality simulators are effective tools for teaching and evaluating surgical skills. As the operating theater is also a place for training, there must be protocols and guidelines, including mandatory standards for supervision, to ensure that patient care is not compromised. Patients appreciate frank communication and honesty from surgeons regarding their expertise and level of competence. To ensure that surgical competence is maintained and keeps pace with technologic advances, professional registration bodies have been promoting programs for recertification. They evaluate performance in practice, professional standing, and commitment to ongoing education.
Sigma Routing Metric for RPL Protocol.
Sanmartin, Paul; Rojas, Aldo; Fernandez, Luis; Avila, Karen; Jabba, Daladier; Valle, Sebastian
2018-04-21
This paper presents the adaptation of a specific metric for the RPL protocol in the objective function MRHOF. Among the functions standardized by IETF, we find OF0, which is based on the minimum hop count, as well as MRHOF, which is based on the Expected Transmission Count (ETX). However, when the network becomes denser or the number of nodes increases, both OF0 and MRHOF introduce long hops, which can generate a bottleneck that restricts the network. The adaptation is proposed to optimize both OFs through a new routing metric. To solve the above problem, the metrics of the minimum number of hops and the ETX are combined by designing a new routing metric called SIGMA-ETX, in which the best route is calculated using the standard deviation of ETX values between each node, as opposed to working with the ETX average along the route. This method ensures a better routing performance in dense sensor networks. The simulations are done through the Cooja simulator, based on the Contiki operating system. The simulations showed that the proposed optimization outperforms at a high margin in both OF0 and MRHOF, in terms of network latency, packet delivery ratio, lifetime, and power consumption.
Wireless radio channel for intramuscular electrode implants in the control of upper limb prostheses.
Stango, Antonietta; Yazdandoost, Kamya Yekeh; Farina, Dario
2015-01-01
In the last few years the use of implanted devices has been considered also in the field of myoelectric hand prostheses. Wireless implanted EMG (Electromyogram) sensors can improve the functioning of the prosthesis, providing information without the disadvantage of the wires, and the usability by amputees. The solutions proposed in the literature are based on proprietary communication protocols between the implanted devices and the prosthesis controller, using frequency bands that are already assigned to other purposes. This study proposes the use of a standard communication protocol (IEEE 802.15.6), specific for wireless body area networks (WBANs), which assign a specific bandwidth to implanted devices. The propagation losses from in-to-on body were investigated by numerical simulation with a 3D human model and an electromagnetic solver. The channel model resulting from the study represents the first step towards the development of myoelectric prosthetic hands which are driven by signals acquired by implanted sensors. However these results can provide important information to researchers for further developments, and manufacturers, which can decrease the production costs for hand prostheses having a common standard of communication with assigned frequencies of operation.
Sigma Routing Metric for RPL Protocol
Rojas, Aldo; Fernandez, Luis
2018-01-01
This paper presents the adaptation of a specific metric for the RPL protocol in the objective function MRHOF. Among the functions standardized by IETF, we find OF0, which is based on the minimum hop count, as well as MRHOF, which is based on the Expected Transmission Count (ETX). However, when the network becomes denser or the number of nodes increases, both OF0 and MRHOF introduce long hops, which can generate a bottleneck that restricts the network. The adaptation is proposed to optimize both OFs through a new routing metric. To solve the above problem, the metrics of the minimum number of hops and the ETX are combined by designing a new routing metric called SIGMA-ETX, in which the best route is calculated using the standard deviation of ETX values between each node, as opposed to working with the ETX average along the route. This method ensures a better routing performance in dense sensor networks. The simulations are done through the Cooja simulator, based on the Contiki operating system. The simulations showed that the proposed optimization outperforms at a high margin in both OF0 and MRHOF, in terms of network latency, packet delivery ratio, lifetime, and power consumption. PMID:29690524
The Veterinary Forensic Necropsy: A Review of Procedures and Protocols.
Brownlie, H W Brooks; Munro, R
2016-09-01
Investigation of animal-related crime, and therefore submission of forensic cases to veterinary pathology facilities, is increasing, yet many veterinary pathologists are unfamiliar and often uncomfortable with involvement in the forensic necropsy. This article discusses various aspects of the forensic necropsy without specific attention to any particular species group or crime. General advice is given on procedures, documentation, and recording of the examination, and the article indicates how these features may differ from those used in investigation of natural disease. It also discusses evidence management, including recordkeeping, identification of evidence, labeling of photographs, and use of standard operating procedures and protocols. Various written and visual methods for documentation of the forensic necropsy are covered, and adjunctive topics such as sample collection, assessment, and description of wounds and taphonomy are included. Cause, mechanism, and manner of death are defined, and guidance to the use of these terms is given. The aim of this article is to offer guidance on procedural aspects of the forensic necropsy that will help those developing their forensic services, contribute to standardization of the provision of forensic veterinary pathology, and build the confidence of the "uncomfortable" forensic veterinary pathologist. © The Author(s) 2016.
Challenges in standardization of blood pressure measurement at the population level.
Tolonen, Hanna; Koponen, Päivikki; Naska, Androniki; Männistö, Satu; Broda, Grazyna; Palosaari, Tarja; Kuulasmaa, Kari
2015-04-10
Accurate blood pressure measurements are needed in clinical practice, intervention studies and health examination surveys. Blood pressure measurements are sensitive: their accuracy can be affected by measurement environment, behaviour of the subject, measurement procedures, devices used for the measurement and the observer. To minimize errors in blood pressure measurement, a standardized measurement protocol is needed. The European Health Examination Survey (EHES) Pilot project was conducted in 2009-2012. A pilot health examination survey was conducted in 12 countries using a standardized protocol. The measurement protocols used in each survey, training provided for the measurers, measurement data, and observations during site visits were collected and evaluated to assess the level of standardization. The EHES measurement protocol for blood pressure was followed accurately in all 12 pilot surveys. Most of the surveys succeeded in organizing a quiet and comfortable measurement environment, and staff instructed survey participants appropriately before examination visits. In all surveys, blood pressure was measured three times, from the right arm in a sitting posture. The biggest variation was in the device used for the blood pressure measurement. It is possible to reach a high level of standardization for blood pressure measurements across countries and over time. A detailed, standardized measurement protocol, and adequate training and monitoring during the fieldwork and centrally organized quality assessment of the data are needed. The recent EU regulation banning the sale of mercury sphygmomanometer in European Union Member States has set new challenges for the standardization of measurement devices since the validity of oscillometric measurements is device-specific and performance of aneroid devices depends very much on calibration.
Hepler, Jeff A; Neumann, Cathy
2003-04-01
To enhance environmental compliance, the U.S. Department of Defense (DOD) recently developed and implemented a standardized environmental audit tool called The Environmental Assessment and Management (TEAM) Guide. Utilization of a common audit tool (TEAM Guide) throughout DOD agencies could be an effective agent of positive change. If, however, the audit tool is inappropriate, environmental compliance at DOD facilities could worsen. Furthermore, existing audit systems such as the U.S. Environmental Protection Agency's (U.S. EPA's) Generic Protocol for Conducting Environmental Audits of Federal Facilities and the International Organization for Standardization's (ISO's) Standard 14001, "Environmental Management System Audits," may be abandoned even if they offer significant advantages over TEAM Guide audit tool. Widespread use of TEAM Guide should not take place until thorough and independent evaluation has been performed. The purpose of this paper is to compare DOD's TEAM Guide audit tool with U.S. EPA's Generic Protocol for Conducting Environmental Audits of Federal Facilities and ISO 14001, in order to assess which is most appropriate and effective for DOD facilities, and in particular those operated by the U.S. Army Corps of Engineers (USACE). USACE was selected as a result of one author's recent experience as a district environmental compliance coordinator responsible for the audit mission at this agency. Specific recommendations for enhancing the quality of environmental audits at all DOD facilities also are given.
SUPPLEMENT TO: STANDARD MEASUREMENT PROTOCOLS - FLORIDA RADON RESEARCH PROGRAM
The report supplements earlier published standard protocols for key measurements where data quality is vital to the Florida Radon Research Program. The report adds measurements of small canister radon flux and soil water potential to the section on soil measurements. It adds indo...
Blunt hepatic and splenic trauma. A single Center experience using a multidisciplinary protocol.
Ruscelli, Paolo; Buccoliero, Farncesco; Mazzocato, Susanna; Belfiori, Giulio; Rabuini, Claudio; Sperti, Pierluigi; Rimini, Massimiliano
2017-01-01
The aim of this retrospective study was to describe more than 10 years experience of a single Trauma Center about non operative management of abdominal organ injuries in hemodynamically stable patients MATERIAL OF STUDY: Between January 2001 and December 2014 ,732 consecutive patients were admitted with blunt abdominal trauma, involving liver and/or spleen and/or kidney, at the Bufalini Cesena Hospital .Management of patients included a specific institutional developed protocol :hemodynamic stability was evaluated in shock room according to the patients response to fluid challenge and the patients were classified into three categories A,B,and C. Form 732 Trauma, 356(48.6%) of patients were submitted to a surgical procedure, all the other patient 376(51.4%) underwent an non operative management .Overall mortality was 9.8% (72), mortality in the surgery group was 15.4% eheras in the non operative group was 4.5%; the relative risk of mortality, measured by the odds ratio waith a 95% confidence interval, was 3.417(2.023-5.772) for rhe surgery group; patient over 40 years old has a statistically significant higher mortality. In our series the overall mortality rate of non operative management group was 4.5%, instead in unstable patients, the surgery group, the mortality was 15.3%; the overall mortality mortality rate after the application of our protocol is 9.8%, Although surgery continues to be the standard for hemodically unstable patients with blunt hepatic and splenic trauma. In our experience AAST Organ Injury Scale was useless for the therapeutic decision making process after the CT scan if a source of bleeding was detected and immediate angiography was performed in order to control and solve it. In our experience the AAST Organ Injury Scale was useless for the therapeutic decision making process, The results suggest that the only criteria of choice for therapeutici strategy was the hemodynamic stability, Nonoperative managem,ent can be applied only following strict institutional criteria KEY WORDS: Hemodynmic stability, Nonoperative management, Trauma.
Gorgolewski, Krzysztof J; Auer, Tibor; Calhoun, Vince D; Craddock, R Cameron; Das, Samir; Duff, Eugene P; Flandin, Guillaume; Ghosh, Satrajit S; Glatard, Tristan; Halchenko, Yaroslav O; Handwerker, Daniel A; Hanke, Michael; Keator, David; Li, Xiangrui; Michael, Zachary; Maumet, Camille; Nichols, B Nolan; Nichols, Thomas E; Pellman, John; Poline, Jean-Baptiste; Rokem, Ariel; Schaefer, Gunnar; Sochat, Vanessa; Triplett, William; Turner, Jessica A; Varoquaux, Gaël; Poldrack, Russell A
2016-06-21
The development of magnetic resonance imaging (MRI) techniques has defined modern neuroimaging. Since its inception, tens of thousands of studies using techniques such as functional MRI and diffusion weighted imaging have allowed for the non-invasive study of the brain. Despite the fact that MRI is routinely used to obtain data for neuroscience research, there has been no widely adopted standard for organizing and describing the data collected in an imaging experiment. This renders sharing and reusing data (within or between labs) difficult if not impossible and unnecessarily complicates the application of automatic pipelines and quality assurance protocols. To solve this problem, we have developed the Brain Imaging Data Structure (BIDS), a standard for organizing and describing MRI datasets. The BIDS standard uses file formats compatible with existing software, unifies the majority of practices already common in the field, and captures the metadata necessary for most common data processing operations.
Gorgolewski, Krzysztof J.; Auer, Tibor; Calhoun, Vince D.; Craddock, R. Cameron; Das, Samir; Duff, Eugene P.; Flandin, Guillaume; Ghosh, Satrajit S.; Glatard, Tristan; Halchenko, Yaroslav O.; Handwerker, Daniel A.; Hanke, Michael; Keator, David; Li, Xiangrui; Michael, Zachary; Maumet, Camille; Nichols, B. Nolan; Nichols, Thomas E.; Pellman, John; Poline, Jean-Baptiste; Rokem, Ariel; Schaefer, Gunnar; Sochat, Vanessa; Triplett, William; Turner, Jessica A.; Varoquaux, Gaël; Poldrack, Russell A.
2016-01-01
The development of magnetic resonance imaging (MRI) techniques has defined modern neuroimaging. Since its inception, tens of thousands of studies using techniques such as functional MRI and diffusion weighted imaging have allowed for the non-invasive study of the brain. Despite the fact that MRI is routinely used to obtain data for neuroscience research, there has been no widely adopted standard for organizing and describing the data collected in an imaging experiment. This renders sharing and reusing data (within or between labs) difficult if not impossible and unnecessarily complicates the application of automatic pipelines and quality assurance protocols. To solve this problem, we have developed the Brain Imaging Data Structure (BIDS), a standard for organizing and describing MRI datasets. The BIDS standard uses file formats compatible with existing software, unifies the majority of practices already common in the field, and captures the metadata necessary for most common data processing operations. PMID:27326542
Building energy simulation in real time through an open standard interface
Pang, Xiufeng; Nouidui, Thierry S.; Wetter, Michael; ...
2015-10-20
Building energy models (BEMs) are typically used for design and code compliance for new buildings and in the renovation of existing buildings to predict energy use. We present the increasing adoption of BEM as standard practice in the building industry presents an opportunity to extend the use of BEMs into construction, commissioning and operation. In 2009, the authors developed a real-time simulation framework to execute an EnergyPlus model in real time to improve building operation. This paper reports an enhancement of that real-time energy simulation framework. The previous version only works with software tools that implement the custom co-simulation interfacemore » of the Building Controls Virtual Test Bed (BCVTB), such as EnergyPlus, Dymola and TRNSYS. The new version uses an open standard interface, the Functional Mockup Interface (FMI), to provide a generic interface to any application that supports the FMI protocol. In addition, the new version utilizes the Simple Measurement and Actuation Profile (sMAP) tool as the data acquisition system to acquire, store and present data. Lastly, this paper introduces the updated architecture of the real-time simulation framework using FMI and presents proof-of-concept demonstration results which validate the new framework.« less
NASA Technical Reports Server (NTRS)
Barry, Matthew R.
2006-01-01
The X-Windows Socket Widget Class ("Class" is used here in the object-oriented-programming sense of the word) was devised to simplify the task of implementing network connections for graphical-user-interface (GUI) computer programs. UNIX Transmission Control Protocol/Internet Protocol (TCP/IP) socket programming libraries require many method calls to configure, operate, and destroy sockets. Most X Windows GUI programs use widget sets or toolkits to facilitate management of complex objects. The widget standards facilitate construction of toolkits and application programs. The X-Windows Socket Widget Class encapsulates UNIX TCP/IP socket-management tasks within the framework of an X Windows widget. Using the widget framework, X Windows GUI programs can treat one or more network socket instances in the same manner as that of other graphical widgets, making it easier to program sockets. Wrapping ISP socket programming libraries inside a widget framework enables a programmer to treat a network interface as though it were a GUI.
A communal catalogue reveals Earth's multiscale microbial diversity.
Thompson, Luke R; Sanders, Jon G; McDonald, Daniel; Amir, Amnon; Ladau, Joshua; Locey, Kenneth J; Prill, Robert J; Tripathi, Anupriya; Gibbons, Sean M; Ackermann, Gail; Navas-Molina, Jose A; Janssen, Stefan; Kopylova, Evguenia; Vázquez-Baeza, Yoshiki; González, Antonio; Morton, James T; Mirarab, Siavash; Zech Xu, Zhenjiang; Jiang, Lingjing; Haroon, Mohamed F; Kanbar, Jad; Zhu, Qiyun; Jin Song, Se; Kosciolek, Tomasz; Bokulich, Nicholas A; Lefler, Joshua; Brislawn, Colin J; Humphrey, Gregory; Owens, Sarah M; Hampton-Marcell, Jarrad; Berg-Lyons, Donna; McKenzie, Valerie; Fierer, Noah; Fuhrman, Jed A; Clauset, Aaron; Stevens, Rick L; Shade, Ashley; Pollard, Katherine S; Goodwin, Kelly D; Jansson, Janet K; Gilbert, Jack A; Knight, Rob
2017-11-23
Our growing awareness of the microbial world's importance and diversity contrasts starkly with our limited understanding of its fundamental structure. Despite recent advances in DNA sequencing, a lack of standardized protocols and common analytical frameworks impedes comparisons among studies, hindering the development of global inferences about microbial life on Earth. Here we present a meta-analysis of microbial community samples collected by hundreds of researchers for the Earth Microbiome Project. Coordinated protocols and new analytical methods, particularly the use of exact sequences instead of clustered operational taxonomic units, enable bacterial and archaeal ribosomal RNA gene sequences to be followed across multiple studies and allow us to explore patterns of diversity at an unprecedented scale. The result is both a reference database giving global context to DNA sequence data and a framework for incorporating data from future studies, fostering increasingly complete characterization of Earth's microbial diversity.
IBM NJE protocol emulator for VAX/VMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engert, D.E.
1981-01-01
Communications software has been written at Argonne National Laboratory to enable a VAX/VMS system to participate as an end-node in a standard IBM network by emulating the Network Job Entry (NJE) protocol. NJE is actually a collection of programs that support job networking for the operating systems used on most large IBM-compatible computers (e.g., VM/370, MVS with JES2 or JES3, SVS, MVT with ASP or HASP). Files received by the VAX can be printed or saved in user-selected disk files. Files sent to the network can be routed to any node in the network for printing, punching, or job submission,more » as well as to a VM/370 user's virtual reader. Files sent from the VAX are queued and transmitted asynchronously to allow users to perform other work while files are awaiting transmission. No changes are required to the IBM software.« less
A Simple XML Producer-Consumer Protocol
NASA Technical Reports Server (NTRS)
Smith, Warren; Gunter, Dan; Quesnel, Darcy; Biegel, Bryan (Technical Monitor)
2001-01-01
There are many different projects from government, academia, and industry that provide services for delivering events in distributed environments. The problem with these event services is that they are not general enough to support all uses and they speak different protocols so that they cannot interoperate. We require such interoperability when we, for example, wish to analyze the performance of an application in a distributed environment. Such an analysis might require performance information from the application, computer systems, networks, and scientific instruments. In this work we propose and evaluate a standard XML-based protocol for the transmission of events in distributed systems. One recent trend in government and academic research is the development and deployment of computational grids. Computational grids are large-scale distributed systems that typically consist of high-performance compute, storage, and networking resources. Examples of such computational grids are the DOE Science Grid, the NASA Information Power Grid (IPG), and the NSF Partnerships for Advanced Computing Infrastructure (PACIs). The major effort to deploy these grids is in the area of developing the software services to allow users to execute applications on these large and diverse sets of resources. These services include security, execution of remote applications, managing remote data, access to information about resources and services, and so on. There are several toolkits for providing these services such as Globus, Legion, and Condor. As part of these efforts to develop computational grids, the Global Grid Forum is working to standardize the protocols and APIs used by various grid services. This standardization will allow interoperability between the client and server software of the toolkits that are providing the grid services. The goal of the Performance Working Group of the Grid Forum is to standardize protocols and representations related to the storage and distribution of performance data. These standard protocols and representations must support tasks such as profiling parallel applications, monitoring the status of computers and networks, and monitoring the performance of services provided by a computational grid. This paper describes a proposed protocol and data representation for the exchange of events in a distributed system. The protocol exchanges messages formatted in XML and it can be layered atop any low-level communication protocol such as TCP or UDP Further, we describe Java and C++ implementations of this protocol and discuss their performance. The next section will provide some further background information. Section 3 describes the main communication patterns of our protocol. Section 4 describes how we represent events and related information using XML. Section 5 describes our protocol and Section 6 discusses the performance of two implementations of the protocol. Finally, an appendix provides the XML Schema definition of our protocol and event information.
You, Ilsun; Kwon, Soonhyun; Choudhary, Gaurav; Sharma, Vishal; Seo, Jung Taek
2018-06-08
The Internet of Things (IoT) utilizes algorithms to facilitate intelligent applications across cities in the form of smart-urban projects. As the majority of devices in IoT are battery operated, their applications should be facilitated with a low-power communication setup. Such facility is possible through the Low-Power Wide-Area Network (LPWAN), but at a constrained bit rate. For long-range communication over LPWAN, several approaches and protocols are adopted. One such protocol is the Long-Range Wide Area Network (LoRaWAN), which is a media access layer protocol for long-range communication between the devices and the application servers via LPWAN gateways. However, LoRaWAN comes with fewer security features as a much-secured protocol consumes more battery because of the exorbitant computational overheads. The standard protocol fails to support end-to-end security and perfect forward secrecy while being vulnerable to the replay attack that makes LoRaWAN limited in supporting applications where security (especially end-to-end security) is important. Motivated by this, an enhanced LoRaWAN security protocol is proposed, which not only provides the basic functions of connectivity between the application server and the end device, but additionally averts these listed security issues. The proposed protocol is developed with two options, the Default Option (DO) and the Security-Enhanced Option (SEO). The protocol is validated through Burrows⁻Abadi⁻Needham (BAN) logic and the Automated Validation of Internet Security Protocols and Applications (AVISPA) tool. The proposed protocol is also analyzed for overheads through system-based and low-power device-based evaluations. Further, a case study on a smart factory-enabled parking system is considered for its practical application. The results, in terms of network latency with reliability fitting and signaling overheads, show paramount improvements and better performance for the proposed protocol compared with the two handshake options, Pre-Shared Key (PSK) and Elliptic Curve Cryptography (ECC), of Datagram Transport Layer Security (DTLS).
NASA Exercise Physiology and Countermeasures Project Overview
NASA Technical Reports Server (NTRS)
Loerch, Linda; Ploutz-Snyder, Lori
2009-01-01
Efficient exercise countermeasures are necessary to offset or minimize spaceflight-induced deconditioning and to maximize crew performance of mission tasks. These countermeasure protocols should use the fewest crew and vehicle resources. NASA s Exercise Physiology and Countermeasures (ExPC) Project works to identify, collect, interpret, and summarize evidence that results in effective exercise countermeasure protocols which protect crew health and performance during International Space Station (ISS) and future exploration-class missions. The ExPC and NASA s Human Research Program are sponsoring multiple studies to evaluate and improve the efficacy of spaceflight exercise countermeasures. First, the Project will measure maximal aerobic capacity (VO2max) during cycle ergometry before, during, and after ISS missions. Second, the Project is sponsoring an evaluation of a new prototype harness that offers improved comfort and increased loading during treadmill operations. Third, the Functional Tasks Test protocol will map performance of anticipated lunar mission tasks with physiologic systems before and after short and long-duration spaceflight, to target system contributions and the tailoring of exercise protocols to maximize performance. In addition to these studies that are actively enrolling crewmember participants, the ExPC is planning new studies that include an evaluation of a higher-intensity/lower-volume exercise countermeasure protocol aboard the ISS using the Advanced Resistive Exercise Device and second-generation treadmill, studies that evaluate bone loading during spaceflight exercise, and ground-based studies that focus on fitness for duty standards required to complete lunar mission tasks and for which exercise protocols need to protect. Summaries of these current and future studies and strategies will be provided to international colleagues for knowledge sharing and possible collaboration.
Automated selective disruption of slow wave sleep
Ooms, Sharon J.; Zempel, John M.; Holtzman, David M.; Ju, Yo-El S.
2017-01-01
Background Slow wave sleep (SWS) plays an important role in neurophysiologic restoration. Experimentally testing the effect of SWS disruption previously required highly time-intensive and subjective methods. Our goal was to develop an automated and objective protocol to reduce SWS without affecting sleep architecture. New Method We developed a custom Matlab™ protocol to calculate electroencephalogram spectral power every 10 seconds live during a polysomnogram, exclude artifact, and, if measurements met criteria for SWS, deliver increasingly louder tones through earphones. Middle-aged healthy volunteers (n=10) each underwent 2 polysomnograms, one with the SWS disruption protocol and one with sham condition. Results The SWS disruption protocol reduced SWS compared to sham condition, as measured by spectral power in the delta (0.5–4 Hz) band, particularly in the 0.5–2 Hz range (mean 20% decrease). A compensatory increase in the proportion of total spectral power in the theta (4–8 Hz) and alpha (8–12 Hz) bands was seen, but otherwise normal sleep features were preserved. N3 sleep decreased from 20±34 to 3±6 minutes, otherwise there were no significant changes in total sleep time, sleep efficiency, or other macrostructural sleep characteristics. Comparison with existing method This novel SWS disruption protocol produces specific reductions in delta band power similar to existing methods, but has the advantage of being automated, such that SWS disruption can be performed easily in a highly standardized and operator-independent manner. Conclusion This automated SWS disruption protocol effectively reduces SWS without impacting overall sleep architecture. PMID:28238859
A literature review: polypharmacy protocol for primary care.
Skinner, Mary
2015-01-01
The purpose of this literature review is to critically evaluate published protocols on polypharmacy in adults ages 65 and older that are currently used in primary care settings that may potentially lead to fewer adverse drug events. A review of OVID, CINAHL, EBSCO, Cochrane Library, Medline, and PubMed databases was completed using the following key words: protocol, guideline, geriatrics, elderly, older adult, polypharmacy, and primary care. Inclusion criteria were: articles in medical, nursing, and pharmacology journals with an intervention, protocol, or guideline addressing polypharmacy that lead to fewer adverse drug events. Qualitative and quantitative studies were included. Exclusion criteria were: publications prior to the year 1992. A gap exists in the literature. No standardized protocol for addressing polypharmacy in the primary care setting was found. Mnemonics, algorithms, clinical practice guidelines, and clinical strategies for addressing polypharmacy in a variety of health care settings were found throughout the literature. Several screening instruments for use in primary care to assess potentially inappropriate prescription of medications in the elderly, such as the Beers Criteria and the STOPP screening tool, were identified. However, these screening instruments were not included in a standardized protocol to manage polypharmacy in primary care. Polypharmacy in the elderly is a critical problem that may result in adverse drug events such as falls, hospitalizations, and increased expenditures for both the patient and the health care system. No standardized protocols to address polypharmacy specific to the primary care setting were identified in this review of the literature. Given the growing population of elderly in this country and the high number of medications they consume, it is critical to focus on the utilization of a standardized protocol to address the potential harm of polypharmacy in the primary care setting and evaluate its effects on patient outcomes. Copyright © 2015 Elsevier Inc. All rights reserved.
Compositional Verification of a Communication Protocol for a Remotely Operated Vehicle
NASA Technical Reports Server (NTRS)
Goodloe, Alwyn E.; Munoz, Cesar A.
2009-01-01
This paper presents the specification and verification in the Prototype Verification System (PVS) of a protocol intended to facilitate communication in an experimental remotely operated vehicle used by NASA researchers. The protocol is defined as a stack-layered com- position of simpler protocols. It can be seen as the vertical composition of protocol layers, where each layer performs input and output message processing, and the horizontal composition of different processes concurrently inhabiting the same layer, where each process satisfies a distinct requirement. It is formally proven that the protocol components satisfy certain delivery guarantees. Compositional techniques are used to prove these guarantees also hold in the composed system. Although the protocol itself is not novel, the methodology employed in its verification extends existing techniques by automating the tedious and usually cumbersome part of the proof, thereby making the iterative design process of protocols feasible.
Sheldon, Elizabeth; Vo, Kim Chi; McIntire, Ramsey A; Aghajanova, Lusine; Zelenko, Zara; Irwin, Juan C; Giudice, Linda C
2011-05-01
To develop a standard operating procedure (SOP) for collection, transport, storage of human endometrial tissue and blood samples, subject and specimen annotation, and establishing sample priorities. The SOP synthesizes sound scientific procedures, the literature on ischemia research, sample collection and gene expression profiling, good laboratory practices, and the authors' experience of workflow and sample quality. The National Institutes of Health, University of California, San Francisco, Human Endometrial Tissue and DNA Bank. Women undergoing endometrial biopsy or hysterectomy for nonmalignant indications. Collecting, processing, storing, distributing endometrial tissue and blood samples under approved institutional review board protocols and written informed consent from participating subjects. Standard operating procedure. The SOP addresses rigorous and consistent subject annotation, specimen processing and characterization, strict regulatory compliance, and a reference for researchers to track collection and storage times that may influence their research. The comprehensive and systematic approach to the procurement of human blood and endometrial tissue in this SOP ensures the high quality, reliability, and scientific usefulness of biospecimens made available to investigators by the National Institutes of Health, University of California, San Francisco, Human Endometrial Tissue and DNA Bank. The detail and perspective in this SOP also provides a blueprint for implementation of similar collection programs at other institutions. Copyright © 2011 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Ziemke, Claas; Kuwahara, Toshinori; Kossev, Ivan
2011-09-01
Even in the field of small satellites, the on-board data handling subsystem has become complex and powerful. With the introduction of powerful CPUs and the availability of considerable amounts of memory on-board a small satellite it has become possible to utilize the flexibility and power of contemporary platform-independent real-time operating systems. Especially the non-commercial sector such like university institutes and community projects such as AMSAT or SSETI are characterized by the inherent lack of financial as well as manpower resources. The opportunity to utilize such real-time operating systems will contribute significantly to achieve a successful mission. Nevertheless the on-board software of a satellite is much more than just an operating system. It has to fulfill a multitude of functional requirements such as: Telecommand interpretation and execution, execution of control loops, generation of telemetry data and frames, failure detection isolation and recovery, the communication with peripherals and so on. Most of the aforementioned tasks are of generic nature and have to be conducted on any satellite with only minor modifications. A general set of functional requirements as well as a protocol for communication is defined in the SA ECSS-E-70-41A standard "Telemetry and telecommand packet utilization". This standard not only defines the communication protocol of the satellite-ground link but also defines a set of so called services which have to be available on-board of every compliant satellite and which are of generic nature. In this paper, a platform-independent and reusable framework is described which is implementing not only the ECSS-E-70-41A standard but also functionalities for interprocess communication, scheduling and a multitude of tasks commonly performed on-board of a satellite. By making use of the capabilities of the high-level programming language C/C++, the powerful open source library BOOST, the real-time operating system RTEMS and finally by providing generic functionalities compliant to the ECSS-E-70-41A standard the proposed framework can provide a great boost in productivity. Together with open source tools such like the GNU tool-chain, Eclipse SDK, the simulation framework OpenSimKit, the emulator QEMU, the proposed on-board software framework forms an integrated development framework. It is possible to design, code and build the on-board software together with the operating system and then run it on a simulated satellite for performance analysis and debugging purposes. This makes it possible to rapidly develop and deploy a full-fledged satellite on-board software with minimal cost and in a limited time frame.
Fernando, Sumadhya D; Ihalamulla, Ratnasiri L; Wickremasinghe, Renu; de Silva, Nipun L; Thilakarathne, Janani H; Wijeyaratne, Pandu; Premaratne, Risintha G
2014-03-15
Individuals with fever are screened for malaria in specially-established malaria diagnostic laboratories set up in rural hospitals in the Northern and Eastern Provinces of Sri Lanka. Large numbers of blood smears negative for malaria parasites are being screened daily. Good quality smears are essential to maintain a high diagnostic competency among the technical staff. The modifications made to the World Health Organization (WHO) standard operating procedures to improve the quality of smears have been studied. A blinded, controlled, interventional study was conducted in 22 intervention and 21 control malaria diagnostic laboratories. Changes were made to the WHO standard operating procedure protocols to prepare, stain and examine blood smears for malaria parasite detection which were implemented in intervention laboratories. These included wipe-cleaning slides, preparing both thick and thin smears on the same slide, reversing the order of collecting blood for thick and thin smears, dry fixing thick smear for 20-25 minutes under table lamp, polishing the edge of spreader slide with sand paper and fixing the thin smear with methanol if not stained within four hours. Parameters with respect to quality of the smear as per WHO criteria were studied using randomly selected slides, and time taken for the report to be issued was recorded in both groups before and after the intervention. There were no significant differences observed in the parameters studied at baseline between the two groups or pre and post intervention in the control group. In the intervention group streak formation in thin smears was reduced from 29.4% to 5.0%. The average fixing time of thick smears was reduced from 2.4 hours to 20 minutes. Inappropriate thickness of thick smears reduced from 18.3% to 1.5%. Overall quality of thick smears and thin smears increased from 76.1% to 98.0% and 81.7% to 87.0%, respectively. The quality of slides bearing both thick and thin smears increased from 60.0% to 87.0%. New protocols with amendments to the WHO standard technical procedures ensure that good quality blood smears are prepared rapidly to diagnose malaria and the time required to issue the reports was reduced.
The Network Operations Control Center upgrade task: Lessons learned
NASA Technical Reports Server (NTRS)
Sherif, J. S.; Tran, T.-L.; Lee, S.
1994-01-01
This article synthesizes and describes the lessons learned from the Network Operations Control Center (NOCC) upgrade project, from the requirements phase through development and test and transfer. At the outset, the NOCC upgrade was being performed simultaneously with two other interfacing and dependent upgrades at the Signal Processing Center (SPC) and Ground Communications Facility (GCF), thereby adding a significant measure of complexity to the management and overall coordination of the development and transfer-to-operations (DTO) effort. Like other success stories, this project carried with it the traditional elements of top management support and exceptional dedication of cognizant personnel. Additionally, there were several NOCC-specific reasons for success, such as end-to-end system engineering, adoption of open-system architecture, thorough requirements management, and use of appropriate off-the-shelf technologies. On the other hand, there were several difficulties, such as ill-defined external interfaces, transition issues caused by new communications protocols, ambivalent use of two sets of policies and standards, and mistailoring of the new JPL management standard (due to the lack of practical guidelines). This article highlights the key lessons learned, as a means of constructive suggestions for the benefit of future projects.
Chervenak, Ann L; van Erp, Theo G M; Kesselman, Carl; D'Arcy, Mike; Sobell, Janet; Keator, David; Dahm, Lisa; Murry, Jim; Law, Meng; Hasso, Anton; Ames, Joseph; Macciardi, Fabio; Potkin, Steven G
2012-01-01
Progress in our understanding of brain disorders increasingly relies on the costly collection of large standardized brain magnetic resonance imaging (MRI) data sets. Moreover, the clinical interpretation of brain scans benefits from compare and contrast analyses of scans from patients with similar, and sometimes rare, demographic, diagnostic, and treatment status. A solution to both needs is to acquire standardized, research-ready clinical brain scans and to build the information technology infrastructure to share such scans, along with other pertinent information, across hospitals. This paper describes the design, deployment, and operation of a federated imaging system that captures and shares standardized, de-identified clinical brain images in a federation across multiple institutions. In addition to describing innovative aspects of the system architecture and our initial testing of the deployed infrastructure, we also describe the Standardized Imaging Protocol (SIP) developed for the project and our interactions with the Institutional Review Board (IRB) regarding handling patient data in the federated environment.
Chervenak, Ann L.; van Erp, Theo G.M.; Kesselman, Carl; D’Arcy, Mike; Sobell, Janet; Keator, David; Dahm, Lisa; Murry, Jim; Law, Meng; Hasso, Anton; Ames, Joseph; Macciardi, Fabio; Potkin, Steven G.
2015-01-01
Progress in our understanding of brain disorders increasingly relies on the costly collection of large standardized brain magnetic resonance imaging (MRI) data sets. Moreover, the clinical interpretation of brain scans benefits from compare and contrast analyses of scans from patients with similar, and sometimes rare, demographic, diagnostic, and treatment status. A solution to both needs is to acquire standardized, research-ready clinical brain scans and to build the information technology infrastructure to share such scans, along with other pertinent information, across hospitals. This paper describes the design, deployment, and operation of a federated imaging system that captures and shares standardized, de-identified clinical brain images in a federation across multiple institutions. In addition to describing innovative aspects of the system architecture and our initial testing of the deployed infrastructure, we also describe the Standardized Imaging Protocol (SIP) developed for the project and our interactions with the Institutional Review Board (IRB) regarding handling patient data in the federated environment. PMID:22941984
Gomes, Inês B; Meireles, Ana; Gonçalves, Ana L; Goeres, Darla M; Sjollema, Jelmer; Simões, Lúcia C; Simões, Manuel
2018-08-01
Biofilms can cause severe problems to human health due to the high tolerance to antimicrobials; consequently, biofilm science and technology constitutes an important research field. Growing a relevant biofilm in the laboratory provides insights into the basic understanding of the biofilm life cycle including responses to antibiotic therapies. Therefore, the selection of an appropriate biofilm reactor is a critical decision, necessary to obtain reproducible and reliable in vitro results. A reactor should be chosen based upon the study goals and a balance between the pros and cons associated with its use and operational conditions that are as similar as possible to the clinical setting. However, standardization in biofilm studies is rare. This review will focus on the four reactors (Calgary biofilm device, Center for Disease Control biofilm reactor, drip flow biofilm reactor, and rotating disk reactor) approved by a standard setting organization (ASTM International) for biofilm experiments and how researchers have modified these standardized reactors and associated protocols to improve the study and understanding of medical biofilms.
Kuttner, Samuel; Bujila, Robert; Kortesniemi, Mika; Andersson, Henrik; Kull, Love; Østerås, Bjørn Helge; Thygesen, Jesper; Tarp, Ivanka Sojat
2013-03-01
Quality assurance (QA) of computed tomography (CT) systems is one of the routine tasks for medical physicists in the Nordic countries. However, standardized QA protocols do not yet exist and the QA methods, as well as the applied tolerance levels, vary in scope and extent at different hospitals. To propose a standardized protocol for acceptance and constancy testing of CT scanners in the Nordic Region. Following a Nordic Association for Clinical Physics (NACP) initiative, a group of medical physicists, with representatives from four Nordic countries, was formed. Based on international literature and practical experience within the group, a comprehensive standardized test protocol was developed. The proposed protocol includes tests related to the mechanical functionality, X-ray tube, detector, and image quality for CT scanners. For each test, recommendations regarding the purpose, equipment needed, an outline of the test method, the measured parameter, tolerance levels, and the testing frequency are stated. In addition, a number of optional tests are briefly discussed that may provide further information about the CT system. Based on international references and medical physicists' practical experiences, a comprehensive QA protocol for CT systems is proposed, including both acceptance and constancy tests. The protocol may serve as a reference for medical physicists in the Nordic countries.
Radiation-Tolerant, SpaceWire-Compatible Switching Fabric
NASA Technical Reports Server (NTRS)
Katzman, Vladimir
2011-01-01
Current and future near-Earth and deep space exploration programs and space defense programs require the development of robust intra-spacecraft serial data transfer electronics that must be reconfigurable, fault-tolerant, and have the ability to operate effectively for long periods of time in harsh environmental conditions. Existing data transfer systems based on state-of-the-art serial data transfer protocols or passive backplanes are slow, power-hungry, and poorly reconfigurable. They provide limited expandability and poor tolerance to radiation effects and total ionizing dose (TID) in particular, which presents harmful threats to modern submicron electronics. This novel approach is based on a standard library of differential cells tolerant to TID, and patented, multi-level serial interface architecture that ensures the reliable operation of serial interconnects without application of a data-strobe or other encoding techniques. This proprietary, high-speed differential interface presents a lowpower solution fully compatible with the SpaceWire (SW) protocol. It replaces a dual data-strobe link with two identical independent data channels, thus improving the system s tolerance to harsh environments through additional double redundancy. Each channel incorporates an automatic line integrity control circuitry that delivers error signals in case of broken or shorted lines.
Kadlub, N; Chapuis Vandenbogaerde, C; Joly, A; Neiva, C; Vazquez, M-P; Picard, A
2018-04-01
Comparing functional outcomes after velar repair appeared to be difficult because of the absence of international standardized scale. Moreover most of the studies evaluating speech after cleft surgery present multiple biases. The aim of our study was to assess speech outcomes in a homogeneous group of patients, and to define an equivalence table between different speech scales. Patients with isolated cleft lip and palate (CLP), operated in our unit by the same senior surgeon were included. All patient were operated according to the same protocol (cheilo-rhinoplasty and intravelar veloplasty at 6 months, followed by a direct closure of the hard palate at 15 months). Speech evaluation was performed after 3 year-old and before the alveolar cleft repair. Borel-Maisonny scale and nasometry were used for speech evaluation. Twenty-four patients were included: 17 unilateral CLP and 7 bilateral CLP. According to the Borel-Maisonny classifications, 82.5% were ranged phonation 1, 1-2 or 2b. Nasometry were normal in almost 60% of cases. This study showed the efficiency of our protocol, and intravelar veloplasty. Moreover we proposed an equivalence table for speech evaluation scale. Copyright © 2017 Elsevier Masson SAS. All rights reserved.
Le Vacon, F
2005-06-01
The qualification of the equipment is a particularly important stage in the transfusional process. On the one hand, of many standards such as those of certification or that of accreditation require it, just as the good transfusional practices; in addition, the practices of steps of quality assurance develop this aspect. Indeed, the absence of the realization of this qualification of material having an influence on the finished product, can lead to an error in the product. This qualification passes by various stages of which some are major such as the drafting of the schedule of conditions, the drafting of the operational protocol of qualification, the decision made for the setting in routine. Finally so that this qualification takes all its dimensions it is necessary to carry out methods linked to the international system of measurement. Moreover certain questions after reflexions must find response such as which unit to check, and only this one, the equipment is - it a complex one, is there a maintenance contract? Once all these elements taken into account, the questions having found their answer, the operational protocol will then well be built, the decisions of settings in routine could be done and the sets of the finalized stages.
NASA Astrophysics Data System (ADS)
Babik, M.; Chudoba, J.; Dewhurst, A.; Finnern, T.; Froy, T.; Grigoras, C.; Hafeez, K.; Hoeft, B.; Idiculla, T.; Kelsey, D. P.; López Muñoz, F.; Martelli, E.; Nandakumar, R.; Ohrenberg, K.; Prelz, F.; Rand, D.; Sciabà, A.; Tigerstedt, U.; Traynor, D.; Wartel, R.
2017-10-01
IPv4 network addresses are running out and the deployment of IPv6 networking in many places is now well underway. Following the work of the HEPiX IPv6 Working Group, a growing number of sites in the Worldwide Large Hadron Collider Computing Grid (WLCG) are deploying dual-stack IPv6/IPv4 services. The aim of this is to support the use of IPv6-only clients, i.e. worker nodes, virtual machines or containers. The IPv6 networking protocols while they do contain features aimed at improving security also bring new challenges for operational IT security. The lack of maturity of IPv6 implementations together with the increased complexity of some of the protocol standards raise many new issues for operational security teams. The HEPiX IPv6 Working Group is producing guidance on best practices in this area. This paper considers some of the security concerns for WLCG in an IPv6 world and presents the HEPiX IPv6 working group guidance for the system administrators who manage IT services on the WLCG distributed infrastructure, for their related site security and networking teams and for developers and software engineers working on WLCG applications.
Stine, Kimo C.; Wahl, Elizabeth C.; Liu, Lichu; Skinner, Robert A.; VanderSchilden, Jaclyn; Bunn, Robert C.; Montgomery, Corey O.; Aronson, James; Becton, David L.; Nicholas, Richard W.; Swearingen, Christopher J.; Suva, Larry J.; Lumpkin, Charles K.
2017-01-01
The majority of Osteosarcoma (OS) patients are treated with a combination of chemotherapy, resection, and limb salvage protocols. These protocols include distraction osteogenesis (DO), which is characterized by direct new bone formation. Cisplatin (CDP) is extensively used for OS chemotherapy and recent studies, using a mouse DO model, have demonstrated that CDP has profound negative effects on bone repair. Recent oncological therapeutic strategies are based on the use of standard cytotoxic drugs plus an assortment of biologic agents. Here we demonstrate that the previously reported CDP-associated inhibition of bone repair can be modulated by the administration of a small molecule p53 inducer (nutlin-3). The effects of nutlin-3 on CDP osteotoxicity were studied using both pre- and post-operative treatment models. In both cases the addition of nutlin-3, bracketing CDP exposure, demonstrated robust and significant bone sparing activity (p < 0.01–0.001). In addition the combination of nutlin-3 and CDP induced equivalent OS tumor killing in a xenograft model. Collectively, these results demonstrate that the induction of p53 peri-operatively protects bone healing from the toxic effects of CDP, while maintaining OS toxicity. PMID:26867804
Student Performances in Various Learning Protocols
ERIC Educational Resources Information Center
Gregorius, Roberto
2011-01-01
A comparison was made between students' overall performance, as measured by overall grade, in different teaching and learning protocols: (1) traditional textbook and lecture along with standard examinations; (2) lectures with online augmentation and PowerPoint lecture notes along with standard examinations; (3) similar to "(2)" but with…
Current federal regulations required monitoring for fecal coliforms or Salmonella in biosolids destined for land application. Methods used for analysis of fecal coliforms and Salmonella were reviewed and a standard protocol was developed. The protocols were then...
Current federal regulations require monitoring for fecal coliforms or Salmonella in biosolids destined for land application. Methods used for analysis of fecal coliforms and Salmonella were reviewed and a standard protocol was developed. The protocols were then evaluated by testi...
Comment on 'Quantum direct communication with authentication'
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Zhan-jun; Key Laboratory of Optoelectronic Information Acquisition and Manipulation of Ministry of Education of China, School of Physics and Material Science, Anhui University, Hefei 230039; Liu, Jun
2007-02-15
Two protocols of quantum direct communication with authentication [Phys. Rev. A 73, 042305 (2006)] were recently proposed by Lee, Lim, and Yang. In this paper we will show that in the two protocols the authenticator Trent should be prevented from knowing the secret message. The first protocol can be eavesdropped on by Trent using the intercept-measure-resend attack, while the second protocol can be eavesdropped on by Trent using a simple single-qubit measurement. To fix these leaks, we revise the original versions of the protocols by using the Pauli Z operation {sigma}{sub z} instead of the original bit-flip operation X. Asmore » a consequence, the attacks we present can be prevented and accordingly the protocol securities are improved.« less
Entanglement distillation protocols and number theory
NASA Astrophysics Data System (ADS)
Bombin, H.; Martin-Delgado, M. A.
2005-09-01
We show that the analysis of entanglement distillation protocols for qudits of arbitrary dimension D benefits from applying basic concepts from number theory, since the set ZDn associated with Bell diagonal states is a module rather than a vector space. We find that a partition of ZDn into divisor classes characterizes the invariant properties of mixed Bell diagonal states under local permutations. We construct a very general class of recursion protocols by means of unitary operations implementing these local permutations. We study these distillation protocols depending on whether we use twirling operations in the intermediate steps or not, and we study them both analytically and numerically with Monte Carlo methods. In the absence of twirling operations, we construct extensions of the quantum privacy algorithms valid for secure communications with qudits of any dimension D . When D is a prime number, we show that distillation protocols are optimal both qualitatively and quantitatively.
Campoy, Irene; Lanau, Lucia; Altadill, Tatiana; Sequeiros, Tamara; Cabrera, Silvia; Cubo-Abert, Montserrat; Pérez-Benavente, Assumpción; Garcia, Angel; Borrós, Salvador; Santamaria, Anna; Ponce, Jordi; Matias-Guiu, Xavier; Reventós, Jaume; Gil-Moreno, Antonio; Rigau, Marina; Colas, Eva
2016-06-18
Uterine aspirates are used in the diagnostic process of endometrial disorders, yet further applications could emerge if its complex milieu was simplified. Exosome-like vesicles isolated from uterine aspirates could become an attractive source of biomarkers, but there is a need to standardize isolation protocols. The objective of the study was to determine whether exosome-like vesicles exist in the fluid fraction of uterine aspirates and to compare protocols for their isolation, characterization, and analysis. We collected uterine aspirates from 39 pre-menopausal women suffering from benign gynecological diseases. The fluid fraction of 27 of those aspirates were pooled and split into equal volumes to evaluate three differential centrifugation-based procedures: (1) a standard protocol, (2) a filtration protocol, and (3) a sucrose cushion protocol. Characterization of isolated vesicles was assessed by electron microscopy, nanoparticle tracking analysis and immunoblot. Specifically for RNA material, we evaluate the effect of sonication and RNase A treatment at different steps of the protocol. We finally confirmed the efficiency of the selected methods in non-pooled samples. All protocols were useful to isolate exosome-like vesicles. However, the Standard procedure was the best performing protocol to isolate exosome-like vesicles from uterine aspirates: nanoparticle tracking analysis revealed a higher concentration of vesicles with a mode of 135 ± 5 nm, and immunoblot showed a higher expression of exosome-related markers (CD9, CD63, and CD81) thus verifying an enrichment in this type of vesicles. RNA contained in exosome-like vesicles was successfully extracted with no sonication treatment and exogenous nucleic acids digestion with RNaseA, allowing the analysis of the specific inner cargo by Real-Time qPCR. We confirmed the existence of exosome-like vesicles in the fluid fraction of uterine aspirates. They were successfully isolated by differential centrifugation giving sufficient proteomic and transcriptomic material for further analyses. The Standard protocol was the best performing procedure since the other two tested protocols did not ameliorate neither yield nor purity of exosome-like vesicles. This study contributes to establishing the basis for future comparative studies to foster the field of biomarker research in gynecology.
Detecting Cyber Attacks On Nuclear Power Plants
NASA Astrophysics Data System (ADS)
Rrushi, Julian; Campbell, Roy
This paper proposes an unconventional anomaly detection approach that provides digital instrumentation and control (I&C) systems in a nuclear power plant (NPP) with the capability to probabilistically discern between legitimate protocol frames and attack frames. The stochastic activity network (SAN) formalism is used to model the fusion of protocol activity in each digital I&C system and the operation of physical components of an NPP. SAN models are employed to analyze links between protocol frames as streams of bytes, their semantics in terms of NPP operations, control data as stored in the memory of I&C systems, the operations of I&C systems on NPP components, and NPP processes. Reward rates and impulse rewards are defined in the SAN models based on the activity-marking reward structure to estimate NPP operation profiles. These profiles are then used to probabilistically estimate the legitimacy of the semantics and payloads of protocol frames received by I&C systems.
NASA Technical Reports Server (NTRS)
Ivancic, William D.; Glover, Daniel R.; vonDeak, Thomas C.; Bhasin, Kul B.
1998-01-01
The realization of the full potential of the National Information Infrastructure (NH) and Global Information Infrastructure (GII) requires seamless interoperability of emerging satellite networks with terrestrial networks. This requires a cooperative effort between industry, academia and government agencies to develop and advocate new, satellite-friendly communication protocols and modifications to existing communication protocol standards. These groups have recently come together to actively participating in a number of standards making bodies including: the Internet Engineering Task Force (IETF), the Asynchronous Transfer Mode (ATM) Forum, the International Telecommunication Union (ITU) and the Telecommunication Industry Association MA) to ensure that issues regarding efficient use of these protocols over satellite links are not overlooked. This paper will summarize the progress made toward standards development to achieve seamless integration and accelerate the deployment of multimedia applications.
ERIC Educational Resources Information Center
Benner, Gregory J.; Nelson, J. Ron; Sanders, Elizabeth A.; Ralston, Nicole C.
2012-01-01
This article examined the efficacy of a primary-level, standard-protocol behavior intervention for students with externalizing behavioral disorders. Elementary schools were randomly assigned to treatment (behavior intervention) or control (business as usual) conditions, and K-3 students were screened for externalizing behavior risk status. The…
DELAMINATION AND XRF ANALYSIS OF NIST LEAD IN PAINT FILM STANDARDS
The objectives of this protocol were to remove the laminate coating from lead paint film standards acquired from NIST by means of surface heating. The average XRF value did not change after removal of the polymer coating suggesting that this protocol is satisfactory for renderin...
NASA STI Program Coordinating Council Twelfth Meeting: Standards
NASA Technical Reports Server (NTRS)
1994-01-01
The theme of this NASA Scientific and Technical Information Program Coordinating Council Meeting was standards and their formation and application. Topics covered included scientific and technical information architecture, the Open Systems Interconnection Transmission Control Protocol/Internet Protocol, Machine-Readable Cataloging (MARC) open system environment procurement, and the Government Information Locator Service.
A standard protocol for describing individual-based and agent-based models
Grimm, Volker; Berger, Uta; Bastiansen, Finn; Eliassen, Sigrunn; Ginot, Vincent; Giske, Jarl; Goss-Custard, John; Grand, Tamara; Heinz, Simone K.; Huse, Geir; Huth, Andreas; Jepsen, Jane U.; Jorgensen, Christian; Mooij, Wolf M.; Muller, Birgit; Pe'er, Guy; Piou, Cyril; Railsback, Steven F.; Robbins, Andrew M.; Robbins, Martha M.; Rossmanith, Eva; Ruger, Nadja; Strand, Espen; Souissi, Sami; Stillman, Richard A.; Vabo, Rune; Visser, Ute; DeAngelis, Donald L.
2006-01-01
Simulation models that describe autonomous individual organisms (individual based models, IBM) or agents (agent-based models, ABM) have become a widely used tool, not only in ecology, but also in many other disciplines dealing with complex systems made up of autonomous entities. However, there is no standard protocol for describing such simulation models, which can make them difficult to understand and to duplicate. This paper presents a proposed standard protocol, ODD, for describing IBMs and ABMs, developed and tested by 28 modellers who cover a wide range of fields within ecology. This protocol consists of three blocks (Overview, Design concepts, and Details), which are subdivided into seven elements: Purpose, State variables and scales, Process overview and scheduling, Design concepts, Initialization, Input, and Submodels. We explain which aspects of a model should be described in each element, and we present an example to illustrate the protocol in use. In addition, 19 examples are available in an Online Appendix. We consider ODD as a first step for establishing a more detailed common format of the description of IBMs and ABMs. Once initiated, the protocol will hopefully evolve as it becomes used by a sufficiently large proportion of modellers.
The Space Communications Protocol Standards Program
NASA Technical Reports Server (NTRS)
Jeffries, Alan; Hooke, Adrian J.
1994-01-01
In the fall of 1992 NASA and the Department of Defense chartered a technical team to explore the possibility of developing a common set of space data communications standards for potential dual-use across the U.S. national space mission support infrastructure. The team focused on the data communications needs of those activities associated with on-lined control of civil and military aircraft. A two-pronged approach was adopted: a top-down survey of representative civil and military space data communications requirements was conducted; and a bottom-up analysis of available standard data communications protocols was performed. A striking intersection of civil and military space mission requirements emerged, and an equally striking consensus on the approach towards joint civil and military space protocol development was reached. The team concluded that wide segments of the U.S. civil and military space communities have common needs for: (1) an efficient file transfer protocol; (2) various flavors of underlying data transport service; (3) an optional data protection mechanism to assure end-to-end security of message exchange; and (4) an efficient internetworking protocol. These recommendations led to initiating a program to develop a suite of protocols based on these findings. This paper describes the current status of this program.
NASA Astrophysics Data System (ADS)
Abercromby, Andrew F. J.; Conkin, Johnny; Gernhardt, Michael L.
2015-04-01
NASA's plans for future human exploration missions utilize a new atmosphere of 56.5 kPa (8.2 psia), 34% O2, 66% N2 to enable rapid extravehicular activity (EVA) capability with minimal gas losses; however, existing EVA prebreathe protocols to mitigate risk of decompression sickness (DCS) are not applicable to the new exploration atmosphere. We provide preliminary analysis of a 15-min prebreathe protocol and examine the potential benefits of intermittent recompression (IR) and an abbreviated N2 purge on crew time and gas consumables usage. A probabilistic model of decompression stress based on an established biophysical model of DCS risk was developed, providing significant (p<0.0001) prediction and goodness-of-fit with 84 cases of DCS in 668 human altitude exposures including a variety of pressure profiles. DCS risk for a 15-min prebreathe protocol was then estimated under different exploration EVA scenarios. Estimated DCS risk for all EVA scenarios modeled using the 15-min prebreathe protocol ranged between 6.1% and 12.1%. Supersaturation in neurological tissues (5- and 10-min half-time compartments) is prevented and tissue tensions in faster half-time compartments (≤40 min), where the majority of whole-body N2 is located, are reduced to about the levels (30.0 vs. 27.6 kPa) achieved during a standard Shuttle prebreathe protocol. IR reduced estimated DCS risk from 9.7% to 7.9% (1.8% reduction) and from 8.4% to 6.1% (2.3% reduction) for the scenarios modeled; the penalty of N2 reuptake during IR may be outweighed by the benefit of decreased bubble size. Savings of 75% of purge gas and time (0.22 kg gas and 6 min of crew time per person per EVA) are achievable by abbreviating the EVA suit purge to 20% N2 vs. 5% N2 at the expense of an increase in estimated DCS risk from 9.7% to 12.1% (2.4% increase). A 15-min prebreathe protocol appears feasible using the new exploration atmosphere. IR between EVAs may enable reductions in suit purge and prebreathe requirements, decompression stress, and/or suit operating pressures. Ground trial validation is required before operational implementation.
Loftus, Tyler J; Mira, Juan C; Ozrazgat-Baslanti, Tezcan; Ghita, Gabriella L; Wang, Zhongkai; Stortz, Julie A; Brumback, Babette A; Bihorac, Azra; Segal, Mark S; Anton, Stephen D; Leeuwenburgh, Christiaan; Mohr, Alicia M; Efron, Philip A; Moldawer, Lyle L; Moore, Frederick A; Brakenridge, Scott C
2017-08-01
Sepsis is a common, costly and morbid cause of critical illness in trauma and surgical patients. Ongoing advances in sepsis resuscitation and critical care support strategies have led to improved in-hospital mortality. However, these patients now survive to enter state of chronic critical illness (CCI), persistent low-grade organ dysfunction and poor long-term outcomes driven by the persistent inflammation, immunosuppression and catabolism syndrome (PICS). The Sepsis and Critical Illness Research Center (SCIRC) was created to provide a platform by which the prevalence and pathogenesis of CCI and PICS may be understood at a mechanistic level across multiple medical disciplines, leading to the development of novel management strategies and targeted therapies. Here, we describe the design, study cohort and standard operating procedures used in the prospective study of human sepsis at a level 1 trauma centre and tertiary care hospital providing care for over 2600 critically ill patients annually. These procedures include implementation of an automated sepsis surveillance initiative, augmentation of clinical decisions with a computerised sepsis protocol, strategies for direct exportation of quality-filtered data from the electronic medical record to a research database and robust long-term follow-up. This study has been registered at ClinicalTrials.gov, approved by the University of Florida Institutional Review Board and is actively enrolling subjects. Dissemination of results is forthcoming. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Pulmonary outcome of esophageal atresia patients and its potential causes in early childhood.
Dittrich, René; Stock, Philippe; Rothe, Karin; Degenhardt, Petra
2017-08-01
The aim of this study was to illustrate the pulmonary long term outcome of patients with repaired esophageal atresia and to further examine causes and correlations that might have led to this outcome. Twenty-seven of 62 possible patients (43%) aged 5-20years, with repaired esophageal atresia were recruited. Body plethysmography and spirometry were performed to evaluate lung function, and the Bruce protocol treadmill exercise test to assess physical fitness. Results were correlated to conditions such as interpouch distance, gastroesophageal reflux or duration of post-operative mechanical ventilation. Seventeen participants (63%) showed abnormal lung function at rest or after exercise. Restrictive ventilatory defects (solely restrictive or combined) were found in 11 participants (41%), and obstructive ventilatory defects (solely obstructive or combined) in 13 subjects (48%). Twenty-two participants (81%) performed the Bruce protocol treadmill exercise test to standard. The treadmill exercise results were expressed in z-score and revealed to be significantly below the standard population mean (z-score=-1.40). Moreover, significant correlations between restrictive ventilatory defects and the interpouch distance; duration of post-operative ventilation; gastroesophageal reflux disease; plus recurrent aspiration pneumonia during infancy; were described. It was shown that esophageal atresia and associated early complications have significant impact on pulmonary long term outcomes such as abnormal lung function and, in particular restrictive ventilatory defects. Long-running and regular follow-ups of patients with congenital esophageal atresia are necessary in order to detect and react to the development and progression of associated complications such as ventilation disorders or gastroesophageal reflux disease. Prognosis study, Level II. Copyright © 2016 Elsevier Inc. All rights reserved.
Louisa Poon, W Y; Covington, Jennifer P; Dempsey, Lauren S; Goetgeluck, Scott L; Marscher, William F; Morelli, Sierra C; Powell, Jana E; Rivers, Elizabeth M; Roth, Ira G
2014-01-01
This article provides an introduction to the use of students' business skills in optimizing teaching opportunities, student learning, and client satisfaction in a primary health care setting at a veterinary teaching hospital. Seven veterinary-student members of the local chapter of the Veterinary Business Management Association (VBMA) evaluated the primary-care service at the University of Georgia (UGA) veterinary teaching hospital and assessed six areas of focus: (1) branding and marketing, (2) client experience, (3) staff and staffing, (4) student experience, (5) time management, and (6) standard operating procedures and protocols. For each area of focus, strengths, weaknesses, opportunities, and threats were identified. Of the six areas, two were identified as areas in need of immediate improvement, the first being the updating of standard operating protocols and the second being time management and the flow of appointments. Recommendations made for these two areas were implemented. Overall, the staff and students provided positive feedback on the recommended changes. Through such a student-centered approach to improving the quality of their education, students are empowered and are held accountable for their learning environment. The fact that the VBMA functions without a parent organization and that the primary-care service at UGA functions primarily as a separate entity from the specialty services at the College of Veterinary Medicine allowed students to have a direct impact on their learning environment. We hope that this model for advancing business education will be studied and promoted to benefit both veterinary education and business practice within academia.
Cho, S H; Lowenstein, J R; Balter, P A; Wells, N H; Hanson, W F
2000-01-01
A new calibration protocol, developed by the AAPM Task Group 51 (TG-51) to replace the TG-21 protocol, is based on an absorbed-dose to water standard and calibration factor (N(D,w)), while the TG-21 protocol is based on an exposure (or air-kerma) standard and calibration factor (N(x)). Because of differences between these standards and the two protocols, the results of clinical reference dosimetry based on TG-51 may be somewhat different from those based on TG-21. The Radiological Physics Center has conducted a systematic comparison between the two protocols, in which photon and electron beam outputs following both protocols were compared under identical conditions. Cylindrical chambers used in this study were selected from the list given in the TG-51 report, covering the majority of current manufacturers. Measured ratios between absorbed-dose and air-kerma calibration factors, derived from the standards traceable to the NIST, were compared with calculated values using the TG-21 protocol. The comparison suggests that there is roughly a 1% discrepancy between measured and calculated ratios. This discrepancy may provide a reasonable measure of possible changes between the absorbed-dose to water determined by TG-51 and that determined by TG-21 for photon beam calibrations. The typical change in a 6 MV photon beam calibration following the implementation of the TG-51 protocol was about 1%, regardless of the chamber used, and the change was somewhat smaller for an 18 MV photon beam. On the other hand, the results for 9 and 16 MeV electron beams show larger changes up to 2%, perhaps because of the updated electron stopping power data used for the TG-51 protocol, in addition to the inherent 1% discrepancy presented in the calibration factors. The results also indicate that the changes may be dependent on the electron energy.
Pinkerton, Steven D.; Pearson, Cynthia R.; Eachus, Susan R.; Berg, Karina M.; Grimes, Richard M.
2008-01-01
Summary Maximizing our economic investment in HIV prevention requires balancing the costs of candidate interventions against their effects and selecting the most cost-effective interventions for implementation. However, many HIV prevention intervention trials do not collect cost information, and those that do use a variety of cost data collection methods and analysis techniques. Standardized cost data collection procedures, instrumentation, and analysis techniques are needed to facilitate the task of assessing intervention costs and to ensure comparability across intervention trials. This article describes the basic elements of a standardized cost data collection and analysis protocol and outlines a computer-based approach to implementing this protocol. Ultimately, the development of such a protocol would require contributions and “buy-in” from a diverse range of stakeholders, including HIV prevention researchers, cost-effectiveness analysts, community collaborators, public health decision makers, and funding agencies. PMID:18301128
Clinical Components of Telemedicine Programs for Diabetic Retinopathy.
Horton, Mark B; Silva, Paolo S; Cavallerano, Jerry D; Aiello, Lloyd Paul
2016-12-01
Diabetic retinopathy is a leading cause of new-onset vision loss worldwide. Treatments supported by large clinical trials are effective in preserving vision, but many persons do not receive timely diagnosis and treatment of diabetic retinopathy, which is typically asymptomatic when most treatable. Telemedicine evaluation to identify diabetic retinopathy has the potential to improve access to care, but there are no universal standards regarding camera choice or protocol for ocular telemedicine. We review the literature regarding the impact of imaging device, number and size of retinal images, pupil dilation, type of image grader, and diagnostic accuracy on telemedicine assessment for diabetic retinopathy. Telemedicine assessment of diabetic retinopathy has the potential to preserve vision, but further development of telemedicine specific technology and standardization of operations are needed to better realize its potential.