Sample records for design processing protocols

  1. The Development of a Design and Construction Process Protocol to Support the Home Modification Process Delivered by Occupational Therapists

    PubMed Central

    Ormerod, Marcus; Newton, Rita

    2018-01-01

    Modifying the home environments of older people as they age in place is a well-established health and social care intervention. Using design and construction methods to redress any imbalance caused by the ageing process or disability within the home environment, occupational therapists are seen as the experts in this field of practice. However, the process used by occupational therapists when modifying home environments has been criticised for being disorganised and not founded on theoretical principles and concepts underpinning the profession. To address this issue, research was conducted to develop a design and construction process protocol specifically for home modifications. A three-stage approach was taken for the analysis of qualitative data generated from an online survey, completed by 135 occupational therapists in the UK. Using both the existing occupational therapy intervention process model and the design and construction process protocol as the theoretical frameworks, a 4-phase, 9-subphase design and construction process protocol for home modifications was developed. Overall, the study is innovative in developing the first process protocol for home modifications, potentially providing occupational therapists with a systematic and effective approach to the design and delivery of home modification services for older and disabled people. PMID:29682348

  2. The Development of a Design and Construction Process Protocol to Support the Home Modification Process Delivered by Occupational Therapists.

    PubMed

    Russell, Rachel; Ormerod, Marcus; Newton, Rita

    2018-01-01

    Modifying the home environments of older people as they age in place is a well-established health and social care intervention. Using design and construction methods to redress any imbalance caused by the ageing process or disability within the home environment, occupational therapists are seen as the experts in this field of practice. However, the process used by occupational therapists when modifying home environments has been criticised for being disorganised and not founded on theoretical principles and concepts underpinning the profession. To address this issue, research was conducted to develop a design and construction process protocol specifically for home modifications. A three-stage approach was taken for the analysis of qualitative data generated from an online survey, completed by 135 occupational therapists in the UK. Using both the existing occupational therapy intervention process model and the design and construction process protocol as the theoretical frameworks, a 4-phase, 9-subphase design and construction process protocol for home modifications was developed. Overall, the study is innovative in developing the first process protocol for home modifications, potentially providing occupational therapists with a systematic and effective approach to the design and delivery of home modification services for older and disabled people.

  3. Representing the work of medical protocols for organizational simulation.

    PubMed Central

    Fridsma, D. B.

    1998-01-01

    Developing and implementing patient care protocols within a specific organizational setting requires knowledge of the protocol, the organization, and the way in which the organization does its work. Computer-based simulation tools have been used in many industries to provide managers with prospective insight into problems of work process and organization design mismatch. Many of these simulation tools are designed for well-understood routine work processes in which there are few contingent tasks. In this paper, we describe theoretic that make it possible to simulate medical protocols using an information-processing theory framework. These simulations will allow medical administrators to test different protocol and organizational designs before actually using them within a particular clinical setting. PMID:9929231

  4. 40 CFR Appendix A - Protocol for Using an Electrochemical Analyzer to Determine Oxygen and Carbon Monoxide...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., and Process Heaters Using Portable Analyzers”, EMC Conditional Test Protocol 30 (CTM-30), Gas Research... cell design(s) conforming to this protocol will determine the analytical range for each gas component..., selective gas scrubbers, etc.) to meet the design specifications of this protocol. Do not make changes to...

  5. 40 CFR Appendix A - Protocol for Using an Electrochemical Analyzer to Determine Oxygen and Carbon Monoxide...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., and Process Heaters Using Portable Analyzers”, EMC Conditional Test Protocol 30 (CTM-30), Gas Research... cell design(s) conforming to this protocol will determine the analytical range for each gas component..., selective gas scrubbers, etc.) to meet the design specifications of this protocol. Do not make changes to...

  6. j5 DNA assembly design automation.

    PubMed

    Hillson, Nathan J

    2014-01-01

    Modern standardized methodologies, described in detail in the previous chapters of this book, have enabled the software-automated design of optimized DNA construction protocols. This chapter describes how to design (combinatorial) scar-less DNA assembly protocols using the web-based software j5. j5 assists biomedical and biotechnological researchers construct DNA by automating the design of optimized protocols for flanking homology sequence as well as type IIS endonuclease-mediated DNA assembly methodologies. Unlike any other software tool available today, j5 designs scar-less combinatorial DNA assembly protocols, performs a cost-benefit analysis to identify which portions of an assembly process would be less expensive to outsource to a DNA synthesis service provider, and designs hierarchical DNA assembly strategies to mitigate anticipated poor assembly junction sequence performance. Software integrated with j5 add significant value to the j5 design process through graphical user-interface enhancement and downstream liquid-handling robotic laboratory automation.

  7. 34 CFR Appendix to Part 5 - Unknown Title

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... the Department. Research protocol, design, processing, and other technical information to the extent... report submitted for comment prior to acceptance. Research protocol, design, processing, and other...-10) Pt. 5, App. Appendix to Part 5 [The following are some examples of specific records (or specific...

  8. A Novel Re-keying Function Protocol (NRFP) For Wireless Sensor Network Security

    PubMed Central

    Abdullah, Maan Younis; Hua, Gui Wei; Alsharabi, Naif

    2008-01-01

    This paper describes a novel re-keying function protocol (NRFP) for wireless sensor network security. A re-keying process management system for sensor networks is designed to support in-network processing. The design of the protocol is motivated by decentralization key management for wireless sensor networks (WSNs), covering key deployment, key refreshment, and key establishment. NRFP supports the establishment of novel administrative functions for sensor nodes that derive/re-derive a session key for each communication session. The protocol proposes direct connection, in-direct connection and hybrid connection. NRFP also includes an efficient protocol for local broadcast authentication based on the use of one-way key chains. A salient feature of the authentication protocol is that it supports source authentication without precluding innetwork processing. Security and performance analysis shows that it is very efficient in computation, communication and storage and, that NRFP is also effective in defending against many sophisticated attacks. PMID:27873963

  9. A Novel Re-keying Function Protocol (NRFP) For Wireless Sensor Network Security.

    PubMed

    Abdullah, Maan Younis; Hua, Gui Wei; Alsharabi, Naif

    2008-12-04

    This paper describes a novel re-keying function protocol (NRFP) for wireless sensor network security. A re-keying process management system for sensor networks is designed to support in-network processing. The design of the protocol is motivated by decentralization key management for wireless sensor networks (WSNs), covering key deployment, key refreshment, and key establishment. NRFP supports the establishment of novel administrative functions for sensor nodes that derive/re-derive a session key for each communication session. The protocol proposes direct connection, in-direct connection and hybrid connection. NRFP also includes an efficient protocol for local broadcast authentication based on the use of one-way key chains. A salient feature of the authentication protocol is that it supports source authentication without precluding in-network processing. Security and performance analysis shows that it is very efficient in computation, communication and storage and, that NRFP is also effective in defending against many sophisticated attacks.

  10. Cure Cycle Design Methodology for Fabricating Reactive Resin Matrix Fiber Reinforced Composites: A Protocol for Producing Void-free Quality Laminates

    NASA Technical Reports Server (NTRS)

    Hou, Tan-Hung

    2014-01-01

    For the fabrication of resin matrix fiber reinforced composite laminates, a workable cure cycle (i.e., temperature and pressure profiles as a function of processing time) is needed and is critical for achieving void-free laminate consolidation. Design of such a cure cycle is not trivial, especially when dealing with reactive matrix resins. An empirical "trial and error" approach has been used as common practice in the composite industry. Such an approach is not only costly, but also ineffective at establishing the optimal processing conditions for a specific resin/fiber composite system. In this report, a rational "processing science" based approach is established, and a universal cure cycle design protocol is proposed. Following this protocol, a workable and optimal cure cycle can be readily and rationally designed for most reactive resin systems in a cost effective way. This design protocol has been validated through experimental studies of several reactive polyimide composites for a wide spectrum of usage that has been documented in the previous publications.

  11. Design and Verification of a Distributed Communication Protocol

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar A.; Goodloe, Alwyn E.

    2009-01-01

    The safety of remotely operated vehicles depends on the correctness of the distributed protocol that facilitates the communication between the vehicle and the operator. A failure in this communication can result in catastrophic loss of the vehicle. To complicate matters, the communication system may be required to satisfy several, possibly conflicting, requirements. The design of protocols is typically an informal process based on successive iterations of a prototype implementation. Yet distributed protocols are notoriously difficult to get correct using such informal techniques. We present a formal specification of the design of a distributed protocol intended for use in a remotely operated vehicle, which is built from the composition of several simpler protocols. We demonstrate proof strategies that allow us to prove properties of each component protocol individually while ensuring that the property is preserved in the composition forming the entire system. Given that designs are likely to evolve as additional requirements emerge, we show how we have automated most of the repetitive proof steps to enable verification of rapidly changing designs.

  12. Compositional Verification of a Communication Protocol for a Remotely Operated Vehicle

    NASA Technical Reports Server (NTRS)

    Goodloe, Alwyn E.; Munoz, Cesar A.

    2009-01-01

    This paper presents the specification and verification in the Prototype Verification System (PVS) of a protocol intended to facilitate communication in an experimental remotely operated vehicle used by NASA researchers. The protocol is defined as a stack-layered com- position of simpler protocols. It can be seen as the vertical composition of protocol layers, where each layer performs input and output message processing, and the horizontal composition of different processes concurrently inhabiting the same layer, where each process satisfies a distinct requirement. It is formally proven that the protocol components satisfy certain delivery guarantees. Compositional techniques are used to prove these guarantees also hold in the composed system. Although the protocol itself is not novel, the methodology employed in its verification extends existing techniques by automating the tedious and usually cumbersome part of the proof, thereby making the iterative design process of protocols feasible.

  13. 77 FR 47707 - Public Housing Assessment System (PHAS): Physical Condition Scoring Notice and Revised Dictionary...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-09

    ... Standards (UPCS) inspection protocol was designed to be a uniform inspection process and standard for HUD's... frequency of inspections based on the results the UPCS inspection. UPCS was designed to assess the condition... physical assessment score. HUD Response: The UPCS inspection protocol as designed assesses the physical...

  14. Predicting Silk Fiber Mechanical Properties through Multiscale Simulation and Protein Design.

    PubMed

    Rim, Nae-Gyune; Roberts, Erin G; Ebrahimi, Davoud; Dinjaski, Nina; Jacobsen, Matthew M; Martín-Moldes, Zaira; Buehler, Markus J; Kaplan, David L; Wong, Joyce Y

    2017-08-14

    Silk is a promising material for biomedical applications, and much research is focused on how application-specific, mechanical properties of silk can be designed synthetically through proper amino acid sequences and processing parameters. This protocol describes an iterative process between research disciplines that combines simulation, genetic synthesis, and fiber analysis to better design silk fibers with specific mechanical properties. Computational methods are used to assess the protein polymer structure as it forms an interconnected fiber network through shearing and how this process affects fiber mechanical properties. Model outcomes are validated experimentally with the genetic design of protein polymers that match the simulation structures, fiber fabrication from these polymers, and mechanical testing of these fibers. Through iterative feedback between computation, genetic synthesis, and fiber mechanical testing, this protocol will enable a priori prediction capability of recombinant material mechanical properties via insights from the resulting molecular architecture of the fiber network based entirely on the initial protein monomer composition. This style of protocol may be applied to other fields where a research team seeks to design a biomaterial with biomedical application-specific properties. This protocol highlights when and how the three research groups (simulation, synthesis, and engineering) should be interacting to arrive at the most effective method for predictive design of their material.

  15. DEVELOPMENT OF A RATIONALLY BASED DESIGN PROTOCOL FOR THE ULTRAVIOLET LIGHT DISINFECTION PROCESS

    EPA Science Inventory

    A protocol is demonstrated for the design and evaluation of ultraviolet (UV) disinfection systems based on a mathematical model. The disinfection model incorporates the system's physical dimensions, the residence time distribution of the reactor and dispersion characteristics, th...

  16. Evaluating the Process of Generating a Clinical Trial Protocol

    PubMed Central

    Franciosi, Lui G.; Butterfield, Noam N.; MacLeod, Bernard A.

    2002-01-01

    The research protocol is the principal document in the conduct of a clinical trial. Its generation requires knowledge about the research problem, the potential experimental confounders, and the relevant Good Clinical Practices for conducting the trial. However, such information is not always available to authors during the writing process. A checklist of over 80 items has been developed to better understand the considerations made by authors in generating a protocol. It is based on the most cited requirements for designing and implementing the randomised controlled trial. Items are categorised according to the trial's research question, experimental design, statistics, ethics, and standard operating procedures. This quality assessment tool evaluates the extent that a generated protocol deviates from the best-planned clinical trial.

  17. How system designers think: a study of design thinking in human factors engineering.

    PubMed

    Papantonopoulos, Sotiris

    2004-11-01

    The paper presents a descriptive study of design thinking in human factors engineering. The objective of the study is to analyse the role of interpretation in design thinking and the role of design practice in guiding interpretation. The study involved 10 system designers undertaking the allocation of cognitive functions in three production planning and control task scenarios. Allocation decisions were recorded and verbal protocols of the design process were collected to elicit the subjects' thought processes. Verbal protocol analysis showed that subjects carried out the design of cognitive task allocation as a problem of applying a selected automation technology from their initial design deliberations. This design strategy stands in contrast to the predominant view of system design that stipulates that user requirements should be thoroughly analysed prior to making any decisions about technology. Theoretical frameworks from design research and ontological design showed that the system design process may be better understood by recognizing the role of design hypotheses in system design, as well as the diverse interactions between interpretation and practice, means and ends, and design practice and the designer's pre-understanding which shape the design process. Ways to balance the bias exerted on the design process were discussed.

  18. Protocol Processing for 100 Gbit/s and Beyond - A Soft Real-Time Approach in Hardware and Software

    NASA Astrophysics Data System (ADS)

    Büchner, Steffen; Lopacinski, Lukasz; Kraemer, Rolf; Nolte, Jörg

    2017-09-01

    100 Gbit/s wireless communication protocol processing stresses all parts of a communication system until the outermost. The efficient use of upcoming 100 Gbit/s and beyond transmission technology requires the rethinking of the way protocols are processed by the communication endpoints. This paper summarizes the achievements of the project End2End100. We will present a comprehensive soft real-time stream processing approach that allows the protocol designer to develop, analyze, and plan scalable protocols for ultra high data rates of 100 Gbit/s and beyond. Furthermore, we will present an ultra-low power, adaptable, and massively parallelized FEC (Forward Error Correction) scheme that detects and corrects bit errors at line rate with an energy consumption between 1 pJ/bit and 13 pJ/bit. The evaluation results discussed in this publication show that our comprehensive approach allows end-to-end communication with a very low protocol processing overhead.

  19. The use of high-fidelity human patient simulation as an evaluative tool in the development of clinical research protocols and procedures.

    PubMed

    Wright, Melanie C; Taekman, Jeffrey M; Barber, Linda; Hobbs, Gene; Newman, Mark F; Stafford-Smith, Mark

    2005-12-01

    Errors in clinical research can be costly, in terms of patient safety, data integrity, and data collection. Data inaccuracy in early subjects of a clinical study may be associated with problems in the design of the protocol, procedures, and data collection tools. High-fidelity patient simulation centers provide an ideal environment to apply human-centered design to clinical trial development. A draft of a complex clinical protocol was designed, evaluated and modified using a high-fidelity human patient simulator in the Duke University Human Simulation and Patient Safety Center. The process included walk-throughs, detailed modifications of the protocol and development of procedural aids. Training of monitors and coordinators provided an opportunity for observation of performance that was used to identify further improvements to the protocol. Evaluative steps were used to design the research protocol and procedures. Iterative modifications were made to the protocol and data collection tools. The success in use of human simulation in the preparation of a complex clinical drug trial suggests the benefits of human patient simulation extend beyond training and medical equipment evaluation. Human patient simulation can provide a context for informal expert evaluation of clinical protocol design and for formal "rehearsal" to evaluate the efficacy of procedures and support tools.

  20. Description and Evaluation of the Research Ethics Review Process in Japan: Proposed Measures for Improvement.

    PubMed

    Suzuki, Mika; Sato, Keiko

    2016-07-01

    Research Ethics Committees (RECs) are designed to protect human subjects in research. It is essential to recognize whether the RECs are achieving this goal. Several studies have reported on RECs; however, detailed data regarding the quality of research protocols and the review process of RECs have not been reported in Japan. We examine research protocols reviewed by RECs and the review processes at three institutions using a novel checklist we developed. The data show that approximately half of all examined protocols lacked a clearly written "Background" section that defines the study rationale and design. These results reiterate suggestions made in previous research regarding educational programs and support departments that could enhance responsible conduct in clinical research to protect human subjects in Japan. © The Author(s) 2016.

  1. Design of freeze-drying processes for pharmaceuticals: practical advice.

    PubMed

    Tang, Xiaolin; Pikal, Michael J

    2004-02-01

    Design of freeze-drying processes is often approached with a "trial and error" experimental plan or, worse yet, the protocol used in the first laboratory run is adopted without further attempts at optimization. Consequently, commercial freeze-drying processes are often neither robust nor efficient. It is our thesis that design of an "optimized" freeze-drying process is not particularly difficult for most products, as long as some simple rules based on well-accepted scientific principles are followed. It is the purpose of this review to discuss the scientific foundations of the freeze-drying process design and then to consolidate these principles into a set of guidelines for rational process design and optimization. General advice is given concerning common stability issues with proteins, but unusual and difficult stability issues are beyond the scope of this review. Control of ice nucleation and crystallization during the freezing step is discussed, and the impact of freezing on the rest of the process and final product quality is reviewed. Representative freezing protocols are presented. The significance of the collapse temperature and the thermal transition, denoted Tg', are discussed, and procedures for the selection of the "target product temperature" for primary drying are presented. Furthermore, guidelines are given for selection of the optimal shelf temperature and chamber pressure settings required to achieve the target product temperature without thermal and/or mass transfer overload of the freeze dryer. Finally, guidelines and "rules" for optimization of secondary drying and representative secondary drying protocols are presented.

  2. Evaluation of the Telecommunications Protocol Processing Subsystem Using Reconfigurable Interoperable Gate Array

    NASA Technical Reports Server (NTRS)

    Pang, Jackson; Liddicoat, Albert; Ralston, Jesse; Pingree, Paula

    2006-01-01

    The current implementation of the Telecommunications Protocol Processing Subsystem Using Reconfigurable Interoperable Gate Arrays (TRIGA) is equipped with CFDP protocol and CCSDS Telemetry and Telecommand framing schemes to replace the CPU intensive software counterpart implementation for reliable deep space communication. We present the hardware/software co-design methodology used to accomplish high data rate throughput. The hardware CFDP protocol stack implementation is then compared against the two recent flight implementations. The results from our experiments show that TRIGA offers more than 3 orders of magnitude throughput improvement with less than one-tenth of the power consumption.

  3. Full-Scale Wind-Tunnel Investigation of Wing-Cooling Ducts Effects of Propeller Slipstream, Special Report

    NASA Technical Reports Server (NTRS)

    Nickle, F. R.; Freeman, Arthur B.

    1939-01-01

    The safety of remotely operated vehicles depends on the correctness of the distributed protocol that facilitates the communication between the vehicle and the operator. A failure in this communication can result in catastrophic loss of the vehicle. To complicate matters, the communication system may be required to satisfy several, possibly conflicting, requirements. The design of protocols is typically an informal process based on successive iterations of a prototype implementation. Yet distributed protocols are notoriously difficult to get correct using such informal techniques. We present a formal specification of the design of a distributed protocol intended for use in a remotely operated vehicle, which is built from the composition of several simpler protocols. We demonstrate proof strategies that allow us to prove properties of each component protocol individually while ensuring that the property is preserved in the composition forming the entire system. Given that designs are likely to evolve as additional requirements emerge, we show how we have automated most of the repetitive proof steps to enable verification of rapidly changing designs.

  4. Sketching in Design Journals: An Analysis of Visual Representations in the Product Design Process

    ERIC Educational Resources Information Center

    Lau, Kimberly; Oehlberg, Lora; Agogino, Alice

    2009-01-01

    This paper explores the sketching behavior of designers and the role of sketching in the design process. Observations from a descriptive study of sketches provided in design journals, characterized by a protocol measuring sketching activities, are presented. A distinction is made between journals that are entirely tangible and those that contain…

  5. Factors affecting adoption, implementation fidelity, and sustainability of the Redesigned Community Health Fund in Tanzania: a mixed methods protocol for process evaluation in the Dodoma region

    PubMed Central

    Kalolo, Albino; Radermacher, Ralf; Stoermer, Manfred; Meshack, Menoris; De Allegri, Manuela

    2015-01-01

    Background Despite the implementation of various initiatives to address low enrollment in voluntary micro health insurance (MHI) schemes in sub-Saharan Africa, the problem of low enrollment remains unresolved. The lack of process evaluations of such interventions makes it difficult to ascertain whether their poor results are because of design failures or implementation weaknesses. Objective In this paper, we describe a process evaluation protocol aimed at opening the ‘black box’ to evaluate the implementation processes of the Redesigned Community Health Fund (CHF) program in the Dodoma region of Tanzania. Design The study employs a cross-sectional mixed methods design and is being carried out 3 years after the launch of the Redesigned CHF program. The study is grounded in a conceptual framework which rests on the Diffusion of Innovation Theory and the Implementation Fidelity Framework. The study utilizes a mixture of quantitative and qualitative data collection tools (questionnaires, focus group discussions, in-depth interviews, and document review), and aligns the evaluation to the Theory of Intervention developed by our team. Quantitative data will be used to measure program adoption, implementation fidelity, and their moderating factors. Qualitative data will be used to explore the responses of stakeholders to the intervention, contextual factors, and moderators of adoption, implementation fidelity, and sustainability. Discussion This protocol describes a systematic process evaluation in relation to the implementation of a reformed MHI. We trust that the theoretical approaches and methodologies described in our protocol may be useful to inform the design of future process evaluations focused on the assessment of complex interventions, such as MHI schemes. PMID:26679408

  6. Analytical Models of Cross-Layer Protocol Optimization in Real-Time Wireless Sensor Ad Hoc Networks

    NASA Astrophysics Data System (ADS)

    Hortos, William S.

    The real-time interactions among the nodes of a wireless sensor network (WSN) to cooperatively process data from multiple sensors are modeled. Quality-of-service (QoS) metrics are associated with the quality of fused information: throughput, delay, packet error rate, etc. Multivariate point process (MVPP) models of discrete random events in WSNs establish stochastic characteristics of optimal cross-layer protocols. Discrete-event, cross-layer interactions in mobile ad hoc network (MANET) protocols have been modeled using a set of concatenated design parameters and associated resource levels by the MVPPs. Characterization of the "best" cross-layer designs for a MANET is formulated by applying the general theory of martingale representations to controlled MVPPs. Performance is described in terms of concatenated protocol parameters and controlled through conditional rates of the MVPPs. Modeling limitations to determination of closed-form solutions versus explicit iterative solutions for ad hoc WSN controls are examined.

  7. Quantum Tomography Protocols with Positivity are Compressed Sensing Protocols (Open Access)

    DTIC Science & Technology

    2015-12-08

    ARTICLE OPEN Quantum tomography protocols with positivity are compressed sensing protocols Amir Kalev1, Robert L Kosut2 and Ivan H Deutsch1...Characterising complex quantum systems is a vital task in quantum information science. Quantum tomography, the standard tool used for this purpose, uses a well...designed measurement record to reconstruct quantum states and processes. It is, however, notoriously inefficient. Recently, the classical signal

  8. The Growing Awareness Inventory: Building Capacity for Culturally Responsive Science and Mathematics with a Structured Observation Protocol

    ERIC Educational Resources Information Center

    Brown, Julie C.; Crippen, Kent J.

    2016-01-01

    This study represents a first iteration in the design process of the Growing Awareness Inventory (GAIn), a structured observation protocol for building the awareness of preservice teachers (PSTs) for resources in mathematics and science classrooms that can be used for culturally responsive pedagogy (CRP). The GAIn is designed to develop awareness…

  9. Electronic-To-Optical-To-Electronic Packet-Data Conversion

    NASA Technical Reports Server (NTRS)

    Monacos, Steve

    1996-01-01

    Space-time multiplexer (STM) cell-based communication system designed to take advantage of both high throughput attainable in optical transmission links and flexibility and functionality of electronic processing, storage, and switching. Long packets segmented and transmitted optically by wavelength-division multiplexing. Performs optoelectronic and protocol conversion between electronic "store-and-forward" protocols and optical "hot-potato" protocols.

  10. An Approach to Verification and Validation of a Reliable Multicasting Protocol

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Montgomery, Todd L.

    1994-01-01

    This paper describes the process of implementing a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally using a combination of formal and informal techniques in an attempt to ensure the correctness of its implementation. Our development process involved three concurrent activities: (1) the initial construction and incremental enhancement of a formal state model of the protocol machine; (2) the initial coding and incremental enhancement of the implementation; and (3) model-based testing of iterative implementations of the protocol. These activities were carried out by two separate teams: a design team and a V&V team. The design team built the first version of RMP with limited functionality to handle only nominal requirements of data delivery. In a series of iterative steps, the design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation. This was done by generating test cases based on suspected errant or offnominal behaviors predicted by the current model. If the execution of a test was different between the model and implementation, then the differences helped identify inconsistencies between the model and implementation. The dialogue between both teams drove the co-evolution of the model and implementation. Testing served as the vehicle for keeping the model and implementation in fidelity with each other. This paper describes (1) our experiences in developing our process model; and (2) three example problems found during the development of RMP.

  11. An approach to verification and validation of a reliable multicasting protocol

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Montgomery, Todd L.

    1995-01-01

    This paper describes the process of implementing a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally using a combination of formal and informal techniques in an attempt to ensure the correctness of its implementation. Our development process involved three concurrent activities: (1) the initial construction and incremental enhancement of a formal state model of the protocol machine; (2) the initial coding and incremental enhancement of the implementation; and (3) model-based testing of iterative implementations of the protocol. These activities were carried out by two separate teams: a design team and a V&V team. The design team built the first version of RMP with limited functionality to handle only nominal requirements of data delivery. In a series of iterative steps, the design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation. This was done by generating test cases based on suspected errant or off-nominal behaviors predicted by the current model. If the execution of a test was different between the model and implementation, then the differences helped identify inconsistencies between the model and implementation. The dialogue between both teams drove the co-evolution of the model and implementation. Testing served as the vehicle for keeping the model and implementation in fidelity with each other. This paper describes (1) our experiences in developing our process model; and (2) three example problems found during the development of RMP.

  12. Advanced medical imaging protocol workflow-a flexible electronic solution to optimize process efficiency, care quality and patient safety in the National VA Enterprise.

    PubMed

    Medverd, Jonathan R; Cross, Nathan M; Font, Frank; Casertano, Andrew

    2013-08-01

    Radiologists routinely make decisions with only limited information when assigning protocol instructions for the performance of advanced medical imaging examinations. Opportunity exists to simultaneously improve the safety, quality and efficiency of this workflow through the application of an electronic solution leveraging health system resources to provide concise, tailored information and decision support in real-time. Such a system has been developed using an open source, open standards design for use within the Veterans Health Administration. The Radiology Protocol Tool Recorder (RAPTOR) project identified key process attributes as well as inherent weaknesses of paper processes and electronic emulators of paper processes to guide the development of its optimized electronic solution. The design provides a kernel that can be expanded to create an integrated radiology environment. RAPTOR has implications relevant to the greater health care community, and serves as a case model for modernization of legacy government health information systems.

  13. Breadth in Design Problem Scoping: Using Insights from Experts to Investigate Student Processes. Research Brief

    ERIC Educational Resources Information Center

    Morozov, Andrew; Kilgore, Deborah; Atman, Cynthia

    2007-01-01

    In this study, the authors used two methods for analyzing expert data: verbal protocol analysis (VPA) and narrative analysis. VPA has been effectively used to describe the design processes employed by engineering students, expert designers, and expert-novice comparative research. VPA involves asking participants to "think aloud" while…

  14. A Guide to Writing a Qualitative Systematic Review Protocol to Enhance Evidence-Based Practice in Nursing and Health Care.

    PubMed

    Butler, Ashleigh; Hall, Helen; Copnell, Beverley

    2016-06-01

    The qualitative systematic review is a rapidly developing area of nursing research. In order to present trustworthy, high-quality recommendations, such reviews should be based on a review protocol to minimize bias and enhance transparency and reproducibility. Although there are a number of resources available to guide researchers in developing a quantitative review protocol, very few resources exist for qualitative reviews. To guide researchers through the process of developing a qualitative systematic review protocol, using an example review question. The key elements required in a systematic review protocol are discussed, with a focus on application to qualitative reviews: Development of a research question; formulation of key search terms and strategies; designing a multistage review process; critical appraisal of qualitative literature; development of data extraction techniques; and data synthesis. The paper highlights important considerations during the protocol development process, and uses a previously developed review question as a working example. This paper will assist novice researchers in developing a qualitative systematic review protocol. By providing a worked example of a protocol, the paper encourages the development of review protocols, enhancing the trustworthiness and value of the completed qualitative systematic review findings. Qualitative systematic reviews should be based on well planned, peer reviewed protocols to enhance the trustworthiness of results and thus their usefulness in clinical practice. Protocols should outline, in detail, the processes which will be used to undertake the review, including key search terms, inclusion and exclusion criteria, and the methods used for critical appraisal, data extraction and data analysis to facilitate transparency of the review process. Additionally, journals should encourage and support the publication of review protocols, and should require reference to a protocol prior to publication of the review results. © 2016 Sigma Theta Tau International.

  15. Processing Protocol for Soil Samples Potentially ...

    EPA Pesticide Factsheets

    Method Operating Procedures This protocol describes the processing steps for 45 g and 9 g soil samples potentially contaminated with Bacillus anthracis spores. The protocol is designed to separate and concentrate the spores from bulk soil down to a pellet that can be used for further analysis. Soil extraction solution and mechanical shaking are used to disrupt soil particle aggregates and to aid in the separation of spores from soil particles. Soil samples are washed twice with soil extraction solution to maximize recovery. Differential centrifugation is used to separate spores from the majority of the soil material. The 45 g protocol has been demonstrated by two laboratories using both loamy and sandy soil types. There were no significant differences overall between the two laboratories for either soil type, suggesting that the processing protocol would be robust enough to use at multiple laboratories while achieving comparable recoveries. The 45 g protocol has demonstrated a matrix limit of detection at 14 spores/gram of soil for loamy and sandy soils.

  16. Processing protocol for soil samples potentially contaminated with Bacillus anthracis spores [HS7.52.02 - 514

    USGS Publications Warehouse

    Silvestri, Erin E.; Griffin, Dale W.

    2017-01-01

    This protocol describes the processing steps for 45 g and 9 g soil samples potentially contaminated with Bacillus anthracis spores. The protocol is designed to separate and concentrate the spores from bulk soil down to a pellet that can be used for further analysis. Soil extraction solution and mechanical shaking are used to disrupt soil particle aggregates and to aid in the separation of spores from soil particles. Soil samples are washed twice with soil extraction solution to maximize recovery. Differential centrifugation is used to separate spores from the majority of the soil material. The 45 g protocol has been demonstrated by two laboratories using both loamy and sandy soil types. There were no significant differences overall between the two laboratories for either soil type, suggesting that the processing protocol would be robust enough to use at multiple laboratories while achieving comparable recoveries. The 45 g protocol has demonstrated a matrix limit of detection at 14 spores/gram of soil for loamy and sandy soils.

  17. Reliable communication in the presence of failures

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.; Joseph, Thomas A.

    1987-01-01

    The design and correctness of a communication facility for a distributed computer system are reported on. The facility provides support for fault-tolerant process groups in the form of a family of reliable multicast protocols that can be used in both local- and wide-area networks. These protocols attain high levels of concurrency, while respecting application-specific delivery ordering constraints, and have varying cost and performance that depend on the degree of ordering desired. In particular, a protocol that enforces causal delivery orderings is introduced and shown to be a valuable alternative to conventional asynchronous communication protocols. The facility also ensures that the processes belonging to a fault-tolerant process group will observe consistant orderings of events affecting the group as a whole, including process failures, recoveries, migration, and dynamic changes to group properties like member rankings. A review of several uses for the protocols is the ISIS system, which supports fault-tolerant resilient objects and bulletin boards, illustrates the significant simplification of higher level algorithms made possible by our approach.

  18. Cross-layer protocol design for QoS optimization in real-time wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Hortos, William S.

    2010-04-01

    The metrics of quality of service (QoS) for each sensor type in a wireless sensor network can be associated with metrics for multimedia that describe the quality of fused information, e.g., throughput, delay, jitter, packet error rate, information correlation, etc. These QoS metrics are typically set at the highest, or application, layer of the protocol stack to ensure that performance requirements for each type of sensor data are satisfied. Application-layer metrics, in turn, depend on the support of the lower protocol layers: session, transport, network, data link (MAC), and physical. The dependencies of the QoS metrics on the performance of the higher layers of the Open System Interconnection (OSI) reference model of the WSN protocol, together with that of the lower three layers, are the basis for a comprehensive approach to QoS optimization for multiple sensor types in a general WSN model. The cross-layer design accounts for the distributed power consumption along energy-constrained routes and their constituent nodes. Following the author's previous work, the cross-layer interactions in the WSN protocol are represented by a set of concatenated protocol parameters and enabling resource levels. The "best" cross-layer designs to achieve optimal QoS are established by applying the general theory of martingale representations to the parameterized multivariate point processes (MVPPs) for discrete random events occurring in the WSN. Adaptive control of network behavior through the cross-layer design is realized through the parametric factorization of the stochastic conditional rates of the MVPPs. The cross-layer protocol parameters for optimal QoS are determined in terms of solutions to stochastic dynamic programming conditions derived from models of transient flows for heterogeneous sensor data and aggregate information over a finite time horizon. Markov state processes, embedded within the complex combinatorial history of WSN events, are more computationally tractable and lead to simplifications for any simulated or analytical performance evaluations of the cross-layer designs.

  19. Improving investigational drug service operations through development of an innovative computer system.

    PubMed

    Sweet, Burgunda V; Tamer, Helen R; Siden, Rivka; McCreadie, Scott R; McGregory, Michael E; Benner, Todd; Tankanow, Roberta M

    2008-05-15

    The development of a computerized system for protocol management, dispensing, inventory accountability, and billing by the investigational drug service (IDS) of a university health system is described. After an unsuccessful search for a commercial system that would accommodate the variation among investigational protocols and meet regulatory requirements, the IDS worked with the health-system pharmacy's information technology staff and informatics pharmacists to develop its own system. The informatics pharmacists observed work-flow and information capture in the IDS and identified opportunities for improved efficiency with an automated system. An iterative build-test-design process was used to provide the flexibility needed for individual protocols. The intent was to design a system that would support most IDS processes, using components that would allow automated backup and redundancies. A browser-based system was chosen to allow remote access. Servers, bar-code scanners, and printers were integrated into the final system design. Initial implementation involved 10 investigational protocols chosen on the basis of dispensing volume and complexity of study design. Other protocols were added over a two-year period; all studies whose drugs were dispensed from the IDS were added, followed by those for which the drugs were dispensed from decentralized pharmacy areas. The IDS briefly used temporary staff to free pharmacist and technician time for system implementation. Decentralized pharmacy areas that rarely dispense investigational drugs continue to use manual processes, with subsequent data transcription into the system. Through the university's technology transfer division, the system was licensed by an external company for sale to other IDSs. The WebIDS system has improved daily operations, enhanced safety and efficiency, and helped meet regulatory requirements for investigational drugs.

  20. Residential Interior Design as Complex Composition: A Case Study of a High School Senior's Composing Process

    ERIC Educational Resources Information Center

    Smagorinsky, Peter; Zoss, Michelle; Reed, Patty M.

    2006-01-01

    This research analyzed the composing processes of one high school student as she designed the interiors of homes for a course in interior design. Data included field notes, an interview with the teacher, artifacts from the class, and the focal student's concurrent and retrospective protocols in relation to her design of home interiors. The…

  1. Analytical approach to cross-layer protocol optimization in wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Hortos, William S.

    2008-04-01

    In the distributed operations of route discovery and maintenance, strong interaction occurs across mobile ad hoc network (MANET) protocol layers. Quality of service (QoS) requirements of multimedia service classes must be satisfied by the cross-layer protocol, along with minimization of the distributed power consumption at nodes and along routes to battery-limited energy constraints. In previous work by the author, cross-layer interactions in the MANET protocol are modeled in terms of a set of concatenated design parameters and associated resource levels by multivariate point processes (MVPPs). Determination of the "best" cross-layer design is carried out using the optimal control of martingale representations of the MVPPs. In contrast to the competitive interaction among nodes in a MANET for multimedia services using limited resources, the interaction among the nodes of a wireless sensor network (WSN) is distributed and collaborative, based on the processing of data from a variety of sensors at nodes to satisfy common mission objectives. Sensor data originates at the nodes at the periphery of the WSN, is successively transported to other nodes for aggregation based on information-theoretic measures of correlation and ultimately sent as information to one or more destination (decision) nodes. The "multimedia services" in the MANET model are replaced by multiple types of sensors, e.g., audio, seismic, imaging, thermal, etc., at the nodes; the QoS metrics associated with MANETs become those associated with the quality of fused information flow, i.e., throughput, delay, packet error rate, data correlation, etc. Significantly, the essential analytical approach to MANET cross-layer optimization, now based on the MVPPs for discrete random events occurring in the WSN, can be applied to develop the stochastic characteristics and optimality conditions for cross-layer designs of sensor network protocols. Functional dependencies of WSN performance metrics are described in terms of the concatenated protocol parameters. New source-to-destination routes are sought that optimize cross-layer interdependencies to achieve the "best available" performance in the WSN. The protocol design, modified from a known reactive protocol, adapts the achievable performance to the transient network conditions and resource levels. Control of network behavior is realized through the conditional rates of the MVPPs. Optimal cross-layer protocol parameters are determined by stochastic dynamic programming conditions derived from models of transient packetized sensor data flows. Moreover, the defining conditions for WSN configurations, grouping sensor nodes into clusters and establishing data aggregation at processing nodes within those clusters, lead to computationally tractable solutions to the stochastic differential equations that describe network dynamics. Closed-form solution characteristics provide an alternative to the "directed diffusion" methods for resource-efficient WSN protocols published previously by other researchers. Performance verification of the resulting cross-layer designs is found by embedding the optimality conditions for the protocols in actual WSN scenarios replicated in a wireless network simulation environment. Performance tradeoffs among protocol parameters remain for a sequel to the paper.

  2. A Range Finding Protocol to Support Design for Transcriptomics Experimentation: Examples of In-Vitro and In-Vivo Murine UV Exposure

    PubMed Central

    van Oostrom, Conny T.; Jonker, Martijs J.; de Jong, Mark; Dekker, Rob J.; Rauwerda, Han; Ensink, Wim A.; de Vries, Annemieke; Breit, Timo M.

    2014-01-01

    In transcriptomics research, design for experimentation by carefully considering biological, technological, practical and statistical aspects is very important, because the experimental design space is essentially limitless. Usually, the ranges of variable biological parameters of the design space are based on common practices and in turn on phenotypic endpoints. However, specific sub-cellular processes might only be partially reflected by phenotypic endpoints or outside the associated parameter range. Here, we provide a generic protocol for range finding in design for transcriptomics experimentation based on small-scale gene-expression experiments to help in the search for the right location in the design space by analyzing the activity of already known genes of relevant molecular mechanisms. Two examples illustrate the applicability: in-vitro UV-C exposure of mouse embryonic fibroblasts and in-vivo UV-B exposure of mouse skin. Our pragmatic approach is based on: framing a specific biological question and associated gene-set, performing a wide-ranged experiment without replication, eliminating potentially non-relevant genes, and determining the experimental ‘sweet spot’ by gene-set enrichment plus dose-response correlation analysis. Examination of many cellular processes that are related to UV response, such as DNA repair and cell-cycle arrest, revealed that basically each cellular (sub-) process is active at its own specific spot(s) in the experimental design space. Hence, the use of range finding, based on an affordable protocol like this, enables researchers to conveniently identify the ‘sweet spot’ for their cellular process of interest in an experimental design space and might have far-reaching implications for experimental standardization. PMID:24823911

  3. Validation of Autoclave Protocols for Successful Decontamination of Category A Medical Waste Generated from Care of Patients with Serious Communicable Diseases

    PubMed Central

    Reimers, Mallory; Ernst, Neysa; Bova, Gregory; Nowakowski, Elaine; Bukowski, James; Ellis, Brandon C.; Smith, Chris; Sauer, Lauren; Dionne, Kim; Carroll, Karen C.; Maragakis, Lisa L.; Parrish, Nicole M.

    2016-01-01

    ABSTRACT In response to the Ebola outbreak in 2014, many hospitals designated specific areas to care for patients with Ebola and other highly infectious diseases. The safe handling of category A infectious substances is a unique challenge in this environment. One solution is on-site waste treatment with a steam sterilizer or autoclave. The Johns Hopkins Hospital (JHH) installed two pass-through autoclaves in its biocontainment unit (BCU). The JHH BCU and The Johns Hopkins biosafety level 3 (BSL-3) clinical microbiology laboratory designed and validated waste-handling protocols with simulated patient trash to ensure adequate sterilization. The results of the validation process revealed that autoclave factory default settings are potentially ineffective for certain types of medical waste and highlighted the critical role of waste packaging in successful sterilization. The lessons learned from the JHH validation process can inform the design of waste management protocols to ensure effective treatment of highly infectious medical waste. PMID:27927920

  4. Validation of Autoclave Protocols for Successful Decontamination of Category A Medical Waste Generated from Care of Patients with Serious Communicable Diseases.

    PubMed

    Garibaldi, Brian T; Reimers, Mallory; Ernst, Neysa; Bova, Gregory; Nowakowski, Elaine; Bukowski, James; Ellis, Brandon C; Smith, Chris; Sauer, Lauren; Dionne, Kim; Carroll, Karen C; Maragakis, Lisa L; Parrish, Nicole M

    2017-02-01

    In response to the Ebola outbreak in 2014, many hospitals designated specific areas to care for patients with Ebola and other highly infectious diseases. The safe handling of category A infectious substances is a unique challenge in this environment. One solution is on-site waste treatment with a steam sterilizer or autoclave. The Johns Hopkins Hospital (JHH) installed two pass-through autoclaves in its biocontainment unit (BCU). The JHH BCU and The Johns Hopkins biosafety level 3 (BSL-3) clinical microbiology laboratory designed and validated waste-handling protocols with simulated patient trash to ensure adequate sterilization. The results of the validation process revealed that autoclave factory default settings are potentially ineffective for certain types of medical waste and highlighted the critical role of waste packaging in successful sterilization. The lessons learned from the JHH validation process can inform the design of waste management protocols to ensure effective treatment of highly infectious medical waste. Copyright © 2017 American Society for Microbiology.

  5. Understanding Photography as Applied Chemistry: Using Talbot's Calotype Process to Introduce Chemistry to Design Students

    ERIC Educational Resources Information Center

    Ro¨sch, Esther S.; Helmerdig, Silke

    2017-01-01

    Early photography processes were predestined to combine chemistry and art. William Henry Fox Talbot is one of the early photography pioneers. In 2-3 day workshops, design students without a major background in chemistry are able to define a reproducible protocol for Talbot's gallic acid containing calotype process. With the experimental concept…

  6. The specification-based validation of reliable multicast protocol: Problem Report. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1995-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP multicasting. In this report, we develop formal models for RMP using existing automated verification systems, and perform validation on the formal RMP specifications. The validation analysis help identifies some minor specification and design problems. We also use the formal models of RMP to generate a test suite for conformance testing of the implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress of implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  7. High Cycle Fatigue (HCF) Science and Technology Program 2002 Annual Report

    DTIC Science & Technology

    2003-08-01

    Turbine Engine Airfoils, Phase I 4.3 Probabilistic Design of Turbine Engine Airfoils, Phase II 4.4 Probabilistic Blade Design System 4.5...XTL17/SE2 7.4 Conclusion 8.0 TEST AND EVALUATION 8.1 Characterization Test Protocol 8.2 Demonstration Test Protocol 8.3 Development of Multi ...transparent and opaque overlays for processing. The objective of the SBIR Phase I program was to identify and evaluate promising methods for

  8. Fully Integrated Passive UHF RFID Tag for Hash-Based Mutual Authentication Protocol.

    PubMed

    Mikami, Shugo; Watanabe, Dai; Li, Yang; Sakiyama, Kazuo

    2015-01-01

    Passive radio-frequency identification (RFID) tag has been used in many applications. While the RFID market is expected to grow, concerns about security and privacy of the RFID tag should be overcome for the future use. To overcome these issues, privacy-preserving authentication protocols based on cryptographic algorithms have been designed. However, to the best of our knowledge, evaluation of the whole tag, which includes an antenna, an analog front end, and a digital processing block, that runs authentication protocols has not been studied. In this paper, we present an implementation and evaluation of a fully integrated passive UHF RFID tag that runs a privacy-preserving mutual authentication protocol based on a hash function. We design a single chip including the analog front end and the digital processing block. We select a lightweight hash function supporting 80-bit security strength and a standard hash function supporting 128-bit security strength. We show that when the lightweight hash function is used, the tag completes the protocol with a reader-tag distance of 10 cm. Similarly, when the standard hash function is used, the tag completes the protocol with the distance of 8.5 cm. We discuss the impact of the peak power consumption of the tag on the distance of the tag due to the hash function.

  9. Design Sketches For Optical Crossbar Switches Intended For Large-Scale Parallel Processing Applications

    NASA Astrophysics Data System (ADS)

    Hartmann, Alfred; Redfield, Steve

    1989-04-01

    This paper discusses design of large-scale (1000x 1000) optical crossbar switching networks for use in parallel processing supercom-puters. Alternative design sketches for an optical crossbar switching network are presented using free-space optical transmission with either a beam spreading/masking model or a beam steering model for internodal communications. The performances of alternative multiple access channel communications protocol-unslotted and slotted ALOHA and carrier sense multiple access (CSMA)-are compared with the performance of the classic arbitrated bus crossbar of conventional electronic parallel computing. These comparisons indicate an almost inverse relationship between ease of implementation and speed of operation. Practical issues of optical system design are addressed, and an optically addressed, composite spatial light modulator design is presented for fabrication to arbitrarily large scale. The wide range of switch architecture, communications protocol, optical systems design, device fabrication, and system performance problems presented by these design sketches poses a serious challenge to practical exploitation of highly parallel optical interconnects in advanced computer designs.

  10. De Novo Enzyme Design Using Rosetta3

    PubMed Central

    Richter, Florian; Leaver-Fay, Andrew; Khare, Sagar D.; Bjelic, Sinisa; Baker, David

    2011-01-01

    The Rosetta de novo enzyme design protocol has been used to design enzyme catalysts for a variety of chemical reactions, and in principle can be applied to any arbitrary chemical reaction of interest, The process has four stages: 1) choice of a catalytic mechanism and corresponding minimal model active site, 2) identification of sites in a set of scaffold proteins where this minimal active site can be realized, 3) optimization of the identities of the surrounding residues for stabilizing interactions with the transition state and primary catalytic residues, and 4) evaluation and ranking the resulting designed sequences. Stages two through four of this process can be carried out with the Rosetta package, while stage one needs to be done externally. Here, we demonstrate how to carry out the Rosetta enzyme design protocol from start to end in detail using for illustration the triosephosphate isomerase reaction. PMID:21603656

  11. Design of Secure ECG-Based Biometric Authentication in Body Area Sensor Networks

    PubMed Central

    Peter, Steffen; Pratap Reddy, Bhanu; Momtaz, Farshad; Givargis, Tony

    2016-01-01

    Body area sensor networks (BANs) utilize wireless communicating sensor nodes attached to a human body for convenience, safety, and health applications. Physiological characteristics of the body, such as the heart rate or Electrocardiogram (ECG) signals, are promising means to simplify the setup process and to improve security of BANs. This paper describes the design and implementation steps required to realize an ECG-based authentication protocol to identify sensor nodes attached to the same human body. Therefore, the first part of the paper addresses the design of a body-area sensor system, including the hardware setup, analogue and digital signal processing, and required ECG feature detection techniques. A model-based design flow is applied, and strengths and limitations of each design step are discussed. Real-world measured data originating from the implemented sensor system are then used to set up and parametrize a novel physiological authentication protocol for BANs. The authentication protocol utilizes statistical properties of expected and detected deviations to limit the number of false positive and false negative authentication attempts. The result of the described holistic design effort is the first practical implementation of biometric authentication in BANs that reflects timing and data uncertainties in the physical and cyber parts of the system. PMID:27110785

  12. Design of Secure ECG-Based Biometric Authentication in Body Area Sensor Networks.

    PubMed

    Peter, Steffen; Reddy, Bhanu Pratap; Momtaz, Farshad; Givargis, Tony

    2016-04-22

    Body area sensor networks (BANs) utilize wireless communicating sensor nodes attached to a human body for convenience, safety, and health applications. Physiological characteristics of the body, such as the heart rate or Electrocardiogram (ECG) signals, are promising means to simplify the setup process and to improve security of BANs. This paper describes the design and implementation steps required to realize an ECG-based authentication protocol to identify sensor nodes attached to the same human body. Therefore, the first part of the paper addresses the design of a body-area sensor system, including the hardware setup, analogue and digital signal processing, and required ECG feature detection techniques. A model-based design flow is applied, and strengths and limitations of each design step are discussed. Real-world measured data originating from the implemented sensor system are then used to set up and parametrize a novel physiological authentication protocol for BANs. The authentication protocol utilizes statistical properties of expected and detected deviations to limit the number of false positive and false negative authentication attempts. The result of the described holistic design effort is the first practical implementation of biometric authentication in BANs that reflects timing and data uncertainties in the physical and cyber parts of the system.

  13. Study protocol of a mixed-methods evaluation of a cluster randomized trial to improve the safety of NSAID and antiplatelet prescribing: data-driven quality improvement in primary care.

    PubMed

    Grant, Aileen; Dreischulte, Tobias; Treweek, Shaun; Guthrie, Bruce

    2012-08-28

    Trials of complex interventions are criticized for being 'black box', so the UK Medical Research Council recommends carrying out a process evaluation to explain the trial findings. We believe it is good practice to pre-specify and publish process evaluation protocols to set standards and minimize bias. Unlike protocols for trials, little guidance or standards exist for the reporting of process evaluations. This paper presents the mixed-method process evaluation protocol of a cluster randomized trial, drawing on a framework designed by the authors. This mixed-method evaluation is based on four research questions and maps data collection to a logic model of how the data-driven quality improvement in primary care (DQIP) intervention is expected to work. Data collection will be predominately by qualitative case studies in eight to ten of the trial practices, focus groups with patients affected by the intervention and quantitative analysis of routine practice data, trial outcome and questionnaire data and data from the DQIP intervention. We believe that pre-specifying the intentions of a process evaluation can help to minimize bias arising from potentially misleading post-hoc analysis. We recognize it is also important to retain flexibility to examine the unexpected and the unintended. From that perspective, a mixed-methods evaluation allows the combination of exploratory and flexible qualitative work, and more pre-specified quantitative analysis, with each method contributing to the design, implementation and interpretation of the other.As well as strengthening the study the authors hope to stimulate discussion among their academic colleagues about publishing protocols for evaluations of randomized trials of complex interventions. DATA-DRIVEN QUALITY IMPROVEMENT IN PRIMARY CARE TRIAL REGISTRATION: ClinicalTrials.gov: NCT01425502.

  14. General A Scheme to Share Information via Employing Discrete Algorithm to Quantum States

    NASA Astrophysics Data System (ADS)

    Kang, Guo-Dong; Fang, Mao-Fa

    2011-02-01

    We propose a protocol for information sharing between two legitimate parties (Bob and Alice) via public-key cryptography. In particular, we specialize the protocol by employing discrete algorithm under mod that maps integers to quantum states via photon rotations. Based on this algorithm, we find that the protocol is secure under various classes of attacks. Specially, owe to the algorithm, the security of the classical privacy contained in the quantum public-key and the corresponding ciphertext is guaranteed. And the protocol is robust against the impersonation attack and the active wiretapping attack by designing particular checking processing, thus the protocol is valid.

  15. Energy modelling in sensor networks

    NASA Astrophysics Data System (ADS)

    Schmidt, D.; Krämer, M.; Kuhn, T.; Wehn, N.

    2007-06-01

    Wireless sensor networks are one of the key enabling technologies for the vision of ambient intelligence. Energy resources for sensor nodes are very scarce. A key challenge is the design of energy efficient communication protocols. Models of the energy consumption are needed to accurately simulate the efficiency of a protocol or application design, and can also be used for automatic energy optimizations in a model driven design process. We propose a novel methodology to create models for sensor nodes based on few simple measurements. In a case study the methodology was used to create models for MICAz nodes. The models were integrated in a simulation environment as well as in a SDL runtime framework of a model driven design process. Measurements on a test application that was created automatically from an SDL specification showed an 80% reduction in energy consumption compared to an implementation without power saving strategies.

  16. Protocol Interoperability Between DDN and ISO (Defense Data Network and International Organization for Standardization) Protocols

    DTIC Science & Technology

    1988-08-01

    Interconnection (OSI) in years. It is felt even more urgent in the past few years, with the rapid evolution of communication technologies and the...services and protocols above the transport layer are usually implemented as user- callable utilities on the host computers, it is desirable to offer them...Networks, Prentice-hall, New Jersey, 1987 [ BOND 87] Bond , John, "Parallel-Processing Concepts Finally Come together in Real Systems", Computer Design

  17. Fully Integrated Passive UHF RFID Tag for Hash-Based Mutual Authentication Protocol

    PubMed Central

    Mikami, Shugo; Watanabe, Dai; Li, Yang; Sakiyama, Kazuo

    2015-01-01

    Passive radio-frequency identification (RFID) tag has been used in many applications. While the RFID market is expected to grow, concerns about security and privacy of the RFID tag should be overcome for the future use. To overcome these issues, privacy-preserving authentication protocols based on cryptographic algorithms have been designed. However, to the best of our knowledge, evaluation of the whole tag, which includes an antenna, an analog front end, and a digital processing block, that runs authentication protocols has not been studied. In this paper, we present an implementation and evaluation of a fully integrated passive UHF RFID tag that runs a privacy-preserving mutual authentication protocol based on a hash function. We design a single chip including the analog front end and the digital processing block. We select a lightweight hash function supporting 80-bit security strength and a standard hash function supporting 128-bit security strength. We show that when the lightweight hash function is used, the tag completes the protocol with a reader-tag distance of 10 cm. Similarly, when the standard hash function is used, the tag completes the protocol with the distance of 8.5 cm. We discuss the impact of the peak power consumption of the tag on the distance of the tag due to the hash function. PMID:26491714

  18. Mercury Assessment and Monitoring Protocol for the Bear Creek Watershed, Colusa County, California

    USGS Publications Warehouse

    Suchanek, Thomas H.; Hothem, Roger L.; Rytuba, James J.; Yee, Julie L.

    2010-01-01

    This report summarizes the known information on the occurrence and distribution of mercury (Hg) in physical/chemical and biological matrices within the Bear Creek watershed. Based on these data, a matrix-specific monitoring protocol for the evaluation of the effectiveness of activities designed to remediate Hg contamination in the Bear Creek watershed is presented. The monitoring protocol documents procedures for collecting and processing water, sediment, and biota for estimation of total Hg (TotHg) and monomethyl mercury (MMeHg) in the Bear Creek watershed. The concurrent sampling of TotHg and MMeHg in biota as well as water and sediment from 10 monitoring sites is designed to assess the relative bioavailability of Hg released from Hg sources in the watershed and identify environments conducive to Hg methylation. These protocols are designed to assist landowners, land managers, water quality regulators, and scientists in determining whether specific restoration/mitigation actions lead to significant progress toward achieving water quality goals to reduce Hg in Bear and Sulphur Creeks.

  19. Reliability of the Nursing Home Survey Process: A Simultaneous Survey Approach

    ERIC Educational Resources Information Center

    Lee, Robert H.; Gajewski, Byron J.; Thompson, Sarah

    2006-01-01

    Purpose: We designed this study to examine the reliability of the nursing home survey process in the state of Kansas using regular and simultaneous survey teams. In particular, the study examined how two survey teams exposed to the same information at the same time differed in their interpretations. Design and Methods: The protocol for…

  20. A PROTOCOL FOR DETERMINING WWF SETTLING VELOCITIES FOR TREATMENT PROCESS DESIGN ENHANCEMENT

    EPA Science Inventory

    Urban wet weather flows (WWF) contain a high proportion of suspended solids (SS) which must be rapidly reduced before release to receiving waters. Site specific, storm-event data evaluations for designing WWF-treatment facilities differs from dry-weather flow design. WWF-sett...

  1. A design protocol for tailoring ice-templated scaffold structure

    PubMed Central

    Pawelec, K. M.; Husmann, A.; Best, S. M.; Cameron, R. E.

    2014-01-01

    In this paper, we show, for the first time, the key link between scaffold architecture and latent heat evolution during the production of porous biomedical collagen structures using freeze-drying. Collagen scaffolds are used widely in the biomedical industry for the repair and reconstruction of skeletal tissues and organs. Freeze-drying of collagen slurries is a standard industrial process, and, until now, the literature has sought to characterize the influence of set processing parameters including the freezing protocol and weight percentage of collagen. However, we are able to demonstrate, by monitoring the local thermal events within the slurry during solidification, that nucleation, growth and annealing processes can be controlled, and therefore we are able to control the resulting scaffold architecture. Based on our correlation of thermal profile measurements with scaffold architecture, we hypothesize that there is a link between the fundamental freezing of ice and the structure of scaffolds, which suggests that this concept is applicable not only for collagen but also for ceramics and pharmaceuticals. We present a design protocol of strategies for tailoring the ice-templated scaffold structure. PMID:24402916

  2. Design and Implementation of Software Protocol in VAX/VMS Using Ethernet Local Area Network.

    DTIC Science & Technology

    1983-06-01

    firm hclding the trademark: INTEL Ccrporation, Santa Clara, California INTELLEC IDS Multbus DIGITAL Research, Pacific Grove, California CE/M-80...bcard NS2030 VMS device driver and N11010 diagnostic program DIGITAL Equipment Corporation, Maynard, Massachusetts V&Z-11/780 Mini computer VAX/VMS...8217 . . - - . , . editors, libary , etc.) of one node to an application process in another ncde. Such protocols may include: 1) .. 11 T f6 - allowing a process in one node to

  3. System approach to distributed sensor management

    NASA Astrophysics Data System (ADS)

    Mayott, Gregory; Miller, Gordon; Harrell, John; Hepp, Jared; Self, Mid

    2010-04-01

    Since 2003, the US Army's RDECOM CERDEC Night Vision Electronic Sensor Directorate (NVESD) has been developing a distributed Sensor Management System (SMS) that utilizes a framework which demonstrates application layer, net-centric sensor management. The core principles of the design support distributed and dynamic discovery of sensing devices and processes through a multi-layered implementation. This results in a sensor management layer that acts as a System with defined interfaces for which the characteristics, parameters, and behaviors can be described. Within the framework, the definition of a protocol is required to establish the rules for how distributed sensors should operate. The protocol defines the behaviors, capabilities, and message structures needed to operate within the functional design boundaries. The protocol definition addresses the requirements for a device (sensors or processes) to dynamically join or leave a sensor network, dynamically describe device control and data capabilities, and allow dynamic addressing of publish and subscribe functionality. The message structure is a multi-tiered definition that identifies standard, extended, and payload representations that are specifically designed to accommodate the need for standard representations of common functions, while supporting the need for feature-based functions that are typically vendor specific. The dynamic qualities of the protocol enable a User GUI application the flexibility of mapping widget-level controls to each device based on reported capabilities in real-time. The SMS approach is designed to accommodate scalability and flexibility within a defined architecture. The distributed sensor management framework and its application to a tactical sensor network will be described in this paper.

  4. An extended protocol for usability validation of medical devices: Research design and reference model.

    PubMed

    Schmettow, Martin; Schnittker, Raphaela; Schraagen, Jan Maarten

    2017-05-01

    This paper proposes and demonstrates an extended protocol for usability validation testing of medical devices. A review of currently used methods for the usability evaluation of medical devices revealed two main shortcomings. Firstly, the lack of methods to closely trace the interaction sequences and derive performance measures. Secondly, a prevailing focus on cross-sectional validation studies, ignoring the issues of learnability and training. The U.S. Federal Drug and Food Administration's recent proposal for a validation testing protocol for medical devices is then extended to address these shortcomings: (1) a novel process measure 'normative path deviations' is introduced that is useful for both quantitative and qualitative usability studies and (2) a longitudinal, completely within-subject study design is presented that assesses learnability, training effects and allows analysis of diversity of users. A reference regression model is introduced to analyze data from this and similar studies, drawing upon generalized linear mixed-effects models and a Bayesian estimation approach. The extended protocol is implemented and demonstrated in a study comparing a novel syringe infusion pump prototype to an existing design with a sample of 25 healthcare professionals. Strong performance differences between designs were observed with a variety of usability measures, as well as varying training-on-the-job effects. We discuss our findings with regard to validation testing guidelines, reflect on the extensions and discuss the perspectives they add to the validation process. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. High-Bandwidth Tactical-Network Data Analysis in a High-Performance-Computing (HPC) Environment: Voice Call Analysis

    DTIC Science & Technology

    2015-09-01

    Gateway 2 4. Voice Packet Flow: SIP , Session Description Protocol (SDP), and RTP 3 5. Voice Data Analysis 5 6. Call Analysis 6 7. Call Metrics 6...analysis processing is designed for a general VoIP system architecture based on Session Initiation Protocol ( SIP ) for negotiating call sessions and...employs Skinny Client Control Protocol for network communication between the phone and the local CallManager (e.g., for each dialed digit), SIP

  6. Fatigue Behavior of Computer-Aided Design/Computer-Assisted Manufacture Ceramic Abutments as a Function of Design and Ceramics Processing.

    PubMed

    Kelly, J Robert; Rungruanganunt, Patchnee

    2016-01-01

    Zirconia is being widely used, at times apparently by simply copying a metal design into ceramic. Structurally, ceramics are sensitive to both design and processing (fabrication) details. The aim of this work was to examine four computer-aided design/computer-assisted manufacture (CAD/CAM) abutments using a modified International Standards Organization (ISO) implant fatigue protocol to determine performance as a function of design and processing. Two full zirconia and two hybrid (Ti-based) abutments (n = 12 each) were tested wet at 15 Hz at a variety of loads to failure. Failure probability distributions were examined at each load, and when found to be the same, data from all loads were combined for lifetime analysis from accelerated to clinical conditions. Two distinctly different failure modes were found for both full zirconia and Ti-based abutments. One of these for zirconia has been reported clinically in the literature, and one for the Ti-based abutments has been reported anecdotally. The ISO protocol modification in this study forced failures in the abutments; no implant bodies failed. Extrapolated cycles for 10% failure at 70 N were: full zirconia, Atlantis 2 × 10(7) and Straumann 3 × 10(7); and Ti-based, Glidewell 1 × 10(6) and Nobel 1 × 10(21). Under accelerated conditions (200 N), performance differed significantly: Straumann clearly outperformed Astra (t test, P = .013), and the Glidewell Ti-base abutment also outperformed Atlantis zirconia at 200 N (Nobel ran-out; t test, P = .035). The modified ISO protocol in this study produced failures that were seen clinically. The manufacture matters; differences in design and fabrication that influence performance cannot be discerned clinically.

  7. Development of a Protocol for Automated Glucose Measurement Transmission Used in Clinical Decision Support Systems Based on the Continua Design Guidelines.

    PubMed

    Meyer, Markus; Donsa, Klaus; Truskaller, Thomas; Frohner, Matthias; Pohn, Birgit; Felfernig, Alexander; Sinner, Frank; Pieber, Thomas

    2018-01-01

    A fast and accurate data transmission from glucose meter to clinical decision support systems (CDSSs) is crucial for the management of type 2 diabetes mellitus since almost all therapeutic interventions are derived from glucose measurements. Aim was to develop a prototype of an automated glucose measurement transmission protocol based on the Continua Design Guidelines and to embed the protocol into a CDSS used by healthcare professionals. A literature and market research was performed to analyze the state-of-the-art and thereupon develop, integrate and validate an automated glucose measurement transmission protocol in an iterative process. Findings from literature and market research guided towards the development of a standardized glucose measurement transmission protocol using a middleware. The interface description to communicate with the glucose meter was illustrated and embedded into a CDSS. A prototype of an interoperable transmission of glucose measurements was developed and implemented in a CDSS presenting a promising way to reduce medication errors and improve user satisfaction.

  8. Designing Endocrine Disruption Out of the Next Generation of Chemicals

    PubMed Central

    Schug, T.T; Abagyan, R.; Blumberg, B.; Collins, T.J.; Crews, D.; DeFur, P.L.; Dickerson, S.M.; Edwards, T.M.; Gore, A.C.; Guillette, L.J.; Hayes, T.; Heindel, J.J.; Moores, A.; Patisaul, H.B.; Tal, T.L.; Thayer, K.A.; Vandenberg, L.N.; Warner, J.; Watson, C.S.; Saal, F.S. vom; Zoeller, R.T.; O’Brien, K.P.; Myers, J.P.

    2013-01-01

    A central goal of green chemistry is to avoid hazard in the design of new chemicals. This objective is best achieved when information about a chemical’s potential hazardous effects is obtained as early in the design process as feasible. Endocrine disruption is a type of hazard that to date has been inadequately addressed by both industrial and regulatory science. To aid chemists in avoiding this hazard, we propose an endocrine disruption testing protocol for use by chemists in the design of new chemicals. The Tiered Protocol for Endocrine Disruption (TiPED) has been created under the oversight of a scientific advisory committee composed of leading representatives from both green chemistry and the environmental health sciences. TiPED is conceived as a tool for new chemical design, thus it starts with a chemist theoretically at “the drawing board.” It consists of five testing tiers ranging from broad in silico evaluation up through specific cell- and whole organism-based assays. To be effective at detecting endocrine disruption, a testing protocol must be able to measure potential hormone-like or hormone-inhibiting effects of chemicals, as well as the many possible interactions and signaling sequellae such chemicals may have with cell-based receptors. Accordingly, we have designed this protocol to broadly interrogate the endocrine system. The proposed protocol will not detect all possible mechanisms of endocrine disruption, because scientific understanding of these phenomena is advancing rapidly. To ensure that the protocol remains current, we have established a plan for incorporating new assays into the protocol as the science advances. In this paper we present the principles that should guide the science of testing new chemicals for endocrine disruption, as well as principles by which to evaluate individual assays for applicability, and laboratories for reliability. In a ‘proof-of-principle’ test, we ran 6 endocrine disrupting chemicals (EDCs) that act via different endocrinological mechanisms through the protocol using published literature. Each was identified as endocrine active by one or more tiers. We believe that this voluntary testing protocol will be a dynamic tool to facilitate efficient and early identification of potentially problematic chemicals, while ultimately reducing the risks to public health. PMID:25110461

  9. Designing Endocrine Disruption Out of the Next Generation of Chemicals.

    PubMed

    Schug, T T; Abagyan, R; Blumberg, B; Collins, T J; Crews, D; DeFur, P L; Dickerson, S M; Edwards, T M; Gore, A C; Guillette, L J; Hayes, T; Heindel, J J; Moores, A; Patisaul, H B; Tal, T L; Thayer, K A; Vandenberg, L N; Warner, J; Watson, C S; Saal, F S Vom; Zoeller, R T; O'Brien, K P; Myers, J P

    2013-01-01

    A central goal of green chemistry is to avoid hazard in the design of new chemicals. This objective is best achieved when information about a chemical's potential hazardous effects is obtained as early in the design process as feasible. Endocrine disruption is a type of hazard that to date has been inadequately addressed by both industrial and regulatory science. To aid chemists in avoiding this hazard, we propose an endocrine disruption testing protocol for use by chemists in the design of new chemicals. The Tiered Protocol for Endocrine Disruption (TiPED) has been created under the oversight of a scientific advisory committee composed of leading representatives from both green chemistry and the environmental health sciences. TiPED is conceived as a tool for new chemical design, thus it starts with a chemist theoretically at "the drawing board." It consists of five testing tiers ranging from broad in silico evaluation up through specific cell- and whole organism-based assays. To be effective at detecting endocrine disruption, a testing protocol must be able to measure potential hormone-like or hormone-inhibiting effects of chemicals, as well as the many possible interactions and signaling sequellae such chemicals may have with cell-based receptors. Accordingly, we have designed this protocol to broadly interrogate the endocrine system. The proposed protocol will not detect all possible mechanisms of endocrine disruption, because scientific understanding of these phenomena is advancing rapidly. To ensure that the protocol remains current, we have established a plan for incorporating new assays into the protocol as the science advances. In this paper we present the principles that should guide the science of testing new chemicals for endocrine disruption, as well as principles by which to evaluate individual assays for applicability, and laboratories for reliability. In a 'proof-of-principle' test, we ran 6 endocrine disrupting chemicals (EDCs) that act via different endocrinological mechanisms through the protocol using published literature. Each was identified as endocrine active by one or more tiers. We believe that this voluntary testing protocol will be a dynamic tool to facilitate efficient and early identification of potentially problematic chemicals, while ultimately reducing the risks to public health.

  10. Automated processing of first-pass radioisotope ventriculography data to determine essential central circulation parameters

    NASA Astrophysics Data System (ADS)

    Krotov, Aleksei; Pankin, Victor

    2017-09-01

    The assessment of central circulation (including heart function) parameters is vital in the preventive diagnostics of inherent and acquired heart failures and during polychemotherapy. The protocols currently applied in Russia do not fully utilize the first-pass assessment (FPRNA) and that results in poor data formalization, while the FPRNA is the one of the fastest, affordable and compact methods among other radioisotope diagnostics protocols. A non-imaging algorithm basing on existing protocols has been designed to use the readings of an additional detector above vena subclavia to determine the total blood volume (TBV), not requiring blood sampling in contrast to current protocols. An automated processing of precordial detector readings is presented, in order to determine the heart strike volume (SV). Two techniques to estimate the ejection fraction (EF) of the heart are discussed.

  11. Impact of Process Protocol Design on Virtual Team Effectiveness

    ERIC Educational Resources Information Center

    Cordes, Christofer Sean

    2013-01-01

    This dissertation examined the influence of action process dimensions on team decision performance, and attitudes toward team work environment and procedures given different degrees of collaborative technology affordance. Process models were used to provide context for understanding team behavior in the experimental task, and clarify understanding…

  12. "Check, Change What You Need to Change and/or Keep What You Want": An Art Therapy Neurobiological-Based Trauma Protocol

    ERIC Educational Resources Information Center

    Hass-Cohen, Noah; Clyde Findlay, Joanna; Carr, Richard; Vanderlan, Jessica

    2014-01-01

    The Check ("Check, Change What You Need To Change and/or Keep What You Want") art therapy protocol is a sequence of directives for treating trauma that is grounded in neurobiological theory and designed to facilitate trauma narrative processing, autobiographical coherency, and the rebalancing of dysregulated responses to psychosocial…

  13. The Ontology of Clinical Research (OCRe): An Informatics Foundation for the Science of Clinical Research

    PubMed Central

    Sim, Ida; Tu, Samson W.; Carini, Simona; Lehmann, Harold P.; Pollock, Brad H.; Peleg, Mor; Wittkowski, Knut M.

    2013-01-01

    To date, the scientific process for generating, interpreting, and applying knowledge has received less informatics attention than operational processes for conducting clinical studies. The activities of these scientific processes — the science of clinical research — are centered on the study protocol, which is the abstract representation of the scientific design of a clinical study. The Ontology of Clinical Research (OCRe) is an OWL 2 model of the entities and relationships of study design protocols for the purpose of computationally supporting the design and analysis of human studies. OCRe’s modeling is independent of any specific study design or clinical domain. It includes a study design typology and a specialized module called ERGO Annotation for capturing the meaning of eligibility criteria. In this paper, we describe the key informatics use cases of each phase of a study’s scientific lifecycle, present OCRe and the principles behind its modeling, and describe applications of OCRe and associated technologies to a range of clinical research use cases. OCRe captures the central semantics that underlies the scientific processes of clinical research and can serve as an informatics foundation for supporting the entire range of knowledge activities that constitute the science of clinical research. PMID:24239612

  14. Toward the modelling of safety violations in healthcare systems.

    PubMed

    Catchpole, Ken

    2013-09-01

    When frontline staff do not adhere to policies, protocols, or checklists, managers often regard these violations as indicating poor practice or even negligence. More often than not, however, these policy and protocol violations reflect the efforts of well intentioned professionals to carry out their work efficiently in the face of systems poorly designed to meet the diverse demands of patient care. Thus, non-compliance with institutional policies and protocols often signals a systems problem, rather than a people problem, and can be influenced among other things by training, competing goals, context, process, location, case complexity, individual beliefs, the direct or indirect influence of others, job pressure, flexibility, rule definition, and clinician-centred design. Three candidates are considered for developing a model of safety behaviour and decision making. The dynamic safety model helps to understand the relationship between systems designs and human performance. The theory of planned behaviour suggests that intention is a function of attitudes, social norms and perceived behavioural control. The naturalistic decision making paradigm posits that decisions are based on a wider view of multiple patients, expertise, systems complexity, behavioural intention, individual beliefs and current understanding of the system. Understanding and predicting behavioural safety decisions could help us to encourage compliance to current processes and to design better interventions.

  15. DNA tetrominoes: the construction of DNA nanostructures using self-organised heterogeneous deoxyribonucleic acids shapes.

    PubMed

    Ong, Hui San; Rahim, Mohd Syafiq; Firdaus-Raih, Mohd; Ramlan, Effirul Ikhwan

    2015-01-01

    The unique programmability of nucleic acids offers alternative in constructing excitable and functional nanostructures. This work introduces an autonomous protocol to construct DNA Tetris shapes (L-Shape, B-Shape, T-Shape and I-Shape) using modular DNA blocks. The protocol exploits the rich number of sequence combinations available from the nucleic acid alphabets, thus allowing for diversity to be applied in designing various DNA nanostructures. Instead of a deterministic set of sequences corresponding to a particular design, the protocol promotes a large pool of DNA shapes that can assemble to conform to any desired structures. By utilising evolutionary programming in the design stage, DNA blocks are subjected to processes such as sequence insertion, deletion and base shifting in order to enrich the diversity of the resulting shapes based on a set of cascading filters. The optimisation algorithm allows mutation to be exerted indefinitely on the candidate sequences until these sequences complied with all the four fitness criteria. Generated candidates from the protocol are in agreement with the filter cascades and thermodynamic simulation. Further validation using gel electrophoresis indicated the formation of the designed shapes. Thus, supporting the plausibility of constructing DNA nanostructures in a more hierarchical, modular, and interchangeable manner.

  16. Application of SNMP on CATV

    NASA Astrophysics Data System (ADS)

    Huang, Hong-bin; Liu, Wei-ping; Chen, Shun-er; Zheng, Liming

    2005-02-01

    A new type of CATV network management system developed by universal MCU, which supports SNMP, is proposed in this paper. From the point of view in both hardware and software, the function and method of every modules inside the system, which include communications in the physical layer, protocol process, data process, and etc, are analyzed. In our design, the management system takes IP MAN as data transmission channel and every controlled object in the management structure has a SNMP agent. In the SNMP agent developed, there are four function modules, including physical layer communication module, protocol process module, internal data process module and MIB management module. In the paper, the structure and function of every module are designed and demonstrated while the related hardware circuit, software flow as well as the experimental results are tested. Furthermore, by introducing RTOS into the software programming, the universal MCU procedure can conducts such multi-thread management as fast Ethernet controller driving, TCP/IP process, serial port signal monitoring and so on, which greatly improves efficiency of CPU.

  17. Time Warp Operating System (TWOS)

    NASA Technical Reports Server (NTRS)

    Bellenot, Steven F.

    1993-01-01

    Designed to support parallel discrete-event simulation, TWOS is complete implementation of Time Warp mechanism - distributed protocol for virtual time synchronization based on process rollback and message annihilation.

  18. Active SAmpling Protocol (ASAP) to Optimize Individual Neurocognitive Hypothesis Testing: A BCI-Inspired Dynamic Experimental Design.

    PubMed

    Sanchez, Gaëtan; Lecaignard, Françoise; Otman, Anatole; Maby, Emmanuel; Mattout, Jérémie

    2016-01-01

    The relatively young field of Brain-Computer Interfaces has promoted the use of electrophysiology and neuroimaging in real-time. In the meantime, cognitive neuroscience studies, which make extensive use of functional exploration techniques, have evolved toward model-based experiments and fine hypothesis testing protocols. Although these two developments are mostly unrelated, we argue that, brought together, they may trigger an important shift in the way experimental paradigms are being designed, which should prove fruitful to both endeavors. This change simply consists in using real-time neuroimaging in order to optimize advanced neurocognitive hypothesis testing. We refer to this new approach as the instantiation of an Active SAmpling Protocol (ASAP). As opposed to classical (static) experimental protocols, ASAP implements online model comparison, enabling the optimization of design parameters (e.g., stimuli) during the course of data acquisition. This follows the well-known principle of sequential hypothesis testing. What is radically new, however, is our ability to perform online processing of the huge amount of complex data that brain imaging techniques provide. This is all the more relevant at a time when physiological and psychological processes are beginning to be approached using more realistic, generative models which may be difficult to tease apart empirically. Based upon Bayesian inference, ASAP proposes a generic and principled way to optimize experimental design adaptively. In this perspective paper, we summarize the main steps in ASAP. Using synthetic data we illustrate its superiority in selecting the right perceptual model compared to a classical design. Finally, we briefly discuss its future potential for basic and clinical neuroscience as well as some remaining challenges.

  19. Development of a manualized protocol of massage therapy for clinical trials in osteoarthritis.

    PubMed

    Ali, Ather; Kahn, Janet; Rosenberger, Lisa; Perlman, Adam I

    2012-10-04

    Clinical trial design of manual therapies may be especially challenging as techniques are often individualized and practitioner-dependent. This paper describes our methods in creating a standardized Swedish massage protocol tailored to subjects with osteoarthritis of the knee while respectful of the individualized nature of massage therapy, as well as implementation of this protocol in two randomized clinical trials. The manualization process involved a collaborative process between methodologic and clinical experts, with the explicit goals of creating a reproducible semi-structured protocol for massage therapy, while allowing some latitude for therapists' clinical judgment and maintaining consistency with a prior pilot study. The manualized protocol addressed identical specified body regions with distinct 30- and 60-min protocols, using standard Swedish strokes. Each protocol specifies the time allocated to each body region. The manualized 30- and 60-min protocols were implemented in a dual-site 24-week randomized dose-finding trial in patients with osteoarthritis of the knee, and is currently being implemented in a three-site 52-week efficacy trial of manualized Swedish massage therapy. In the dose-finding study, therapists adhered to the protocols and significant treatment effects were demonstrated. The massage protocol was manualized, using standard techniques, and made flexible for individual practitioner and subject needs. The protocol has been applied in two randomized clinical trials. This manualized Swedish massage protocol has real-world utility and can be readily utilized both in the research and clinical settings. Clinicaltrials.gov NCT00970008 (18 August 2009).

  20. A psychoengineering paradigm for the neurocognitive mechanisms of biofeedback and neurofeedback.

    PubMed

    Gaume, A; Vialatte, A; Mora-Sánchez, A; Ramdani, C; Vialatte, F B

    2016-09-01

    We believe that the missing keystone to design effective and efficient biofeedback and neurofeedback protocols is a comprehensive model of the mechanisms of feedback learning. In this manuscript we review the learning models in behavioral, developmental and cognitive psychology, and derive a synthetic model of the psychological perspective on biofeedback. We afterwards review the neural correlates of feedback learning mechanisms, and present a general neuroscience model of biofeedback. We subsequently show how biomedical engineering principles can be applied to design efficient feedback protocols. We finally present an integrative psychoengineering model of the feedback learning processes, and provide new guidelines for the efficient design of biofeedback and neurofeedback protocols. We identify five key properties, (1) perceptibility=can the subject perceive the biosignal?, (2) autonomy=can the subject regulate by himself?, (3) mastery=degree of control over the biosignal, (4) motivation=rewards system of the biofeedback, and (5) learnability=possibility of learning. We conclude with guidelines for the investigation and promotion of these properties in biofeedback protocols. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. An electronic specimen collection protocol schema (eSCPS). Document architecture for specimen management and the exchange of specimen collection protocols between biobanking information systems.

    PubMed

    Eminaga, O; Semjonow, A; Oezguer, E; Herden, J; Akbarov, I; Tok, A; Engelmann, U; Wille, S

    2014-01-01

    The integrity of collection protocols in biobanking is essential for a high-quality sample preparation process. However, there is not currently a well-defined universal method for integrating collection protocols in the biobanking information system (BIMS). Therefore, an electronic schema of the collection protocol that is based on Extensible Markup Language (XML) is required to maintain the integrity and enable the exchange of collection protocols. The development and implementation of an electronic specimen collection protocol schema (eSCPS) was performed at two institutions (Muenster and Cologne) in three stages. First, we analyzed the infrastructure that was already established at both the biorepository and the hospital information systems of these institutions and determined the requirements for the sufficient preparation of specimens and documentation. Second, we designed an eSCPS according to these requirements. Finally, a prospective study was conducted to implement and evaluate the novel schema in the current BIMS. We designed an eSCPS that provides all of the relevant information about collection protocols. Ten electronic collection protocols were generated using the supplementary Protocol Editor tool, and these protocols were successfully implemented in the existing BIMS. Moreover, an electronic list of collection protocols for the current studies being performed at each institution was included, new collection protocols were added, and the existing protocols were redesigned to be modifiable. The documentation time was significantly reduced after implementing the eSCPS (5 ± 2 min vs. 7 ± 3 min; p = 0.0002). The eSCPS improves the integrity and facilitates the exchange of specimen collection protocols in the existing open-source BIMS.

  2. Documentation of operational protocol for the use of MAMA software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwartz, Daniel S.

    2016-01-21

    Image analysis of Scanning Electron Microscope (SEM) micrographs is a complex process that can vary significantly between analysts. The factors causing the variation are numerous, and the purpose of Task 2b is to develop and test a set of protocols designed to minimize variation in image analysis between different analysts and laboratories, specifically using the MAMA software package, Version 2.1. The protocols were designed to be “minimally invasive”, so that expert SEM operators will not be overly constrained in the way they analyze particle samples. The protocols will be tested using a round-robin approach where results from expert SEM usersmore » at Los Alamos National Laboratory, Lawrence Livermore National Laboratory, Pacific Northwest National Laboratory, Savannah River National Laboratory, and the National Institute of Standards and Testing will be compared. The variation of the results will be used to quantify uncertainty in the particle image analysis process. The round-robin exercise will proceed with 3 levels of rigor, each with their own set of protocols, as described below in Tasks 2b.1, 2b.2, and 2b.3. The uncertainty will be developed using NIST standard reference material SRM 1984 “Thermal Spray Powder – Particle Size Distribution, Tungsten Carbide/Cobalt (Acicular)” [Reference 1]. Full details are available in the Certificate of Analysis, posted on the NIST website (http://www.nist.gov/srm/).« less

  3. Structure-Based Virtual Screening for Drug Discovery: Principles, Applications and Recent Advances

    PubMed Central

    Lionta, Evanthia; Spyrou, George; Vassilatis, Demetrios K.; Cournia, Zoe

    2014-01-01

    Structure-based drug discovery (SBDD) is becoming an essential tool in assisting fast and cost-efficient lead discovery and optimization. The application of rational, structure-based drug design is proven to be more efficient than the traditional way of drug discovery since it aims to understand the molecular basis of a disease and utilizes the knowledge of the three-dimensional structure of the biological target in the process. In this review, we focus on the principles and applications of Virtual Screening (VS) within the context of SBDD and examine different procedures ranging from the initial stages of the process that include receptor and library pre-processing, to docking, scoring and post-processing of topscoring hits. Recent improvements in structure-based virtual screening (SBVS) efficiency through ensemble docking, induced fit and consensus docking are also discussed. The review highlights advances in the field within the framework of several success studies that have led to nM inhibition directly from VS and provides recent trends in library design as well as discusses limitations of the method. Applications of SBVS in the design of substrates for engineered proteins that enable the discovery of new metabolic and signal transduction pathways and the design of inhibitors of multifunctional proteins are also reviewed. Finally, we contribute two promising VS protocols recently developed by us that aim to increase inhibitor selectivity. In the first protocol, we describe the discovery of micromolar inhibitors through SBVS designed to inhibit the mutant H1047R PI3Kα kinase. Second, we discuss a strategy for the identification of selective binders for the RXRα nuclear receptor. In this protocol, a set of target structures is constructed for ensemble docking based on binding site shape characterization and clustering, aiming to enhance the hit rate of selective inhibitors for the desired protein target through the SBVS process. PMID:25262799

  4. DATA QUALITY OBJECTIVES AND STATISTICAL DESIGN SUPPORT FOR DEVELOPMENT OF A MONITORING PROTOCOL FOR RECREATIONAL WATERS

    EPA Science Inventory

    The purpose of this report is to describe the outputs of the Data Quality Objectives (DQOs) Process and discussions about developing a statistical design that will be used to implement the research study of recreational beach waters.

  5. Is There Room in the Graduate Curriculum to Learn How to Be a Grad Student? An Approach Using a Graduate-Level Biochemical Engineering Course

    ERIC Educational Resources Information Center

    Aucoin, Marc G.; Jolicoeur, Mario

    2009-01-01

    Undergraduate and graduate engineering training differ significantly. The former looks to established protocols and formulas to design and control processes while the latter often involves questioning established protocols and formulas to better suit and describe phenomena. Although we do not dispute the benefits of practical hands-on approaches,…

  6. A standard protocol for describing individual-based and agent-based models

    USGS Publications Warehouse

    Grimm, Volker; Berger, Uta; Bastiansen, Finn; Eliassen, Sigrunn; Ginot, Vincent; Giske, Jarl; Goss-Custard, John; Grand, Tamara; Heinz, Simone K.; Huse, Geir; Huth, Andreas; Jepsen, Jane U.; Jorgensen, Christian; Mooij, Wolf M.; Muller, Birgit; Pe'er, Guy; Piou, Cyril; Railsback, Steven F.; Robbins, Andrew M.; Robbins, Martha M.; Rossmanith, Eva; Ruger, Nadja; Strand, Espen; Souissi, Sami; Stillman, Richard A.; Vabo, Rune; Visser, Ute; DeAngelis, Donald L.

    2006-01-01

    Simulation models that describe autonomous individual organisms (individual based models, IBM) or agents (agent-based models, ABM) have become a widely used tool, not only in ecology, but also in many other disciplines dealing with complex systems made up of autonomous entities. However, there is no standard protocol for describing such simulation models, which can make them difficult to understand and to duplicate. This paper presents a proposed standard protocol, ODD, for describing IBMs and ABMs, developed and tested by 28 modellers who cover a wide range of fields within ecology. This protocol consists of three blocks (Overview, Design concepts, and Details), which are subdivided into seven elements: Purpose, State variables and scales, Process overview and scheduling, Design concepts, Initialization, Input, and Submodels. We explain which aspects of a model should be described in each element, and we present an example to illustrate the protocol in use. In addition, 19 examples are available in an Online Appendix. We consider ODD as a first step for establishing a more detailed common format of the description of IBMs and ABMs. Once initiated, the protocol will hopefully evolve as it becomes used by a sufficiently large proportion of modellers.

  7. Verification and validation of a reliable multicast protocol

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Montgomery, Todd L.

    1995-01-01

    This paper describes the methods used to specify and implement a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally by two complementary teams using a combination of formal and informal techniques in an attempt to ensure the correctness of the protocol implementation. The first team, called the Design team, initially specified protocol requirements using a variant of SCR requirements tables and implemented a prototype solution. The second team, called the V&V team, developed a state model based on the requirements tables and derived test cases from these tables to exercise the implementation. In a series of iterative steps, the Design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation through testing. Test cases derived from state transition paths in the formal model formed the dialogue between teams during development and served as the vehicles for keeping the model and implementation in fidelity with each other. This paper describes our experiences in developing our process model, details of our approach, and some example problems found during the development of RMP.

  8. CT and MR Protocol Standardization Across a Large Health System: Providing a Consistent Radiologist, Patient, and Referring Provider Experience.

    PubMed

    Sachs, Peter B; Hunt, Kelly; Mansoubi, Fabien; Borgstede, James

    2017-02-01

    Building and maintaining a comprehensive yet simple set of standardized protocols for a cross-sectional image can be a daunting task. A single department may have difficulty preventing "protocol creep," which almost inevitably occurs when an organized "playbook" of protocols does not exist and individual radiologists and technologists alter protocols at will and on a case-by-case basis. When multiple departments or groups function in a large health system, the lack of uniformity of protocols can increase exponentially. In 2012, the University of Colorado Hospital formed a large health system (UCHealth) and became a 5-hospital provider network. CT and MR imaging studies are conducted at multiple locations by different radiology groups. To facilitate consistency in ordering, acquisition, and appearance of a given study, regardless of location, we minimized the number of protocols across all scanners and sites of practice with a clinical indication-driven protocol selection and standardization process. Here we review the steps utilized to perform this process improvement task and insure its stability over time. Actions included creation of a standardized protocol template, which allowed for changes in electronic storage and management of protocols, designing a change request form, and formation of a governance structure. We utilized rapid improvement events (1 day for CT, 2 days for MR) and reduced 248 CT protocols into 97 standardized protocols and 168 MR protocols to 66. Additional steps are underway to further standardize output and reporting of imaging interpretation. This will result in an improved, consistent radiologist, patient, and provider experience across the system.

  9. The BACnet Campus Challenge - Part 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Masica, Ken; Tom, Steve

    Here, the BACnet protocol was designed to achieve interoperability among building automation vendors and evolve over time to include new functionality as well as support new communication technologies such as the Ethernet and IP protocols as they became prevalent and economical in the market place. For large multi-building, multi-vendor campus environments, standardizing on the BACnet protocol as an implementation strategy can be a key component in meeting the challenge of an interoperable, flexible, and scalable building automation system. The interoperability of BACnet is especially important when large campuses with legacy equipment have DDC upgrades to facilities performed over different timemore » frames and use different contractors that install equipment from different vendors under the guidance of different campus HVAC project managers. In these circumstances, BACnet can serve as a common foundation for interoperability when potential variability exists in approaches to the design-build process by numerous parties over time. Likewise, BACnet support for a range of networking protocols and technologies can be a key strategy for achieving flexible and scalable automation systems as campuses and enterprises expand networking infrastructures using standard interoperable protocols like IP and Ethernet.« less

  10. The BACnet Campus Challenge - Part 1

    DOE PAGES

    Masica, Ken; Tom, Steve

    2015-12-01

    Here, the BACnet protocol was designed to achieve interoperability among building automation vendors and evolve over time to include new functionality as well as support new communication technologies such as the Ethernet and IP protocols as they became prevalent and economical in the market place. For large multi-building, multi-vendor campus environments, standardizing on the BACnet protocol as an implementation strategy can be a key component in meeting the challenge of an interoperable, flexible, and scalable building automation system. The interoperability of BACnet is especially important when large campuses with legacy equipment have DDC upgrades to facilities performed over different timemore » frames and use different contractors that install equipment from different vendors under the guidance of different campus HVAC project managers. In these circumstances, BACnet can serve as a common foundation for interoperability when potential variability exists in approaches to the design-build process by numerous parties over time. Likewise, BACnet support for a range of networking protocols and technologies can be a key strategy for achieving flexible and scalable automation systems as campuses and enterprises expand networking infrastructures using standard interoperable protocols like IP and Ethernet.« less

  11. Development of the protocol for purification of artemisinin based on combination of commercial and computationally designed adsorbents.

    PubMed

    Piletska, Elena V; Karim, Kal; Cutler, Malcolm; Piletsky, Sergey A

    2013-01-01

    A polymeric adsorbent for extraction of the antimalarial drug artemisinin from Artemisia annua L. was computationally designed. This polymer demonstrated a high capacity for artemisinin (120 mg g(-1) ), quantitative recovery (87%) and was found to be an effective material for purification of artemisinin from complex plant matrix. The artemisinin quantification was conducted using an optimised HPLC-MS protocol, which was characterised by high precision and linearity in the concentration range between 0.05 and 2 μg mL(-1) . Optimisation of the purification protocol also involved screening of commercial adsorbents for the removal of waxes and other interfering natural compounds, which inhibit the crystallisation of artemisinin. As a result of a two step-purification protocol crystals of artemisinin were obtained, and artemisinin purity was evaluated as 75%. By performing the second stage of purification twice, the purity of artemisinin can be further improved to 99%. The developed protocol produced high-purity artemisinin using only a few purification steps that makes it suitable for large scale industrial manufacturing process. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. A data transmission method for particle physics experiments based on Ethernet physical layer

    NASA Astrophysics Data System (ADS)

    Huang, Xi-Ru; Cao, Ping; Zheng, Jia-Jun

    2015-11-01

    Due to its advantages of universality, flexibility and high performance, fast Ethernet is widely used in readout system design for modern particle physics experiments. However, Ethernet is usually used together with the TCP/IP protocol stack, which makes it difficult to implement readout systems because designers have to use the operating system to process this protocol. Furthermore, TCP/IP degrades the transmission efficiency and real-time performance. To maximize the performance of Ethernet in physics experiment applications, a data readout method based on the physical layer (PHY) is proposed. In this method, TCP/IP is replaced with a customized and simple protocol, which makes it easier to implement. On each readout module, data from the front-end electronics is first fed into an FPGA for protocol processing and then sent out to a PHY chip controlled by this FPGA for transmission. This kind of data path is fully implemented by hardware. From the side of the data acquisition system (DAQ), however, the absence of a standard protocol causes problems for the network related applications. To solve this problem, in the operating system kernel space, data received by the network interface card is redirected from the traditional flow to a specified memory space by a customized program. This memory space can easily be accessed by applications in user space. For the purpose of verification, a prototype system has been designed and implemented. Preliminary test results show that this method can meet the requirements of data transmission from the readout module to the DAQ with an efficient and simple manner. Supported by National Natural Science Foundation of China (11005107) and Independent Projects of State Key Laboratory of Particle Detection and Electronics (201301)

  13. Three steps to writing adaptive study protocols in the early phase clinical development of new medicines

    PubMed Central

    2014-01-01

    This article attempts to define terminology and to describe a process for writing adaptive, early phase study protocols which are transparent, self-intuitive and uniform. It provides a step by step guide, giving templates from projects which received regulatory authorisation and were successfully performed in the UK. During adaptive studies evolving data is used to modify the trial design and conduct within the protocol-defined remit. Adaptations within that remit are documented using non-substantial protocol amendments which do not require regulatory or ethical review. This concept is efficient in gathering relevant data in exploratory early phase studies, ethical and time- and cost-effective. PMID:24980283

  14. In Situ Chemical Oxidation for Groundwater Remediation: Site-Specific Engineering & Technology Application

    DTIC Science & Technology

    2010-10-01

    PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Colorado School of Mines,1500 Illinois St, Golden ,CO,80401 8. PERFORMING ORGANIZATION REPORT NUMBER 9...Protocol page 13 Overall ISCO Protocol Flow Diagram addition, laboratory studies may be used to select optimal chemistry parameters to maximize oxidant...Design Process 5. Because of the complexity of these oxidants’ chemistry and implementation, with much of the knowledge base residing with those

  15. A New On-Line Diagnosis Protocol for the SPIDER Family of Byzantine Fault Tolerant Architectures

    NASA Technical Reports Server (NTRS)

    Geser, Alfons; Miner, Paul S.

    2004-01-01

    This paper presents the formal verification of a new protocol for online distributed diagnosis for the SPIDER family of architectures. An instance of the Scalable Processor-Independent Design for Electromagnetic Resilience (SPIDER) architecture consists of a collection of processing elements communicating over a Reliable Optical Bus (ROBUS). The ROBUS is a specialized fault-tolerant device that guarantees Interactive Consistency, Distributed Diagnosis (Group Membership), and Synchronization in the presence of a bounded number of physical faults. Formal verification of the original SPIDER diagnosis protocol provided a detailed understanding that led to the discovery of a significantly more efficient protocol. The original protocol was adapted from the formally verified protocol used in the MAFT architecture. It required O(N) message exchanges per defendant to correctly diagnose failures in a system with N nodes. The new protocol achieves the same diagnostic fidelity, but only requires O(1) exchanges per defendant. This paper presents this new diagnosis protocol and a formal proof of its correctness using PVS.

  16. Application of two segmentation protocols during the processing of virtual images in rapid prototyping: ex vivo study with human dry mandibles.

    PubMed

    Ferraz, Eduardo Gomes; Andrade, Lucio Costa Safira; dos Santos, Aline Rode; Torregrossa, Vinicius Rabelo; Rubira-Bullen, Izabel Regina Fischer; Sarmento, Viviane Almeida

    2013-12-01

    The aim of this study was to evaluate the accuracy of virtual three-dimensional (3D) reconstructions of human dry mandibles, produced from two segmentation protocols ("outline only" and "all-boundary lines"). Twenty virtual three-dimensional (3D) images were built from computed tomography exam (CT) of 10 dry mandibles, in which linear measurements between anatomical landmarks were obtained and compared to an error probability of 5 %. The results showed no statistically significant difference among the dry mandibles and the virtual 3D reconstructions produced from segmentation protocols tested (p = 0,24). During the designing of a virtual 3D reconstruction, both "outline only" and "all-boundary lines" segmentation protocols can be used. Virtual processing of CT images is the most complex stage during the manufacture of the biomodel. Establishing a better protocol during this phase allows the construction of a biomodel with characteristics that are closer to the original anatomical structures. This is essential to ensure a correct preoperative planning and a suitable treatment.

  17. Application Protocol, Initial Graphics Exchange Specification (IGES), Layered Electrical Product

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O`Connell, L.J.

    1994-12-01

    An application protocol is an information systems engineering view of a specific product The view represents an agreement on the generic activities needed to design and fabricate the product the agreement on the information needed to support those activities, and the specific constructs of a product data standard for use in transferring some or all of the information required. This application protocol describes the data for electrical and electronic products in terms of a product description standard called the Initial Graphics Exchange Specification (IGES). More specifically, the Layered Electrical Product IGES Application Protocol (AP) specifies the mechanisms for defining andmore » exchanging computer-models and their associated data for those products which have been designed in two dimensional geometry so as to be produced as a series of layers in IGES format The AP defines the appropriateness of the data items for describing the geometry of the various parts of a product (shape and location), the connectivity, and the processing and material characteristics. Excluded is the behavioral requirements which the product was intended to satisfy, except as those requirements have been recorded as design rules or product testing requirements.« less

  18. Layered Electrical Product Application Protocol (AP). Draft: Initial Graphics Exchange Specification (IGES)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1994-09-01

    An application protocol is an information systems engineering view of a specific product. The view represents an agreement on the generic activities needed to design and fabricate the product, the agreement on the information needed to support those activities, and the specific constructs of a product data standard for use in transfering some or all of the information required. This applications protocol describes the data for electrical and electronic products in terms of a product description standard called the Initial Graphics Exchange Specification (IGES). More specifically, the Layered Electrical Product IGES Application Protocol (AP) specifies the mechanisms for defining andmore » exchanging computer-models and their associated data for those products which have been designed in two dimensional geometry so as to be produced as a series of layers in IGES format. The AP defines the appropriateness of the data items for describing the geometry of the various parts of a product (shape and location), the connectivity, and the processing and material characteristics. Excluded is the behavioral requirements which the product was intended to satisfy, except as those requirements have been recorded as design rules or product testing requirements.« less

  19. Designing of Roaming Protocol for Bluetooth Equipped Multi Agent Systems

    NASA Astrophysics Data System (ADS)

    Subhan, Fazli; Hasbullah, Halabi B.

    Bluetooth is an established standard for low cost, low power, wireless personal area network. Currently, Bluetooth does not support any roaming protocol in which handoff occurs dynamically when a Bluetooth device is moving out of the piconet. If a device is losing its connection to the master device, no provision is made to transfer it to another master. Handoff is not possible in a piconet, as in order to stay within the network, a slave would have to keep the same master. So, by definition intra-handoff is not possible within a piconet. This research mainly focuses on Bluetooth technology and designing a roaming protocol for Bluetooth equipped multi agent systems. A mathematical model is derived for an agent. The idea behind the mathematical model is to know when to initiate the roaming process for an agent. A desired trajectory for the agent is calculated using its x and y coordinates system, and is simulated in SIMULINK. Various roaming techniques are also studied and discussed. The advantage of designing a roaming protocol is to ensure the Bluetooth enabled roaming devices can freely move inside the network coverage without losing its connection or break of service in case of changing the base stations.

  20. Design and Construction of a Multi-wavelength, Micromirror Total Internal Reflectance Fluorescence Microscope

    PubMed Central

    Larson, Joshua; Kirk, Matt; Drier, Eric A.; O’Brien, William; MacKay, James F.; Friedman, Larry; Hoskins, Aaron

    2015-01-01

    Colocalization Single Molecule Spectroscopy (CoSMoS) has proven to be a useful method for studying the composition, kinetics, and mechanisms of complex cellular machines. Key to the technique is the ability to simultaneously monitor multiple proteins and/or nucleic acids as they interact with one another. Here we describe a protocol for constructing a CoSMoS micromirror Total Internal Reflection Fluorescence Microscope (mmTIRFM). Design and construction of a scientific microscope often requires a number of custom components and a significant time commitment. In our protocol, we have streamlined this process by implementation of a commercially available microscopy platform designed to accommodate the optical components necessary for a mmTIRFM. The mmTIRF system eliminates the need for machining custom parts by the end-user and facilitates optical alignment. Depending on the experience-level of the microscope builder, these time-savings and the following protocol can enable mmTIRF construction to be completed within two months. PMID:25188633

  1. Design and construction of a multiwavelength, micromirror total internal reflectance fluorescence microscope.

    PubMed

    Larson, Joshua; Kirk, Matt; Drier, Eric A; O'Brien, William; MacKay, James F; Friedman, Larry J; Hoskins, Aaron A

    2014-10-01

    Colocalization single-molecule spectroscopy (CoSMoS) has proven to be a useful method for studying the composition, kinetics and mechanisms of complex cellular machines. Key to the technique is the ability to simultaneously monitor multiple proteins and/or nucleic acids as they interact with one another. Here we describe a protocol for constructing a CoSMoS micromirror total internal reflection fluorescence microscope (mmTIRFM). Design and construction of a scientific microscope often requires a number of custom components and a substantial time commitment. In our protocol, we have streamlined this process by implementation of a commercially available microscopy platform designed to accommodate the optical components necessary for an mmTIRFM. The mmTIRF system eliminates the need for machining custom parts by the end user and facilitates optical alignment. Depending on the experience level of the microscope builder, these time savings and the following protocol can enable mmTIRF construction to be completed within 2 months.

  2. Empirical studies of design software: Implications for software engineering environments

    NASA Technical Reports Server (NTRS)

    Krasner, Herb

    1988-01-01

    The empirical studies team of MCC's Design Process Group conducted three studies in 1986-87 in order to gather data on professionals designing software systems in a range of situations. The first study (the Lift Experiment) used thinking aloud protocols in a controlled laboratory setting to study the cognitive processes of individual designers. The second study (the Object Server Project) involved the observation, videotaping, and data collection of a design team of a medium-sized development project over several months in order to study team dynamics. The third study (the Field Study) involved interviews with the personnel from 19 large development projects in the MCC shareholders in order to study how the process of design is affected by organizationl and project behavior. The focus of this report will be on key observations of design process (at several levels) and their implications for the design of environments.

  3. IONAC-Lite

    NASA Technical Reports Server (NTRS)

    Torgerson, Jordan L.; Clare, Loren P.; Pang, Jackson

    2011-01-01

    The Interplanetary Overlay Net - working Protocol Accelerator (IONAC) described previously in The Inter - planetary Overlay Networking Protocol Accelerator (NPO-45584), NASA Tech Briefs, Vol. 32, No. 10, (October 2008) p. 106 (http://www.techbriefs.com/component/ content/article/3317) provides functions that implement the Delay Tolerant Networking (DTN) bundle protocol. New missions that require high-speed downlink-only use of DTN can now be accommodated by the unidirectional IONAC-Lite to support high data rate downlink mission applications. Due to constrained energy resources, a conventional software implementation of the DTN protocol can provide only limited throughput for any given reasonable energy consumption rate. The IONAC-Lite DTN Protocol Accelerator is able to reduce this energy consumption by an order of magnitude and increase the throughput capability by two orders of magnitude. In addition, a conventional DTN implementation requires a bundle database with a considerable storage requirement. In very high downlink datarate missions such as near-Earth radar science missions, the storage space utilization needs to be maximized for science data and minimized for communications protocol-related storage needs. The IONAC-Lite DTN Protocol Accelerator is implemented in a reconfigurable hardware device to accomplish exactly what s needed for high-throughput DTN downlink-only scenarios. The following are salient features of the IONAC-Lite implementation: An implementation of the Bundle Protocol for an environment that requires a very high rate bundle egress data rate. The C&DH (command and data handling) subsystem is also expected to be very constrained so the interaction with the C&DH processor and the temporary storage are minimized. Fully pipelined design so that bundle processing database is not required. Implements a lookup table-based approach to eliminate multi-pass processing requirement imposed by the Bundle Protocol header s length field structure and the SDNV (self-delimiting numeric value) data field formatting. 8-bit parallel datapath to support high data-rate missions. Reduced resource utilization implementation for missions that do not require custody transfer features. There was no known implementation of the DTN protocol in a field programmable gate array (FPGA) device prior to the current implementation. The combination of energy and performance optimization that embodies this design makes the work novel.

  4. Assessment of Language and Literacy: A Process of Hypothesis Testing for Individual Differences

    ERIC Educational Resources Information Center

    Scott, Cheryl M.

    2011-01-01

    Purpose: Older school-aged children and adolescents with persistent language and literacy impairments vary in their individual profiles of linguistic strengths and weaknesses. Given the multidimensional nature and complexity of language, designing an assessment protocol capable of uncovering linguistic variation is challenging. A process of…

  5. Field Demonstration, Optimization, and Rigorous Validation of Peroxygen-Based ISCO for the Remediation of Contaminated Groundwater - CHP Stabilization Protocol

    DTIC Science & Technology

    2014-05-01

    propagations CoCs Contaminants of concern GC Gas chromatography DNAPL Dense nonaqueous phase liquid ISCO In situ chemical oxidation HCA...used for the design and scale-up of air strippers, ion exchange systems, precipitation reactors , and many other treatment processes. Such treatability...studies provide definitive data on system dimensions and reagent dosages using linear or non -linear scale-up. Designing these processes without the

  6. GLOBECOM '86 - Global Telecommunications Conference, Houston, TX, Dec. 1-4, 1986, Conference Record. Volumes 1, 2, & 3

    NASA Astrophysics Data System (ADS)

    Papers are presented on local area networks; formal methods for communication protocols; computer simulation of communication systems; spread spectrum and coded communications; tropical radio propagation; VLSI for communications; strategies for increasing software productivity; multiple access communications; advanced communication satellite technologies; and spread spectrum systems. Topics discussed include Space Station communication and tracking development and design; transmission networks; modulation; data communications; computer network protocols and performance; and coding and synchronization. Consideration is given to free space optical communications systems; VSAT communication networks; network topology design; advances in adaptive filtering echo cancellation and adaptive equalization; advanced signal processing for satellite communications; the elements, design, and analysis of fiber-optic networks; and advances in digital microwave systems.

  7. An Indoor Positioning-Based Mobile Payment System Using Bluetooth Low Energy Technology

    PubMed Central

    Winata, Doni

    2018-01-01

    The development of information technology has paved the way for faster and more convenient payment process flows and new methodology for the design and implementation of next generation payment systems. The growth of smartphone usage nowadays has fostered a new and popular mobile payment environment. Most of the current generation smartphones support Bluetooth Low Energy (BLE) technology to communicate with nearby BLE-enabled devices. It is plausible to construct an Over-the-Air BLE-based mobile payment system as one of the payment methods for people living in modern societies. In this paper, a secure indoor positioning-based mobile payment authentication protocol with BLE technology and the corresponding mobile payment system design are proposed. The proposed protocol consists of three phases: initialization phase, session key construction phase, and authentication phase. When a customer moves toward the POS counter area, the proposed mobile payment system will automatically detect the position of the customer to confirm whether the customer is ready for the checkout process. Once the system has identified the customer is standing within the payment-enabled area, the payment system will invoke authentication process between POS and the customer’s smartphone through BLE communication channel to generate a secure session key and establish an authenticated communication session to perform the payment transaction accordingly. A prototype is implemented to assess the performance of the proposed design for mobile payment system. In addition, security analysis is conducted to evaluate the security strength of the proposed protocol. PMID:29587399

  8. An Indoor Positioning-Based Mobile Payment System Using Bluetooth Low Energy Technology.

    PubMed

    Yohan, Alexander; Lo, Nai-Wei; Winata, Doni

    2018-03-25

    The development of information technology has paved the way for faster and more convenient payment process flows and new methodology for the design and implementation of next generation payment systems. The growth of smartphone usage nowadays has fostered a new and popular mobile payment environment. Most of the current generation smartphones support Bluetooth Low Energy (BLE) technology to communicate with nearby BLE-enabled devices. It is plausible to construct an Over-the-Air BLE-based mobile payment system as one of the payment methods for people living in modern societies. In this paper, a secure indoor positioning-based mobile payment authentication protocol with BLE technology and the corresponding mobile payment system design are proposed. The proposed protocol consists of three phases: initialization phase, session key construction phase, and authentication phase. When a customer moves toward the POS counter area, the proposed mobile payment system will automatically detect the position of the customer to confirm whether the customer is ready for the checkout process. Once the system has identified the customer is standing within the payment-enabled area, the payment system will invoke authentication process between POS and the customer's smartphone through BLE communication channel to generate a secure session key and establish an authenticated communication session to perform the payment transaction accordingly. A prototype is implemented to assess the performance of the proposed design for mobile payment system. In addition, security analysis is conducted to evaluate the security strength of the proposed protocol.

  9. Electronic protocol of respiratory physical therapy in patients with idiopathic adolescent scoliosis.

    PubMed

    Cano, Danila Vieira Baldini; Malafaia, Osvaldo; Alves, Vera Lúcia dos Santos; Avanzi, Osmar; Pinto, José Simão de Paula

    2011-01-01

    To create a clinical database of respiratory function in patients with adolescent idiopathic scoliosis; computerize and store this clinical data through the use of a software; incorporate this electronic protocol to the SINPE© (Integrated Electronic Protocols System) and analyze a pilot project with interpretation of results. From the literature review a computerized data bank of clinical data of postural deviations was set up (master protocol). Upon completion of the master protocol a specific protocol of respiratory function in patients with adolescent idiopathic scoliosis was designed and a pilot project was conducted to collect and analyze data from ten patients. It was possible to create the master protocol of postural deviations and the specific protocol of respiratory function in patients with adolescent idiopathic scoliosis. The data collected in the pilot project was processed by the SINPE ANALYZER©, generating charts and statistics. The establishment of the clinical database of adolescent idiopathic scoliosis was possible. Computerization and storage of clinical data using the software were viable. The electronic protocol of adolescent idiopathic scoliosis could be incorporated into the SINPE© and its use in the pilot project was successful.

  10. How do nurses, midwives and health visitors contribute to protocol-based care? A synthesis of the UK literature.

    PubMed

    Ilott, Irene; Booth, Andrew; Rick, Jo; Patterson, Malcolm

    2010-06-01

    To explore how nurses, midwives and health visitors contribute to the development, implementation and audit of protocol-based care. Protocol-based care refers to the use of documents that set standards for clinical care processes with the intent of reducing unacceptable variations in practice. Documents such as protocols, clinical guidelines and care pathways underpin evidence-based practice throughout the world. An interpretative review using the five-stage systematic literature review process. The data sources were the British Nursing Index, CINAHL, EMBASE, MEDLINE and Web of Science from onset to 2005. The Journal of Integrated Care Pathways was hand searched (1997-June 2006). Thirty three studies about protocol-based care in the United Kingdom were appraised using the Qualitative Assessment and Review Instrument (QARI version 2). The literature was synthesized inductively and deductively, using an official 12-step guide for development as a framework for the deductive synthesis. Most papers were descriptive, offering practitioner knowledge and positive findings about a locally developed and owned protocol-based care. The majority were instigated in response to clinical need or service re-design. Development of protocol-based care was a non-linear, idiosyncratic process, with steps omitted, repeated or completed in a different order. The context and the multiple purposes of protocol-based care influenced the development process. Implementation and sustainability were rarely mentioned, or theorised as a change. The roles and activities of nurses were so understated as to be almost invisible. There were notable gaps in the literature about the resource use costs, the engagement of patients in the decision-making process, leadership and the impact of formalisation and new roles on inter-professional relations. Documents that standardise clinical care are part of the history of nursing as well as contemporary evidence-based care and expanded roles. Considering the proliferation and contested nature of protocol-based care, the dearth of literature about the contribution, experience and outcomes for nurses, midwives and health visitors is noteworthy and requires further investigation. (c) 2010 Elsevier Ltd. All rights reserved.

  11. A Conceptual Design for a Reliable Optical Bus (ROBUS)

    NASA Technical Reports Server (NTRS)

    Miner, Paul S.; Malekpour, Mahyar; Torres, Wilfredo

    2002-01-01

    The Scalable Processor-Independent Design for Electromagnetic Resilience (SPIDER) is a new family of fault-tolerant architectures under development at NASA Langley Research Center (LaRC). The SPIDER is a general-purpose computational platform suitable for use in ultra-reliable embedded control applications. The design scales from a small configuration supporting a single aircraft function to a large distributed configuration capable of supporting several functions simultaneously. SPIDER consists of a collection of simplex processing elements communicating via a Reliable Optical Bus (ROBUS). The ROBUS is an ultra-reliable, time-division multiple access broadcast bus with strictly enforced write access (no babbling idiots) providing basic fault-tolerant services using formally verified fault-tolerance protocols including Interactive Consistency (Byzantine Agreement), Internal Clock Synchronization, and Distributed Diagnosis. The conceptual design of the ROBUS is presented in this paper including requirements, topology, protocols, and the block-level design. Verification activities, including the use of formal methods, are also discussed.

  12. EXACT2: the semantics of biomedical protocols

    PubMed Central

    2014-01-01

    Background The reliability and reproducibility of experimental procedures is a cornerstone of scientific practice. There is a pressing technological need for the better representation of biomedical protocols to enable other agents (human or machine) to better reproduce results. A framework that ensures that all information required for the replication of experimental protocols is essential to achieve reproducibility. Methods We have developed the ontology EXACT2 (EXperimental ACTions) that is designed to capture the full semantics of biomedical protocols required for their reproducibility. To construct EXACT2 we manually inspected hundreds of published and commercial biomedical protocols from several areas of biomedicine. After establishing a clear pattern for extracting the required information we utilized text-mining tools to translate the protocols into a machine amenable format. We have verified the utility of EXACT2 through the successful processing of previously 'unseen' (not used for the construction of EXACT2) protocols. Results The paper reports on a fundamentally new version EXACT2 that supports the semantically-defined representation of biomedical protocols. The ability of EXACT2 to capture the semantics of biomedical procedures was verified through a text mining use case. In this EXACT2 is used as a reference model for text mining tools to identify terms pertinent to experimental actions, and their properties, in biomedical protocols expressed in natural language. An EXACT2-based framework for the translation of biomedical protocols to a machine amenable format is proposed. Conclusions The EXACT2 ontology is sufficient to record, in a machine processable form, the essential information about biomedical protocols. EXACT2 defines explicit semantics of experimental actions, and can be used by various computer applications. It can serve as a reference model for for the translation of biomedical protocols in natural language into a semantically-defined format. PMID:25472549

  13. Toward fidelity between specification and implementation

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Montgomery, Todd L.; Morrison, Jeff; Wu, Yunqing

    1994-01-01

    This paper describes the methods used to specify and implement a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally by two complementary teams using a combination of formal and informal techniques in an attempt to ensure the correctness of the protocol implementation. The first team, called the Design team, initially specified protocol requirements using a variant of SCR requirements tables and implemented a prototype solution. The second team, called the V&V team, developed a state model based on the requirements tables and derived test cases from these tables to exercise the implementation. In a series of iterative steps, the Design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation through testing. Test cases derived from state transition paths in the formal model formed the dialogue between teams during development and served as the vehicles for keeping the model and implementation in fidelity with each other. This paper describes our experiences in developing our process model, details of our approach, and some example problems found during the development of RMP.

  14. On the Design of a Comprehensive Authorisation Framework for Service Oriented Architecture (SOA)

    DTIC Science & Technology

    2013-07-01

    Authentication Server AZM Authorisation Manager AZS Authorisation Server BP Business Process BPAA Business Process Authorisation Architecture BPAD Business...Internet Protocol Security JAAS Java Authentication and Authorisation Service MAC Mandatory Access Control RBAC Role Based Access Control RCA Regional...the authentication process, make authorisation decisions using application specific access control functions that results in the practice of

  15. An Authentication Protocol for Future Sensor Networks.

    PubMed

    Bilal, Muhammad; Kang, Shin-Gak

    2017-04-28

    Authentication is one of the essential security services in Wireless Sensor Networks (WSNs) for ensuring secure data sessions. Sensor node authentication ensures the confidentiality and validity of data collected by the sensor node, whereas user authentication guarantees that only legitimate users can access the sensor data. In a mobile WSN, sensor and user nodes move across the network and exchange data with multiple nodes, thus experiencing the authentication process multiple times. The integration of WSNs with Internet of Things (IoT) brings forth a new kind of WSN architecture along with stricter security requirements; for instance, a sensor node or a user node may need to establish multiple concurrent secure data sessions. With concurrent data sessions, the frequency of the re-authentication process increases in proportion to the number of concurrent connections. Moreover, to establish multiple data sessions, it is essential that a protocol participant have the capability of running multiple instances of the protocol run, which makes the security issue even more challenging. The currently available authentication protocols were designed for the autonomous WSN and do not account for the above requirements. Hence, ensuring a lightweight and efficient authentication protocol has become more crucial. In this paper, we present a novel, lightweight and efficient key exchange and authentication protocol suite called the Secure Mobile Sensor Network (SMSN) Authentication Protocol. In the SMSN a mobile node goes through an initial authentication procedure and receives a re-authentication ticket from the base station. Later a mobile node can use this re-authentication ticket when establishing multiple data exchange sessions and/or when moving across the network. This scheme reduces the communication and computational complexity of the authentication process. We proved the strength of our protocol with rigorous security analysis (including formal analysis using the BAN-logic) and simulated the SMSN and previously proposed schemes in an automated protocol verifier tool. Finally, we compared the computational complexity and communication cost against well-known authentication protocols.

  16. An Authentication Protocol for Future Sensor Networks

    PubMed Central

    Bilal, Muhammad; Kang, Shin-Gak

    2017-01-01

    Authentication is one of the essential security services in Wireless Sensor Networks (WSNs) for ensuring secure data sessions. Sensor node authentication ensures the confidentiality and validity of data collected by the sensor node, whereas user authentication guarantees that only legitimate users can access the sensor data. In a mobile WSN, sensor and user nodes move across the network and exchange data with multiple nodes, thus experiencing the authentication process multiple times. The integration of WSNs with Internet of Things (IoT) brings forth a new kind of WSN architecture along with stricter security requirements; for instance, a sensor node or a user node may need to establish multiple concurrent secure data sessions. With concurrent data sessions, the frequency of the re-authentication process increases in proportion to the number of concurrent connections. Moreover, to establish multiple data sessions, it is essential that a protocol participant have the capability of running multiple instances of the protocol run, which makes the security issue even more challenging. The currently available authentication protocols were designed for the autonomous WSN and do not account for the above requirements. Hence, ensuring a lightweight and efficient authentication protocol has become more crucial. In this paper, we present a novel, lightweight and efficient key exchange and authentication protocol suite called the Secure Mobile Sensor Network (SMSN) Authentication Protocol. In the SMSN a mobile node goes through an initial authentication procedure and receives a re-authentication ticket from the base station. Later a mobile node can use this re-authentication ticket when establishing multiple data exchange sessions and/or when moving across the network. This scheme reduces the communication and computational complexity of the authentication process. We proved the strength of our protocol with rigorous security analysis (including formal analysis using the BAN-logic) and simulated the SMSN and previously proposed schemes in an automated protocol verifier tool. Finally, we compared the computational complexity and communication cost against well-known authentication protocols. PMID:28452937

  17. ALFA: The new ALICE-FAIR software framework

    NASA Astrophysics Data System (ADS)

    Al-Turany, M.; Buncic, P.; Hristov, P.; Kollegger, T.; Kouzinopoulos, C.; Lebedev, A.; Lindenstruth, V.; Manafov, A.; Richter, M.; Rybalchenko, A.; Vande Vyvre, P.; Winckler, N.

    2015-12-01

    The commonalities between the ALICE and FAIR experiments and their computing requirements led to the development of large parts of a common software framework in an experiment independent way. The FairRoot project has already shown the feasibility of such an approach for the FAIR experiments and extending it beyond FAIR to experiments at other facilities[1, 2]. The ALFA framework is a joint development between ALICE Online- Offline (O2) and FairRoot teams. ALFA is designed as a flexible, elastic system, which balances reliability and ease of development with performance using multi-processing and multithreading. A message- based approach has been adopted; such an approach will support the use of the software on different hardware platforms, including heterogeneous systems. Each process in ALFA assumes limited communication and reliance on other processes. Such a design will add horizontal scaling (multiple processes) to vertical scaling provided by multiple threads to meet computing and throughput demands. ALFA does not dictate any application protocols. Potentially, any content-based processor or any source can change the application protocol. The framework supports different serialization standards for data exchange between different hardware and software languages.

  18. Design and Implementation of a Prospective Adult Congenital Heart Disease Biobank.

    PubMed

    Opotowsky, Alexander R; Loukas, Brittani; Ellervik, Christina; Moko, Lilamarie E; Singh, Michael N; Landzberg, Elizabeth I; Rimm, Eric B; Landzberg, Michael J

    2016-11-01

    Adults with congenital heart disease (ACHD) comprise a growing, increasingly complex population. The Boston Adult Congenital Heart Disease Biobank is a program for the collection and storage of biospecimens to provide a sustainable resource for scientific biomarker investigation in ACHD. We describe a protocol to collect, process, and store biospecimens for ACHD or associated diagnoses developed based on existing literature and consultation with cardiovascular biomarker epidemiologists. The protocol involves collecting urine and ∼48.5 mL of blood. A subset of the blood and urine undergoes immediate clinically relevant testing. The remaining biospecimens are processed soon after collection and stored at -80°C as aliquots of ethylenediaminetetraacetic acid (EDTA) and lithium heparin plasma, serum, red cell and buffy coat pellet, and urine supernatant. Including tubes with diverse anticoagulant and clot accelerator contents will enable flexible downstream use. Demographic and clinical data are entered into a database; data on biospecimen collection, processing, and storage are managed by an enterprise laboratory information management system. Since implementation in 2012, we have enrolled more than 650 unique participants (aged 18-80 years, 53.3% women); the Biobank contains over 11,000 biospecimen aliquots. The most common primary CHD diagnoses are single ventricle status-post Fontan procedure (18.8%), repaired tetralogy of Fallot with pulmonary stenosis or atresia (17.6%), and left-sided obstructive lesions (17.5%). We describe the design and implementation of biospecimen collection, handling, and storage protocols with multiple levels of quality assurance. These protocols are feasible and reflect the size and goals of the Boston ACHD Biobank. © The Author(s) 2016.

  19. Efficient multiparty quantum key agreement with collective detection.

    PubMed

    Huang, Wei; Su, Qi; Liu, Bin; He, Yuan-Hang; Fan, Fan; Xu, Bing-Jie

    2017-11-10

    As a burgeoning branch of quantum cryptography, quantum key agreement is a kind of key establishing processes where the security and fairness of the established common key should be guaranteed simultaneously. However, the difficulty on designing a qualified quantum key agreement protocol increases significantly with the increase of the number of the involved participants. Thus far, only few of the existing multiparty quantum key agreement (MQKA) protocols can really achieve security and fairness. Nevertheless, these qualified MQKA protocols are either too inefficient or too impractical. In this paper, an MQKA protocol is proposed with single photons in travelling mode. Since only one eavesdropping detection is needed in the proposed protocol, the qubit efficiency and measurement efficiency of it are higher than those of the existing ones in theory. Compared with the protocols which make use of the entangled states or multi-particle measurements, the proposed protocol is more feasible with the current technologies. Security and fairness analysis shows that the proposed protocol is not only immune to the attacks from external eavesdroppers, but also free from the attacks from internal betrayers.

  20. Application of QC_DR software for acceptance testing and routine quality control of direct digital radiography systems: initial experiences using the Italian Association of Physicist in Medicine quality control protocol.

    PubMed

    Nitrosi, Andrea; Bertolini, Marco; Borasi, Giovanni; Botti, Andrea; Barani, Adriana; Rivetti, Stefano; Pierotti, Luisa

    2009-12-01

    Ideally, medical x-ray imaging systems should be designed to deliver maximum image quality at an acceptable radiation risk to the patient. Quality assurance procedures are employed to ensure that these standards are maintained. A quality control protocol for direct digital radiography (DDR) systems is described and discussed. Software to automatically process and analyze the required images was developed. In this paper, the initial results obtained on equipment of different DDR manufacturers were reported. The protocol was developed to highlight even small discrepancies in standard operating performance.

  1. Optimal protocols for slowly driven quantum systems.

    PubMed

    Zulkowski, Patrick R; DeWeese, Michael R

    2015-09-01

    The design of efficient quantum information processing will rely on optimal nonequilibrium transitions of driven quantum systems. Building on a recently developed geometric framework for computing optimal protocols for classical systems driven in finite time, we construct a general framework for optimizing the average information entropy for driven quantum systems. Geodesics on the parameter manifold endowed with a positive semidefinite metric correspond to protocols that minimize the average information entropy production in finite time. We use this framework to explicitly compute the optimal entropy production for a simple two-state quantum system coupled to a heat bath of bosonic oscillators, which has applications to quantum annealing.

  2. Model of a programmable quantum processing unit based on a quantum transistor effect

    NASA Astrophysics Data System (ADS)

    Ablayev, Farid; Andrianov, Sergey; Fetisov, Danila; Moiseev, Sergey; Terentyev, Alexandr; Urmanchev, Andrey; Vasiliev, Alexander

    2018-02-01

    In this paper we propose a model of a programmable quantum processing device realizable with existing nano-photonic technologies. It can be viewed as a basis for new high performance hardware architectures. Protocols for physical implementation of device on the controlled photon transfer and atomic transitions are presented. These protocols are designed for executing basic single-qubit and multi-qubit gates forming a universal set. We analyze the possible operation of this quantum computer scheme. Then we formalize the physical architecture by a mathematical model of a Quantum Processing Unit (QPU), which we use as a basis for the Quantum Programming Framework. This framework makes it possible to perform universal quantum computations in a multitasking environment.

  3. Software Design Document MCC CSCI (1). Volume 1 Sections 1.0-2.18

    DTIC Science & Technology

    1991-06-01

    AssociationUserProtocol /simnet/common!include/prot ____________________ ____________________ ocol/p assoc.h Primitive long Standard C type...Information. 2.2.1.4.2 ProcessMessage ProcessMessage processes a message from another process. type describes the message as either one-way, a synchronous or...Macintosh Consoles. This is sometimes necessary due to normal clock skew so that operations among the MCC components will remain synchronized . This

  4. The Impact of TCARE[R] on Service Recommendation, Use, and Caregiver Well-Being

    ERIC Educational Resources Information Center

    Kwak, Jung; Montgomery, Rhonda J. V.; Kosloski, Karl; Lang, Josh

    2011-01-01

    Purpose of the Study: Findings are reported from a study that examined the effects of the Tailored Caregiver Assessment and Referral (TCARE[R]) protocol, a care management process designed to help family caregivers, on care planning and caregiver outcomes. Design and Methods: A longitudinal, randomized controlled trial was conducted with 97…

  5. Design and Characterization of a Secure Automatic Dependent Surveillance-Broadcast Prototype

    DTIC Science & Technology

    2015-03-26

    during the thesis process. Thank you to Mr. Dave Prentice of AFRL for providing the Aeroflex IFR 6000 baseband signals, upon which many design decisions...35 25 Example Aeroflex IFR 6000 signal. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 26...Global Positioning System HDL hardware description language I in-phase IFR Instrument Flight Rules IP Internet Protocol IP intellectual property IPSec

  6. Modelling and regulating of cardio-respiratory response for the enhancement of interval training

    PubMed Central

    2014-01-01

    Background The interval training method has been a well known exercise protocol which helps strengthen and improve one’s cardiovascular fitness. Purpose To develop an effective training protocol to improve cardiovascular fitness based on modelling and analysis of Heart Rate (HR) and Oxygen Uptake (VO2) dynamics. Methods In order to model the cardiorespiratory response to the onset and offset exercises, the (K4b2, Cosmed) gas analyzer was used to monitor and record the heart rate and oxygen uptake for ten healthy male subjects. An interval training protocol was developed for young health users and was simulated using a proposed RC switching model which was presented to accommodate the variations of the cardiorespiratory dynamics to running exercises. A hybrid system model was presented to describe the adaptation process and a multi-loop PI control scheme was designed for the tuning of interval training regime. Results By observing the original data for each subject, we can clearly identify that all subjects have similar HR and VO2 profiles. The proposed model is capable to simulate the exercise responses during onset and offset exercises; it ensures the continuity of the outputs within the interval training protocol. Under some mild assumptions, a hybrid system model can describe the adaption process and accordingly a multi-loop PI controller can be designed for the tuning of interval training protocol. The self-adaption feature of the proposed controller gives the exerciser the opportunity to reach his desired setpoints after a certain number of training sessions. Conclusions The established interval training protocol targets a range of 70-80% of HRmax which is mainly a training zone for the purpose of cardiovascular system development and improvement. Furthermore, the proposed multi-loop feedback controller has the potential to tune the interval training protocol according to the feedback from an individual exerciser. PMID:24499131

  7. A distributed computing model for telemetry data processing

    NASA Astrophysics Data System (ADS)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-05-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  8. A distributed computing model for telemetry data processing

    NASA Technical Reports Server (NTRS)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-01-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  9. Protecting Privacy and Securing the Gathering of Location Proofs - The Secure Location Verification Proof Gathering Protocol

    NASA Astrophysics Data System (ADS)

    Graham, Michelle; Gray, David

    As wireless networks become increasingly ubiquitous, the demand for a method of locating a device has increased dramatically. Location Based Services are now commonplace but there are few methods of verifying or guaranteeing a location provided by a user without some specialised hardware, especially in larger scale networks. We propose a system for the verification of location claims, using proof gathered from neighbouring devices. In this paper we introduce a protocol to protect this proof gathering process, protecting the privacy of all involved parties and securing it from intruders and malicious claiming devices. We present the protocol in stages, extending the security of this protocol to allow for flexibility within its application. The Secure Location Verification Proof Gathering Protocol (SLVPGP) has been designed to function within the area of Vehicular Networks, although its application could be extended to any device with wireless & cryptographic capabilities.

  10. Privacy-Preserving Health Data Collection for Preschool Children

    PubMed Central

    Zhang, Yuan; Ji, Yue

    2013-01-01

    With the development of network technology, more and more data are transmitted over the network and privacy issues have become a research focus. In this paper, we study the privacy in health data collection of preschool children and present a new identity-based encryption protocol for privacy protection. The background of the protocol is as follows. A physical examination for preschool children is needed every year out of consideration for the children's health. After the examination, data are transmitted through the Internet to the education authorities for analysis. In the process of data collection, it is unnecessary for the education authorities to know the identities of the children. Based on this, we designed a privacy-preserving protocol, which delinks the children's identities from the examination data. Thus, the privacy of the children is preserved during data collection. We present the protocol in detail and prove the correctness of the protocol. PMID:24285984

  11. Privacy-preserving health data collection for preschool children.

    PubMed

    Guan, Shaopeng; Zhang, Yuan; Ji, Yue

    2013-01-01

    With the development of network technology, more and more data are transmitted over the network and privacy issues have become a research focus. In this paper, we study the privacy in health data collection of preschool children and present a new identity-based encryption protocol for privacy protection. The background of the protocol is as follows. A physical examination for preschool children is needed every year out of consideration for the children's health. After the examination, data are transmitted through the Internet to the education authorities for analysis. In the process of data collection, it is unnecessary for the education authorities to know the identities of the children. Based on this, we designed a privacy-preserving protocol, which delinks the children's identities from the examination data. Thus, the privacy of the children is preserved during data collection. We present the protocol in detail and prove the correctness of the protocol.

  12. Design and Analysis of A Beacon-Less Routing Protocol for Large Volume Content Dissemination in Vehicular Ad Hoc Networks.

    PubMed

    Hu, Miao; Zhong, Zhangdui; Ni, Minming; Baiocchi, Andrea

    2016-11-01

    Large volume content dissemination is pursued by the growing number of high quality applications for Vehicular Ad hoc NETworks(VANETs), e.g., the live road surveillance service and the video-based overtaking assistant service. For the highly dynamical vehicular network topology, beacon-less routing protocols have been proven to be efficient in achieving a balance between the system performance and the control overhead. However, to the authors' best knowledge, the routing design for large volume content has not been well considered in the previous work, which will introduce new challenges, e.g., the enhanced connectivity requirement for a radio link. In this paper, a link Lifetime-aware Beacon-less Routing Protocol (LBRP) is designed for large volume content delivery in VANETs. Each vehicle makes the forwarding decision based on the message header information and its current state, including the speed and position information. A semi-Markov process analytical model is proposed to evaluate the expected delay in constructing one routing path for LBRP. Simulations show that the proposed LBRP scheme outperforms the traditional dissemination protocols in providing a low end-to-end delay. The analytical model is shown to exhibit a good match on the delay estimation with Monte Carlo simulations, as well.

  13. Design and Analysis of A Beacon-Less Routing Protocol for Large Volume Content Dissemination in Vehicular Ad Hoc Networks

    PubMed Central

    Hu, Miao; Zhong, Zhangdui; Ni, Minming; Baiocchi, Andrea

    2016-01-01

    Large volume content dissemination is pursued by the growing number of high quality applications for Vehicular Ad hoc NETworks(VANETs), e.g., the live road surveillance service and the video-based overtaking assistant service. For the highly dynamical vehicular network topology, beacon-less routing protocols have been proven to be efficient in achieving a balance between the system performance and the control overhead. However, to the authors’ best knowledge, the routing design for large volume content has not been well considered in the previous work, which will introduce new challenges, e.g., the enhanced connectivity requirement for a radio link. In this paper, a link Lifetime-aware Beacon-less Routing Protocol (LBRP) is designed for large volume content delivery in VANETs. Each vehicle makes the forwarding decision based on the message header information and its current state, including the speed and position information. A semi-Markov process analytical model is proposed to evaluate the expected delay in constructing one routing path for LBRP. Simulations show that the proposed LBRP scheme outperforms the traditional dissemination protocols in providing a low end-to-end delay. The analytical model is shown to exhibit a good match on the delay estimation with Monte Carlo simulations, as well. PMID:27809285

  14. Optimization of protocol design: a path to efficient, lower cost clinical trial execution

    PubMed Central

    Malikova, Marina A

    2016-01-01

    Managing clinical trials requires strategic planning and efficient execution. In order to achieve a timely delivery of important clinical trials’ outcomes, it is useful to establish standardized trial management guidelines and develop robust scoring methodology for evaluation of study protocol complexity. This review will explore the challenges clinical teams face in developing protocols to ensure that the right patients are enrolled and the right data are collected to demonstrate that a drug is safe and efficacious, while managing study costs and study complexity based on proposed comprehensive scoring model. Key factors to consider when developing protocols and techniques to minimize complexity will be discussed. A methodology to identify processes at planning phase, approaches to increase fiscal return and mitigate fiscal compliance risk for clinical trials will be addressed. PMID:28031939

  15. Simulating Autonomous Telecommunication Networks for Space Exploration

    NASA Technical Reports Server (NTRS)

    Segui, John S.; Jennings, Esther H.

    2008-01-01

    Currently, most interplanetary telecommunication systems require human intervention for command and control. However, considering the range from near Earth to deep space missions, combined with the increase in the number of nodes and advancements in processing capabilities, the benefits from communication autonomy will be immense. Likewise, greater mission science autonomy brings the need for unscheduled, unpredictable communication and network routing. While the terrestrial Internet protocols are highly developed their suitability for space exploration has been questioned. JPL has developed the Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) tool to help characterize network designs and protocols. The results will allow future mission planners to better understand the trade offs of communication protocols. This paper discusses various issues with interplanetary network and simulation results of interplanetary networking protocols.

  16. Protocol for the process evaluation of a complex intervention designed to increase the use of research in health policy and program organisations (the SPIRIT study).

    PubMed

    Haynes, Abby; Brennan, Sue; Carter, Stacy; O'Connor, Denise; Schneider, Carmen Huckel; Turner, Tari; Gallego, Gisselle

    2014-09-27

    Process evaluation is vital for understanding how interventions function in different settings, including if and why they have different effects or do not work at all. This is particularly important in trials of complex interventions in 'real world' organisational settings where causality is difficult to determine. Complexity presents challenges for process evaluation, and process evaluations that tackle complexity are rarely reported. This paper presents the detailed protocol for a process evaluation embedded in a randomised trial of a complex intervention known as SPIRIT (Supporting Policy In health with Research: an Intervention Trial). SPIRIT aims to build capacity for using research in health policy and program agencies. We describe the flexible and pragmatic methods used for capturing, managing and analysing data across three domains: (a) the intervention as it was implemented; (b) how people participated in and responded to the intervention; and (c) the contextual characteristics that mediated this relationship and may influence outcomes. Qualitative and quantitative data collection methods include purposively sampled semi-structured interviews at two time points, direct observation and coding of intervention activities, and participant feedback forms. We provide examples of the data collection and data management tools developed. This protocol provides a worked example of how to embed process evaluation in the design and evaluation of a complex intervention trial. It tackles complexity in the intervention and its implementation settings. To our knowledge, it is the only detailed example of the methods for a process evaluation of an intervention conducted as part of a randomised trial in policy organisations. We identify strengths and weaknesses, and discuss how the methods are functioning during early implementation. Using 'insider' consultation to develop methods is enabling us to optimise data collection while minimising discomfort and burden for participants. Embedding the process evaluation within the trial design is facilitating access to data, but may impair participants' willingness to talk openly in interviews. While it is challenging to evaluate the process of conducting a randomised trial of a complex intervention, our experience so far suggests that it is feasible and can add considerably to the knowledge generated.

  17. Improving Conduct and Feasibility of Clinical Trials to Evaluate Antibacterial Drugs to Treat Hospital-Acquired Bacterial Pneumonia and Ventilator-Associated Bacterial Pneumonia: Recommendations of the Clinical Trials Transformation Initiative Antibacterial Drug Development Project Team.

    PubMed

    Knirsch, Charles; Alemayehu, Demissie; Botgros, Radu; Comic-Savic, Sabrina; Friedland, David; Holland, Thomas L; Merchant, Kunal; Noel, Gary J; Pelfrene, Eric; Reith, Christina; Santiago, Jonas; Tiernan, Rosemary; Tenearts, Pamela; Goldsack, Jennifer C; Fowler, Vance G

    2016-08-15

    The etiology of hospital-acquired or ventilator-associated bacterial pneumonia (HABP/VABP) is often multidrug-resistant infections. The evaluation of new antibacterial drugs for efficacy in this population is important, as many antibacterial drugs have demonstrated limitations when studied in this population. HABP/VABP trials are expensive and challenging to conduct due to protocol complexity and low patient enrollment, among other factors. The Clinical Trials Transformation Initiative (CTTI) seeks to advance antibacterial drug development by streamlining HABP/VABP clinical trials to improve efficiency and feasibility while maintaining ethical rigor, patient safety, information value, and scientific validity. In 2013, CTTI engaged a multidisciplinary group of experts to discuss challenges impeding the conduct of HABP/VABP trials. Separate workstreams identified challenges associated with HABP/VABP protocol complexity. The Project Team developed potential solutions to streamline HABP/VABP trials using a Quality by Design approach. CTTI recommendations focus on 4 key areas to improve HABP/VABP trials: informed consent processes/practices, protocol design, choice of an institutional review board (IRB), and trial outcomes. Informed consent processes should include legally authorized representatives. Protocol design decisions should focus on eligibility criteria, prestudy antibacterial therapy considerations, use of new diagnostics, and sample size. CTTI recommends that sponsors use a central IRB and discuss trial endpoints with regulators, including defining a clinical failure and evaluating the impact of concomitant antibacterial drugs. Streamlining HABP/VABP trials by addressing key protocol elements can improve trial startup and patient recruitment/retention, reduce trial complexity and costs, and ensure patient safety while advancing antibacterial drug development. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.

  18. Computer-aided dental prostheses construction using reverse engineering.

    PubMed

    Solaberrieta, E; Minguez, R; Barrenetxea, L; Sierra, E; Etxaniz, O

    2014-01-01

    The implementation of computer-aided design/computer-aided manufacturing (CAD/CAM) systems with virtual articulators, which take into account the kinematics, constitutes a breakthrough in the construction of customised dental prostheses. This paper presents a multidisciplinary protocol involving CAM techniques to produce dental prostheses. This protocol includes a step-by-step procedure using innovative reverse engineering technologies to transform completely virtual design processes into customised prostheses. A special emphasis is placed on a novel method that permits a virtual location of the models. The complete workflow includes the optical scanning of the patient, the use of reverse engineering software and, if necessary, the use of rapid prototyping to produce CAD temporary prostheses.

  19. AHTD cracking protocol application with automated distress survey for design and management.

    DOT National Transportation Integrated Search

    2011-03-09

    Manual surveys of pavement cracking have problems associated with variability, repeatability, processing : speed, and cost. If conducted in the field, safety and related liability of manual survey present challenges : to highway agencies. Therefore a...

  20. Relevant Telecomputing Activities.

    ERIC Educational Resources Information Center

    Ross, Patricia

    1995-01-01

    Discusses the use of telecomputing in classrooms. Topics include telecomputing goals; use of the Internet; language arts and music FTP (file transfer protocol) sites; social studies FTP sites; science Telnet sites; social studies Telnet sites; skill building and learning processes; and instructional design. (LRW)

  1. Developing and Evaluating the GriefLink Web Site: Processes, Protocols, Dilemmas and Lessons Learned

    ERIC Educational Resources Information Center

    Clark, Sheila; Burgess, Teresa; Laven, Gillian; Bull, Michael; Marker, Julie; Browne, Eric

    2004-01-01

    Despite a profusion of recommendations regarding the quality of web sites and guidelines related to ethical issues surrounding health-related sites, there is little guidance for the design and evaluation of sites relating to loss and grief. This article, which addresses these deficiencies, results from a community consultation process of designing…

  2. Cross-layer protocols optimized for real-time multimedia services in energy-constrained mobile ad hoc networks

    NASA Astrophysics Data System (ADS)

    Hortos, William S.

    2003-07-01

    Mobile ad hoc networking (MANET) supports self-organizing, mobile infrastructures and enables an autonomous network of mobile nodes that can operate without a wired backbone. Ad hoc networks are characterized by multihop, wireless connectivity via packet radios and by the need for efficient dynamic protocols. All routers are mobile and can establish connectivity with other nodes only when they are within transmission range. Importantly, ad hoc wireless nodes are resource-constrained, having limited processing, memory, and battery capacity. Delivery of high quality-ofservice (QoS), real-time multimedia services from Internet-based applications over a MANET is a challenge not yet achieved by proposed Internet Engineering Task Force (IETF) ad hoc network protocols in terms of standard performance metrics such as end-to-end throughput, packet error rate, and delay. In the distributed operations of route discovery and maintenance, strong interaction occurs across MANET protocol layers, in particular, the physical, media access control (MAC), network, and application layers. The QoS requirements are specified for the service classes by the application layer. The cross-layer design must also satisfy the battery-limited energy constraints, by minimizing the distributed power consumption at the nodes and of selected routes. Interactions across the layers are modeled in terms of the set of concatenated design parameters including associated energy costs. Functional dependencies of the QoS metrics are described in terms of the concatenated control parameters. New cross-layer designs are sought that optimize layer interdependencies to achieve the "best" QoS available in an energy-constrained, time-varying network. The protocol design, based on a reactive MANET protocol, adapts the provisioned QoS to dynamic network conditions and residual energy capacities. The cross-layer optimization is based on stochastic dynamic programming conditions derived from time-dependent models of MANET packet flows. Regulation of network behavior is modeled by the optimal control of the conditional rates of multivariate point processes (MVPPs); these rates depend on the concatenated control parameters through a change of probability measure. The MVPP models capture behavior of many service applications, e.g., voice, video and the self-similar behavior of Internet data sessions. Performance verification of the cross-layer protocols, derived from the dynamic programming conditions, can be achieved by embedding the conditions in a reactive routing protocol for MANETs, in a simulation environment, such as the wireless extension of ns-2. A canonical MANET scenario consists of a distributed collection of battery-powered laptops or hand-held terminals, capable of hosting multimedia applications. Simulation details and performance tradeoffs, not presented, remain for a sequel to the paper.

  3. Research protocol for the Picture Talk Project: a qualitative study on research and consent with remote Australian Aboriginal communities

    PubMed Central

    Fitzpatrick, Emily F M; Carter, Maureen; Oscar, June; Lawford, Tom; Martiniuk, Alexandra L C; D’Antoine, Heather A; Elliott, Elizabeth J

    2017-01-01

    Introduction Research with Indigenous populations is not always designed with cultural sensitivity. Few publications evaluate or describe in detail seeking consent for research with Indigenous participants. When potential participants are not engaged in a culturally respectful manner, participation rates and research quality can be adversely affected. It is unethical to proceed with research without truly informed consent. Methods and analysis We describe a culturally appropriate research protocol that is invited by Aboriginal communities of the Fitzroy Valley in Western Australia. The Picture Talk Project is a research partnership with local Aboriginal leaders who are also chief investigators. We will interview Aboriginal leaders about research, community engagement and the consent process and hold focus groups with Aboriginal community members about individual consent. Cultural protocols will be applied to recruit and conduct research with participants. Transcripts will be analysed using NVivo10 qualitative software and themes synthesised to highlight the key issues raised by the community about the research process. This protocol will guide future research with the Aboriginal communities of the Fitzroy Valley and may inform the approach to research with other Indigenous communities of Australia or the world. It must be noted that no community is the same and all research requires local consultation and input. To conduct culturally sensitive research, respected local people from the community who have knowledge of cultural protocol and language are engaged to guide each step of the research process from the project design to the delivery of results. Ethics and dissemination Ethics approval was granted by the University of Sydney Human Research Ethics Committee (No. 2012/348, reference:14760), the Western Australia Country Health Service Ethics Committee (No. 2012:15), the Western Australian Aboriginal Health Ethics Committee and reviewed by the Kimberley Aboriginal Health Planning Forum Research Sub-Committee (No. 2012–008). Results will be disseminated through peer review articles, a local Fitzroy Valley report and conference presentations. PMID:29288181

  4. LINCS: Livermore's network architecture. [Octopus computing network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fletcher, J.G.

    1982-01-01

    Octopus, a local computing network that has been evolving at the Lawrence Livermore National Laboratory for over fifteen years, is currently undergoing a major revision. The primary purpose of the revision is to consolidate and redefine the variety of conventions and formats, which have grown up over the years, into a single standard family of protocols, the Livermore Interactive Network Communication Standard (LINCS). This standard treats the entire network as a single distributed operating system such that access to a computing resource is obtained in a single way, whether that resource is local (on the same computer as the accessingmore » process) or remote (on another computer). LINCS encompasses not only communication but also such issues as the relationship of customer to server processes and the structure, naming, and protection of resources. The discussion includes: an overview of the Livermore user community and computing hardware, the functions and structure of each of the seven layers of LINCS protocol, the reasons why we have designed our own protocols and why we are dissatisfied by the directions that current protocol standards are taking.« less

  5. Design and Analysis of Optimization Algorithms to Minimize Cryptographic Processing in BGP Security Protocols.

    PubMed

    Sriram, Vinay K; Montgomery, Doug

    2017-07-01

    The Internet is subject to attacks due to vulnerabilities in its routing protocols. One proposed approach to attain greater security is to cryptographically protect network reachability announcements exchanged between Border Gateway Protocol (BGP) routers. This study proposes and evaluates the performance and efficiency of various optimization algorithms for validation of digitally signed BGP updates. In particular, this investigation focuses on the BGPSEC (BGP with SECurity extensions) protocol, currently under consideration for standardization in the Internet Engineering Task Force. We analyze three basic BGPSEC update processing algorithms: Unoptimized, Cache Common Segments (CCS) optimization, and Best Path Only (BPO) optimization. We further propose and study cache management schemes to be used in conjunction with the CCS and BPO algorithms. The performance metrics used in the analyses are: (1) routing table convergence time after BGPSEC peering reset or router reboot events and (2) peak-second signature verification workload. Both analytical modeling and detailed trace-driven simulation were performed. Results show that the BPO algorithm is 330% to 628% faster than the unoptimized algorithm for routing table convergence in a typical Internet core-facing provider edge router.

  6. An approach to verification and validation of a reliable multicasting protocol: Extended Abstract

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Montgomery, Todd L.

    1995-01-01

    This paper describes the process of implementing a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally using a combination of formal and informal techniques in an attempt to ensure the correctness of its implementation. Our development process involved three concurrent activities: (1) the initial construction and incremental enhancement of a formal state model of the protocol machine; (2) the initial coding and incremental enhancement of the implementation; and (3) model-based testing of iterative implementations of the protocol. These activities were carried out by two separate teams: a design team and a V&V team. The design team built the first version of RMP with limited functionality to handle only nominal requirements of data delivery. This initial version did not handle off-nominal cases such as network partitions or site failures. Meanwhile, the V&V team concurrently developed a formal model of the requirements using a variant of SCR-based state tables. Based on these requirements tables, the V&V team developed test cases to exercise the implementation. In a series of iterative steps, the design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation. This was done by generating test cases based on suspected errant or off-nominal behaviors predicted by the current model. If the execution of a test in the model and implementation agreed, then the test either found a potential problem or verified a required behavior. However, if the execution of a test was different in the model and implementation, then the differences helped identify inconsistencies between the model and implementation. In either case, the dialogue between both teams drove the co-evolution of the model and implementation. We have found that this interactive, iterative approach to development allows software designers to focus on delivery of nominal functionality while the V&V team can focus on analysis of off nominal cases. Testing serves as the vehicle for keeping the model and implementation in fidelity with each other. This paper describes (1) our experiences in developing our process model; and (2) three example problems found during the development of RMP. Although RMP has provided our research effort with a rich set of test cases, it also has practical applications within NASA. For example, RMP is being considered for use in the NASA EOSDIS project due to its significant performance benefits in applications that need to replicate large amounts of data to many network sites.

  7. A Survey on Underwater Acoustic Sensor Network Routing Protocols.

    PubMed

    Li, Ning; Martínez, José-Fernán; Meneses Chaus, Juan Manuel; Eckert, Martina

    2016-03-22

    Underwater acoustic sensor networks (UASNs) have become more and more important in ocean exploration applications, such as ocean monitoring, pollution detection, ocean resource management, underwater device maintenance, etc. In underwater acoustic sensor networks, since the routing protocol guarantees reliable and effective data transmission from the source node to the destination node, routing protocol design is an attractive topic for researchers. There are many routing algorithms have been proposed in recent years. To present the current state of development of UASN routing protocols, we review herein the UASN routing protocol designs reported in recent years. In this paper, all the routing protocols have been classified into different groups according to their characteristics and routing algorithms, such as the non-cross-layer design routing protocol, the traditional cross-layer design routing protocol, and the intelligent algorithm based routing protocol. This is also the first paper that introduces intelligent algorithm-based UASN routing protocols. In addition, in this paper, we investigate the development trends of UASN routing protocols, which can provide researchers with clear and direct insights for further research.

  8. A Survey on Underwater Acoustic Sensor Network Routing Protocols

    PubMed Central

    Li, Ning; Martínez, José-Fernán; Meneses Chaus, Juan Manuel; Eckert, Martina

    2016-01-01

    Underwater acoustic sensor networks (UASNs) have become more and more important in ocean exploration applications, such as ocean monitoring, pollution detection, ocean resource management, underwater device maintenance, etc. In underwater acoustic sensor networks, since the routing protocol guarantees reliable and effective data transmission from the source node to the destination node, routing protocol design is an attractive topic for researchers. There are many routing algorithms have been proposed in recent years. To present the current state of development of UASN routing protocols, we review herein the UASN routing protocol designs reported in recent years. In this paper, all the routing protocols have been classified into different groups according to their characteristics and routing algorithms, such as the non-cross-layer design routing protocol, the traditional cross-layer design routing protocol, and the intelligent algorithm based routing protocol. This is also the first paper that introduces intelligent algorithm-based UASN routing protocols. In addition, in this paper, we investigate the development trends of UASN routing protocols, which can provide researchers with clear and direct insights for further research. PMID:27011193

  9. Evaluating the Healthy Start program. Design development to evaluative assessment.

    PubMed

    Raykovich, K S; McCormick, M C; Howell, E M; Devaney, B L

    1996-09-01

    The national evaluation of the federally funded Healthy Start program involved translating a design for a process and outcomes evaluation and standard maternal and infant data set, both developed prior to the national evaluation contract award, into an evaluation design and client data collection protocol that could be used to evaluate 15 diverse grantees. This article discusses the experience of creating a process and outcomes evaluation design that was both substantively and methodologically appropriate given such issues as the diversity of grantees and their community-based intervention strategies; the process of accessing secondary data sources, including vital records; the quality of client level data submissions; and the need to incorporate both qualitative and quantitative approaches into the evaluation design. The relevance of this experience for the conduct of other field studies of public health interventions is discussed.

  10. A web service system supporting three-dimensional post-processing of medical images based on WADO protocol.

    PubMed

    He, Longjun; Xu, Lang; Ming, Xing; Liu, Qian

    2015-02-01

    Three-dimensional post-processing operations on the volume data generated by a series of CT or MR images had important significance on image reading and diagnosis. As a part of the DIOCM standard, WADO service defined how to access DICOM objects on the Web, but it didn't involve three-dimensional post-processing operations on the series images. This paper analyzed the technical features of three-dimensional post-processing operations on the volume data, and then designed and implemented a web service system for three-dimensional post-processing operations of medical images based on the WADO protocol. In order to improve the scalability of the proposed system, the business tasks and calculation operations were separated into two modules. As results, it was proved that the proposed system could support three-dimensional post-processing service of medical images for multiple clients at the same moment, which met the demand of accessing three-dimensional post-processing operations on the volume data on the web.

  11. Advanced teleprocessing systems

    NASA Astrophysics Data System (ADS)

    Kleinrock, L.; Gerla, M.

    1982-09-01

    This Annual Technical Report covers research covering the period from October 1, 1981 to September 30, 1982. This contract has three primary designated research areas: packet radio systems, resource sharing and allocation, and distributed processing and control. This report contains abstracts of publications which summarize research results in these areas followed by the main body of the report which is devoted to a study of channel access protocols that are executed by the nodes of a network to schedule their transmissions on multi-access broadcast channel. In particular the main body consists of a Ph.D. dissertation, Channel Access Protocols for Multi-Hop Broadcast Packet Radio Networks. This work discusses some new channel access protocols useful for mobile radio networks. Included is an analysis of slotted ALOHA and some tight bounds on the performance of all possible protocols in a mobile environment.

  12. Baseband-processed SS-TDMA communication system architecture and design concepts

    NASA Technical Reports Server (NTRS)

    Attwood, S.; Sabourin, D.

    1982-01-01

    The architecture and system design for a commercial satellite communications system planned for the 1990's was developed by Motorola for NASA's Lewis Research Center. The system provides data communications between individual users via trunking and customer premises service terminals utilizing a central switching satellite operating in a time-division multiple-access (TDMA) mode. The major elements of the design incorporating baseband processing include: demand-assigned multiple access reservation protocol, spectral utilization, system synchronization, modulation technique and forward error control implementation. Motorola's baseband processor design, which is being proven in a proof-of-concept advanced technology development, will perform data regeneration and message routing for individual users on-board the spacecraft.

  13. Ultrasensitive detection of nucleic acids by template enhanced hybridization followed by rolling circle amplification and catalytic hairpin assembly.

    PubMed

    Song, Weiling; Zhang, Qiao; Sun, Wenbo

    2015-02-11

    An ultrasensitive protocol for fluorescent detection of DNA is designed by combining the template enhanced hybridization process (TEHP) with Rolling Circle Amplification (RCA) and Catalytic Hairpin Assembly (CHA), showing a remarkable amplification efficiency.

  14. Multimedia Networks: Mission Impossible?

    ERIC Educational Resources Information Center

    Weiss, Andrew M.

    1996-01-01

    Running multimedia on a network, often difficult because of the memory and processing power required, is becoming easier thanks to new protocols and products. Those developing network design criteria may wish to consider making use of Fast Ethernet, Asynchronous Transfer Method (ATM), switches, "fat pipes", additional network…

  15. A Comparison of Five FMRI Protocols for Mapping Speech Comprehension Systems

    PubMed Central

    Binder, Jeffrey R.; Swanson, Sara J.; Hammeke, Thomas A.; Sabsevitz, David S.

    2008-01-01

    Aims Many fMRI protocols for localizing speech comprehension have been described, but there has been little quantitative comparison of these methods. We compared five such protocols in terms of areas activated, extent of activation, and lateralization. Methods FMRI BOLD signals were measured in 26 healthy adults during passive listening and active tasks using words and tones. Contrasts were designed to identify speech perception and semantic processing systems. Activation extent and lateralization were quantified by counting activated voxels in each hemisphere for each participant. Results Passive listening to words produced bilateral superior temporal activation. After controlling for pre-linguistic auditory processing, only a small area in the left superior temporal sulcus responded selectively to speech. Active tasks engaged an extensive, bilateral attention and executive processing network. Optimal results (consistent activation and strongly lateralized pattern) were obtained by contrasting an active semantic decision task with a tone decision task. There was striking similarity between the network of brain regions activated by the semantic task and the network of brain regions that showed task-induced deactivation, suggesting that semantic processing occurs during the resting state. Conclusions FMRI protocols for mapping speech comprehension systems differ dramatically in pattern, extent, and lateralization of activation. Brain regions involved in semantic processing were identified only when an active, non-linguistic task was used as a baseline, supporting the notion that semantic processing occurs whenever attentional resources are not controlled. Identification of these lexical-semantic regions is particularly important for predicting language outcome in patients undergoing temporal lobe surgery. PMID:18513352

  16. GLobal Integrated Design Environment

    NASA Technical Reports Server (NTRS)

    Kunkel, Matthew; McGuire, Melissa; Smith, David A.; Gefert, Leon P.

    2011-01-01

    The GLobal Integrated Design Environment (GLIDE) is a collaborative engineering application built to resolve the design session issues of real-time passing of data between multiple discipline experts in a collaborative environment. Utilizing Web protocols and multiple programming languages, GLIDE allows engineers to use the applications to which they are accustomed in this case, Excel to send and receive datasets via the Internet to a database-driven Web server. Traditionally, a collaborative design session consists of one or more engineers representing each discipline meeting together in a single location. The discipline leads exchange parameters and iterate through their respective processes to converge on an acceptable dataset. In cases in which the engineers are unable to meet, their parameters are passed via e-mail, telephone, facsimile, or even postal mail. The result of this slow process of data exchange would elongate a design session to weeks or even months. While the iterative process remains in place, software can now exchange parameters securely and efficiently, while at the same time allowing for much more information about a design session to be made available. GLIDE is written in a compilation of several programming languages, including REALbasic, PHP, and Microsoft Visual Basic. GLIDE client installers are available to download for both Microsoft Windows and Macintosh systems. The GLIDE client software is compatible with Microsoft Excel 2000 or later on Windows systems, and with Microsoft Excel X or later on Macintosh systems. GLIDE follows the Client-Server paradigm, transferring encrypted and compressed data via standard Web protocols. Currently, the engineers use Excel as a front end to the GLIDE Client, as many of their custom tools run in Excel.

  17. Rosetta:MSF: a modular framework for multi-state computational protein design.

    PubMed

    Löffler, Patrick; Schmitz, Samuel; Hupfeld, Enrico; Sterner, Reinhard; Merkl, Rainer

    2017-06-01

    Computational protein design (CPD) is a powerful technique to engineer existing proteins or to design novel ones that display desired properties. Rosetta is a software suite including algorithms for computational modeling and analysis of protein structures and offers many elaborate protocols created to solve highly specific tasks of protein engineering. Most of Rosetta's protocols optimize sequences based on a single conformation (i. e. design state). However, challenging CPD objectives like multi-specificity design or the concurrent consideration of positive and negative design goals demand the simultaneous assessment of multiple states. This is why we have developed the multi-state framework MSF that facilitates the implementation of Rosetta's single-state protocols in a multi-state environment and made available two frequently used protocols. Utilizing MSF, we demonstrated for one of these protocols that multi-state design yields a 15% higher performance than single-state design on a ligand-binding benchmark consisting of structural conformations. With this protocol, we designed de novo nine retro-aldolases on a conformational ensemble deduced from a (βα)8-barrel protein. All variants displayed measurable catalytic activity, testifying to a high success rate for this concept of multi-state enzyme design.

  18. Rosetta:MSF: a modular framework for multi-state computational protein design

    PubMed Central

    Hupfeld, Enrico; Sterner, Reinhard

    2017-01-01

    Computational protein design (CPD) is a powerful technique to engineer existing proteins or to design novel ones that display desired properties. Rosetta is a software suite including algorithms for computational modeling and analysis of protein structures and offers many elaborate protocols created to solve highly specific tasks of protein engineering. Most of Rosetta’s protocols optimize sequences based on a single conformation (i. e. design state). However, challenging CPD objectives like multi-specificity design or the concurrent consideration of positive and negative design goals demand the simultaneous assessment of multiple states. This is why we have developed the multi-state framework MSF that facilitates the implementation of Rosetta’s single-state protocols in a multi-state environment and made available two frequently used protocols. Utilizing MSF, we demonstrated for one of these protocols that multi-state design yields a 15% higher performance than single-state design on a ligand-binding benchmark consisting of structural conformations. With this protocol, we designed de novo nine retro-aldolases on a conformational ensemble deduced from a (βα)8-barrel protein. All variants displayed measurable catalytic activity, testifying to a high success rate for this concept of multi-state enzyme design. PMID:28604768

  19. Design and performance evaluation of a distributed OFDMA-based MAC protocol for MANETs.

    PubMed

    Park, Jaesung; Chung, Jiyoung; Lee, Hyungyu; Lee, Jung-Ryun

    2014-01-01

    In this paper, we propose a distributed MAC protocol for OFDMA-based wireless mobile ad hoc multihop networks, in which the resource reservation and data transmission procedures are operated in a distributed manner. A frame format is designed considering the characteristics of OFDMA that each node can transmit or receive data to or from multiple nodes simultaneously. Under this frame structure, we propose a distributed resource management method including network state estimation and resource reservation processes. We categorize five types of logical errors according to their root causes and show that two of the logical errors are inevitable while three of them are avoided under the proposed distributed MAC protocol. In addition, we provide a systematic method to determine the advertisement period of each node by presenting a clear relation between the accuracy of estimated network states and the signaling overhead. We evaluate the performance of the proposed protocol in respect of the reservation success rate and the success rate of data transmission. Since our method focuses on avoiding logical errors, it could be easily placed on top of the other resource allocation methods focusing on the physical layer issues of the resource management problem and interworked with them.

  20. Fixation and Commitment while Designing and Its Measurement

    ERIC Educational Resources Information Center

    Gero, John S.

    2011-01-01

    This paper introduces the notion that fixation and commitment while designing can be measured by studying the protocol of the design session. It is hypothesized that the dynamic entropy of the linkograph of the protocol provides the basis for such a measurement. The hypothesis is empirically tested using a design protocol and the results…

  1. The PD COMM trial: a protocol for the process evaluation of a randomised trial assessing the effectiveness of two types of SLT for people with Parkinson's disease.

    PubMed

    Masterson-Algar, Patricia; Burton, Christopher R; Brady, Marian C; Nicoll, Avril; Clarke, Carl E; Rick, Caroline; Hughes, Max; Au, Pui; Smith, Christina H; Sackley, Catherine M

    2017-08-29

    The PD COMM trial is a phase III multi-centre randomised controlled trial whose aim is to evaluate the effectiveness and cost-effectiveness of two approaches to speech and language therapy (SLT) compared with no SLT intervention (control) for people with Parkinson's disease who have self-reported or carer-reported problems with their speech or voice. Our protocol describes the process evaluation embedded within the outcome evaluation whose aim is to evaluate what happened at the time of the PD COMM intervention implementation and to provide findings that will assist in the interpretation of the PD COMM trial results. Furthermore, the aim of the PD COMM process evaluation is to investigate intervention complexity within a theoretical model of how the trialled interventions might work best and why. Drawing from the Normalization Process Theory and frameworks for implementation fidelity, a mixed method design will be used to address process evaluation research questions. Therapists' and participants' perceptions and experiences will be investigated via in-depth interviews. Critical incident reports, baseline survey data from therapists, treatment record forms and home practice diaries also will be collected at relevant time points throughout the running of the PD COMM trial. Process evaluation data will be analysed independently of the outcome evaluation before the two sets of data are then combined. To date, there are a limited number of published process evaluation protocols, and few are linked to trials investigating rehabilitation therapies. Providing a strong theoretical framework underpinning design choices and being tailored to meet the complex characteristics of the trialled interventions, our process evaluation has the potential to provide valuable insight into which components of the interventions being delivered in PD COMM worked best (and what did not), how they worked well and why. ISRCTN Registry, ISRCTN12421382 . Registered on 18 April 2016.

  2. GLobal Integrated Design Environment (GLIDE): A Concurrent Engineering Application

    NASA Technical Reports Server (NTRS)

    McGuire, Melissa L.; Kunkel, Matthew R.; Smith, David A.

    2010-01-01

    The GLobal Integrated Design Environment (GLIDE) is a client-server software application purpose-built to mitigate issues associated with real time data sharing in concurrent engineering environments and to facilitate discipline-to-discipline interaction between multiple engineers and researchers. GLIDE is implemented in multiple programming languages utilizing standardized web protocols to enable secure parameter data sharing between engineers and researchers across the Internet in closed and/or widely distributed working environments. A well defined, HyperText Transfer Protocol (HTTP) based Application Programming Interface (API) to the GLIDE client/server environment enables users to interact with GLIDE, and each other, within common and familiar tools. One such common tool, Microsoft Excel (Microsoft Corporation), paired with its add-in API for GLIDE, is discussed in this paper. The top-level examples given demonstrate how this interface improves the efficiency of the design process of a concurrent engineering study while reducing potential errors associated with manually sharing information between study participants.

  3. Design of a Clinical Information Management System to Support DNA Analysis Laboratory Operation

    PubMed Central

    Dubay, Christopher J.; Zimmerman, David; Popovich, Bradley

    1995-01-01

    The LabDirector system has been developed at the Oregon Health Sciences University to support the operation of our clinical DNA analysis laboratory. Through an iterative design process which has spanned two years, we have produced a system that is both highly tailored to a clinical genetics production laboratory and flexible in its implementation, to support the rapid growth and change of protocols and methodologies in use in the field. The administrative aspects of the system are integrated with an enterprise schedule management system. The laboratory side of the system is driven by a protocol modeling and execution system. The close integration between these two aspects of the clinical laboratory facilitates smooth operations, and allows management to accurately measure costs and performance. The entire application has been designed and documented to provide utility to a wide range of clinical laboratory environments.

  4. A hybrid MAC protocol design for energy-efficient very-high-throughput millimeter wave, wireless sensor communication networks

    NASA Astrophysics Data System (ADS)

    Jian, Wei; Estevez, Claudio; Chowdhury, Arshad; Jia, Zhensheng; Wang, Jianxin; Yu, Jianguo; Chang, Gee-Kung

    2010-12-01

    This paper presents an energy-efficient Medium Access Control (MAC) protocol for very-high-throughput millimeter-wave (mm-wave) wireless sensor communication networks (VHT-MSCNs) based on hybrid multiple access techniques of frequency division multiplexing access (FDMA) and time division multiplexing access (TDMA). An energy-efficient Superframe for wireless sensor communication network employing directional mm-wave wireless access technologies is proposed for systems that require very high throughput, such as high definition video signals, for sensing, processing, transmitting, and actuating functions. Energy consumption modeling for each network element and comparisons among various multi-access technologies in term of power and MAC layer operations are investigated for evaluating the energy-efficient improvement of proposed MAC protocol.

  5. Bayesian adaptive survey protocols for resource management

    USGS Publications Warehouse

    Halstead, Brian J.; Wylie, Glenn D.; Coates, Peter S.; Casazza, Michael L.

    2011-01-01

    Transparency in resource management decisions requires a proper accounting of uncertainty at multiple stages of the decision-making process. As information becomes available, periodic review and updating of resource management protocols reduces uncertainty and improves management decisions. One of the most basic steps to mitigating anthropogenic effects on populations is determining if a population of a species occurs in an area that will be affected by human activity. Species are rarely detected with certainty, however, and falsely declaring a species absent can cause improper conservation decisions or even extirpation of populations. We propose a method to design survey protocols for imperfectly detected species that accounts for multiple sources of uncertainty in the detection process, is capable of quantitatively incorporating expert opinion into the decision-making process, allows periodic updates to the protocol, and permits resource managers to weigh the severity of consequences if the species is falsely declared absent. We developed our method using the giant gartersnake (Thamnophis gigas), a threatened species precinctive to the Central Valley of California, as a case study. Survey date was negatively related to the probability of detecting the giant gartersnake, and water temperature was positively related to the probability of detecting the giant gartersnake at a sampled location. Reporting sampling effort, timing and duration of surveys, and water temperatures would allow resource managers to evaluate the probability that the giant gartersnake occurs at sampled sites where it is not detected. This information would also allow periodic updates and quantitative evaluation of changes to the giant gartersnake survey protocol. Because it naturally allows multiple sources of information and is predicated upon the idea of updating information, Bayesian analysis is well-suited to solving the problem of developing efficient sampling protocols for species of conservation concern.

  6. Network Configuration Analysis for Formation Flying Satellites

    NASA Technical Reports Server (NTRS)

    Knoblock, Eric J.; Wallett, Thomas M.; Konangi, Vijay K.; Bhasin, Kul B.

    2001-01-01

    The performance of two networks to support autonomous multi-spacecraft formation flying systems is presented. Both systems are comprised of a ten-satellite formation, with one of the satellites designated as the central or 'mother ship.' All data is routed through the mother ship to the terrestrial network. The first system uses a TCP/EP over ATM protocol architecture within the formation, and the second system uses the IEEE 802.11 protocol architecture within the formation. The simulations consist of file transfers using either the File Transfer Protocol (FTP) or the Simple Automatic File Exchange (SAFE) Protocol. The results compare the IP queuing delay, IP queue size and IP processing delay at the mother ship as well as end-to-end delay for both systems. In all cases, using IEEE 802.11 within the formation yields less delay. Also, the throughput exhibited by SAFE is better than FTP.

  7. Design and Methodological Considerations of the Centers for Disease Control and Prevention Urologic and Renal Protocol for the Newborn and Young Child with Spina Bifida

    PubMed Central

    Routh, Jonathan C.; Cheng, Earl Y.; Austin, J. Christopher; Baum, Michelle A.; Gargollo, Patricio C.; Grady, Richard W.; Herron, Adrienne R.; Kim, Steven S.; King, Shelly J.; Koh, Chester J.; Paramsothy, Pangaja; Raman, Lisa; Schechter, Michael S.; Smith, Kathryn A.; Tanaka, Stacy T.; Thibadeau, Judy K.; Walker, William O.; Wallis, M. Chad; Wiener, John S.; Joseph, David B.

    2016-01-01

    Purpose Care of children with spina bifida has significantly advanced in the last half century, resulting in gains in longevity and quality of life for affected children and caregivers. Bladder dysfunction is the norm in patients with spina bifida and may result in infection, renal scarring and chronic kidney disease. However, the optimal urological management for spina bifida related bladder dysfunction is unknown. Materials and Methods In 2012 the Centers for Disease Control and Prevention convened a working group composed of pediatric urologists, nephrologists, epidemiologists, methodologists, community advocates and Centers for Disease Control and Prevention personnel to develop a protocol to optimize urological care of children with spina bifida from the newborn period through age 5 years. Results An iterative quality improvement protocol was selected. In this model participating institutions agree to prospectively treat all newborns with spina bifida using a single consensus based protocol. During the 5-year study period outcomes will be routinely assessed and the protocol adjusted as needed to optimize patient and process outcomes. Primary study outcomes include urinary tract infections, renal scarring, renal function and bladder characteristics. The protocol specifies the timing and use of testing (eg ultrasonography, urodynamics) and interventions (eg intermittent catheterization, prophylactic antibiotics, antimuscarinic medications). Starting in 2014 the Centers for Disease Control and Prevention began funding 9 study sites to implement and evaluate the protocol. Conclusions The Centers for Disease Control and Prevention Urologic and Renal Protocol for the Newborn and Young Child with Spina Bifida began accruing patients in 2015. Assessment in the first 5 years will focus on urinary tract infections, renal function, renal scarring and clinical process improvements. PMID:27475969

  8. Design and Methodological Considerations of the Centers for Disease Control and Prevention Urologic and Renal Protocol for the Newborn and Young Child with Spina Bifida.

    PubMed

    Routh, Jonathan C; Cheng, Earl Y; Austin, J Christopher; Baum, Michelle A; Gargollo, Patricio C; Grady, Richard W; Herron, Adrienne R; Kim, Steven S; King, Shelly J; Koh, Chester J; Paramsothy, Pangaja; Raman, Lisa; Schechter, Michael S; Smith, Kathryn A; Tanaka, Stacy T; Thibadeau, Judy K; Walker, William O; Wallis, M Chad; Wiener, John S; Joseph, David B

    2016-12-01

    Care of children with spina bifida has significantly advanced in the last half century, resulting in gains in longevity and quality of life for affected children and caregivers. Bladder dysfunction is the norm in patients with spina bifida and may result in infection, renal scarring and chronic kidney disease. However, the optimal urological management for spina bifida related bladder dysfunction is unknown. In 2012 the Centers for Disease Control and Prevention convened a working group composed of pediatric urologists, nephrologists, epidemiologists, methodologists, community advocates and Centers for Disease Control and Prevention personnel to develop a protocol to optimize urological care of children with spina bifida from the newborn period through age 5 years. An iterative quality improvement protocol was selected. In this model participating institutions agree to prospectively treat all newborns with spina bifida using a single consensus based protocol. During the 5-year study period outcomes will be routinely assessed and the protocol adjusted as needed to optimize patient and process outcomes. Primary study outcomes include urinary tract infections, renal scarring, renal function and bladder characteristics. The protocol specifies the timing and use of testing (eg ultrasonography, urodynamics) and interventions (eg intermittent catheterization, prophylactic antibiotics, antimuscarinic medications). Starting in 2014 the Centers for Disease Control and Prevention began funding 9 study sites to implement and evaluate the protocol. The Centers for Disease Control and Prevention Urologic and Renal Protocol for the Newborn and Young Child with Spina Bifida began accruing patients in 2015. Assessment in the first 5 years will focus on urinary tract infections, renal function, renal scarring and clinical process improvements. Copyright © 2016 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  9. Robowell: An automated process for monitoring ground water quality using established sampling protocols

    USGS Publications Warehouse

    Granato, G.E.; Smith, K.P.

    1999-01-01

    Robowell is an automated process for monitoring selected ground water quality properties and constituents by pumping a well or multilevel sampler. Robowell was developed and tested to provide a cost-effective monitoring system that meets protocols expected for manual sampling. The process uses commercially available electronics, instrumentation, and hardware, so it can be configured to monitor ground water quality using the equipment, purge protocol, and monitoring well design most appropriate for the monitoring site and the contaminants of interest. A Robowell prototype was installed on a sewage treatment plant infiltration bed that overlies a well-studied unconfined sand and gravel aquifer at the Massachusetts Military Reservation, Cape Cod, Massachusetts, during a time when two distinct plumes of constituents were released. The prototype was operated from May 10 to November 13, 1996, and quality-assurance/quality-control measurements demonstrated that the data obtained by the automated method was equivalent to data obtained by manual sampling methods using the same sampling protocols. Water level, specific conductance, pH, water temperature, dissolved oxygen, and dissolved ammonium were monitored by the prototype as the wells were purged according to U.S Geological Survey (USGS) ground water sampling protocols. Remote access to the data record, via phone modem communications, indicated the arrival of each plume over a few days and the subsequent geochemical reactions over the following weeks. Real-time availability of the monitoring record provided the information needed to initiate manual sampling efforts in response to changes in measured ground water quality, which proved the method and characterized the screened portion of the plume in detail through time. The methods and the case study described are presented to document the process for future use.

  10. Investigating nurse practitioners in the private sector: a theoretically informed research protocol.

    PubMed

    Adams, Margaret; Gardner, Glenn; Yates, Patsy

    2017-06-01

    To report a study protocol and the theoretical framework normalisation process theory that informs this protocol for a case study investigation of private sector nurse practitioners. Most research evaluating nurse practitioner service is focused on public, mainly acute care environments where nurse practitioner service is well established with strong structures for governance and sustainability. Conversely, there is lack of clarity in governance for emerging models in the private sector. In a climate of healthcare reform, nurse practitioner service is extending beyond the familiar public health sector. Further research is required to inform knowledge of the practice, operational framework and governance of new nurse practitioner models. The proposed research will use a multiple exploratory case study design to examine private sector nurse practitioner service. Data collection includes interviews, surveys and audits. A sequential mixed method approach to analysis of each case will be conducted. Findings from within-case analysis will lead to a meta-synthesis across all four cases to gain a holistic understanding of the cases under study, private sector nurse practitioner service. Normalisation process theory will be used to guide the research process, specifically coding and analysis of data using theory constructs and the relevant components associated with those constructs. This article provides a blueprint for the research and describes a theoretical framework, normalisation process theory in terms of its flexibility as an analytical framework. Consistent with the goals of best research practice, this study protocol will inform the research community in the field of primary health care about emerging research in this field. Publishing a study protocol ensures researcher fidelity to the analysis plan and supports research collaboration across teams. © 2016 John Wiley & Sons Ltd.

  11. Extra-Vehicular Activity (EVA) glove evaluation test protocol

    NASA Technical Reports Server (NTRS)

    Hinman-Sweeney, E. M.

    1994-01-01

    One of the most critical components of a space suit is the gloves, yet gloves have traditionally presented significant design challenges. With continued efforts at glove development, a method for evaluating glove performance is needed. This paper presents a pressure-glove evaluation protocol. A description of this evaluation protocol, and its development is provided. The protocol allows comparison of one glove design to another, or any one design to bare-handed performance. Gloves for higher pressure suits may be evaluated at current and future design pressures to drive out differences in performance due to pressure effects. Using this protocol, gloves may be evaluated during design to drive out design problems and determine areas for improvement, or fully mature designs may be evaluated with respect to mission requirements. Several different test configurations are presented to handle these cases. This protocol was run on a prototype glove. The prototype was evaluated at two operating pressures and in the unpressurized state, with results compared to bare-handed performance. Results and analysis from this test series are provided, as is a description of the configuration used for this test.

  12. A Natural Language Processing-based Model to Automate MRI Brain Protocol Selection and Prioritization.

    PubMed

    Brown, Andrew D; Marotta, Thomas R

    2017-02-01

    Incorrect imaging protocol selection can contribute to increased healthcare cost and waste. To help healthcare providers improve the quality and safety of medical imaging services, we developed and evaluated three natural language processing (NLP) models to determine whether NLP techniques could be employed to aid in clinical decision support for protocoling and prioritization of magnetic resonance imaging (MRI) brain examinations. To test the feasibility of using an NLP model to support clinical decision making for MRI brain examinations, we designed three different medical imaging prediction tasks, each with a unique outcome: selecting an examination protocol, evaluating the need for contrast administration, and determining priority. We created three models for each prediction task, each using a different classification algorithm-random forest, support vector machine, or k-nearest neighbor-to predict outcomes based on the narrative clinical indications and demographic data associated with 13,982 MRI brain examinations performed from January 1, 2013 to June 30, 2015. Test datasets were used to calculate the accuracy, sensitivity and specificity, predictive values, and the area under the curve. Our optimal results show an accuracy of 82.9%, 83.0%, and 88.2% for the protocol selection, contrast administration, and prioritization tasks, respectively, demonstrating that predictive algorithms can be used to aid in clinical decision support for examination protocoling. NLP models developed from the narrative clinical information provided by referring clinicians and demographic data are feasible methods to predict the protocol and priority of MRI brain examinations. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  13. A Secure, Intelligent, and Smart-Sensing Approach for Industrial System Automation and Transmission over Unsecured Wireless Networks

    PubMed Central

    Shahzad, Aamir; Lee, Malrey; Xiong, Neal Naixue; Jeong, Gisung; Lee, Young-Keun; Choi, Jae-Young; Mahesar, Abdul Wheed; Ahmad, Iftikhar

    2016-01-01

    In Industrial systems, Supervisory control and data acquisition (SCADA) system, the pseudo-transport layer of the distributed network protocol (DNP3) performs the functions of the transport layer and network layer of the open systems interconnection (OSI) model. This study used a simulation design of water pumping system, in-which the network nodes are directly and wirelessly connected with sensors, and are monitored by the main controller, as part of the wireless SCADA system. This study also intends to focus on the security issues inherent in the pseudo-transport layer of the DNP3 protocol. During disassembly and reassembling processes, the pseudo-transport layer keeps track of the bytes sequence. However, no mechanism is available that can verify the message or maintain the integrity of the bytes in the bytes received/transmitted from/to the data link layer or in the send/respond from the main controller/sensors. To properly and sequentially keep track of the bytes, a mechanism is required that can perform verification while bytes are received/transmitted from/to the lower layer of the DNP3 protocol or the send/respond to/from field sensors. For security and byte verification purposes, a mechanism needs to be proposed for the pseudo-transport layer, by employing cryptography algorithm. A dynamic choice security buffer (SB) is designed and employed during the security development. To achieve the desired goals of the proposed study, a pseudo-transport layer stack model is designed using the DNP3 protocol open library and the security is deployed and tested, without changing the original design. PMID:26950129

  14. A Secure, Intelligent, and Smart-Sensing Approach for Industrial System Automation and Transmission over Unsecured Wireless Networks.

    PubMed

    Shahzad, Aamir; Lee, Malrey; Xiong, Neal Naixue; Jeong, Gisung; Lee, Young-Keun; Choi, Jae-Young; Mahesar, Abdul Wheed; Ahmad, Iftikhar

    2016-03-03

    In Industrial systems, Supervisory control and data acquisition (SCADA) system, the pseudo-transport layer of the distributed network protocol (DNP3) performs the functions of the transport layer and network layer of the open systems interconnection (OSI) model. This study used a simulation design of water pumping system, in-which the network nodes are directly and wirelessly connected with sensors, and are monitored by the main controller, as part of the wireless SCADA system. This study also intends to focus on the security issues inherent in the pseudo-transport layer of the DNP3 protocol. During disassembly and reassembling processes, the pseudo-transport layer keeps track of the bytes sequence. However, no mechanism is available that can verify the message or maintain the integrity of the bytes in the bytes received/transmitted from/to the data link layer or in the send/respond from the main controller/sensors. To properly and sequentially keep track of the bytes, a mechanism is required that can perform verification while bytes are received/transmitted from/to the lower layer of the DNP3 protocol or the send/respond to/from field sensors. For security and byte verification purposes, a mechanism needs to be proposed for the pseudo-transport layer, by employing cryptography algorithm. A dynamic choice security buffer (SB) is designed and employed during the security development. To achieve the desired goals of the proposed study, a pseudo-transport layer stack model is designed using the DNP3 protocol open library and the security is deployed and tested, without changing the original design.

  15. Comprehensive protocol of traceability during IVF: the result of a multicentre failure mode and effect analysis.

    PubMed

    Rienzi, L; Bariani, F; Dalla Zorza, M; Albani, E; Benini, F; Chamayou, S; Minasi, M G; Parmegiani, L; Restelli, L; Vizziello, G; Costa, A Nanni

    2017-08-01

    Can traceability of gametes and embryos be ensured during IVF? The use of a simple and comprehensive traceability system that includes the most susceptible phases during the IVF process minimizes the risk of mismatches. Mismatches in IVF are very rare but unfortunately possible with dramatic consequences for both patients and health care professionals. Traceability is thus a fundamental aspect of the treatment. A clear process of patient and cell identification involving witnessing protocols has to be in place in every unit. To identify potential failures in the traceability process and to develop strategies to mitigate the risk of mismatches, previously failure mode and effects analysis (FMEA) has been used effectively. The FMEA approach is however a subjective analysis, strictly related to specific protocols and thus the results are not always widely applicable. To reduce subjectivity and to obtain a widespread comprehensive protocol of traceability, a multicentre centrally coordinated FMEA was performed. Seven representative Italian centres (three public and four private) were selected. The study had a duration of 21 months (from April 2015 to December 2016) and was centrally coordinated by a team of experts: a risk analysis specialist, an expert embryologist and a specialist in human factor. Principal investigators of each centre were first instructed about proactive risk assessment and FMEA methodology. A multidisciplinary team to perform the FMEA analysis was then formed in each centre. After mapping the traceability process, each team identified the possible causes of mistakes in their protocol. A risk priority number (RPN) for each identified potential failure mode was calculated. The results of the FMEA analyses were centrally investigated and consistent corrective measures suggested. The teams performed new FMEA analyses after the recommended implementations. In each centre, this study involved: the laboratory director, the Quality Control & Quality Assurance responsible, Embryologist(s), Gynaecologist(s), Nurse(s) and Administration. The FMEA analyses were performed according to the Joint Commission International. The FMEA teams identified seven main process phases: oocyte collection, sperm collection, gamete processing, insemination, embryo culture, embryo transfer and gamete/embryo cryopreservation. A mean of 19.3 (SD ± 5.8) associated process steps and 41.9 (SD ± 12.4) possible failure modes were recognized per centre. A RPN ≥15 was calculated in a mean of 6.4 steps (range 2-12, SD ± 3.60). A total of 293 failure modes were centrally analysed 45 of which were considered at medium/high risk. After consistent corrective measures implementation and re-evaluation, a significant reduction in the RPNs in all centres (RPN <15 for all steps) was observed. A simple and comprehensive traceability system was designed as the result of the seven FMEA analyses. The validity of FMEA is in general questionable due to the subjectivity of the judgments. The design of this study has however minimized this risk by introducing external experts for the analysis of the FMEA results. Specific situations such as sperm/oocyte donation, import/export and pre-implantation genetic testing were not taken into consideration. Finally, this study is only limited to the analysis of failure modes that may lead to mismatches, other possible procedural mistakes are not accounted for. Every single IVF centre should have a clear and reliable protocol for identification of patients and traceability of cells during manipulation. The results of this study can support IVF groups in better recognizing critical steps in their protocols, understanding identification and witnessing process, and in turn enhancing safety by introducing validated corrective measures. This study was designed by the Italian Society of Embryology Reproduction and Research (SIERR) and funded by the Italian National Transplant Centre (CNT) of the Italian National Institute of Health (ISS). The authors have no conflicts of interest. N/A. © The Author 2017. Published by Oxford University Press on behalf of the European Society of Human Reproduction and Embryology. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  16. Resource use and costs of type 2 diabetes patients receiving managed or protocolized primary care: a controlled clinical trial.

    PubMed

    van der Heijden, Amber A W A; de Bruijne, Martine C; Feenstra, Talitha L; Dekker, Jacqueline M; Baan, Caroline A; Bosmans, Judith E; Bot, Sandra D M; Donker, Gé A; Nijpels, Giel

    2014-06-25

    The increasing prevalence of diabetes is associated with increased health care use and costs. Innovations to improve the quality of care, manage the increasing demand for health care and control the growth of health care costs are needed. The aim of this study is to evaluate the care process and costs of managed, protocolized and usual care for type 2 diabetes patients from a societal perspective. In two distinct regions of the Netherlands, both managed and protocolized diabetes care were implemented. Managed care was characterized by centralized organization, coordination, responsibility and centralized annual assessment. Protocolized care had a partly centralized organizational structure. Usual care was characterized by a decentralized organizational structure. Using a quasi-experimental control group pretest-posttest design, the care process (guideline adherence) and costs were compared between managed (n = 253), protocolized (n = 197), and usual care (n = 333). We made a distinction between direct health care costs, direct non-health care costs and indirect costs. Multivariate regression models were used to estimate differences in costs adjusted for confounding factors. Because of the skewed distribution of the costs, bootstrapping methods (5000 replications) with a bias-corrected and accelerated approach were used to estimate 95% confidence intervals (CI) around the differences in costs. Compared to usual and protocolized care, in managed care more patients were treated according to diabetes guidelines. Secondary health care use was higher in patients under usual care compared to managed and protocolized care. Compared to usual care, direct costs were significantly lower in managed care (€-1.181 (95% CI: -2.597 to -334)) while indirect costs were higher (€ 758 (95% CI: -353 to 2.701), although not significant. Direct, indirect and total costs were lower in protocolized care compared to usual care (though not significantly). Compared to usual care, managed care was significantly associated with better process in terms of diabetes care, fewer secondary care consultations and lower health care costs. The same trends were seen for protocolized care, however they were not statistically significant. Current Controlled trials: ISRCTN66124817.

  17. Resource use and costs of type 2 diabetes patients receiving managed or protocolized primary care: a controlled clinical trial

    PubMed Central

    2014-01-01

    Background The increasing prevalence of diabetes is associated with increased health care use and costs. Innovations to improve the quality of care, manage the increasing demand for health care and control the growth of health care costs are needed. The aim of this study is to evaluate the care process and costs of managed, protocolized and usual care for type 2 diabetes patients from a societal perspective. Methods In two distinct regions of the Netherlands, both managed and protocolized diabetes care were implemented. Managed care was characterized by centralized organization, coordination, responsibility and centralized annual assessment. Protocolized care had a partly centralized organizational structure. Usual care was characterized by a decentralized organizational structure. Using a quasi-experimental control group pretest-posttest design, the care process (guideline adherence) and costs were compared between managed (n = 253), protocolized (n = 197), and usual care (n = 333). We made a distinction between direct health care costs, direct non-health care costs and indirect costs. Multivariate regression models were used to estimate differences in costs adjusted for confounding factors. Because of the skewed distribution of the costs, bootstrapping methods (5000 replications) with a bias-corrected and accelerated approach were used to estimate 95% confidence intervals (CI) around the differences in costs. Results Compared to usual and protocolized care, in managed care more patients were treated according to diabetes guidelines. Secondary health care use was higher in patients under usual care compared to managed and protocolized care. Compared to usual care, direct costs were significantly lower in managed care (€-1.181 (95% CI: -2.597 to -334)) while indirect costs were higher (€758 (95% CI: -353 to 2.701), although not significant. Direct, indirect and total costs were lower in protocolized care compared to usual care (though not significantly). Conclusions Compared to usual care, managed care was significantly associated with better process in terms of diabetes care, fewer secondary care consultations and lower health care costs. The same trends were seen for protocolized care, however they were not statistically significant. Trial registration Current Controlled trials: ISRCTN66124817. PMID:24966055

  18. Automated processing of forensic casework samples using robotic workstations equipped with nondisposable tips: contamination prevention.

    PubMed

    Frégeau, Chantal J; Lett, C Marc; Elliott, Jim; Yensen, Craig; Fourney, Ron M

    2008-05-01

    An automated process has been developed for the analysis of forensic casework samples using TECAN Genesis RSP 150/8 or Freedom EVO liquid handling workstations equipped exclusively with nondisposable tips. Robot tip cleaning routines have been incorporated strategically within the DNA extraction process as well as at the end of each session. Alternative options were examined for cleaning the tips and different strategies were employed to verify cross-contamination. A 2% sodium hypochlorite wash (1/5th dilution of the 10.8% commercial bleach stock) proved to be the best overall approach for preventing cross-contamination of samples processed using our automated protocol. The bleach wash steps do not adversely impact the short tandem repeat (STR) profiles developed from DNA extracted robotically and allow for major cost savings through the implementation of fixed tips. We have demonstrated that robotic workstations equipped with fixed pipette tips can be used with confidence with properly designed tip washing routines to process casework samples using an adapted magnetic bead extraction protocol.

  19. Integration and Analysis of Neighbor Discovery and Link Quality Estimation in Wireless Sensor Networks

    PubMed Central

    Radi, Marjan; Dezfouli, Behnam; Abu Bakar, Kamalrulnizam; Abd Razak, Shukor

    2014-01-01

    Network connectivity and link quality information are the fundamental requirements of wireless sensor network protocols to perform their desired functionality. Most of the existing discovery protocols have only focused on the neighbor discovery problem, while a few number of them provide an integrated neighbor search and link estimation. As these protocols require a careful parameter adjustment before network deployment, they cannot provide scalable and accurate network initialization in large-scale dense wireless sensor networks with random topology. Furthermore, performance of these protocols has not entirely been evaluated yet. In this paper, we perform a comprehensive simulation study on the efficiency of employing adaptive protocols compared to the existing nonadaptive protocols for initializing sensor networks with random topology. In this regard, we propose adaptive network initialization protocols which integrate the initial neighbor discovery with link quality estimation process to initialize large-scale dense wireless sensor networks without requiring any parameter adjustment before network deployment. To the best of our knowledge, this work is the first attempt to provide a detailed simulation study on the performance of integrated neighbor discovery and link quality estimation protocols for initializing sensor networks. This study can help system designers to determine the most appropriate approach for different applications. PMID:24678277

  20. SAFARI digital processing unit: performance analysis of the SpaceWire links in case of a LEON3-FT based CPU

    NASA Astrophysics Data System (ADS)

    Giusi, Giovanni; Liu, Scige J.; Di Giorgio, Anna M.; Galli, Emanuele; Pezzuto, Stefano; Farina, Maria; Spinoglio, Luigi

    2014-08-01

    SAFARI (SpicA FAR infrared Instrument) is a far-infrared imaging Fourier Transform Spectrometer for the SPICA mission. The Digital Processing Unit (DPU) of the instrument implements the functions of controlling the overall instrument and implementing the science data compression and packing. The DPU design is based on the use of a LEON family processor. In SAFARI, all instrument components are connected to the central DPU via SpaceWire links. On these links science data, housekeeping and commands flows are in some cases multiplexed, therefore the interface control shall be able to cope with variable throughput needs. The effective data transfer workload can be an issue for the overall system performances and becomes a critical parameter for the on-board software design, both at application layer level and at lower, and more HW related, levels. To analyze the system behavior in presence of the expected SAFARI demanding science data flow, we carried out a series of performance tests using the standard GR-CPCI-UT699 LEON3-FT Development Board, provided by Aeroflex/Gaisler, connected to the emulator of the SAFARI science data links, in a point-to-point topology. Two different communication protocols have been used in the tests, the ECSS-E-ST-50-52C RMAP protocol and an internally defined one, the SAFARI internal data handling protocol. An incremental approach has been adopted to measure the system performances at different levels of the communication protocol complexity. In all cases the performance has been evaluated by measuring the CPU workload and the bus latencies. The tests have been executed initially in a custom low level execution environment and finally using the Real- Time Executive for Multiprocessor Systems (RTEMS), which has been selected as the operating system to be used onboard SAFARI. The preliminary results of the carried out performance analysis confirmed the possibility of using a LEON3 CPU processor in the SAFARI DPU, but pointed out, in agreement with previous similar studies, the need of carefully designing the overall architecture to implement some of the DPU functionalities on additional processing devices.

  1. A Study of Novice Systems Analysis Problem Solving Behaviors Using Protocol Analysis

    DTIC Science & Technology

    1992-09-01

    conducted. Each subject was given the same task to perform. The task involved a case study (Appendix B) of a utility company’s customer order processing system...behavior (Ramesh, 1989). The task was to design a customer order processing system that utilized a centralized telephone answering service center...of the utility company’s customer order processing system that was developed based on information obtained by a large systems consulting firm during

  2. Built to last? The sustainability of health system improvements, interventions and change strategies: a study protocol for a systematic review

    PubMed Central

    Braithwaite, Jeffrey; Testa, Luke; Lamprell, Gina; Herkes, Jessica; Ludlow, Kristiana; McPherson, Elise; Campbell, Margie; Holt, Joanna

    2017-01-01

    Introduction The sustainability of healthcare interventions and change programmes is of increasing importance to researchers and healthcare stakeholders interested in creating sustainable health systems to cope with mounting stressors. The aim of this protocol is to extend earlier work and describe a systematic review to identify, synthesise and draw meaning from studies published within the last 5 years that measure the sustainability of interventions, improvement efforts and change strategies in the health system. Methods and analysis The protocol outlines a method by which to execute a rigorous systematic review. The design includes applying primary and secondary data collection techniques, consisting of a comprehensive database search complemented by contact with experts, and searching secondary databases and reference lists, using snowballing techniques. The review and analysis process will occur via an abstract review followed by a full-text screening process. The inclusion criteria include English-language, peer-reviewed, primary, empirical research articles published after 2011 in scholarly journals, for which the full text is available. No restrictions on location will be applied. The review that results from this protocol will synthesise and compare characteristics of the included studies. Ultimately, it is intended that this will help make it easier to identify and design sustainable interventions, improvement efforts and change strategies. Ethics and dissemination As no primary data were collected, ethical approval was not required. Results will be disseminated in conference presentations, peer-reviewed publications and among policymaker bodies interested in creating sustainable health systems. PMID:29133332

  3. A review of blood sample handling and pre-processing for metabolomics studies.

    PubMed

    Hernandes, Vinicius Veri; Barbas, Coral; Dudzik, Danuta

    2017-09-01

    Metabolomics has been found to be applicable to a wide range of clinical studies, bringing a new era for improving clinical diagnostics, early disease detection, therapy prediction and treatment efficiency monitoring. A major challenge in metabolomics, particularly untargeted studies, is the extremely diverse and complex nature of biological specimens. Despite great advances in the field there still exist fundamental needs for considering pre-analytical variability that can introduce bias to the subsequent analytical process and decrease the reliability of the results and moreover confound final research outcomes. Many researchers are mainly focused on the instrumental aspects of the biomarker discovery process, and sample related variables sometimes seem to be overlooked. To bridge the gap, critical information and standardized protocols regarding experimental design and sample handling and pre-processing are highly desired. Characterization of a range variation among sample collection methods is necessary to prevent results misinterpretation and to ensure that observed differences are not due to an experimental bias caused by inconsistencies in sample processing. Herein, a systematic discussion of pre-analytical variables affecting metabolomics studies based on blood derived samples is performed. Furthermore, we provide a set of recommendations concerning experimental design, collection, pre-processing procedures and storage conditions as a practical review that can guide and serve for the standardization of protocols and reduction of undesirable variation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Microwave processing of a dental ceramic used in computer-aided design/computer-aided manufacturing.

    PubMed

    Pendola, Martin; Saha, Subrata

    2015-01-01

    Because of their favorable mechanical properties and natural esthetics, ceramics are widely used in restorative dentistry. The conventional ceramic sintering process required for their use is usually slow, however, and the equipment has an elevated energy consumption. Sintering processes that use microwaves have several advantages compared to regular sintering: shorter processing times, lower energy consumption, and the capacity for volumetric heating. The objective of this study was to test the mechanical properties of a dental ceramic used in computer-aided design/computer-aided manufacturing (CAD/CAM) after the specimens were processed with microwave hybrid sintering. Density, hardness, and bending strength were measured. When ceramic specimens were sintered with microwaves, the processing times were reduced and protocols were simplified. Hardness was improved almost 20% compared to regular sintering, and flexural strength measurements suggested that specimens were approximately 50% stronger than specimens sintered in a conventional system. Microwave hybrid sintering may preserve or improve the mechanical properties of dental ceramics designed for CAD/CAM processing systems, reducing processing and waiting times.

  5. Time-saving design of experiment protocol for optimization of LC-MS data processing in metabolomic approaches.

    PubMed

    Zheng, Hong; Clausen, Morten Rahr; Dalsgaard, Trine Kastrup; Mortensen, Grith; Bertram, Hanne Christine

    2013-08-06

    We describe a time-saving protocol for the processing of LC-MS-based metabolomics data by optimizing parameter settings in XCMS and threshold settings for removing noisy and low-intensity peaks using design of experiment (DoE) approaches including Plackett-Burman design (PBD) for screening and central composite design (CCD) for optimization. A reliability index, which is based on evaluation of the linear response to a dilution series, was used as a parameter for the assessment of data quality. After identifying the significant parameters in the XCMS software by PBD, CCD was applied to determine their values by maximizing the reliability and group indexes. Optimal settings by DoE resulted in improvements of 19.4% and 54.7% in the reliability index for a standard mixture and human urine, respectively, as compared with the default setting, and a total of 38 h was required to complete the optimization. Moreover, threshold settings were optimized by using CCD for further improvement. The approach combining optimal parameter setting and the threshold method improved the reliability index about 9.5 times for a standards mixture and 14.5 times for human urine data, which required a total of 41 h. Validation results also showed improvements in the reliability index of about 5-7 times even for urine samples from different subjects. It is concluded that the proposed methodology can be used as a time-saving approach for improving the processing of LC-MS-based metabolomics data.

  6. Run-length encoding graphic rules, biochemically editable designs and steganographical numeric data embedment for DNA-based cryptographical coding system.

    PubMed

    Kawano, Tomonori

    2013-03-01

    There have been a wide variety of approaches for handling the pieces of DNA as the "unplugged" tools for digital information storage and processing, including a series of studies applied to the security-related area, such as DNA-based digital barcodes, water marks and cryptography. In the present article, novel designs of artificial genes as the media for storing the digitally compressed data for images are proposed for bio-computing purpose while natural genes principally encode for proteins. Furthermore, the proposed system allows cryptographical application of DNA through biochemically editable designs with capacity for steganographical numeric data embedment. As a model case of image-coding DNA technique application, numerically and biochemically combined protocols are employed for ciphering the given "passwords" and/or secret numbers using DNA sequences. The "passwords" of interest were decomposed into single letters and translated into the font image coded on the separate DNA chains with both the coding regions in which the images are encoded based on the novel run-length encoding rule, and the non-coding regions designed for biochemical editing and the remodeling processes revealing the hidden orientation of letters composing the original "passwords." The latter processes require the molecular biological tools for digestion and ligation of the fragmented DNA molecules targeting at the polymerase chain reaction-engineered termini of the chains. Lastly, additional protocols for steganographical overwriting of the numeric data of interests over the image-coding DNA are also discussed.

  7. Developing a Standard Method for Link-Layer Security of CCSDS Space Communications

    NASA Technical Reports Server (NTRS)

    Biggerstaff, Craig

    2009-01-01

    Communications security for space systems has been a specialized field generally far removed from considerations of mission interoperability and cross-support in fact, these considerations often have been viewed as intrinsically opposed to security objectives. The space communications protocols defined by the Consultative Committee for Space Data Systems (CCSDS) have a twenty-five year history of successful use in over 400 missions. While the CCSDS Telemetry, Telecommand, and Advancing Orbiting Systems protocols for use at OSI Layer 2 are operationally mature, there has been no direct support within these protocols for communications security techniques. Link-layer communications security has been successfully implemented in the past using mission-unique methods, but never before with an objective of facilitating cross-support and interoperability. This paper discusses the design of a standard method for cryptographic authentication, encryption, and replay protection at the data link layer that can be integrated into existing CCSDS protocols without disruption to legacy communications services. Integrating cryptographic operations into existing data structures and processing sequences requires a careful assessment of the potential impediments within spacecraft, ground stations, and operations centers. The objective of this work is to provide a sound method for cryptographic encapsulation of frame data that also facilitates Layer 2 virtual channel switching, such that a mission may procure data transport services as needed without involving third parties in the cryptographic processing, or split independent data streams for separate cryptographic processing.

  8. Designing of routing algorithms in autonomous distributed data transmission system for mobile computing devices with ‘WiFi-Direct’ technology

    NASA Astrophysics Data System (ADS)

    Nikitin, I. A.; Sherstnev, V. S.; Sherstneva, A. I.; Botygin, I. A.

    2017-02-01

    The results of the research of existent routing protocols in wireless networks and their main features are discussed in the paper. Basing on the protocol data, the routing protocols in wireless networks, including search routing algorithms and phone directory exchange algorithms, are designed with the ‘WiFi-Direct’ technology. Algorithms without IP-protocol were designed, and that enabled one to increase the efficiency of the algorithms while working only with the MAC-addresses of the devices. The developed algorithms are expected to be used in the mobile software engineering with the Android platform taken as base. Easier algorithms and formats of the well-known route protocols, rejection of the IP-protocols enables to use the developed protocols on more primitive mobile devices. Implementation of the protocols to the engineering industry enables to create data transmission networks among working places and mobile robots without any access points.

  9. NEW APPROACHES TO ESTIMATION OF SOLID-WASTE QUANTITY AND COMPOSITION

    EPA Science Inventory

    Efficient and statistically sound sampling protocols for estimating the quantity and composition of solid waste over a stated period of time in a given location, such as a landfill site or at a specific point in an industrial or commercial process, are essential to the design ...

  10. Oxidative C-H activation of amines using protuberant lychee-like goethite

    EPA Science Inventory

    Goethite with protuberant lychee morphology has been synthesized that accomplishes C-H activation of N-methylanilines to generate α-aminonitriles; the catalyst takes oxygen from air and uses it as a cooxidant in the process. Inspired by nature, we aspired to design a protocol for...

  11. Experimental demonstration of the anti-maser

    NASA Astrophysics Data System (ADS)

    Mazzocco, Anthony; Aviles, Michael; Andrews, Jim; Dawson, Nathan; Crescimanno, Michael

    2012-10-01

    We denote by ``anti-maser'' a coherent perfect absorption (CPA) process in the radio frequency domain. We demonstrate several experimental realizations of the anti-maser suitable for an advanced undergraduate laboratory. Students designed, assembled and tested these devices, as well as the inexpensive laboratory setup and experimental protocol for displaying various CPA phenomenon.

  12. Overview of a Linguistic Theory of Design. AI Memo 383A.

    ERIC Educational Resources Information Center

    Miller, Mark L.; Goldstein, Ira P.

    The SPADE theory, which uses linguistic formalisms to model the planning and debugging processes of computer programming, was simultaneously developed and tested in three separate contexts--computer uses in education, automatic programming (a traditional artificial intelligence arena), and protocol analysis (the domain of information processing…

  13. A network control concept for the 30/20 GHz communication system baseband processor

    NASA Technical Reports Server (NTRS)

    Sabourin, D. J.; Hay, R. E.

    1982-01-01

    The architecture and system design for a satellite-switched TDMA communication system employing on-board processing was developed by Motorola for NASA's Lewis Research Center. The system design is based on distributed processing techniques that provide extreme flexibility in the selection of a network control protocol without impacting the satellite or ground terminal hardware. A network control concept that includes system synchronization and allows burst synchronization to occur within the system operational requirement is described. This concept integrates the tracking and control links with the communication links via the baseband processor, resulting in an autonomous system operational approach.

  14. Single-cell transcriptome conservation in cryopreserved cells and tissues.

    PubMed

    Guillaumet-Adkins, Amy; Rodríguez-Esteban, Gustavo; Mereu, Elisabetta; Mendez-Lago, Maria; Jaitin, Diego A; Villanueva, Alberto; Vidal, August; Martinez-Marti, Alex; Felip, Enriqueta; Vivancos, Ana; Keren-Shaul, Hadas; Heath, Simon; Gut, Marta; Amit, Ido; Gut, Ivo; Heyn, Holger

    2017-03-01

    A variety of single-cell RNA preparation procedures have been described. So far, protocols require fresh material, which hinders complex study designs. We describe a sample preservation method that maintains transcripts in viable single cells, allowing one to disconnect time and place of sampling from subsequent processing steps. We sequence single-cell transcriptomes from >1000 fresh and cryopreserved cells using 3'-end and full-length RNA preparation methods. Our results confirm that the conservation process did not alter transcriptional profiles. This substantially broadens the scope of applications in single-cell transcriptomics and could lead to a paradigm shift in future study designs.

  15. Lightweight CoAP-Based Bootstrapping Service for the Internet of Things.

    PubMed

    Garcia-Carrillo, Dan; Marin-Lopez, Rafael

    2016-03-11

    The Internet of Things (IoT) is becoming increasingly important in several fields of industrial applications and personal applications, such as medical e-health, smart cities, etc. The research into protocols and security aspects related to this area is continuously advancing in making these networks more reliable and secure, taking into account these aspects by design. Bootstrapping is a procedure by which a user obtains key material and configuration information, among other parameters, to operate as an authenticated party in a security domain. Until now solutions have focused on re-using security protocols that were not developed for IoT constraints. For this reason, in this work we propose a design and implementation of a lightweight bootstrapping service for IoT networks that leverages one of the application protocols used in IoT : Constrained Application Protocol (CoAP). Additionally, in order to provide flexibility, scalability, support for large scale deployment, accountability and identity federation, our design uses technologies such as the Extensible Authentication Protocol (EAP) and Authentication Authorization and Accounting (AAA). We have named this service CoAP-EAP. First, we review the state of the art in the field of bootstrapping and specifically for IoT. Second, we detail the bootstrapping service: the architecture with entities and interfaces and the flow operation. Third, we obtain performance measurements of CoAP-EAP (bootstrapping time, memory footprint, message processing time, message length and energy consumption) and compare them with PANATIKI. The most significant and constrained representative of the bootstrapping solutions related with CoAP-EAP. As we will show, our solution provides significant improvements, mainly due to an important reduction of the message length.

  16. Lightweight CoAP-Based Bootstrapping Service for the Internet of Things

    PubMed Central

    Garcia-Carrillo, Dan; Marin-Lopez, Rafael

    2016-01-01

    The Internet of Things (IoT) is becoming increasingly important in several fields of industrial applications and personal applications, such as medical e-health, smart cities, etc. The research into protocols and security aspects related to this area is continuously advancing in making these networks more reliable and secure, taking into account these aspects by design. Bootstrapping is a procedure by which a user obtains key material and configuration information, among other parameters, to operate as an authenticated party in a security domain. Until now solutions have focused on re-using security protocols that were not developed for IoT constraints. For this reason, in this work we propose a design and implementation of a lightweight bootstrapping service for IoT networks that leverages one of the application protocols used in IoT : Constrained Application Protocol (CoAP). Additionally, in order to provide flexibility, scalability, support for large scale deployment, accountability and identity federation, our design uses technologies such as the Extensible Authentication Protocol (EAP) and Authentication Authorization and Accounting (AAA). We have named this service CoAP-EAP. First, we review the state of the art in the field of bootstrapping and specifically for IoT. Second, we detail the bootstrapping service: the architecture with entities and interfaces and the flow operation. Third, we obtain performance measurements of CoAP-EAP (bootstrapping time, memory footprint, message processing time, message length and energy consumption) and compare them with PANATIKI. The most significant and constrained representative of the bootstrapping solutions related with CoAP-EAP. As we will show, our solution provides significant improvements, mainly due to an important reduction of the message length. PMID:26978362

  17. [Design and piloting of a structured service medication dispensing process].

    PubMed

    Abaurre, Raquel; García-Delgado, Pilar; Maurandi, M Dolores; Arrebola, Cristóbal; Gastelurrutia, Miguel Ángel; Martínez-Martínez, Fernando

    2015-01-01

    The aim of this article is to design and pilot a protocol for the dispensing of medications service. Using the requirements proposed in the Ministry of Health Pharmaceutical Care Consensus, a literature search was made applying qualitative consensus techniques. An observational, cross-sectional study was conducted from March to June 2009. A total of 53 community pharmacies from 24 Spanish counties. Patients who requested one or more particular medications with or without medical prescription for their own use or for someone in their care. The personalised medication information (IPM), the problems associated with the medications (PRM), and the negative results associated with the medication (RNM), detected by the pharmacist each time medication was dispensed, as well as the perception of the pharmacist on the operability of the protocol were recorded. A total of 870 medications were dispensed, with 423 (48.6%) cases of lack of personalised medication information (IPM) being detected. PRM were detected in 10.11% of the dispensed medications, as well as 68 (7.81%) suspected RNM: safety (n = 35; 51.5%), effectiveness (n = 29; 42.6%) and necessity (n = 4; 5.8%). Almost two-thirds (65.21%) of the pharmacists said that the protocol is in operation. The designed protocol helped to detect deficiencies in the information to the patients about their medications, as well as the PRM and RNM, and is shown to be tool that is easy to use and apply. Copyright © 2013 Elsevier España, S.L.U. All rights reserved.

  18. Microwave-Assisted γ-Valerolactone Production for Biomass Lignin Extraction: A Cascade Protocol.

    PubMed

    Tabasso, Silvia; Grillo, Giorgio; Carnaroglio, Diego; Calcio Gaudino, Emanuela; Cravotto, Giancarlo

    2016-03-26

    The general need to slow the depletion of fossil resources and reduce carbon footprints has led to tremendous effort being invested in creating "greener" industrial processes and developing alternative means to produce fuels and synthesize platform chemicals. This work aims to design a microwave-assisted cascade process for a full biomass valorisation cycle. GVL (γ-valerolactone), a renewable green solvent, has been used in aqueous acidic solution to achieve complete biomass lignin extraction. After lignin precipitation, the levulinic acid (LA)-rich organic fraction was hydrogenated, which regenerated the starting solvent for further biomass delignification. This process does not requires a purification step because GVL plays the dual role of solvent and product, while the reagent (LA) is a product of biomass delignification. In summary, this bio-refinery approach to lignin extraction is a cascade protocol in which the solvent loss is integrated into the conversion cycle, leading to simplified methods for biomass valorisation.

  19. SPAR: a security- and power-aware routing protocol for wireless ad hoc and sensor networks

    NASA Astrophysics Data System (ADS)

    Oberoi, Vikram; Chigan, Chunxiao

    2005-05-01

    Wireless Ad Hoc and Sensor Networks (WAHSNs) are vulnerable to extensive attacks as well as severe resource constraints. To fulfill the security needs, many security enhancements have been proposed. Like wise, from resource constraint perspective, many power aware schemes have been proposed to save the battery power. However, we observe that for the severely resource limited and extremely vulnerable WAHSNs, taking security or power (or any other resource) alone into consideration for protocol design is rather inadequate toward the truly "secure-and-useful" WAHSNs. For example, from resource constraint perspective, we identify one of the potential problems, the Security-Capable-Congestion (SCC) behavior, for the WAHSNs routing protocols where only the security are concerned. On the other hand, the design approach where only scarce resource is concerned, such as many power-aware WAHSNs protocols, leaves security unconsidered and is undesirable to many WAHSNs application scenarios. Motivated by these observations, we propose a co-design approach, where both the high security and effective resource consumption are targeted for WAHSNs protocol design. Specifically, we propose a novel routing protocol, Security- and Power- Aware Routing (SPAR) protocol based on this co-design approach. In SPAR, the routing decisions are made based on both security and power as routing criteria. The idea of the SPAR mechanism is routing protocol independent and therefore can be broadly integrated into any of the existing WAHSNs routing protocols. The simulation results show that SPAR outperforms the WAHSNs routing protocols where security or power alone is considered, significantly. This research finding demonstrates the proposed security- and resource- aware co-design approach is promising towards the truly "secure-and-useful" WAHSNs.

  20. Research on TCP/IP network communication based on Node.js

    NASA Astrophysics Data System (ADS)

    Huang, Jing; Cai, Lixiong

    2018-04-01

    In the face of big data, long connection and high synchronization, TCP/IP network communication will cause performance bottlenecks due to its blocking multi-threading service model. This paper presents a method of TCP/IP network communication protocol based on Node.js. On the basis of analyzing the characteristics of Node.js architecture and asynchronous non-blocking I/O model, the principle of its efficiency is discussed, and then compare and analyze the network communication model of TCP/IP protocol to expound the reasons why TCP/IP protocol stack is widely used in network communication. Finally, according to the large data and high concurrency in the large-scale grape growing environment monitoring process, a TCP server design based on Node.js is completed. The results show that the example runs stably and efficiently.

  1. Experimental verification of multipartite entanglement in quantum networks

    PubMed Central

    McCutcheon, W.; Pappa, A.; Bell, B. A.; McMillan, A.; Chailloux, A.; Lawson, T.; Mafu, M.; Markham, D.; Diamanti, E.; Kerenidis, I.; Rarity, J. G.; Tame, M. S.

    2016-01-01

    Multipartite entangled states are a fundamental resource for a wide range of quantum information processing tasks. In particular, in quantum networks, it is essential for the parties involved to be able to verify if entanglement is present before they carry out a given distributed task. Here we design and experimentally demonstrate a protocol that allows any party in a network to check if a source is distributing a genuinely multipartite entangled state, even in the presence of untrusted parties. The protocol remains secure against dishonest behaviour of the source and other parties, including the use of system imperfections to their advantage. We demonstrate the verification protocol in a three- and four-party setting using polarization-entangled photons, highlighting its potential for realistic photonic quantum communication and networking applications. PMID:27827361

  2. Probabilistic Analysis of Hierarchical Cluster Protocols for Wireless Sensor Networks

    NASA Astrophysics Data System (ADS)

    Kaj, Ingemar

    Wireless sensor networks are designed to extract data from the deployment environment and combine sensing, data processing and wireless communication to provide useful information for the network users. Hundreds or thousands of small embedded units, which operate under low-energy supply and with limited access to central network control, rely on interconnecting protocols to coordinate data aggregation and transmission. Energy efficiency is crucial and it has been proposed that cluster based and distributed architectures such as LEACH are particularly suitable. We analyse the random cluster hierarchy in this protocol and provide a solution for low-energy and limited-loss optimization. Moreover, we extend these results to a multi-level version of LEACH, where clusters of nodes again self-organize to form clusters of clusters, and so on.

  3. [Problematizing the multidisciplinary residency in oncology: a practical teaching protocol from the perspective of nurse residents].

    PubMed

    Melo, Myllena Cândida de; Queluci, Gisella de Carvalho; Gouvêa, Mônica Villela

    2014-08-01

    To investigate practical teaching of nurse residents in a multidisciplinary residency in oncology. A qualitative descriptive study grounded in the problematization methodology and its steps, represented by the Maguerez Arch. Data were analyzed using content analysis. Potentiating and limiting elements of the residency guided the design of a practical teaching protocol from the perspective of residents, structured in three stages: Welcoming and ambience; Nursing care for problem situations; and, Evaluation process. Systematization of practical teaching promoted the autonomy of individuals and the approximation of teaching to reality, making residency less strenuous, stressful and distressing.

  4. Carbon Offsets in California: What Role for Earth Scientists in the Policy Process? (Invited)

    NASA Astrophysics Data System (ADS)

    Cullenward, D.; Strong, A. L.

    2013-12-01

    This talk addresses the policy structure in California for developing and approving carbon offset protocols, which rely on findings from the environmental and earth sciences communities. In addition to providing an overview of the legal requirements of carbon offsets, we describe a series of case studies of how scientists can engage with policymakers. Based on those experiences, we suggest ways for the earth sciences community to become more involved in climate policy development. California's climate law, known as AB 32, requires that major sectors of the state's economy reduce their emissions to 1990 levels by 2020. As part of AB 32, the California Air Resources Board created a cap-and-trade market to ensure compliance with the statutory target. Under this system, regulated companies have to acquire tradable emissions permits (called 'compliance instruments') for the greenhouse gas emissions they release. The State allocates a certain number of allowances to regulated entities through a mixture of auctions and free transfers, with the total number equal to the overall emissions target; these allowances, along with approved offsets credits, are the compliance instruments that regulated entities are required to obtain by law. One of the key policy design issues in California's cap-and-trade market concerns the use of carbon offsets. Under AB 32, the Air Resources Board can issue offset credits to project developers who reduce emissions outside of the capped sectors (electricity, industry, and transportation)--or even outside of California--pursuant to approved offset protocols. Project developers then sell the credits to regulated companies in California. Essentially, offsets allow regulated entities in California to earn credit for emissions reductions that take place outside the scope of AB 32. Many regulated entities and economists are in favor of offsets because they view them as a source of low-cost compliance instruments. On the other hand, critics argue that some offset protocols award credits for activities that would have occurred anyway; by replacing a company's need to acquire an allowance in the carbon market, critics believe that poorly designed offset protocols increase greenhouse gas emissions. Thus, the effectiveness of the policy approach depends on the scientific integrity of the offset protocols. To date, California has approved offset protocols for emissions reductions in four applications: (1) forestry, (2) urban forestry, (3) livestock, and (4) destruction of ozone-depleting substances. In addition, the State is currently considering protocols that would address (5) methane emissions from mining and (6) greenhouse gas reductions from improved rice cultivation practices. These protocols rely heavily on findings from the environmental and earth sciences communities, especially when the protocol subject involves land use or land use change. Yet, due to budget constraints, the Air Resources Board is relying primarily on third-party protocol developers to design and propose the detailed structures under which offset credits will be issued. Despite the fact that any member of the public may participate in the governance regime that leads to protocol approvals, few scientists or scientific organizations provide input into the policy process. We use case studies from several of the California protocols to illustrate ways scientists can apply their skills to a crucial stage of climate policy development.

  5. A detailed description of the implementation of inpatient insulin orders with a commercial electronic health record system.

    PubMed

    Neinstein, Aaron; MacMaster, Heidemarie Windham; Sullivan, Mary M; Rushakoff, Robert

    2014-07-01

    In the setting of Meaningful Use laws and professional society guidelines, hospitals are rapidly implementing electronic glycemic management order sets. There are a number of best practices established in the literature for glycemic management protocols and programs. We believe that this is the first published account of the detailed steps to be taken to design, implement, and optimize glycemic management protocols in a commercial computerized provider order entry (CPOE) system. Prior to CPOE implementation, our hospital already had a mature glycemic management program. To transition to CPOE, we underwent the following 4 steps: (1) preparation and requirements gathering, (2) design and build, (3) implementation and dissemination, and (4) optimization. These steps required more than 2 years of coordinated work between physicians, nurses, pharmacists, and programmers. With the move to CPOE, our complex glycemic management order sets were successfully implemented without any significant interruptions in care. With feedback from users, we have continued to refine the order sets, and this remains an ongoing process. Successful implementation of glycemic management protocols in CPOE is dependent on broad stakeholder input and buy-in. When using a commercial CPOE system, there may be limitations of the system, necessitating workarounds. There should be an upfront plan to apply resources for continuous process improvement and optimization after implementation. © 2014 Diabetes Technology Society.

  6. Estimated cost of a health visitor-led protocol for perinatal mental health.

    PubMed

    Oluboyede, Yemi; Lewis, Anne; Ilott, Irene; Lekka, Chrysanthi

    2010-06-01

    Anecdotally, protocols, care pathways and clinical guidelines are time consuming to develop and sustain, but there is little research about the actual costs of their development, use and audit.This is a notable gap considering the pervasiveness of such documents that are intended to reduce unacceptable variations in practice by standardising care processes. A case study research design was used to calculate the resource use costs of a protocol for perinatal mental health, part of the core programme for health visitors in a primary care trust in the west of England. The methods were in-depth interviews with the operational lead for the protocol (a health visitor) and documentary analysis. The total estimated cost of staff time over a five-year period (2004 to 2008) was Euro 73,598, comprising Euro 36,162 (49%) for development and Euro 37,436 (51%) for implementation. Although these are best estimates dependent upon retrospective data, they indicate the opportunity cost of staff time for a single protocol in one trust over five years. When new protocols, care pathways or clinical guidelines are proposed, the costs need to be considered and weighed against the benefits of engaging frontline staff in service improvements.

  7. Design exploration and verification platform, based on high-level modeling and FPGA prototyping, for fast and flexible digital communication in physics experiments

    NASA Astrophysics Data System (ADS)

    Magazzù, G.; Borgese, G.; Costantino, N.; Fanucci, L.; Incandela, J.; Saponara, S.

    2013-02-01

    In many research fields as high energy physics (HEP), astrophysics, nuclear medicine or space engineering with harsh operating conditions, the use of fast and flexible digital communication protocols is becoming more and more important. The possibility to have a smart and tested top-down design flow for the design of a new protocol for control/readout of front-end electronics is very useful. To this aim, and to reduce development time, costs and risks, this paper describes an innovative design/verification flow applied as example case study to a new communication protocol called FF-LYNX. After the description of the main FF-LYNX features, the paper presents: the definition of a parametric SystemC-based Integrated Simulation Environment (ISE) for high-level protocol definition and validation; the set up of figure of merits to drive the design space exploration; the use of ISE for early analysis of the achievable performances when adopting the new communication protocol and its interfaces for a new (or upgraded) physics experiment; the design of VHDL IP cores for the TX and RX protocol interfaces; their implementation on a FPGA-based emulator for functional verification and finally the modification of the FPGA-based emulator for testing the ASIC chipset which implements the rad-tolerant protocol interfaces. For every step, significant results will be shown to underline the usefulness of this design and verification approach that can be applied to any new digital protocol development for smart detectors in physics experiments.

  8. Research on low-latency MAC protocols for wireless sensor networks

    NASA Astrophysics Data System (ADS)

    He, Chenguang; Sha, Xuejun; Lee, Chankil

    2007-11-01

    Energy-efficient should not be the only design goal in MAC protocols for wireless sensor networks, which involve the use of battery-operated computing and sensing devices. Low-latency operation becomes the same important as energy-efficient in the case that the traffic load is very heavy or the real-time constrain is used in applications like tracking or locating. This paper introduces some causes of traditional time delays which are inherent in a multi-hops network using existing WSN MAC protocols, illuminates the importance of low-latency MAC design for wireless sensor networks, and presents three MACs as examples of low-latency protocols designed specially for sleep delay, wait delay and wakeup delay in wireless sensor networks, respectively. The paper also discusses design trade-offs with emphasis on low-latency and points out their advantages and disadvantages, together with some design considerations and suggestions for MAC protocols for future applications and researches.

  9. A New Cellular Architecture for Information Retrieval from Sensor Networks through Embedded Service and Security Protocols

    PubMed Central

    Shahzad, Aamir; Landry, René; Lee, Malrey; Xiong, Naixue; Lee, Jongho; Lee, Changhoon

    2016-01-01

    Substantial changes have occurred in the Information Technology (IT) sectors and with these changes, the demand for remote access to field sensor information has increased. This allows visualization, monitoring, and control through various electronic devices, such as laptops, tablets, i-Pads, PCs, and cellular phones. The smart phone is considered as a more reliable, faster and efficient device to access and monitor industrial systems and their corresponding information interfaces anywhere and anytime. This study describes the deployment of a protocol whereby industrial system information can be securely accessed by cellular phones via a Supervisory Control And Data Acquisition (SCADA) server. To achieve the study goals, proprietary protocol interconnectivity with non-proprietary protocols and the usage of interconnectivity services are considered in detail. They support the visualization of the SCADA system information, and the related operations through smart phones. The intelligent sensors are configured and designated to process real information via cellular phones by employing information exchange services between the proprietary protocol and non-proprietary protocols. SCADA cellular access raises the issue of security flaws. For these challenges, a cryptography-based security method is considered and deployed, and it could be considered as a part of a proprietary protocol. Subsequently, transmission flows from the smart phones through a cellular network. PMID:27314351

  10. A decentralized fuzzy C-means-based energy-efficient routing protocol for wireless sensor networks.

    PubMed

    Alia, Osama Moh'd

    2014-01-01

    Energy conservation in wireless sensor networks (WSNs) is a vital consideration when designing wireless networking protocols. In this paper, we propose a Decentralized Fuzzy Clustering Protocol, named DCFP, which minimizes total network energy dissipation to promote maximum network lifetime. The process of constructing the infrastructure for a given WSN is performed only once at the beginning of the protocol at a base station, which remains unchanged throughout the network's lifetime. In this initial construction step, a fuzzy C-means algorithm is adopted to allocate sensor nodes into their most appropriate clusters. Subsequently, the protocol runs its rounds where each round is divided into a CH-Election phase and a Data Transmission phase. In the CH-Election phase, the election of new cluster heads is done locally in each cluster where a new multicriteria objective function is proposed to enhance the quality of elected cluster heads. In the Data Transmission phase, the sensing and data transmission from each sensor node to their respective cluster head is performed and cluster heads in turn aggregate and send the sensed data to the base station. Simulation results demonstrate that the proposed protocol improves network lifetime, data delivery, and energy consumption compared to other well-known energy-efficient protocols.

  11. A Decentralized Fuzzy C-Means-Based Energy-Efficient Routing Protocol for Wireless Sensor Networks

    PubMed Central

    2014-01-01

    Energy conservation in wireless sensor networks (WSNs) is a vital consideration when designing wireless networking protocols. In this paper, we propose a Decentralized Fuzzy Clustering Protocol, named DCFP, which minimizes total network energy dissipation to promote maximum network lifetime. The process of constructing the infrastructure for a given WSN is performed only once at the beginning of the protocol at a base station, which remains unchanged throughout the network's lifetime. In this initial construction step, a fuzzy C-means algorithm is adopted to allocate sensor nodes into their most appropriate clusters. Subsequently, the protocol runs its rounds where each round is divided into a CH-Election phase and a Data Transmission phase. In the CH-Election phase, the election of new cluster heads is done locally in each cluster where a new multicriteria objective function is proposed to enhance the quality of elected cluster heads. In the Data Transmission phase, the sensing and data transmission from each sensor node to their respective cluster head is performed and cluster heads in turn aggregate and send the sensed data to the base station. Simulation results demonstrate that the proposed protocol improves network lifetime, data delivery, and energy consumption compared to other well-known energy-efficient protocols. PMID:25162060

  12. A New Cellular Architecture for Information Retrieval from Sensor Networks through Embedded Service and Security Protocols.

    PubMed

    Shahzad, Aamir; Landry, René; Lee, Malrey; Xiong, Naixue; Lee, Jongho; Lee, Changhoon

    2016-06-14

    Substantial changes have occurred in the Information Technology (IT) sectors and with these changes, the demand for remote access to field sensor information has increased. This allows visualization, monitoring, and control through various electronic devices, such as laptops, tablets, i-Pads, PCs, and cellular phones. The smart phone is considered as a more reliable, faster and efficient device to access and monitor industrial systems and their corresponding information interfaces anywhere and anytime. This study describes the deployment of a protocol whereby industrial system information can be securely accessed by cellular phones via a Supervisory Control And Data Acquisition (SCADA) server. To achieve the study goals, proprietary protocol interconnectivity with non-proprietary protocols and the usage of interconnectivity services are considered in detail. They support the visualization of the SCADA system information, and the related operations through smart phones. The intelligent sensors are configured and designated to process real information via cellular phones by employing information exchange services between the proprietary protocol and non-proprietary protocols. SCADA cellular access raises the issue of security flaws. For these challenges, a cryptography-based security method is considered and deployed, and it could be considered as a part of a proprietary protocol. Subsequently, transmission flows from the smart phones through a cellular network.

  13. A Secure Region-Based Geographic Routing Protocol (SRBGR) for Wireless Sensor Networks

    PubMed Central

    Adnan, Ali Idarous; Hanapi, Zurina Mohd; Othman, Mohamed; Zukarnain, Zuriati Ahmad

    2017-01-01

    Due to the lack of dependency for routing initiation and an inadequate allocated sextant on responding messages, the secure geographic routing protocols for Wireless Sensor Networks (WSNs) have attracted considerable attention. However, the existing protocols are more likely to drop packets when legitimate nodes fail to respond to the routing initiation messages while attackers in the allocated sextant manage to respond. Furthermore, these protocols are designed with inefficient collection window and inadequate verification criteria which may lead to a high number of attacker selections. To prevent the failure to find an appropriate relay node and undesirable packet retransmission, this paper presents Secure Region-Based Geographic Routing Protocol (SRBGR) to increase the probability of selecting the appropriate relay node. By extending the allocated sextant and applying different message contention priorities more legitimate nodes can be admitted in the routing process. Moreover, the paper also proposed the bound collection window for a sufficient collection time and verification cost for both attacker identification and isolation. Extensive simulation experiments have been performed to evaluate the performance of the proposed protocol in comparison with other existing protocols. The results demonstrate that SRBGR increases network performance in terms of the packet delivery ratio and isolates attacks such as Sybil and Black hole. PMID:28121992

  14. A Secure Region-Based Geographic Routing Protocol (SRBGR) for Wireless Sensor Networks.

    PubMed

    Adnan, Ali Idarous; Hanapi, Zurina Mohd; Othman, Mohamed; Zukarnain, Zuriati Ahmad

    2017-01-01

    Due to the lack of dependency for routing initiation and an inadequate allocated sextant on responding messages, the secure geographic routing protocols for Wireless Sensor Networks (WSNs) have attracted considerable attention. However, the existing protocols are more likely to drop packets when legitimate nodes fail to respond to the routing initiation messages while attackers in the allocated sextant manage to respond. Furthermore, these protocols are designed with inefficient collection window and inadequate verification criteria which may lead to a high number of attacker selections. To prevent the failure to find an appropriate relay node and undesirable packet retransmission, this paper presents Secure Region-Based Geographic Routing Protocol (SRBGR) to increase the probability of selecting the appropriate relay node. By extending the allocated sextant and applying different message contention priorities more legitimate nodes can be admitted in the routing process. Moreover, the paper also proposed the bound collection window for a sufficient collection time and verification cost for both attacker identification and isolation. Extensive simulation experiments have been performed to evaluate the performance of the proposed protocol in comparison with other existing protocols. The results demonstrate that SRBGR increases network performance in terms of the packet delivery ratio and isolates attacks such as Sybil and Black hole.

  15. A Passive Testing Approach for Protocols in Wireless Sensor Networks

    PubMed Central

    Che, Xiaoping; Maag, Stephane; Tan, Hwee-Xian; Tan, Hwee-Pink; Zhou, Zhangbing

    2015-01-01

    Smart systems are today increasingly developed with the number of wireless sensor devices drastically increasing. They are implemented within several contexts throughout our environment. Thus, sensed data transported in ubiquitous systems are important, and the way to carry them must be efficient and reliable. For that purpose, several routing protocols have been proposed for wireless sensor networks (WSN). However, one stage that is often neglected before their deployment is the conformance testing process, a crucial and challenging step. Compared to active testing techniques commonly used in wired networks, passive approaches are more suitable to the WSN environment. While some works propose to specify the protocol with state models or to analyze them with simulators and emulators, we here propose a logic-based approach for formally specifying some functional requirements of a novel WSN routing protocol. We provide an algorithm to evaluate these properties on collected protocol execution traces. Further, we demonstrate the efficiency and suitability of our approach by its application into common WSN functional properties, as well as specific ones designed from our own routing protocol. We provide relevant testing verdicts through a real indoor testbed and the implementation of our protocol. Furthermore, the flexibility, genericity and practicability of our approach have been proven by the experimental results. PMID:26610495

  16. A Passive Testing Approach for Protocols in Wireless Sensor Networks.

    PubMed

    Che, Xiaoping; Maag, Stephane; Tan, Hwee-Xian; Tan, Hwee-Pink; Zhou, Zhangbing

    2015-11-19

    Smart systems are today increasingly developed with the number of wireless sensor devices drastically increasing. They are implemented within several contexts throughout our environment. Thus, sensed data transported in ubiquitous systems are important, and the way to carry them must be efficient and reliable. For that purpose, several routing protocols have been proposed for wireless sensor networks (WSN). However, one stage that is often neglected before their deployment is the conformance testing process, a crucial and challenging step. Compared to active testing techniques commonly used in wired networks, passive approaches are more suitable to the WSN environment. While some works propose to specify the protocol with state models or to analyze them with simulators and emulators, we here propose a logic-based approach for formally specifying some functional requirements of a novel WSN routing protocol. We provide an algorithm to evaluate these properties on collected protocol execution traces. Further, we demonstrate the efficiency and suitability of our approach by its application into common WSN functional properties, as well as specific ones designed from our own routing protocol. We provide relevant testing verdicts through a real indoor testbed and the implementation of our protocol. Furthermore, the flexibility, genericity and practicability of our approach have been proven by the experimental results.

  17. SPP: A data base processor data communications protocol

    NASA Technical Reports Server (NTRS)

    Fishwick, P. A.

    1983-01-01

    The design and implementation of a data communications protocol for the Intel Data Base Processor (DBP) is defined. The protocol is termed SPP (Service Port Protocol) since it enables data transfer between the host computer and the DBP service port. The protocol implementation is extensible in that it is explicitly layered and the protocol functionality is hierarchically organized. Extensive trace and performance capabilities have been supplied with the protocol software to permit optional efficient monitoring of the data transfer between the host and the Intel data base processor. Machine independence was considered to be an important attribute during the design and implementation of SPP. The protocol source is fully commented and is included in Appendix A of this report.

  18. Improving delirium care in the intensive care unit: The design of a pragmatic study

    PubMed Central

    2011-01-01

    Background Delirium prevalence in the intensive care unit (ICU) is high. Numerous psychotropic agents are used to manage delirium in the ICU with limited data regarding their efficacy or harms. Methods/Design This is a randomized controlled trial of 428 patients aged 18 and older suffering from delirium and admitted to the ICU of Wishard Memorial Hospital in Indianapolis. Subjects assigned to the intervention group will receive a multicomponent pharmacological management protocol for delirium (PMD) and those assigned to the control group will receive no change in their usual ICU care. The primary outcomes of the trial are (1) delirium severity as measured by the Delirium Rating Scale revised-98 (DRS-R-98) and (2) delirium duration as determined by the Confusion Assessment Method for the ICU (CAM-ICU). The PMD protocol targets the three neurotransmitter systems thought to be compromised in delirious patients: dopamine, acetylcholine, and gamma-aminobutyric acid. The PMD protocol will target the reduction of anticholinergic medications and benzodiazepines, and introduce a low-dose of haloperidol at 0.5-1 mg for 7 days. The protocol will be delivered by a combination of computer (artificial intelligence) and pharmacist (human intelligence) decision support system to increase adherence to the PMD protocol. Discussion The proposed study will evaluate the content and the delivery process of a multicomponent pharmacological management program for delirium in the ICU. Trial Registration ClinicalTrials.gov: NCT00842608 PMID:21645330

  19. E-novo: an automated workflow for efficient structure-based lead optimization.

    PubMed

    Pearce, Bradley C; Langley, David R; Kang, Jia; Huang, Hongwei; Kulkarni, Amit

    2009-07-01

    An automated E-Novo protocol designed as a structure-based lead optimization tool was prepared through Pipeline Pilot with existing CHARMm components in Discovery Studio. A scaffold core having 3D binding coordinates of interest is generated from a ligand-bound protein structural model. Ligands of interest are generated from the scaffold using an R-group fragmentation/enumeration tool within E-Novo, with their cores aligned. The ligand side chains are conformationally sampled and are subjected to core-constrained protein docking, using a modified CHARMm-based CDOCKER method to generate top poses along with CDOCKER energies. In the final stage of E-Novo, a physics-based binding energy scoring function ranks the top ligand CDOCKER poses using a more accurate Molecular Mechanics-Generalized Born with Surface Area method. Correlation of the calculated ligand binding energies with experimental binding affinities were used to validate protocol performance. Inhibitors of Src tyrosine kinase, CDK2 kinase, beta-secretase, factor Xa, HIV protease, and thrombin were used to test the protocol using published ligand crystal structure data within reasonably defined binding sites. In-house Respiratory Syncytial Virus inhibitor data were used as a more challenging test set using a hand-built binding model. Least squares fits for all data sets suggested reasonable validation of the protocol within the context of observed ligand binding poses. The E-Novo protocol provides a convenient all-in-one structure-based design process for rapid assessment and scoring of lead optimization libraries.

  20. The Design and Validation of an Early Childhood STEM Classroom Observational Protocol

    ERIC Educational Resources Information Center

    Milford, Todd; Tippett, Christine

    2015-01-01

    Across K-12 education, there has been recent attention to the learning opportunities available to students in science, technology, engineering, and mathematics (STEM) learning. Early childhood education (ECE) has been excluded from this process. The scholarly literature contains good evidence for including science teaching and learning at the ECE…

  1. From Networked Learning to Operational Practice: Constructing and Transferring Superintendent Knowledge in a Regional Instructional Rounds Network

    ERIC Educational Resources Information Center

    Travis, Timothy J.

    2015-01-01

    Instructional rounds are an emerging network structure with processes and protocols designed to develop superintendents' knowledge and skills in leading large-scale improvement, to enable superintendents to build an infrastructure that supports the work of improvement, to assist superintendents in distributing leadership throughout their district,…

  2. Clinical guideline representation in a CDS: a human information processing method.

    PubMed

    Kilsdonk, Ellen; Riezebos, Rinke; Kremer, Leontien; Peute, Linda; Jaspers, Monique

    2012-01-01

    The Dutch Childhood Oncology Group (DCOG) has developed evidence-based guidelines for screening childhood cancer survivors for possible late complications of treatment. These paper-based guidelines appeared to not suit clinicians' information retrieval strategies; it was thus decided to communicate the guidelines through a Computerized Decision Support (CDS) tool. To ensure high usability of this tool, an analysis of clinicians' cognitive strategies in retrieving information from the paper-based guidelines was used as requirements elicitation method. An information processing model was developed through an analysis of think aloud protocols and used as input for the design of the CDS user interface. Usability analysis of the user interface showed that the navigational structure of the CDS tool fitted well with the clinicians' mental strategies employed in deciding on survivors screening protocols. Clinicians were more efficient and more complete in deciding on patient-tailored screening procedures when supported by the CDS tool than by the paper-based guideline booklet. The think-aloud method provided detailed insight into users' clinical work patterns that supported the design of a highly usable CDS system.

  3. Built to last? The sustainability of health system improvements, interventions and change strategies: a study protocol for a systematic review.

    PubMed

    Braithwaite, Jeffrey; Testa, Luke; Lamprell, Gina; Herkes, Jessica; Ludlow, Kristiana; McPherson, Elise; Campbell, Margie; Holt, Joanna

    2017-11-12

    The sustainability of healthcare interventions and change programmes is of increasing importance to researchers and healthcare stakeholders interested in creating sustainable health systems to cope with mounting stressors. The aim of this protocol is to extend earlier work and describe a systematic review to identify, synthesise and draw meaning from studies published within the last 5 years that measure the sustainability of interventions, improvement efforts and change strategies in the health system. The protocol outlines a method by which to execute a rigorous systematic review. The design includes applying primary and secondary data collection techniques, consisting of a comprehensive database search complemented by contact with experts, and searching secondary databases and reference lists, using snowballing techniques. The review and analysis process will occur via an abstract review followed by a full-text screening process. The inclusion criteria include English-language, peer-reviewed, primary, empirical research articles published after 2011 in scholarly journals, for which the full text is available. No restrictions on location will be applied. The review that results from this protocol will synthesise and compare characteristics of the included studies. Ultimately, it is intended that this will help make it easier to identify and design sustainable interventions, improvement efforts and change strategies. As no primary data were collected, ethical approval was not required. Results will be disseminated in conference presentations, peer-reviewed publications and among policymaker bodies interested in creating sustainable health systems. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  4. Variability in Criteria for Emergency Medical Services Routing of Acute Stroke Patients to Designated Stroke Center Hospitals.

    PubMed

    Dimitrov, Nikolay; Koenig, William; Bosson, Nichole; Song, Sarah; Saver, Jeffrey L; Mack, William J; Sanossian, Nerses

    2015-09-01

    Comprehensive stroke systems of care include routing to the nearest designated stroke center hospital, bypassing non-designated hospitals. Routing protocols are implemented at the state or county level and vary in qualification criteria and determination of destination hospital. We surveyed all counties in the state of California for presence and characteristics of their prehospital stroke routing protocols. Each county's local emergency medical services agency (LEMSA) was queried for the presence of a stroke routing protocol. We reviewed these protocols for method of stroke identification and criteria for patient transport to a stroke center. Thirty-three LEMSAs serve 58 counties in California with populations ranging from 1,175 to nearly 10 million. Fifteen LEMSAs (45%) had stroke routing protocols, covering 23 counties (40%) and 68% of the state population. Counties with protocols had higher population density (1,500 vs. 140 persons per square mile). In the six counties without designated stroke centers, patients meeting criteria were transported out of county. Stroke identification in the field was achieved using the Cincinnati Prehospital Stroke Screen in 72%, Los Angeles Prehospital Stroke Screen in 7% and a county-specific protocol in 22%. California EMS prehospital acute stroke routing protocols cover 68% of the state population and vary in characteristics including activation by symptom onset time and destination facility features, reflecting matching of system design to local geographic resources.

  5. Motivational Interviewing and Medication Review in Coronary Heart Disease (MIMeRiC): Intervention Development and Protocol for the Process Evaluation.

    PubMed

    Östbring, Malin Johansson; Eriksson, Tommy; Petersson, Göran; Hellström, Lina

    2018-01-30

    Trials of complex interventions are often criticized for being difficult to interpret because the effects of apparently similar interventions vary across studies dependent on context, targeted groups, and the delivery of the intervention. The Motivational Interviewing and Medication Review in Coronary heart disease (MIMeRiC) trial is a randomized controlled trial (RCT) of an intervention aimed at improving pharmacological secondary prevention. Guidelines for the development and evaluation of complex interventions have recently highlighted the need for better reporting of the development of interventions, including descriptions of how the intervention is assumed to work, how this theory informed the process evaluation, and how the process evaluation relates to the outcome evaluation. This paper aims to describe how the intervention was designed and developed. The aim of the process evaluation is to better understand how and why the intervention in the MIMeRiC trial was effective or not effective. The research questions for evaluating the process are based on the conceptual model of change processes assumed in the intervention and will be analyzed by qualitative and quantitative methods. Quantitative data are used to evaluate the medication review in terms of drug-related problems, to describe how patients' beliefs about medicines are affected by the intervention, and to evaluate the quality of motivational interviewing. Qualitative data will be used to analyze whether patients experienced the intervention as intended, how cardiologists experienced the collaboration and intervention, and how the intervention affected patients' overall experience of care after coronary heart disease. The development and piloting of the intervention are described in relation to the theoretical framework. Data for the process evaluation will be collected until March 2018. Some process evaluation questions will be analyzed before, and others will be analyzed after the outcomes of the MIMeRiC RCT are known. This paper describes the framework for the design of the intervention tested in the MIMeRiC trial, development of the intervention from the pilot stage to the complete trial intervention, and the framework and methods for the process evaluation. Providing the protocol of the process evaluation allows prespecification of the processes that will be evaluated, because we hypothesize that they will determine the outcomes of the MIMeRiC trial. This protocol also constitutes a contribution to the new field of process evaluations as made explicit in health services research and clinical trials of complex interventions. ©Malin Johansson Östbring, Tommy Eriksson, Göran Petersson, Lina Hellström. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 30.01.2018.

  6. Roadmap for the protection of disaster research participants: findings from the World Trade Center Evacuation Study.

    PubMed

    Qureshi, Kristine A; Gershon, Robyn R M; Smailes, Elizabeth; Raveis, Victoria H; Murphy, Bridgette; Matzner, Frederick; Fleischman, Alan R

    2007-01-01

    This report addresses the development, implementation, and evaluation of a protocol designed to protect participants from inadvertent emotional harm or further emotional trauma due to their participation in the World Trade Center Evacuation (WTCE) Study research project. This project was designed to identify the individual, organizational, and structural (environmental) factors associated with evacuation from the World Trade Center Towers 1 and 2 on 11 September 2001. Following published recommended practices for protecting potentially vulnerable disaster research participants, protective strategies and quality assurance processes were implemented and evaluated, including an assessment of the impact of participation on study subjects enrolled in the qualitative phase of the WTCE Study. The implementation of a protocol designed to protect disaster study participants from further emotional trauma was feasible and effective in minimizing risk and monitoring for psychological injury associated with study participation. Details about this successful strategy provide a roadmap that can be applied in other post-disaster research investigations.

  7. Experimental Optimal Single Qubit Purification in an NMR Quantum Information Processor

    PubMed Central

    Hou, Shi-Yao; Sheng, Yu-Bo; Feng, Guan-Ru; Long, Gui-Lu

    2014-01-01

    High quality single qubits are the building blocks in quantum information processing. But they are vulnerable to environmental noise. To overcome noise, purification techniques, which generate qubits with higher purities from qubits with lower purities, have been proposed. Purifications have attracted much interest and been widely studied. However, the full experimental demonstration of an optimal single qubit purification protocol proposed by Cirac, Ekert and Macchiavello [Phys. Rev. Lett. 82, 4344 (1999), the CEM protocol] more than one and half decades ago, still remains an experimental challenge, as it requires more complicated networks and a higher level of precision controls. In this work, we design an experiment scheme that realizes the CEM protocol with explicit symmetrization of the wave functions. The purification scheme was successfully implemented in a nuclear magnetic resonance quantum information processor. The experiment fully demonstrated the purification protocol, and showed that it is an effective way of protecting qubits against errors and decoherence. PMID:25358758

  8. Run-length encoding graphic rules, biochemically editable designs and steganographical numeric data embedment for DNA-based cryptographical coding system

    PubMed Central

    Kawano, Tomonori

    2013-01-01

    There have been a wide variety of approaches for handling the pieces of DNA as the “unplugged” tools for digital information storage and processing, including a series of studies applied to the security-related area, such as DNA-based digital barcodes, water marks and cryptography. In the present article, novel designs of artificial genes as the media for storing the digitally compressed data for images are proposed for bio-computing purpose while natural genes principally encode for proteins. Furthermore, the proposed system allows cryptographical application of DNA through biochemically editable designs with capacity for steganographical numeric data embedment. As a model case of image-coding DNA technique application, numerically and biochemically combined protocols are employed for ciphering the given “passwords” and/or secret numbers using DNA sequences. The “passwords” of interest were decomposed into single letters and translated into the font image coded on the separate DNA chains with both the coding regions in which the images are encoded based on the novel run-length encoding rule, and the non-coding regions designed for biochemical editing and the remodeling processes revealing the hidden orientation of letters composing the original “passwords.” The latter processes require the molecular biological tools for digestion and ligation of the fragmented DNA molecules targeting at the polymerase chain reaction-engineered termini of the chains. Lastly, additional protocols for steganographical overwriting of the numeric data of interests over the image-coding DNA are also discussed. PMID:23750303

  9. Development of an Ecological Momentary Assessment Mobile App for a Low-Literacy, Mexican American Population to Collect Disordered Eating Behaviors

    PubMed Central

    Stein, Karen F; Chaudry, Beenish; Trabold, Nicole

    2016-01-01

    Background Ecological momentary assessment (EMA) is a popular method for understanding population health in which participants report their experiences while in naturally occurring contexts in order to increase the reliability and ecological validity of the collected data (as compared to retrospective recall). EMA studies, however, have relied primarily on text-based questionnaires, effectively eliminating low-literacy populations from the samples. Objective To provide a case study of design of an EMA mobile app for a low-literacy population. In particular, we present the design process and final design of an EMA mobile app for low literate, Mexican American women to record unhealthy eating and weight control behaviors (UEWCBs). Methods An iterative, user-centered design process was employed to develop the mobile app. An existing EMA protocol to measure UEWCBs in college-enrolled Mexican American women was used as the starting point for the application. The app utilizes an icon interface, with optional audio prompts, that is culturally sensitive and usable by a low-literacy population. A total of 41 women participated over the course of 4 phases of the design process, which included 2 interview and task-based phases (n=8, n=11), focus groups (n=15), and a 5-day, in situ deployment (n=7). Results Participants’ mental models of UEWCBs differed substantially from prevailing definitions found in the literature, prompting a major reorganization of the app interface. Differences in health literacy and numeracy were better identified with the Newest Vital Sign tool, as compared with the Short Assessment of Health Literacy tool. Participants had difficulty imagining scenarios in the interviews to practice recording a specific UEWCB; instead, usability was best tested in situ. Participants were able to use the EMA mobile app over the course of 5 days to record UEWCBs. Conclusions Results suggest that the iterative, user-centered design process was essential for designing the app to be made usable by the target population. Simply taking the protocol designed for a higher-literacy population and replacing words with icons and/or audio would have been unsuccessful with this population. PMID:27418020

  10. Cognitive Protocol Stack Design

    DTIC Science & Technology

    2015-12-30

    SECURITY CLASSIFICATION OF: In the ARO “ Cognitive Protocol Stack Design" project we proposed cognitive networking solutions published in international...areas related to cognitive networking, opening also new lines of research that was not possible to forecast at the beginning of the project. In a...Distribution Unlimited Final Report: Cognitive Protocol Stack Design The views, opinions and/or findings contained in this report are those of the author(s

  11. Flaws in current human training protocols for spontaneous Brain-Computer Interfaces: lessons learned from instructional design

    PubMed Central

    Lotte, Fabien; Larrue, Florian; Mühl, Christian

    2013-01-01

    While recent research on Brain-Computer Interfaces (BCI) has highlighted their potential for many applications, they remain barely used outside laboratories. The main reason is their lack of robustness. Indeed, with current BCI, mental state recognition is usually slow and often incorrect. Spontaneous BCI (i.e., mental imagery-based BCI) often rely on mutual learning efforts by the user and the machine, with BCI users learning to produce stable ElectroEncephaloGraphy (EEG) patterns (spontaneous BCI control being widely acknowledged as a skill) while the computer learns to automatically recognize these EEG patterns, using signal processing. Most research so far was focused on signal processing, mostly neglecting the human in the loop. However, how well the user masters the BCI skill is also a key element explaining BCI robustness. Indeed, if the user is not able to produce stable and distinct EEG patterns, then no signal processing algorithm would be able to recognize them. Unfortunately, despite the importance of BCI training protocols, they have been scarcely studied so far, and used mostly unchanged for years. In this paper, we advocate that current human training approaches for spontaneous BCI are most likely inappropriate. We notably study instructional design literature in order to identify the key requirements and guidelines for a successful training procedure that promotes a good and efficient skill learning. This literature study highlights that current spontaneous BCI user training procedures satisfy very few of these requirements and hence are likely to be suboptimal. We therefore identify the flaws in BCI training protocols according to instructional design principles, at several levels: in the instructions provided to the user, in the tasks he/she has to perform, and in the feedback provided. For each level, we propose new research directions that are theoretically expected to address some of these flaws and to help users learn the BCI skill more efficiently. PMID:24062669

  12. Development and evaluation of a study design typology for human research.

    PubMed

    Carini, Simona; Pollock, Brad H; Lehmann, Harold P; Bakken, Suzanne; Barbour, Edward M; Gabriel, Davera; Hagler, Herbert K; Harper, Caryn R; Mollah, Shamim A; Nahm, Meredith; Nguyen, Hien H; Scheuermann, Richard H; Sim, Ida

    2009-11-14

    A systematic classification of study designs would be useful for researchers, systematic reviewers, readers, and research administrators, among others. As part of the Human Studies Database Project, we developed the Study Design Typology to standardize the classification of study designs in human research. We then performed a multiple observer masked evaluation of active research protocols in four institutions according to a standardized protocol. Thirty-five protocols were classified by three reviewers each into one of nine high-level study designs for interventional and observational research (e.g., N-of-1, Parallel Group, Case Crossover). Rater classification agreement was moderately high for the 35 protocols (Fleiss' kappa = 0.442) and higher still for the 23 quantitative studies (Fleiss' kappa = 0.463). We conclude that our typology shows initial promise for reliably distinguishing study design types for quantitative human research.

  13. Oncosurgical Management of Liver Limited Stage IV Colorectal Cancer: Preliminary Data and Protocol for a Randomized Controlled Trial.

    PubMed

    Sutton, Paul; Vimalachandran, Dale; Poston, Graeme; Fenwick, Stephen; Malik, Hassan

    2018-05-09

    Colorectal cancer is the fourth commonest cancer and second commonest cause of cancer-related death in the United Kingdom. Almost 15% of patients have metastases on presentation. An increasing number of surgical strategies and better neoadjuvant treatment options are responsible for more patients undergoing resection of liver metastases, with prolonged survival in a select group of patients who present with synchronous disease. It is clear that the optimal strategy for the management of these patients remains unclear, and there is certainly a complete absence of Level 1 evidence in the literature. The objective of this study is to undertake preliminary work and devise an outline trial protocol to inform the future development of clinical studies to investigate the management of patients with liver limited stage IV colorectal cancer. We have undertaken some preliminary work and begun the process of designing a randomized controlled trial and present a draft trial protocol here. This study is at the protocol development stage only, and as such no results are available. There is no funding in place for this study, and no anticipated start date. We have presented preliminary work and an outline trial protocol which we anticipate will inform the future development of clinical studies to investigate the management of patients with liver limited stage IV colorectal cancer. We do not believe that the trial we have designed will answer the most significant clinical questions, nor that it is feasible to be delivered within the United Kingdom's National Health Service at this current time. ©Paul Sutton, Dale Vimalachandran, Graeme Poston, Stephen Fenwick, Hassan Malik. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 09.05.2018.

  14. Model Checking Failed Conjectures in Theorem Proving: A Case Study

    NASA Technical Reports Server (NTRS)

    Pike, Lee; Miner, Paul; Torres-Pomales, Wilfredo

    2004-01-01

    Interactive mechanical theorem proving can provide high assurance of correct design, but it can also be a slow iterative process. Much time is spent determining why a proof of a conjecture is not forthcoming. In some cases, the conjecture is false and in others, the attempted proof is insufficient. In this case study, we use the SAL family of model checkers to generate a concrete counterexample to an unproven conjecture specified in the mechanical theorem prover, PVS. The focus of our case study is the ROBUS Interactive Consistency Protocol. We combine the use of a mechanical theorem prover and a model checker to expose a subtle flaw in the protocol that occurs under a particular scenario of faults and processor states. Uncovering the flaw allows us to mend the protocol and complete its general verification in PVS.

  15. Returning Individual Research Results: Development of a Cancer Genetics Education and Risk Communication Protocol

    PubMed Central

    Roberts, J. Scott; Shalowitz, David I.; Christensen, Kurt D.; Everett, Jessica N.; Kim, Scott Y. H.; Raskin, Leon; Gruber, Stephen B.

    2011-01-01

    The obligations of researchers to disclose clinically and/or personally significant individual research results are highly debated, but few empirical studies have addressed this topic. We describe the development of a protocol for returning research results to participants at one site of a multicenter study of the genetic epidemiology of melanoma. Protocol development involved numerous challenges: (1) deciding whether genotype results merited disclosure; (2) achieving an appropriate format for communicating results; (3) developing education materials; (4) deciding whether to retest samples for additional laboratory validation; (5) identifying and notifying selected participants; and (6) assessing the impact of disclosure. Our experience suggests potential obstacles depending on researcher resources and the design of the parent study, but offers a process by which researchers can responsibly return individual study results and evaluate the impact of disclosure. PMID:20831418

  16. Atomic entanglement purification and concentration using coherent state input-output process in low-Q cavity QED regime.

    PubMed

    Cao, Cong; Wang, Chuan; He, Ling-Yan; Zhang, Ru

    2013-02-25

    We investigate an atomic entanglement purification protocol based on the coherent state input-output process by working in low-Q cavity in the atom-cavity intermediate coupling region. The information of entangled states are encoded in three-level configured single atoms confined in separated one-side optical micro-cavities. Using the coherent state input-output process, we design a two-qubit parity check module (PCM), which allows the quantum nondemolition measurement for the atomic qubits, and show its use for remote parities to distill a high-fidelity atomic entangled ensemble from an initial mixed state ensemble nonlocally. The proposed scheme can further be used for unknown atomic states entanglement concentration. Also by exploiting the PCM, we describe a modified scheme for atomic entanglement concentration by introducing ancillary single atoms. As the coherent state input-output process is robust and scalable in realistic applications, and the detection in the PCM is based on the intensity of outgoing coherent state, the present protocols may be widely used in large-scaled and solid-based quantum repeater and quantum information processing.

  17. Building distributed rule-based systems using the AI Bus

    NASA Technical Reports Server (NTRS)

    Schultz, Roger D.; Stobie, Iain C.

    1990-01-01

    The AI Bus software architecture was designed to support the construction of large-scale, production-quality applications in areas of high technology flux, running heterogeneous distributed environments, utilizing a mix of knowledge-based and conventional components. These goals led to its current development as a layered, object-oriented library for cooperative systems. This paper describes the concepts and design of the AI Bus and its implementation status as a library of reusable and customizable objects, structured by layers from operating system interfaces up to high-level knowledge-based agents. Each agent is a semi-autonomous process with specialized expertise, and consists of a number of knowledge sources (a knowledge base and inference engine). Inter-agent communication mechanisms are based on blackboards and Actors-style acquaintances. As a conservative first implementation, we used C++ on top of Unix, and wrapped an embedded Clips with methods for the knowledge source class. This involved designing standard protocols for communication and functions which use these protocols in rules. Embedding several CLIPS objects within a single process was an unexpected problem because of global variables, whose solution involved constructing and recompiling a C++ version of CLIPS. We are currently working on a more radical approach to incorporating CLIPS, by separating out its pattern matcher, rule and fact representations and other components as true object oriented modules.

  18. Feasibility and Efficacy of Nurse-Driven Acute Stroke Care.

    PubMed

    Mainali, Shraddha; Stutzman, Sonja; Sengupta, Samarpita; Dirickson, Amanda; Riise, Laura; Jones, Donald; Yang, Julian; Olson, DaiWai M

    2017-05-01

    Acute stroke care requires rapid assessment and intervention. Replacing traditional sequential algorithms in stroke care with parallel processing using telestroke consultation could be useful in the management of acute stroke patients. The purpose of this study was to assess the feasibility of a nurse-driven acute stroke protocol using a parallel processing model. This is a prospective, nonrandomized, feasibility study of a quality improvement initiative. Stroke team members had a 1-month training phase, and then the protocol was implemented for 6 months and data were collected on a "run-sheet." The primary outcome of this study was to determine if a nurse-driven acute stroke protocol is feasible and assists in decreasing door to needle (intravenous tissue plasminogen activator [IV-tPA]) times. Of the 153 stroke patients seen during the protocol implementation phase, 57 were designated as "level 1" (symptom onset <4.5 hours) strokes requiring acute stroke management. Among these strokes, 78% were nurse-driven, and 75% of the telestroke encounters were also nurse-driven. The average door to computerized tomography time was significantly reduced in nurse-driven codes (38.9 minutes versus 24.4 minutes; P < .04). The use of a nurse-driven protocol is feasible and effective. When used in conjunction with a telestroke specialist, it may be of value in improving patient outcomes by decreasing the time for door to decision for IV-tPA. Copyright © 2017 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  19. Protocol-based care: the standardisation of decision-making?

    PubMed

    Rycroft-Malone, Jo; Fontenla, Marina; Seers, Kate; Bick, Debra

    2009-05-01

    To explore how protocol-based care affects clinical decision-making. In the context of evidence-based practice, protocol-based care is a mechanism for facilitating the standardisation of care and streamlining decision-making through rationalising the information with which to make judgements and ultimately decisions. However, whether protocol-based care does, in the reality of practice, standardise decision-making is unknown. This paper reports on a study that explored the impact of protocol-based care on nurses' decision-making. Theoretically informed by realistic evaluation and the promoting action on research implementation in health services framework, a case study design using ethnographic methods was used. Two sites were purposively sampled; a diabetic and endocrine unit and a cardiac medical unit. Within each site, data collection included observation, postobservation semi-structured interviews with staff and patients, field notes, feedback sessions and document review. Data were inductively and thematically analysed. Decisions made by nurses in both sites were varied according to many different and interacting factors. While several standardised care approaches were available for use, in reality, a variety of information sources informed decision-making. The primary approach to knowledge exchange and acquisition was person-to-person; decision-making was a social activity. Rarely were standardised care approaches obviously referred to; nurses described following a mental flowchart, not necessarily linked to a particular guideline or protocol. When standardised care approaches were used, it was reported that they were used flexibly and particularised. While the logic of protocol-based care is algorithmic, in the reality of clinical practice, other sources of information supported nurses' decision-making process. This has significant implications for the political goal of standardisation. The successful implementation and judicious use of tools such as protocols and guidelines will likely be dependant on approaches that facilitate the development of nurses' decision-making processes in parallel to paying attention to the influence of context.

  20. PAT: an intelligent authoring tool for facilitating clinical trial design.

    PubMed

    Tagaris, Anastasios; Andronikou, Vassiliki; Karanastasis, Efstathios; Chondrogiannis, Efthymios; Tsirmpas, Charalambos; Varvarigou, Theodora; Koutsouris, Dimitris

    2014-01-01

    Great investments are made by both private and public funds and a wealth of research findings is published, the research and development pipeline phases quite low productivity and tremendous delays. In this paper, we present a novel authoring tool which has been designed and developed for facilitating study design. Its underlying models are based on a thorough analysis of existing clinical trial protocols (CTPs) and eligibility criteria (EC) published in clinicaltrials.gov by domain experts. Moreover, its integration with intelligent decision support services and mechanisms linking the study design process with healthcare patient data as well as its direct access to literature designate it as a powerful tool offering great support to researchers during clinical trial design.

  1. Fast and accurate semantic annotation of bioassays exploiting a hybrid of machine learning and user confirmation.

    PubMed

    Clark, Alex M; Bunin, Barry A; Litterman, Nadia K; Schürer, Stephan C; Visser, Ubbo

    2014-01-01

    Bioinformatics and computer aided drug design rely on the curation of a large number of protocols for biological assays that measure the ability of potential drugs to achieve a therapeutic effect. These assay protocols are generally published by scientists in the form of plain text, which needs to be more precisely annotated in order to be useful to software methods. We have developed a pragmatic approach to describing assays according to the semantic definitions of the BioAssay Ontology (BAO) project, using a hybrid of machine learning based on natural language processing, and a simplified user interface designed to help scientists curate their data with minimum effort. We have carried out this work based on the premise that pure machine learning is insufficiently accurate, and that expecting scientists to find the time to annotate their protocols manually is unrealistic. By combining these approaches, we have created an effective prototype for which annotation of bioassay text within the domain of the training set can be accomplished very quickly. Well-trained annotations require single-click user approval, while annotations from outside the training set domain can be identified using the search feature of a well-designed user interface, and subsequently used to improve the underlying models. By drastically reducing the time required for scientists to annotate their assays, we can realistically advocate for semantic annotation to become a standard part of the publication process. Once even a small proportion of the public body of bioassay data is marked up, bioinformatics researchers can begin to construct sophisticated and useful searching and analysis algorithms that will provide a diverse and powerful set of tools for drug discovery researchers.

  2. Fast and accurate semantic annotation of bioassays exploiting a hybrid of machine learning and user confirmation

    PubMed Central

    Bunin, Barry A.; Litterman, Nadia K.; Schürer, Stephan C.; Visser, Ubbo

    2014-01-01

    Bioinformatics and computer aided drug design rely on the curation of a large number of protocols for biological assays that measure the ability of potential drugs to achieve a therapeutic effect. These assay protocols are generally published by scientists in the form of plain text, which needs to be more precisely annotated in order to be useful to software methods. We have developed a pragmatic approach to describing assays according to the semantic definitions of the BioAssay Ontology (BAO) project, using a hybrid of machine learning based on natural language processing, and a simplified user interface designed to help scientists curate their data with minimum effort. We have carried out this work based on the premise that pure machine learning is insufficiently accurate, and that expecting scientists to find the time to annotate their protocols manually is unrealistic. By combining these approaches, we have created an effective prototype for which annotation of bioassay text within the domain of the training set can be accomplished very quickly. Well-trained annotations require single-click user approval, while annotations from outside the training set domain can be identified using the search feature of a well-designed user interface, and subsequently used to improve the underlying models. By drastically reducing the time required for scientists to annotate their assays, we can realistically advocate for semantic annotation to become a standard part of the publication process. Once even a small proportion of the public body of bioassay data is marked up, bioinformatics researchers can begin to construct sophisticated and useful searching and analysis algorithms that will provide a diverse and powerful set of tools for drug discovery researchers. PMID:25165633

  3. What the drivers do and do not tell you: using verbal protocol analysis to investigate driver behaviour in emergency situations.

    PubMed

    Banks, Victoria A; Stanton, Neville A; Harvey, Catherine

    2014-01-01

    Although task analysis of pedestrian detection can provide us with useful insights into how a driver may behave in emergency situations, the cognitive elements of driver decision-making are less well understood. To assist in the design of future Advanced Driver Assistance Systems, such as Autonomous Emergency Brake systems, it is essential that the cognitive elements of the driving task are better understood. This paper uses verbal protocol analysis in an exploratory fashion to uncover the thought processes underlying behavioural outcomes represented by hard data collected using the Southampton University Driving Simulator.

  4. The NAIMS cooperative pilot project: Design, implementation and future directions.

    PubMed

    Oh, Jiwon; Bakshi, Rohit; Calabresi, Peter A; Crainiceanu, Ciprian; Henry, Roland G; Nair, Govind; Papinutto, Nico; Constable, R Todd; Reich, Daniel S; Pelletier, Daniel; Rooney, William; Schwartz, Daniel; Tagge, Ian; Shinohara, Russell T; Simon, Jack H; Sicotte, Nancy L

    2017-10-01

    The North American Imaging in Multiple Sclerosis (NAIMS) Cooperative represents a network of 27 academic centers focused on accelerating the pace of magnetic resonance imaging (MRI) research in multiple sclerosis (MS) through idea exchange and collaboration. Recently, NAIMS completed its first project evaluating the feasibility of implementation and reproducibility of quantitative MRI measures derived from scanning a single MS patient using a high-resolution 3T protocol at seven sites. The results showed the feasibility of utilizing advanced quantitative MRI measures in multicenter studies and demonstrated the importance of careful standardization of scanning protocols, central image processing, and strategies to account for inter-site variability.

  5. The protocol and design of a randomised controlled study on training of attention within the first year after acquired brain injury.

    PubMed

    Bartfai, Aniko; Markovic, Gabriela; Sargenius Landahl, Kristina; Schult, Marie-Louise

    2014-05-08

    To describe the design of the study aiming to examine intensive targeted cognitive rehabilitation of attention in the acute (<4 months) and subacute rehabilitation phases (4-12 months) after acquired brain injury and to evaluate the effects on function, activity and participation (return to work). Within a prospective, randomised, controlled study 120 consecutive patients with stroke or traumatic brain injury were randomised to 20 hours of intensive attention training by Attention Process Training or by standard, activity based training. Progress was evaluated by Statistical Process Control and by pre and post measurement of functional and activity levels. Return to work was also evaluated in the post-acute phase. Primary endpoints were the changes in the attention measure, Paced Auditory Serial Addition Test and changes in work ability. Secondary endpoints included measurement of cognitive functions, activity and work return. There were 3, 6 and 12-month follow ups focussing on health economics. The study will provide information on rehabilitation of attention in the early phases after ABI; effects on function, activity and return to work. Further, the application of Statistical Process Control might enable closer investigation of the cognitive changes after acquired brain injury and demonstrate the usefulness of process measures in rehabilitation. The study was registered at ClinicalTrials.gov Protocol. NCT02091453, registered: 19 March 2014.

  6. Solvent replacement for green processing.

    PubMed Central

    Sherman, J; Chin, B; Huibers, P D; Garcia-Valls, R; Hatton, T A

    1998-01-01

    The implementation of the Montreal Protocol, the Clean Air Act, and the Pollution Prevention Act of 1990 has resulted in increased awareness of organic solvent use in chemical processing. The advances made in the search to find "green" replacements for traditional solvents are reviewed, with reference to solvent alternatives for cleaning, coatings, and chemical reaction and separation processes. The development of solvent databases and computational methods that aid in the selection and/or design of feasible or optimal environmentally benign solvent alternatives for specific applications is also discussed. Images Figure 2 Figure 3 PMID:9539018

  7. Identifying Evidence-Based Educational Practices: Which Research Designs Provide Findings That Can Influence Social Change?

    ERIC Educational Resources Information Center

    Schirmer, Barbara R.; Lockman, Alison S.; Schirmer, Todd N.

    2016-01-01

    We conducted this conceptual study to determine if the Institute of Education Sciences/National Science Foundation pipeline of evidence guidelines could be applied as a protocol that researchers could follow in establishing evidence of effective instructional practices. To do this, we compared these guidelines, new drug development process, and…

  8. Methods and Management of the Healthy Brain Study: A Large Multisite Qualitative Research Project

    ERIC Educational Resources Information Center

    Laditka, Sarah B.; Corwin, Sara J.; Laditka, James N.; Liu, Rui; Friedman, Daniela B.; Mathews, Anna E.; Wilcox, Sara

    2009-01-01

    Purpose of the study: To describe processes used in the Healthy Brain project to manage data collection, coding, and data distribution in a large qualitative project, conducted by researchers at 9 universities in 9 states. Design and Methods: Project management protocols included: (a) managing audiotapes and surveys to ensure data confidentiality,…

  9. Self-Regulated Learning Using Multimedia Programs in Dentistry Postgraduate Students: A Multimethod Approach

    ERIC Educational Resources Information Center

    Lloret, Miguel; Aguila, Estela; Lloret, Alejandro

    2009-01-01

    The purpose of this study was to study the effect of a multimedia computing program on the production of activities and self-regulated learning processes in 18 students of the Dentistry postdegree (Celaya, Mexico). A multi-method design (quasi-experimental, pretest-post-test and qualitative: Think aloud protocol) was used. Self-regulated…

  10. Inquiry in the Large-Enrollment Science Classroom: Simulating a Research Investigation

    ERIC Educational Resources Information Center

    Reeve, Suzanne; Hammond, Jennetta W.; Bradshaw, William S.

    2004-01-01

    We conduct research workshops twice each semester in our cell biology lecture course. Instead of solely analyzing data obtained by others, students form groups to design research questions and experimental protocols on a given topic. The main focus is the process of scientific thinking, not simply obtaining a correct product. (Contains 3 tables…

  11. RICIS Symposium 1992: Mission and Safety Critical Systems Research and Applications

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This conference deals with computer systems which control systems whose failure to operate correctly could produce the loss of life and or property, mission and safety critical systems. Topics covered are: the work of standards groups, computer systems design and architecture, software reliability, process control systems, knowledge based expert systems, and computer and telecommunication protocols.

  12. Internet Data Delivery for Future Space Missions

    NASA Technical Reports Server (NTRS)

    Rash, James; Hogie, Keith; Casasanta, Ralph; Hennessy, Joseph F. (Technical Monitor)

    2002-01-01

    This paper presents work being done at NASA/GSFC (Goddard Space Flight Center) on applying standard Internet applications and protocols to meet the technology challenge of future satellite missions. Internet protocols (IP) can provide seamless dynamic communication among heterogeneous instruments, spacecraft, ground stations, and constellations of spacecraft. A primary component of this work is to design and demonstrate automated end-to-end transport of files in a dynamic space environment using off-the-shelf, low-cost, commodity-level standard applications and protocols. These functions and capabilities will become increasingly significant in the years to come as both Earth and space science missions fly more sensors and the present labor-intensive, mission-specific techniques for processing and routing data become prohibitively expensive. This paper describes how an IP-based communication architecture can support existing operations concepts and how it will enable some new and complex communication and science concepts. The authors identify specific end-to-end file transfers all the way from instruments to control centers and scientists, and then describe how each data flow can be supported using standard Internet protocols and applications. The scenarios include normal data downlink and command uplink as well as recovery scenarios for both onboard and ground failures. The scenarios are based on an Earth orbiting spacecraft with data rates and downlink capabilities from 300 Kbps to 4 Mbps. Many examples are based on designs currently being investigated for the Global Precipitation Measurement (GPM) mission.

  13. A Study of Quality of Service Communication for High-Speed Packet-Switching Computer Sub-Networks

    NASA Technical Reports Server (NTRS)

    Cui, Zhenqian

    1999-01-01

    In this thesis, we analyze various factors that affect quality of service (QoS) communication in high-speed, packet-switching sub-networks. We hypothesize that sub-network-wide bandwidth reservation and guaranteed CPU processing power at endpoint systems for handling data traffic are indispensable to achieving hard end-to-end quality of service. Different bandwidth reservation strategies, traffic characterization schemes, and scheduling algorithms affect the network resources and CPU usage as well as the extent that QoS can be achieved. In order to analyze those factors, we design and implement a communication layer. Our experimental analysis supports our research hypothesis. The Resource ReSerVation Protocol (RSVP) is designed to realize resource reservation. Our analysis of RSVP shows that using RSVP solely is insufficient to provide hard end-to-end quality of service in a high-speed sub-network. Analysis of the IEEE 802.lp protocol also supports the research hypothesis.

  14. Knowledge about the research and ethics committee at Makerere University, Kampala.

    PubMed

    Ibingira, B R; Ochieng, J

    2013-12-01

    All research involving human participants should be reviewed by a competent and independent institutional research and ethics committee. Research conducted at Makerere University College of Health Sciences should be subjected to a rigorous review process by the ethics committee in order to protect human participants' interests, rights and welfare. To evaluate researchers' knowledge about the functions and ethical review process of the College of Health Sciences research and ethics committee. A cross sectional study. 135 researchers consented to participate in the study, but 70 questionnaires were answered giving a 52% response. Age ranged between 30 to 61 years, majority of participants 30-39 years. Most of the respondents do agree that the REC functions include Protocol review 86%, protection of research participants 84.3%, and monitoring of ongoing research. During ethical review, the RECpays special attention to scientific design [79.7%] and ethical issues [75.3%], but less to the budget and literature review. More than 97% of the respondents believe that the REC is either average or very good, while 2.8% rank it below average. Respondents knew the major functions of the committee including protection of the rights and welfare of research participants, protocol review and monitoring of on going research, and the elements of protocol review that are given more attention include ;scientific design and ethical issues. Overall performance of the REC was ranked as average by respondents. The committee should limit delays in approval and effectively handle all functions of the committee.

  15. A protocol for better design, application, and communication of population viability analyses.

    PubMed

    Pe'er, Guy; Matsinos, Yiannis G; Johst, Karin; Franz, Kamila W; Turlure, Camille; Radchuk, Viktoriia; Malinowska, Agnieszka H; Curtis, Janelle M R; Naujokaitis-Lewis, Ilona; Wintle, Brendan A; Henle, Klaus

    2013-08-01

    Population viability analyses (PVAs) contribute to conservation theory, policy, and management. Most PVAs focus on single species within a given landscape and address a specific problem. This specificity often is reflected in the organization of published PVA descriptions. Many lack structure, making them difficult to understand, assess, repeat, or use for drawing generalizations across PVA studies. In an assessment comparing published PVAs and existing guidelines, we found that model selection was rarely justified; important parameters remained neglected or their implementation was described vaguely; limited details were given on parameter ranges, sensitivity analysis, and scenarios; and results were often reported too inconsistently to enable repeatability and comparability. Although many guidelines exist on how to design and implement reliable PVAs and standards exist for documenting and communicating ecological models in general, there is a lack of organized guidelines for designing, applying, and communicating PVAs that account for their diversity of structures and contents. To fill this gap, we integrated published guidelines and recommendations for PVA design and application, protocols for documenting ecological models in general and individual-based models in particular, and our collective experience in developing, applying, and reviewing PVAs. We devised a comprehensive protocol for the design, application, and communication of PVAs (DAC-PVA), which has 3 primary elements. The first defines what a useful PVA is; the second element provides a workflow for the design and application of a useful PVA and highlights important aspects that need to be considered during these processes; and the third element focuses on communication of PVAs to ensure clarity, comprehensiveness, repeatability, and comparability. Thereby, DAC-PVA should strengthen the credibility and relevance of PVAs for policy and management, and improve the capacity to generalize PVA findings across studies. © 2013 Society for Conservation Biology.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ishihara, T

    Currently, the problem at hand is in distributing identical copies of OEP and filter software to a large number of farm nodes. One of the common methods used to transfer these softwares is through unicast. Unicast protocol faces the problem of repetitiously sending the same data over the network. Since the sending rate is limited, this process poses to be a bottleneck. Therefore, one possible solution to this problem lies in creating a reliable multicast protocol. A specific type of multicast protocol is the Bulk Multicast Protocol [4]. This system consists of one sender distributing data to many receivers. Themore » sender delivers data at a given rate of data packets. In response to that, the receiver replies to the sender with a status packet which contains information about the packet loss in terms of Negative Acknowledgment. The probability of the status packet sent back to the sender is+, where N is the number of receivers. The protocol is designed to have approximately 1 status packet for each data packet sent. In this project, we were able to show that the time taken for the complete transfer of a file to multiple receivers was about 12 times faster with multicast than by the use of unicast. The implementation of this experimental protocol shows remarkable improvement in mass data transfer to a large number of farm machines.« less

  17. Treatment outcomes after implementation of an adapted WHO protocol for severe sepsis and septic shock in Haiti.

    PubMed

    Papali, Alfred; Eoin West, T; Verceles, Avelino C; Augustin, Marc E; Nathalie Colas, L; Jean-Francois, Carl H; Patel, Devang M; Todd, Nevins W; McCurdy, Michael T

    2017-10-01

    The World Health Organization (WHO) has developed a simplified algorithm specific to resource-limited settings for the treatment of severe sepsis emphasizing early fluids and antibiotics. However, this protocol's clinical effectiveness is unknown. We describe patient outcomes before and after implementation of an adapted WHO severe sepsis protocol at a community hospital in Haiti. Using a before-and-after study design, we retrospectively enrolled 99 adult Emergency Department patients with severe sepsis from January through March 2012. After protocol implementation in January 2014, we compared outcomes to 67 patients with severe sepsis retrospectively enrolled from February to April 2014. We defined sepsis according to the WHO's Integrated Management of Adult Illness guidelines and severe sepsis as sepsis plus organ dysfunction. After protocol implementation, quantity of fluid administered increased and the physician's differential diagnoses more often included sepsis. Patients were more likely to have follow-up vital signs taken sooner, a radiograph performed, and a lactic acid tested. There were no improvements in mortality, time to fluids or antimicrobials. Use of a simplified sepsis protocol based primarily on physiologic parameters allows for substantial improvements in process measures in the care of severely septic patients in a resource-constrained setting. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Development of an alcohol withdrawal protocol: CNS collaborative exemplar.

    PubMed

    Phillips, Susan; Haycock, Camille; Boyle, Deborah

    2006-01-01

    The purpose of this process improvement project was to develop an Alcohol Withdrawal Syndrome (AWS) management protocol for acute care. The prevalence of alcohol abuse in our society presents challenges for health professionals, and few nurses have received formal education on the identification and treatment of AWS, which has frequently resulted in ineffective, nonstandardized care. However, nurses practicing in medical-surgical, emergency, trauma, and critical care settings must be astute in the assessment and management of AWS. DESIGN/BACKGROUND/RATIONALE: Following an analysis of existing management protocols, a behavioral health clinical nurse specialist was asked to lead a work team composed of physicians, pharmacists, and nurses to develop a new evidence-based alcohol withdrawal protocol for acute care. By implementing a standardized assessment tool and treatment protocol, clinical nurse specialists empowered nursing staff with strategies to prevent the serious medical complications associated with AWS. FINDINGS/OUTCOMES: The development and integration of a safe and effective treatment protocol to manage AWS was facilitated by collaborative, evidence-based decision making. Clinical experience and specialty expertise were integrated by clinical nurse specialists skilled in group dynamics, problem-solving, and the implementation of change. Improving care of patients in AWS is an exemplar for clinical nurse specialist roles as change agent and patient advocate.

  19. Considerations in establishing a post-mortem brain and tissue bank for the study of myalgic encephalomyelitis/chronic fatigue syndrome: a proposed protocol.

    PubMed

    Nacul, Luis; O'Donovan, Dominic G; Lacerda, Eliana M; Gveric, Djordje; Goldring, Kirstin; Hall, Alison; Bowman, Erinna; Pheby, Derek

    2014-06-18

    Our aim, having previously investigated through a qualitative study involving extensive discussions with experts and patients the issues involved in establishing and maintaining a disease specific brain and tissue bank for myalgic encephalomyelitis/chronic fatigue syndrome (ME/CFS), was to develop a protocol for a UK ME/CFS repository of high quality human tissue from well characterised subjects with ME/CFS and controls suitable for a broad range of research applications. This would involve a specific donor program coupled with rapid tissue collection and processing, supplemented by comprehensive prospectively collected clinical, laboratory and self-assessment data from cases and controls. We reviewed the operations of existing tissue banks from published literature and from their internal protocols and standard operating procedures (SOPs). On this basis, we developed the protocol presented here, which was designed to meet high technical and ethical standards and legal requirements and was based on recommendations of the MRC UK Brain Banks Network. The facility would be most efficient and cost-effective if incorporated into an existing tissue bank. Tissue collection would be rapid and follow robust protocols to ensure preservation sufficient for a wide range of research uses. A central tissue bank would have resources both for wide-scale donor recruitment and rapid response to donor death for prompt harvesting and processing of tissue. An ME/CFS brain and tissue bank could be established using this protocol. Success would depend on careful consideration of logistic, technical, legal and ethical issues, continuous consultation with patients and the donor population, and a sustainable model of funding ideally involving research councils, health services, and patient charities. This initiative could revolutionise the understanding of this still poorly-understood disease and enhance development of diagnostic biomarkers and treatments.

  20. Guest Editor's Introduction: Special section on dependable distributed systems

    NASA Astrophysics Data System (ADS)

    Fetzer, Christof

    1999-09-01

    We rely more and more on computers. For example, the Internet reshapes the way we do business. A `computer outage' can cost a company a substantial amount of money. Not only with respect to the business lost during an outage, but also with respect to the negative publicity the company receives. This is especially true for Internet companies. After recent computer outages of Internet companies, we have seen a drastic fall of the shares of the affected companies. There are multiple causes for computer outages. Although computer hardware becomes more reliable, hardware related outages remain an important issue. For example, some of the recent computer outages of companies were caused by failed memory and system boards, and even by crashed disks - a failure type which can easily be masked using disk mirroring. Transient hardware failures might also look like software failures and, hence, might be incorrectly classified as such. However, many outages are software related. Faulty system software, middleware, and application software can crash a system. Dependable computing systems are systems we can rely on. Dependable systems are, by definition, reliable, available, safe and secure [3]. This special section focuses on issues related to dependable distributed systems. Distributed systems have the potential to be more dependable than a single computer because the probability that all computers in a distributed system fail is smaller than the probability that a single computer fails. However, if a distributed system is not built well, it is potentially less dependable than a single computer since the probability that at least one computer in a distributed system fails is higher than the probability that one computer fails. For example, if the crash of any computer in a distributed system can bring the complete system to a halt, the system is less dependable than a single-computer system. Building dependable distributed systems is an extremely difficult task. There is no silver bullet solution. Instead one has to apply a variety of engineering techniques [2]: fault-avoidance (minimize the occurrence of faults, e.g. by using a proper design process), fault-removal (remove faults before they occur, e.g. by testing), fault-evasion (predict faults by monitoring and reconfigure the system before failures occur), and fault-tolerance (mask and/or contain failures). Building a system from scratch is an expensive and time consuming effort. To reduce the cost of building dependable distributed systems, one would choose to use commercial off-the-shelf (COTS) components whenever possible. The usage of COTS components has several potential advantages beyond minimizing costs. For example, through the widespread usage of a COTS component, design failures might be detected and fixed before the component is used in a dependable system. Custom-designed components have to mature without the widespread in-field testing of COTS components. COTS components have various potential disadvantages when used in dependable systems. For example, minimizing the time to market might lead to the release of components with inherent design faults (e.g. use of `shortcuts' that only work most of the time). In addition, the components might be more complex than needed and, hence, potentially have more design faults than simpler components. However, given economic constraints and the ability to cope with some of the problems using fault-evasion and fault-tolerance, only for a small percentage of systems can one justify not using COTS components. Distributed systems built from current COTS components are asynchronous systems in the sense that there exists no a priori known bound on the transmission delay of messages or the execution time of processes. When designing a distributed algorithm, one would like to make sure (e.g. by testing or verification) that it is correct, i.e. satisfies its specification. Many distributed algorithms make use of consensus (eventually all non-crashed processes have to agree on a value), leader election (a crashed leader is eventually replaced by a new leader, but at any time there is at most one leader) or a group membership detection service (a crashed process is eventually suspected to have crashed but only crashed processes are suspected). From a theoretical point of view, the service specifications given for such services are not implementable in asynchronous systems. In particular, for each implementation one can derive a counter example in which the service violates its specification. From a practical point of view, the consensus, the leader election, and the membership detection problem are solvable in asynchronous distributed systems. In this special section, Raynal and Tronel show how to bridge this difference by showing how to implement the group membership detection problem with a negligible probability [1] to fail in an asynchronous system. The group membership detection problem is specified by a liveness condition (L) and a safety property (S): (L) if a process p crashes, then eventually every non-crashed process q has to suspect that p has crashed; and (S) if a process q suspects p, then p has indeed crashed. One can show that either (L) or (S) is implementable, but one cannot implement both (L) and (S) at the same time in an asynchronous system. In practice, one only needs to implement (L) and (S) such that the probability that (L) or (S) is violated becomes negligible. Raynal and Tronel propose and analyse a protocol that implements (L) with certainty and that can be tuned such that the probability that (S) is violated becomes negligible. Designing and implementing distributed fault-tolerant protocols for asynchronous systems is a difficult but not an impossible task. A fault-tolerant protocol has to detect and mask certain failure classes, e.g. crash failures and message omission failures. There is a trade-off between the performance of a fault-tolerant protocol and the failure classes the protocol can tolerate. One wants to tolerate as many failure classes as needed to satisfy the stochastic requirements of the protocol [1] while still maintaining a sufficient performance. Since clients of a protocol have different requirements with respect to the performance/fault-tolerance trade-off, one would like to be able to customize protocols such that one can select an appropriate performance/fault-tolerance trade-off. In this special section Hiltunen et al describe how one can compose protocols from micro-protocols in their Cactus system. They show how a group RPC system can be tailored to the needs of a client. In particular, they show how considering additional failure classes affects the performance of a group RPC system. References [1] Cristian F 1991 Understanding fault-tolerant distributed systems Communications of ACM 34 (2) 56-78 [2] Heimerdinger W L and Weinstock C B 1992 A conceptual framework for system fault tolerance Technical Report 92-TR-33, CMU/SEI [3] Laprie J C (ed) 1992 Dependability: Basic Concepts and Terminology (Vienna: Springer)

  1. [PROtocol-based MObilizaTION on intensive care units : Design of a cluster randomized pilot study].

    PubMed

    Nydahl, P; Diers, A; Günther, U; Haastert, B; Hesse, S; Kerschensteiner, C; Klarmann, S; Köpke, S

    2017-10-12

    Despite convincing evidence for early mobilization of patients on intensive care units (ICU), implementation in practice is limited. Protocols for early mobilization, including in- and exclusion criteria, assessments, safety criteria, and step schemes may increase the rate of implementation and mobilization. Patients (population) on ICUs with a protocol for early mobilization (intervention), compared to patients on ICUs without protocol (control), will be more frequently mobilized (outcome). A multicenter, stepped-wedge, cluster-randomized pilot study is presented. Five ICUs will receive an adapted, interprofessional protocol for early mobilization in randomized order. Before and after implementation, mobilization of ICU patients will be evaluated by randomized monthly one-day point prevalence surveys. Primary outcome is the percentage of patients mobilized out of bed, operationalized as a score of ≥3 on the ICU Mobility Scale. Secondary outcome parameters will be presence and/or length of mechanical ventilation, delirium, stay on ICU and in hospital, barriers to early mobilization, adverse events, and process parameters as identified barriers, used strategies, and adaptions to local conditions. Exploratory evaluation of study feasibility and estimation of effect sizes as the basis for a future explanatory study.

  2. Droplet-based pyrosequencing using digital microfluidics.

    PubMed

    Boles, Deborah J; Benton, Jonathan L; Siew, Germaine J; Levy, Miriam H; Thwar, Prasanna K; Sandahl, Melissa A; Rouse, Jeremy L; Perkins, Lisa C; Sudarsan, Arjun P; Jalili, Roxana; Pamula, Vamsee K; Srinivasan, Vijay; Fair, Richard B; Griffin, Peter B; Eckhardt, Allen E; Pollack, Michael G

    2011-11-15

    The feasibility of implementing pyrosequencing chemistry within droplets using electrowetting-based digital microfluidics is reported. An array of electrodes patterned on a printed-circuit board was used to control the formation, transportation, merging, mixing, and splitting of submicroliter-sized droplets contained within an oil-filled chamber. A three-enzyme pyrosequencing protocol was implemented in which individual droplets contained enzymes, deoxyribonucleotide triphosphates (dNTPs), and DNA templates. The DNA templates were anchored to magnetic beads which enabled them to be thoroughly washed between nucleotide additions. Reagents and protocols were optimized to maximize signal over background, linearity of response, cycle efficiency, and wash efficiency. As an initial demonstration of feasibility, a portion of a 229 bp Candida parapsilosis template was sequenced using both a de novo protocol and a resequencing protocol. The resequencing protocol generated over 60 bp of sequence with 100% sequence accuracy based on raw pyrogram levels. Excellent linearity was observed for all of the homopolymers (two, three, or four nucleotides) contained in the C. parapsilosis sequence. With improvements in microfluidic design it is expected that longer reads, higher throughput, and improved process integration (i.e., "sample-to-sequence" capability) could eventually be achieved using this low-cost platform.

  3. Droplet-Based Pyrosequencing Using Digital Microfluidics

    PubMed Central

    Boles, Deborah J.; Benton, Jonathan L.; Siew, Germaine J.; Levy, Miriam H.; Thwar, Prasanna K.; Sandahl, Melissa A.; Rouse, Jeremy L.; Perkins, Lisa C.; Sudarsan, Arjun P.; Jalili, Roxana; Pamula, Vamsee K.; Srinivasan, Vijay; Fair, Richard B.; Griffin, Peter B.; Eckhardt, Allen E.; Pollack, Michael G.

    2013-01-01

    The feasibility of implementing pyrosequencing chemistry within droplets using electrowetting-based digital microfluidics is reported. An array of electrodes patterned on a printed-circuit board was used to control the formation, transportation, merging, mixing, and splitting of submicroliter-sized droplets contained within an oil-filled chamber. A three-enzyme pyrosequencing protocol was implemented in which individual droplets contained enzymes, deoxyribonucleotide triphosphates (dNTPs), and DNA templates. The DNA templates were anchored to magnetic beads which enabled them to be thoroughly washed between nucleotide additions. Reagents and protocols were optimized to maximize signal over background, linearity of response, cycle efficiency, and wash efficiency. As an initial demonstration of feasibility, a portion of a 229 bp Candida parapsilosis template was sequenced using both a de novo protocol and a resequencing protocol. The resequencing protocol generated over 60 bp of sequence with 100% sequence accuracy based on raw pyrogram levels. Excellent linearity was observed for all of the homopolymers (two, three, or four nucleotides) contained in the C. parapsilosis sequence. With improvements in microfluidic design it is expected that longer reads, higher throughput, and improved process integration (i.e., “sample-to-sequence” capability) could eventually be achieved using this low-cost platform. PMID:21932784

  4. An Energy Balanced and Lifetime Extended Routing Protocol for Underwater Sensor Networks.

    PubMed

    Wang, Hao; Wang, Shilian; Zhang, Eryang; Lu, Luxi

    2018-05-17

    Energy limitation is an adverse problem in designing routing protocols for underwater sensor networks (UWSNs). To prolong the network lifetime with limited battery power, an energy balanced and efficient routing protocol, called energy balanced and lifetime extended routing protocol (EBLE), is proposed in this paper. The proposed EBLE not only balances traffic loads according to the residual energy, but also optimizes data transmissions by selecting low-cost paths. Two phases are operated in the EBLE data transmission process: (1) candidate forwarding set selection phase and (2) data transmission phase. In candidate forwarding set selection phase, nodes update candidate forwarding nodes by broadcasting the position and residual energy level information. The cost value of available nodes is calculated and stored in each sensor node. Then in data transmission phase, high residual energy and relatively low-cost paths are selected based on the cost function and residual energy level information. We also introduce detailed analysis of optimal energy consumption in UWSNs. Numerical simulation results on a variety of node distributions and data load distributions prove that EBLE outperforms other routing protocols (BTM, BEAR and direct transmission) in terms of network lifetime and energy efficiency.

  5. Recovery After Prolonged Bed-Rest Deconditioning

    NASA Technical Reports Server (NTRS)

    Greenleaf, John E.; Quach, David T.

    2003-01-01

    Recovery data were analyzed from normal healthy test subjects maintained in the horizontal or head-down body position in well-controlled bed rest (BR) studies in which adherence to the well-designed protocol was monitored. Because recovery data were almost always of secondary importance to the data collected during the BR period, there was little consistency in the recovery experimental designs regarding control factors (e.g., diet or exercise), duration, or timing of data collection. Thus, only about half of the BR studies that provided appropriate data were analyzed here. These recovery data were sorted into two groups: those from BR protocols of less than 37 days, and those from protocols greater than 36 days. There was great disparity in the unchanged responses at the end of BR in these two groups. Likewise with the variables that required more than 40 days for recovery; for example, some immune variables required more than 180 days. Knowledge of the recovery process after BR in healthy people should assist rehabilitation workers in differentiating "healthy" BR recovery responses from those of the infirmity of sick or injured patients; this should result in more appropriate and efficient health care.

  6. Development of a method for measuring femoral torsion using real-time ultrasound.

    PubMed

    Hafiz, Eliza; Hiller, Claire E; Nicholson, Leslie L; Nightingale, E Jean; Clarke, Jillian L; Grimaldi, Alison; Eisenhuth, John P; Refshauge, Kathryn M

    2014-07-01

    Excessive femoral torsion has been associated with various musculoskeletal and neurological problems. To explore this relationship, it is essential to be able to measure femoral torsion in the clinic accurately. Computerized tomography (CT) and magnetic resonance imaging (MRI) are thought to provide the most accurate measurements but CT involves significant radiation exposure and MRI is expensive. The aim of this study was to design a method for measuring femoral torsion in the clinic, and to determine the reliability of this method. Details of design process, including construction of a jig, the protocol developed and the reliability of the method are presented. The protocol developed used ultrasound to image a ridge on the greater trochanter, and a customized jig placed on the femoral condyles as reference points. An inclinometer attached to the customized jig allowed quantification of the degree of femoral torsion. Measurements taken with this protocol had excellent intra- and inter-rater reliability (ICC2,1 = 0.98 and 0.97, respectively). This method of measuring femoral torsion also permitted measurement of femoral torsion with a high degree of accuracy. This method is applicable to the research setting and, with minor adjustments, will be applicable to the clinical setting.

  7. Design and Evaluation of Complex Moving HIFU Treatment Protocols

    NASA Astrophysics Data System (ADS)

    Kargl, Steven G.; Andrew, Marilee A.; Kaczkowski, Peter J.; Brayman, Andrew A.; Crum, Lawrence A.

    2005-03-01

    The use of moving high-intensity focused ultrasound (HIFU) treatment protocols is of interest in achieving efficient formation of large-volume thermal lesions in tissue. Judicious protocol design is critical in order to avoid collateral damage to healthy tissues outside the treatment zone. A KZK-BHTE model, extended to simulate multiple, moving scans in tissue, is used to investigate protocol design considerations. Prediction and experimental observations are presented which 1) validate the model, 2) illustrate how to assess the effects of acoustic nonlinearity, and 3) demonstrate how to assess and control collateral damage such as prefocal lesion formation and lesion formation resulting from thermal conduction without direct HIFU exposure. Experimental data consist of linear and circular scan protocols delivered over a range of exposure regimes in ex vivo bovine liver.

  8. Microfabricated Modular Scale-Down Device for Regenerative Medicine Process Development

    PubMed Central

    Reichen, Marcel; Macown, Rhys J.; Jaccard, Nicolas; Super, Alexandre; Ruban, Ludmila; Griffin, Lewis D.; Veraitch, Farlan S.; Szita, Nicolas

    2012-01-01

    The capacity of milli and micro litre bioreactors to accelerate process development has been successfully demonstrated in traditional biotechnology. However, for regenerative medicine present smaller scale culture methods cannot cope with the wide range of processing variables that need to be evaluated. Existing microfabricated culture devices, which could test different culture variables with a minimum amount of resources (e.g. expensive culture medium), are typically not designed with process development in mind. We present a novel, autoclavable, and microfabricated scale-down device designed for regenerative medicine process development. The microfabricated device contains a re-sealable culture chamber that facilitates use of standard culture protocols, creating a link with traditional small-scale culture devices for validation and scale-up studies. Further, the modular design can easily accommodate investigation of different culture substrate/extra-cellular matrix combinations. Inactivated mouse embryonic fibroblasts (iMEF) and human embryonic stem cell (hESC) colonies were successfully seeded on gelatine-coated tissue culture polystyrene (TC-PS) using standard static seeding protocols. The microfluidic chip included in the device offers precise and accurate control over the culture medium flow rate and resulting shear stresses in the device. Cells were cultured for two days with media perfused at 300 µl.h−1 resulting in a modelled shear stress of 1.1×10−4 Pa. Following perfusion, hESC colonies stained positively for different pluripotency markers and retained an undifferentiated morphology. An image processing algorithm was developed which permits quantification of co-cultured colony-forming cells from phase contrast microscope images. hESC colony sizes were quantified against the background of the feeder cells (iMEF) in less than 45 seconds for high-resolution images, which will permit real-time monitoring of culture progress in future experiments. The presented device is a first step to harness the advantages of microfluidics for regenerative medicine process development. PMID:23284952

  9. DIVA V2.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    CHEN, JOANNA; SIMIRENKO, LISA; TAPASWI, MANJIRI

    The DIVA software interfaces a process in which researchers design their DNA with a web-based graphical user interface, submit their designs to a central queue, and a few weeks later receive their sequence-verified clonal constructs. Each researcher independently designs the DNA to be constructed with a web-based BioCAD tool, and presses a button to submit their designs to a central queue. Researchers have web-based access to their DNA design queues, and can track the progress of their submitted designs as they progress from "evaluation", to "waiting for reagents", to "in progress", to "complete". Researchers access their completed constructs through themore » central DNA repository. Along the way, all DNA construction success/failure rates are captured in a central database. Once a design has been submitted to the queue, a small number of dedicated staff evaluate the design for feasibility and provide feedback to the responsible researcher if the design is either unreasonable (e.g., encompasses a combinatorial library of a billion constructs) or small design changes could significantly facilitate the downstream implementation process. The dedicated staff then use DNA assembly design automation software to optimize the DNA construction process for the design, leveraging existing parts from the DNA repository where possible and ordering synthetic DNA where necessary. SynTrack software manages the physical locations and availability of the various requisite reagents and process inputs (e.g., DNA templates). Once all requisite process inputs are available, the design progresses from "waiting for reagents" to "in progress" in the design queue. Human-readable and machine-parseable DNA construction protocols output by the DNA assembly design automation software are then executed by the dedicated staff exploiting lab automation devices wherever possible. Since the all employed DNA construction methods are sequence-agnostic, standardized (utilize the same enzymatic master mixes and reaction conditions), completely independent DNA construction tasks can be aggregated into the same multi-well plates and pursued in parallel. The resulting sets of cloned constructs can then be screened by high-throughput next-gen sequencing platforms for sequence correctness. A combination of long read-length (e.g., PacBio) and paired-end read platforms (e.g., Illumina) would be exploited depending the particular task at hand (e.g., PacBio might be sufficient to screen a set of pooled constructs with significant gene divergence). Post sequence verification, designs for which at least one correct clone was identified will progress to a "complete" status, while designs for which no correct clones wereidentified will progress to a "failure" status. Depending on the failure mode (e.g., no transformants), and how many prior attempts/variations of assembly protocol have been already made for a given design, subsequent attempts may be made or the design can progress to a "permanent failure" state. All success and failure rate information will be captured during the process, including at which stage a given clonal construction procedure failed (e.g., no PCR product) and what the exact failure was (e.g. assembly piece 2 missing). This success/failure rate data can be leveraged to refine the DNA assembly design process.« less

  10. An example problem illustrating the application of the national lime association mixture design and testing protocol (MDTP) to ascertain engineering properties of lime-treated subgrades for mechanistic pavement design/analysis.

    DOT National Transportation Integrated Search

    2001-09-01

    This document presents an example of mechanistic design and analysis using a mix design and : testing protocol. More specifically, it addresses the structural properties of lime-treated subgrade, : subbase, and base layers through mechanistic design ...

  11. Assessment of an improved bone washing protocol for deceased donor human bone.

    PubMed

    Eagle, M J; Man, J; Rooney, P; Hogg, P; Kearney, J N

    2015-03-01

    NHSBT Tissue Services issues bone to surgeons in the UK in two formats, fresh-frozen unprocessed bone from living donors and processed bone from deceased donors. Processed bone may be frozen or freeze dried and all processed bone is currently subjected to a washing protocol to remove blood and bone marrow. In this study we have improved the current bone washing protocol for cancellous bone and assessed the success of the protocol by measuring the removal of the bone marrow components: soluble protein, DNA and haemoglobin at each step in the process, and residual components in the bone at the end of the process. The bone washing protocol is a combination of sonication, warm water washes, centrifugation and chemical (ethanol and hydrogen peroxide) treatments. We report that the bone washing protocol is capable of removing up to 99.85 % soluble protein, 99.95 % DNA and 100 % of haemoglobin from bone. The new bone washing protocol does not render any bone cytotoxic as shown by contact cytotoxicity assays. No microbiological cell growth was detected in any of the wash steps. This process is now in use for processed cancellous bone issued by NHSBT.

  12. Complete Bell-state analysis for superconducting-quantum-interference-device qubits with a transitionless tracking algorithm

    NASA Astrophysics Data System (ADS)

    Kang, Yi-Hao; Chen, Ye-Hong; Shi, Zhi-Cheng; Huang, Bi-Hua; Song, Jie; Xia, Yan

    2017-08-01

    We propose a protocol for complete Bell-state analysis for two superconducting-quantum-interference-device qubits. The Bell-state analysis could be completed by using a sequence of microwave pulses designed by the transitionless tracking algorithm, which is a useful method in the technique of shortcut to adiabaticity. After the whole process, the information for distinguishing four Bell states will be encoded on two auxiliary qubits, while the Bell states remain unchanged. One can read out the information by detecting the auxiliary qubits. Thus the Bell-state analysis is nondestructive. The numerical simulations show that the protocol possesses a high success probability of distinguishing each Bell state with current experimental technology even when decoherence is taken into account. Thus, the protocol may have potential applications for the information readout in quantum communications and quantum computations in superconducting quantum networks.

  13. Asynchronous Message Service Reference Implementation

    NASA Technical Reports Server (NTRS)

    Burleigh, Scott C.

    2011-01-01

    This software provides a library of middleware functions with a simple application programming interface, enabling implementation of distributed applications in conformance with the CCSDS AMS (Consultative Committee for Space Data Systems Asynchronous Message Service) specification. The AMS service, and its protocols, implement an architectural concept under which the modules of mission systems may be designed as if they were to operate in isolation, each one producing and consuming mission information without explicit awareness of which other modules are currently operating. Communication relationships among such modules are self-configuring; this tends to minimize complexity in the development and operations of modular data systems. A system built on this model is a society of generally autonomous, inter-operating modules that may fluctuate freely over time in response to changing mission objectives, modules functional upgrades, and recovery from individual module failure. The purpose of AMS, then, is to reduce mission cost and risk by providing standard, reusable infrastructure for the exchange of information among data system modules in a manner that is simple to use, highly automated, flexible, robust, scalable, and efficient. The implementation is designed to spawn multiple threads of AMS functionality under the control of an AMS application program. These threads enable all members of an AMS-based, distributed application to discover one another in real time, subscribe to messages on specific topics, and to publish messages on specific topics. The query/reply (client/server) communication model is also supported. Message exchange is optionally subject to encryption (to support confidentiality) and authorization. Fault tolerance measures in the discovery protocol minimize the likelihood of overall application failure due to any single operational error anywhere in the system. The multi-threaded design simplifies processing while enabling application nodes to operate at high speeds; linked lists protected by mutex semaphores and condition variables are used for efficient, inter-thread communication. Applications may use a variety of transport protocols underlying AMS itself, including TCP (Transmission Control Protocol), UDP (User Datagram Protocol), and message queues.

  14. Special Plans and Operations: Assessment of Allegations Concerning Traumatic Brain Injury Research Integrity in Iraq

    DTIC Science & Technology

    2011-03-31

    protocols conducted in Iraq. His office had been designated by the 1 A research protocol is a formal document detailing the study methodology and the...Human Research Protections Program plan requires scientific peer review to ensure that research is scientifically sound in its design and methods, and...ofthe approved research protocol and IRB minutes, revealed that there was no mention of "active rehabilitation and exercise" under the design

  15. The Xpress Transfer Protocol (XTP): A tutorial (expanded version)

    NASA Technical Reports Server (NTRS)

    Sanders, Robert M.; Weaver, Alfred C.

    1990-01-01

    The Xpress Transfer Protocol (XTP) is a reliable, real-time, light weight transfer layer protocol. Current transport layer protocols such as DoD's Transmission Control Protocol (TCP) and ISO's Transport Protocol (TP) were not designed for the next generation of high speed, interconnected reliable networks such as fiber distributed data interface (FDDI) and the gigabit/second wide area networks. Unlike all previous transport layer protocols, XTP is being designed to be implemented in hardware as a VLSI chip set. By streamlining the protocol, combining the transport and network layers and utilizing the increased speed and parallelization possible with a VLSI implementation, XTP will be able to provide the end-to-end data transmission rates demanded in high speed networks without compromising reliability and functionality. This paper describes the operation of the XTP protocol and in particular, its error, flow and rate control; inter-networking addressing mechanisms; and multicast support features, as defined in the XTP Protocol Definition Revision 3.4.

  16. Allergen Challenge Chamber: an innovative solution in allergic rhinitis diagnosis.

    PubMed

    Krzych-Fałta, Edyta; Sowa, Jerzy; Wojas, Oksana; Piekarska, Barbara; Sybilski, Adam; Samoliński, Bolesław

    2015-12-01

    The Allergen Challenge Chamber (ACC) is definitely a serious challenge on the one hand and an innovative solution in allergic rhinitis diagnosis on the other. The gradual validation of the chamber (according to the test protocol) will allow for standardisation, which is a process undertaken by centres worldwide. The process of designing a consistent system that allows for creating conditions as those in the case of natural inhalation took into account all the aspects (technical specification) necessary to ensure appropriate inhalation.

  17. Cell Therapy Regulation in Taiwan

    PubMed Central

    Chen, Yuan-Chuan; Cheng, Hwei-Fang; Yeh, Ming-Kung

    2017-01-01

    Cell therapy is not only a novel medical practice but also a medicinal product [cell therapy product (CTP)]. More and more CTPs are being approved for marketing globally because of the rapid development of bio-medicine in cell culture, preservation, and preparation. However, regulation is the most important criterion for the development of CTPs. Regulations must be flexible to expedite the process of marketing for new CTPs. Recently, the Taiwan Food and Drug Administration (TFDA) updated the related regulations such as regulation of development, current regulatory framework and process, and the application and evaluation processes. When the quality of CTPs has been improved significantly, their safety and efficacy are further ensured. The treatment protocol, a new design for adaptive licensing to current clinical practice, is a rapid process for patients with life-threatening diseases or serious conditions for which there are no suitable drugs, medical devices, or other therapeutic methods available. The hospital can submit the treatment protocol to apply for cell therapy as a medical practice, which may result in easier and faster cell therapy development, and personalized treatment for individual patients will evolve quickly. PMID:27697103

  18. Evaluating newly acquired authority of nurse practitioners and physician assistants for reserved medical procedures in the Netherlands: a study protocol

    PubMed Central

    De Bruijn-Geraets, Daisy P; Van Eijk-Hustings, Yvonne JL; Vrijhoef, Hubertus JM

    2014-01-01

    Aim The study protocol is designed to evaluate the effects of granting independent authorization for medical procedures to nurse practitioners and physician assistants on processes and outcomes of health care. Background Recent (temporarily) enacted legislation in Dutch health care authorizes nurse practitioners and physician assistants to indicate and perform specified medical procedures, i.e. catheterization, cardioversion, defibrillation, endoscopy, injection, puncture, prescribing and simple surgical procedures, independently. Formerly, these procedures were exclusively reserved to physicians, dentists and midwives. Design A triangulation mixed method design is used to collect quantitative (surveys) and qualitative (interviews) data. Methods Outcomes are selected from evidence-based frameworks and models for assessing the impact of advanced nursing on quality of health care. Data are collected in various manners. Surveys are structured around the domains: (i) quality of care; (ii) costs; (iii) healthcare resource use; and (iv) patient centredness. Focus group and expert interviews aim to ascertain facilitators and barriers to the implementation process. Data are collected before the amendment of the law, 1 and 2·5 years thereafter. Groups of patients, nurse practitioners, physician assistants, supervising physicians and policy makers all participate in this national study. The study is supported by a grant from the Dutch Ministry of Health, Welfare and Sport in March 2011. Research Ethics Committee approval was obtained in July 2011. Conclusion This study will provide information about the effects of granting independent authorization for medical procedures to nurse practitioners and physician assistants on processes and outcomes of health care. Study findings aim to support policy makers and other stakeholders in making related decisions. The study design enables a cross-national comparative analysis. PMID:24684631

  19. A Protocol for Evaluating Contextual Design Principles

    PubMed Central

    Stamps, Arthur

    2014-01-01

    This paper explains how scientific data can be incorporated into urban design decisions, such as evaluating contextual design principles. The recommended protocols are based on the Cochrane Reviews that have been widely used in medical research. The major concepts of a Cochrane Review are explained, as well as the underlying mathematics. The underlying math is meta-analysis. Data are reported for three applications and seven contextual design policies. It is suggested that use of the Cochrane protocols will be of great assistance to planners by providing scientific data that can be used to evaluate the efficacies of contextual design policies prior to implementing those policies. PMID:25431448

  20. An ICT-Based Platform to Monitor Protocols in the Healthcare Environment.

    PubMed

    Rorís, Víctor M Alonso; Gago, Juan M Santos; Sabucedo, Luis Álvarez; Merino, Mateo Ramos; Valero, Javier Sanz

    2016-10-01

    Procedures from the healthcare domain involve highly critical actions as they may pose a risk for patients' life. Therefore, a large effort is devoted to the standardization in clinical praxis and to the control of quality for these protocols in order to minimize hazards. In this line, this work is compelled to provide an ICT-based support to carry out these controls in a simple and effective manner. Using a methodology based on HACCP and taking advantage of Semantic tools, a holistic platform of services for traceability and control of processes has been designed and implemented. The applied paradigm is based on the use of Control Points as singular points to generate traces using observations and measures relevant for the processes considered. Based on those, it is possible to offer services for advanced querying and knowledge inference. The local deployment just requires regular mobile phones or tablets making this solution cost-effective and easily replicable.

  1. Communication-Gateway Software For NETEX, DECnet, And TCP/IP

    NASA Technical Reports Server (NTRS)

    Keith, B.; Ferry, D.; Fendler, E.

    1990-01-01

    Communications gateway software, GATEWAY, provides process-to-process communication between remote applications programs in different protocol domains. Communicating peer processes may be resident on any paired combination of NETEX, DECnet, or TCP/IP hosts. Provides necessary mapping from one protocol to another and facilitates practical intermachine communications in cost-effective manner by eliminating need to standardize on single protocol or to implement multiple protocols in host computers. Written in Ada.

  2. Real-Time QoS Routing Protocols in Wireless Multimedia Sensor Networks: Study and Analysis.

    PubMed

    Alanazi, Adwan; Elleithy, Khaled

    2015-09-02

    Many routing protocols have been proposed for wireless sensor networks. These routing protocols are almost always based on energy efficiency. However, recent advances in complementary metal-oxide semiconductor (CMOS) cameras and small microphones have led to the development of Wireless Multimedia Sensor Networks (WMSN) as a class of wireless sensor networks which pose additional challenges. The transmission of imaging and video data needs routing protocols with both energy efficiency and Quality of Service (QoS) characteristics in order to guarantee the efficient use of the sensor nodes and effective access to the collected data. Also, with integration of real time applications in Wireless Senor Networks (WSNs), the use of QoS routing protocols is not only becoming a significant topic, but is also gaining the attention of researchers. In designing an efficient QoS routing protocol, the reliability and guarantee of end-to-end delay are critical events while conserving energy. Thus, considerable research has been focused on designing energy efficient and robust QoS routing protocols. In this paper, we present a state of the art research work based on real-time QoS routing protocols for WMSNs that have already been proposed. This paper categorizes the real-time QoS routing protocols into probabilistic and deterministic protocols. In addition, both categories are classified into soft and hard real time protocols by highlighting the QoS issues including the limitations and features of each protocol. Furthermore, we have compared the performance of mobility-aware query based real-time QoS routing protocols from each category using Network Simulator-2 (NS2). This paper also focuses on the design challenges and future research directions as well as highlights the characteristics of each QoS routing protocol.

  3. Real-Time QoS Routing Protocols in Wireless Multimedia Sensor Networks: Study and Analysis

    PubMed Central

    Alanazi, Adwan; Elleithy, Khaled

    2015-01-01

    Many routing protocols have been proposed for wireless sensor networks. These routing protocols are almost always based on energy efficiency. However, recent advances in complementary metal-oxide semiconductor (CMOS) cameras and small microphones have led to the development of Wireless Multimedia Sensor Networks (WMSN) as a class of wireless sensor networks which pose additional challenges. The transmission of imaging and video data needs routing protocols with both energy efficiency and Quality of Service (QoS) characteristics in order to guarantee the efficient use of the sensor nodes and effective access to the collected data. Also, with integration of real time applications in Wireless Senor Networks (WSNs), the use of QoS routing protocols is not only becoming a significant topic, but is also gaining the attention of researchers. In designing an efficient QoS routing protocol, the reliability and guarantee of end-to-end delay are critical events while conserving energy. Thus, considerable research has been focused on designing energy efficient and robust QoS routing protocols. In this paper, we present a state of the art research work based on real-time QoS routing protocols for WMSNs that have already been proposed. This paper categorizes the real-time QoS routing protocols into probabilistic and deterministic protocols. In addition, both categories are classified into soft and hard real time protocols by highlighting the QoS issues including the limitations and features of each protocol. Furthermore, we have compared the performance of mobility-aware query based real-time QoS routing protocols from each category using Network Simulator-2 (NS2). This paper also focuses on the design challenges and future research directions as well as highlights the characteristics of each QoS routing protocol. PMID:26364639

  4. Resident Choice and the Survey Process: The Need for Standardized Observation and Transparency

    ERIC Educational Resources Information Center

    Schnelle, John F.; Bertrand, Rosanna; Hurd, Donna; White, Alan; Squires, David; Feuerberg, Marvin; Hickey, Kelly; Simmons, Sandra F.

    2009-01-01

    Purpose: To describe a standardized observation protocol to determine if nursing home (NH) staff offer choice to residents during 3 morning activities of daily living (ADL) and compare the observational data with deficiency statements cited by state survey staff. Design and Methods: Morning ADL care was observed in 20 NHs in 5 states by research…

  5. Interdisciplinary Approach to the Development of Accessible Computer-Administered Measurement Instruments.

    PubMed

    Magasi, Susan; Harniss, Mark; Heinemann, Allen W

    2018-01-01

    Principles of fairness in testing require that all test takers, including people with disabilities, have an equal opportunity to demonstrate their capacity on the construct being measured. Measurement design features and assessment protocols can pose barriers for people with disabilities. Fairness in testing is a fundamental validity issue at all phases in the design, administration, and interpretation of measurement instruments in clinical practice and research. There is limited guidance for instrument developers on how to develop and evaluate the accessibility and usability of measurement instruments. This article describes a 6-stage iterative process for developing accessible computer-administered measurement instruments grounded in the procedures implemented across several major measurement initiatives. A key component of this process is interdisciplinary teams of accessibility experts, content and measurement experts, information technology experts, and people with disabilities working together to ensure that measurement instruments are accessible and usable by a wide range of users. The development of accessible measurement instruments is not only an ethical requirement, it also ensures better science by minimizing measurement bias, missing data, and attrition due to mismatches between the target population and test administration platform and protocols. Copyright © 2017 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  6. In-Office Endoscopic Laryngeal Laser Procedures: A Patient Safety Initiative.

    PubMed

    Anderson, Jennifer; Bensoussan, Yael; Townsley, Richard; Kell, Erika

    2018-05-01

    Objective To review complications of in-office endoscopic laryngeal laser procedures after implementation of standardized safety protocol. Methods A retrospective review was conducted of the first 2 years of in-office laser procedures at St Michaels Hospital after the introduction of a standardized safety protocol. The protocol included patient screening, procedure checklist with standardized reporting of processes, medications, and complications. Primary outcomes measured were complication rates of in-office laryngeal laser procedures. Secondary outcomes included hemodynamic changes, local anesthetic dose, laser settings, total laser/procedure time, and incidence of sedation. Results A total of 145 in-office KTP procedures performed on 65 patients were reviewed. In 98% of cases, the safety protocol was fully implemented. The overall complication rate was 4.8%. No major complications were encountered. Minor complications included vasovagal episodes and patient intolerance. The rate of patient intolerance resulting early termination of anticipated procedure was 13.1%. Total local anesthetic dose averaged 172.9 mg lidocaine per procedure. The mean amount of laser energy dispersed was 261.2 J, with mean total procedure time of 48.3 minutes. Sixteen percent of patients had preprocedure sedation. Vital signs were found to vary modestly. Systolic blood pressure was lower postprocedure in 13.8% and symptomatic in 4.1%. Discussion The review of our standardized safety protocol has revealed that in-office laser treatment for laryngeal pathology has extremely low complication rates with safe patient outcomes. Implications for Practice The trend of shifting procedures out of the operating room into the office/clinic setting requires new processes designed to promote patient safety.

  7. Design of an instrument to measure the quality of care in Physical Therapy.

    PubMed

    Cavalheiro, Leny Vieira; Eid, Raquel Afonso Caserta; Talerman, Claudia; Prado, Cristiane do; Gobbi, Fátima Cristina Martorano; Andreoli, Paola Bruno de Araujo

    2015-01-01

    To design an instrument composed of domains that would demonstrate physical therapy activities and generate a consistent index to represent the quality of care in physical therapy. The methodology Lean Six Sigma was used to design the tool. The discussion involved seven different management groups staff. By means of brainstorming and Cause & Effect Matrix, we set up the process map. Five requirements composed the quality of care index in physical therapy, after application of the tool called Cause & Effect Matrix. The following requirements were assessed: physical therapist performance, care outcome indicator, adherence to physical therapy protocols, measure whether the prognosis and treatment outcome was achieved and Infrastructure. The proposed design allowed evaluating several items related to physical therapy service, enabling customization, reproducibility and benchmarking with other organizations. For management, this index provides the opportunity to identify areas for improvement and the strengths of the team and process of physical therapy care.

  8. Device USB interface and software development for electric parameter measuring instrument

    NASA Astrophysics Data System (ADS)

    Li, Deshi; Chen, Jian; Wu, Yadong

    2003-09-01

    Aimed at general devices development, this paper discussed the development of USB interface and software development. With an example, using PDIUSBD12 which support parallel interface, the paper analyzed its technical characteristics. Designed different interface circuit with 80C52 singlechip microcomputer and TMS320C54 series digital signal processor, analyzed the address allocation, register access. According to USB1.1 standard protocol, designed the device software and application layer protocol. The paper designed the data exchange protocol, and carried out system functions.

  9. Networks for Autonomous Formation Flying Satellite Systems

    NASA Technical Reports Server (NTRS)

    Knoblock, Eric J.; Konangi, Vijay K.; Wallett, Thomas M.; Bhasin, Kul B.

    2001-01-01

    The performance of three communications networks to support autonomous multi-spacecraft formation flying systems is presented. All systems are comprised of a ten-satellite formation arranged in a star topology, with one of the satellites designated as the central or "mother ship." All data is routed through the mother ship to the terrestrial network. The first system uses a TCP/lP over ATM protocol architecture within the formation the second system uses the IEEE 802.11 protocol architecture within the formation and the last system uses both of the previous architectures with a constellation of geosynchronous satellites serving as an intermediate point-of-contact between the formation and the terrestrial network. The simulations consist of file transfers using either the File Transfer Protocol (FTP) or the Simple Automatic File Exchange (SAFE) Protocol. The results compare the IF queuing delay, and IP processing delay at the mother ship as well as application-level round-trip time for both systems, In all cases, using IEEE 802.11 within the formation yields less delay. Also, the throughput exhibited by SAFE is better than FTP.

  10. SpaceWire Protocol ID: What Does It Mean To You?

    NASA Technical Reports Server (NTRS)

    Rakow, Glenn; Schnurr, Richard; Gilley, Daniel; Parks, Steve

    2006-01-01

    Spacewire is becoming a popular solution for satellite high-speed data buses because it is a simple standard that provides great flexibility for a wide range of system requirements. It is simple in packet format and protocol, allowing users to easily tailor their implementation for their specific application. Some of the attractive aspects of Spacewire that make it easy to implement also make it hard for future reuse. Protocol reuse is difficult because Spacewire does not have a defined mechanism to communicate with the higher layers of the protocol stack. This has forced users of Spacewire to define unique packet formats and define how these packets are to be processed. Each mission writes their own Interface Control Document (ICD) and tailors Spacewire for their specific requirements making reuse difficult. Part of the reason for this habit may be because engineers typically optimize designs for their own requirements in the absence of a standard. This is an inefficient use of project resources and costs more to develop missions. A new packet format for Spacewire has been defined as a solution for this problem. This new packet format is a compliment to the Spacewire standard that will support protocol development upon Spacewire. The new packet definition does not replace the current packet structure, i.e., does not make the standard obsolete, but merely extends the standard for those who want to develop protocols over Spacewire. The Spacewire packet is defined with the first part being the Destination Address, which may be one or more bytes. This is followed by the packet cargo, which is user defined. The cargo is truncated with an End-Of-Packet (EOP) marker. This packet structure offers low packet overhead and allows the user to define how the contents are to be formatted. It also provides for many different addressing schemes, which provide flexibility in the system. This packet flexibility is typically an attractive part of the Spacewire. The new extended packet format adds one new field to the packet that greatly enhances the capability of Spacewire. This new field called the Protocol Identifier (ID) is used to identify the packet contents and the associated processing for the packet. This feature along with the restriction in the packet format that uses the Protocol ID, allows a deterministic method of decoding packets that was not before possible. The first part of the packet is still the Destination Address, which still conforms to the original standard but with one restriction. The restriction is that the first byte seen at the destination by the user needs to be a logical address, independent of the addressing scheme used. The second field is defined as the Protocol ID, which is usually one byte in length. The packet cargo (user defined) follows the Protocol ID. After the packet cargo is the EOP, which defines the end of packet. The value of the Protocol ID is assigned by the Spacewire working group and the protocol description published for others to use. The development of Protocols for Spacewire is currently the area of greatest activity by the Spacewire working group. The first protocol definition by the working group has been completed and is now in the process of formal standardization. There are many other protocols in development for missions that have not yet received formal Protocol ID assignment, but even if the protocols are not formally assigned a value, this effort will provide synergism for future developments.

  11. Manufacture of a human mesenchymal stem cell population using an automated cell culture platform.

    PubMed

    Thomas, Robert James; Chandra, Amit; Liu, Yang; Hourd, Paul C; Conway, Paul P; Williams, David J

    2007-09-01

    Tissue engineering and regenerative medicine are rapidly developing fields that use cells or cell-based constructs as therapeutic products for a wide range of clinical applications. Efforts to commercialise these therapies are driving a need for capable, scaleable, manufacturing technologies to ensure therapies are able to meet regulatory requirements and are economically viable at industrial scale production. We report the first automated expansion of a human bone marrow derived mesenchymal stem cell population (hMSCs) using a fully automated cell culture platform. Differences in cell population growth profile, attributed to key methodological differences, were observed between the automated protocol and a benchmark manual protocol. However, qualitatively similar cell output, assessed by cell morphology and the expression of typical hMSC markers, was obtained from both systems. Furthermore, the critical importance of minor process variation, e.g. the effect of cell seeding density on characteristics such as population growth kinetics and cell phenotype, was observed irrespective of protocol type. This work highlights the importance of careful process design in therapeutic cell manufacture and demonstrates the potential of automated culture for future optimisation and scale up studies required for the translation of regenerative medicine products from the laboratory to the clinic.

  12. An Adaptive Jitter Mechanism for Reactive Route Discovery in Sensor Networks

    PubMed Central

    Cordero, Juan Antonio; Yi, Jiazi; Clausen, Thomas

    2014-01-01

    This paper analyses the impact of jitter when applied to route discovery in reactive (on-demand) routing protocols. In multi-hop non-synchronized wireless networks, jitter—a small, random variation in the timing of message emission—is commonly employed, as a means to avoid collisions of simultaneous transmissions by adjacent routers over the same channel. In a reactive routing protocol for sensor and ad hoc networks, jitter is recommended during the route discovery process, specifically, during the network-wide flooding of route request messages, in order to avoid collisions. Commonly, a simple uniform jitter is recommended. Alas, this is not without drawbacks: when applying uniform jitter to the route discovery process, an effect called delay inversion is observed. This paper, first, studies and quantifies this delay inversion effect. Second, this paper proposes an adaptive jitter mechanism, designed to alleviate the delay inversion effect and thereby to reduce the route discovery overhead and (ultimately) allow the routing protocol to find more optimal paths, as compared to uniform jitter. This paper presents both analytical and simulation studies, showing that the proposed adaptive jitter can effectively decrease the cost of route discovery and increase the path quality. PMID:25111238

  13. Design and Development of Layered Security: Future Enhancements and Directions in Transmission

    PubMed Central

    Shahzad, Aamir; Lee, Malrey; Kim, Suntae; Kim, Kangmin; Choi, Jae-Young; Cho, Younghwa; Lee, Keun-Kwang

    2016-01-01

    Today, security is a prominent issue when any type of communication is being undertaken. Like traditional networks, supervisory control and data acquisition (SCADA) systems suffer from a number of vulnerabilities. Numerous end-to-end security mechanisms have been proposed for the resolution of SCADA-system security issues, but due to insecure real-time protocol use and the reliance upon open protocols during Internet-based communication, these SCADA systems can still be compromised by security challenges. This study reviews the security challenges and issues that are commonly raised during SCADA/protocol transmissions and proposes a secure distributed-network protocol version 3 (DNP3) design, and the implementation of the security solution using a cryptography mechanism. Due to the insecurities found within SCADA protocols, the new development consists of a DNP3 protocol that has been designed as a part of the SCADA system, and the cryptographically derived security is deployed within the application layer as a part of the DNP3 stack. PMID:26751443

  14. Design and Development of Layered Security: Future Enhancements and Directions in Transmission.

    PubMed

    Shahzad, Aamir; Lee, Malrey; Kim, Suntae; Kim, Kangmin; Choi, Jae-Young; Cho, Younghwa; Lee, Keun-Kwang

    2016-01-06

    Today, security is a prominent issue when any type of communication is being undertaken. Like traditional networks, supervisory control and data acquisition (SCADA) systems suffer from a number of vulnerabilities. Numerous end-to-end security mechanisms have been proposed for the resolution of SCADA-system security issues, but due to insecure real-time protocol use and the reliance upon open protocols during Internet-based communication, these SCADA systems can still be compromised by security challenges. This study reviews the security challenges and issues that are commonly raised during SCADA/protocol transmissions and proposes a secure distributed-network protocol version 3 (DNP3) design, and the implementation of the security solution using a cryptography mechanism. Due to the insecurities found within SCADA protocols, the new development consists of a DNP3 protocol that has been designed as a part of the SCADA system, and the cryptographically derived security is deployed within the application layer as a part of the DNP3 stack.

  15. An application protocol for CAD to CAD transfer of electronic information

    NASA Technical Reports Server (NTRS)

    Azu, Charles C., Jr.

    1993-01-01

    The exchange of Computer Aided Design (CAD) information between dissimilar CAD systems is a problem. This is especially true for transferring electronics CAD information such as multi-chip module (MCM), hybrid microcircuit assembly (HMA), and printed circuit board (PCB) designs. Currently, there exists several neutral data formats for transferring electronics CAD information. These include IGES, EDIF, and DXF formats. All these formats have limitations for use in exchanging electronic data. In an attempt to overcome these limitations, the Navy's MicroCIM program implemented a project to transfer hybrid microcircuit design information between dissimilar CAD systems. The IGES (Initial Graphics Exchange Specification) format is used since it is well established within the CAD industry. The goal of the project is to have a complete transfer of microelectronic CAD information, using IGES, without any data loss. An Application Protocol (AP) is being developed to specify how hybrid microcircuit CAD information will be represented by IGES entity constructs. The AP defines which IGES data items are appropriate for describing HMA geometry, connectivity, and processing as well as HMA material characteristics.

  16. Absorbable energy monitoring scheme: new design protocol to test vehicle structural crashworthiness.

    PubMed

    Ofochebe, Sunday M; Enibe, Samuel O; Ozoegwu, Chigbogu G

    2016-05-01

    In vehicle crashworthiness design optimization detailed system evaluation capable of producing reliable results are basically achieved through high-order numerical computational (HNC) models such as the dynamic finite element model, mesh-free model etc. However the application of these models especially during optimization studies is basically challenged by their inherent high demand on computational resources, conditional stability of the solution process, and lack of knowledge of viable parameter range for detailed optimization studies. The absorbable energy monitoring scheme (AEMS) presented in this paper suggests a new design protocol that attempts to overcome such problems in evaluation of vehicle structure for crashworthiness. The implementation of the AEMS involves studying crash performance of vehicle components at various absorbable energy ratios based on a 2DOF lumped-mass-spring (LMS) vehicle impact model. This allows for prompt prediction of useful parameter values in a given design problem. The application of the classical one-dimensional LMS model in vehicle crash analysis is further improved in the present work by developing a critical load matching criterion which allows for quantitative interpretation of the results of the abstract model in a typical vehicle crash design. The adequacy of the proposed AEMS for preliminary vehicle crashworthiness design is demonstrated in this paper, however its extension to full-scale design-optimization problem involving full vehicle model that shows greater structural detail requires more theoretical development.

  17. Neuroimaging Study Designs, Computational Analyses and Data Provenance Using the LONI Pipeline

    PubMed Central

    Dinov, Ivo; Lozev, Kamen; Petrosyan, Petros; Liu, Zhizhong; Eggert, Paul; Pierce, Jonathan; Zamanyan, Alen; Chakrapani, Shruthi; Van Horn, John; Parker, D. Stott; Magsipoc, Rico; Leung, Kelvin; Gutman, Boris; Woods, Roger; Toga, Arthur

    2010-01-01

    Modern computational neuroscience employs diverse software tools and multidisciplinary expertise to analyze heterogeneous brain data. The classical problems of gathering meaningful data, fitting specific models, and discovering appropriate analysis and visualization tools give way to a new class of computational challenges—management of large and incongruous data, integration and interoperability of computational resources, and data provenance. We designed, implemented and validated a new paradigm for addressing these challenges in the neuroimaging field. Our solution is based on the LONI Pipeline environment [3], [4], a graphical workflow environment for constructing and executing complex data processing protocols. We developed study-design, database and visual language programming functionalities within the LONI Pipeline that enable the construction of complete, elaborate and robust graphical workflows for analyzing neuroimaging and other data. These workflows facilitate open sharing and communication of data and metadata, concrete processing protocols, result validation, and study replication among different investigators and research groups. The LONI Pipeline features include distributed grid-enabled infrastructure, virtualized execution environment, efficient integration, data provenance, validation and distribution of new computational tools, automated data format conversion, and an intuitive graphical user interface. We demonstrate the new LONI Pipeline features using large scale neuroimaging studies based on data from the International Consortium for Brain Mapping [5] and the Alzheimer's Disease Neuroimaging Initiative [6]. User guides, forums, instructions and downloads of the LONI Pipeline environment are available at http://pipeline.loni.ucla.edu. PMID:20927408

  18. Packet based serial link realized in FPGA dedicated for high resolution infrared image transmission

    NASA Astrophysics Data System (ADS)

    Bieszczad, Grzegorz

    2015-05-01

    In article the external digital interface specially designed for thermographic camera built in Military University of Technology is described. The aim of article is to illustrate challenges encountered during design process of thermal vision camera especially related to infrared data processing and transmission. Article explains main requirements for interface to transfer Infra-Red or Video digital data and describes the solution which we elaborated based on Low Voltage Differential Signaling (LVDS) physical layer and signaling scheme. Elaborated link for image transmission is built using FPGA integrated circuit with built-in high speed serial transceivers achieving up to 2500Gbps throughput. Image transmission is realized using proprietary packet protocol. Transmission protocol engine was described in VHDL language and tested in FPGA hardware. The link is able to transmit 1280x1024@60Hz 24bit video data using one signal pair. Link was tested to transmit thermal-vision camera picture to remote monitor. Construction of dedicated video link allows to reduce power consumption compared to solutions with ASIC based encoders and decoders realizing video links like DVI or packed based Display Port, with simultaneous reduction of wires needed to establish link to one pair. Article describes functions of modules integrated in FPGA design realizing several functions like: synchronization to video source, video stream packeting, interfacing transceiver module and dynamic clock generation for video standard conversion.

  19. 23 CFR 1340.5 - Selection of observation sites.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... STATE OBSERVATIONAL SURVEYS OF SEAT BELT USE Survey Design Requirements § 1340.5 Selection of... observation sites. The survey design shall include at a minimum the following protocols: (1) Protocol when...

  20. 23 CFR 1340.5 - Selection of observation sites.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... STATE OBSERVATIONAL SURVEYS OF SEAT BELT USE Survey Design Requirements § 1340.5 Selection of... observation sites. The survey design shall include at a minimum the following protocols: (1) Protocol when...

  1. 23 CFR 1340.5 - Selection of observation sites.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... STATE OBSERVATIONAL SURVEYS OF SEAT BELT USE Survey Design Requirements § 1340.5 Selection of... observation sites. The survey design shall include at a minimum the following protocols: (1) Protocol when...

  2. IRB Process Improvements: A Machine Learning Analysis.

    PubMed

    Shoenbill, Kimberly; Song, Yiqiang; Cobb, Nichelle L; Drezner, Marc K; Mendonca, Eneida A

    2017-06-01

    Clinical research involving humans is critically important, but it is a lengthy and expensive process. Most studies require institutional review board (IRB) approval. Our objective is to identify predictors of delays or accelerations in the IRB review process and apply this knowledge to inform process change in an effort to improve IRB efficiency, transparency, consistency and communication. We analyzed timelines of protocol submissions to determine protocol or IRB characteristics associated with different processing times. Our evaluation included single variable analysis to identify significant predictors of IRB processing time and machine learning methods to predict processing times through the IRB review system. Based on initial identified predictors, changes to IRB workflow and staffing procedures were instituted and we repeated our analysis. Our analysis identified several predictors of delays in the IRB review process including type of IRB review to be conducted, whether a protocol falls under Veteran's Administration purview and specific staff in charge of a protocol's review. We have identified several predictors of delays in IRB protocol review processing times using statistical and machine learning methods. Application of this knowledge to process improvement efforts in two IRBs has led to increased efficiency in protocol review. The workflow and system enhancements that are being made support our four-part goal of improving IRB efficiency, consistency, transparency, and communication.

  3. Safe bunker designing for the 18 MV Varian 2100 Clinac: a comparison between Monte Carlo simulation based upon data and new protocol recommendations.

    PubMed

    Beigi, Manije; Afarande, Fatemeh; Ghiasi, Hosein

    2016-01-01

    The aim of this study was to compare two bunkers designed by only protocols recommendations and Monte Carlo (MC) based upon data derived for an 18 MV Varian 2100Clinac accelerator. High energy radiation therapy is associated with fast and thermal photoneutrons. Adequate shielding against the contaminant neutron has been recommended by IAEA and NCRP new protocols. The latest protocols released by the IAEA (safety report No. 47) and NCRP report No. 151 were used for the bunker designing calculations. MC method based upon data was also derived. Two bunkers using protocols and MC upon data were designed and discussed. From designed door's thickness, the door designed by the MC simulation and Wu-McGinley analytical method was closer in both BPE and lead thickness. In the case of the primary and secondary barriers, MC simulation resulted in 440.11 mm for the ordinary concrete, total concrete thickness of 1709 mm was required. Calculating the same parameters value with the recommended analytical methods resulted in 1762 mm for the required thickness using 445 mm as recommended by TVL for the concrete. Additionally, for the secondary barrier the thickness of 752.05 mm was obtained. Our results showed MC simulation and the followed protocols recommendations in dose calculation are in good agreement in the radiation contamination dose calculation. Difference between the two analytical and MC simulation methods revealed that the application of only one method for the bunker design may lead to underestimation or overestimation in dose and shielding calculations.

  4. Paperless protocoling of CT and MRI requests at an outpatient imaging center.

    PubMed

    Bassignani, Matthew J; Dierolf, David A; Roberts, David L; Lee, Steven

    2010-04-01

    We created our imaging center (IC) to move outpatient imaging from our busy inpatient imaging suite off-site to a location that is more inviting to ambulatory patients. Nevertheless, patients scanned at our IC still represent the depth and breadth of illness complexity seen with our tertiary care population. Thus, we protocol exams on an individualized basis to ensure that the referring clinician's question is fully answered by the exam performed. Previously, paper based protocoling was a laborious process for all those involved where the IC business office would fax the requests to various reading rooms for protocoling by the subspecialist radiologists who are 3 miles away at the main hospital. Once protocoled, reading room coordinators would fax back the protocoled request to the IC technical area in preparation for the next day's scheduled exams. At any breakdown in this process (e.g., lost paperwork), patient exams were delayed and clinicians and patients became upset. To improve this process, we developed a paper free process whereby protocoling is accomplished through scanning of exam requests into our PACS. Using the common worklist functionality found in most PACS, we created "protocoling worklists" that contain these scanned documents. Radiologists protocol these studies in the PACS worklist (with the added benefit of having all imaging and report data available), and subsequently, the technologists can see and act on the protocols they find in PACS. This process has significantly decreased interruptions in our busy reading rooms and decreased rework of IC staff.

  5. Design of the Protocol Processor for the ROBUS-2 Communication System

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo; Malekpour, Mahyar R.; Miner, Paul S.

    2005-01-01

    The ROBUS-2 Protocol Processor (RPP) is a custom-designed hardware component implementing the functionality of the ROBUS-2 fault-tolerant communication system. The Reliable Optical Bus (ROBUS) is the core communication system of the Scalable Processor-Independent Design for Enhanced Reliability (SPIDER), a general-purpose fault tolerant integrated modular architecture currently under development at NASA Langley Research Center. ROBUS is a time-division multiple access (TDMA) broadcast communication system with medium access control by means of time-indexed communication schedule. ROBUS-2 is a developmental version of the ROBUS providing guaranteed fault-tolerant services to the attached processing elements (PEs), in the presence of a bounded number of faults. These services include message broadcast (Byzantine Agreement), dynamic communication schedule update, time reference (clock synchronization), and distributed diagnosis (group membership). ROBUS also features fault-tolerant startup and restart capabilities. ROBUS-2 tolerates internal as well as PE faults, and incorporates a dynamic self-reconfiguration capability driven by the internal diagnostic system. ROBUS consists of RPPs connected to each other by a lower-level physical communication network. The RPP has a pipelined architecture and the design is parameterized in the behavioral and structural domains. The design of the RPP enables the bus to achieve a PE-message throughput that approaches the available bandwidth at the physical layer.

  6. Data-driven CT protocol review and management—experience from a large academic hospital.

    PubMed

    Zhang, Da; Savage, Cristy A; Li, Xinhua; Liu, Bob

    2015-03-01

    Protocol review plays a critical role in CT quality assurance, but large numbers of protocols and inconsistent protocol names on scanners and in exam records make thorough protocol review formidable. In this investigation, we report on a data-driven cataloging process that can be used to assist in the reviewing and management of CT protocols. We collected lists of scanner protocols, as well as 18 months of recent exam records, for 10 clinical scanners. We developed computer algorithms to automatically deconstruct the protocol names on the scanner and in the exam records into core names and descriptive components. Based on the core names, we were able to group the scanner protocols into a much smaller set of "core protocols," and to easily link exam records with the scanner protocols. We calculated the percentage of usage for each core protocol, from which the most heavily used protocols were identified. From the percentage-of-usage data, we found that, on average, 18, 33, and 49 core protocols per scanner covered 80%, 90%, and 95%, respectively, of all exams. These numbers are one order of magnitude smaller than the typical numbers of protocols that are loaded on a scanner (200-300, as reported in the literature). Duplicated, outdated, and rarely used protocols on the scanners were easily pinpointed in the cataloging process. The data-driven cataloging process can facilitate the task of protocol review. Copyright © 2015 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  7. Interface design for CMOS-integrated Electrochemical Impedance Spectroscopy (EIS) biosensors.

    PubMed

    Manickam, Arun; Johnson, Christopher Andrew; Kavusi, Sam; Hassibi, Arjang

    2012-10-29

    Electrochemical Impedance Spectroscopy (EIS) is a powerful electrochemical technique to detect biomolecules. EIS has the potential of carrying out label-free and real-time detection, and in addition, can be easily implemented using electronic integrated circuits (ICs) that are built through standard semiconductor fabrication processes. This paper focuses on the various design and optimization aspects of EIS ICs, particularly the bio-to-semiconductor interface design. We discuss, in detail, considerations such as the choice of the electrode surface in view of IC manufacturing, surface linkers, and development of optimal bio-molecular detection protocols. We also report experimental results, using both macro- and micro-electrodes to demonstrate the design trade-offs and ultimately validate our optimization procedures.

  8. Design of on-board Bluetooth wireless network system based on fault-tolerant technology

    NASA Astrophysics Data System (ADS)

    You, Zheng; Zhang, Xiangqi; Yu, Shijie; Tian, Hexiang

    2007-11-01

    In this paper, the Bluetooth wireless data transmission technology is applied in on-board computer system, to realize wireless data transmission between peripherals of the micro-satellite integrating electronic system, and in view of the high demand of reliability of a micro-satellite, a design of Bluetooth wireless network based on fault-tolerant technology is introduced. The reliability of two fault-tolerant systems is estimated firstly using Markov model, then the structural design of this fault-tolerant system is introduced; several protocols are established to make the system operate correctly, some related problems are listed and analyzed, with emphasis on Fault Auto-diagnosis System, Active-standby switch design and Data-Integrity process.

  9. Gender-Specific Combination HIV Prevention for Youth in High-Burden Settings: The MP3 Youth Observational Pilot Study Protocol

    PubMed Central

    Agot, Kawango

    2017-01-01

    Background Nearly three decades into the epidemic, sub-Saharan Africa (SSA) remains the region most heavily affected by human immunodeficiency virus (HIV), with nearly 70% of the 34 million people living with HIV globally residing in the region. In SSA, female and male youth (15 to 24 years) are at a disproportionately high risk of HIV infection compared to adults. As such, there is a need to target HIV prevention strategies to youth and to tailor them to a gender-specific context. This protocol describes the process for the multi-staged approach in the design of the MP3 Youth pilot study, a gender-specific, combination, HIV prevention intervention for youth in Kenya. Objective The objective of this multi-method protocol is to outline a rigorous and replicable methodology for a gender-specific combination HIV prevention pilot study for youth in high-burden settings, illustrating the triangulated methods undertaken to ensure that age, sex, and context are integral in the design of the intervention. Methods The mixed-methods, cross-sectional, longitudinal cohort pilot study protocol was developed by first conducting a systematic review of the literature, which shaped focus group discussions around prevention package and delivery options, and that also informed age- and sex- stratified mathematical modeling. The review, qualitative data, and mathematical modeling created a triangulated evidence base of interventions to be included in the pilot study protocol. To design the pilot study protocol, we convened an expert panel to select HIV prevention interventions effective for youth in SSA, which will be offered in a mobile health setting. The goal of the pilot study implementation and evaluation is to apply lessons learned to more effective HIV prevention evidence and programming. Results The combination HIV prevention package in this protocol includes (1) offering HIV testing and counseling for all youth; (2) voluntary medical circumcision and condoms for males; (3) pre-exposure prophylaxis (PrEP), conditional cash transfer (CCT), and contraceptives for females; and (4) referrals for HIV care among those identified as HIV-positive. The combination package platform selected is mobile health teams in an integrated services delivery model. A cross-sectional analysis will be conducted to determine the uptake of the interventions. To determine long-term impact, the protocol outlines enrolling selected participants in mutually exclusive longitudinal cohorts (HIV-positive, PrEP, CCT, and HIV-negative) followed by using mobile phone text messages (short message service, SMS) and in-person surveys to prospectively assess prevention method uptake, adherence, and risk compensation behaviors. Cross-sectional and sub-cohort analyses will be conducted to determine intervention packages uptake. Conclusions The literature review, focus groups, and modeling indicate that offering age- and gender- specific combination HIV prevention interventions that include biomedical, behavioral, and structural interventions can have an impact on HIV risk reduction. Implementing this protocol will show the feasibility of delivering these services at scale. The MP3 Youth study is one of the few combination HIV prevention intervention protocols incorporating youth- and gender-specific interventions in one delivery setting. Lessons learned from the design of the protocol can be incorporated into the national guidance for combination HIV prevention for youth in Kenya and other high-burden SSA settings. Trial Registration ClinicalTrials.gov NCT01571128; http://clinicaltrials.gov/ct2/show/NCT01571128?term=MP3+youth&rank=1 (Archived by WebCite at http://www.webcitation.org/6nmioPd54) PMID:28274904

  10. Contamination Mitigation Strategies for Long Duration Human Spaceflight Missions

    NASA Technical Reports Server (NTRS)

    Lewis, Ruthan; Lupisella, Mark; Bleacher, Jake; Farrell, William

    2017-01-01

    Contamination control issues are particularly challenging for long-term human spaceflight and are associated with the search for life, dynamic environmental conditions, human-robotic-environment interaction, sample collection and return, biological processes, waste management, long-term environmental disturbance, etc. These issues impact mission success, human health, planetary protection, and research and discovery. Mitigation and control techniques and strategies may include and integrate long-term environmental monitoring and reporting, contamination control and planetary protection protocols, habitation site design, habitat design, and surface exploration and traverse pathways and area access planning.

  11. Internet Technology for Future Space Missions

    NASA Technical Reports Server (NTRS)

    Hennessy, Joseph F. (Technical Monitor); Rash, James; Casasanta, Ralph; Hogie, Keith

    2002-01-01

    Ongoing work at National Aeronautics and Space Administration Goddard Space Flight Center (NASA/GSFC), seeks to apply standard Internet applications and protocols to meet the technology challenge of future satellite missions. Internet protocols and technologies are under study as a future means to provide seamless dynamic communication among heterogeneous instruments, spacecraft, ground stations, constellations of spacecraft, and science investigators. The primary objective is to design and demonstrate in the laboratory the automated end-to-end transport of files in a simulated dynamic space environment using off-the-shelf, low-cost, commodity-level standard applications and protocols. The demonstrated functions and capabilities will become increasingly significant in the years to come as both earth and space science missions fly more sensors and the present labor-intensive, mission-specific techniques for processing and routing data become prohibitively. This paper describes how an IP-based communication architecture can support all existing operations concepts and how it will enable some new and complex communication and science concepts. The authors identify specific end-to-end data flows from the instruments to the control centers and scientists, and then describe how each data flow can be supported using standard Internet protocols and applications. The scenarios include normal data downlink and command uplink as well as recovery scenarios for both onboard and ground failures. The scenarios are based on an Earth orbiting spacecraft with downlink data rates from 300 Kbps to 4 Mbps. Included examples are based on designs currently being investigated for potential use by the Global Precipitation Measurement (GPM) mission.

  12. Mechanical Stimulation Protocols of Human Derived Cells in Articular Cartilage Tissue Engineering - A Systematic Review.

    PubMed

    Khozoee, Baktash; Mafi, Pouya; Mafi, Reza; Khan, Wasim S

    2017-01-01

    Mechanical stimulation is a key factor in articular cartilage generation and maintenance. Bioreactor systems have been designed and built in order to deliver specific types of mechanical stimulation. The focus has been twofold, applying a type of preconditioning in order to stimulate cell differentiation, and to simulate in vivo conditions in order to gain further insight into how cells respond to different stimulatory patterns. Due to the complex forces at work within joints, it is difficult to simulate mechanical conditions using a bioreactor. The aim of this review is to gain a deeper understanding of the complexities of mechanical stimulation protocols by comparing those employed in bioreactors in the context of tissue engineering for articular cartilage, and to consider their effects on cultured cells. Allied and Complementary Medicine 1985 to 2016, Ovid MEDLINE[R] 1946 to 2016, and Embase 1974 to 2016 were searched using key terms. Results were subject to inclusion and exclusion criteria, key findings summarised into a table and subsequently discussed. Based on this review it is overwhelmingly clear that mechanical stimulation leads to increased chondrogenic properties in the context of bioreactor articular cartilage tissue engineering using human cells. However, given the variability and lack of controlled factors between research articles, results are difficult to compare, and a standardised method of evaluating stimulation protocols proved challenging. With improved standardisation in mechanical stimulation protocol reporting, bioreactor design and building processes, along with a better understanding of joint behaviours, we hope to perform a meta-analysis on stimulation protocols and methods. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  13. A protocol for eliciting nonmaterial values through a cultural ecosystem services frame.

    PubMed

    Gould, Rachelle K; Klain, Sarah C; Ardoin, Nicole M; Satterfield, Terre; Woodside, Ulalia; Hannahs, Neil; Daily, Gretchen C; Chan, Kai M

    2015-04-01

    Stakeholders' nonmaterial desires, needs, and values often critically influence the success of conservation projects. These considerations are challenging to articulate and characterize, resulting in their limited uptake in management and policy. We devised an interview protocol designed to enhance understanding of cultural ecosystem services (CES). The protocol begins with discussion of ecosystem-related activities (e.g., recreation, hunting) and management and then addresses CES, prompting for values encompassing concepts identified in the Millennium Ecosystem Assessment (2005) and explored in other CES research. We piloted the protocol in Hawaii and British Columbia. In each location, we interviewed 30 individuals from diverse backgrounds. We analyzed results from the 2 locations to determine the effectiveness of the interview protocol in elucidating nonmaterial values. The qualitative and spatial components of the protocol helped characterize cultural, social, and ethical values associated with ecosystems in multiple ways. Maps and situational, or vignette-like, questions helped respondents articulate difficult-to-discuss values. Open-ended prompts allowed respondents to express a diversity of ecosystem-related values and proved sufficiently flexible for interviewees to communicate values for which the protocol did not explicitly probe. Finally, the results suggest that certain values, those mentioned frequently throughout the interview, are particularly salient for particular populations. The protocol can provide efficient, contextual, and place-based data on the importance of particular ecosystem attributes for human well-being. Qualitative data are complementary to quantitative and spatial assessments in the comprehensive representation of people's values pertaining to ecosystems, and this protocol may assist in incorporating values frequently overlooked in decision making processes. © 2014 The Authors. Conservation Biology published by Wiley Periodicals, Inc. on behalf of Society for Conservation Biology.

  14. The stem cell laboratory: design, equipment, and oversight.

    PubMed

    Wesselschmidt, Robin L; Schwartz, Philip H

    2011-01-01

    This chapter describes some of the major issues to be considered when setting up a laboratory for the culture of human pluripotent stem cells (hPSCs). The process of establishing a hPSC laboratory can be divided into two equally important parts. One is completely administrative and includes developing protocols, seeking approval, and establishing reporting processes and documentation. The other part of establishing a hPSC laboratory involves the physical plant and includes design, equipment and personnel. Proper planning of laboratory operations and proper design of the physical layout of the stem cell laboratory so that meets the scope of planned operations is a major undertaking, but the time spent upfront will pay long-term returns in operational efficiency and effectiveness. A well-planned, organized, and properly equipped laboratory supports research activities by increasing efficiency and reducing lost time and wasted resources.

  15. Study and Simulation of Enhancements for TCP (Transmission Control Protocol) Performance Over Noisy, High-Latency Links

    NASA Technical Reports Server (NTRS)

    Shepard, Timothy J.; Partridge, Craig; Coulter, Robert

    1997-01-01

    The designers of the TCP/IP protocol suite explicitly included support of satellites in their design goals. The goal of the Internet Project was to design a protocol which could be layered over different networking technologies to allow them to be concatenated into an internet. The results of this project included two protocols, IP and TCP. IP is the protocol used by all elements in the network and it defines the standard packet format for IP datagrams. TCP is the end-to-end transport protocol commonly used between end systems on the Internet to derive a reliable bi-directional byte-pipe service from the underlying unreliable IP datagram service. Satellite links are explicitly mentioned in Vint Cerf's 2-page article which appeared in 1980 in CCR [2] to introduce the specifications for IP and TCP. In the past fifteen years, TCP has been demonstrated to work over many differing networking technologies, including over paths including satellites links. So if satellite links were in the minds of the designers from the beginning, what is the problem? The problem is that the performance of TCP has in some cases been disappointing. A goal of the authors of the original specification of TCP was to specify only enough behavior to ensure interoperability. The specification left a number of important decisions, in particular how much data is to be sent when, to the implementor. This was deliberately' done. By leaving performance-related decisions to the implementor, this would allow the protocol TCP to be tuned and adapted to different networks and situations in the future without the need to revise the specification of the protocol, or break interoperability. Interoperability would continue while future implementations would be allowed flexibility to adapt to needs which could not be anticipated at the time of the original protocol design.

  16. In-Space Networking on NASA's SCAN Testbed

    NASA Technical Reports Server (NTRS)

    Brooks, David E.; Eddy, Wesley M.; Clark, Gilbert J.; Johnson, Sandra K.

    2016-01-01

    The NASA Space Communications and Navigation (SCaN) Testbed, an external payload onboard the International Space Station, is equipped with three software defined radios and a flight computer for supporting in-space communication research. New technologies being studied using the SCaN Testbed include advanced networking, coding, and modulation protocols designed to support the transition of NASAs mission systems from primarily point to point data links and preplanned routes towards adaptive, autonomous internetworked operations needed to meet future mission objectives. Networking protocols implemented on the SCaN Testbed include the Advanced Orbiting Systems (AOS) link-layer protocol, Consultative Committee for Space Data Systems (CCSDS) Encapsulation Packets, Internet Protocol (IP), Space Link Extension (SLE), CCSDS File Delivery Protocol (CFDP), and Delay-Tolerant Networking (DTN) protocols including the Bundle Protocol (BP) and Licklider Transmission Protocol (LTP). The SCaN Testbed end-to-end system provides three S-band data links and one Ka-band data link to exchange space and ground data through NASAs Tracking Data Relay Satellite System or a direct-to-ground link to ground stations. The multiple data links and nodes provide several upgradable elements on both the space and ground systems. This paper will provide a general description of the testbeds system design and capabilities, discuss in detail the design and lessons learned in the implementation of the network protocols, and describe future plans for continuing research to meet the communication needs for evolving global space systems.

  17. Quantum communication and information processing

    NASA Astrophysics Data System (ADS)

    Beals, Travis Roland

    Quantum computers enable dramatically more efficient algorithms for solving certain classes of computational problems, but, in doing so, they create new problems. In particular, Shor's Algorithm allows for efficient cryptanalysis of many public-key cryptosystems. As public key cryptography is a critical component of present-day electronic commerce, it is crucial that a working, secure replacement be found. Quantum key distribution (QKD), first developed by C.H. Bennett and G. Brassard, offers a partial solution, but many challenges remain, both in terms of hardware limitations and in designing cryptographic protocols for a viable large-scale quantum communication infrastructure. In Part I, I investigate optical lattice-based approaches to quantum information processing. I look at details of a proposal for an optical lattice-based quantum computer, which could potentially be used for both quantum communications and for more sophisticated quantum information processing. In Part III, I propose a method for converting and storing photonic quantum bits in the internal state of periodically-spaced neutral atoms by generating and manipulating a photonic band gap and associated defect states. In Part II, I present a cryptographic protocol which allows for the extension of present-day QKD networks over much longer distances without the development of new hardware. I also present a second, related protocol which effectively solves the authentication problem faced by a large QKD network, thus making QKD a viable, information-theoretic secure replacement for public key cryptosystems.

  18. Developing family planning nurse practitioner protocols.

    PubMed

    Hawkins, J W; Roberto, D

    1984-01-01

    This article focuses on the process of development of protocols for family planning nurse practitioners. A rationale for the use of protocols, a definition of the types and examples, and the pros and cons of practice with protocols are presented. A how-to description for the development process follows, including methods and a suggested tool for critique and evaluation. The aim of the article is to assist nurse practitioners in developing protocols for their practice.

  19. A Novel Process Audit for Standardized Perioperative Handoff Protocols.

    PubMed

    Pallekonda, Vinay; Scholl, Adam T; McKelvey, George M; Amhaz, Hassan; Essa, Deanna; Narreddy, Spurthy; Tan, Jens; Templonuevo, Mark; Ramirez, Sasha; Petrovic, Michelle A

    2017-11-01

    A perioperative handoff protocol provides a standardized delivery of communication during a handoff that occurs from the operating room to the postanestheisa care unit or ICU. The protocol's success is dependent, in part, on its continued proper use over time. A novel process audit was developed to help ensure that a perioperative handoff protocol is used accurately and appropriately over time. The Audit Observation Form is used for the Audit Phase of the process audit, while the Audit Averages Form is used for the Data Analysis Phase. Employing minimal resources and using quantitative methods, the process audit provides the necessary means to evaluate the proper execution of any perioperative handoff protocol. Copyright © 2017 The Joint Commission. Published by Elsevier Inc. All rights reserved.

  20. Protocol for evaluation of the cost-effectiveness of ePrescribing systems and candidate prototype for other related health information technologies

    PubMed Central

    2014-01-01

    Background This protocol concerns the assessment of cost-effectiveness of hospital health information technology (HIT) in four hospitals. Two of these hospitals are acquiring ePrescribing systems incorporating extensive decision support, while the other two will implement systems incorporating more basic clinical algorithms. Implementation of an ePrescribing system will have diffuse effects over myriad clinical processes, so the protocol has to deal with a large amount of information collected at various ‘levels’ across the system. Methods/Design The method we propose is use of Bayesian ideas as a philosophical guide. Assessment of cost-effectiveness requires a number of parameters in order to measure incremental cost utility or benefit – the effectiveness of the intervention in reducing frequency of preventable adverse events; utilities for these adverse events; costs of HIT systems; and cost consequences of adverse events averted. There is no single end-point that adequately and unproblematically captures the effectiveness of the intervention; we therefore plan to observe changes in error rates and adverse events in four error categories (death, permanent disability, moderate disability, minimal effect). For each category we will elicit and pool subjective probability densities from experts for reductions in adverse events, resulting from deployment of the intervention in a hospital with extensive decision support. The experts will have been briefed with quantitative and qualitative data from the study and external data sources prior to elicitation. Following this, there will be a process of deliberative dialogues so that experts can “re-calibrate” their subjective probability estimates. The consolidated densities assembled from the repeat elicitation exercise will then be used to populate a health economic model, along with salient utilities. The credible limits from these densities can define thresholds for sensitivity analyses. Discussion The protocol we present here was designed for evaluation of ePrescribing systems. However, the methodology we propose could be used whenever research cannot provide a direct and unbiased measure of comparative effectiveness. PMID:25038609

  1. A cloud-based X73 ubiquitous mobile healthcare system: design and implementation.

    PubMed

    Ji, Zhanlin; Ganchev, Ivan; O'Droma, Máirtín; Zhang, Xin; Zhang, Xueji

    2014-01-01

    Based on the user-centric paradigm for next generation networks, this paper describes a ubiquitous mobile healthcare (uHealth) system based on the ISO/IEEE 11073 personal health data (PHD) standards (X73) and cloud computing techniques. A number of design issues associated with the system implementation are outlined. The system includes a middleware on the user side, providing a plug-and-play environment for heterogeneous wireless sensors and mobile terminals utilizing different communication protocols and a distributed "big data" processing subsystem in the cloud. The design and implementation of this system are envisaged as an efficient solution for the next generation of uHealth systems.

  2. Rethinking the NTCIP Design and Protocols - Analyzing the Issues

    DOT National Transportation Integrated Search

    1998-03-03

    This working paper discusses the issues involved in changing the current draft NTCIP standard from an X.25-based protocol stack to an Internet-based protocol stack. It contains a methodology which could be used to change NTCIP's base protocols. This ...

  3. Factors that impact on the use of mechanical ventilation weaning protocols in critically ill adults and children: a qualitative evidence-synthesis.

    PubMed

    Jordan, Joanne; Rose, Louise; Dainty, Katie N; Noyes, Jane; Blackwood, Bronagh

    2016-10-04

    Prolonged mechanical ventilation is associated with a longer intensive care unit (ICU) length of stay and higher mortality. Consequently, methods to improve ventilator weaning processes have been sought. Two recent Cochrane systematic reviews in ICU adult and paediatric populations concluded that protocols can be effective in reducing the duration of mechanical ventilation, but there was significant heterogeneity in study findings. Growing awareness of the benefits of understanding the contextual factors impacting on effectiveness has encouraged the integration of qualitative evidence syntheses with effectiveness reviews, which has delivered important insights into the reasons underpinning (differential) effectiveness of healthcare interventions. 1. To locate, appraise and synthesize qualitative evidence concerning the barriers and facilitators of the use of protocols for weaning critically-ill adults and children from mechanical ventilation;2. To integrate this synthesis with two Cochrane effectiveness reviews of protocolized weaning to help explain observed heterogeneity by identifying contextual factors that impact on the use of protocols for weaning critically-ill adults and children from mechanical ventilation;3. To use the integrated body of evidence to suggest the circumstances in which weaning protocols are most likely to be used. We used a range of search terms identified with the help of the SPICE (Setting, Perspective, Intervention, Comparison, Evaluation) mnemonic. Where available, we used appropriate methodological filters for specific databases. We searched the following databases: Ovid MEDLINE, Embase, OVID, PsycINFO, CINAHL Plus, EBSCOHost, Web of Science Core Collection, ASSIA, IBSS, Sociological Abstracts, ProQuest and LILACS on the 26th February 2015. In addition, we searched: the grey literature; the websites of professional associations for relevant publications; and the reference lists of all publications reviewed. We also contacted authors of the trials included in the effectiveness reviews as well as of studies (potentially) included in the qualitative synthesis, conducted citation searches of the publications reporting these studies, and contacted content experts.We reran the search on 3rd July 2016 and found three studies, which are awaiting classification. We included qualitative studies that described: the circumstances in which protocols are designed, implemented or used, or both, and the views and experiences of healthcare professionals either involved in the design, implementation or use of weaning protocols or involved in the weaning of critically-ill adults and children from mechanical ventilation not using protocols. We included studies that: reflected on any aspect of the use of protocols, explored contextual factors relevant to the development, implementation or use of weaning protocols, and reported contextual phenomena and outcomes identified as relevant to the effectiveness of protocolized weaning from mechanical ventilation. At each stage, two review authors undertook designated tasks, with the results shared amongst the wider team for discussion and final development. We independently reviewed all retrieved titles, abstracts and full papers for inclusion, and independently extracted selected data from included studies. We used the findings of the included studies to develop a new set of analytic themes focused on the barriers and facilitators to the use of protocols, and further refined them to produce a set of summary statements. We used the Confidence in the Evidence from Reviews of Qualitative Research (CERQual) framework to arrive at a final assessment of the overall confidence of the evidence used in the synthesis. We included all studies but undertook two sensitivity analyses to determine how the removal of certain bodies of evidence impacted on the content and confidence of the synthesis. We deployed a logic model to integrate the findings of the qualitative evidence synthesis with those of the Cochrane effectiveness reviews. We included 11 studies in our synthesis, involving 267 participants (one study did not report the number of participants). Five more studies are awaiting classification and will be dealt with when we update the review.The quality of the evidence was mixed; of the 35 summary statements, we assessed 17 as 'low', 13 as 'moderate' and five as 'high' confidence. Our synthesis produced nine analytical themes, which report potential barriers and facilitators to the use of protocols. The themes are: the need for continual staff training and development; clinical experience as this promotes felt and perceived competence and confidence to wean; the vulnerability of weaning to disparate interprofessional working; an understanding of protocols as militating against a necessary proactivity in clinical practice; perceived nursing scope of practice and professional risk; ICU structure and processes of care; the ability of protocols to act as a prompt for shared care and consistency in weaning practice; maximizing the use of protocols through visibility and ease of implementation; and the ability of protocols to act as a framework for communication with parents. There is a clear need for weaning protocols to take account of the social and cultural environment in which they are to be implemented. Irrespective of its inherent strengths, a protocol will not be used if it does not accommodate these complexities. In terms of protocol development, comprehensive interprofessional input will help to ensure broad-based understanding and a sense of 'ownership'. In terms of implementation, all relevant ICU staff will benefit from general weaning as well as protocol-specific training; not only will this help secure a relevant clinical knowledge base and operational understanding, but will also demonstrate to others that this knowledge and understanding is in place. In order to maximize relevance and acceptability, protocols should be designed with the patient profile and requirements of the target ICU in mind. Predictably, an under-resourced ICU will impact adversely on protocol implementation, as staff will prioritize management of acutely deteriorating and critically-ill patients.

  4. Computational Enzyme Design: Advances, hurdles and possible ways forward

    PubMed Central

    Linder, Mats

    2012-01-01

    This mini review addresses recent developments in computational enzyme design. Successful protocols as well as known issues and limitations are discussed from an energetic perspective. It will be argued that improved results can be obtained by including a dynamic treatment in the design protocol. Finally, a molecular dynamics-based approach for evaluating and refining computational designs is presented. PMID:24688650

  5. The OPERA trial: a protocol for the process evaluation of a randomised trial of an exercise intervention for older people in residential and nursing accommodation

    PubMed Central

    2011-01-01

    Background The OPERA trial is large cluster randomised trial testing a physical activity intervention to address depression amongst people living in nursing and residential homes for older people. A process evaluation was commissioned alongside the trial and we report the protocol for this process evaluation. Challenges included the cognitive and physical ability of the participants, the need to respect the privacy of all home residents, including study non-participants, and the physical structure of the homes. Evaluation activity had to be organised around the structured timetable of homes, leaving limited opportunities for data collection. The aims of this process evaluation are to provide findings that will assist in the interpretation of the clinical trial results, and to inform potential implementation of the physical activity intervention on a wider scale. Methods/design Quantitative data on recruitment of homes and individuals is being collected. For homes in the intervention arm, data on dose and fidelity of the intervention delivered; including individual rates of participation in exercise classes are collected. In the control homes, uptake and delivery of depression awareness training is monitored. These data will be combined with qualitative data from an in-depth study of a purposive sample of eight homes (six intervention and two control). Discussion Although process evaluations are increasingly funded alongside trials, it is still rare to see the findings published, and even rarer to see the protocol for such an evaluation published. Process evaluations have the potential to assist in interpreting and understanding trial results as well as informing future roll-outs of interventions. If such evaluations are funded they should also be reported and reviewed in a similar way to the trial outcome evaluation. Trial Registration ISRCTN No: ISRCTN43769277 PMID:21288341

  6. Software Assurance Curriculum Project Volume 1: Master of Software Assurance Reference Curriculum

    DTIC Science & Technology

    2010-08-01

    activity by providing a check on the relevance and currency of the process used to develop the MSwA2010 curriculum content. Figure 2 is an expansion of...random oracle model, symmetric crypto primitives, modes of operations, asymmetric crypto primitives (Chapter 5) [16] Detailed design...encryption, public key encryption, digital signatures, message authentication codes, crypto protocols, cryptanalysis, and further detailed crypto

  7. A Methodology and Software Environment for Testing Process Model’s Sequential Predictions with Protocols

    DTIC Science & Technology

    1992-12-21

    in preparation). Foundations of artificial intelligence. Cambridge, MA: MIT Press. O’Reilly, R. C. (1991). X3DNet: An X- Based Neural Network ...2.2.3 Trace based protocol analysis 19 2.2A Summary of important data features 21 2.3 Tools related to process model testing 23 2.3.1 Tools for building...algorithm 57 3. Requirements for testing process models using trace based protocol 59 analysis 3.1 Definition of trace based protocol analysis (TBPA) 59

  8. Design of structurally distinct proteins using strategies inspired by evolution

    DOE PAGES

    Jacobs, T. M.; Williams, B.; Williams, T.; ...

    2016-05-06

    Natural recombination combines pieces of preexisting proteins to create new tertiary structures and functions. In this paper, we describe a computational protocol, called SEWING, which is inspired by this process and builds new proteins from connected or disconnected pieces of existing structures. Helical proteins designed with SEWING contain structural features absent from other de novo designed proteins and, in some cases, remain folded at more than 100°C. High-resolution structures of the designed proteins CA01 and DA05R1 were solved by x-ray crystallography (2.2 angstrom resolution) and nuclear magnetic resonance, respectively, and there was excellent agreement with the design models. Finally, thismore » method provides a new strategy to rapidly create large numbers of diverse and designable protein scaffolds.« less

  9. A technology training protocol for meeting QSEN goals: Focusing on meaningful learning.

    PubMed

    Luo, Shuhong; Kalman, Melanie

    2018-01-01

    The purpose of this paper is to describe and discuss how we designed and developed a 12-step technology training protocol. The protocol is meant to improve meaningful learning in technology education so that nursing students are able to meet the informatics requirements of Quality and Safety Education in Nursing competencies. When designing and developing the training protocol, we used a simplified experiential learning model that addressed the core features of meaningful learning: to connect new knowledge with students' prior knowledge and real-world workflow. Before training, we identified students' prior knowledge and workflow tasks. During training, students learned by doing, reflected on their prior computer skills and workflow, designed individualized procedures for integration into their workflow, and practiced the self-designed procedures in real-world settings. The trainer was a facilitator who provided a meaningful learning environment, asked the right questions to guide reflective conversation, and offered scaffoldings at critical moments. This training protocol could significantly improve nurses' competencies in using technologies and increase their desire to adopt new technologies. © 2017 Wiley Periodicals, Inc.

  10. A Novel Cross-Layer Routing Protocol Based on Network Coding for Underwater Sensor Networks.

    PubMed

    Wang, Hao; Wang, Shilian; Bu, Renfei; Zhang, Eryang

    2017-08-08

    Underwater wireless sensor networks (UWSNs) have attracted increasing attention in recent years because of their numerous applications in ocean monitoring, resource discovery and tactical surveillance. However, the design of reliable and efficient transmission and routing protocols is a challenge due to the low acoustic propagation speed and complex channel environment in UWSNs. In this paper, we propose a novel cross-layer routing protocol based on network coding (NCRP) for UWSNs, which utilizes network coding and cross-layer design to greedily forward data packets to sink nodes efficiently. The proposed NCRP takes full advantages of multicast transmission and decode packets jointly with encoded packets received from multiple potential nodes in the entire network. The transmission power is optimized in our design to extend the life cycle of the network. Moreover, we design a real-time routing maintenance protocol to update the route when detecting inefficient relay nodes. Substantial simulations in underwater environment by Network Simulator 3 (NS-3) show that NCRP significantly improves the network performance in terms of energy consumption, end-to-end delay and packet delivery ratio compared with other routing protocols for UWSNs.

  11. Evaluating Data Abstraction Assistant, a novel software application for data abstraction during systematic reviews: protocol for a randomized controlled trial.

    PubMed

    Saldanha, Ian J; Schmid, Christopher H; Lau, Joseph; Dickersin, Kay; Berlin, Jesse A; Jap, Jens; Smith, Bryant T; Carini, Simona; Chan, Wiley; De Bruijn, Berry; Wallace, Byron C; Hutfless, Susan M; Sim, Ida; Murad, M Hassan; Walsh, Sandra A; Whamond, Elizabeth J; Li, Tianjing

    2016-11-22

    Data abstraction, a critical systematic review step, is time-consuming and prone to errors. Current standards for approaches to data abstraction rest on a weak evidence base. We developed the Data Abstraction Assistant (DAA), a novel software application designed to facilitate the abstraction process by allowing users to (1) view study article PDFs juxtaposed to electronic data abstraction forms linked to a data abstraction system, (2) highlight (or "pin") the location of the text in the PDF, and (3) copy relevant text from the PDF into the form. We describe the design of a randomized controlled trial (RCT) that compares the relative effectiveness of (A) DAA-facilitated single abstraction plus verification by a second person, (B) traditional (non-DAA-facilitated) single abstraction plus verification by a second person, and (C) traditional independent dual abstraction plus adjudication to ascertain the accuracy and efficiency of abstraction. This is an online, randomized, three-arm, crossover trial. We will enroll 24 pairs of abstractors (i.e., sample size is 48 participants), each pair comprising one less and one more experienced abstractor. Pairs will be randomized to abstract data from six articles, two under each of the three approaches. Abstractors will complete pre-tested data abstraction forms using the Systematic Review Data Repository (SRDR), an online data abstraction system. The primary outcomes are (1) proportion of data items abstracted that constitute an error (compared with an answer key) and (2) total time taken to complete abstraction (by two abstractors in the pair, including verification and/or adjudication). The DAA trial uses a practical design to test a novel software application as a tool to help improve the accuracy and efficiency of the data abstraction process during systematic reviews. Findings from the DAA trial will provide much-needed evidence to strengthen current recommendations for data abstraction approaches. The trial is registered at National Information Center on Health Services Research and Health Care Technology (NICHSR) under Registration # HSRP20152269: https://wwwcf.nlm.nih.gov/hsr_project/view_hsrproj_record.cfm?NLMUNIQUE_ID=20152269&SEARCH_FOR=Tianjing%20Li . All items from the World Health Organization Trial Registration Data Set are covered at various locations in this protocol. Protocol version and date: This is version 2.0 of the protocol, dated September 6, 2016. As needed, we will communicate any protocol amendments to the Institutional Review Boards (IRBs) of Johns Hopkins Bloomberg School of Public Health (JHBSPH) and Brown University. We also will make appropriate as-needed modifications to the NICHSR website in a timely fashion.

  12. Dependency of image quality on acquisition protocol and image processing in chest tomosynthesis-a visual grading study based on clinical data.

    PubMed

    Jadidi, Masoud; Båth, Magnus; Nyrén, Sven

    2018-04-09

    To compare the quality of images obtained with two different protocols with different acquisition time and the influence from image post processing in a chest digital tomosynthesis (DTS) system. 20 patients with suspected lung cancer were imaged with a chest X-ray equipment with tomosynthesis option. Two examination protocols with different acquisition times (6.3 and 12 s) were performed on each patient. Both protocols were presented with two different image post-processing (standard DTS processing and more advanced processing optimised for chest radiography). Thus, 4 series from each patient, altogether 80 series, were presented anonymously and in a random order. Five observers rated the quality of the reconstructed section images according to predefined quality criteria in three different classes. Visual grading characteristics (VGC) was used to analyse the data and the area under the VGC curve (AUC VGC ) was used as figure-of-merit. The 12 s protocol and the standard DTS processing were used as references in the analyses. The protocol with 6.3 s acquisition time had a statistically significant advantage over the vendor-recommended protocol with 12 s acquisition time for the classes of criteria, Demarcation (AUC VGC = 0.56, p = 0.009) and Disturbance (AUC VGC = 0.58, p < 0.001). A similar value of AUC VGC was found also for the class Structure (definition of bone structures in the spine) (0.56) but it could not be statistically separated from 0.5 (p = 0.21). For the image processing, the VGC analysis showed a small but statistically significant advantage for the standard DTS processing over the more advanced processing for the classes of criteria Demarcation (AUC VGC = 0.45, p = 0.017) and Disturbance (AUC VGC = 0.43, p = 0.005). A similar value of AUC VGC was found also for the class Structure (0.46), but it could not be statistically separated from 0.5 (p = 0.31). The study indicates that the protocol with 6.3 s acquisition time yields slightly better image quality than the vender-recommended protocol with acquisition time 12 s for several anatomical structures. Furthermore, the standard gradation processing  (the vendor-recommended post-processing for DTS), yields to some extent advantage over the gradation processing/multiobjective frequency processing/flexible noise control processing in terms of image quality for all classes of criteria. Advances in knowledge: The study proves that the image quality may be strongly affected by the selection of DTS protocol and that the vendor-recommended protocol may not always be the optimal choice.

  13. Application of Metagenomic Sequencing to Food Safety: Detection of Shiga Toxin-Producing Escherichia coli on Fresh Bagged Spinach

    PubMed Central

    Leonard, Susan R.; Mammel, Mark K.; Lacher, David W.

    2015-01-01

    Culture-independent diagnostics reduce the reliance on traditional (and slower) culture-based methodologies. Here we capitalize on advances in next-generation sequencing (NGS) to apply this approach to food pathogen detection utilizing NGS as an analytical tool. In this study, spiking spinach with Shiga toxin-producing Escherichia coli (STEC) following an established FDA culture-based protocol was used in conjunction with shotgun metagenomic sequencing to determine the limits of detection, sensitivity, and specificity levels and to obtain information on the microbiology of the protocol. We show that an expected level of contamination (∼10 CFU/100 g) could be adequately detected (including key virulence determinants and strain-level specificity) within 8 h of enrichment at a sequencing depth of 10,000,000 reads. We also rationalize the relative benefit of static versus shaking culture conditions and the addition of selected antimicrobial agents, thereby validating the long-standing culture-based parameters behind such protocols. Moreover, the shotgun metagenomic approach was informative regarding the dynamics of microbial communities during the enrichment process, including initial surveys of the microbial loads associated with bagged spinach; the microbes found included key genera such as Pseudomonas, Pantoea, and Exiguobacterium. Collectively, our metagenomic study highlights and considers various parameters required for transitioning to such sequencing-based diagnostics for food safety and the potential to develop better enrichment processes in a high-throughput manner not previously possible. Future studies will investigate new species-specific DNA signature target regimens, rational design of medium components in concert with judicious use of additives, such as antibiotics, and alterations in the sample processing protocol to enhance detection. PMID:26386062

  14. An evaluation of MPI message rate on hybrid-core processors

    DOE PAGES

    Barrett, Brian W.; Brightwell, Ron; Grant, Ryan; ...

    2014-11-01

    Power and energy concerns are motivating chip manufacturers to consider future hybrid-core processor designs that may combine a small number of traditional cores optimized for single-thread performance with a large number of simpler cores optimized for throughput performance. This trend is likely to impact the way in which compute resources for network protocol processing functions are allocated and managed. In particular, the performance of MPI match processing is critical to achieving high message throughput. In this paper, we analyze the ability of simple and more complex cores to perform MPI matching operations for various scenarios in order to gain insightmore » into how MPI implementations for future hybrid-core processors should be designed.« less

  15. Telecommunications issues of intelligent database management for ground processing systems in the EOS era

    NASA Technical Reports Server (NTRS)

    Touch, Joseph D.

    1994-01-01

    Future NASA earth science missions, including the Earth Observing System (EOS), will be generating vast amounts of data that must be processed and stored at various locations around the world. Here we present a stepwise-refinement of the intelligent database management (IDM) of the distributed active archive center (DAAC - one of seven regionally-located EOSDIS archive sites) architecture, to showcase the telecommunications issues involved. We develop this architecture into a general overall design. We show that the current evolution of protocols is sufficient to support IDM at Gbps rates over large distances. We also show that network design can accommodate a flexible data ingestion storage pipeline and a user extraction and visualization engine, without interference between the two.

  16. Safe bunker designing for the 18 MV Varian 2100 Clinac: a comparison between Monte Carlo simulation based upon data and new protocol recommendations

    PubMed Central

    Beigi, Manije; Afarande, Fatemeh; Ghiasi, Hosein

    2016-01-01

    Aim The aim of this study was to compare two bunkers designed by only protocols recommendations and Monte Carlo (MC) based upon data derived for an 18 MV Varian 2100Clinac accelerator. Background High energy radiation therapy is associated with fast and thermal photoneutrons. Adequate shielding against the contaminant neutron has been recommended by IAEA and NCRP new protocols. Materials and methods The latest protocols released by the IAEA (safety report No. 47) and NCRP report No. 151 were used for the bunker designing calculations. MC method based upon data was also derived. Two bunkers using protocols and MC upon data were designed and discussed. Results From designed door's thickness, the door designed by the MC simulation and Wu–McGinley analytical method was closer in both BPE and lead thickness. In the case of the primary and secondary barriers, MC simulation resulted in 440.11 mm for the ordinary concrete, total concrete thickness of 1709 mm was required. Calculating the same parameters value with the recommended analytical methods resulted in 1762 mm for the required thickness using 445 mm as recommended by TVL for the concrete. Additionally, for the secondary barrier the thickness of 752.05 mm was obtained. Conclusion Our results showed MC simulation and the followed protocols recommendations in dose calculation are in good agreement in the radiation contamination dose calculation. Difference between the two analytical and MC simulation methods revealed that the application of only one method for the bunker design may lead to underestimation or overestimation in dose and shielding calculations. PMID:26900357

  17. Optimization of a sample processing protocol for recovery of Bacillus anthracis spores from soil

    USGS Publications Warehouse

    Silvestri, Erin E.; Feldhake, David; Griffin, Dale; Lisle, John T.; Nichols, Tonya L.; Shah, Sanjiv; Pemberton, A; Schaefer III, Frank W

    2016-01-01

    Following a release of Bacillus anthracis spores into the environment, there is a potential for lasting environmental contamination in soils. There is a need for detection protocols for B. anthracis in environmental matrices. However, identification of B. anthracis within a soil is a difficult task. Processing soil samples helps to remove debris, chemical components, and biological impurities that can interfere with microbiological detection. This study aimed to optimize a previously used indirect processing protocol, which included a series of washing and centrifugation steps. Optimization of the protocol included: identifying an ideal extraction diluent, variation in the number of wash steps, variation in the initial centrifugation speed, sonication and shaking mechanisms. The optimized protocol was demonstrated at two laboratories in order to evaluate the recovery of spores from loamy and sandy soils. The new protocol demonstrated an improved limit of detection for loamy and sandy soils over the non-optimized protocol with an approximate matrix limit of detection at 14 spores/g of soil. There were no significant differences overall between the two laboratories for either soil type, suggesting that the processing protocol will be robust enough to use at multiple laboratories while achieving comparable recoveries.

  18. Rendezvous Protocols and Dynamic Frequency Hopping Interference Design for Anti-Jamming Satellite Communication

    DTIC Science & Technology

    2013-11-25

    previously considered this proactive approach to combat unintentional, persistent (non- reactive) interference . In this project, we plan on extending our...channel” (or code ) by chance, through public knowledge of the underlying protocol semantics , or by compromising one of the network devices. An alternative...AFRL-RV-PS- AFRL-RV-PS- TR-2013-0142 TR-2013-0142 RENDEZVOUS PROTOCOLS AND DYNAMIC FREQUENCY HOPPING INTERFERENCE DESIGN FOR ANTI-JAMMING

  19. The Unanticipated Challenges Associated With Implementing an Observational Study Protocol in a Large-Scale Physical Activity and Global Positioning System Data Collection.

    PubMed

    McCrorie, Paul; Walker, David; Ellaway, Anne

    2018-04-30

    Large-scale primary data collections are complex, costly, and time-consuming. Study protocols for trial-based research are now commonplace, with a growing number of similar pieces of work being published on observational research. However, useful additions to the literature base are publications that describe the issues and challenges faced while conducting observational studies. These can provide researchers with insightful knowledge that can inform funding proposals or project development work. In this study, we identify and reflectively discuss the unforeseen or often unpublished issues associated with organizing and implementing a large-scale objectively measured physical activity and global positioning system (GPS) data collection. The SPACES (Studying Physical Activity in Children's Environments across Scotland) study was designed to collect objectively measured physical activity and GPS data from 10- to 11-year-old children across Scotland, using a postal delivery method. The 3 main phases of the project (recruitment, delivery of project materials, and data collection and processing) are described within a 2-stage framework: (1) intended design and (2) implementation of the intended design. Unanticipated challenges arose, which influenced the data collection process; these encompass four main impact categories: (1) cost, budget, and funding; (2) project timeline; (3) participation and engagement; and (4) data challenges. The main unforeseen issues that impacted our timeline included the informed consent process for children under the age of 18 years; the use of, and coordination with, the postal service to deliver study information and equipment; and the variability associated with when participants began data collection and the time taken to send devices and consent forms back (1-12 months). Unanticipated budgetary issues included the identification of some study materials (AC power adapter) not fitting through letterboxes, as well as the employment of fieldworkers to increase recruitment and the return of consent forms. Finally, we encountered data issues when processing physical activity and GPS data that had been initiated across daylight saving time. We present learning points and recommendations that may benefit future studies of similar methodology in their early stages of development. ©Paul McCrorie, David Walker, Anne Ellaway. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 30.04.2018.

  20. Quality Assurance Program Plan for SFR Metallic Fuel Data Qualification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benoit, Timothy; Hlotke, John Daniel; Yacout, Abdellatif

    2017-07-05

    This document contains an evaluation of the applicability of the current Quality Assurance Standards from the American Society of Mechanical Engineers Standard NQA-1 (NQA-1) criteria and identifies and describes the quality assurance process(es) by which attributes of historical, analytical, and other data associated with sodium-cooled fast reactor [SFR] metallic fuel and/or related reactor fuel designs and constituency will be evaluated. This process is being instituted to facilitate validation of data to the extent that such data may be used to support future licensing efforts associated with advanced reactor designs. The initial data to be evaluated under this program were generatedmore » during the US Integral Fast Reactor program between 1984-1994, where the data includes, but is not limited to, research and development data and associated documents, test plans and associated protocols, operations and test data, technical reports, and information associated with past United States Nuclear Regulatory Commission reviews of SFR designs.« less

  1. DNA Assembly in 3D Printed Fluidics

    PubMed Central

    Patrick, William G.; Nielsen, Alec A. K.; Keating, Steven J.; Levy, Taylor J.; Wang, Che-Wei; Rivera, Jaime J.; Mondragón-Palomino, Octavio; Carr, Peter A.; Voigt, Christopher A.; Oxman, Neri; Kong, David S.

    2015-01-01

    The process of connecting genetic parts—DNA assembly—is a foundational technology for synthetic biology. Microfluidics present an attractive solution for minimizing use of costly reagents, enabling multiplexed reactions, and automating protocols by integrating multiple protocol steps. However, microfluidics fabrication and operation can be expensive and requires expertise, limiting access to the technology. With advances in commodity digital fabrication tools, it is now possible to directly print fluidic devices and supporting hardware. 3D printed micro- and millifluidic devices are inexpensive, easy to make and quick to produce. We demonstrate Golden Gate DNA assembly in 3D-printed fluidics with reaction volumes as small as 490 nL, channel widths as fine as 220 microns, and per unit part costs ranging from $0.61 to $5.71. A 3D-printed syringe pump with an accompanying programmable software interface was designed and fabricated to operate the devices. Quick turnaround and inexpensive materials allowed for rapid exploration of device parameters, demonstrating a manufacturing paradigm for designing and fabricating hardware for synthetic biology. PMID:26716448

  2. MollDE: a homology modeling framework you can click with.

    PubMed

    Canutescu, Adrian A; Dunbrack, Roland L

    2005-06-15

    Molecular Integrated Development Environment (MolIDE) is an integrated application designed to provide homology modeling tools and protocols under a uniform, user-friendly graphical interface. Its main purpose is to combine the most frequent modeling steps in a semi-automatic, interactive way, guiding the user from the target protein sequence to the final three-dimensional protein structure. The typical basic homology modeling process is composed of building sequence profiles of the target sequence family, secondary structure prediction, sequence alignment with PDB structures, assisted alignment editing, side-chain prediction and loop building. All of these steps are available through a graphical user interface. MolIDE's user-friendly and streamlined interactive modeling protocol allows the user to focus on the important modeling questions, hiding from the user the raw data generation and conversion steps. MolIDE was designed from the ground up as an open-source, cross-platform, extensible framework. This allows developers to integrate additional third-party programs to MolIDE. http://dunbrack.fccc.edu/molide/molide.php rl_dunbrack@fccc.edu.

  3. Molecular beacon-enabled purification of living cells by targeting cell type-specific mRNAs.

    PubMed

    Wile, Brian M; Ban, Kiwon; Yoon, Young-Sup; Bao, Gang

    2014-10-01

    Molecular beacons (MBs) are dual-labeled oligonucleotides that fluoresce only in the presence of complementary mRNA. The use of MBs to target specific mRNAs allows sorting of specific cells from a mixed cell population. In contrast to existing approaches that are limited by available surface markers or selectable metabolic characteristics, the MB-based method enables the isolation of a wide variety of cells. For example, the ability to purify specific cell types derived from pluripotent stem cells (PSCs) is important for basic research and therapeutics. In addition to providing a general protocol for MB design, validation and nucleofection into cells, we describe how to isolate a specific cell population from differentiating PSCs. By using this protocol, we have successfully isolated cardiomyocytes differentiated from mouse or human PSCs (hPSCs) with ∼ 97% purity, as confirmed by electrophysiology and immunocytochemistry. After designing MBs, their ordering and validation requires 2 weeks, and the isolation process requires 3 h.

  4. Paediatric CT protocol optimisation: a design of experiments to support the modelling and optimisation process.

    PubMed

    Rani, K; Jahnen, A; Noel, A; Wolf, D

    2015-07-01

    In the last decade, several studies have emphasised the need to understand and optimise the computed tomography (CT) procedures in order to reduce the radiation dose applied to paediatric patients. To evaluate the influence of the technical parameters on the radiation dose and the image quality, a statistical model has been developed using the design of experiments (DOE) method that has been successfully used in various fields (industry, biology and finance) applied to CT procedures for the abdomen of paediatric patients. A Box-Behnken DOE was used in this study. Three mathematical models (contrast-to-noise ratio, noise and CTDI vol) depending on three factors (tube current, tube voltage and level of iterative reconstruction) were developed and validated. They will serve as a basis for the development of a CT protocol optimisation model. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Template for success: using a resident-designed sign-out template in the handover of patient care.

    PubMed

    Clark, Clancy J; Sindell, Sarah L; Koehler, Richard P

    2011-01-01

    Report our implementation of a standardized handover process in a general surgery residency program. The standardized handover process, sign-out template, method of implementation, and continuous quality improvement process were designed by general surgery residents with support of faculty and senior hospital administration using standard work principles and business models of the Virginia Mason Production System and the Toyota Production System. Nonprofit, tertiary referral teaching hospital. General surgery residents, residency faculty, patient care providers, and hospital administration. After instruction in quality improvement initiatives, a team of general surgery residents designed a sign-out process using an electronic template and standard procedures. The initial implementation phase resulted in 73% compliance. Using resident-driven continuous quality improvement processes, real-time feedback enabled residents to modify and improve this process, eventually attaining 100% compliance and acceptance by residents. The creation of a standardized template and protocol for patient handovers might eliminate communication failures. Encouraging residents to participate in this process can establish the groundwork for successful implementation of a standardized handover process. Integrating a continuous quality-improvement process into such an initiative can promote active participation of busy general surgery residents and lead to successful implementation of standard procedures. Copyright © 2011 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  6. Auto-Generated Semantic Processing Services

    NASA Technical Reports Server (NTRS)

    Davis, Rodney; Hupf, Greg

    2009-01-01

    Auto-Generated Semantic Processing (AGSP) Services is a suite of software tools for automated generation of other computer programs, denoted cross-platform semantic adapters, that support interoperability of computer-based communication systems that utilize a variety of both new and legacy communication software running in a variety of operating- system/computer-hardware combinations. AGSP has numerous potential uses in military, space-exploration, and other government applications as well as in commercial telecommunications. The cross-platform semantic adapters take advantage of common features of computer- based communication systems to enforce semantics, messaging protocols, and standards of processing of streams of binary data to ensure integrity of data and consistency of meaning among interoperating systems. The auto-generation aspect of AGSP Services reduces development time and effort by emphasizing specification and minimizing implementation: In effect, the design, building, and debugging of software for effecting conversions among complex communication protocols, custom device mappings, and unique data-manipulation algorithms is replaced with metadata specifications that map to an abstract platform-independent communications model. AGSP Services is modular and has been shown to be easily integrable into new and legacy NASA flight and ground communication systems.

  7. A Systematic Comprehensive Computational Model for Stake Estimation in Mission Assurance: Applying Cyber Security Econometrics System (CSES) to Mission Assurance Analysis Protocol (MAAP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abercrombie, Robert K; Sheldon, Frederick T; Grimaila, Michael R

    2010-01-01

    In earlier works, we presented a computational infrastructure that allows an analyst to estimate the security of a system in terms of the loss that each stakeholder stands to sustain as a result of security breakdowns. In this paper, we discuss how this infrastructure can be used in the subject domain of mission assurance as defined as the full life-cycle engineering process to identify and mitigate design, production, test, and field support deficiencies of mission success. We address the opportunity to apply the Cyberspace Security Econometrics System (CSES) to Carnegie Mellon University and Software Engineering Institute s Mission Assurance Analysismore » Protocol (MAAP) in this context.« less

  8. Group protocol to mitigate disaster stress and enhance social support in adolescents exposed to Hurricane Hugo.

    PubMed

    Stewart, J B; Hardin, S B; Weinrich, S; McGeorge, S; Lopez, J; Pesut, D

    1992-01-01

    Literature reports that cognitive understanding and social support can mitigate stress in both adults and adolescents. As a subcomponent of the Carolina Adolescent Health Project (CAHP), this research evaluated the efficacy of a Cognitive Social Support (CSS) group protocol designed to mitigate the disaster stress of adolescents who had been exposed seriously to Hurricane Hugo. A purposive sample of 259 students participated in and evaluated the CSS. This article reports the specific structure, content, process, rationale, and cost of the CSS. Evaluations indicated that 82% of the students evaluated the small-group component of the CSS as "very good" or "excellent," while 70% rated the large-group component as "very good" or "excellent."

  9. The Design of Finite State Machine for Asynchronous Replication Protocol

    NASA Astrophysics Data System (ADS)

    Wang, Yanlong; Li, Zhanhuai; Lin, Wei; Hei, Minglei; Hao, Jianhua

    Data replication is a key way to design a disaster tolerance system and to achieve reliability and availability. It is difficult for a replication protocol to deal with the diverse and complex environment. This means that data is less well replicated than it ought to be. To reduce data loss and to optimize replication protocols, we (1) present a finite state machine, (2) run it to manage an asynchronous replication protocol and (3) report a simple evaluation of the asynchronous replication protocol based on our state machine. It's proved that our state machine is applicable to guarantee the asynchronous replication protocol running in the proper state to the largest extent in the event of various possible events. It also can helpful to build up replication-based disaster tolerance systems to ensure the business continuity.

  10. A Web Resource for Standardized Benchmark Datasets, Metrics, and Rosetta Protocols for Macromolecular Modeling and Design.

    PubMed

    Ó Conchúir, Shane; Barlow, Kyle A; Pache, Roland A; Ollikainen, Noah; Kundert, Kale; O'Meara, Matthew J; Smith, Colin A; Kortemme, Tanja

    2015-01-01

    The development and validation of computational macromolecular modeling and design methods depend on suitable benchmark datasets and informative metrics for comparing protocols. In addition, if a method is intended to be adopted broadly in diverse biological applications, there needs to be information on appropriate parameters for each protocol, as well as metrics describing the expected accuracy compared to experimental data. In certain disciplines, there exist established benchmarks and public resources where experts in a particular methodology are encouraged to supply their most efficient implementation of each particular benchmark. We aim to provide such a resource for protocols in macromolecular modeling and design. We present a freely accessible web resource (https://kortemmelab.ucsf.edu/benchmarks) to guide the development of protocols for protein modeling and design. The site provides benchmark datasets and metrics to compare the performance of a variety of modeling protocols using different computational sampling methods and energy functions, providing a "best practice" set of parameters for each method. Each benchmark has an associated downloadable benchmark capture archive containing the input files, analysis scripts, and tutorials for running the benchmark. The captures may be run with any suitable modeling method; we supply command lines for running the benchmarks using the Rosetta software suite. We have compiled initial benchmarks for the resource spanning three key areas: prediction of energetic effects of mutations, protein design, and protein structure prediction, each with associated state-of-the-art modeling protocols. With the help of the wider macromolecular modeling community, we hope to expand the variety of benchmarks included on the website and continue to evaluate new iterations of current methods as they become available.

  11. Post-task Effects on EEG Brain Activity Differ for Various Differential Learning and Contextual Interference Protocols

    PubMed Central

    Henz, Diana; John, Alexander; Merz, Christian; Schöllhorn, Wolfgang I.

    2018-01-01

    A large body of research has shown superior learning rates in variable practice compared to repetitive practice. More specifically, this has been demonstrated in the contextual interference (CI) and in the differential learning (DL) approach that are both representatives of variable practice. Behavioral studies have indicate different learning processes in CI and DL. Aim of the present study was to examine immediate post-task effects on electroencephalographic (EEG) brain activation patterns after CI and DL protocols that reveal underlying neural processes at the early stage of motor consolidation. Additionally, we tested two DL protocols (gradual DL, chaotic DL) to examine the effect of different degrees of stochastic fluctuations within the DL approach with a low degree of fluctuations in gradual DL and a high degree of fluctuations in chaotic DL. Twenty-two subjects performed badminton serves according to three variable practice protocols (CI, gradual DL, chaotic DL), and a repetitive learning protocol in a within-subjects design. Spontaneous EEG activity was measured before, and immediately after each 20-min practice session from 19 electrodes. Results showed distinguishable neural processes after CI, DL, and repetitive learning. Increases in EEG theta and alpha power were obtained in somatosensory regions (electrodes P3, P7, Pz, P4, P8) in both DL conditions compared to CI, and repetitive learning. Increases in theta and alpha activity in motor areas (electrodes C3, Cz, C4) were found after chaotic DL compared to gradual DL, and CI. Anterior areas (electrodes F3, F7, Fz, F4, F8) showed increased activity in the beta and gamma bands after CI. Alpha activity was increased in occipital areas (electrodes O1, O2) after repetitive learning. Post-task EEG brain activation patterns suggest that DL stimulates the somatosensory and motor system, and engages more regions of the cortex than repetitive learning due to a tighter stimulation of the motor and somatosensory system during DL practice. CI seems to activate specifically executively controlled processing in anterior brain areas. We discuss the obtained patterns of post-training EEG traces as evidence for different underlying neural processes in CI, DL, and repetitive learning at the early stage of motor learning. PMID:29445334

  12. Considerations in establishing a post-mortem brain and tissue bank for the study of myalgic encephalomyelitis/chronic fatigue syndrome: a proposed protocol

    PubMed Central

    2014-01-01

    Background Our aim, having previously investigated through a qualitative study involving extensive discussions with experts and patients the issues involved in establishing and maintaining a disease specific brain and tissue bank for myalgic encephalomyelitis/chronic fatigue syndrome (ME/CFS), was to develop a protocol for a UK ME/CFS repository of high quality human tissue from well characterised subjects with ME/CFS and controls suitable for a broad range of research applications. This would involve a specific donor program coupled with rapid tissue collection and processing, supplemented by comprehensive prospectively collected clinical, laboratory and self-assessment data from cases and controls. Findings We reviewed the operations of existing tissue banks from published literature and from their internal protocols and standard operating procedures (SOPs). On this basis, we developed the protocol presented here, which was designed to meet high technical and ethical standards and legal requirements and was based on recommendations of the MRC UK Brain Banks Network. The facility would be most efficient and cost-effective if incorporated into an existing tissue bank. Tissue collection would be rapid and follow robust protocols to ensure preservation sufficient for a wide range of research uses. A central tissue bank would have resources both for wide-scale donor recruitment and rapid response to donor death for prompt harvesting and processing of tissue. Conclusion An ME/CFS brain and tissue bank could be established using this protocol. Success would depend on careful consideration of logistic, technical, legal and ethical issues, continuous consultation with patients and the donor population, and a sustainable model of funding ideally involving research councils, health services, and patient charities. This initiative could revolutionise the understanding of this still poorly-understood disease and enhance development of diagnostic biomarkers and treatments. PMID:24938650

  13. Blockchain protocols in clinical trials: Transparency and traceability of consent.

    PubMed

    Benchoufi, Mehdi; Porcher, Raphael; Ravaud, Philippe

    2017-01-01

    Clinical trial consent for protocols and their revisions should be transparent for patients and traceable for stakeholders. Our goal is to implement a process allowing for collection of patients' informed consent, which is bound to protocol revisions, storing and tracking the consent in a secure, unfalsifiable and publicly verifiable way, and enabling the sharing of this information in real time. For that, we build a consent workflow using a trending technology called Blockchain. This is a distributed technology that brings a built-in layer of transparency and traceability. From a more general and prospective point of view, we believe Blockchain technology brings a paradigmatical shift to the entire clinical research field. We designed a Proof-of-Concept protocol consisting of time-stamping each step of the patient's consent collection using Blockchain, thus archiving and historicising the consent through cryptographic validation in a securely unfalsifiable and transparent way. For each protocol revision, consent was sought again.  We obtained a single document, in an open format, that accounted for the whole consent collection process: a time-stamped consent status regarding each version of the protocol. This document cannot be corrupted and can be checked on any dedicated public website. It should be considered a robust proof of data. However, in a live clinical trial, the authentication system should be strengthened to remove the need for third parties, here trial stakeholders, and give participative control to the peer users. In the future, the complex data flow of a clinical trial could be tracked by using Blockchain, which core functionality, named Smart Contract, could help prevent clinical trial events not occurring in the correct chronological order, for example including patients before they consented or analysing case report form data before freezing the database. Globally, Blockchain could help with reliability, security, transparency and could be a consistent step toward reproducibility.

  14. Blockchain protocols in clinical trials: Transparency and traceability of consent

    PubMed Central

    Benchoufi, Mehdi; Porcher, Raphael; Ravaud, Philippe

    2018-01-01

    Clinical trial consent for protocols and their revisions should be transparent for patients and traceable for stakeholders. Our goal is to implement a process allowing for collection of patients’ informed consent, which is bound to protocol revisions, storing and tracking the consent in a secure, unfalsifiable and publicly verifiable way, and enabling the sharing of this information in real time. For that, we build a consent workflow using a trending technology called Blockchain. This is a distributed technology that brings a built-in layer of transparency and traceability. From a more general and prospective point of view, we believe Blockchain technology brings a paradigmatical shift to the entire clinical research field. We designed a Proof-of-Concept protocol consisting of time-stamping each step of the patient’s consent collection using Blockchain, thus archiving and historicising the consent through cryptographic validation in a securely unfalsifiable and transparent way. For each protocol revision, consent was sought again.  We obtained a single document, in an open format, that accounted for the whole consent collection process: a time-stamped consent status regarding each version of the protocol. This document cannot be corrupted and can be checked on any dedicated public website. It should be considered a robust proof of data. However, in a live clinical trial, the authentication system should be strengthened to remove the need for third parties, here trial stakeholders, and give participative control to the peer users. In the future, the complex data flow of a clinical trial could be tracked by using Blockchain, which core functionality, named Smart Contract, could help prevent clinical trial events not occurring in the correct chronological order, for example including patients before they consented or analysing case report form data before freezing the database. Globally, Blockchain could help with reliability, security, transparency and could be a consistent step toward reproducibility. PMID:29167732

  15. Internet-Protocol-Based Satellite Bus Architecture Designed

    NASA Technical Reports Server (NTRS)

    Slywczak, Richard A.

    2004-01-01

    NASA is designing future complex satellite missions ranging from single satellites and constellations to space networks and sensor webs. These missions require more interoperability, autonomy, and coordination than previous missions; in addition, a desire exists to have scientists retrieve data directly from the satellite rather than a central distribution source. To meet these goals, NASA has been studying the possibility of extending the Transmission Control Protocol/Internet Protocol (TCP/IP) suite for spacebased applications.

  16. Integration of Molecular Dynamics Based Predictions into the Optimization of De Novo Protein Designs: Limitations and Benefits.

    PubMed

    Carvalho, Henrique F; Barbosa, Arménio J M; Roque, Ana C A; Iranzo, Olga; Branco, Ricardo J F

    2017-01-01

    Recent advances in de novo protein design have gained considerable insight from the intrinsic dynamics of proteins, based on the integration of molecular dynamics simulations protocols on the state-of-the-art de novo protein design protocols used nowadays. With this protocol we illustrate how to set up and run a molecular dynamics simulation followed by a functional protein dynamics analysis. New users will be introduced to some useful open-source computational tools, including the GROMACS molecular dynamics simulation software package and ProDy for protein structural dynamics analysis.

  17. Experimental high-speed network

    NASA Astrophysics Data System (ADS)

    McNeill, Kevin M.; Klein, William P.; Vercillo, Richard; Alsafadi, Yasser H.; Parra, Miguel V.; Dallas, William J.

    1993-09-01

    Many existing local area networking protocols currently applied in medical imaging were originally designed for relatively low-speed, low-volume networking. These protocols utilize small packet sizes appropriate for text based communication. Local area networks of this type typically provide raw bandwidth under 125 MHz. These older network technologies are not optimized for the low delay, high data traffic environment of a totally digital radiology department. Some current implementations use point-to-point links when greater bandwidth is required. However, the use of point-to-point communications for a total digital radiology department network presents many disadvantages. This paper describes work on an experimental multi-access local area network called XFT. The work includes the protocol specification, and the design and implementation of network interface hardware and software. The protocol specifies the Physical and Data Link layers (OSI layers 1 & 2) for a fiber-optic based token ring providing a raw bandwidth of 500 MHz. The protocol design and implementation of the XFT interface hardware includes many features to optimize image transfer and provide flexibility for additional future enhancements which include: a modular hardware design supporting easy portability to a variety of host system buses, a versatile message buffer design providing 16 MB of memory, and the capability to extend the raw bandwidth of the network to 3.0 GHz.

  18. Novice Interpretations of Visual Representations of Geosciences Data

    NASA Astrophysics Data System (ADS)

    Burkemper, L. K.; Arthurs, L.

    2013-12-01

    Past cognition research of individual's perception and comprehension of bar and line graphs are substantive enough that they have resulted in the generation of graph design principles and graph comprehension theories; however, gaps remain in our understanding of how people process visual representations of data, especially of geologic and atmospheric data. This pilot project serves to build on others' prior research and begin filling the existing gaps. The primary objectives of this pilot project include: (i) design a novel data collection protocol based on a combination of paper-based surveys, think-aloud interviews, and eye-tracking tasks to investigate student data handling skills of simple to complex visual representations of geologic and atmospheric data, (ii) demonstrate that the protocol yields results that shed light on student data handling skills, and (iii) generate preliminary findings upon which tentative but perhaps helpful recommendations on how to more effectively present these data to the non-scientist community and teach essential data handling skills. An effective protocol for the combined use of paper-based surveys, think-aloud interviews, and computer-based eye-tracking tasks for investigating cognitive processes involved in perceiving, comprehending, and interpreting visual representations of geologic and atmospheric data is instrumental to future research in this area. The outcomes of this pilot study provide the foundation upon which future more in depth and scaled up investigations can build. Furthermore, findings of this pilot project are sufficient for making, at least, tentative recommendations that can help inform (i) the design of physical attributes of visual representations of data, especially more complex representations, that may aid in improving students' data handling skills and (ii) instructional approaches that have the potential to aid students in more effectively handling visual representations of geologic and atmospheric data that they might encounter in a course, television news, newspapers and magazines, and websites. Such recommendations would also be the potential subject of future investigations and have the potential to impact the design features when data is presented to the public and instructional strategies not only in geoscience courses but also other science, technology, engineering, and mathematics (STEM) courses.

  19. Assessment of Healthcare Worker Protocol Deviations and Self-Contamination During Personal Protective Equipment Donning and Doffing.

    PubMed

    Kwon, Jennie H; Burnham, Carey-Ann D; Reske, Kimberly A; Liang, Stephen Y; Hink, Tiffany; Wallace, Meghan A; Shupe, Angela; Seiler, Sondra; Cass, Candice; Fraser, Victoria J; Dubberke, Erik R

    2017-09-01

    OBJECTIVE To evaluate healthcare worker (HCW) risk of self-contamination when donning and doffing personal protective equipment (PPE) using fluorescence and MS2 bacteriophage. DESIGN Prospective pilot study. SETTING Tertiary-care hospital. PARTICIPANTS A total of 36 HCWs were included in this study: 18 donned/doffed contact precaution (CP) PPE and 18 donned/doffed Ebola virus disease (EVD) PPE. INTERVENTIONS HCWs donned PPE according to standard protocols. Fluorescent liquid and MS2 bacteriophage were applied to HCWs. HCWs then doffed their PPE. After doffing, HCWs were scanned for fluorescence and swabbed for MS2. MS2 detection was performed using reverse transcriptase PCR. The donning and doffing processes were videotaped, and protocol deviations were recorded. RESULTS Overall, 27% of EVD PPE HCWs and 50% of CP PPE HCWs made ≥1 protocol deviation while donning, and 100% of EVD PPE HCWs and 67% of CP PPE HCWs made ≥1 protocol deviation while doffing (P=.02). The median number of doffing protocol deviations among EVD PPE HCWs was 4, versus 1 among CP PPE HCWs. Also, 15 EVD PPE protocol deviations were committed by doffing assistants and/or trained observers. Fluorescence was detected on 8 EVD PPE HCWs (44%) and 5 CP PPE HCWs (28%), most commonly on hands. MS2 was recovered from 2 EVD PPE HCWs (11%) and 3 CP PPE HCWs (17%). CONCLUSIONS Protocol deviations were common during both EVD and CP PPE doffing, and some deviations during EVD PPE doffing were committed by the HCW doffing assistant and/or the trained observer. Self-contamination was common. PPE donning/doffing are complex and deserve additional study. Infect Control Hosp Epidemiol 2017;38:1077-1083.

  20. [Design and implementation of the ELSA-Brasil biobank: a prospective study in a Brazilian population].

    PubMed

    Pereira, Alexandre C; Bensenor, Isabela M; Fedeli, Ligia M; Castilhos, Cristina; Vidigal, Pedro G; Maniero, Viviane; Leite, Claudia M; Pimentel, Robercia A; Duncan, Bruce B; Mill, Jose Geraldo; Lotufo, Paulo A

    2013-06-01

    The Brazilian Longitudinal Study for Adult Health (ELSA-Brasil) is a multicenter prospective cohort of civil servants designed to assess the determinants of chronic diseases, especially cardiovascular diseases and type 2 diabetes. The present article describes the main design and implementation points of the ELSA-Brasil biobank project. Economic, political, logistical and technological aspects of this study are characterized. Additionally, it discusses the final biorepository protocol and the facilities implemented to achieve this objective. The design and implementation process of the ELSA-Brasil biobank took three years to be performed. Both the central and local biobanks were built according to the best biorepository techniques, using different technological solutions for the distinct needs expected in this study.

  1. The Design and Development of a Regenerative Separatory Column Using Calixarenes as a Polymeric Backbone for the Purification of Water from Urine

    NASA Technical Reports Server (NTRS)

    Polk, M.

    1999-01-01

    The objective of this research project is to design calixarenes, cup-shaped molecules, with the specific binding sites to the sodium chloride and nitrogen containing components of urine, such as urea and uric acid, in urine. The following partition of the research accomplishes this objective: (1) functionalization of calixarene, (2) development of a calixarene based medium for the separatory process, (3) design of the column regeneration protocol. Work was also accomplished in the area of temperature sensitive paint (TSP). Research was undertaken to design a TSP with insulating propertites. An important part of this research project is to discover the thermal conductivity of polymers for TSP.

  2. The Effects of Alternative Resuscitation Strategies on Acute Kidney Injury in Patients with Septic Shock.

    PubMed

    Kellum, John A; Chawla, Lakhmir S; Keener, Christopher; Singbartl, Kai; Palevsky, Paul M; Pike, Francis L; Yealy, Donald M; Huang, David T; Angus, Derek C

    2016-02-01

    Septic shock is a common cause of acute kidney injury (AKI), and fluid resuscitation is a major part of therapy. To determine if structured resuscitation designed to alter fluid, blood, and vasopressor use affects the development or severity of AKI or outcomes. Ancillary study to the ProCESS (Protocolized Care for Early Septic Shock) trial of alternative resuscitation strategies (two protocols vs. usual care) for septic shock. We studied 1,243 patients and classified AKI using serum creatinine and urine output. We determined recovery status at hospital discharge, examined rates of renal replacement therapy and fluid overload, and measured biomarkers of kidney damage. Among patients without evidence of AKI at enrollment, 37.6% of protocolized care and 38.1% of usual care patients developed kidney injury (P = 0.90). AKI duration (P = 0.59) and rates of renal replacement therapy did not differ between study arms (6.9% for protocolized care and 4.3% for usual care; P = 0.08). Fluid overload occurred in 8.3% of protocolized care and 6.3% of usual care patients (P = 0.26). Among patients with severe AKI, complete and partial recovery was 50.7 and 13.2% for protocolized patients and 49.1 and 13.4% for usual care patients (P = 0.93). Sixty-day hospital mortality was 6.2% for patients without AKI, 16.8% for those with stage 1, and 27.7% for stages 2 to 3. In patients with septic shock, AKI is common and associated with adverse outcomes, but it is not influenced by protocolized resuscitation compared with usual care.

  3. The Stem Cell Laboratory: Design, Equipment, and Oversight

    PubMed Central

    Wesselschmidt, Robin L.; Schwartz, Philip H.

    2013-01-01

    This chapter describes some of the major issues to be considered when setting up a laboratory for the culture of human pluripotent stem cells (hPSCs). The process of establishing a hPSC laboratory can be divided into two equally important parts. One is completely administrative and includes developing protocols, seeking approval, and establishing reporting processes and documentation. The other part of establishing a hPSC laboratory involves the physical plant and includes design, equipment and personnel. Proper planning of laboratory operations and proper design of the physical layout of the stem cell laboratory so that meets the scope of planned operations is a major undertaking, but the time spent upfront will pay long-term returns in operational efficiency and effectiveness. A well-planned, organized, and properly equipped laboratory supports research activities by increasing efficiency and reducing lost time and wasted resources. PMID:21822863

  4. Interplanetary Overlay Network Bundle Protocol Implementation

    NASA Technical Reports Server (NTRS)

    Burleigh, Scott C.

    2011-01-01

    The Interplanetary Overlay Network (ION) system's BP package, an implementation of the Delay-Tolerant Networking (DTN) Bundle Protocol (BP) and supporting services, has been specifically designed to be suitable for use on deep-space robotic vehicles. Although the ION BP implementation is unique in its use of zero-copy objects for high performance, and in its use of resource-sensitive rate control, it is fully interoperable with other implementations of the BP specification (Internet RFC 5050). The ION BP implementation is built using the same software infrastructure that underlies the implementation of the CCSDS (Consultative Committee for Space Data Systems) File Delivery Protocol (CFDP) built into the flight software of Deep Impact. It is designed to minimize resource consumption, while maximizing operational robustness. For example, no dynamic allocation of system memory is required. Like all the other ION packages, ION's BP implementation is designed to port readily between Linux and Solaris (for easy development and for ground system operations) and VxWorks (for flight systems operations). The exact same source code is exercised in both environments. Initially included in the ION BP implementations are the following: libraries of functions used in constructing bundle forwarders and convergence-layer (CL) input and output adapters; a simple prototype bundle forwarder and associated CL adapters designed to run over an IPbased local area network; administrative tools for managing a simple DTN infrastructure built from these components; a background daemon process that silently destroys bundles whose time-to-live intervals have expired; a library of functions exposed to applications, enabling them to issue and receive data encapsulated in DTN bundles; and some simple applications that can be used for system checkout and benchmarking.

  5. Reducing Physical Risk Factors in Construction Work Through a Participatory Intervention: Protocol for a Mixed-Methods Process Evaluation.

    PubMed

    Ajslev, Jeppe; Brandt, Mikkel; Møller, Jeppe Lykke; Skals, Sebastian; Vinstrup, Jonas; Jakobsen, Markus Due; Sundstrup, Emil; Madeleine, Pascal; Andersen, Lars Louis

    2016-05-26

    Previous research has shown that reducing physical workload among workers in the construction industry is complicated. In order to address this issue, we developed a process evaluation in a formative mixed-methods design, drawing on existing knowledge of the potential barriers for implementation. We present the design of a mixed-methods process evaluation of the organizational, social, and subjective practices that play roles in the intervention study, integrating technical measurements to detect excessive physical exertion measured with electromyography and accelerometers, video documentation of working tasks, and a 3-phased workshop program. The evaluation is designed in an adapted process evaluation framework, addressing recruitment, reach, fidelity, satisfaction, intervention delivery, intervention received, and context of the intervention companies. Observational studies, interviews, and questionnaires among 80 construction workers organized in 20 work gangs, as well as health and safety staff, contribute to the creation of knowledge about these phenomena. At the time of publication, the process of participant recruitment is underway. Intervention studies are challenging to conduct and evaluate in the construction industry, often because of narrow time frames and ever-changing contexts. The mixed-methods design presents opportunities for obtaining detailed knowledge of the practices intra-acting with the intervention, while offering the opportunity to customize parts of the intervention.

  6. Time Warp Operating System, Version 2.5.1

    NASA Technical Reports Server (NTRS)

    Bellenot, Steven F.; Gieselman, John S.; Hawley, Lawrence R.; Peterson, Judy; Presley, Matthew T.; Reiher, Peter L.; Springer, Paul L.; Tupman, John R.; Wedel, John J., Jr.; Wieland, Frederick P.; hide

    1993-01-01

    Time Warp Operating System, TWOS, is special purpose computer program designed to support parallel simulation of discrete events. Complete implementation of Time Warp software mechanism, which implements distributed protocol for virtual synchronization based on rollback of processes and annihilation of messages. Supports simulations and other computations in which both virtual time and dynamic load balancing used. Program utilizes underlying resources of operating system. Written in C programming language.

  7. In silico design of ligand triggered RNA switches.

    PubMed

    Findeiß, Sven; Hammer, Stefan; Wolfinger, Michael T; Kühnl, Felix; Flamm, Christoph; Hofacker, Ivo L

    2018-04-13

    This contribution sketches a work flow to design an RNA switch that is able to adapt two structural conformations in a ligand-dependent way. A well characterized RNA aptamer, i.,e., knowing its K d and adaptive structural features, is an essential ingredient of the described design process. We exemplify the principles using the well-known theophylline aptamer throughout this work. The aptamer in its ligand-binding competent structure represents one structural conformation of the switch while an alternative fold that disrupts the binding-competent structure forms the other conformation. To keep it simple we do not incorporate any regulatory mechanism to control transcription or translation. We elucidate a commonly used design process by explicitly dissecting and explaining the necessary steps in detail. We developed a novel objective function which specifies the mechanistics of this simple, ligand-triggered riboswitch and describe an extensive in silico analysis pipeline to evaluate important kinetic properties of the designed sequences. This protocol and the developed software can be easily extended or adapted to fit novel design scenarios and thus can serve as a template for future needs. Copyright © 2018. Published by Elsevier Inc.

  8. Ultrasensitive electrochemical detection of nucleic acids by template enhanced hybridization followed with rolling circle amplification.

    PubMed

    Ji, Hanxu; Yan, Feng; Lei, Jianping; Ju, Huangxian

    2012-08-21

    An ultrasensitive protocol for electrochemical detection of DNA is designed with quantum dots (QDs) as a signal tag by combining the template enhanced hybridization process (TEHP) and rolling circle amplification (RCA). Upon the recognition of the molecular beacon (MB) to target DNA, the MB hybridizes with assistants and target DNA to form a ternary ''Y-junction''. The target DNA can be dissociated from the structure under the reaction of nicking endonuclease to initiate the next hybridization process. The template enhanced MB fragments further act as the primers of the RCA reaction to produce thousands of repeated oligonucleotide sequences, which can bind with oligonucleotide functionalized QDs. The attached signal tags can be easily read out by square-wave voltammetry after dissolving with acid. Because of the cascade signal amplification and the specific TEHP and RCA reaction, this newly designed protocol provides an ultrasensitive electrochemical detection of DNA down to the attomolar level (11 aM) with a linear range of 6 orders of magnitude (from 1 × 10(-17) to 1 × 10(-11) M) and can discriminate mismatched DNA from perfect matched target DNA with high selectivity. The high sensitivity and specificity make this method a great potential for early diagnosis in gene-related diseases.

  9. Medical education research and IRB review: an analysis and comparison of the IRB review process at six institutions.

    PubMed

    Dyrbye, Liselotte N; Thomas, Matthew R; Mechaber, Alex J; Eacker, Anne; Harper, William; Massie, F Stanford; Power, David V; Shanafelt, Tait D

    2007-07-01

    To compare how different institutional review boards (IRBs) process and evaluate the same multiinstitutional educational research proposal of medical students' quality of life. Prospective collection in 2005 of key variables regarding the IRB submission and review process of the same educational research proposal involving medical students, which was submitted to six IRBs, each associated with a different medical school. Four IRBs determined the protocol was appropriate for expedited review, and the remaining two required full review. Substantial variation existed in the time to review the protocol by an IRB administrator/IRB member (range 1-101 days) and by the IRB committee (range 6-115 days). One IRB committee approved the study as written. The remaining five IRB committees had a median of 13 requests for additional information/changes to the protocol. Sixty-eight percent of requests (36 of 53) pertained to the informed consent letter; one third (12 of 36) of these requests were unique modifications requested by one IRB but not the others. Although five IRB committees approved the survey after a median of 47 days (range 6-73), one committee had not responded six months after submission (164 days), preventing that school from participating. The findings suggest variability in the timeliness and consistency of IRB review of medical education research across institutions that may hinder multi-institutional research and slow evidence-based medical education reform. The findings demonstrate the difficulties of having medical education research reviewed by IRBs, which are typically designed to review clinical trials, and suggest that the review process for medical education research needs reform.

  10. Designing typefaces for maps. A protocol of tests.

    NASA Astrophysics Data System (ADS)

    Biniek, Sébastien; Touya, Guillaume; Rouffineau, Gilles; Huot-Marchand, Thomas

    2018-05-01

    The text management in map design is a topic generally linked to placement and composition issues. Whereas the type design issue is rarely addressed or at least only partially. Moreover the typefaces especially designed for maps are rare. This paper presents a protocol of tests to evaluate characters for digital topographic maps and fonts that were designed for the screen through the use of geographical information systems using this protocol. It was launched by the Atelier National de Recherche Typographique Research (ANRT, located in Nancy, France) and took place over his `post-master' course in 2013. The purpose is to isolate different issues inherent to text in a topographic map: map background, nonlinear text placement and toponymic hierarchies. Further research is necessary to improve this kind of approach.

  11. Evaluation of Patient Handoff Methods on an Inpatient Teaching Service

    PubMed Central

    Craig, Steven R.; Smith, Hayden L.; Downen, A. Matthew; Yost, W. John

    2012-01-01

    Background The patient handoff process can be a highly variable and unstructured period at risk for communication errors. The morning sign-in process used by resident physicians at teaching hospitals typically involves less rigorous handoff protocols than the resident evening sign-out process. Little research has been conducted on best practices for handoffs during morning sign-in exchanges between resident physicians. Research must evaluate optimal protocols for the resident morning sign-in process. Methods Three morning handoff protocols consisting of written, electronic, and face-to-face methods were implemented over 3 study phases during an academic year. Study participants included all interns covering the internal medicine inpatient teaching service at a tertiary hospital. Study measures entailed intern survey-based interviews analyzed for failures in handoff protocols with or without missed pertinent information. Descriptive and comparative analyses examined study phase differences. Results A scheduled face-to-face handoff process had the fewest protocol deviations and demonstrated best communication of essential patient care information between cross-covering teams compared to written and electronic sign-in protocols. Conclusion Intern patient handoffs were more reliable when the sign-in protocol included scheduled face-to-face meetings. This method provided the best communication of patient care information and allowed for open exchanges of information. PMID:23267259

  12. Characterizing College Science Assessments: The Three-Dimensional Learning Assessment Protocol

    PubMed Central

    Underwood, Sonia M.; Matz, Rebecca L.; Posey, Lynmarie A.; Carmel, Justin H.; Caballero, Marcos D.; Fata-Hartley, Cori L.; Ebert-May, Diane; Jardeleza, Sarah E.; Cooper, Melanie M.

    2016-01-01

    Many calls to improve science education in college and university settings have focused on improving instructor pedagogy. Meanwhile, science education at the K-12 level is undergoing significant changes as a result of the emphasis on scientific and engineering practices, crosscutting concepts, and disciplinary core ideas. This framework of “three-dimensional learning” is based on the literature about how people learn science and how we can help students put their knowledge to use. Recently, similar changes are underway in higher education by incorporating three-dimensional learning into college science courses. As these transformations move forward, it will become important to assess three-dimensional learning both to align assessments with the learning environment, and to assess the extent of the transformations. In this paper we introduce the Three-Dimensional Learning Assessment Protocol (3D-LAP), which is designed to characterize and support the development of assessment tasks in biology, chemistry, and physics that align with transformation efforts. We describe the development process used by our interdisciplinary team, discuss the validity and reliability of the protocol, and provide evidence that the protocol can distinguish between assessments that have the potential to elicit evidence of three-dimensional learning and those that do not. PMID:27606671

  13. A network coding based routing protocol for underwater sensor networks.

    PubMed

    Wu, Huayang; Chen, Min; Guan, Xin

    2012-01-01

    Due to the particularities of the underwater environment, some negative factors will seriously interfere with data transmission rates, reliability of data communication, communication range, and network throughput and energy consumption of underwater sensor networks (UWSNs). Thus, full consideration of node energy savings, while maintaining a quick, correct and effective data transmission, extending the network life cycle are essential when routing protocols for underwater sensor networks are studied. In this paper, we have proposed a novel routing algorithm for UWSNs. To increase energy consumption efficiency and extend network lifetime, we propose a time-slot based routing algorithm (TSR).We designed a probability balanced mechanism and applied it to TSR. The theory of network coding is introduced to TSBR to meet the requirement of further reducing node energy consumption and extending network lifetime. Hence, time-slot based balanced network coding (TSBNC) comes into being. We evaluated the proposed time-slot based balancing routing algorithm and compared it with other classical underwater routing protocols. The simulation results show that the proposed protocol can reduce the probability of node conflicts, shorten the process of routing construction, balance energy consumption of each node and effectively prolong the network lifetime.

  14. Investigating Linguistic Sources of Differential Item Functioning Using Expert Think-Aloud Protocols in Science Achievement Tests

    NASA Astrophysics Data System (ADS)

    Roth, Wolff-Michael; Oliveri, Maria Elena; Dallie Sandilands, Debra; Lyons-Thomas, Juliette; Ercikan, Kadriye

    2013-03-01

    Even if national and international assessments are designed to be comparable, subsequent psychometric analyses often reveal differential item functioning (DIF). Central to achieving comparability is to examine the presence of DIF, and if DIF is found, to investigate its sources to ensure differentially functioning items that do not lead to bias. In this study, sources of DIF were examined using think-aloud protocols. The think-aloud protocols of expert reviewers were conducted for comparing the English and French versions of 40 items previously identified as DIF (N = 20) and non-DIF (N = 20). Three highly trained and experienced experts in verifying and accepting/rejecting multi-lingual versions of curriculum and testing materials for government purposes participated in this study. Although there is a considerable amount of agreement in the identification of differentially functioning items, experts do not consistently identify and distinguish DIF and non-DIF items. Our analyses of the think-aloud protocols identified particular linguistic, general pedagogical, content-related, and cognitive factors related to sources of DIF. Implications are provided for the process of arriving at the identification of DIF, prior to the actual administration of tests at national and international levels.

  15. A Network Coding Based Routing Protocol for Underwater Sensor Networks

    PubMed Central

    Wu, Huayang; Chen, Min; Guan, Xin

    2012-01-01

    Due to the particularities of the underwater environment, some negative factors will seriously interfere with data transmission rates, reliability of data communication, communication range, and network throughput and energy consumption of underwater sensor networks (UWSNs). Thus, full consideration of node energy savings, while maintaining a quick, correct and effective data transmission, extending the network life cycle are essential when routing protocols for underwater sensor networks are studied. In this paper, we have proposed a novel routing algorithm for UWSNs. To increase energy consumption efficiency and extend network lifetime, we propose a time-slot based routing algorithm (TSR).We designed a probability balanced mechanism and applied it to TSR. The theory of network coding is introduced to TSBR to meet the requirement of further reducing node energy consumption and extending network lifetime. Hence, time-slot based balanced network coding (TSBNC) comes into being. We evaluated the proposed time-slot based balancing routing algorithm and compared it with other classical underwater routing protocols. The simulation results show that the proposed protocol can reduce the probability of node conflicts, shorten the process of routing construction, balance energy consumption of each node and effectively prolong the network lifetime. PMID:22666045

  16. Characterizing College Science Assessments: The Three-Dimensional Learning Assessment Protocol.

    PubMed

    Laverty, James T; Underwood, Sonia M; Matz, Rebecca L; Posey, Lynmarie A; Carmel, Justin H; Caballero, Marcos D; Fata-Hartley, Cori L; Ebert-May, Diane; Jardeleza, Sarah E; Cooper, Melanie M

    2016-01-01

    Many calls to improve science education in college and university settings have focused on improving instructor pedagogy. Meanwhile, science education at the K-12 level is undergoing significant changes as a result of the emphasis on scientific and engineering practices, crosscutting concepts, and disciplinary core ideas. This framework of "three-dimensional learning" is based on the literature about how people learn science and how we can help students put their knowledge to use. Recently, similar changes are underway in higher education by incorporating three-dimensional learning into college science courses. As these transformations move forward, it will become important to assess three-dimensional learning both to align assessments with the learning environment, and to assess the extent of the transformations. In this paper we introduce the Three-Dimensional Learning Assessment Protocol (3D-LAP), which is designed to characterize and support the development of assessment tasks in biology, chemistry, and physics that align with transformation efforts. We describe the development process used by our interdisciplinary team, discuss the validity and reliability of the protocol, and provide evidence that the protocol can distinguish between assessments that have the potential to elicit evidence of three-dimensional learning and those that do not.

  17. Digital gene expression analysis with sample multiplexing and PCR duplicate detection: A straightforward protocol.

    PubMed

    Rozenberg, Andrey; Leese, Florian; Weiss, Linda C; Tollrian, Ralph

    2016-01-01

    Tag-Seq is a high-throughput approach used for discovering SNPs and characterizing gene expression. In comparison to RNA-Seq, Tag-Seq eases data processing and allows detection of rare mRNA species using only one tag per transcript molecule. However, reduced library complexity raises the issue of PCR duplicates, which distort gene expression levels. Here we present a novel Tag-Seq protocol that uses the least biased methods for RNA library preparation combined with a novel approach for joint PCR template and sample labeling. In our protocol, input RNA is fragmented by hydrolysis, and poly(A)-bearing RNAs are selected and directly ligated to mixed DNA-RNA P5 adapters. The P5 adapters contain i5 barcodes composed of sample-specific (moderately) degenerate base regions (mDBRs), which later allow detection of PCR duplicates. The P7 adapter is attached via reverse transcription with individual i7 barcodes added during the amplification step. The resulting libraries can be sequenced on an Illumina sequencer. After sample demultiplexing and PCR duplicate removal with a free software tool we designed, the data are ready for downstream analysis. Our protocol was tested on RNA samples from predator-induced and control Daphnia microcrustaceans.

  18. Participatory design of a collaborative clinical trial protocol writing system.

    PubMed

    Weng, Chunhua; McDonald, David W; Sparks, Dana; McCoy, Jason; Gennari, John H

    2007-06-01

    To explore concrete approaches to socio-technical design of collaborative healthcare information systems and to design a groupware technology for collaborative clinical trial protocol writing. We conducted "quick and dirty ethnography" through semi-structured interviews, observational studies, and work artifacts analysis to understand the group work for protocol development. We used participatory design through evolutionary prototyping to explore the feature space of a collaborative writing system. Our design strategies include role-based user advocacy, formative evaluation, and change management. Quick and dirty ethnography helped us efficiently understand relevant work practice, and participatory design helped us engage users into design and bring out their tacit work knowledge. Our approach that intertwined both techniques helped achieve a "work-informed and user-oriented" design. This research leads to a collaborative writing system that supports in situ communication, group awareness, and effective work progress tracking. The usability evaluation results have been satisfactory. The system design is being transferred to an organizational tool for daily use.

  19. Protocol for determining bull trout presence

    USGS Publications Warehouse

    Peterson, James; Dunham, Jason B.; Howell, Philip; Thurow, Russell; Bonar, Scott

    2002-01-01

    The Western Division of the American Fisheries Society was requested to develop protocols for determining presence/absence and potential habitat suitability for bull trout. The general approach adopted is similar to the process for the marbled murrelet, whereby interim guidelines are initially used, and the protocols are subsequently refined as data are collected. Current data were considered inadequate to precisely identify suitable habitat but could be useful in stratifying sampling units for presence/absence surveys. The presence/absence protocol builds on previous approaches (Hillman and Platts 1993; Bonar et al. 1997), except it uses the variation in observed bull trout densities instead of a minimum threshold density and adjusts for measured differences in sampling efficiency due to gear types and habitat characteristics. The protocol consists of: 1. recommended sample sizes with 80% and 95% detection probabilities for juvenile and resident adult bull trout for day and night snorkeling and electrofishing adjusted for varying habitat characteristics for 50m and 100m sampling units, 2. sampling design considerations, including possible habitat characteristics for stratification, 3. habitat variables to be measured in the sampling units, and 3. guidelines for training sampling crews. Criteria for habitat strata consist of coarse, watershed-scale characteristics (e.g., mean annual air temperature) and fine-scale, reach and habitat-specific features (e.g., water temperature, channel width). The protocols will be revised in the future using data from ongoing presence/absence surveys, additional research on sampling efficiencies, and development of models of habitat/species occurrence.

  20. TRIGA: Telecommunications Protocol Processing Subsystem Using Reconfigurable Interoperable Gate Arrays

    NASA Technical Reports Server (NTRS)

    Pang, Jackson; Pingree, Paula J.; Torgerson, J. Leigh

    2006-01-01

    We present the Telecommunications protocol processing subsystem using Reconfigurable Interoperable Gate Arrays (TRIGA), a novel approach that unifies fault tolerance, error correction coding and interplanetary communication protocol off-loading to implement CCSDS File Delivery Protocol and Datalink layers. The new reconfigurable architecture offers more than one order of magnitude throughput increase while reducing footprint requirements in memory, command and data handling processor utilization, communication system interconnects and power consumption.

  1. A Family of ACO Routing Protocols for Mobile Ad Hoc Networks.

    PubMed

    Rupérez Cañas, Delfín; Sandoval Orozco, Ana Lucila; García Villalba, Luis Javier; Kim, Tai-Hoon

    2017-05-22

    In this work, an ACO routing protocol for mobile ad hoc networks based on AntHocNet is specified. As its predecessor, this new protocol, called AntOR, is hybrid in the sense that it contains elements from both reactive and proactive routing. Specifically, it combines a reactive route setup process with a proactive route maintenance and improvement process. Key aspects of the AntOR protocol are the disjoint-link and disjoint-node routes, separation between the regular pheromone and the virtual pheromone in the diffusion process and the exploration of routes, taking into consideration the number of hops in the best routes. In this work, a family of ACO routing protocols based on AntOR is also specified. These protocols are based on protocol successive refinements. In this work, we also present a parallelized version of AntOR that we call PAntOR. Using programming multiprocessor architectures based on the shared memory protocol, PAntOR allows running tasks in parallel using threads. This parallelization is applicable in the route setup phase, route local repair process and link failure notification. In addition, a variant of PAntOR that consists of having more than one interface, which we call PAntOR-MI (PAntOR-Multiple Interface), is specified. This approach parallelizes the sending of broadcast messages by interface through threads.

  2. Outcome of patients after lower limb fracture with partial weight bearing postoperatively treated with or without anti-gravity treadmill (alter G®) during six weeks of rehabilitation - a protocol of a prospective randomized trial.

    PubMed

    Henkelmann, Ralf; Schneider, Sebastian; Müller, Daniel; Gahr, Ralf; Josten, Christoph; Böhme, Jörg

    2017-03-14

    Partial or complete immobilization leads to different adjustment processes like higher risk of muscle atrophy or a decrease of general performance. The present study is designed to prove efficacy of the anti-gravity treadmill (alter G®) compared to a standard rehabilitation protocol in patients with tibial plateau (group 1)or ankle fractures (group 2) with six weeks of partial weight bearing of 20 kg. This prospective randomized study will include a total of 60 patients for each group according to predefined inclusion and exclusion criteria. 1:1 randomization will be performed centrally via fax supported by the Clinical Trial Centre Leipzig (ZKS Leipzig). Patients in the treatment arm will be treated with an anti-gravity treadmill (alter G®) instead of physiotherapy. The protocol is designed parallel to standard physiotherapy with a frequency of two to three times of training with the treadmill per week with duration of 20 min for six weeks. Up to date no published randomized controlled trial with an anti-gravity treadmill is available. The findings of this study can help to modify rehabilitation of patients with partial weight bearing due to their injury or postoperative protocol. It will deliver interesting results if an anti-gravity treadmill is useful in rehabilitation in those patients. Further ongoing studies will identify different indications for an anti-gravity treadmill. Thus, in connection with those studies, a more valid statement regarding safety and efficacy is possible. NCT02790229 registered on May 29, 2016.

  3. Bridging the gap in complementary and alternative medicine research: manualization as a means of promoting standardization and flexibility of treatment in clinical trials of acupuncture.

    PubMed

    Schnyer, Rosa N; Allen, John J B

    2002-10-01

    An important methodological challenge encountered in acupuncture clinical research involves the design of treatment protocols that help ensure standardization and replicability while allowing for the necessary flexibility to tailor treatments to each individual. Manualization of protocols used in clinical trials of acupuncture and other traditionally-based complementary and alternative medicine (CAM) systems facilitates the systematic delivery of replicable and standardized, yet individually-tailored treatments. To facilitate high-quality CAM acupuncture research by outlining a method for the systematic design and implementation of protocols used in CAM clinical trials based on the concept of treatment manualization. A series of treatment manuals was developed to systematically articulate the Chinese medical theoretical and clinical framework for a given Western-defined illness, to increase the quality and consistency of treatment, and to standardize the technical aspects of the protocol. In all, three manuals were developed for National Institutes of Health (NIH)-funded clinical trials of acupuncture for depression, spasticity in cerebral palsy, and repetitive stress injury. In Part I, the rationale underlying these manuals and the challenges encountered in creating them are discussed, and qualitative assessments of their utility are provided. In Part II, a methodology to develop treatment manuals for use in clinical trials is detailed, and examples are given. A treatment manual provides a precise way to train and supervise practitioners, enable evaluation of conformity and competence, facilitate the training process, and increase the ability to identify the active therapeutic ingredients in clinical trials of acupuncture.

  4. Using Ontologies to Formalize Services Specifications in Multi-Agent Systems

    NASA Technical Reports Server (NTRS)

    Breitman, Karin Koogan; Filho, Aluizio Haendchen; Haeusler, Edward Hermann

    2004-01-01

    One key issue in multi-agent systems (MAS) is their ability to interact and exchange information autonomously across applications. To secure agent interoperability, designers must rely on a communication protocol that allows software agents to exchange meaningful information. In this paper we propose using ontologies as such communication protocol. Ontologies capture the semantics of the operations and services provided by agents, allowing interoperability and information exchange in a MAS. Ontologies are a formal, machine processable, representation that allows to capture the semantics of a domain and, to derive meaningful information by way of logical inference. In our proposal we use a formal knowledge representation language (OWL) that translates into Description Logics (a subset of first order logic), thus eliminating ambiguities and providing a solid base for machine based inference. The main contribution of this approach is to make the requirements explicit, centralize the specification in a single document (the ontology itself), at the same that it provides a formal, unambiguous representation that can be processed by automated inference machines.

  5. Cyclotide isolation and characterization.

    PubMed

    Craik, David J; Henriques, Sonia Troeira; Mylne, Joshua S; Wang, Conan K

    2012-01-01

    Cyclotides are disulfide-rich cyclic peptides produced by plants with the presumed natural function of defense agents against insect pests. They are present in a wide range of plant tissues, being ribosomally synthesized via precursor proteins that are posttranslationally processed to produce mature peptides with a characteristic cyclic backbone and cystine knot motif associated with their six conserved cysteine residues. Their processing is not fully understood but involves asparaginyl endoproteinase activity. In addition to interest in their defense roles and their unique topologies, cyclotides have attracted attention as potential templates in peptide-based drug design applications. This chapter provides protocols for the isolation of cyclotides from plants, their detection and sequencing by mass spectrometry, and their structural analysis by NMR, as well as describing methods for the isolation of nucleic acid sequences that encode their precursor proteins. Assays to assess their membrane-binding interactions are also described. These protocols provide a "starter kit" for researchers entering the cyclotide field. Copyright © 2012 Elsevier Inc. All rights reserved.

  6. Instructor Handbook for the Protocol Modules on Classroom Management. Utah Protocol Materials Project.

    ERIC Educational Resources Information Center

    Langer, Philip; Borg, Walter R.

    This handbook is designed to acquaint the teacher educator with the training materials in classroom management prepared by the Utah State University Protocol Training Project. It deals with the protocol materials generally and with each module specifically, and includes the following sections: (a) an introduction to and rationale for protocol…

  7. A Novel Cross-Layer Routing Protocol Based on Network Coding for Underwater Sensor Networks

    PubMed Central

    Wang, Hao; Wang, Shilian; Bu, Renfei; Zhang, Eryang

    2017-01-01

    Underwater wireless sensor networks (UWSNs) have attracted increasing attention in recent years because of their numerous applications in ocean monitoring, resource discovery and tactical surveillance. However, the design of reliable and efficient transmission and routing protocols is a challenge due to the low acoustic propagation speed and complex channel environment in UWSNs. In this paper, we propose a novel cross-layer routing protocol based on network coding (NCRP) for UWSNs, which utilizes network coding and cross-layer design to greedily forward data packets to sink nodes efficiently. The proposed NCRP takes full advantages of multicast transmission and decode packets jointly with encoded packets received from multiple potential nodes in the entire network. The transmission power is optimized in our design to extend the life cycle of the network. Moreover, we design a real-time routing maintenance protocol to update the route when detecting inefficient relay nodes. Substantial simulations in underwater environment by Network Simulator 3 (NS-3) show that NCRP significantly improves the network performance in terms of energy consumption, end-to-end delay and packet delivery ratio compared with other routing protocols for UWSNs. PMID:28786915

  8. Running key mapping in a quantum stream cipher by the Yuen 2000 protocol

    NASA Astrophysics Data System (ADS)

    Shimizu, Tetsuya; Hirota, Osamu; Nagasako, Yuki

    2008-03-01

    A quantum stream cipher by Yuen 2000 protocol (so-called Y00 protocol or αη scheme) consisting of linear feedback shift register of short key is very attractive in implementing secure 40 Gbits/s optical data transmission, which is expected as a next-generation network. However, a basic model of the Y00 protocol with a very short key needs a careful design against fast correlation attacks as pointed out by Donnet This Brief Report clarifies an effectiveness of irregular mapping between running key and physical signals in the driver for selection of M -ary basis in the transmitter, and gives a design method. Consequently, quantum stream cipher by the Y00 protocol with our mapping has immunity against the proposed fast correlation attacks on a basic model of the Y00 protocol even if the key is very short.

  9. Design of an instrument to measure the quality of care in Physical Therapy

    PubMed Central

    Cavalheiro, Leny Vieira; Eid, Raquel Afonso Caserta; Talerman, Claudia; do Prado, Cristiane; Gobbi, Fátima Cristina Martorano; Andreoli, Paola Bruno de Araujo

    2015-01-01

    ABSTRACT Objective: To design an instrument composed of domains that would demonstrate physical therapy activities and generate a consistent index to represent the quality of care in physical therapy. Methods: The methodology Lean Six Sigma was used to design the tool. The discussion involved seven different management groups staff. By means of brainstorming and Cause & Effect Matrix, we set up the process map. Results: Five requirements composed the quality of care index in physical therapy, after application of the tool called Cause & Effect Matrix. The following requirements were assessed: physical therapist performance, care outcome indicator, adherence to physical therapy protocols, measure whether the prognosis and treatment outcome was achieved and Infrastructure. Conclusion: The proposed design allowed evaluating several items related to physical therapy service, enabling customization, reproducibility and benchmarking with other organizations. For management, this index provides the opportunity to identify areas for improvement and the strengths of the team and process of physical therapy care. PMID:26154548

  10. Protocols for second-generation business satellites systems

    NASA Astrophysics Data System (ADS)

    Evans, B. G.; Coakley, F. P.; El Amin, M. H. M.

    The paper discusses the nature and mix of traffic in business satellite systems and describes the limitations on the protocol imposed by the differing impairments of speech, video, and data. A simple TDMA system protocol is presented which meets the requirements of mixed-service operation. The efficiency of the protocol together with implications for allocation, scheduling and synchronisation are discussed. Future-generation satellites will probably use on-board processing. Some initial work on protocols that make use of on-board processing and the implications for satellite and earth-station equipment are presented.

  11. A Cloud-Based X73 Ubiquitous Mobile Healthcare System: Design and Implementation

    PubMed Central

    Ji, Zhanlin; O'Droma, Máirtín; Zhang, Xin; Zhang, Xueji

    2014-01-01

    Based on the user-centric paradigm for next generation networks, this paper describes a ubiquitous mobile healthcare (uHealth) system based on the ISO/IEEE 11073 personal health data (PHD) standards (X73) and cloud computing techniques. A number of design issues associated with the system implementation are outlined. The system includes a middleware on the user side, providing a plug-and-play environment for heterogeneous wireless sensors and mobile terminals utilizing different communication protocols and a distributed “big data” processing subsystem in the cloud. The design and implementation of this system are envisaged as an efficient solution for the next generation of uHealth systems. PMID:24737958

  12. A CoAP-Based Network Access Authentication Service for Low-Power Wide Area Networks: LO-CoAP-EAP.

    PubMed

    Garcia-Carrillo, Dan; Marin-Lopez, Rafael; Kandasamy, Arunprabhu; Pelov, Alexander

    2017-11-17

    The Internet-of-Things (IoT) landscape is expanding with new radio technologies. In addition to the Low-Rate Wireless Personal Area Network (LR-WPAN), the recent set of technologies conforming the so-called Low-Power Wide Area Networks (LP-WAN) offers long-range communications, allowing one to send small pieces of information at a reduced energy cost, which promotes the creation of new IoT applications and services. However, LP-WAN technologies pose new challenges since they have strong limitations in the available bandwidth. In general, a first step prior to a smart object being able to gain access to the network is the process of network access authentication. It involves authentication, authorization and key management operations. This process is of vital importance for operators to control network resources. However, proposals for managing network access authentication in LP-WAN are tailored to the specifics of each technology, which could introduce interoperability problems in the future. In this sense, little effort has been put so far into providing a wireless-independent solution for network access authentication in the area of LP-WAN. To fill this gap, we propose a service named Low-Overhead CoAP-EAP (LO-CoAP-EAP), which is based on previous work designed for LR-WPAN. LO-CoAP-EAP integrates the use of Authentication, Authorization and Accounting (AAA) infrastructures and the Extensible Authentication Protocol (EAP) protocol. For this integration, we use the Constrained Application Protocol (CoAP) to design a network authentication service independent of the type of LP-WAN technology. LO-CoAP-EAP represents a trade-off between flexibility, wireless technology independence, scalability and performance in LP-WAN.

  13. A CoAP-Based Network Access Authentication Service for Low-Power Wide Area Networks: LO-CoAP-EAP

    PubMed Central

    Garcia-Carrillo, Dan; Marin-Lopez, Rafael; Kandasamy, Arunprabhu; Pelov, Alexander

    2017-01-01

    The Internet-of-Things (IoT) landscape is expanding with new radio technologies. In addition to the Low-Rate Wireless Personal Area Network (LR-WPAN), the recent set of technologies conforming the so-called Low-Power Wide Area Networks (LP-WAN) offers long-range communications, allowing one to send small pieces of information at a reduced energy cost, which promotes the creation of new IoT applications and services. However, LP-WAN technologies pose new challenges since they have strong limitations in the available bandwidth. In general, a first step prior to a smart object being able to gain access to the network is the process of network access authentication. It involves authentication, authorization and key management operations. This process is of vital importance for operators to control network resources. However, proposals for managing network access authentication in LP-WAN are tailored to the specifics of each technology, which could introduce interoperability problems in the future. In this sense, little effort has been put so far into providing a wireless-independent solution for network access authentication in the area of LP-WAN. To fill this gap, we propose a service named Low-Overhead CoAP-EAP (LO-CoAP-EAP), which is based on previous work designed for LR-WPAN. LO-CoAP-EAP integrates the use of Authentication, Authorization and Accounting (AAA) infrastructures and the Extensible Authentication Protocol (EAP) protocol. For this integration, we use the Constrained Application Protocol (CoAP) to design a network authentication service independent of the type of LP-WAN technology. LO-CoAP-EAP represents a trade-off between flexibility, wireless technology independence, scalability and performance in LP-WAN. PMID:29149040

  14. Gender-Specific Combination HIV Prevention for Youth in High-Burden Settings: The MP3 Youth Observational Pilot Study Protocol.

    PubMed

    Buttolph, Jasmine; Inwani, Irene; Agot, Kawango; Cleland, Charles M; Cherutich, Peter; Kiarie, James N; Osoti, Alfred; Celum, Connie L; Baeten, Jared M; Nduati, Ruth; Kinuthia, John; Hallett, Timothy B; Alsallaq, Ramzi; Kurth, Ann E

    2017-03-08

    Nearly three decades into the epidemic, sub-Saharan Africa (SSA) remains the region most heavily affected by human immunodeficiency virus (HIV), with nearly 70% of the 34 million people living with HIV globally residing in the region. In SSA, female and male youth (15 to 24 years) are at a disproportionately high risk of HIV infection compared to adults. As such, there is a need to target HIV prevention strategies to youth and to tailor them to a gender-specific context. This protocol describes the process for the multi-staged approach in the design of the MP3 Youth pilot study, a gender-specific, combination, HIV prevention intervention for youth in Kenya. The objective of this multi-method protocol is to outline a rigorous and replicable methodology for a gender-specific combination HIV prevention pilot study for youth in high-burden settings, illustrating the triangulated methods undertaken to ensure that age, sex, and context are integral in the design of the intervention. The mixed-methods, cross-sectional, longitudinal cohort pilot study protocol was developed by first conducting a systematic review of the literature, which shaped focus group discussions around prevention package and delivery options, and that also informed age- and sex- stratified mathematical modeling. The review, qualitative data, and mathematical modeling created a triangulated evidence base of interventions to be included in the pilot study protocol. To design the pilot study protocol, we convened an expert panel to select HIV prevention interventions effective for youth in SSA, which will be offered in a mobile health setting. The goal of the pilot study implementation and evaluation is to apply lessons learned to more effective HIV prevention evidence and programming. The combination HIV prevention package in this protocol includes (1) offering HIV testing and counseling for all youth; (2) voluntary medical circumcision and condoms for males; (3) pre-exposure prophylaxis (PrEP), conditional cash transfer (CCT), and contraceptives for females; and (4) referrals for HIV care among those identified as HIV-positive. The combination package platform selected is mobile health teams in an integrated services delivery model. A cross-sectional analysis will be conducted to determine the uptake of the interventions. To determine long-term impact, the protocol outlines enrolling selected participants in mutually exclusive longitudinal cohorts (HIV-positive, PrEP, CCT, and HIV-negative) followed by using mobile phone text messages (short message service, SMS) and in-person surveys to prospectively assess prevention method uptake, adherence, and risk compensation behaviors. Cross-sectional and sub-cohort analyses will be conducted to determine intervention packages uptake. The literature review, focus groups, and modeling indicate that offering age- and gender- specific combination HIV prevention interventions that include biomedical, behavioral, and structural interventions can have an impact on HIV risk reduction. Implementing this protocol will show the feasibility of delivering these services at scale. The MP3 Youth study is one of the few combination HIV prevention intervention protocols incorporating youth- and gender-specific interventions in one delivery setting. Lessons learned from the design of the protocol can be incorporated into the national guidance for combination HIV prevention for youth in Kenya and other high-burden SSA settings. ClinicalTrials.gov NCT01571128; http://clinicaltrials.gov/ct2/show/NCT01571128?term=MP3+youth&rank=1 (Archived by WebCite at http://www.webcitation.org/6nmioPd54). ©Jasmine Buttolph, Irene Inwani, Kawango Agot, Charles M Cleland, Peter Cherutich, James N Kiarie, Alfred Osoti, Connie L Celum, Jared M Baeten, Ruth Nduati, John Kinuthia, Timothy B Hallett, Ramzi Alsallaq, Ann E Kurth. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 08.03.2017.

  15. Automated monitoring of medical protocols: a secure and distributed architecture.

    PubMed

    Alsinet, T; Ansótegui, C; Béjar, R; Fernández, C; Manyà, F

    2003-03-01

    The control of the right application of medical protocols is a key issue in hospital environments. For the automated monitoring of medical protocols, we need a domain-independent language for their representation and a fully, or semi, autonomous system that understands the protocols and supervises their application. In this paper we describe a specification language and a multi-agent system architecture for monitoring medical protocols. We model medical services in hospital environments as specialized domain agents and interpret a medical protocol as a negotiation process between agents. A medical service can be involved in multiple medical protocols, and so specialized domain agents are independent of negotiation processes and autonomous system agents perform monitoring tasks. We present the detailed architecture of the system agents and of an important domain agent, the database broker agent, that is responsible of obtaining relevant information about the clinical history of patients. We also describe how we tackle the problems of privacy, integrity and authentication during the process of exchanging information between agents.

  16. Design Science Research toward Designing/Prototyping a Repeatable Model for Testing Location Management (LM) Algorithms for Wireless Networking

    ERIC Educational Resources Information Center

    Peacock, Christopher

    2012-01-01

    The purpose of this research effort was to develop a model that provides repeatable Location Management (LM) testing using a network simulation tool, QualNet version 5.1 (2011). The model will provide current and future protocol developers a framework to simulate stable protocol environments for development. This study used the Design Science…

  17. 22 CFR 2.4 - Designation of official guests.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Designation of official guests. 2.4 Section 2.4... Protocol. The designation of a person as an official guest is final. Pursuant to section 2658 of title 22... official guests is hereby delegated to the Chief of Protocol. (22 U.S.C. 2658) [45 FR 55716, Aug. 21, 1980] ...

  18. Sperm Cell Population Dynamics in Ram Semen during the Cryopreservation Process

    PubMed Central

    Ramón, Manuel; Pérez-Guzmán, M. Dolores; Jiménez-Rabadán, Pilar; Esteso, Milagros C.; García-Álvarez, Olga; Maroto-Morales, Alejandro; Anel-López, Luis; Soler, Ana J.; Fernández-Santos, M. Rocío; Garde, J. Julián

    2013-01-01

    Background Sperm cryopreservation has become an indispensable tool in biology. Initially, studies were aimed towards the development of efficient freezing protocols in different species that would allow for an efficient storage of semen samples for long periods of time, ensuring its viability. Nowadays, it is widely known that an important individual component exists in the cryoresistance of semen, and efforts are aimed at identifying those sperm characteristics that may allow us to predict this cryoresistance. This knowledge would lead, ultimately, to the design of optimized freezing protocols for the sperm characteristics of each male. Methodology/Principal Findings We have evaluated the changes that occur in the sperm head dimensions throughout the cryopreservation process. We have found three different patterns of response, each of one related to a different sperm quality at thawing. We have been able to characterize males based on these patterns. For each male, its pattern remained constant among different ejaculates. This latter would imply that males always respond in the same way to freezing, giving even more importance to this sperm feature. Conclusions/Significance Changes in the sperm head during cryopreservation process have resulted useful to identify the ability of semen of males for freezing. We suggest that analyses of these response patterns would represent an important tool to characterize the cryoresistance of males when implemented within breeding programs. We also propose follow-up experiments to examine the outcomes of the use of different freezing protocols depending on the pattern of response of males. PMID:23544054

  19. The use of participant-observation protocol in an industrial engineering research.

    PubMed

    Silveira e Silva, Renato da; Sznelwar, Laerte Idal; D'Afonseca e Silva, Victor

    2012-01-01

    Based on literature, this article aims to present the "participant-observation' research protocol, and its practical application in the industrial engineering field, more specifically within the area of design development, and in the case shown by this article, of interiors' design. The main target is to identify the concept of the method, i.e., from its characteristics to structure a general sense about the subject, so that the protocol can be used in different areas of knowledge, especially those ones which are committed with the scientific research involving the expertise from researchers, and subjective feelings and opinions of the users of an engineering product, and how this knowledge can be benefic for product design, contributing since the earliest stage of design.

  20. Object oriented design (OOD) in real-time hardware-in-the-loop (HWIL) simulations

    NASA Astrophysics Data System (ADS)

    Morris, Joe; Richard, Henri; Lowman, Alan; Youngren, Rob

    2006-05-01

    Using Object Oriented Design (OOD) concepts in AMRDEC's Hardware-in-the Loop (HWIL) real-time simulations allows the user to interchange parts of the simulation to meet test requirements. A large-scale three-spectral band simulator connected via a high speed reflective memory ring for time-critical data transfers to PC controllers connected by non real-time Ethernet protocols is used to separate software objects from logical entities close to their respective controlled hardware. Each standalone object does its own dynamic initialization, real-time processing, and end of run processing; therefore it can be easily maintained and updated. A Resource Allocation Program (RAP) is also utilized along with a device table to allocate, organize, and document the communication protocol between the software and hardware components. A GUI display program lists all allocations and deallocations of HWIL memory and hardware resources. This interactive program is also used to clean up defunct allocations of dead processes. Three examples are presented using the OOD and RAP concepts. The first is the control of an ACUTRONICS built three-axis flight table using the same control for calibration and real-time functions. The second is the transportability of a six-degree-of-freedom (6-DOF) simulation from an Onyx residence to a Linux-PC. The third is the replacement of the 6-DOF simulation with a replay program to drive the facility with archived run data for demonstration or analysis purposes.

  1. PNNI Performance Validation Test Report

    NASA Technical Reports Server (NTRS)

    Dimond, Robert P.

    1999-01-01

    Two Private Network-Network Interface (PNNI) neighboring peers were monitored with a protocol analyzer to understand and document how PNNI works with regards to initialization and recovery processes. With the processes documented, pertinent events were found and measured to determine the protocols behavior in several environments, which consisted of congestion and/or delay. Subsequent testing of the protocol in these environments was conducted to determine the protocol's suitability for use in satellite-terrestrial network architectures.

  2. BODY-ORIENTED THERAPY IN RECOVERY FROM CHILD SEXUAL ABUSE: AN EFFICACY STUDY

    PubMed Central

    Price, Cynthia

    2007-01-01

    Context There has been little research on body therapy for women in sexual abuse recovery. This study examines body-oriented therapy—an approach focused on body awareness and involving the combination of bodywork and the emotional processing of psychotherapy. Objective To examine the efficacy and the perceived influence on abuse recovery of body-oriented therapy. Massage therapy served as a relative control condition to address the lack of touch-based comparisons in bodywork research. Design A 2-group, repeated measures design was employed, involving randomization to either body-oriented therapy or massage group, conducted in 8, hour-long sessions by 1 of 4 research clinicians. Statistical and qualitative analysis was employed to provide both empirical and experiential perspectives on the study process. Setting Participants were seen in treatment rooms of a university in the northwestern United States and in clinician’s private offices. Participants Twenty-four adult females in psychotherapy for child sexual abuse. Interventions Body-oriented therapy protocol was delivered in three stages, involving massage, body awareness exercises, and inner-body focusing process. Massage therapy protocol was stan- dardized. Both protocols were delivered over clothes. Main Outcome Measures The outcomes reflected 3 key con-structs—psychological well being, physical well-being, and body connection. Repeated measures included: Brief Symptom Inventory, Dissociative Experiences Scale, Crime-Related Post Traumatic Stress Disorder Scale, Medical Symptoms Checklist, Scale of Body Connection and Scale of Body Investment. Results were gathered at 6 time points: baseline, 2 times during intervention, post-intervention, and at 1 month and 3 months follow-up. To examine the experiential perspective of the study process, written questionnaires were administered before and after intervention and at 1 month and 3 months follow-up. Results Repeated measures analysis of variance (ANOVA) indicated significant improvement on all outcome measures for both intervention groups, providing support for the efficacy of body therapy in recovery from childhood sexual abuse. There were no statistically significant differences between groups; however, qualitative analysis of open-ended questions about participant intervention experience revealed that the groups differed on perceived experience of the intervention and its influence on therapeutic recovery. PMID:16189948

  3. iLAP: a workflow-driven software for experimental protocol development, data acquisition and analysis

    PubMed Central

    2009-01-01

    Background In recent years, the genome biology community has expended considerable effort to confront the challenges of managing heterogeneous data in a structured and organized way and developed laboratory information management systems (LIMS) for both raw and processed data. On the other hand, electronic notebooks were developed to record and manage scientific data, and facilitate data-sharing. Software which enables both, management of large datasets and digital recording of laboratory procedures would serve a real need in laboratories using medium and high-throughput techniques. Results We have developed iLAP (Laboratory data management, Analysis, and Protocol development), a workflow-driven information management system specifically designed to create and manage experimental protocols, and to analyze and share laboratory data. The system combines experimental protocol development, wizard-based data acquisition, and high-throughput data analysis into a single, integrated system. We demonstrate the power and the flexibility of the platform using a microscopy case study based on a combinatorial multiple fluorescence in situ hybridization (m-FISH) protocol and 3D-image reconstruction. iLAP is freely available under the open source license AGPL from http://genome.tugraz.at/iLAP/. Conclusion iLAP is a flexible and versatile information management system, which has the potential to close the gap between electronic notebooks and LIMS and can therefore be of great value for a broad scientific community. PMID:19941647

  4. Open solutions to distributed control in ground tracking stations

    NASA Technical Reports Server (NTRS)

    Heuser, William Randy

    1994-01-01

    The advent of high speed local area networks has made it possible to interconnect small, powerful computers to function together as a single large computer. Today, distributed computer systems are the new paradigm for large scale computing systems. However, the communications provided by the local area network is only one part of the solution. The services and protocols used by the application programs to communicate across the network are as indispensable as the local area network. And the selection of services and protocols that do not match the system requirements will limit the capabilities, performance, and expansion of the system. Proprietary solutions are available but are usually limited to a select set of equipment. However, there are two solutions based on 'open' standards. The question that must be answered is 'which one is the best one for my job?' This paper examines a model for tracking stations and their requirements for interprocessor communications in the next century. The model and requirements are matched with the model and services provided by the five different software architectures and supporting protocol solutions. Several key services are examined in detail to determine which services and protocols most closely match the requirements for the tracking station environment. The study reveals that the protocols are tailored to the problem domains for which they were originally designed. Further, the study reveals that the process control model is the closest match to the tracking station model.

  5. GoActive: a protocol for the mixed methods process evaluation of a school-based physical activity promotion programme for 13-14year old adolescents.

    PubMed

    Jong, Stephanie T; Brown, Helen Elizabeth; Croxson, Caroline H D; Wilkinson, Paul; Corder, Kirsten L; van Sluijs, Esther M F

    2018-05-21

    Process evaluations are critical for interpreting and understanding outcome trial results. By understanding how interventions function across different settings, process evaluations have the capacity to inform future dissemination of interventions. The complexity of Get others Active (GoActive), a 12-week, school-based physical activity intervention implemented in eight schools, highlights the need to investigate how implementation is achieved across a variety of school settings. This paper describes the mixed methods GoActive process evaluation protocol that is embedded within the outcome evaluation. In this detailed process evaluation protocol, we describe the flexible and pragmatic methods that will be used for capturing the process evaluation data. A mixed methods design will be used for the process evaluation, including quantitative data collected in both the control and intervention arms of the GoActive trial, and qualitative data collected in the intervention arm. Data collection methods will include purposively sampled, semi-structured interviews and focus group interviews, direct observation, and participant questionnaires (completed by students, teachers, older adolescent mentors, and local authority-funded facilitators). Data will be analysed thematically within and across datasets. Overall synthesis of findings will address the process of GoActive implementation, and through which this process affects outcomes, with careful attention to the context of the school environment. This process evaluation will explore the experience of participating in GoActive from the perspectives of key groups, providing a greater understanding of the acceptability and process of implementation of the intervention across the eight intervention schools. This will allow for appraisal of the intervention's conceptual base, inform potential dissemination, and help optimise post-trial sustainability. The process evaluation will also assist in contextualising the trial effectiveness results with respect to how the intervention may or may not have worked and, if it was found to be effective, what might be required for it to be sustained in the 'real world'. Furthermore, it will offer suggestions for the development and implementation of future initiatives to promote physical activity within schools. ISRCTN, ISRCTN31583496 . Registered on 18 February 2014.

  6. Design of a functional cyclic HSV1-TK reporter and its application to PET imaging of apoptosis

    PubMed Central

    Wang, Zhe; Wang, Fu; Hida, Naoki; Kiesewetter, Dale O; Tian, Jie; Niu, Gang; Chen, Xiaoyuan

    2017-01-01

    Positron emission tomography (PET) is a sensitive and noninvasive imaging method that is widely used to explore molecular events in living subjects. PET can precisely and quantitatively evaluate cellular apoptosis, which has a crucial role in various physiological and pathological processes. In this protocol, we describe the design and use of an engineered cyclic herpes simplex virus 1–thymidine kinase (HSV1-TK) PET reporter whose kinase activity is specifically switched on by apoptosis. The expression of cyclic TK (cTK) in healthy cells leads to inactive product, whereas the activation of apoptosis through the caspase-3 pathway cleaves cTK, thus restoring its activity and enabling PET imaging. In addition to detailing the design and construction of the cTK plasmid in this protocol, we include assays for evaluating the function and specificity of the cTK reporter in apoptotic cells, such as assays for measuring the cell uptake of PET tracer in apoptotic cells, correlating doxorubicin (Dox)-induced cell apoptosis to cTK function recovery, and in vivo PET imaging of cancer cell apoptosis, and we also include corresponding data acquisition methods. The time to build the entire cTK reporter is ~2–3 weeks. The selection of a stable cancer cell line takes ~4–6 weeks. The time to implement assays regarding cTK function in apoptotic cells and the in vivo imaging varies depending on the experiment. The cyclization strategy described in this protocol can also be adapted to create other reporter systems for broad biomedical applications. PMID:25927390

  7. An alternative method for processing northern blots after capillary transfer.

    PubMed

    Nilsen, Timothy W

    2015-03-02

    Different laboratories use different methods for the prehybridization, hybridization, and washing steps of the northern blotting procedure. In this protocol, a northern blot is pretreated with Church and Gilbert hybridization buffer to block nonspecific probe-binding sites. The immobilized RNA is then hybridized to a DNA probe specific for the RNA of interest. Finally, the membrane is washed and subjected to autoradiography or phosphorimaging. The solutions and conditions described here may be ideal for those who prefer to use fewer ingredients in their solutions. This protocol is designed to achieve the same goals as other northern blotting approaches. It minimizes background (nonspecific adherence of probe to membrane and nonspecific hybridization) and maximizes specific hybridization to RNAs immobilized on a membrane. © 2015 Cold Spring Harbor Laboratory Press.

  8. LANES - LOCAL AREA NETWORK EXTENSIBLE SIMULATOR

    NASA Technical Reports Server (NTRS)

    Gibson, J.

    1994-01-01

    The Local Area Network Extensible Simulator (LANES) provides a method for simulating the performance of high speed local area network (LAN) technology. LANES was developed as a design and analysis tool for networking on board the Space Station. The load, network, link and physical layers of a layered network architecture are all modeled. LANES models to different lower-layer protocols, the Fiber Distributed Data Interface (FDDI) and the Star*Bus. The load and network layers are included in the model as a means of introducing upper-layer processing delays associated with message transmission; they do not model any particular protocols. FDDI is an American National Standard and an International Organization for Standardization (ISO) draft standard for a 100 megabit-per-second fiber-optic token ring. Specifications for the LANES model of FDDI are taken from the Draft Proposed American National Standard FDDI Token Ring Media Access Control (MAC), document number X3T9.5/83-16 Rev. 10, February 28, 1986. This is a mature document describing the FDDI media-access-control protocol. Star*Bus, also known as the Fiber Optic Demonstration System, is a protocol for a 100 megabit-per-second fiber-optic star-topology LAN. This protocol, along with a hardware prototype, was developed by Sperry Corporation under contract to NASA Goddard Space Flight Center as a candidate LAN protocol for the Space Station. LANES can be used to analyze performance of a networking system based on either FDDI or Star*Bus under a variety of loading conditions. Delays due to upper-layer processing can easily be nullified, allowing analysis of FDDI or Star*Bus as stand-alone protocols. LANES is a parameter-driven simulation; it provides considerable flexibility in specifying both protocol an run-time parameters. Code has been optimized for fast execution and detailed tracing facilities have been included. LANES was written in FORTRAN 77 for implementation on a DEC VAX under VMS 4.6. It consists of two programs, a simulation program and a user-interface program. The simulation program requires the SLAM II simulation library from Pritsker and Associates, W. Lafayette IN; the user interface is implemented using the Ingres database manager from Relational Technology, Inc. Information about running the simulation program without the user-interface program is contained in the documentation. The memory requirement is 129,024 bytes. LANES was developed in 1988.

  9. How to benchmark methods for structure-based virtual screening of large compound libraries.

    PubMed

    Christofferson, Andrew J; Huang, Niu

    2012-01-01

    Structure-based virtual screening is a useful computational technique for ligand discovery. To systematically evaluate different docking approaches, it is important to have a consistent benchmarking protocol that is both relevant and unbiased. Here, we describe the designing of a benchmarking data set for docking screen assessment, a standard docking screening process, and the analysis and presentation of the enrichment of annotated ligands among a background decoy database.

  10. Refining MARGINS Mini-Lessons Using Classroom Observations

    NASA Astrophysics Data System (ADS)

    Iverson, E. A.; Manduca, C. A.; McDaris, J. R.; Lee, S.

    2009-12-01

    One of the challenges that we face in developing teaching materials or activities from research findings is testing the materials to determine that they work as intended. Traditionally faculty develop material for their own class, notice what worked and didn’t, and improve them the next year. However, as we move to a community process of creating and sharing teaching materials, a community-based process for testing materials is appropriate. The MARGINS project has piloted such a process for testing teaching materials and activities developed as part of its mini-lesson project (http://serc.carleton.edu/margins/index.html). Building on prior work developing mechanisms for community review of teaching resources (e.g. Kastens, 2002; Hancock and Manduca, 2005; Mayhew and Hall, 2007), the MARGINS evaluation team developed a structured classroom observation protocol. The goals of field testing are to a) gather structured, consistent feedback for the lesson authors based on classroom use; b) guide reviewers of these lessons to reflect on research-based educational practice as a framework for their comments; c) collect information on the data and observations that the reviewer used to underpin their review; d) determine which mini-lessons are ready to be made widely available on the website. The protocol guides faculty observations on why they used the activity, the effectiveness of the activity in their classroom, the success of the activity in leading to the desired learning, and what other faculty need to successfully use the activity. Available online (http://serc.carleton.edu/margins/protocol.html), the protocol can be downloaded and completed during instruction with the activity. In order to encourage review of mini-lessons using the protocol, a workshop focused on review and revision of activities was held in May 2009. In preparation for the workshop, 13 of the 28 participants chose to field test a mini-lesson prior to the workshop and reported that they found this process instructive. Activity authors found the observations very helpful and the first mini-lessons have now been revised using feedback from testers. Initial results show that the tested mini-lessons give students hands-on experience with scientific data and help students make connections between geologic phenomena and data. Productive feedback ranged from suggestions for improving activity design, adaptations for other audiences, suggestions for clearer presentation, and tips for using the materials. The team plans to broaden the use of the protocol to test and refine all of the mini-lessons in the MARGINS collection.

  11. Health-care management of an unexpected case of Ebola virus disease at the Alcorcón Foundation University Teaching Hospital.

    PubMed

    Rodríguez-Caravaca, Gil; Timermans, Rafael; Parra-Ramírez, Juan Manuel; Domínguez-Hernández, Francisco Javier; Algora-Weber, Alejandro; Delgado-Iribarren, Alberto; Hermida-Gutiérrez, Guillermo

    2015-04-01

    The first Ebola virus infected patient outside Africa was diagnosed and treated at Alcorcón Foundation University Teaching Hospital (AFUTH). We describe the integrated management strategy (medical, occupational health, preventive and public health) applied to the case. Descriptive study of health-care management of an unexpected case of Ebola virus disease (EVD) at AFUTH treated on 6 October 2014. We describe the clinical evolution of the patient while he was attended at the Emergency Department, the drawing-up process of the action protocol, the process of training of hospital staff, the administrative management for transferring the patient to the referral centre, and the measures implemented for cleaning, disinfection and management of waste. Qualitative variables are expressed as percentages. Our centre designed and updated, from May to October, five versions of the acting and care protocol for patients with EVD. The protocol was in force at the AFUTH when a nursing assistant was attended on 6 October 2014. All preventive, diagnostic and therapeutic measures outlined in the protocol were applied and 206 professionals had received training and information about care procedures with a suspect case. Health-care management of an unexpected case of EVD was adequate and there was no secondary cases in our staff as a result. All resources available should be used to fight EVD. Copyright © 2015 Elsevier España, S.L.U. y Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.

  12. Dual sensory loss: development of a dual sensory loss protocol and design of a randomized controlled trial

    PubMed Central

    2013-01-01

    Background Dual sensory loss (DSL) has a negative impact on health and wellbeing and its prevalence is expected to increase due to demographic aging. However, specialized care or rehabilitation programs for DSL are scarce. Until now, low vision rehabilitation does not sufficiently target concurrent impairments in vision and hearing. This study aims to 1) develop a DSL protocol (for occupational therapists working in low vision rehabilitation) which focuses on optimal use of the senses and teaches DSL patients and their communication partners to use effective communication strategies, and 2) describe the multicenter parallel randomized controlled trial (RCT) designed to test the effectiveness and cost-effectiveness of the DSL protocol. Methods/design To develop a DSL protocol, literature was reviewed and content was discussed with professionals in eye/ear care (interviews/focus groups) and DSL patients (interviews). A pilot study was conducted to test and confirm the DSL protocol. In addition, a two-armed international multi-center RCT will evaluate the effectiveness and cost-effectiveness of the DSL protocol compared to waiting list controls, in 124 patients in low vision rehabilitation centers in the Netherlands and Belgium. Discussion This study provides a treatment protocol for rehabilitation of DSL within low vision rehabilitation, which aims to be a valuable addition to the general low vision rehabilitation care. Trial registration Netherlands Trial Register (NTR) identifier: NTR2843 PMID:23941667

  13. Authentication techniques for smart cards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, R.A.

    1994-02-01

    Smart card systems are most cost efficient when implemented as a distributed system, which is a system without central host interaction or a local database of card numbers for verifying transaction approval. A distributed system, as such, presents special card and user authentication problems. Fortunately, smart cards offer processing capabilities that provide solutions to authentication problems, provided the system is designed with proper data integrity measures. Smart card systems maintain data integrity through a security design that controls data sources and limits data changes. A good security design is usually a result of a system analysis that provides a thoroughmore » understanding of the application needs. Once designers understand the application, they may specify authentication techniques that mitigate the risk of system compromise or failure. Current authentication techniques include cryptography, passwords, challenge/response protocols, and biometrics. The security design includes these techniques to help prevent counterfeit cards, unauthorized use, or information compromise. This paper discusses card authentication and user identity techniques that enhance security for microprocessor card systems. It also describes the analysis process used for determining proper authentication techniques for a system.« less

  14. A Family of ACO Routing Protocols for Mobile Ad Hoc Networks

    PubMed Central

    Rupérez Cañas, Delfín; Sandoval Orozco, Ana Lucila; García Villalba, Luis Javier; Kim, Tai-hoon

    2017-01-01

    In this work, an ACO routing protocol for mobile ad hoc networks based on AntHocNet is specified. As its predecessor, this new protocol, called AntOR, is hybrid in the sense that it contains elements from both reactive and proactive routing. Specifically, it combines a reactive route setup process with a proactive route maintenance and improvement process. Key aspects of the AntOR protocol are the disjoint-link and disjoint-node routes, separation between the regular pheromone and the virtual pheromone in the diffusion process and the exploration of routes, taking into consideration the number of hops in the best routes. In this work, a family of ACO routing protocols based on AntOR is also specified. These protocols are based on protocol successive refinements. In this work, we also present a parallelized version of AntOR that we call PAntOR. Using programming multiprocessor architectures based on the shared memory protocol, PAntOR allows running tasks in parallel using threads. This parallelization is applicable in the route setup phase, route local repair process and link failure notification. In addition, a variant of PAntOR that consists of having more than one interface, which we call PAntOR-MI (PAntOR-Multiple Interface), is specified. This approach parallelizes the sending of broadcast messages by interface through threads. PMID:28531159

  15. Environmental Compliance Assessment Protocol-Centers for Disease Control and Prevention (ECAP-CDC)

    DTIC Science & Technology

    1993-10-01

    propellers, or appliances. 2. Military weapons or equipment designed for combat use. 3. Rockets or equipment designed for research, or experimental or...should be reproduced and used during the assessment to take notes. It is designed to be inserted between each page of the protocols, allowing the...procedures are designed as an aid and should not be considered exhaus- tive. Use of the guide requires the evaluator’s judgement to play a role in

  16. RetroPath2.0: A retrosynthesis workflow for metabolic engineers.

    PubMed

    Delépine, Baudoin; Duigou, Thomas; Carbonell, Pablo; Faulon, Jean-Loup

    2018-01-01

    Synthetic biology applied to industrial biotechnology is transforming the way we produce chemicals. However, despite advances in the scale and scope of metabolic engineering, the research and development process still remains costly. In order to expand the chemical repertoire for the production of next generation compounds, a major engineering biology effort is required in the development of novel design tools that target chemical diversity through rapid and predictable protocols. Addressing that goal involves retrosynthesis approaches that explore the chemical biosynthetic space. However, the complexity associated with the large combinatorial retrosynthesis design space has often been recognized as the main challenge hindering the approach. Here, we provide RetroPath2.0, an automated open source workflow for retrosynthesis based on generalized reaction rules that perform the retrosynthesis search from chassis to target through an efficient and well-controlled protocol. Its easiness of use and the versatility of its applications make this tool a valuable addition to the biological engineer bench desk. We show through several examples the application of the workflow to biotechnological relevant problems, including the identification of alternative biosynthetic routes through enzyme promiscuity or the development of biosensors. We demonstrate in that way the ability of the workflow to streamline retrosynthesis pathway design and its major role in reshaping the design, build, test and learn pipeline by driving the process toward the objective of optimizing bioproduction. The RetroPath2.0 workflow is built using tools developed by the bioinformatics and cheminformatics community, because it is open source we anticipate community contributions will likely expand further the features of the workflow. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  17. Flexible continuous manufacturing platforms for solid dispersion formulations

    NASA Astrophysics Data System (ADS)

    Karry-Rivera, Krizia Marie

    In 2013 16,000 people died in the US due to overdose from prescription drugs and synthetic narcotics. As of that same year, 90% of new molecular entities in the pharmaceutical drug pipeline are classified as poor water-soluble. The work in this dissertation aims to design, develop and validate platforms that solubilize weak acids and can potentially deter drug abuse. These platforms are based on processing solid dispersions via solvent-casting and hot-melt extrusion methods to produce oral transmucosal films and melt tablets. To develop these platforms, nanocrystalline suspensions and glassy solutions were solvent-casted in the form of films after physicochemical characterizations of drug-excipient interactions and design of experiment approaches. A second order model was fitted to the emulsion diffusion process to predict average nanoparticle size and for process optimization. To further validate the manufacturing flexibility of the formulations, glassy solutions were also extruded and molded into tablets. This process included a systematic quality-by-design (QbD) approach that served to identify the factors affecting the critical quality attributes (CQAs) of the melt tablets. These products, due to their novelty, lack discriminatory performance tests that serve as predictors to their compliance and stability. Consequently, Process Analytical Technology (PAT) tools were integrated into the continuous manufacturing platform for films. Near-infrared (NIR) spectroscopy, including chemical imaging, combined with deconvolution algorithms were utilized for a holistic assessment of the effect of formulation and process variables on the product's CQAs. Biorelevant dissolution protocols were then established to improve the in-vivo in-vitro correlation of the oral transmucosal films. In conclusion, the work in this dissertation supports the delivery of poor-water soluble drugs in products that may deter abuse. Drug nanocrystals ensured high bioavailability, while glassy solutions enabled drug solubilization in polymer matrices. PAT tools helped in characterizing the micro and macro structure of the product while also used as a control strategy for manufacturing. The systematic QbD assessment enabled identification of the variables that significantly affected melt tablet performance and their potential as an abuse deterrent product. Being that these glassy products are novel systems, biorelevant protocols for testing dissolution performance of films were also developed.

  18. SU-E-P-03: Implementing a Low Dose Lung Screening CT Program Meeting Regulatory Requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    LaFrance, M; Marsh, S; O'Donnell, G

    Purpose: To provide information pertaining to IROC Houston QA Center's (RPC) credentialing process for institutions participating in NCI-sponsored clinical trials. Purpose: Provide guidance to the Radiology Departments with the intent of implementing a Low Dose CT Screening Program using different CT Scanners with multiple techniques within the framework of the required state regulations. Method: State Requirements for the purpose of implementing a Low Dose CT Lung Protocol required working with the Radiology and Pulmonary Department in setting up a Low Dose Screening Protocol designed to reduce the radiation burden to the patients enrolled. Radiation dose measurements (CTDIvol) for various CTmore » manufacturers (Siemens16, Siemens 64, Philips 64, and Neusoft128) for three different weight based protocols. All scans were reviewed by the Radiologist. Prior to starting a low dose lung screening protocol, information had to be submitted to the state for approval. Performing a Healing Arts protocol requires extensive information. This not only includes name and address of the applicant but a detailed description of the disease, the x-ray examination and the population to be examined. The unit had to be tested by a qualified expert using the technique charts. The credentials of all the operators, the supervisors and the Radiologists had to be submitted to the state. Results: All the appropriate documentation was sent to the state for review. The measured results between the Low Dose Protocol versus the default Adult Chest Protocol showed that there was a dose reduction of 65% for small (100-150 lb.) patient, 75% for the Medium patient (151-250 lbs.), and a 55% reduction for the Large patient ( over 250 lbs.). Conclusion: Measured results indicated that the Low Dose Protocol indeed lowered the screening patient's radiation dose and the institution was able to submit the protocol to the State's regulators.« less

  19. Using the Emanuel et al. framework to assess ethical issues raised by a biomedical research ethics committee in South Africa.

    PubMed

    Tsoka-Gwegweni, Joyce M; Wassenaar, Douglas R

    2014-12-01

    The Emanuel, Wendler, and Grady framework was designed as a universal tool for use in many settings including developing countries. However, it is not known whether the work of African health research ethics committees (RECs) is compatible with this framework. The absence of any normative or empirical weighting of the eight principles within this framework suggests that different health RECs may raise some ethical issues more frequently than others when reviewing protocols. We used the Emanuel et al. framework to assess, code, and rank the most frequent ethical issues considered by a biomedical REC during review of research protocols for the years 2008 to 2012. We extracted data from the recorded minutes of a South African biomedical REC for the years 2008 to 2012, designed the data collection sheet according to the Emanuel et al. framework, and removed all identifiers during data processing and analysis. From the 98 protocols that we assessed, the most frequent issues that emerged were the informed consent, scientific validity, fair participant selection, and ongoing respect for participants. This study represents the first known attempt to analyze REC responses/minutes using the Emanuel et al. framework, and suggests that this framework may be useful in describing and categorizing the core activities of an REC. © The Author(s) 2014.

  20. MicroSEQ® Salmonella spp. Detection Kit Using the Pathatrix® 10-Pooling Salmonella spp. Kit Linked Protocol Method Modification.

    PubMed

    Wall, Jason; Conrad, Rick; Latham, Kathy; Liu, Eric

    2014-03-01

    Real-time PCR methods for detecting foodborne pathogens offer the advantages of simplicity and quick time to results compared to traditional culture methods. The addition of a recirculating pooled immunomagnetic separation method prior to real-time PCR analysis increases processing output while reducing both cost and labor. This AOAC Research Institute method modification study validates the MicroSEQ® Salmonella spp. Detection Kit [AOAC Performance Tested Method (PTM) 031001] linked with the Pathatrix® 10-Pooling Salmonella spp. Kit (AOAC PTM 090203C) in diced tomatoes, chocolate, and deli ham. The Pathatrix 10-Pooling protocol represents a method modification of the enrichment portion of the MicroSEQ Salmonella spp. The results of the method modification were compared to standard cultural reference methods for diced tomatoes, chocolate, and deli ham. All three matrixes were analyzed in a paired study design. An additional set of chocolate test portions was analyzed using an alternative enrichment medium in an unpaired study design. For all matrixes tested, there were no statistically significant differences in the number of positive test portions detected by the modified candidate method compared to the appropriate reference method. The MicroSEQ Salmonella spp. protocol linked with the Pathatrix individual or 10-Pooling procedure demonstrated reliability as a rapid, simplified, method for the preparation of samples and subsequent detection of Salmonella in diced tomatoes, chocolate, and deli ham.

  1. An intelligent case-adjustment algorithm for the automated design of population-based quality auditing protocols.

    PubMed

    Advani, Aneel; Jones, Neil; Shahar, Yuval; Goldstein, Mary K; Musen, Mark A

    2004-01-01

    We develop a method and algorithm for deciding the optimal approach to creating quality-auditing protocols for guideline-based clinical performance measures. An important element of the audit protocol design problem is deciding which guide-line elements to audit. Specifically, the problem is how and when to aggregate individual patient case-specific guideline elements into population-based quality measures. The key statistical issue involved is the trade-off between increased reliability with more general population-based quality measures versus increased validity from individually case-adjusted but more restricted measures done at a greater audit cost. Our intelligent algorithm for auditing protocol design is based on hierarchically modeling incrementally case-adjusted quality constraints. We select quality constraints to measure using an optimization criterion based on statistical generalizability coefficients. We present results of the approach from a deployed decision support system for a hypertension guideline.

  2. 77 FR 36281 - Solicitation of Information and Recommendations for Revising OIG's Provider Self-Disclosure Protocol

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-18

    ...] Solicitation of Information and Recommendations for Revising OIG's Provider Self-Disclosure Protocol AGENCY... Register notice informs the public that OIG: (1) Intends to update the Provider Self-Disclosure Protocol... Provider Self-Disclosure Protocol (the Protocol) to establish a process for health care providers to...

  3. Personalised Hip Therapy: development of a non-operative protocol to treat femoroacetabular impingement syndrome in the FASHIoN randomised controlled trial.

    PubMed

    Wall, Peter Dh; Dickenson, Edward J; Robinson, David; Hughes, Ivor; Realpe, Alba; Hobson, Rachel; Griffin, Damian R; Foster, Nadine E

    2016-10-01

    Femoroacetabular impingement (FAI) syndrome is increasingly recognised as a cause of hip pain. As part of the design of a randomised controlled trial (RCT) of arthroscopic surgery for FAI syndrome, we developed a protocol for non-operative care and evaluated its feasibility. In phase one, we developed a protocol for non-operative care for FAI in the UK National Health Service (NHS), through a process of systematic review and consensus gathering. In phase two, the protocol was tested in an internal pilot RCT for protocol adherence and adverse events. The final protocol, called Personalised Hip Therapy (PHT), consists of four core components led by physiotherapists: detailed patient assessment, education and advice, help with pain relief and an exercise-based programme that is individualised, supervised and progressed over time. PHT is delivered over 12-26 weeks in 6-10 physiotherapist-patient contacts, supplemented by a home exercise programme. In the pilot RCT, 42 patients were recruited and 21 randomised to PHT. Review of treatment case report forms, completed by physiotherapists, showed that 13 patients (62%) received treatment that had closely followed the PHT protocol. 13 patients reported some muscle soreness at 6 weeks, but there were no serious adverse events. PHT provides a structure for the non-operative care of FAI and offers guidance to clinicians and researchers in an evolving area with limited evidence. PHT was deliverable within the National Health Service, is safe, and now forms the comparator to arthroscopic surgery in the UK FASHIoN trial (ISRCTN64081839). ISRCTN 09754699. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  4. Personalised Hip Therapy: development of a non-operative protocol to treat femoroacetabular impingement syndrome in the FASHIoN randomised controlled trial

    PubMed Central

    Wall, Peter DH; Dickenson, Edward J; Robinson, David; Hughes, Ivor; Realpe, Alba; Hobson, Rachel; Griffin, Damian R; Foster, Nadine E

    2016-01-01

    Introduction Femoroacetabular impingement (FAI) syndrome is increasingly recognised as a cause of hip pain. As part of the design of a randomised controlled trial (RCT) of arthroscopic surgery for FAI syndrome, we developed a protocol for non-operative care and evaluated its feasibility. Methods In phase one, we developed a protocol for non-operative care for FAI in the UK National Health Service (NHS), through a process of systematic review and consensus gathering. In phase two, the protocol was tested in an internal pilot RCT for protocol adherence and adverse events. Results The final protocol, called Personalised Hip Therapy (PHT), consists of four core components led by physiotherapists: detailed patient assessment, education and advice, help with pain relief and an exercise-based programme that is individualised, supervised and progressed over time. PHT is delivered over 12–26 weeks in 6–10 physiotherapist-patient contacts, supplemented by a home exercise programme. In the pilot RCT, 42 patients were recruited and 21 randomised to PHT. Review of treatment case report forms, completed by physiotherapists, showed that 13 patients (62%) received treatment that had closely followed the PHT protocol. 13 patients reported some muscle soreness at 6 weeks, but there were no serious adverse events. Conclusion PHT provides a structure for the non-operative care of FAI and offers guidance to clinicians and researchers in an evolving area with limited evidence. PHT was deliverable within the National Health Service, is safe, and now forms the comparator to arthroscopic surgery in the UK FASHIoN trial (ISRCTN64081839). Trial registration number ISRCTN 09754699. PMID:27629405

  5. Time Synchronization and Distribution Mechanisms for Space Networks

    NASA Technical Reports Server (NTRS)

    Woo, Simon S.; Gao, Jay L.; Clare, Loren P.; Mills, David L.

    2011-01-01

    This work discusses research on the problems of synchronizing and distributing time information between spacecraft based on the Network Time Protocol (NTP), where NTP is a standard time synchronization protocol widely used in the terrestrial network. The Proximity-1 Space Link Interleaved Time Synchronization (PITS) Protocol was designed and developed for synchronizing spacecraft that are in proximity where proximity is less than 100,000 km distant. A particular application is synchronization between a Mars orbiter and rover. Lunar scenarios as well as outer-planet deep space mother-ship-probe missions may also apply. Spacecraft with more accurate time information functions as a time-server, and the other spacecraft functions as a time-client. PITS can be easily integrated and adaptable to the CCSDS Proximity-1 Space Link Protocol with minor modifications. In particular, PITS can take advantage of the timestamping strategy that underlying link layer functionality provides for accurate time offset calculation. The PITS algorithm achieves time synchronization with eight consecutive space network time packet exchanges between two spacecraft. PITS can detect and avoid possible errors from receiving duplicate and out-of-order packets by comparing with the current state variables and timestamps. Further, PITS is able to detect error events and autonomously recover from unexpected events that can possibly occur during the time synchronization and distribution process. This capability achieves an additional level of protocol protection on top of CRC or Error Correction Codes. PITS is a lightweight and efficient protocol, eliminating the needs for explicit frame sequence number and long buffer storage. The PITS protocol is capable of providing time synchronization and distribution services for a more general domain where multiple entities need to achieve time synchronization using a single point-to-point link.

  6. Network protocols for real-time applications

    NASA Technical Reports Server (NTRS)

    Johnson, Marjory J.

    1987-01-01

    The Fiber Distributed Data Interface (FDDI) and the SAE AE-9B High Speed Ring Bus (HSRB) are emerging standards for high-performance token ring local area networks. FDDI was designed to be a general-purpose high-performance network. HSRB was designed specifically for military real-time applications. A workshop was conducted at NASA Ames Research Center in January, 1987 to compare and contrast these protocols with respect to their ability to support real-time applications. This report summarizes workshop presentations and includes an independent comparison of the two protocols. A conclusion reached at the workshop was that current protocols for the upper layers of the Open Systems Interconnection (OSI) network model are inadequate for real-time applications.

  7. Bringing memory fMRI to the clinic: comparison of seven memory fMRI protocols in temporal lobe epilepsy.

    PubMed

    Towgood, Karren; Barker, Gareth J; Caceres, Alejandro; Crum, William R; Elwes, Robert D C; Costafreda, Sergi G; Mehta, Mitul A; Morris, Robin G; von Oertzen, Tim J; Richardson, Mark P

    2015-04-01

    fMRI is increasingly implemented in the clinic to assess memory function. There are multiple approaches to memory fMRI, but limited data on advantages and reliability of different methods. Here, we compared effect size, activation lateralisation, and between-sessions reliability of seven memory fMRI protocols: Hometown Walking (block design), Scene encoding (block design and event-related design), Picture encoding (block and event-related), and Word encoding (block and event-related). All protocols were performed on three occasions in 16 patients with temporal lobe epilepsy (TLE). Group T-maps showed activity bilaterally in medial temporal lobe for all protocols. Using ANOVA, there was an interaction between hemisphere and seizure-onset lateralisation (P = 0.009) and between hemisphere, protocol and seizure-onset lateralisation (P = 0.002), showing that the distribution of memory-related activity between left and right temporal lobes differed between protocols and between patients with left-onset and right-onset seizures. Using voxelwise intraclass Correlation Coefficient, between-sessions reliability was best for Hometown and Scenes (block and event). The between-sessions spatial overlap of activated voxels was also greatest for Hometown and Scenes. Lateralisation of activity between hemispheres was most reliable for Scenes (block and event) and Words (event). Using receiver operating characteristic analysis to explore the ability of each fMRI protocol to classify patients as left-onset or right-onset TLE, only the Words (event) protocol achieved a significantly above-chance classification of patients at all three sessions. We conclude that Words (event) protocol shows the best combination of between-sessions reliability of the distribution of activity between hemispheres and reliable ability to distinguish between left-onset and right-onset patients. © 2015 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  8. Potential of Wake-Up Radio-Based MAC Protocols for Implantable Body Sensor Networks (IBSN)—A Survey

    PubMed Central

    Karuppiah Ramachandran, Vignesh Raja; Ayele, Eyuel D.; Meratnia, Nirvana; Havinga, Paul J. M.

    2016-01-01

    With the advent of nano-technology, medical sensors and devices are becoming highly miniaturized. Consequently, the number of sensors and medical devices being implanted to accurately monitor and diagnose a disease is increasing. By measuring the symptoms and controlling a medical device as close as possible to the source, these implantable devices are able to save lives. A wireless link between medical sensors and implantable medical devices is essential in the case of closed-loop medical devices, in which symptoms of the diseases are monitored by sensors that are not placed in close proximity of the therapeutic device. Medium Access Control (MAC) is crucial to make it possible for several medical devices to communicate using a shared wireless medium in such a way that minimum delay, maximum throughput, and increased network life-time are guaranteed. To guarantee this Quality of Service (QoS), the MAC protocols control the main sources of limited resource wastage, namely the idle-listening, packet collisions, over-hearing, and packet loss. Traditional MAC protocols designed for body sensor networks are not directly applicable to Implantable Body Sensor Networks (IBSN) because of the dynamic nature of the radio channel within the human body and the strict QoS requirements of IBSN applications. Although numerous MAC protocols are available in the literature, the majority of them are designed for Body Sensor Network (BSN) and Wireless Sensor Network (WSN). To the best of our knowledge, there is so far no research paper that explores the impact of these MAC protocols specifically for IBSN. MAC protocols designed for implantable devices are still in their infancy and one of their most challenging objectives is to be ultra-low-power. One of the technological solutions to achieve this objective so is to integrate the concept of Wake-up radio (WuR) into the MAC design. In this survey, we present a taxonomy of MAC protocols based on their use of WuR technology and identify their bottlenecks to be used in IBSN applications. Furthermore, we present a number of open research challenges and requirements for designing an energy-efficient and reliable wireless communication protocol for IBSN. PMID:27916822

  9. Potential of Wake-Up Radio-Based MAC Protocols for Implantable Body Sensor Networks (IBSN)-A Survey.

    PubMed

    Karuppiah Ramachandran, Vignesh Raja; Ayele, Eyuel D; Meratnia, Nirvana; Havinga, Paul J M

    2016-11-29

    With the advent of nano-technology, medical sensors and devices are becoming highly miniaturized. Consequently, the number of sensors and medical devices being implanted to accurately monitor and diagnose a disease is increasing. By measuring the symptoms and controlling a medical device as close as possible to the source, these implantable devices are able to save lives. A wireless link between medical sensors and implantable medical devices is essential in the case of closed-loop medical devices, in which symptoms of the diseases are monitored by sensors that are not placed in close proximity of the therapeutic device. Medium Access Control (MAC) is crucial to make it possible for several medical devices to communicate using a shared wireless medium in such a way that minimum delay, maximum throughput, and increased network life-time are guaranteed. To guarantee this Quality of Service (QoS), the MAC protocols control the main sources of limited resource wastage, namely the idle-listening, packet collisions, over-hearing, and packet loss. Traditional MAC protocols designed for body sensor networks are not directly applicable to Implantable Body Sensor Networks (IBSN) because of the dynamic nature of the radio channel within the human body and the strict QoS requirements of IBSN applications. Although numerous MAC protocols are available in the literature, the majority of them are designed for Body Sensor Network (BSN) and Wireless Sensor Network (WSN). To the best of our knowledge, there is so far no research paper that explores the impact of these MAC protocols specifically for IBSN. MAC protocols designed for implantable devices are still in their infancy and one of their most challenging objectives is to be ultra-low-power. One of the technological solutions to achieve this objective so is to integrate the concept of Wake-up radio (WuR) into the MAC design. In this survey, we present a taxonomy of MAC protocols based on their use of WuR technology and identify their bottlenecks to be used in IBSN applications. Furthermore, we present a number of open research challenges and requirements for designing an energy-efficient and reliable wireless communication protocol for IBSN.

  10. Optimization of a vitrification protocol for hatched blastocysts from the dromedary camel (Camelus dromedarius).

    PubMed

    Herrid, M; Billah, M; Malo, C; Skidmore, J A

    2016-03-01

    The objective of this study was to modify and optimize a vitrification protocol (open pulled straw) that was originally designed for human oocytes and embryos, to make it suitable for the cryopreservation of camel hatched blastocysts. The original open pulled straw protocol was a complex process with 15-minute exposure of oocytes/embryos in 7.5% ethylene glycol (EG) and 7.5% dimethyl sulfoxide (Me2SO) for equilibration, and cooling in 16% EG + 16% Me2SO + 1 M sucrose. Recognizing a need to better control the cryoprotectant (CPA) concentrations, while avoiding toxicity to the embryos, the effects on the survival rate and developmental potential of camel embryos in vitro were investigated using two different methods of loading the CPAs into the embryos (stepwise and semicontinuous increase in concentration), two different loading temperature/time (room temperature ∼24 °C/15 min and body 37 °C/3 min), and the replacement of Me2SO with EG alone or in combination with glycerol (Gly). A total of 145 in vivo-derived embryos were subjected to these processes, and after warming their morphological quality and integrity, and re-expansion was assessed after 0, 2, 24, 48, 72, and 96 hours of culture. Exposure of embryos in a stepwise method was more beneficial to the survival of embryos than was the semicontinuous process, and loading of CPAs at 37 °C with a short exposure time (3 minutes) resulted in an outcome comparable to the original processing at room temperature with a longer exposure time (15 minutes). The replacement of the Me2SO + EG mixture with EG only or a combination of EG + Gly in the vitrification medium significantly improved the outcome of all these evaluation criteria (P < 0.05). The modified protocol of loading EG at 37 °C for 3 minutes has increased the embryo survival of the original protocol from 67% to 91% and the developmental rate from 57% to 83% at 5-day culture. These results were comparable to or better than those reported in human or other species, indicating that this optimized method is well suited to any commercial embryo transfer program in the dromedary camel. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Design and Development of a Portable WiFi enabled BIA device

    NASA Astrophysics Data System (ADS)

    Križaj, D.; Baloh, M.; Brajkovič, R.; Žagar, T.

    2013-04-01

    A bioimpedance device (BIA) for evaluation of sarcopenia - age related muscle mass loss - is designed, developed and evaluated. The requirements were based on lightweight design, flexible and user enabled incorporation of measurement protocols and WiFi protocol for remote device control, full internet integration and fast development and usage of measurement protocols. The current design is based on usage of a microcontroller with integrated AD/DA converters. The prototype system was assembled and the operation and connectivity to different handheld devices and laptop computers was successfully tested. The designed BIA device can be accessed using TCP sockets and once the connection is established the data transfer runs successfully at the specified speed. The accuracy of currently developed prototype is about 5% for the impedance modulus and 5 deg. for the phase for the frequencies below 20 kHz with an unfiltered excitation signal and no additional amplifiers employed.

  12. "WWW.MDTF.ORG": a World Wide Web forum for developing open-architecture, freely distributed, digital teaching file software by participant consensus.

    PubMed

    Katzman, G L; Morris, D; Lauman, J; Cochella, C; Goede, P; Harnsberger, H R

    2001-06-01

    To foster a community supported evaluation processes for open-source digital teaching file (DTF) development and maintenance. The mechanisms used to support this process will include standard web browsers, web servers, forum software, and custom additions to the forum software to potentially enable a mediated voting protocol. The web server will also serve as a focal point for beta and release software distribution, which is the desired end-goal of this process. We foresee that www.mdtf.org will provide for widespread distribution of open source DTF software that will include function and interface design decisions from community participation on the website forums.

  13. Whole Building Cost and Performance Measurement: Data Collection Protocol Revision 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fowler, Kimberly M.; Spees, Kathleen L.; Kora, Angela R.

    2009-03-27

    This protocol was written for the Department of Energy’s Federal Energy Management Program (FEMP) to be used by the public as a tool for assessing building cost and performance measurement. The primary audiences are sustainable design professionals, asset owners, building managers, and research professionals within the Federal sector. The protocol was developed based on the need for measured performance and cost data on sustainable design projects. Historically there has not been a significant driver in the public or private sector to quantify whole building performance in comparable terms. The deployment of sustainable design into the building sector has initiated manymore » questions on the performance and operational cost of these buildings.« less

  14. A protocol of rope skipping exercise for primary school children: A pilot test

    NASA Astrophysics Data System (ADS)

    Radzi, A. N. M.; Rambely, A. S.; Chellapan, K.

    2014-06-01

    This paper aims to investigate the methods and sample used in rope skipping as an exercise approach. A systematic literature review was approached in identifying skipping performance in the related researches. The methods were compared to determine the best methodological approach for the targeted skipping based research measure. A pilot test was performed among seven students below 12 years old. As the outcome of the review, a skipping protocol design has been proposed for 10 years old primary school students. The proposed protocol design is to be submitted to PPUKM Ethical Committee for approval prior to its implementation in investigation memory enhancement in relation to designed skipping activities.

  15. On the relevance of glycolysis process on brain gliomas.

    PubMed

    Kounelakis, M G; Zervakis, M E; Giakos, G C; Postma, G J; Buydens, L M C; Kotsiakis, X

    2013-01-01

    The proposed analysis considers aspects of both statistical and biological validation of the glycolysis effect on brain gliomas, at both genomic and metabolic level. In particular, two independent datasets are analyzed in parallel, one engaging genomic (Microarray Expression) data and the other metabolomic (Magnetic Resonance Spectroscopy Imaging) data. The aim of this study is twofold. First to show that, apart from the already studied genes (markers), other genes such as those involved in the human cell glycolysis significantly contribute in gliomas discrimination. Second, to demonstrate how the glycolysis process can open new ways towards the design of patient-specific therapeutic protocols. The results of our analysis demonstrate that the combination of genes participating in the glycolytic process (ALDOA, ALDOC, ENO2, GAPDH, HK2, LDHA, LDHB, MDH1, PDHB, PFKM, PGI, PGK1, PGM1 and PKLR) with the already known tumor suppressors (PTEN, Rb, TP53), oncogenes (CDK4, EGFR, PDGF) and HIF-1, enhance the discrimination of low versus high-grade gliomas providing high prediction ability in a cross-validated framework. Following these results and supported by the biological effect of glycolytic genes on cancer cells, we address the study of glycolysis for the development of new treatment protocols.

  16. Evaluating mixed samples as a source of error in non-invasive genetic studies using microsatellites

    USGS Publications Warehouse

    Roon, David A.; Thomas, M.E.; Kendall, K.C.; Waits, L.P.

    2005-01-01

    The use of noninvasive genetic sampling (NGS) for surveying wild populations is increasing rapidly. Currently, only a limited number of studies have evaluated potential biases associated with NGS. This paper evaluates the potential errors associated with analysing mixed samples drawn from multiple animals. Most NGS studies assume that mixed samples will be identified and removed during the genotyping process. We evaluated this assumption by creating 128 mixed samples of extracted DNA from brown bear (Ursus arctos) hair samples. These mixed samples were genotyped and screened for errors at six microsatellite loci according to protocols consistent with those used in other NGS studies. Five mixed samples produced acceptable genotypes after the first screening. However, all mixed samples produced multiple alleles at one or more loci, amplified as only one of the source samples, or yielded inconsistent electropherograms by the final stage of the error-checking process. These processes could potentially reduce the number of individuals observed in NGS studies, but errors should be conservative within demographic estimates. Researchers should be aware of the potential for mixed samples and carefully design gel analysis criteria and error checking protocols to detect mixed samples.

  17. Deposition and Characterization of Thin Films on Metallic Substrates

    NASA Technical Reports Server (NTRS)

    Gatica, Jorge E.

    2005-01-01

    A CVD method was successfully developed to produce conversion coatings on aluminum alloys surfaces with reproducible results with a variety of precursors. A well defined protocol to prepare the precursor solutions formulated in a previous research was extended to other additives. It was demonstrated that solutions prepared following such a protocol could be used to systematically generate protective coatings onto aluminum surfaces. Experiments with a variety of formulations revealed that a refined deposition protocol yields reproducible conversion coatings of controlled composition. A preliminary correlation between solution formulations and successful precursors was derived. Coatings were tested for adhesion properties enhancement for commercial paints. A standard testing method was followed and clear trends were identified. Only one precursors was tested systematically. Anticipated work on other precursors should allow a better characterization of the effect of intermetallics on the production of conversion/protective coatings on metals and ceramics. The significance of this work was the practical demonstration that chemical vapor deposition (CVD) techniques can be used to systematically generate protective/conversion coating on non-ferrous surfaces. In order to become an effective approach to replace chromate-based pre- treatment processes, namely in the aerospace or automobile industry, the process parameters must be defined more precisely. Moreover, the feasibility of scale-up designs necessitates a more comprehensive characterization of the fluid flow, transport phenomena, and chemical kinetics interacting in the process. Kinetic characterization showed a significantly different effect of magnesium-based precursors when compared to iron-based precursors. Future work will concentrate on refining the process through computer simulations and further experimental studies on the effect of other transition metals to induce deposition of conversion/protective films on aluminum and other metallic substrates.

  18. Zebrafish embryology and cartilage staining protocols for high school students.

    PubMed

    Emran, Farida; Brooks, Jacqueline M; Zimmerman, Steven R; Johnson, Susan L; Lue, Robert A

    2009-06-01

    The Life Sciences-Howard Hughes Medical Institute Outreach Program at Harvard University supports high school science education by offering an on-campus program for students and their teachers to participate in investigative, hands-on laboratory sessions. The outreach program has recently designed and launched a successful zebrafish embryology protocol that we present here. The main objectives of this protocol are to introduce students to zebrafish as a model research organism and to provide students with direct experience with current techniques used in embryological research. The content of the lab is designed to generate discussions on embryology, genetics, fertilization, natural selection, and animal adaptation. The protocol produces reliable results in a time-efficient manner using a minimum of reagents. The protocol presented here consists of three sections: observations of live zebrafish larvae at different developmental stages, cartilage staining of zebrafish larvae, and a mutant hunt involving identification of two zebrafish mutants (nacre and chokh). Here, we describe the protocol, show the results obtained for each section, and suggest possible alternatives for different lab settings.

  19. A proposed group management scheme for XTP multicast

    NASA Technical Reports Server (NTRS)

    Dempsey, Bert J.; Weaver, Alfred C.

    1990-01-01

    The purpose of a group management scheme is to enable its associated transfer layer protocol to be responsive to user determined reliability requirements for multicasting. Group management (GM) must assist the client process in coordinating multicast group membership, allow the user to express the subset of the multicast group that a particular multicast distribution must reach in order to be successful (reliable), and provide the transfer layer protocol with the group membership information necessary to guarantee delivery to this subset. GM provides services and mechanisms that respond to the need of the client process or process level management protocols to coordinate, modify, and determine attributes of the multicast group, especially membership. XTP GM provides a link between process groups and their multicast groups by maintaining a group membership database that identifies members in a name space understood by the underlying transfer layer protocol. Other attributes of the multicast group useful to both the client process and the data transfer protocol may be stored in the database. Examples include the relative dispersion, most recent update, and default delivery parameters of a group.

  20. Assisted closed-loop optimization of SSVEP-BCI efficiency

    PubMed Central

    Fernandez-Vargas, Jacobo; Pfaff, Hanns U.; Rodríguez, Francisco B.; Varona, Pablo

    2012-01-01

    We designed a novel assisted closed-loop optimization protocol to improve the efficiency of brain-computer interfaces (BCI) based on steady state visually evoked potentials (SSVEP). In traditional paradigms, the control over the BCI-performance completely depends on the subjects' ability to learn from the given feedback cues. By contrast, in the proposed protocol both the subject and the machine share information and control over the BCI goal. Generally, the innovative assistance consists in the delivery of online information together with the online adaptation of BCI stimuli properties. In our case, this adaptive optimization process is realized by (1) a closed-loop search for the best set of SSVEP flicker frequencies and (2) feedback of actual SSVEP magnitudes to both the subject and the machine. These closed-loop interactions between subject and machine are evaluated in real-time by continuous measurement of their efficiencies, which are used as online criteria to adapt the BCI control parameters. The proposed protocol aims to compensate for variability in possibly unknown subjects' state and trait dimensions. In a study with N = 18 subjects, we found significant evidence that our protocol outperformed classic SSVEP-BCI control paradigms. Evidence is presented that it takes indeed into account interindividual variabilities: e.g., under the new protocol, baseline resting state EEG measures predict subjects' BCI performances. This paper illustrates the promising potential of assisted closed-loop protocols in BCI systems. Probably their applicability might be expanded to innovative uses, e.g., as possible new diagnostic/therapeutic tools for clinical contexts and as new paradigms for basic research. PMID:23443214

  1. Assisted closed-loop optimization of SSVEP-BCI efficiency.

    PubMed

    Fernandez-Vargas, Jacobo; Pfaff, Hanns U; Rodríguez, Francisco B; Varona, Pablo

    2013-01-01

    We designed a novel assisted closed-loop optimization protocol to improve the efficiency of brain-computer interfaces (BCI) based on steady state visually evoked potentials (SSVEP). In traditional paradigms, the control over the BCI-performance completely depends on the subjects' ability to learn from the given feedback cues. By contrast, in the proposed protocol both the subject and the machine share information and control over the BCI goal. Generally, the innovative assistance consists in the delivery of online information together with the online adaptation of BCI stimuli properties. In our case, this adaptive optimization process is realized by (1) a closed-loop search for the best set of SSVEP flicker frequencies and (2) feedback of actual SSVEP magnitudes to both the subject and the machine. These closed-loop interactions between subject and machine are evaluated in real-time by continuous measurement of their efficiencies, which are used as online criteria to adapt the BCI control parameters. The proposed protocol aims to compensate for variability in possibly unknown subjects' state and trait dimensions. In a study with N = 18 subjects, we found significant evidence that our protocol outperformed classic SSVEP-BCI control paradigms. Evidence is presented that it takes indeed into account interindividual variabilities: e.g., under the new protocol, baseline resting state EEG measures predict subjects' BCI performances. This paper illustrates the promising potential of assisted closed-loop protocols in BCI systems. Probably their applicability might be expanded to innovative uses, e.g., as possible new diagnostic/therapeutic tools for clinical contexts and as new paradigms for basic research.

  2. Construction and Setup of a Bench-scale Algal Photosynthetic Bioreactor with Temperature, Light, and pH Monitoring for Kinetic Growth Tests.

    PubMed

    Karam, Amanda L; McMillan, Catherine C; Lai, Yi-Chun; de Los Reyes, Francis L; Sederoff, Heike W; Grunden, Amy M; Ranjithan, Ranji S; Levis, James W; Ducoste, Joel J

    2017-06-14

    The optimal design and operation of photosynthetic bioreactors (PBRs) for microalgal cultivation is essential for improving the environmental and economic performance of microalgae-based biofuel production. Models that estimate microalgal growth under different conditions can help to optimize PBR design and operation. To be effective, the growth parameters used in these models must be accurately determined. Algal growth experiments are often constrained by the dynamic nature of the culture environment, and control systems are needed to accurately determine the kinetic parameters. The first step in setting up a controlled batch experiment is live data acquisition and monitoring. This protocol outlines a process for the assembly and operation of a bench-scale photosynthetic bioreactor that can be used to conduct microalgal growth experiments. This protocol describes how to size and assemble a flat-plate, bench-scale PBR from acrylic. It also details how to configure a PBR with continuous pH, light, and temperature monitoring using a data acquisition and control unit, analog sensors, and open-source data acquisition software.

  3. Construction and Setup of a Bench-scale Algal Photosynthetic Bioreactor with Temperature, Light, and pH Monitoring for Kinetic Growth Tests

    PubMed Central

    Karam, Amanda L.; McMillan, Catherine C.; Lai, Yi-Chun; de los Reyes, Francis L.; Sederoff, Heike W.; Grunden, Amy M.; Ranjithan, Ranji S.; Levis, James W.; Ducoste, Joel J.

    2017-01-01

    The optimal design and operation of photosynthetic bioreactors (PBRs) for microalgal cultivation is essential for improving the environmental and economic performance of microalgae-based biofuel production. Models that estimate microalgal growth under different conditions can help to optimize PBR design and operation. To be effective, the growth parameters used in these models must be accurately determined. Algal growth experiments are often constrained by the dynamic nature of the culture environment, and control systems are needed to accurately determine the kinetic parameters. The first step in setting up a controlled batch experiment is live data acquisition and monitoring. This protocol outlines a process for the assembly and operation of a bench-scale photosynthetic bioreactor that can be used to conduct microalgal growth experiments. This protocol describes how to size and assemble a flat-plate, bench-scale PBR from acrylic. It also details how to configure a PBR with continuous pH, light, and temperature monitoring using a data acquisition and control unit, analog sensors, and open-source data acquisition software. PMID:28654054

  4. Laser direct-write for fabrication of three-dimensional paper-based devices.

    PubMed

    He, P J W; Katis, I N; Eason, R W; Sones, C L

    2016-08-16

    We report the use of a laser-based direct-write (LDW) technique that allows the design and fabrication of three-dimensional (3D) structures within a paper substrate that enables implementation of multi-step analytical assays via a 3D protocol. The technique is based on laser-induced photo-polymerisation, and through adjustment of the laser writing parameters such as the laser power and scan speed we can control the depths of hydrophobic barriers that are formed within a substrate which, when carefully designed and integrated, produce 3D flow paths. So far, we have successfully used this depth-variable patterning protocol for stacking and sealing of multi-layer substrates, for assembly of backing layers for two-dimensional (2D) lateral flow devices and finally for fabrication of 3D devices. Since the 3D flow paths can also be formed via a single laser-writing process by controlling the patterning parameters, this is a distinct improvement over other methods that require multiple complicated and repetitive assembly procedures. This technique is therefore suitable for cheap, rapid and large-scale fabrication of 3D paper-based microfluidic devices.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fitzpatrick, Stephen W.

    This project involved a three-year program managed by BioMetics, Inc. (Waltham, MA) to demonstrate the commercial feasibility of Biofine thermochemical process technology for conversion of cellulose-containing wastes or renewable materials into levulinic acid, a versatile platform chemical. The program, commencing in October 1995, involved the design, procurement, construction and operation of a plant utilizing the Biofine process to convert 1 dry ton per day of paper sludge waste. The plant was successfully designed, constructed, and commissioned in 1997. It was operated for a period of one year on paper sludge from a variety of source paper mills to collect datamore » to verify the design for a commercial scale plant. Operational results were obtained for four different feedstock varieties. Stable, continuous operation was achieved for two of the feedstocks. Continuous operation of the plant at demonstration scale provided the opportunity for process optimization, development of operational protocols, operator training and identification of suitable materials of construction for scale up to commercial operation . Separated fiber from municipal waster was also successfully processed. The project team consisted of BioMetics Inc., Great Lakes Chemical Corporation (West Lafayette, IN), and New York State Energy Research and Development Authority (Albany, NY).« less

  6. A Taxonomy of Attacks on the DNP3 Protocol

    NASA Astrophysics Data System (ADS)

    East, Samuel; Butts, Jonathan; Papa, Mauricio; Shenoi, Sujeet

    Distributed Network Protocol (DNP3) is the predominant SCADA protocol in the energy sector - more than 75% of North American electric utilities currently use DNP3 for industrial control applications. This paper presents a taxonomy of attacks on the protocol. The attacks are classified based on targets (control center, outstation devices and network/communication paths) and threat categories (interception, interruption, modification and fabrication). To facilitate risk analysis and mitigation strategies, the attacks are associated with the specific DNP3 protocol layers they exploit. Also, the operational impact of the attacks is categorized in terms of three key SCADA objectives: process confi- dentiality, process awareness and process control. The attack taxonomy clarifies the nature and scope of the threats to DNP3 systems, and can provide insights into the relative costs and benefits of implementing mitigation strategies.

  7. Update on the MRI Core of the Alzheimer's Disease Neuroimaging Initiative

    PubMed Central

    Jack, Clifford R; Bernstein, Matt A; Borowski, Bret J; Gunter, Jeffrey L; Fox, Nick C; Thompson, Paul M; Schuff, Norbert; Krueger, Gunnar; Killiany, Ronald J; DeCarli, Charles S; Dale, Anders M; Weiner, Michael W

    2010-01-01

    Functions of the ADNI MRI core fall into three categories: (1) those of the central MRI core lab at Mayo Clinic, Rochester, Minnesota, needed to generate high quality MRI data in all subjects at each time point; (2) those of the funded ADNI MRI core imaging analysis groups responsible for analyzing the MRI data, and (3) the joint function of the entire MRI core in designing and problem solving MR image acquisition, pre-processing and analyses methods. The primary objective of ADNI was and continues to be improving methods for clinical trials in Alzheimer's disease. Our approach to the present (“ADNI-GO”) and future (“ADNI-2”, if funded) MRI protocol will be to maintain MRI methodological consistency in previously enrolled “ADNI-1” subjects who are followed longitudinally in ADNI-GO and ADNI-2. We will modernize and expand the MRI protocol for all newly enrolled ADNI-GO and ADNI-2 subjects. All newly enrolled subjects will be scanned at 3T with a core set of three sequence types: 3D T1-weighted volume, FLAIR, and a long TE gradient echo volumetric acquisition for micro hemorrhage detection. In addition to this core ADNI-GO and ADNI-2 protocol, we will perform vendor specific pilot sub-studies of arterial spin labeling perfusion, resting state functional connectivity and diffusion tensor imaging. One each of these sequences will be added to the core protocol on systems from each MRI vendor. These experimental sub-studies are designed to demonstrate the feasibility of acquiring useful data in a multi-center (but single vendor) setting for these three emerging MRI applications. PMID:20451869

  8. Update on the magnetic resonance imaging core of the Alzheimer's disease neuroimaging initiative.

    PubMed

    Jack, Clifford R; Bernstein, Matt A; Borowski, Bret J; Gunter, Jeffrey L; Fox, Nick C; Thompson, Paul M; Schuff, Norbert; Krueger, Gunnar; Killiany, Ronald J; Decarli, Charles S; Dale, Anders M; Carmichael, Owen W; Tosun, Duygu; Weiner, Michael W

    2010-05-01

    Functions of the Alzheimer's Disease Neuroimaging Initiative (ADNI) magnetic resonance imaging (MRI) core fall into three categories: (1) those of the central MRI core laboratory at Mayo Clinic, Rochester, Minnesota, needed to generate high quality MRI data in all subjects at each time point; (2) those of the funded ADNI MRI core imaging analysis groups responsible for analyzing the MRI data; and (3) the joint function of the entire MRI core in designing and problem solving MR image acquisition, pre-processing, and analyses methods. The primary objective of ADNI was and continues to be improving methods for clinical trials in Alzheimer's disease. Our approach to the present ("ADNI-GO") and future ("ADNI-2," if funded) MRI protocol will be to maintain MRI methodological consistency in the previously enrolled "ADNI-1" subjects who are followed up longitudinally in ADNI-GO and ADNI-2. We will modernize and expand the MRI protocol for all newly enrolled ADNI-GO and ADNI-2 subjects. All newly enrolled subjects will be scanned at 3T with a core set of three sequence types: 3D T1-weighted volume, FLAIR, and a long TE gradient echo volumetric acquisition for micro hemorrhage detection. In addition to this core ADNI-GO and ADNI-2 protocol, we will perform vendor-specific pilot sub-studies of arterial spin-labeling perfusion, resting state functional connectivity, and diffusion tensor imaging. One of these sequences will be added to the core protocol on systems from each MRI vendor. These experimental sub-studies are designed to demonstrate the feasibility of acquiring useful data in a multicenter (but single vendor) setting for these three emerging MRI applications. Copyright 2010 The Alzheimer

  9. Using APEX to Model Anticipated Human Error: Analysis of a GPS Navigational Aid

    NASA Technical Reports Server (NTRS)

    VanSelst, Mark; Freed, Michael; Shefto, Michael (Technical Monitor)

    1997-01-01

    The interface development process can be dramatically improved by predicting design facilitated human error at an early stage in the design process. The approach we advocate is to SIMULATE the behavior of a human agent carrying out tasks with a well-specified user interface, ANALYZE the simulation for instances of human error, and then REFINE the interface or protocol to minimize predicted error. This approach, incorporated into the APEX modeling architecture, differs from past approaches to human simulation in Its emphasis on error rather than e.g. learning rate or speed of response. The APEX model consists of two major components: (1) a powerful action selection component capable of simulating behavior in complex, multiple-task environments; and (2) a resource architecture which constrains cognitive, perceptual, and motor capabilities to within empirically demonstrated limits. The model mimics human errors arising from interactions between limited human resources and elements of the computer interface whose design falls to anticipate those limits. We analyze the design of a hand-held Global Positioning System (GPS) device used for radical and navigational decisions in small yacht recalls. The analysis demonstrates how human system modeling can be an effective design aid, helping to accelerate the process of refining a product (or procedure).

  10. Developing psychotherapists’ competence through clinical supervision: protocol for a qualitative study of supervisory dyads

    PubMed Central

    2013-01-01

    Background Mental health professionals face unique demands and stressors in their work, resulting in high rates of burnout and distress. Clinical supervision is a widely adopted and valued mechanism of professional support, development, and accountability, despite the very limited evidence of specific impacts on therapist or client outcomes. The current study aims to address this by exploring how psychotherapists develop competence through clinical supervision and what impact this has on the supervisees’ practice and their clients’ outcomes. This paper provides a rationale for the study and describes the protocol for an in-depth qualitative study of supervisory dyads, highlighting how it addresses gaps in the literature. Methods/Design The study of 16–20 supervisor-supervisee dyads uses a qualitative mixed method design, with two phases. In phase one, supervisors who are nominated as expert by their peers are interviewed about their supervision practice. In phase two, supervisors record a supervision session with a consenting supervisee; interpersonal process recall interviews are conducted separately with supervisor and supervisee to reflect in depth on the teaching and learning processes occurring. All interviews will be transcribed, coded and analysed to identify the processes that build competence, using a modified form of Consensual Qualitative Research (CQR) strategies. Using a theory-building case study method, data from both phases of the study will be integrated to develop a model describing the processes that build competence and support wellbeing in practising psychotherapists, reflecting the accumulated wisdom of the expert supervisors. Discussion The study addresses past study limitations by examining expert supervisors and their supervisory interactions, by reflecting on actual supervision sessions, and by using dyadic analysis of the supervisory pairs. The study findings will inform the development of future supervision training and practice and identify fruitful avenues for future research. PMID:23298408

  11. Co-Designing Mobile Apps to Assist in Clinical Nursing Education: A Study Protocol.

    PubMed

    O'Connor, Siobhan; Andrews, Tom

    2016-01-01

    Mobile applications (apps) to train health professionals is gaining momentum as the benefits of mobile learning (mLearning) are becoming apparent in complex clinical environments. However, most educational apps are generic, off-the-shelf pieces of software that do not take into consideration the unique needs of nursing students. The proposed study will apply a user-centred design process to create a tailored mobile app for nursing students to learn and apply clinical skills in practice. The app will be piloted and evaluated to understand how nursing students use mobile technology in clinical settings to support their learning and educational needs.

  12. Event detection in an assisted living environment.

    PubMed

    Stroiescu, Florin; Daly, Kieran; Kuris, Benjamin

    2011-01-01

    This paper presents the design of a wireless event detection and in building location awareness system. The systems architecture is based on using a body worn sensor to detect events such as falls where they occur in an assisted living environment. This process involves developing event detection algorithms and transmitting such events wirelessly to an in house network based on the 802.15.4 protocol. The network would then generate alerts both in the assisted living facility and remotely to an offsite monitoring facility. The focus of this paper is on the design of the system architecture and the compliance challenges in applying this technology.

  13. GCS programmer's manual

    NASA Technical Reports Server (NTRS)

    Lowman, Douglas S.; Withers, B. Edward; Shagnea, Anita M.; Dent, Leslie A.; Hayhurst, Kelly J.

    1990-01-01

    A variety of instructions to be used in the development of implementations of software for the Guidance and Control Software (GCS) project is described. This document fulfills the Radio Technical Commission for Aeronautics RTCA/DO-178A guidelines, 'Software Considerations in Airborne Systems and Equipment Certification' requirements for document No. 4, which specifies the information necessary for understanding and programming the host computer, and document No. 12, which specifies the software design and implementation standards that are applicable to the software development and testing process. Information on the following subjects is contained: activity recording, communication protocol, coding standards, change management, error handling, design standards, problem reporting, module testing logs, documentation formats, accuracy requirements, and programmer responsibilities.

  14. OSI-compatible protocols for mobile-satellite communications: The AMSS experience

    NASA Technical Reports Server (NTRS)

    Moher, Michael

    1990-01-01

    The protocol structure of the international aeronautical mobile satellite service (AMSS) is reviewed with emphasis on those aspects of protocol performance, validation, and conformance which are peculiar to mobile services. This is in part an analysis of what can be learned from the AMSS experience with protocols which is relevant to the design of other mobile satellite data networks, e.g., land mobile.

  15. CDC WONDER: a cooperative processing architecture for public health.

    PubMed Central

    Friede, A; Rosen, D H; Reid, J A

    1994-01-01

    CDC WONDER is an information management architecture designed for public health. It provides access to information and communications without the user's needing to know the location of data or communication pathways and mechanisms. CDC WONDER users have access to extractions from some 40 databases; electronic mail (e-mail); and surveillance data processing. System components include the Remote Client, the Communications Server, the Queue Managers, and Data Servers and Process Servers. The Remote Client software resides in the user's machine; other components are at the Centers for Disease Control and Prevention (CDC). The Remote Client, the Communications Server, and the Applications Server provide access to the information and functions in the Data Servers and Process Servers. The system architecture is based on cooperative processing, and components are coupled via pure message passing, using several protocols. This architecture allows flexibility in the choice of hardware and software. One system limitation is that final results from some subsystems are obtained slowly. Although designed for public health, CDC WONDER could be useful for other disciplines that need flexible, integrated information exchange. PMID:7719813

  16. Small-molecule ligand docking into comparative models with Rosetta

    PubMed Central

    Combs, Steven A; DeLuca, Samuel L; DeLuca, Stephanie H; Lemmon, Gordon H; Nannemann, David P; Nguyen, Elizabeth D; Willis, Jordan R; Sheehan, Jonathan H; Meiler, Jens

    2017-01-01

    Structure-based drug design is frequently used to accelerate the development of small-molecule therapeutics. Although substantial progress has been made in X-ray crystallography and nuclear magnetic resonance (NMR) spectroscopy, the availability of high-resolution structures is limited owing to the frequent inability to crystallize or obtain sufficient NMR restraints for large or flexible proteins. Computational methods can be used to both predict unknown protein structures and model ligand interactions when experimental data are unavailable. This paper describes a comprehensive and detailed protocol using the Rosetta modeling suite to dock small-molecule ligands into comparative models. In the protocol presented here, we review the comparative modeling process, including sequence alignment, threading and loop building. Next, we cover docking a small-molecule ligand into the protein comparative model. In addition, we discuss criteria that can improve ligand docking into comparative models. Finally, and importantly, we present a strategy for assessing model quality. The entire protocol is presented on a single example selected solely for didactic purposes. The results are therefore not representative and do not replace benchmarks published elsewhere. We also provide an additional tutorial so that the user can gain hands-on experience in using Rosetta. The protocol should take 5–7 h, with additional time allocated for computer generation of models. PMID:23744289

  17. Prevention of Osmotic Injury to Human Umbilical Vein Endothelial Cells for Biopreservation: A First Step Toward Biobanking of Endothelial Cells for Vascular Tissue Engineering.

    PubMed

    Niu, Dan; Zhao, Gang; Liu, Xiaoli; Zhou, Ping; Cao, Yunxia

    2016-03-01

    High-survival-rate cryopreservation of endothelial cells plays a critical role in vascular tissue engineering, while optimization of osmotic injuries is the first step toward successful cryopreservation. We designed a low-cost, easy-to-use, microfluidics-based microperfusion chamber to investigate the osmotic responses of human umbilical vein endothelial cells (HUVECs) at different temperatures, and then optimized the protocols for using cryoprotective agents (CPAs) to minimize osmotic injuries and improve processes before freezing and after thawing. The fundamental cryobiological parameters were measured using the microperfusion chamber, and then, the optimized protocols using these parameters were confirmed by survival evaluation and cell proliferation experiments. It was revealed for the first time that HUVECs have an unusually small permeability coefficient for Me2SO. Even at the concentrations well established for slow freezing of cells (1.5 M), one-step removal of CPAs for HUVECs might result in inevitable osmotic injuries, indicating that multiple-step removal is essential. Further experiments revealed that multistep removal of 1.5 M Me2SO at 25°C was the best protocol investigated, in good agreement with theory. These results should prove invaluable for optimization of cryopreservation protocols of HUVECs.

  18. Accelerated In Vitro Degradation of Optically Clear Low β-Sheet Silk Films by Enzyme-Mediated Pretreatment

    PubMed Central

    Shang, Ke; Rnjak-Kovacina, Jelena; Lin, Yinan; Hayden, Rebecca S.; Tao, Hu; Kaplan, David L.

    2013-01-01

    Purpose: To design patterned, transparent silk films with fast degradation rates for the purpose of tissue engineering corneal stroma. Methods: β-sheet (crystalline) content of silk films was decreased significantly by using a short water annealing time. Additionally, a protocol combining short water annealing time with enzymatic pretreatment of silk films with protease XIV was developed. Results: Low β-sheet content (17%–18%) and enzymatic pretreatment provided film stability in aqueous environments and accelerated degradation of the silk films in the presence of human corneal fibroblasts in vitro. The results demonstrate a direct relationship between reduced β-sheet content and enzymatic pretreatment, and overall degradation rate of the protein films. Conclusions: The novel protocol developed here provides new approaches to modulate the regeneration rate of silk biomaterials for corneal tissue regeneration needs. Translational Relevance: Patterned silk protein films possess desirable characteristics for corneal tissue engineering, including optical transparency, biocompatibility, cell alignment, and tunable mechanical properties, but current fabrication protocols do not provide adequate degradation rates to match the regeneration properties of the human cornea. This novel processing protocol makes silk films more suitable for the construction of human corneal stroma tissue and a promising way to tune silk film degradation properties to match corneal tissue regeneration. PMID:24049717

  19. Accelerated in vitro Degradation of Optically Clear Low β-sheet Silk Films by Enzyme-Mediated Pretreatment

    PubMed Central

    Shang, Ke; Rnjak-Kovacina, Jelena; Lin, Yinan; Hayden, Rebecca S.; Hu, Tao; Kaplan, David L.

    2013-01-01

    Purpose To design patterned, transparent silk films with fast degradation rates for the purpose of tissue engineering corneal stroma, Methods β-sheet (crystalline) content of silk films was decreased significantly by using a short water annealing time. Additionally, a protocol combining short water annealing time with enzymatic pretreatment of silk films with protease XIV was developed. Results Low β-sheet content (17–18%) and enzymatic pre-treatment provided film stability in aqueous environments and accelerated degradation of the silk films in the presence of human corneal fibroblasts in vitro. The results demonstrate a direct relationship between reduced β-sheet content and enzymatic pre-treatment and overall degradation rate of the protein films. Conclusions The novel protocol developed here provides new approaches to modulate the regeneration rate of silk biomaterials for corneal tissue regeneration needs. Translational relevance Patterned silk protein films possess desirable characteristics for corneal tissue engineering, including optical transparency, biocompatibility, cell alignment and tunable mechanical properties, but current fabrication protocols do not provide adequate degradation rates to match the regeneration properties of the human cornea. This novel processing protocol makes silk films more suitable for the construction of human corneal stroma tissue and a promising way to tune silk film degradation properties to match corneal tissue regeneration. PMID:23579493

  20. Analytical platform for metabolome analysis of microbial cells using methyl chloroformate derivatization followed by gas chromatography-mass spectrometry.

    PubMed

    Smart, Kathleen F; Aggio, Raphael B M; Van Houtte, Jeremy R; Villas-Bôas, Silas G

    2010-09-01

    This protocol describes an analytical platform for the analysis of intra- and extracellular metabolites of microbial cells (yeast, filamentous fungi and bacteria) using gas chromatography-mass spectrometry (GC-MS). The protocol is subdivided into sampling, sample preparation, chemical derivatization of metabolites, GC-MS analysis and data processing and analysis. This protocol uses two robust quenching methods for microbial cultures, the first of which, cold glycerol-saline quenching, causes reduced leakage of intracellular metabolites, thus allowing a more reliable separation of intra- and extracellular metabolites with simultaneous stopping of cell metabolism. The second, fast filtration, is specifically designed for quenching filamentous micro-organisms. These sampling techniques are combined with an easy sample-preparation procedure and a fast chemical derivatization reaction using methyl chloroformate. This reaction takes place at room temperature, in aqueous medium, and is less prone to matrix effect compared with other derivatizations. This protocol takes an average of 10 d to complete and enables the simultaneous analysis of hundreds of metabolites from the central carbon metabolism (amino and nonamino organic acids, phosphorylated organic acids and fatty acid intermediates) using an in-house MS library and a data analysis pipeline consisting of two free software programs (Automated Mass Deconvolution and Identification System (AMDIS) and R).

  1. Optimization of a Sample Processing Protocol for Recovery of ...

    EPA Pesticide Factsheets

    Journal Article Following a release of Bacillus anthracis spores into the environment, there is a potential for lasting environmental contamination in soils. There is a need for detection protocols for B. anthracis in environmental matrices. However, identification of B. anthracis within a soil is a difficult task. Processing soil samples helps to remove debris, chemical components, and biological impurities that can interfere with microbiological detection. This study aimed to optimize a previously used indirect processing protocol, which included a series of washing and centrifugation steps.

  2. Better movers and thinkers (BMT): A quasi-experimental study into the impact of physical education on children's cognition-A study protocol.

    PubMed

    Dalziell, Andrew; Boyle, James; Mutrie, Nanette

    2015-01-01

    This study will extend on a pilot study and will evaluate the impact of a novel approach to PE, Better Movers and Thinkers (BMT), on students' cognition, physical activity habits, and gross motor coordination (GMC). The study will involve six mainstream state schools with students aged 9-11 years. Three schools will be allocated as the intervention condition and three as the control condition. The design of the study is a 16-week intervention with pre-, post- and 6 month follow-up measurements taken using the 'Cognitive Assessment System (CAS)' GMC tests, and the 'Physical Activity Habits Questionnaire for Children (PAQ-C).' Qualitative data will be gathered using student focus groups and class teacher interviews in each of the six schools. ANCOVA will be used to evaluate any effect of intervention comparing pre-test scores with post-test scores and then pre-test scores with 6 month follow-up scores. Qualitative data will be analysed through an iterative process using grounded theory. This protocol provides the details of the rationale and design of the study and details of the intervention, outcome measures, and the recruitment process. The study will address gaps within current research by evaluating if a change of approach in the delivery of PE within schools has an effect on children's cognition, PA habits, and GMC within a Scottish setting.

  3. Covalent docking of selected boron-based serine beta-lactamase inhibitors

    NASA Astrophysics Data System (ADS)

    Sgrignani, Jacopo; Novati, Beatrice; Colombo, Giorgio; Grazioso, Giovanni

    2015-05-01

    AmpC β-lactamase is a hydrolytic enzyme conferring resistance to β-lactam antibiotics in multiple Gram-negative bacteria. Therefore, identification of non-β-lactam compounds able to inhibit the enzyme is crucial for the development of novel antibacterial therapies. In general, AmpC inhibitors have to engage the highly solvent-exposed catalytic site of the enzyme. Therefore, understanding the implications of ligand-protein induced-fit and water-mediated interactions behind the inhibitor-enzyme recognition process is fundamental for undertaking structure-based drug design process. Here, we focus on boronic acids, a promising class of beta-lactamase covalent inhibitors. First, we optimized a docking protocol able to reproduce the experimentally determined binding mode of AmpC inhibitors bearing a boronic group. This goal was pursued (1) performing rigid and flexible docking calculations aiming to establish the role of the side chain conformations; and (2) investigating the role of specific water molecules in shaping the enzyme active site and mediating ligand protein interactions. Our calculations showed that some water molecules, conserved in the majority of the considered X-ray structures, are needed to correctly predict the binding pose of known covalent AmpC inhibitors. On this basis, we formalized our findings in a docking and scoring protocol that could be useful for the structure-based design of new boronic acid AmpC inhibitors.

  4. An empirical evaluation of graphical interfaces to support flight planning

    NASA Technical Reports Server (NTRS)

    Smith, Philip J.; Mccoy, Elaine; Layton, Chuck; Bihari, Tom

    1995-01-01

    Whether optimization techniques or expert systems technologies are used, the underlying inference processes and the model or knowledge base for a computerized problem-solving system are likely to be incomplete for any given complex, real-world task. To deal with the resultant brittleness, it has been suggested that 'cooperative' rather than 'automated' problem-solving systems be designed. Such cooperative systems are proposed to explicitly enhance the collaboration of people and the computer system when working in partnership to solve problems. This study evaluates the impact of alternative design concepts on the performance of airline pilots interacting with such a cooperative system designed to support enroute flight planning. Thirty pilots were studied using three different versions of the system. The results clearly demonstrate that different system design concepts can strongly influence the cognitive processes of users. Indeed, one of the designs studied caused four times as many pilots to accept a poor flight amendment. Based on think-aloud protocols, cognitive models are proposed to account for how features of the computer system interacted with specific types of scenarios to influence exploration and decision-making by the pilots. The results are then used to develop recommendations for guiding the design of cooperative systems.

  5. Microfluidic Transduction Harnesses Mass Transport Principles to Enhance Gene Transfer Efficiency.

    PubMed

    Tran, Reginald; Myers, David R; Denning, Gabriela; Shields, Jordan E; Lytle, Allison M; Alrowais, Hommood; Qiu, Yongzhi; Sakurai, Yumiko; Li, William C; Brand, Oliver; Le Doux, Joseph M; Spencer, H Trent; Doering, Christopher B; Lam, Wilbur A

    2017-10-04

    Ex vivo gene therapy using lentiviral vectors (LVs) is a proven approach to treat and potentially cure many hematologic disorders and malignancies but remains stymied by cumbersome, cost-prohibitive, and scale-limited production processes that cannot meet the demands of current clinical protocols for widespread clinical utilization. However, limitations in LV manufacture coupled with inefficient transduction protocols requiring significant excess amounts of vector currently limit widespread implementation. Herein, we describe a microfluidic, mass transport-based approach that overcomes the diffusion limitations of current transduction platforms to enhance LV gene transfer kinetics and efficiency. This novel ex vivo LV transduction platform is flexible in design, easy to use, scalable, and compatible with standard cell transduction reagents and LV preparations. Using hematopoietic cell lines, primary human T cells, primary hematopoietic stem and progenitor cells (HSPCs) of both murine (Sca-1 + ) and human (CD34 + ) origin, microfluidic transduction using clinically processed LVs occurs up to 5-fold faster and requires as little as one-twentieth of LV. As an in vivo validation of the microfluidic-based transduction technology, HSPC gene therapy was performed in hemophilia A mice using limiting amounts of LV. Compared to the standard static well-based transduction protocols, only animals transplanted with microfluidic-transduced cells displayed clotting levels restored to normal. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.

  6. Validation of the Preverbal Visual Assessment (PreViAs) questionnaire.

    PubMed

    García-Ormaechea, Inés; González, Inmaculada; Duplá, María; Andres, Eva; Pueyo, Victoria

    2014-10-01

    Visual cognitive integrative functions need to be evaluated by a behavioral assessment, which requires an experienced evaluator. The Preverbal Visual Assessment (PreViAs) questionnaire was designed to evaluate these functions, both in general pediatric population or in children with high risk of visual cognitive problems, through primary caregivers' answers. We aimed to validate the PreViAs questionnaire by comparing caregiver reports with results from a comprehensive clinical protocol. A total of 220 infants (<2 years old) were divided into two groups according to visual development, as determined by the clinical protocol. Their primary caregivers completed the PreViAs questionnaire, which consists of 30 questions related to one or more visual domains: visual attention, visual communication, visual-motor coordination, and visual processing. Questionnaire answers were compared with results of behavioral assessments performed by three pediatric ophthalmologists. Results of the clinical protocol classified 128 infants as having normal visual maturation, and 92 as having abnormal visual maturation. The specificity of PreViAs questionnaire was >80%, and sensitivity was 64%-79%. More than 80% of the infants were correctly classified, and test-retest reliability exceeded 0.9 for all domains. The PreViAs questionnaire is useful to detect abnormal visual maturation in infants from birth to 24months of age. It improves the anamnesis process in infants at risk of visual dysfunctions. Copyright © 2014. Published by Elsevier Ireland Ltd.

  7. Simulation as a Tool to Facilitate Practice Changes in Teams Taking Care of Patients Under Investigation for Ebola Virus Disease in Spain.

    PubMed

    Rojo, Elena; Oruña, Clara; Sierra, Dolores; García, Gema; Del Moral, Ignacio; Maestre, Jose M

    2016-04-01

    We analyzed the impact of simulation-based training on clinical practice and work processes on teams caring for patients with possible Ebola virus disease (EVD) in Cantabria, Spain. The Government of Spain set up a special committee for the management of EVD, and the Spanish Ministry of Health and foreign health services created an action protocol. Each region is responsible for selecting a reference hospital and an in-house care team to care for patients under investigation. Laboratory-confirmed cases of EVD have to be transferred to the Carlos III Health Institute in Madrid. Predeployment training and follow-up support are required to help personnel work safely and effectively. Simulation-based scenarios were designed to give staff the opportunity to practice before encountering a real-life situation. Lessons learned by each team during debriefings were listed, and a survey administered 3 months later assessed the implementation of practice and system changes. Implemented changes were related to clinical practice (eg, teamwork principles application), protocol implementation (eg, addition of new processes and rewriting of confusing parts), and system and workflow (eg, change of shift schedule and rearrangement of room equipment). Simulation can be used to detect needed changes in protocol or guidelines or can be adapted to meet the needs of a specific team.

  8. Development of a dynamic quality assurance testing protocol for multisite clinical trial DCE-CT accreditation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Driscoll, B.; Keller, H.; Jaffray, D.

    2013-08-15

    Purpose: Credentialing can have an impact on whether or not a clinical trial produces useful quality data that is comparable between various institutions and scanners. With the recent increase of dynamic contrast enhanced-computed tomography (DCE-CT) usage as a companion biomarker in clinical trials, effective quality assurance, and control methods are required to ensure there is minimal deviation in the results between different scanners and protocols at various institutions. This paper attempts to address this problem by utilizing a dynamic flow imaging phantom to develop and evaluate a DCE-CT quality assurance (QA) protocol.Methods: A previously designed flow phantom, capable of producingmore » predictable and reproducible time concentration curves from contrast injection was fully validated and then utilized to design a DCE-CT QA protocol. The QA protocol involved a set of quantitative metrics including injected and total mass error, as well as goodness of fit comparison to the known truth concentration curves. An additional region of interest (ROI) sensitivity analysis was also developed to provide additional details on intrascanner variability and determine appropriate ROI sizes for quantitative analysis. Both the QA protocol and ROI sensitivity analysis were utilized to test variations in DCE-CT results using different imaging parameters (tube voltage and current) as well as alternate reconstruction methods and imaging techniques. The developed QA protocol and ROI sensitivity analysis was then applied at three institutions that were part of clinical trial involving DCE-CT and results were compared.Results: The inherent specificity of robustness of the phantom was determined through calculation of the total intraday variability and determined to be less than 2.2 ± 1.1% (total calculated output contrast mass error) with a goodness of fit (R{sup 2}) of greater than 0.99 ± 0.0035 (n= 10). The DCE-CT QA protocol was capable of detecting significant deviations from the expected phantom result when scanning at low mAs and low kVp in terms of quantitative metrics (Injected Mass Error 15.4%), goodness of fit (R{sup 2}) of 0.91, and ROI sensitivity (increase in minimum input function ROI radius by 146 ± 86%). These tests also confirmed that the ASIR reconstruction process was beneficial in reducing noise without substantially increasing partial volume effects and that vendor specific modes (e.g., axial shuttle) did not significantly affect the phantom results. The phantom and QA protocol were finally able to quickly (<90 min) and successfully validate the DCE-CT imaging protocol utilized at the three separate institutions of a multicenter clinical trial; thereby enhancing the confidence in the patient data collected.Conclusions: A DCE QA protocol was developed that, in combination with a dynamic multimodality flow phantom, allows the intrascanner variability to be separated from other sources of variability such as the impact of injection protocol and ROI selection. This provides a valuable resource that can be utilized at various clinical trial institutions to test conformance with imaging protocols and accuracy requirements as well as ensure that the scanners are performing as expected for dynamic scans.« less

  9. Linked Orders Improve Safety in Scheduling and Administration of Chemotherapeutic Agents

    PubMed Central

    Whipple, Nancy; Boulware, Joy; Danca, Kala; Boyarin, Kirill; Ginsberg, Eliot; Poon, Eric; Sweet, Micheal; Schade, Sue; Rogala, Jennifer

    2010-01-01

    The pharmacologic treatment for cancer must adhere to complex, finely orchestrated treatment plans, including not only chemotherapy medications, but pre/post-hydration, anti-emetics, anti-anxiety, and other medications that are given before, during and after chemotherapy doses. The treatment plans specify the medications and dictate precise dosing, frequency, and timing. This is a challenge to most Computerized Physician Order Entry (CPOE), Pharmacy and Electronic Medication Administration record (eMAR) Systems. Medications are scheduled on specific dates, referred to as chemo days, from the onset of the treatment, and precisely timed on the designated chemo day. For patients enrolled in research protocols, the adherence to the defined schedule takes on additional import, since variation is a violation of the protocol. If the oncologist determines that medications must be administered outside the defined constraints, the patient must be un-enrolled from the protocol and the course of therapy is re-written. Pharmacy and eMAR systems utilized in processing chemotherapy medications must be able to support the intricate relationships between each drug defined in the treatment plans. PMID:21347104

  10. Evidence of Absence software

    USGS Publications Warehouse

    Dalthorp, Daniel; Huso, Manuela M. P.; Dail, David; Kenyon, Jessica

    2014-01-01

    Evidence of Absence software (EoA) is a user-friendly application used for estimating bird and bat fatalities at wind farms and designing search protocols. The software is particularly useful in addressing whether the number of fatalities has exceeded a given threshold and what search parameters are needed to give assurance that thresholds were not exceeded. The software is applicable even when zero carcasses have been found in searches. Depending on the effectiveness of the searches, such an absence of evidence of mortality may or may not be strong evidence that few fatalities occurred. Under a search protocol in which carcasses are detected with nearly 100 percent certainty, finding zero carcasses would be convincing evidence that overall mortality rate was near zero. By contrast, with a less effective search protocol with low probability of detecting a carcass, finding zero carcasses does not rule out the possibility that large numbers of animals were killed but not detected in the searches. EoA uses information about the search process and scavenging rates to estimate detection probabilities to determine a maximum credible number of fatalities, even when zero or few carcasses are observed.

  11. Patient-centred screening for primary immunodeficiency, a multi-stage diagnostic protocol designed for non-immunologists: 2011 update

    PubMed Central

    de Vries, E

    2012-01-01

    Members of the European Society for Immunodeficiencies (ESID) and other colleagues have updated the multi-stage expert-opinion-based diagnostic protocol for non-immunologists incorporating newly defined primary immunodeficiency diseases (PIDs). The protocol presented here aims to increase the awareness of PIDs among doctors working in different fields. Prompt identification of PID is important for prognosis, but this may not be an easy task. The protocol therefore starts from the clinical presentation of the patient. Because PIDs may present at all ages, this protocol is aimed at both adult and paediatric physicians. The multi-stage design allows cost-effective screening for PID of the large number of potential cases in the early phases, with more expensive tests reserved for definitive classification in collaboration with a specialist in the field of immunodeficiency at a later stage. PMID:22132890

  12. A survey on temperature-aware routing protocols in wireless body sensor networks.

    PubMed

    Oey, Christian Henry Wijaya; Moh, Sangman

    2013-08-02

    The rapid growth of the elderly population in the world and the rising cost of healthcare impose big issues for healthcare and medical monitoring. A Wireless Body Sensor Network (WBSN) is comprised of small sensor nodes attached inside, on or around a human body, the main purpose of which is to monitor the functions and surroundings of the human body. However, the heat generated by the node's circuitry and antenna could cause damage to the human tissue. Therefore, in designing a routing protocol for WBSNs, it is important to reduce the heat by incorporating temperature into the routing metric. The main contribution of this paper is to survey existing temperature-aware routing protocols that have been proposed for WBSNs. In this paper, we present a brief overview of WBSNs, review the existing routing protocols comparatively and discuss challenging open issues in the design of routing protocols.

  13. A Survey on Temperature-Aware Routing Protocols in Wireless Body Sensor Networks

    PubMed Central

    Oey, Christian Henry Wijaya; Moh, Sangman

    2013-01-01

    The rapid growth of the elderly population in the world and the rising cost of healthcare impose big issues for healthcare and medical monitoring. A Wireless Body Sensor Network (WBSN) is comprised of small sensor nodes attached inside, on or around a human body, the main purpose of which is to monitor the functions and surroundings of the human body. However, the heat generated by the node's circuitry and antenna could cause damage to the human tissue. Therefore, in designing a routing protocol for WBSNs, it is important to reduce the heat by incorporating temperature into the routing metric. The main contribution of this paper is to survey existing temperature-aware routing protocols that have been proposed for WBSNs. In this paper, we present a brief overview of WBSNs, review the existing routing protocols comparatively and discuss challenging open issues in the design of routing protocols. PMID:23917259

  14. Standards of yellow fever vaccination and travel medicine practice in the Republic of Ireland: A questionnaire-based evaluation.

    PubMed

    Noone, Peter; Hamza, Mohammed; Tang, John; Flaherty, Gerard

    2015-01-01

    The Department of Health regulates the designation of yellow fever vaccination centres (YFVCs) in the Republic of Ireland to ensure appropriate standards in the safe, effective use of yellow fever vaccine for overseas travellers. The process of designation of YFVCs is delegated to Directors of Public Health who direct Principal Medical Officers. Variation in implementation of specific criteria for designation exists and no formal follow up inspection is carried out. This survey of all designated YFVCs in the Republic of Ireland aimed to assess compliance with standards to ensure the objectives of the national yellow fever vaccination programme were met. A piloted questionnaire devised from a United Kingdom (UK) YFVC survey was developed and tested in five YFVCs. The questionnaire was adapted for the postal survey and captured data on professional training, reference sources, services provided, physical facilities and supplies, and was distributed to 655 YFVCs in a stamped addressed envelope. During the period 2010-2011, there were 655 designated YFVCs in the Republic of Ireland. Responses were received from 246 centres (38% response rate), 91% of which were in general practice. Deficiencies were identified in respect of vaccine refrigeration protocols, record keeping, attendance at YFVC training sessions, and clinical protocols for adverse events. Specific deficiencies in relation to training, vaccine storage, administration and documentation should be addressed to ensure standardised YFVC practices and thus align them with best international practice. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. SU-F-R-11: Designing Quality and Safety Informatics Through Implementation of a CT Radiation Dose Monitoring Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, JM; Samei, E; Departments of Physics, Electrical and Computer Engineering, and Biomedical Engineering, and Medical Physics Graduate Program, Duke University, Durham, NC

    2016-06-15

    Purpose: Recent legislative and accreditation requirements have driven rapid development and implementation of CT radiation dose monitoring solutions. Institutions must determine how to improve quality, safety, and consistency of their clinical performance. The purpose of this work was to design a strategy and meaningful characterization of results from an in-house, clinically-deployed dose monitoring solution. Methods: A dose monitoring platform was designed by our imaging physics group that focused on extracting protocol parameters, dose metrics, and patient demographics and size. Compared to most commercial solutions, which focus on individual exam alerts and global thresholds, the program sought to characterize overall consistencymore » and targeted thresholds based on eight analytic interrogations. Those were based on explicit questions related to protocol application, national benchmarks, protocol and size-specific dose targets, operational consistency, outliers, temporal trends, intra-system variability, and consistent use of electronic protocols. Using historical data since the start of 2013, 95% and 99% intervals were used to establish yellow and amber parameterized dose alert thresholds, respectively, as a function of protocol, scanner, and size. Results: Quarterly reports have been generated for three hospitals for 3 quarters of 2015 totaling 27880, 28502, 30631 exams, respectively. Four adult and two pediatric protocols were higher than external institutional benchmarks. Four protocol dose levels were being inconsistently applied as a function of patient size. For the three hospitals, the minimum and maximum amber outlier percentages were [1.53%,2.28%], [0.76%,1.8%], [0.94%,1.17%], respectively. Compared with the electronic protocols, 10 protocols were found to be used with some inconsistency. Conclusion: Dose monitoring can satisfy requirements with global alert thresholds and patient dose records, but the real value is in optimizing patient-specific protocols, balancing image quality trade-offs that dose-reduction strategies promise, and improving the performance and consistency of a clinical operation. Data plots that capture patient demographics and scanner performance demonstrate that value.« less

  16. Optimum processing of mammographic film.

    PubMed

    Sprawls, P; Kitts, E L

    1996-03-01

    Underprocessing of mammographic film can result in reduced contrast and visibility of breast structures and an unnecessary increase in radiation dose to the patient. Underprocessing can be caused by physical factors (low developer temperature, inadequate development time, insufficient developer agitation) or chemical factors (developer not optimized for film type; overdiluted, underreplenished, contaminated, or frequently changed developer). Conventional quality control programs are designed to produce consistent processing but do not address the issue of optimum processing. Optimum processing is defined as the level of processing that produces the film performance characteristics (contrast and sensitivity) specified by the film manufacturer. Optimum processing of mammographic film can be achieved by following a two-step protocol. The first step is to set up the processing conditions according to recommendations from the film and developer chemistry manufacturers. The second step is to verify the processing results by comparing them with sensitometric data provided by the film manufacturer.

  17. Quarantine and protocol

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The purpose of the Orbiting Quarantine Facility is to provide maximum protection of the terrestrial biosphere by ensuring that the returned Martian samples are safe to bring to Earth. The protocol designed to detect the presence of biologically active agents in the Martian soil is described. The protocol determines one of two things about the sample: (1) that it is free from nonterrestrial life forms and can be sent to a terrestrial containment facility where extensive chemical, biochemical, geological, and physical investigations can be conducted; or (2) that it exhibits "biological effects" of the type that dictate second order testing. The quarantine protocol is designed to be conducted on a small portion of the returned sample, leaving the bulk of the sample undisturbed for study on Earth.

  18. Application of Game Theory Approaches in Routing Protocols for Wireless Networks

    NASA Astrophysics Data System (ADS)

    Javidi, Mohammad M.; Aliahmadipour, Laya

    2011-09-01

    An important and essential issue for wireless networks is routing protocol design that is a major technical challenge due to the function of the network. Game theory is a powerful mathematical tool that analyzes the strategic interactions among multiple decision makers and the results of researches show that applied game theory in routing protocol lead to improvement the network performance through reduce overhead and motivates selfish nodes to collaborate in the network. This paper presents a review and comparison for typical representatives of routing protocols designed that applied game theory approaches for various wireless networks such as ad hoc networks, mobile ad hoc networks and sensor networks that all of them lead to improve the network performance.

  19. Registered nurses' clinical reasoning in home healthcare clinical practice: A think-aloud study with protocol analysis.

    PubMed

    Johnsen, Hege Mari; Slettebø, Åshild; Fossum, Mariann

    2016-05-01

    The home healthcare context can be unpredictable and complex, and requires registered nurses with a high level of clinical reasoning skills and professional autonomy. Thus, additional knowledge about registered nurses' clinical reasoning performance during patient home care is required. The aim of this study is to describe the cognitive processes and thinking strategies used by recently graduated registered nurses while caring for patients in home healthcare clinical practice. An exploratory qualitative think-aloud design with protocol analysis was used. Home healthcare visits to patients with stroke, diabetes, and chronic obstructive pulmonary disease in seven healthcare districts in southern Norway. A purposeful sample of eight registered nurses with one year of experience. Each nurse was interviewed using the concurrent think-aloud technique in three different patient home healthcare clinical practice visits. A total of 24 home healthcare visits occurred. Follow-up interviews were conducted with each participant. The think-aloud sessions were transcribed and analysed using three-step protocol analysis. Recently graduated registered nurses focused on both general nursing concepts and concepts specific to the domains required and tasks provided in home healthcare services as well as for different patient groups. Additionally, participants used several assertion types, cognitive processes, and thinking strategies. Our results showed that recently graduated registered nurses used both simple and complex cognitive processes involving both inductive and deductive reasoning. However, their reasoning was more reactive than proactive. The results may contribute to nursing practice in terms of developing effective nursing education programmes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Exploiting IoT Technologies and Open Source Components for Smart Seismic Network Instrumentation

    NASA Astrophysics Data System (ADS)

    Germenis, N. G.; Koulamas, C. A.; Foundas, P. N.

    2017-12-01

    The data collection infrastructure of any seismic network poses a number of requirements and trade-offs related to accuracy, reliability, power autonomy and installation & operational costs. Having the right hardware design at the edge of this infrastructure, embedded software running inside the instruments is the heart of pre-processing and communication services implementation and their integration with the central storage and processing facilities of the seismic network. This work demonstrates the feasibility and benefits of exploiting software components from heterogeneous sources in order to realize a smart seismic data logger, achieving higher reliability, faster integration and less development and testing costs of critical functionality that is in turn responsible for the cost and power efficient operation of the device. The instrument's software builds on top of widely used open source components around the Linux kernel with real-time extensions, the core Debian Linux distribution, the earthworm and seiscomp tooling frameworks, as well as components from the Internet of Things (IoT) world, such as the CoAP and MQTT protocols for the signaling planes, besides the widely used de-facto standards of the application domain at the data plane, such as the SeedLink protocol. By using an innovative integration of features based on lower level GPL components of the seiscomp suite with higher level processing earthworm components, coupled with IoT protocol extensions to the latter, the instrument can implement smart functionality such as network controlled, event triggered data transmission in parallel with edge archiving and on demand, short term historical data retrieval.

  1. An implementation of the SNR high speed network communication protocol (Receiver part)

    NASA Astrophysics Data System (ADS)

    Wan, Wen-Jyh

    1995-03-01

    This thesis work is to implement the receiver pan of the SNR high speed network transport protocol. The approach was to use the Systems of Communicating Machines (SCM) as the formal definition of the protocol. Programs were developed on top of the Unix system using C programming language. The Unix system features that were adopted for this implementation were multitasking, signals, shared memory, semaphores, sockets, timers and process control. The problems encountered, and solved, were signal loss, shared memory conflicts, process synchronization, scheduling, data alignment and errors in the SCM specification itself. The result was a correctly functioning program which implemented the SNR protocol. The system was tested using different connection modes, lost packets, duplicate packets and large data transfers. The contributions of this thesis are: (1) implementation of the receiver part of the SNR high speed transport protocol; (2) testing and integration with the transmitter part of the SNR transport protocol on an FDDI data link layered network; (3) demonstration of the functions of the SNR transport protocol such as connection management, sequenced delivery, flow control and error recovery using selective repeat methods of retransmission; and (4) modifications to the SNR transport protocol specification such as corrections for incorrect predicate conditions, defining of additional packet types formats, solutions for signal lost and processes contention problems etc.

  2. Generic Protocol for the Verification of Ballast Water Treatment Technology. Version 5.1

    DTIC Science & Technology

    2010-09-01

    the Protocol ..................................................................................... 2 1.4 Verification Testing Process ...Volumes, Containers and Processing .................................................................38 Table 10. Recommendation for Water...or persistent distortion of a measurement process that causes errors in one direction. Challenge Water: Water supplied to a treatment system under

  3. Probability Distributions over Cryptographic Protocols

    DTIC Science & Technology

    2009-06-01

    Artificial Immune Algorithm . . . . . . . . . . . . . . . . . . . 9 3 Design Decisions 11 3.1 Common Ground...creation algorithm for unbounded distribution . . . . . . . 24 4.2 Message creation algorithm for unbounded naive distribution . . . . 24 4.3 Protocol...creation algorithm for intended-run distributions . . . . . . 26 4.4 Protocol and message creation algorithm for realistic distribution . . 32 ix THIS

  4. Evaluation of Interoperability Protocols in Repositories of Electronic Theses and Dissertations

    ERIC Educational Resources Information Center

    Hakimjavadi, Hesamedin; Masrek, Mohamad Noorman

    2013-01-01

    Purpose: The purpose of this study is to evaluate the status of eight interoperability protocols within repositories of electronic theses and dissertations (ETDs) as an introduction to further studies on feasibility of deploying these protocols in upcoming areas of interoperability. Design/methodology/approach: Three surveys of 266 ETD…

  5. Automation of metabolic stability studies in microsomes, cytosol and plasma using a 215 Gilson liquid handler.

    PubMed

    Linget, J M; du Vignaud, P

    1999-05-01

    A 215 Gilson liquid handler was used to automate enzymatic incubations using microsomes, cytosol and plasma. The design of automated protocols are described. They were based on the use of 96 deep well plates and on HPLC-based methods for assaying the substrate. The assessment of those protocols was made with comparison between manual and automated incubations, reliability and reproducibility of automated incubations in microsomes and cytosol. Examples of the use of those programs in metabolic studies in drug research, i.e. metabolic screening in microsomes and plasma were shown. Even rapid processes (with disappearance half lives as low as 1 min) can be analysed. This work demonstrates how stability studies can be automated to save time, render experiments involving human biological media less hazardous and may be improve inter-laboratory reproducibility.

  6. Patient-derived Models of Human Breast Cancer: Protocols for In vitro and In vivo Applications in Tumor Biology and Translational Medicine

    PubMed Central

    DeRose, Yoko S.; Gligorich, Keith M.; Wang, Guoying; Georgelas, Ann; Bowman, Paulette; Courdy, Samir J.; Welm, Alana L.; Welm, Bryan E.

    2013-01-01

    Research models that replicate the diverse genetic and molecular landscape of breast cancer are critical for developing the next generation therapeutic entities that can target specific cancer subtypes. Patient-derived tumorgrafts, generated by transplanting primary human tumor samples into immune-compromised mice, are a valuable method to model the clinical diversity of breast cancer in mice, and are a potential resource in personalized medicine. Primary tumorgrafts also enable in vivo testing of therapeutics and make possible the use of patient cancer tissue for in vitro screens. Described in this unit are a variety of protocols including tissue collection, biospecimen tracking, tissue processing, transplantation, and 3-dimensional culturing of xenografted tissue, that enable use of bona fide uncultured human tissue in designing and validating cancer therapies. PMID:23456611

  7. RNA-seq mixology: designing realistic control experiments to compare protocols and analysis methods

    PubMed Central

    Holik, Aliaksei Z.; Law, Charity W.; Liu, Ruijie; Wang, Zeya; Wang, Wenyi; Ahn, Jaeil; Asselin-Labat, Marie-Liesse; Smyth, Gordon K.

    2017-01-01

    Abstract Carefully designed control experiments provide a gold standard for benchmarking different genomics research tools. A shortcoming of many gene expression control studies is that replication involves profiling the same reference RNA sample multiple times. This leads to low, pure technical noise that is atypical of regular studies. To achieve a more realistic noise structure, we generated a RNA-sequencing mixture experiment using two cell lines of the same cancer type. Variability was added by extracting RNA from independent cell cultures and degrading particular samples. The systematic gene expression changes induced by this design allowed benchmarking of different library preparation kits (standard poly-A versus total RNA with Ribozero depletion) and analysis pipelines. Data generated using the total RNA kit had more signal for introns and various RNA classes (ncRNA, snRNA, snoRNA) and less variability after degradation. For differential expression analysis, voom with quality weights marginally outperformed other popular methods, while for differential splicing, DEXSeq was simultaneously the most sensitive and the most inconsistent method. For sample deconvolution analysis, DeMix outperformed IsoPure convincingly. Our RNA-sequencing data set provides a valuable resource for benchmarking different protocols and data pre-processing workflows. The extra noise mimics routine lab experiments more closely, ensuring any conclusions are widely applicable. PMID:27899618

  8. Involving Latina/o parents in patient-centered outcomes research: Contributions to research study design, implementation and outcomes.

    PubMed

    Pérez Jolles, Mónica; Martinez, Maria; Garcia, San Juanita; Stein, Gabriela L; Thomas, Kathleen C

    2017-10-01

    Comparative effectiveness research (CER) is supported by policymakers as a way to provide service providers and patients with evidence-based information to make better health-care decisions and ultimately improve services for patients. However, Latina/o patients are rarely involved as study advisors, and there is a lack of documentation on how their voices contribute to the research process when they are included as collaborators. The purpose of this article was to contribute to the literature by presenting concrete contributions of Latina/o parent involvement to study design, implementation and outcomes in the context of a CER study called Padres Efectivos (Parent Activation). Researchers facilitated a collaborative relationship with parents by establishing a mentor parent group. The contributions of parent involvement in the following stages of the research process are described: (i) proposal development, (ii) implementation of protocols, (iii) analysis plan and (iv) dissemination of results. Mentor parents' contributions helped tailor the content of the intervention to their needs during proposal, increased recruitment, validated the main outcome measure and added two important outcome measures, emphasized the importance of controlling for novice treatment status and developed innovative dissemination strategies. Mentor parents' guidance to the researchers has contributed to reaching recruitment goals, strengthened the study protocol, expanded findings, supported broad ownership of study implications and enriched the overall study data collection efforts. These findings can inform future research efforts seeking an active Latino parent collaboration and the timely incorporation of parent voices in each phase of the research process. © 2017 The Authors Health Expectations Published by John Wiley & Sons Ltd.

  9. Quality by design: optimization of a freeze-drying cycle via design space in case of heterogeneous drying behavior and influence of the freezing protocol.

    PubMed

    Pisano, Roberto; Fissore, Davide; Barresi, Antonello A; Brayard, Philippe; Chouvenc, Pierre; Woinet, Bertrand

    2013-02-01

    This paper shows how to optimize the primary drying phase, for both product quality and drying time, of a parenteral formulation via design space. A non-steady state model, parameterized with experimentally determined heat and mass transfer coefficients, is used to define the design space when the heat transfer coefficient varies with the position of the vial in the array. The calculations recognize both equipment and product constraints, and also take into account model parameter uncertainty. Examples are given of cycles designed for the same formulation, but varying the freezing conditions and the freeze-dryer scale. These are then compared in terms of drying time. Furthermore, the impact of inter-vial variability on design space, and therefore on the optimized cycle, is addressed. With this regard, a simplified method is presented for the cycle design, which reduces the experimental effort required for the system qualification. The use of mathematical modeling is demonstrated to be very effective not only for cycle development, but also for solving problem of process transfer. This study showed that inter-vial variability remains significant when vials are loaded on plastic trays, and how inter-vial variability can be taken into account during process design.

  10. Recasting a traditional laboratory practical as a "Design-your-own protocol" to teach a universal research skill.

    PubMed

    Whitworth, David E

    2016-07-08

    Laboratory-based practical classes are a common feature of life science teaching, during which students learn how to perform experiments and generate/interpret data. Practical classes are typically instructional, concentrating on providing topic- and technique-specific skills, however to produce research-capable graduates it is also important to develop generic practical skills. To provide an opportunity for students to develop the skills needed to create bespoke protocols for experimental benchwork, a traditional practical was repurposed. Students were given a list of available resources and an experimental goal, and directed to create a bench protocol to achieve the aim (measuring the iron in hemoglobin). In a series of teaching events students received feedback from staff, and peers prototyped the protocols, before protocols were finally implemented. Graduates highlighted this exercise as one of the most important of their degrees, primarily because of the clear relevance of the skills acquired to professional practice. The exercise exemplifies a range of pedagogic principles, but arguably its most important innovation is that it repurposed a pre-existing practical. This had the benefits of automatically providing scaffolding to direct the students' thought processes, while retaining the advantages of a "discovery learning" exercise, and allowing facile adoption of the approach across the sector. © 2016 by The International Union of Biochemistry and Molecular Biology, 44(4):377-380, 2016. © 2016 The International Union of Biochemistry and Molecular Biology.

  11. Design and implementation of a medium speed communications interface and protocol for a low cost, refreshed display computer

    NASA Technical Reports Server (NTRS)

    Phyne, J. R.; Nelson, M. D.

    1975-01-01

    The design and implementation of hardware and software systems involved in using a 40,000 bit/second communication line as the connecting link between an IMLAC PDS 1-D display computer and a Univac 1108 computer system were described. The IMLAC consists of two independent processors sharing a common memory. The display processor generates the deflection and beam control currents as it interprets a program contained in the memory; the minicomputer has a general instruction set and is responsible for starting and stopping the display processor and for communicating with the outside world through the keyboard, teletype, light pen, and communication line. The processing time associated with each data byte was minimized by designing the input and output processes as finite state machines which automatically sequence from each state to the next. Several tests of the communication link and the IMLAC software were made using a special low capacity computer grade cable between the IMLAC and the Univac.

  12. Comparison of a Stimulus Equivalence Protocol and Traditional Lecture for Teaching Single-Subject Designs

    ERIC Educational Resources Information Center

    Lovett, Sadie; Rehfeldt, Ruth Anne; Garcia, Yors; Dunning, Johnna

    2011-01-01

    This study compared the effects of a computer-based stimulus equivalence protocol to a traditional lecture format in teaching single-subject experimental design concepts to undergraduate students. Participants were assigned to either an equivalence or a lecture group, and performance on a paper-and-pencil test that targeted relations among the…

  13. Adaptive Probabilistic Protocols for Advanced Networks/Assuring the Integrity of Highly Decentralized Communications Systems

    DTIC Science & Technology

    2005-03-01

    to obtain a protocol customized to the needs of a specific setting, under control of an automated theorem proving system that can guarantee...new “compositional” method for protocol design and implementation, in which small microprotocols are combined to obtain a protocol customized to the...and Network Centric Enterprise (NCES) visions. This final report documents a wide range of contributions and technology transitions, including: A

  14. Secure authentication protocol for Internet applications over CATV network

    NASA Astrophysics Data System (ADS)

    Chin, Le-Pond

    1998-02-01

    An authentication protocol is proposed in this paper to implement secure functions which include two way authentication and key management between end users and head-end. The protocol can protect transmission from frauds, attacks such as reply and wiretap. Location privacy is also achieved. A rest protocol is designed to restore the system once when systems fail. The security is verified by taking several security and privacy requirements into consideration.

  15. Contextual information management: An example of independent-checking in the review of laboratory-based bloodstain pattern analysis.

    PubMed

    Osborne, Nikola K P; Taylor, Michael C

    2018-05-01

    This article describes a New Zealand forensic agency's contextual information management protocol for bloodstain pattern evidence examined in the laboratory. In an effort to create a protocol that would have minimal impact on current work-flow, while still effectively removing task-irrelevant contextual information, the protocol was designed following an in-depth consultation with management and forensic staff. The resulting design was for a protocol of independent-checking (i.e. blind peer-review) where the checker's interpretation of the evidence is conducted in the absence of case information and the original examiner's notes or interpretation(s). At the conclusion of a ten-case trial period, there was widespread agreement that the protocol had minimal impact on the number of people required, the cost, or the time to complete an item examination. The agency is now looking to adopt the protocol into standard operating procedures and in some cases the protocol has been extended to cover other laboratory-based examinations (e.g. fabric damage, shoeprint examination, and physical fits). The protocol developed during this trial provides a useful example for agencies seeking to adopt contextual information management into their workflow. Copyright © 2018 The Chartered Society of Forensic Sciences. Published by Elsevier B.V. All rights reserved.

  16. Statistical aspects of the TNK-S2B trial of tenecteplase versus alteplase in acute ischemic stroke: an efficient, dose-adaptive, seamless phase II/III design.

    PubMed

    Levin, Bruce; Thompson, John L P; Chakraborty, Bibhas; Levy, Gilberto; MacArthur, Robert; Haley, E Clarke

    2011-08-01

    TNK-S2B, an innovative, randomized, seamless phase II/III trial of tenecteplase versus rt-PA for acute ischemic stroke, terminated for slow enrollment before regulatory approval of use of phase II patients in phase III. (1) To review the trial design and comprehensive type I error rate simulations and (2) to discuss issues raised during regulatory review, to facilitate future approval of similar designs. In phase II, an early (24-h) outcome and adaptive sequential procedure selected one of three tenecteplase doses for phase III comparison with rt-PA. Decision rules comparing this dose to rt-PA would cause stopping for futility at phase II end, or continuation to phase III. Phase III incorporated two co-primary hypotheses, allowing for a treatment effect at either end of the trichotomized Rankin scale. Assuming no early termination, four interim analyses and one final analysis of 1908 patients provided an experiment-wise type I error rate of <0.05. Over 1,000 distribution scenarios, each involving 40,000 replications, the maximum type I error in phase III was 0.038. Inflation from the dose selection was more than offset by the one-half continuity correction in the test statistics. Inflation from repeated interim analyses was more than offset by the reduction from the clinical stopping rules for futility at the first interim analysis. Design complexity and evolving regulatory requirements lengthened the review process. (1) The design was innovative and efficient. Per protocol, type I error was well controlled for the co-primary phase III hypothesis tests, and experiment-wise. (2a) Time must be allowed for communications with regulatory reviewers from first design stages. (2b) Adequate type I error control must be demonstrated. (2c) Greater clarity is needed on (i) whether this includes demonstration of type I error control if the protocol is violated and (ii) whether simulations of type I error control are acceptable. (2d) Regulatory agency concerns that protocols for futility stopping may not be followed may be allayed by submitting interim analysis results to them as these analyses occur.

  17. Redesigning a large school-based clinical trial in response to changes in community practice

    PubMed Central

    Gerald, Lynn B; Gerald, Joe K; McClure, Leslie A; Harrington, Kathy; Erwin, Sue; Bailey, William C

    2011-01-01

    Background Asthma exacerbations are seasonal with the greatest risk in elementary-age students occurring shortly after returning to school following summer break. Recent research suggests that this seasonality in children is primarily related to viral respiratory tract infections. Regular hand washing is the most effective method to prevent the spread of viral respiratory infections; unfortunately, achieving hand washing recommendations in schools is difficult. Therefore, we designed a study to evaluate the effect of hand sanitizer use in elementary schools on exacerbations among children with asthma. Purpose To describe the process of redesigning the trial in response to changes in the safety profile of the hand sanitizer as well as changes in hand hygiene practice in the schools. Methods The original trial was a randomized, longitudinal, subject-blinded, placebo-controlled, community-based crossover trial. The primary aim was to evaluate the incremental effectiveness of hand sanitizer use in addition to usual hand hygiene practices to decrease asthma exacerbations in elementary-age children. Three events occurred that required major modifications to the original study protocol: (1) safety concerns arose regarding the hand sanitizer’s active ingredient; (2) no substitute placebo hand sanitizer was available; and (3) community preferences changed regarding hand hygiene practices in the schools. Results The revised protocol is a randomized, longitudinal, community-based crossover trial. The primary aim is to evaluate the incremental effectiveness of a two-step hand hygiene process (hand hygiene education plus institutionally provided alcohol-based hand sanitizer) versus usual care to decrease asthma exacerbations. Enrollment was completed in May 2009 with 527 students from 30 schools. The intervention began in August 2009 and will continue through May 2011. Study results should be available at the end of 2011. Limitations The changed design does not allow us to directly measure the effectiveness of hand sanitizer use as a supplement to traditional hand washing practices. Conclusions The need to balance a rigorous study design with one that is acceptable to the community requires investigators to be actively involved with community collaborators and able to adapt study protocols to fit changing community practices. PMID:21730079

  18. Redesigning a large school-based clinical trial in response to changes in community practice.

    PubMed

    Gerald, Lynn B; Gerald, Joe K; McClure, Leslie A; Harrington, Kathy; Erwin, Sue; Bailey, William C

    2011-06-01

    Asthma exacerbations are seasonal with the greatest risk in elementary-age students occurring shortly after returning to school following summer break. Recent research suggests that this seasonality in children is primarily related to viral respiratory tract infections. Regular hand washing is the most effective method to prevent the spread of viral respiratory infections; unfortunately, achieving hand washing recommendations in schools is difficult. Therefore, we designed a study to evaluate the effect of hand sanitizer use in elementary schools on exacerbations among children with asthma. To describe the process of redesigning the trial in response to changes in the safety profile of the hand sanitizer as well as changes in hand hygiene practice in the schools. The original trial was a randomized, longitudinal, subject-blinded, placebo-controlled, community-based crossover trial. The primary aim was to evaluate the incremental effectiveness of hand sanitizer use in addition to usual hand hygiene practices to decrease asthma exacerbations in elementary-age children. Three events occurred that required major modifications to the original study protocol: (1) safety concerns arose regarding the hand sanitizer's active ingredient; (2) no substitute placebo hand sanitizer was available; and (3) community preferences changed regarding hand hygiene practices in the schools. The revised protocol is a randomized, longitudinal, community-based crossover trial. The primary aim is to evaluate the incremental effectiveness of a two-step hand hygiene process (hand hygiene education plus institutionally provided alcohol-based hand sanitizer) versus usual care to decrease asthma exacerbations. Enrollment was completed in May 2009 with 527 students from 30 schools. The intervention began in August 2009 and will continue through May 2011. Study results should be available at the end of 2011. The changed design does not allow us to directly measure the effectiveness of hand sanitizer use as a supplement to traditional hand washing practices. The need to balance a rigorous study design with one that is acceptable to the community requires investigators to be actively involved with community collaborators and able to adapt study protocols to fit changing community practices.

  19. Application of the Sketch Match method in Sulina coastal study area within PEGASO project

    NASA Astrophysics Data System (ADS)

    Marin, Eugenia; Nichersu, Iuliana; Mierla, Marian; Trifanov, Cristian; Nichersu, Iulian

    2013-04-01

    The Sketch Match approach for Sulina pilot case was carried out in the frame of the project "People for Ecosystem Based Governance in Assessing Sustainable Development of Ocean and Coast" - PEGASO, funded by the Seventh Framework Programme. The PEGASO project has been designed to identify common threats and solutions in relation to the long-term sustainable development and environmental protection of coastal zones bordering the Mediterranean and Black Seas in ways relevant to the implementation of the Integrated Coastal Zone Management Protocol (ICZM) for the Mediterranean. PEGASO will use the model of the existing ICZM Protocol for the Mediterranean and adjust it to the needs of the Black Sea through innovative actions, one of them being Refine and develop efficient and easy to use tools for making sustainability assessments in the coastal zone tested through a number of relevant pilot sites. Thus, for the Romania case study, the Sketch Match approach was selected, being an interactive public participation planning method, developed by the Dutch Government, and applied for Sulina area in order to stimulate support and involvement from stakeholders regarding Integrated Coastal Zone Management Protocol by consulting and involving these people in the planning process and making use of a coherent package of interactive methods. Participants were representatives of a wide range of stakeholders, varying from local fisherman to representatives of the Local and County council and Danube Delta Biosphere Reserve Authority. They participated in a two-day design session, focused on problems and potentials of the area, with the aim to work out possible solutions for an integrated coastal spatial planning, focusing on the parallel enhance of the various local functions in the spatial design (coastal area protection next to industry, tourism, nature, recreation, and other activities).

  20. The development of a fear of falling interdisciplinary intervention program

    PubMed Central

    Gomez, Fernando; Curcio, Carmen-Lucia

    2007-01-01

    Objective: To describe the development process of a protocol for a fear of falling interdisciplinary intervention program based on the main factors associated with fear of falling. Design/methods: The process of developing a protocol consisted of defining the target population, selecting the initial assessment components, adapting the intervention program based on findings about fear of falling and restriction of activities in this population. Settings: University-affiliated outpatient vertigo, dizziness and falls clinic in coffee-growers zone of Colombian Andes Mountains. Results: An intervention program was developed based on three main falling conceptual models. A medical intervention, based on a biomedical and pathophysiological model, a physiotherapeutic intervention based on a postural control model and a psychological intervention based on a biological-behavioral model. Conclusion: This interdisciplinary fear of falling intervention program developed is based on particular characteristics of target population, with differences in the inclusion criteria and the program intervention components; with emphasis on medical (recurrent falls and dizziness evaluation and management), psychological (cognitive-behavioral therapy) and physiotherapeutic (balance and transfers training) components. PMID:18225468

Top