Sample records for information processing protocols

  1. 77 FR 36281 - Solicitation of Information and Recommendations for Revising OIG's Provider Self-Disclosure Protocol

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-18

    ...] Solicitation of Information and Recommendations for Revising OIG's Provider Self-Disclosure Protocol AGENCY... Register notice informs the public that OIG: (1) Intends to update the Provider Self-Disclosure Protocol... Provider Self-Disclosure Protocol (the Protocol) to establish a process for health care providers to...

  2. Evaluation of Patient Handoff Methods on an Inpatient Teaching Service

    PubMed Central

    Craig, Steven R.; Smith, Hayden L.; Downen, A. Matthew; Yost, W. John

    2012-01-01

    Background The patient handoff process can be a highly variable and unstructured period at risk for communication errors. The morning sign-in process used by resident physicians at teaching hospitals typically involves less rigorous handoff protocols than the resident evening sign-out process. Little research has been conducted on best practices for handoffs during morning sign-in exchanges between resident physicians. Research must evaluate optimal protocols for the resident morning sign-in process. Methods Three morning handoff protocols consisting of written, electronic, and face-to-face methods were implemented over 3 study phases during an academic year. Study participants included all interns covering the internal medicine inpatient teaching service at a tertiary hospital. Study measures entailed intern survey-based interviews analyzed for failures in handoff protocols with or without missed pertinent information. Descriptive and comparative analyses examined study phase differences. Results A scheduled face-to-face handoff process had the fewest protocol deviations and demonstrated best communication of essential patient care information between cross-covering teams compared to written and electronic sign-in protocols. Conclusion Intern patient handoffs were more reliable when the sign-in protocol included scheduled face-to-face meetings. This method provided the best communication of patient care information and allowed for open exchanges of information. PMID:23267259

  3. DARPA Internet Program. Internet and Transmission Control Specifications,

    DTIC Science & Technology

    1981-09-01

    Internet Program Protocol Specification", RFC 791, USC/ Information Sciences Institute, September 1981. [34] Postel, J., ed., "Transmission Control Protocol...DARPA Internet Program Protocol Specification", RFC 793, USC/ Information Sciences Institute, September 1981. [35] Postel, J., "Echo Process", RFC 347...Newman, March 1981. [53] Postel, J., " Internet Control Message Protocol - DARPA Internet Program Protocol Specification", RFC 792, USC/ Information

  4. Automated monitoring of medical protocols: a secure and distributed architecture.

    PubMed

    Alsinet, T; Ansótegui, C; Béjar, R; Fernández, C; Manyà, F

    2003-03-01

    The control of the right application of medical protocols is a key issue in hospital environments. For the automated monitoring of medical protocols, we need a domain-independent language for their representation and a fully, or semi, autonomous system that understands the protocols and supervises their application. In this paper we describe a specification language and a multi-agent system architecture for monitoring medical protocols. We model medical services in hospital environments as specialized domain agents and interpret a medical protocol as a negotiation process between agents. A medical service can be involved in multiple medical protocols, and so specialized domain agents are independent of negotiation processes and autonomous system agents perform monitoring tasks. We present the detailed architecture of the system agents and of an important domain agent, the database broker agent, that is responsible of obtaining relevant information about the clinical history of patients. We also describe how we tackle the problems of privacy, integrity and authentication during the process of exchanging information between agents.

  5. Error characterization and quantum control benchmarking in liquid state NMR using quantum information processing techniques

    NASA Astrophysics Data System (ADS)

    Laforest, Martin

    Quantum information processing has been the subject of countless discoveries since the early 1990's. It is believed to be the way of the future for computation: using quantum systems permits one to perform computation exponentially faster than on a regular classical computer. Unfortunately, quantum systems that not isolated do not behave well. They tend to lose their quantum nature due to the presence of the environment. If key information is known about the noise present in the system, methods such as quantum error correction have been developed in order to reduce the errors introduced by the environment during a given quantum computation. In order to harness the quantum world and implement the theoretical ideas of quantum information processing and quantum error correction, it is imperative to understand and quantify the noise present in the quantum processor and benchmark the quality of the control over the qubits. Usual techniques to estimate the noise or the control are based on quantum process tomography (QPT), which, unfortunately, demands an exponential amount of resources. This thesis presents work towards the characterization of noisy processes in an efficient manner. The protocols are developed from a purely abstract setting with no system-dependent variables. To circumvent the exponential nature of quantum process tomography, three different efficient protocols are proposed and experimentally verified. The first protocol uses the idea of quantum error correction to extract relevant parameters about a given noise model, namely the correlation between the dephasing of two qubits. Following that is a protocol using randomization and symmetrization to extract the probability that a given number of qubits are simultaneously corrupted in a quantum memory, regardless of the specifics of the error and which qubits are affected. Finally, a last protocol, still using randomization ideas, is developed to estimate the average fidelity per computational gates for single and multi qubit systems. Even though liquid state NMR is argued to be unsuitable for scalable quantum information processing, it remains the best test-bed system to experimentally implement, verify and develop protocols aimed at increasing the control over general quantum information processors. For this reason, all the protocols described in this thesis have been implemented in liquid state NMR, which then led to further development of control and analysis techniques.

  6. Implementation of Quantum Private Queries Using Nuclear Magnetic Resonance

    NASA Astrophysics Data System (ADS)

    Wang, Chuan; Hao, Liang; Zhao, Lian-Jie

    2011-08-01

    We present a modified protocol for the realization of a quantum private query process on a classical database. Using one-qubit query and CNOT operation, the query process can be realized in a two-mode database. In the query process, the data privacy is preserved as the sender would not reveal any information about the database besides her query information, and the database provider cannot retain any information about the query. We implement the quantum private query protocol in a nuclear magnetic resonance system. The density matrix of the memory registers are constructed.

  7. Optimal protocol for maximum work extraction in a feedback process with a time-varying potential

    NASA Astrophysics Data System (ADS)

    Kwon, Chulan

    2017-12-01

    The nonequilibrium nature of information thermodynamics is characterized by the inequality or non-negativity of the total entropy change of the system, memory, and reservoir. Mutual information change plays a crucial role in the inequality, in particular if work is extracted and the paradox of Maxwell's demon is raised. We consider the Brownian information engine where the protocol set of the harmonic potential is initially chosen by the measurement and varies in time. We confirm the inequality of the total entropy change by calculating, in detail, the entropic terms including the mutual information change. We rigorously find the optimal values of the time-dependent protocol for maximum extraction of work both for the finite-time and the quasi-static process.

  8. Protocol Analysis: A Methodology for Exploring the Information Processing of Gifted Students.

    ERIC Educational Resources Information Center

    Anderson, Margaret A.

    1986-01-01

    Protocol analysis techniques, in which subjects are taught to think aloud, can provide information on the mental operations used by gifted learners. Concerns over the use of such data are described and new directions for the technique are proposed. (CL)

  9. Representing the work of medical protocols for organizational simulation.

    PubMed Central

    Fridsma, D. B.

    1998-01-01

    Developing and implementing patient care protocols within a specific organizational setting requires knowledge of the protocol, the organization, and the way in which the organization does its work. Computer-based simulation tools have been used in many industries to provide managers with prospective insight into problems of work process and organization design mismatch. Many of these simulation tools are designed for well-understood routine work processes in which there are few contingent tasks. In this paper, we describe theoretic that make it possible to simulate medical protocols using an information-processing theory framework. These simulations will allow medical administrators to test different protocol and organizational designs before actually using them within a particular clinical setting. PMID:9929231

  10. Application Transparent HTTP Over a Disruption Tolerant Smartnet

    DTIC Science & Technology

    2014-09-01

    American Standard Code for Information Interchange BP Bundle Protocol BPA bundle protocol agent CLA convergence layer adapters CPU central processing...forwarding them through the plugin pipeline. The initial version of the DTNInput plugin uses the BBN Spindle bundle protocol agent ( BPA ) implementation

  11. A New Cellular Architecture for Information Retrieval from Sensor Networks through Embedded Service and Security Protocols

    PubMed Central

    Shahzad, Aamir; Landry, René; Lee, Malrey; Xiong, Naixue; Lee, Jongho; Lee, Changhoon

    2016-01-01

    Substantial changes have occurred in the Information Technology (IT) sectors and with these changes, the demand for remote access to field sensor information has increased. This allows visualization, monitoring, and control through various electronic devices, such as laptops, tablets, i-Pads, PCs, and cellular phones. The smart phone is considered as a more reliable, faster and efficient device to access and monitor industrial systems and their corresponding information interfaces anywhere and anytime. This study describes the deployment of a protocol whereby industrial system information can be securely accessed by cellular phones via a Supervisory Control And Data Acquisition (SCADA) server. To achieve the study goals, proprietary protocol interconnectivity with non-proprietary protocols and the usage of interconnectivity services are considered in detail. They support the visualization of the SCADA system information, and the related operations through smart phones. The intelligent sensors are configured and designated to process real information via cellular phones by employing information exchange services between the proprietary protocol and non-proprietary protocols. SCADA cellular access raises the issue of security flaws. For these challenges, a cryptography-based security method is considered and deployed, and it could be considered as a part of a proprietary protocol. Subsequently, transmission flows from the smart phones through a cellular network. PMID:27314351

  12. A New Cellular Architecture for Information Retrieval from Sensor Networks through Embedded Service and Security Protocols.

    PubMed

    Shahzad, Aamir; Landry, René; Lee, Malrey; Xiong, Naixue; Lee, Jongho; Lee, Changhoon

    2016-06-14

    Substantial changes have occurred in the Information Technology (IT) sectors and with these changes, the demand for remote access to field sensor information has increased. This allows visualization, monitoring, and control through various electronic devices, such as laptops, tablets, i-Pads, PCs, and cellular phones. The smart phone is considered as a more reliable, faster and efficient device to access and monitor industrial systems and their corresponding information interfaces anywhere and anytime. This study describes the deployment of a protocol whereby industrial system information can be securely accessed by cellular phones via a Supervisory Control And Data Acquisition (SCADA) server. To achieve the study goals, proprietary protocol interconnectivity with non-proprietary protocols and the usage of interconnectivity services are considered in detail. They support the visualization of the SCADA system information, and the related operations through smart phones. The intelligent sensors are configured and designated to process real information via cellular phones by employing information exchange services between the proprietary protocol and non-proprietary protocols. SCADA cellular access raises the issue of security flaws. For these challenges, a cryptography-based security method is considered and deployed, and it could be considered as a part of a proprietary protocol. Subsequently, transmission flows from the smart phones through a cellular network.

  13. 34 CFR Appendix to Part 5 - Unknown Title

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... the Department. Research protocol, design, processing, and other technical information to the extent... report submitted for comment prior to acceptance. Research protocol, design, processing, and other...-10) Pt. 5, App. Appendix to Part 5 [The following are some examples of specific records (or specific...

  14. Gaussian error correction of quantum states in a correlated noisy channel.

    PubMed

    Lassen, Mikael; Berni, Adriano; Madsen, Lars S; Filip, Radim; Andersen, Ulrik L

    2013-11-01

    Noise is the main obstacle for the realization of fault-tolerant quantum information processing and secure communication over long distances. In this work, we propose a communication protocol relying on simple linear optics that optimally protects quantum states from non-Markovian or correlated noise. We implement the protocol experimentally and demonstrate the near-ideal protection of coherent and entangled states in an extremely noisy channel. Since all real-life channels are exhibiting pronounced non-Markovian behavior, the proposed protocol will have immediate implications in improving the performance of various quantum information protocols.

  15. Advanced medical imaging protocol workflow-a flexible electronic solution to optimize process efficiency, care quality and patient safety in the National VA Enterprise.

    PubMed

    Medverd, Jonathan R; Cross, Nathan M; Font, Frank; Casertano, Andrew

    2013-08-01

    Radiologists routinely make decisions with only limited information when assigning protocol instructions for the performance of advanced medical imaging examinations. Opportunity exists to simultaneously improve the safety, quality and efficiency of this workflow through the application of an electronic solution leveraging health system resources to provide concise, tailored information and decision support in real-time. Such a system has been developed using an open source, open standards design for use within the Veterans Health Administration. The Radiology Protocol Tool Recorder (RAPTOR) project identified key process attributes as well as inherent weaknesses of paper processes and electronic emulators of paper processes to guide the development of its optimized electronic solution. The design provides a kernel that can be expanded to create an integrated radiology environment. RAPTOR has implications relevant to the greater health care community, and serves as a case model for modernization of legacy government health information systems.

  16. General A Scheme to Share Information via Employing Discrete Algorithm to Quantum States

    NASA Astrophysics Data System (ADS)

    Kang, Guo-Dong; Fang, Mao-Fa

    2011-02-01

    We propose a protocol for information sharing between two legitimate parties (Bob and Alice) via public-key cryptography. In particular, we specialize the protocol by employing discrete algorithm under mod that maps integers to quantum states via photon rotations. Based on this algorithm, we find that the protocol is secure under various classes of attacks. Specially, owe to the algorithm, the security of the classical privacy contained in the quantum public-key and the corresponding ciphertext is guaranteed. And the protocol is robust against the impersonation attack and the active wiretapping attack by designing particular checking processing, thus the protocol is valid.

  17. Subconscious Visual Cues during Movement Execution Allow Correct Online Choice Reactions

    PubMed Central

    Leukel, Christian; Lundbye-Jensen, Jesper; Christensen, Mark Schram; Gollhofer, Albert; Nielsen, Jens Bo; Taube, Wolfgang

    2012-01-01

    Part of the sensory information is processed by our central nervous system without conscious perception. Subconscious processing has been shown to be capable of triggering motor reactions. In the present study, we asked the question whether visual information, which is not consciously perceived, could influence decision-making in a choice reaction task. Ten healthy subjects (28±5 years) executed two different experimental protocols. In the Motor reaction protocol, a visual target cue was shown on a computer screen. Depending on the displayed cue, subjects had to either complete a reaching movement (go-condition) or had to abort the movement (stop-condition). The cue was presented with different display durations (20–160 ms). In the second Verbalization protocol, subjects verbalized what they experienced on the screen. Again, the cue was presented with different display durations. This second protocol tested for conscious perception of the visual cue. The results of this study show that subjects achieved significantly more correct responses in the Motor reaction protocol than in the Verbalization protocol. This difference was only observed at the very short display durations of the visual cue. Since correct responses in the Verbalization protocol required conscious perception of the visual information, our findings imply that the subjects performed correct motor responses to visual cues, which they were not conscious about. It is therefore concluded that humans may reach decisions based on subconscious visual information in a choice reaction task. PMID:23049749

  18. EXACT2: the semantics of biomedical protocols

    PubMed Central

    2014-01-01

    Background The reliability and reproducibility of experimental procedures is a cornerstone of scientific practice. There is a pressing technological need for the better representation of biomedical protocols to enable other agents (human or machine) to better reproduce results. A framework that ensures that all information required for the replication of experimental protocols is essential to achieve reproducibility. Methods We have developed the ontology EXACT2 (EXperimental ACTions) that is designed to capture the full semantics of biomedical protocols required for their reproducibility. To construct EXACT2 we manually inspected hundreds of published and commercial biomedical protocols from several areas of biomedicine. After establishing a clear pattern for extracting the required information we utilized text-mining tools to translate the protocols into a machine amenable format. We have verified the utility of EXACT2 through the successful processing of previously 'unseen' (not used for the construction of EXACT2) protocols. Results The paper reports on a fundamentally new version EXACT2 that supports the semantically-defined representation of biomedical protocols. The ability of EXACT2 to capture the semantics of biomedical procedures was verified through a text mining use case. In this EXACT2 is used as a reference model for text mining tools to identify terms pertinent to experimental actions, and their properties, in biomedical protocols expressed in natural language. An EXACT2-based framework for the translation of biomedical protocols to a machine amenable format is proposed. Conclusions The EXACT2 ontology is sufficient to record, in a machine processable form, the essential information about biomedical protocols. EXACT2 defines explicit semantics of experimental actions, and can be used by various computer applications. It can serve as a reference model for for the translation of biomedical protocols in natural language into a semantically-defined format. PMID:25472549

  19. Wireless access to a pharmaceutical database: a demonstrator for data driven Wireless Application Protocol (WAP) applications in medical information processing.

    PubMed

    Schacht Hansen, M; Dørup, J

    2001-01-01

    The Wireless Application Protocol technology implemented in newer mobile phones has built-in facilities for handling much of the information processing needed in clinical work. To test a practical approach we ported a relational database of the Danish pharmaceutical catalogue to Wireless Application Protocol using open source freeware at all steps. We used Apache 1.3 web software on a Linux server. Data containing the Danish pharmaceutical catalogue were imported from an ASCII file into a MySQL 3.22.32 database using a Practical Extraction and Report Language script for easy update of the database. Data were distributed in 35 interrelated tables. Each pharmaceutical brand name was given its own card with links to general information about the drug, active substances, contraindications etc. Access was available through 1) browsing therapeutic groups and 2) searching for a brand name. The database interface was programmed in the server-side scripting language PHP3. A free, open source Wireless Application Protocol gateway to a pharmaceutical catalogue was established to allow dial-in access independent of commercial Wireless Application Protocol service providers. The application was tested on the Nokia 7110 and Ericsson R320s cellular phones. We have demonstrated that Wireless Application Protocol-based access to a dynamic clinical database can be established using open source freeware. The project opens perspectives for a further integration of Wireless Application Protocol phone functions in clinical information processing: Global System for Mobile communication telephony for bilateral communication, asynchronous unilateral communication via e-mail and Short Message Service, built-in calculator, calendar, personal organizer, phone number catalogue and Dictaphone function via answering machine technology. An independent Wireless Application Protocol gateway may be placed within hospital firewalls, which may be an advantage with respect to security. However, if Wireless Application Protocol phones are to become effective tools for physicians, special attention must be paid to the limitations of the devices. Input tools of Wireless Application Protocol phones should be improved, for instance by increased use of speech control.

  20. Wireless access to a pharmaceutical database: A demonstrator for data driven Wireless Application Protocol applications in medical information processing

    PubMed Central

    Hansen, Michael Schacht

    2001-01-01

    Background The Wireless Application Protocol technology implemented in newer mobile phones has built-in facilities for handling much of the information processing needed in clinical work. Objectives To test a practical approach we ported a relational database of the Danish pharmaceutical catalogue to Wireless Application Protocol using open source freeware at all steps. Methods We used Apache 1.3 web software on a Linux server. Data containing the Danish pharmaceutical catalogue were imported from an ASCII file into a MySQL 3.22.32 database using a Practical Extraction and Report Language script for easy update of the database. Data were distributed in 35 interrelated tables. Each pharmaceutical brand name was given its own card with links to general information about the drug, active substances, contraindications etc. Access was available through 1) browsing therapeutic groups and 2) searching for a brand name. The database interface was programmed in the server-side scripting language PHP3. Results A free, open source Wireless Application Protocol gateway to a pharmaceutical catalogue was established to allow dial-in access independent of commercial Wireless Application Protocol service providers. The application was tested on the Nokia 7110 and Ericsson R320s cellular phones. Conclusions We have demonstrated that Wireless Application Protocol-based access to a dynamic clinical database can be established using open source freeware. The project opens perspectives for a further integration of Wireless Application Protocol phone functions in clinical information processing: Global System for Mobile communication telephony for bilateral communication, asynchronous unilateral communication via e-mail and Short Message Service, built-in calculator, calendar, personal organizer, phone number catalogue and Dictaphone function via answering machine technology. An independent Wireless Application Protocol gateway may be placed within hospital firewalls, which may be an advantage with respect to security. However, if Wireless Application Protocol phones are to become effective tools for physicians, special attention must be paid to the limitations of the devices. Input tools of Wireless Application Protocol phones should be improved, for instance by increased use of speech control. PMID:11720946

  1. A Family of Quantum Protocols

    NASA Astrophysics Data System (ADS)

    Devetak, Igor; Harrow, Aram W.; Winter, Andreas

    2004-12-01

    We introduce three new quantum protocols involving noisy quantum channels and entangled states, and relate them operationally and conceptually with four well-known old protocols. Two of the new protocols (the mother and father) can generate the other five “child” protocols by direct application of teleportation and superdense coding, and can be derived in turn by making the old protocols “coherent.” This gives very simple proofs for two famous old protocols (the hashing inequality and quantum channel capacity) and provides the basis for optimal trade-off curves in several quantum information processing tasks.

  2. Bayesian adaptive survey protocols for resource management

    USGS Publications Warehouse

    Halstead, Brian J.; Wylie, Glenn D.; Coates, Peter S.; Casazza, Michael L.

    2011-01-01

    Transparency in resource management decisions requires a proper accounting of uncertainty at multiple stages of the decision-making process. As information becomes available, periodic review and updating of resource management protocols reduces uncertainty and improves management decisions. One of the most basic steps to mitigating anthropogenic effects on populations is determining if a population of a species occurs in an area that will be affected by human activity. Species are rarely detected with certainty, however, and falsely declaring a species absent can cause improper conservation decisions or even extirpation of populations. We propose a method to design survey protocols for imperfectly detected species that accounts for multiple sources of uncertainty in the detection process, is capable of quantitatively incorporating expert opinion into the decision-making process, allows periodic updates to the protocol, and permits resource managers to weigh the severity of consequences if the species is falsely declared absent. We developed our method using the giant gartersnake (Thamnophis gigas), a threatened species precinctive to the Central Valley of California, as a case study. Survey date was negatively related to the probability of detecting the giant gartersnake, and water temperature was positively related to the probability of detecting the giant gartersnake at a sampled location. Reporting sampling effort, timing and duration of surveys, and water temperatures would allow resource managers to evaluate the probability that the giant gartersnake occurs at sampled sites where it is not detected. This information would also allow periodic updates and quantitative evaluation of changes to the giant gartersnake survey protocol. Because it naturally allows multiple sources of information and is predicated upon the idea of updating information, Bayesian analysis is well-suited to solving the problem of developing efficient sampling protocols for species of conservation concern.

  3. Investigating nurse practitioners in the private sector: a theoretically informed research protocol.

    PubMed

    Adams, Margaret; Gardner, Glenn; Yates, Patsy

    2017-06-01

    To report a study protocol and the theoretical framework normalisation process theory that informs this protocol for a case study investigation of private sector nurse practitioners. Most research evaluating nurse practitioner service is focused on public, mainly acute care environments where nurse practitioner service is well established with strong structures for governance and sustainability. Conversely, there is lack of clarity in governance for emerging models in the private sector. In a climate of healthcare reform, nurse practitioner service is extending beyond the familiar public health sector. Further research is required to inform knowledge of the practice, operational framework and governance of new nurse practitioner models. The proposed research will use a multiple exploratory case study design to examine private sector nurse practitioner service. Data collection includes interviews, surveys and audits. A sequential mixed method approach to analysis of each case will be conducted. Findings from within-case analysis will lead to a meta-synthesis across all four cases to gain a holistic understanding of the cases under study, private sector nurse practitioner service. Normalisation process theory will be used to guide the research process, specifically coding and analysis of data using theory constructs and the relevant components associated with those constructs. This article provides a blueprint for the research and describes a theoretical framework, normalisation process theory in terms of its flexibility as an analytical framework. Consistent with the goals of best research practice, this study protocol will inform the research community in the field of primary health care about emerging research in this field. Publishing a study protocol ensures researcher fidelity to the analysis plan and supports research collaboration across teams. © 2016 John Wiley & Sons Ltd.

  4. IRB Process Improvements: A Machine Learning Analysis.

    PubMed

    Shoenbill, Kimberly; Song, Yiqiang; Cobb, Nichelle L; Drezner, Marc K; Mendonca, Eneida A

    2017-06-01

    Clinical research involving humans is critically important, but it is a lengthy and expensive process. Most studies require institutional review board (IRB) approval. Our objective is to identify predictors of delays or accelerations in the IRB review process and apply this knowledge to inform process change in an effort to improve IRB efficiency, transparency, consistency and communication. We analyzed timelines of protocol submissions to determine protocol or IRB characteristics associated with different processing times. Our evaluation included single variable analysis to identify significant predictors of IRB processing time and machine learning methods to predict processing times through the IRB review system. Based on initial identified predictors, changes to IRB workflow and staffing procedures were instituted and we repeated our analysis. Our analysis identified several predictors of delays in the IRB review process including type of IRB review to be conducted, whether a protocol falls under Veteran's Administration purview and specific staff in charge of a protocol's review. We have identified several predictors of delays in IRB protocol review processing times using statistical and machine learning methods. Application of this knowledge to process improvement efforts in two IRBs has led to increased efficiency in protocol review. The workflow and system enhancements that are being made support our four-part goal of improving IRB efficiency, consistency, transparency, and communication.

  5. Gunslinger Effect and Müller-Lyer Illusion: Examining Early Visual Information Processing for Late Limb-Target Control.

    PubMed

    Roberts, James W; Lyons, James; Garcia, Daniel B L; Burgess, Raquel; Elliott, Digby

    2017-07-01

    The multiple process model contends that there are two forms of online control for manual aiming: impulse regulation and limb-target control. This study examined the impact of visual information processing for limb-target control. We amalgamated the Gunslinger protocol (i.e., faster movements following a reaction to an external trigger compared with the spontaneous initiation of movement) and Müller-Lyer target configurations into the same aiming protocol. The results showed the Gunslinger effect was isolated at the early portions of the movement (peak acceleration and peak velocity). Reacted aims reached a longer displacement at peak deceleration, but no differences for movement termination. The target configurations manifested terminal biases consistent with the illusion. We suggest the visual information processing demands imposed by reacted aims can be adapted by integrating early feedforward information for limb-target control.

  6. Quantum Tomography Protocols with Positivity are Compressed Sensing Protocols (Open Access)

    DTIC Science & Technology

    2015-12-08

    ARTICLE OPEN Quantum tomography protocols with positivity are compressed sensing protocols Amir Kalev1, Robert L Kosut2 and Ivan H Deutsch1...Characterising complex quantum systems is a vital task in quantum information science. Quantum tomography, the standard tool used for this purpose, uses a well...designed measurement record to reconstruct quantum states and processes. It is, however, notoriously inefficient. Recently, the classical signal

  7. Protocol-based care: the standardisation of decision-making?

    PubMed

    Rycroft-Malone, Jo; Fontenla, Marina; Seers, Kate; Bick, Debra

    2009-05-01

    To explore how protocol-based care affects clinical decision-making. In the context of evidence-based practice, protocol-based care is a mechanism for facilitating the standardisation of care and streamlining decision-making through rationalising the information with which to make judgements and ultimately decisions. However, whether protocol-based care does, in the reality of practice, standardise decision-making is unknown. This paper reports on a study that explored the impact of protocol-based care on nurses' decision-making. Theoretically informed by realistic evaluation and the promoting action on research implementation in health services framework, a case study design using ethnographic methods was used. Two sites were purposively sampled; a diabetic and endocrine unit and a cardiac medical unit. Within each site, data collection included observation, postobservation semi-structured interviews with staff and patients, field notes, feedback sessions and document review. Data were inductively and thematically analysed. Decisions made by nurses in both sites were varied according to many different and interacting factors. While several standardised care approaches were available for use, in reality, a variety of information sources informed decision-making. The primary approach to knowledge exchange and acquisition was person-to-person; decision-making was a social activity. Rarely were standardised care approaches obviously referred to; nurses described following a mental flowchart, not necessarily linked to a particular guideline or protocol. When standardised care approaches were used, it was reported that they were used flexibly and particularised. While the logic of protocol-based care is algorithmic, in the reality of clinical practice, other sources of information supported nurses' decision-making process. This has significant implications for the political goal of standardisation. The successful implementation and judicious use of tools such as protocols and guidelines will likely be dependant on approaches that facilitate the development of nurses' decision-making processes in parallel to paying attention to the influence of context.

  8. An electronic specimen collection protocol schema (eSCPS). Document architecture for specimen management and the exchange of specimen collection protocols between biobanking information systems.

    PubMed

    Eminaga, O; Semjonow, A; Oezguer, E; Herden, J; Akbarov, I; Tok, A; Engelmann, U; Wille, S

    2014-01-01

    The integrity of collection protocols in biobanking is essential for a high-quality sample preparation process. However, there is not currently a well-defined universal method for integrating collection protocols in the biobanking information system (BIMS). Therefore, an electronic schema of the collection protocol that is based on Extensible Markup Language (XML) is required to maintain the integrity and enable the exchange of collection protocols. The development and implementation of an electronic specimen collection protocol schema (eSCPS) was performed at two institutions (Muenster and Cologne) in three stages. First, we analyzed the infrastructure that was already established at both the biorepository and the hospital information systems of these institutions and determined the requirements for the sufficient preparation of specimens and documentation. Second, we designed an eSCPS according to these requirements. Finally, a prospective study was conducted to implement and evaluate the novel schema in the current BIMS. We designed an eSCPS that provides all of the relevant information about collection protocols. Ten electronic collection protocols were generated using the supplementary Protocol Editor tool, and these protocols were successfully implemented in the existing BIMS. Moreover, an electronic list of collection protocols for the current studies being performed at each institution was included, new collection protocols were added, and the existing protocols were redesigned to be modifiable. The documentation time was significantly reduced after implementing the eSCPS (5 ± 2 min vs. 7 ± 3 min; p = 0.0002). The eSCPS improves the integrity and facilitates the exchange of specimen collection protocols in the existing open-source BIMS.

  9. Optimal protocols for slowly driven quantum systems.

    PubMed

    Zulkowski, Patrick R; DeWeese, Michael R

    2015-09-01

    The design of efficient quantum information processing will rely on optimal nonequilibrium transitions of driven quantum systems. Building on a recently developed geometric framework for computing optimal protocols for classical systems driven in finite time, we construct a general framework for optimizing the average information entropy for driven quantum systems. Geodesics on the parameter manifold endowed with a positive semidefinite metric correspond to protocols that minimize the average information entropy production in finite time. We use this framework to explicitly compute the optimal entropy production for a simple two-state quantum system coupled to a heat bath of bosonic oscillators, which has applications to quantum annealing.

  10. Protocols for the Investigation of Information Processing in Human Assessment of Fundamental Movement Skills.

    PubMed

    Ward, Brodie J; Thornton, Ashleigh; Lay, Brendan; Rosenberg, Michael

    2017-01-01

    Fundamental movement skill (FMS) assessment remains an important tool in classifying individuals' level of FMS proficiency. The collection of FMS performances for assessment and monitoring has remained unchanged over the last few decades, but new motion capture technologies offer opportunities to automate this process. To achieve this, a greater understanding of the human process of movement skill assessment is required. The authors present the rationale and protocols of a project in which they aim to investigate the visual search patterns and information extraction employed by human assessors during FMS assessment, as well as the implementation of the Kinect system for FMS capture.

  11. Studying frequency processing of the brain to enhance long-term memory and develop a human brain protocol.

    PubMed

    Friedrich, Wernher; Du, Shengzhi; Balt, Karlien

    2015-01-01

    The temporal lobe in conjunction with the hippocampus is responsible for memory processing. The gamma wave is involved with this process. To develop a human brain protocol, a better understanding of the relationship between gamma and long-term memory is vital. A more comprehensive understanding of the human brain and specific analogue waves it uses will support the development of a human brain protocol. Fifty-eight participants aged between 6 and 60 years participated in long-term memory experiments. It is envisaged that the brain could be stimulated through binaural beats (sound frequency) at 40 Hz (gamma) to enhance long-term memory capacity. EEG recordings have been transformed to sound and then to an information standard, namely ASCII. Statistical analysis showed a proportional relationship between long-term memory and gamma activity. Results from EEG recordings indicate a pattern. The pattern was obtained through the de-codification of an EEG recording to sound and then to ASCII. Stimulation of gamma should enhance long term memory capacity. More research is required to unlock the human brains' protocol key. This key will enable the processing of information directly to and from human memory via gamma, the hippocampus and the temporal lobe.

  12. Growth and Visual Information Processing in Infants in Southern Ethiopia

    ERIC Educational Resources Information Center

    Kennedy, Tay; Thomas, David G.; Woltamo, Tesfaye; Abebe, Yewelsew; Hubbs-Tait, Laura; Sykova, Vladimira; Stoecker, Barbara J.; Hambidge, K. Michael

    2008-01-01

    Speed of information processing and recognition memory can be assessed in infants using a visual information processing (VIP) paradigm. In a sample of 100 infants 6-8 months of age from Southern Ethiopia, we assessed relations between growth and VIP. The 69 infants who completed the VIP protocol had a mean weight z score of -1.12 plus or minus…

  13. A proposed group management scheme for XTP multicast

    NASA Technical Reports Server (NTRS)

    Dempsey, Bert J.; Weaver, Alfred C.

    1990-01-01

    The purpose of a group management scheme is to enable its associated transfer layer protocol to be responsive to user determined reliability requirements for multicasting. Group management (GM) must assist the client process in coordinating multicast group membership, allow the user to express the subset of the multicast group that a particular multicast distribution must reach in order to be successful (reliable), and provide the transfer layer protocol with the group membership information necessary to guarantee delivery to this subset. GM provides services and mechanisms that respond to the need of the client process or process level management protocols to coordinate, modify, and determine attributes of the multicast group, especially membership. XTP GM provides a link between process groups and their multicast groups by maintaining a group membership database that identifies members in a name space understood by the underlying transfer layer protocol. Other attributes of the multicast group useful to both the client process and the data transfer protocol may be stored in the database. Examples include the relative dispersion, most recent update, and default delivery parameters of a group.

  14. Preparing Protocols for Institutional Review Boards.

    ERIC Educational Resources Information Center

    Lyons, Charles M.

    1983-01-01

    Introduces the process by which Institutional Review Boards (IRBs) review proposals for research involving human subjects. Describes the composition of IRBs. Presents the Nuremberg code, the elements of informed consent, the judging criteria for proposals, and a sample protocol format. References newly published regulations governing research with…

  15. Design and Verification of a Distributed Communication Protocol

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar A.; Goodloe, Alwyn E.

    2009-01-01

    The safety of remotely operated vehicles depends on the correctness of the distributed protocol that facilitates the communication between the vehicle and the operator. A failure in this communication can result in catastrophic loss of the vehicle. To complicate matters, the communication system may be required to satisfy several, possibly conflicting, requirements. The design of protocols is typically an informal process based on successive iterations of a prototype implementation. Yet distributed protocols are notoriously difficult to get correct using such informal techniques. We present a formal specification of the design of a distributed protocol intended for use in a remotely operated vehicle, which is built from the composition of several simpler protocols. We demonstrate proof strategies that allow us to prove properties of each component protocol individually while ensuring that the property is preserved in the composition forming the entire system. Given that designs are likely to evolve as additional requirements emerge, we show how we have automated most of the repetitive proof steps to enable verification of rapidly changing designs.

  16. Cryptanalysis and improvement of a quantum communication-based online shopping mechanism

    NASA Astrophysics Data System (ADS)

    Huang, Wei; Yang, Ying-Hui; Jia, Heng-Yue

    2015-06-01

    Recently, Chou et al. (Electron Commer Res 14:349-367, 2014) presented a novel controlled quantum secure direct communication protocol which can be used for online shopping. The authors claimed that their protocol was immune to the attacks from both external eavesdropper and internal betrayer. However, we find that this protocol is vulnerable to the attack from internal betrayer. In this paper, we analyze the security of this protocol to show that the controller in this protocol is able to eavesdrop the secret information of the sender (i.e., the customer's shopping information), which indicates that it cannot be used for secure online shopping as the authors expected. Accordingly, an improvement of this protocol, which could resist the controller's attack, is proposed. In addition, we present another protocol which is more appropriate for online shopping. Finally, a discussion about the difference in detail of the quantum secure direct communication process between regular quantum communications and online shopping is given.

  17. The Classroom Observation Protocol for Undergraduate STEM (COPUS): a new instrument to characterize university STEM classroom practices.

    PubMed

    Smith, Michelle K; Jones, Francis H M; Gilbert, Sarah L; Wieman, Carl E

    2013-01-01

    Instructors and the teaching practices they employ play a critical role in improving student learning in college science, technology, engineering, and mathematics (STEM) courses. Consequently, there is increasing interest in collecting information on the range and frequency of teaching practices at department-wide and institution-wide scales. To help facilitate this process, we present a new classroom observation protocol known as the Classroom Observation Protocol for Undergraduate STEM or COPUS. This protocol allows STEM faculty, after a short 1.5-hour training period, to reliably characterize how faculty and students are spending their time in the classroom. We present the protocol, discuss how it differs from existing classroom observation protocols, and describe the process by which it was developed and validated. We also discuss how the observation data can be used to guide individual and institutional change.

  18. The Classroom Observation Protocol for Undergraduate STEM (COPUS): A New Instrument to Characterize University STEM Classroom Practices

    PubMed Central

    Smith, Michelle K.; Jones, Francis H. M.; Gilbert, Sarah L.; Wieman, Carl E.

    2013-01-01

    Instructors and the teaching practices they employ play a critical role in improving student learning in college science, technology, engineering, and mathematics (STEM) courses. Consequently, there is increasing interest in collecting information on the range and frequency of teaching practices at department-wide and institution-wide scales. To help facilitate this process, we present a new classroom observation protocol known as the Classroom Observation Protocol for Undergraduate STEM or COPUS. This protocol allows STEM faculty, after a short 1.5-hour training period, to reliably characterize how faculty and students are spending their time in the classroom. We present the protocol, discuss how it differs from existing classroom observation protocols, and describe the process by which it was developed and validated. We also discuss how the observation data can be used to guide individual and institutional change. PMID:24297289

  19. Feedback Specificity, Information Processing, and Transfer of Training

    ERIC Educational Resources Information Center

    Goodman, Jodi S.; Wood, Robert E.; Chen, Zheng

    2011-01-01

    This study examines the effects of feedback specificity on transfer of training and the mechanisms through which feedback can enhance or inhibit transfer. We used concurrent verbal protocol methodology to elicit and operationalize the explicit information processing activities used by 48 trainees performing the Furniture Factory computer…

  20. Cochrane Qualitative and Implementation Methods Group guidance series-paper 2: methods for question formulation, searching, and protocol development for qualitative evidence synthesis.

    PubMed

    Harris, Janet L; Booth, Andrew; Cargo, Margaret; Hannes, Karin; Harden, Angela; Flemming, Kate; Garside, Ruth; Pantoja, Tomas; Thomas, James; Noyes, Jane

    2018-05-01

    This paper updates previous Cochrane guidance on question formulation, searching, and protocol development, reflecting recent developments in methods for conducting qualitative evidence syntheses to inform Cochrane intervention reviews. Examples are used to illustrate how decisions about boundaries for a review are formed via an iterative process of constructing lines of inquiry and mapping the available information to ascertain whether evidence exists to answer questions related to effectiveness, implementation, feasibility, appropriateness, economic evidence, and equity. The process of question formulation allows reviewers to situate the topic in relation to how it informs and explains effectiveness, using the criterion of meaningfulness, appropriateness, feasibility, and implementation. Questions related to complex questions and interventions can be structured by drawing on an increasingly wide range of question frameworks. Logic models and theoretical frameworks are useful tools for conceptually mapping the literature to illustrate the complexity of the phenomenon of interest. Furthermore, protocol development may require iterative question formulation and searching. Consequently, the final protocol may function as a guide rather than a prescriptive route map, particularly in qualitative reviews that ask more exploratory and open-ended questions. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Government Open Systems Interconnection Profile (GOSIP) transition strategy

    NASA Astrophysics Data System (ADS)

    Laxen, Mark R.

    1993-09-01

    This thesis analyzes the Government Open Systems Interconnection Profile (GOSIP) and the requirements of the Federal Information Processing Standard (FIPS) Publication 146-1. It begins by examining the International Organization for Standardization (ISO) Open Systems Interconnection (OSI) architecture and protocol suites and the distinctions between GOSIP version one and two. Additionally, it explores some of the GOSIP protocol details and discusses the process by which standards organizations have developed their recommendations. Implementation considerations from both government and vendor perspectives illustrate the barriers and requirements faced by information systems managers, as well as basic transition strategies. The result of this thesis is to show a transition strategy through an extended and coordinated period of coexistence due to extensive legacy systems and GOSIP product unavailability. Recommendations for GOSIP protocol standards to include capabilities outside the OSI model are also presented.

  2. Quantum-key-distribution protocol with pseudorandom bases

    NASA Astrophysics Data System (ADS)

    Trushechkin, A. S.; Tregubov, P. A.; Kiktenko, E. O.; Kurochkin, Y. V.; Fedorov, A. K.

    2018-01-01

    Quantum key distribution (QKD) offers a way for establishing information-theoretical secure communications. An important part of QKD technology is a high-quality random number generator for the quantum-state preparation and for post-processing procedures. In this work, we consider a class of prepare-and-measure QKD protocols, utilizing additional pseudorandomness in the preparation of quantum states. We study one of such protocols and analyze its security against the intercept-resend attack. We demonstrate that, for single-photon sources, the considered protocol gives better secret key rates than the BB84 and the asymmetric BB84 protocols. However, the protocol strongly requires single-photon sources.

  3. The effects of sweep numbers per average and protocol type on the accuracy of the p300-based concealed information test.

    PubMed

    Dietrich, Ariana B; Hu, Xiaoqing; Rosenfeld, J Peter

    2014-03-01

    In the first of two experiments, we compared the accuracy of the P300 concealed information test protocol as a function of numbers of trials experienced by subjects and ERP averages analyzed by investigators. Contrary to Farwell et al. (Cogn Neurodyn 6(2):115-154, 2012), we found no evidence that 100 trial based averages are more accurate than 66 or 33 trial based averages (all numbers led to accuracies of 84-94 %). There was actually a trend favoring the lowest trial numbers. The second study compared numbers of irrelevant stimuli recalled and recognized in the 3-stimulus protocol versus the complex trial protocol (Rosenfeld in Memory detection: theory and application of the concealed information test, Cambridge University Press, New York, pp 63-89, 2011). Again, in contrast to expectations from Farwell et al. (Cogn Neurodyn 6(2):115-154, 2012), there were no differences between protocols, although there were more irrelevant stimuli recognized than recalled, and irrelevant 4-digit number group stimuli were neither recalled nor recognized as well as irrelevant city name stimuli. We therefore conclude that stimulus processing in the P300-based complex trial protocol-with no more than 33 sweep averages-is adequate to allow accurate detection of concealed information.

  4. Evaluating the Process of Generating a Clinical Trial Protocol

    PubMed Central

    Franciosi, Lui G.; Butterfield, Noam N.; MacLeod, Bernard A.

    2002-01-01

    The research protocol is the principal document in the conduct of a clinical trial. Its generation requires knowledge about the research problem, the potential experimental confounders, and the relevant Good Clinical Practices for conducting the trial. However, such information is not always available to authors during the writing process. A checklist of over 80 items has been developed to better understand the considerations made by authors in generating a protocol. It is based on the most cited requirements for designing and implementing the randomised controlled trial. Items are categorised according to the trial's research question, experimental design, statistics, ethics, and standard operating procedures. This quality assessment tool evaluates the extent that a generated protocol deviates from the best-planned clinical trial.

  5. Method and Apparatus for Processing UDP Data Packets

    NASA Technical Reports Server (NTRS)

    Murphy, Brandon M. (Inventor)

    2017-01-01

    A method and apparatus for processing a plurality of data packets. A data packet is received. A determination is made as to whether a portion of the data packet follows a selected digital recorder standard protocol based on a header of the data packet. Raw data in the data packet is converted into human-readable information in response to a determination that the portion of the data packet follows the selected digital recorder standard protocol.

  6. Controlled Bidirectional Quantum Secure Direct Communication

    PubMed Central

    Chou, Yao-Hsin; Lin, Yu-Ting; Zeng, Guo-Jyun; Lin, Fang-Jhu; Chen, Chi-Yuan

    2014-01-01

    We propose a novel protocol for controlled bidirectional quantum secure communication based on a nonlocal swap gate scheme. Our proposed protocol would be applied to a system in which a controller (supervisor/Charlie) controls the bidirectional communication with quantum information or secret messages between legitimate users (Alice and Bob). In this system, the legitimate users must obtain permission from the controller in order to exchange their respective quantum information or secret messages simultaneously; the controller is unable to obtain any quantum information or secret messages from the decoding process. Moreover, the presence of the controller also avoids the problem of one legitimate user receiving the quantum information or secret message before the other, and then refusing to help the other user decode the quantum information or secret message. Our proposed protocol is aimed at protecting against external and participant attacks on such a system, and the cost of transmitting quantum bits using our protocol is less than that achieved in other studies. Based on the nonlocal swap gate scheme, the legitimate users exchange their quantum information or secret messages without transmission in a public channel, thus protecting against eavesdroppers stealing the secret messages. PMID:25006596

  7. A Multidirectional Model for Assessing Learning Disabled Students' Intelligence: An Information-Processing Framework.

    ERIC Educational Resources Information Center

    Swanson, H. Lee

    1982-01-01

    An information processing approach to the assessment of learning disabled students' intellectual performance is presented. The model is based on the assumption that intelligent behavior is comprised of a variety of problem- solving strategies. An account of child problem solving is explained and illustrated with a "thinking aloud" protocol.…

  8. Experimental Quantum Randomness Processing Using Superconducting Qubits

    NASA Astrophysics Data System (ADS)

    Yuan, Xiao; Liu, Ke; Xu, Yuan; Wang, Weiting; Ma, Yuwei; Zhang, Fang; Yan, Zhaopeng; Vijay, R.; Sun, Luyan; Ma, Xiongfeng

    2016-07-01

    Coherently manipulating multipartite quantum correlations leads to remarkable advantages in quantum information processing. A fundamental question is whether such quantum advantages persist only by exploiting multipartite correlations, such as entanglement. Recently, Dale, Jennings, and Rudolph negated the question by showing that a randomness processing, quantum Bernoulli factory, using quantum coherence, is strictly more powerful than the one with classical mechanics. In this Letter, focusing on the same scenario, we propose a theoretical protocol that is classically impossible but can be implemented solely using quantum coherence without entanglement. We demonstrate the protocol by exploiting the high-fidelity quantum state preparation and measurement with a superconducting qubit in the circuit quantum electrodynamics architecture and a nearly quantum-limited parametric amplifier. Our experiment shows the advantage of using quantum coherence of a single qubit for information processing even when multipartite correlation is not present.

  9. Collaborative international research: ethical and regulatory issues pertaining to human biological materials at a South African institutional research ethics committee.

    PubMed

    Sathar, Aslam; Dhai, Amaboo; van der Linde, Stephan

    2014-12-01

    Human Biological Materials (HBMs) are an invaluable resource in biomedical research. To determine if researchers and a Research Ethics Committee (REC) at a South African institution addressed ethical issues pertaining to HBMs in collaborative research with developed countries. Ethically approved retrospective cross-sectional descriptive audit. Of the 1305 protocols audited, 151 (11.57%) fulfilled the study's inclusion criteria. Compared to other developed countries, a majority of sponsors (90) were from the USA (p = 0.0001). The principle investigators (PIs) in all 151 protocols informed the REC of their intent to store HBMs. Only 132 protocols informed research participants (P < 0.0001). In 148 protocols informed consent (IC) was obtained from research participants, 116 protocols (76.8%) solicited broad consent compared to specific consent (32; 21.2%) [p < 0.0001]. In 105 cases a code was used to maintain confidentiality. HBMs were anonymised in 14 protocols [p < 0.0001]. More protocols informed the REC (90) than the research participants (67) that HBMs would be exported (p = 0.011). Export permits (EPs) and Material Transfer Agreements (MTAs) were not available in 109 and 143 protocols, respectively. Researchers and the REC did not adequately address the inter-related ethical and regulatory issues pertaining to HBMs. There was a lack of congruence between the ethical guidelines of developed countries and their actions which are central to the access to HBMs in collaborative research. HBMs may be leaving South Africa without EPs and MTAs during the process of international collaborative research. © 2013 John Wiley & Sons Ltd.

  10. Lyceum: A Multi-Protocol Digital Library Gateway

    NASA Technical Reports Server (NTRS)

    Maa, Ming-Hokng; Nelson, Michael L.; Esler, Sandra L.

    1997-01-01

    Lyceum is a prototype scalable query gateway that provides a logically central interface to multi-protocol and physically distributed, digital libraries of scientific and technical information. Lyceum processes queries to multiple syntactically distinct search engines used by various distributed information servers from a single logically central interface without modification of the remote search engines. A working prototype (http://www.larc.nasa.gov/lyceum/) demonstrates the capabilities, potentials, and advantages of this type of meta-search engine by providing access to over 50 servers covering over 20 disciplines.

  11. Quantum communication and information processing

    NASA Astrophysics Data System (ADS)

    Beals, Travis Roland

    Quantum computers enable dramatically more efficient algorithms for solving certain classes of computational problems, but, in doing so, they create new problems. In particular, Shor's Algorithm allows for efficient cryptanalysis of many public-key cryptosystems. As public key cryptography is a critical component of present-day electronic commerce, it is crucial that a working, secure replacement be found. Quantum key distribution (QKD), first developed by C.H. Bennett and G. Brassard, offers a partial solution, but many challenges remain, both in terms of hardware limitations and in designing cryptographic protocols for a viable large-scale quantum communication infrastructure. In Part I, I investigate optical lattice-based approaches to quantum information processing. I look at details of a proposal for an optical lattice-based quantum computer, which could potentially be used for both quantum communications and for more sophisticated quantum information processing. In Part III, I propose a method for converting and storing photonic quantum bits in the internal state of periodically-spaced neutral atoms by generating and manipulating a photonic band gap and associated defect states. In Part II, I present a cryptographic protocol which allows for the extension of present-day QKD networks over much longer distances without the development of new hardware. I also present a second, related protocol which effectively solves the authentication problem faced by a large QKD network, thus making QKD a viable, information-theoretic secure replacement for public key cryptosystems.

  12. Experimental Optimal Single Qubit Purification in an NMR Quantum Information Processor

    PubMed Central

    Hou, Shi-Yao; Sheng, Yu-Bo; Feng, Guan-Ru; Long, Gui-Lu

    2014-01-01

    High quality single qubits are the building blocks in quantum information processing. But they are vulnerable to environmental noise. To overcome noise, purification techniques, which generate qubits with higher purities from qubits with lower purities, have been proposed. Purifications have attracted much interest and been widely studied. However, the full experimental demonstration of an optimal single qubit purification protocol proposed by Cirac, Ekert and Macchiavello [Phys. Rev. Lett. 82, 4344 (1999), the CEM protocol] more than one and half decades ago, still remains an experimental challenge, as it requires more complicated networks and a higher level of precision controls. In this work, we design an experiment scheme that realizes the CEM protocol with explicit symmetrization of the wave functions. The purification scheme was successfully implemented in a nuclear magnetic resonance quantum information processor. The experiment fully demonstrated the purification protocol, and showed that it is an effective way of protecting qubits against errors and decoherence. PMID:25358758

  13. A Typology for Modeling Processes in Clinical Guidelines and Protocols

    NASA Astrophysics Data System (ADS)

    Tu, Samson W.; Musen, Mark A.

    We analyzed the graphical representations that are used by various guideline-modeling methods to express process information embodied in clinical guidelines and protocols. From this analysis, we distilled four modeling formalisms and the processes they typically model: (1) flowcharts for capturing problem-solving processes, (2) disease-state maps that link decision points in managing patient problems over time, (3) plans that specify sequences of activities that contribute toward a goal, (4) workflow specifications that model care processes in an organization. We characterized the four approaches and showed that each captures some aspect of what a guideline may specify. We believe that a general guideline-modeling system must provide explicit representation for each type of process.

  14. Measuring Spontaneous and Instructed Evaluation Processes during Web Search: Integrating Concurrent Thinking-Aloud Protocols and Eye-Tracking Data

    ERIC Educational Resources Information Center

    Gerjets, Peter; Kammerer, Yvonne; Werner, Benita

    2011-01-01

    Web searching for complex information requires to appropriately evaluating diverse sources of information. Information science studies identified different criteria applied by searchers to evaluate Web information. However, the explicit evaluation instructions used in these studies might have resulted in a distortion of spontaneous evaluation…

  15. Constructing Meaning: Think-Aloud Protocols of ELLs on English and Spanish Word Problems.

    ERIC Educational Resources Information Center

    Celedon-Pattichis, Sylvia

    This one-year qualitative study analyzed how nine middle school English language learners (ELLs) of Mexican descent constructed meaning on think-aloud protocols of Spanish and English word problems. Strategies used by these students to process information from English to their native language included translating to Spanish, reading the problem at…

  16. An Extensible Information Grid for Risk Management

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Bell, David G.

    2003-01-01

    This paper describes recent work on developing an extensible information grid for risk management at NASA - a RISK INFORMATION GRID. This grid is being developed by integrating information grid technology with risk management processes for a variety of risk related applications. To date, RISK GRID applications are being developed for three main NASA processes: risk management - a closed-loop iterative process for explicit risk management, program/project management - a proactive process that includes risk management, and mishap management - a feedback loop for learning from historical risks that escaped other processes. This is enabled through an architecture involving an extensible database, structuring information with XML, schemaless mapping of XML, and secure server-mediated communication using standard protocols.

  17. Protocols for Image Processing based Underwater Inspection of Infrastructure Elements

    NASA Astrophysics Data System (ADS)

    O'Byrne, Michael; Ghosh, Bidisha; Schoefs, Franck; Pakrashi, Vikram

    2015-07-01

    Image processing can be an important tool for inspecting underwater infrastructure elements like bridge piers and pile wharves. Underwater inspection often relies on visual descriptions of divers who are not necessarily trained in specifics of structural degradation and the information may often be vague, prone to error or open to significant variation of interpretation. Underwater vehicles, on the other hand can be quite expensive to deal with for such inspections. Additionally, there is now significant encouragement globally towards the deployment of more offshore renewable wind turbines and wave devices and the requirement for underwater inspection can be expected to increase significantly in the coming years. While the merit of image processing based assessment of the condition of underwater structures is understood to a certain degree, there is no existing protocol on such image based methods. This paper discusses and describes an image processing protocol for underwater inspection of structures. A stereo imaging image processing method is considered in this regard and protocols are suggested for image storage, imaging, diving, and inspection. A combined underwater imaging protocol is finally presented which can be used for a variety of situations within a range of image scenes and environmental conditions affecting the imaging conditions. An example of detecting marine growth is presented of a structure in Cork Harbour, Ireland.

  18. A DNA-based pattern classifier with in vitro learning and associative recall for genomic characterization and biosensing without explicit sequence knowledge.

    PubMed

    Lee, Ju Seok; Chen, Junghuei; Deaton, Russell; Kim, Jin-Woo

    2014-01-01

    Genetic material extracted from in situ microbial communities has high promise as an indicator of biological system status. However, the challenge is to access genomic information from all organisms at the population or community scale to monitor the biosystem's state. Hence, there is a need for a better diagnostic tool that provides a holistic view of a biosystem's genomic status. Here, we introduce an in vitro methodology for genomic pattern classification of biological samples that taps large amounts of genetic information from all genes present and uses that information to detect changes in genomic patterns and classify them. We developed a biosensing protocol, termed Biological Memory, that has in vitro computational capabilities to "learn" and "store" genomic sequence information directly from genomic samples without knowledge of their explicit sequences, and that discovers differences in vitro between previously unknown inputs and learned memory molecules. The Memory protocol was designed and optimized based upon (1) common in vitro recombinant DNA operations using 20-base random probes, including polymerization, nuclease digestion, and magnetic bead separation, to capture a snapshot of the genomic state of a biological sample as a DNA memory and (2) the thermal stability of DNA duplexes between new input and the memory to detect similarities and differences. For efficient read out, a microarray was used as an output method. When the microarray-based Memory protocol was implemented to test its capability and sensitivity using genomic DNA from two model bacterial strains, i.e., Escherichia coli K12 and Bacillus subtilis, results indicate that the Memory protocol can "learn" input DNA, "recall" similar DNA, differentiate between dissimilar DNA, and detect relatively small concentration differences in samples. This study demonstrated not only the in vitro information processing capabilities of DNA, but also its promise as a genomic pattern classifier that could access information from all organisms in a biological system without explicit genomic information. The Memory protocol has high potential for many applications, including in situ biomonitoring of ecosystems, screening for diseases, biosensing of pathological features in water and food supplies, and non-biological information processing of memory devices, among many.

  19. Data exchange technology based on handshake protocol for industrial automation system

    NASA Astrophysics Data System (ADS)

    Astafiev, A. V.; Shardin, T. O.

    2018-05-01

    In the article, questions of data exchange technology based on the handshake protocol for industrial automation system are considered. The methods of organizing the technology in client-server applications are analyzed. In the process of work, the main threats of client-server applications that arise during the information interaction of users are indicated. Also, a comparative analysis of analogue systems was carried out, as a result of which the most suitable option was chosen for further use. The basic schemes for the operation of the handshake protocol are shown, as well as the general scheme of the implemented application, which describes the entire process of interaction between the client and the server.

  20. The Federal Government and Information Technology Standards: Building the National Information Infrastructure.

    ERIC Educational Resources Information Center

    Radack, Shirley M.

    1994-01-01

    Examines the role of the National Institute of Standards and Technology (NIST) in the development of the National Information Infrastructure (NII). Highlights include the standards process; voluntary standards; Open Systems Interconnection problems; Internet Protocol Suite; consortia; government's role; and network security. (16 references) (LRW)

  1. Full-Scale Wind-Tunnel Investigation of Wing-Cooling Ducts Effects of Propeller Slipstream, Special Report

    NASA Technical Reports Server (NTRS)

    Nickle, F. R.; Freeman, Arthur B.

    1939-01-01

    The safety of remotely operated vehicles depends on the correctness of the distributed protocol that facilitates the communication between the vehicle and the operator. A failure in this communication can result in catastrophic loss of the vehicle. To complicate matters, the communication system may be required to satisfy several, possibly conflicting, requirements. The design of protocols is typically an informal process based on successive iterations of a prototype implementation. Yet distributed protocols are notoriously difficult to get correct using such informal techniques. We present a formal specification of the design of a distributed protocol intended for use in a remotely operated vehicle, which is built from the composition of several simpler protocols. We demonstrate proof strategies that allow us to prove properties of each component protocol individually while ensuring that the property is preserved in the composition forming the entire system. Given that designs are likely to evolve as additional requirements emerge, we show how we have automated most of the repetitive proof steps to enable verification of rapidly changing designs.

  2. Adolescent Self-Consent for Biomedical Human Immunodeficiency Virus Prevention Research.

    PubMed

    Gilbert, Amy Lewis; Knopf, Amelia S; Fortenberry, J Dennis; Hosek, Sybil G; Kapogiannis, Bill G; Zimet, Gregory D

    2015-07-01

    The Adolescent Medicine Trials Network Protocol 113 (ATN113) is an open-label, multisite demonstration project and Phase II safety study of human immunodeficiency virus (HIV) preexposure prophylaxis with 15- to 17-year-old young men who have sex with men that requires adolescent consent for participation. The purpose of this study was to examine factors related to the process by which Institutional Review Boards (IRBs) and researchers made decisions regarding whether to approve and implement ATN113 so as to inform future biomedical HIV prevention research with high-risk adolescent populations. Participants included 17 researchers at 13 sites in 12 states considering ATN113 implementation. Qualitative descriptive methods were used. Data sources included interviews and documents generated during the initiation process. A common process for initiating ATN113 emerged, and informants described how they identified and addressed practical, ethical, and legal challenges that arose. Informants described the process as responding to the protocol, preparing for IRB submission, abstaining from or proceeding with submission, responding to IRB concerns, and reacting to the outcomes. A complex array of factors impacting approval and implementation were identified, and ATN113 was ultimately implemented in seven of 13 sites. Informants also reflected on lessons learned that may help inform future biomedical HIV prevention research with high-risk adolescent populations. The results illustrate factors for consideration in determining whether to implement such trials, demonstrate that such protocols have the potential to be approved, and highlight a need for clearer standards regarding biomedical HIV prevention research with high-risk adolescent populations. Copyright © 2015 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  3. Improving cardiac operating room to intensive care unit handover using a standardised handover process.

    PubMed

    Gleicher, Yehoshua; Mosko, Jeffrey David; McGhee, Irene

    2017-01-01

    Handovers from the cardiovascular operating room (CVOR) to the cardiovascular intensive care unit (CVICU) are complex processes involving the transfer of information, equipment and responsibility, at a time when the patient is most vulnerable. This transfer is typically variable in structure, content and execution. This variability can lead to the omission and miscommunication of critical information leading to patient harm. We set out to improve the quality of patient handover from the CVOR to the CVICU by introducing a standardised handover protocol. This study is an interventional time-series study over a 4-month period at an adult cardiac surgery centre. A standardised handover protocol was developed using quality improvement methodologies. The protocol included a handover content checklist and introduction of a formal 'sterile cockpit' timeout. Implementation of the protocol was refined using monthly iterative Plan-Do-Study-Act. The primary outcome was the quality of handovers, measured by a Handover Score, comprising handover content, teamwork and patient care planning indicators. Secondary outcomes included handover duration, adherence to the standardised handover protocol and handover team satisfaction surveys. 37 handovers were observed (6 pre intervention and 31 post intervention). The mean handover score increased from 6.5 to 14.0 (maximum 18 points). Specific improvements included fewer handover interruptions and more frequent postoperative patient care planning. Average handover duration increased slightly from 2:40 to 2:57 min. Caregivers noted improvements in teamwork, content received and patient care planning. The majority (>95%) agreed that the intervention was a valuable addition to the CVOR to CVICU handover process. Implementation of a standardised handover protocol for postcardiac surgery patients was associated with fewer interruptions during handover, more reliable transfer of critical content and improved patient care planning.

  4. Learning time-dependent noise to reduce logical errors: real time error rate estimation in quantum error correction

    NASA Astrophysics Data System (ADS)

    Huo, Ming-Xia; Li, Ying

    2017-12-01

    Quantum error correction is important to quantum information processing, which allows us to reliably process information encoded in quantum error correction codes. Efficient quantum error correction benefits from the knowledge of error rates. We propose a protocol for monitoring error rates in real time without interrupting the quantum error correction. Any adaptation of the quantum error correction code or its implementation circuit is not required. The protocol can be directly applied to the most advanced quantum error correction techniques, e.g. surface code. A Gaussian processes algorithm is used to estimate and predict error rates based on error correction data in the past. We find that using these estimated error rates, the probability of error correction failures can be significantly reduced by a factor increasing with the code distance.

  5. Using semantics for representing experimental protocols.

    PubMed

    Giraldo, Olga; García, Alexander; López, Federico; Corcho, Oscar

    2017-11-13

    An experimental protocol is a sequence of tasks and operations executed to perform experimental research in biological and biomedical areas, e.g. biology, genetics, immunology, neurosciences, virology. Protocols often include references to equipment, reagents, descriptions of critical steps, troubleshooting and tips, as well as any other information that researchers deem important for facilitating the reusability of the protocol. Although experimental protocols are central to reproducibility, the descriptions are often cursory. There is the need for a unified framework with respect to the syntactic structure and the semantics for representing experimental protocols. In this paper we present "SMART Protocols ontology", an ontology for representing experimental protocols. Our ontology represents the protocol as a workflow with domain specific knowledge embedded within a document. We also present the S ample I nstrument R eagent O bjective (SIRO) model, which represents the minimal common information shared across experimental protocols. SIRO was conceived in the same realm as the Patient Intervention Comparison Outcome (PICO) model that supports search, retrieval and classification purposes in evidence based medicine. We evaluate our approach against a set of competency questions modeled as SPARQL queries and processed against a set of published and unpublished protocols modeled with the SP Ontology and the SIRO model. Our approach makes it possible to answer queries such as Which protocols use tumor tissue as a sample. Improving reporting structures for experimental protocols requires collective efforts from authors, peer reviewers, editors and funding bodies. The SP Ontology is a contribution towards this goal. We build upon previous experiences and bringing together the view of researchers managing protocols in their laboratory work. Website: https://smartprotocols.github.io/ .

  6. Cross-layer protocol design for QoS optimization in real-time wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Hortos, William S.

    2010-04-01

    The metrics of quality of service (QoS) for each sensor type in a wireless sensor network can be associated with metrics for multimedia that describe the quality of fused information, e.g., throughput, delay, jitter, packet error rate, information correlation, etc. These QoS metrics are typically set at the highest, or application, layer of the protocol stack to ensure that performance requirements for each type of sensor data are satisfied. Application-layer metrics, in turn, depend on the support of the lower protocol layers: session, transport, network, data link (MAC), and physical. The dependencies of the QoS metrics on the performance of the higher layers of the Open System Interconnection (OSI) reference model of the WSN protocol, together with that of the lower three layers, are the basis for a comprehensive approach to QoS optimization for multiple sensor types in a general WSN model. The cross-layer design accounts for the distributed power consumption along energy-constrained routes and their constituent nodes. Following the author's previous work, the cross-layer interactions in the WSN protocol are represented by a set of concatenated protocol parameters and enabling resource levels. The "best" cross-layer designs to achieve optimal QoS are established by applying the general theory of martingale representations to the parameterized multivariate point processes (MVPPs) for discrete random events occurring in the WSN. Adaptive control of network behavior through the cross-layer design is realized through the parametric factorization of the stochastic conditional rates of the MVPPs. The cross-layer protocol parameters for optimal QoS are determined in terms of solutions to stochastic dynamic programming conditions derived from models of transient flows for heterogeneous sensor data and aggregate information over a finite time horizon. Markov state processes, embedded within the complex combinatorial history of WSN events, are more computationally tractable and lead to simplifications for any simulated or analytical performance evaluations of the cross-layer designs.

  7. The Reference Encounter Model.

    ERIC Educational Resources Information Center

    White, Marilyn Domas

    1983-01-01

    Develops model of the reference interview which explicitly incorporates human information processing, particularly schema ideas presented by Marvin Minsky and other theorists in cognitive processing and artificial intelligence. Questions are raised concerning use of content analysis of transcribed verbal protocols as methodology for studying…

  8. Complete Bell-state analysis for superconducting-quantum-interference-device qubits with a transitionless tracking algorithm

    NASA Astrophysics Data System (ADS)

    Kang, Yi-Hao; Chen, Ye-Hong; Shi, Zhi-Cheng; Huang, Bi-Hua; Song, Jie; Xia, Yan

    2017-08-01

    We propose a protocol for complete Bell-state analysis for two superconducting-quantum-interference-device qubits. The Bell-state analysis could be completed by using a sequence of microwave pulses designed by the transitionless tracking algorithm, which is a useful method in the technique of shortcut to adiabaticity. After the whole process, the information for distinguishing four Bell states will be encoded on two auxiliary qubits, while the Bell states remain unchanged. One can read out the information by detecting the auxiliary qubits. Thus the Bell-state analysis is nondestructive. The numerical simulations show that the protocol possesses a high success probability of distinguishing each Bell state with current experimental technology even when decoherence is taken into account. Thus, the protocol may have potential applications for the information readout in quantum communications and quantum computations in superconducting quantum networks.

  9. Informal trail monitoring protocols: Denali National Park and Preserve. Final Report, October 2011

    USGS Publications Warehouse

    Marion, Jeffrey L.; Wimpey, Jeremy F.

    2011-01-01

    Managers at Alaska?s Denali National Park and Preserve (DENA) sponsored this research to assess and monitor visitor-created informal trails (ITs). DENA is located in south-central Alaska and managed as a six million acre wilderness park. This program of research was guided by the following objectives: (1) Investigate alternative methods for monitoring the spatial distribution, aggregate lineal extent, and tread conditions of informal (visitor-created) trails within the park. (2) In consultation with park staff, develop, pilot test, and refine cost-effective and scientifically defensible trail monitoring procedures that are fully integrated with the park?s Geographic Information System. (3) Prepare a technical report that compiles and presents research results and their management implications. This report presents the protocol development and field testing process, illustrates the types of data produced by their application, and provides guidance for their application and use. The protocols described provide managers with an efficient means to document and monitor IT conditions in settings ranging from pristine to intensively visited.

  10. A distributed computing model for telemetry data processing

    NASA Astrophysics Data System (ADS)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-05-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  11. A distributed computing model for telemetry data processing

    NASA Technical Reports Server (NTRS)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-01-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  12. Analytical Models of Cross-Layer Protocol Optimization in Real-Time Wireless Sensor Ad Hoc Networks

    NASA Astrophysics Data System (ADS)

    Hortos, William S.

    The real-time interactions among the nodes of a wireless sensor network (WSN) to cooperatively process data from multiple sensors are modeled. Quality-of-service (QoS) metrics are associated with the quality of fused information: throughput, delay, packet error rate, etc. Multivariate point process (MVPP) models of discrete random events in WSNs establish stochastic characteristics of optimal cross-layer protocols. Discrete-event, cross-layer interactions in mobile ad hoc network (MANET) protocols have been modeled using a set of concatenated design parameters and associated resource levels by the MVPPs. Characterization of the "best" cross-layer designs for a MANET is formulated by applying the general theory of martingale representations to controlled MVPPs. Performance is described in terms of concatenated protocol parameters and controlled through conditional rates of the MVPPs. Modeling limitations to determination of closed-form solutions versus explicit iterative solutions for ad hoc WSN controls are examined.

  13. Generation of an arbitrary concatenated Greenberger-Horne-Zeilinger state with single photons

    NASA Astrophysics Data System (ADS)

    Chen, Shan-Shan; Zhou, Lan; Sheng, Yu-Bo

    2017-02-01

    The concatenated Greenberger-Horne-Zeilinger (C-GHZ) state is a new kind of logic-qubit entangled state, which may have extensive applications in future quantum communication. In this letter, we propose a protocol for constructing an arbitrary C-GHZ state with single photons. We exploit the cross-Kerr nonlinearity for this purpose. This protocol has some advantages over previous protocols. First, it only requires two kinds of cross-Kerr nonlinearities to generate single phase shifts  ±θ. Second, it is not necessary to use sophisticated m-photon Toffoli gates. Third, this protocol is deterministic and can be used to generate an arbitrary C-GHZ state. This protocol may be useful in future quantum information processing based on the C-GHZ state.

  14. Semantic Technologies and Bio-Ontologies.

    PubMed

    Gutierrez, Fernando

    2017-01-01

    As information available through data repositories constantly grows, the need for automated mechanisms for linking, querying, and sharing data has become a relevant factor both in research and industry. This situation is more evident in research fields such as the life sciences, where new experiments by different research groups are constantly generating new information regarding a wide variety of related study objects. However, current methods for representing information and knowledge are not suited for machine processing. The Semantic Technologies are a set of standards and protocols that intend to provide methods for representing and handling data that encourages reusability of information and is machine-readable. In this chapter, we will provide a brief introduction to Semantic Technologies, and how these protocols and standards have been incorporated into the life sciences to facilitate dissemination and access to information.

  15. Energy Efficient and QoS sensitive Routing Protocol for Ad Hoc Networks

    NASA Astrophysics Data System (ADS)

    Saeed Tanoli, Tariq; Khalid Khan, Muhammad

    2013-12-01

    Efficient routing is an important part of wireless ad hoc networks. Since in ad hoc networks we have limited resources, there are many limitations like bandwidth, battery consumption, and processing cycle etc. Reliability is also necessary since there is no allowance for invalid or incomplete information (and expired data is useless). There are various protocols that perform routing by considering one parameter but ignoring other parameters. In this paper we present a protocol that finds route on the basis of bandwidth, energy and mobility of the nodes participating in the communication.

  16. PCoD Lite - Using an Interim PCoD Protocol to Assess the Effects of Disturbance Associated with US Navy Exercises on Marine Mammal Populations

    DTIC Science & Technology

    2015-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. PCoD Lite - Using an Interim PCoD Protocol to Assess...US National Research Council (NRC 2005). Here, we provide an insight into how the Interim PCoD approach (Harwood et al. 2014, King et al. 2015...illustrate how the interim PCoD protocol can be used to inform the process of determining whether or not Navy activities are likely to have an impact on

  17. Remote Entanglement by Coherent Multiplication of Concurrent Quantum Signals

    NASA Astrophysics Data System (ADS)

    Roy, Ananda; Jiang, Liang; Stone, A. Douglas; Devoret, Michel

    2015-10-01

    Concurrent remote entanglement of distant, noninteracting quantum entities is a crucial function for quantum information processing. In contrast with the existing protocols which employ the addition of signals to generate entanglement between two remote qubits, the continuous variable protocol we present is based on the multiplication of signals. This protocol can be straightforwardly implemented by a novel Josephson junction mixing circuit. Our scheme would be able to generate provable entanglement even in the presence of practical imperfections: finite quantum efficiency of detectors and undesired photon loss in current state-of-the-art devices.

  18. Community of Practice: A Path to Strategic Learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nancy M. Carlson

    2003-04-01

    To explore the concept of community of practice, the research initially concentrates on a strategic business process in a research and applied engineering laboratory discovering essential communication tools and processes needed to cultivate a high functioning cross-disciplinary team engaged in proposal preparation. Qualitative research in the human ecology of the proposal process blends topic-oriented ethnography and grounded theory and includes an innovative addition to qualitative interviewing, called meta-inquiry. Meta-inquiry uses an initial interview protocol with a homogeneous pool of informants to enhance the researcher's sensitivity to the unique cultures involved in the proposal process before developing a formal interview protocol.more » In this study the preanalysis process uses data from editors, graphic artists, text processors, and production coordinators to assess, modify, enhance, and focus the formal interview protocol with scientists, engineers, and technical managers-the heterogeneous informants. Thus this human ecology-based interview protocol values homogeneous and heterogeneous informant data and acquires data from which concepts, categories, properties, and both substantive and formal theory emerges. The research discovers the five essential processes of owning, visioning, reviewing, producing, and contributing for strategic learning to occur in a proposal community of practice. The apprenticeship, developmental, and nurturing perspectives of adult learning provide the proposal community of practice with cohesion, interdependence, and caring, while core and boundary practices provide insight into the tacit and explicit dimensions of the proposal process. By making these dimensions explicit, the necessary competencies, absorptive capacity, and capabilities needed for strategic learning are discovered. Substantive theory emerges and provides insight into the ability of the proposal community of practice to evolve, flourish, and adapt to the strategic advantage of the laboratory. The substantive theory explores the dimensions of owning, visioning, reviewing, producing, and contributing and their interrelationship to community learning dynamics. Through dialogue, creative tension, and imagination, the proposal community of practice focuses on actionable goals linked by proactively participating in practice, creating possibilities, evaluating and enhancing potential, producing a valued product, and confirming strategic value. Lastly, a formal theory emerges linking competency-capacity-capability, cohesion, interdependence, and caring as essential attributes of strategic learning communities.« less

  19. Using Ontologies to Formalize Services Specifications in Multi-Agent Systems

    NASA Technical Reports Server (NTRS)

    Breitman, Karin Koogan; Filho, Aluizio Haendchen; Haeusler, Edward Hermann

    2004-01-01

    One key issue in multi-agent systems (MAS) is their ability to interact and exchange information autonomously across applications. To secure agent interoperability, designers must rely on a communication protocol that allows software agents to exchange meaningful information. In this paper we propose using ontologies as such communication protocol. Ontologies capture the semantics of the operations and services provided by agents, allowing interoperability and information exchange in a MAS. Ontologies are a formal, machine processable, representation that allows to capture the semantics of a domain and, to derive meaningful information by way of logical inference. In our proposal we use a formal knowledge representation language (OWL) that translates into Description Logics (a subset of first order logic), thus eliminating ambiguities and providing a solid base for machine based inference. The main contribution of this approach is to make the requirements explicit, centralize the specification in a single document (the ontology itself), at the same that it provides a formal, unambiguous representation that can be processed by automated inference machines.

  20. Analytical approach to cross-layer protocol optimization in wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Hortos, William S.

    2008-04-01

    In the distributed operations of route discovery and maintenance, strong interaction occurs across mobile ad hoc network (MANET) protocol layers. Quality of service (QoS) requirements of multimedia service classes must be satisfied by the cross-layer protocol, along with minimization of the distributed power consumption at nodes and along routes to battery-limited energy constraints. In previous work by the author, cross-layer interactions in the MANET protocol are modeled in terms of a set of concatenated design parameters and associated resource levels by multivariate point processes (MVPPs). Determination of the "best" cross-layer design is carried out using the optimal control of martingale representations of the MVPPs. In contrast to the competitive interaction among nodes in a MANET for multimedia services using limited resources, the interaction among the nodes of a wireless sensor network (WSN) is distributed and collaborative, based on the processing of data from a variety of sensors at nodes to satisfy common mission objectives. Sensor data originates at the nodes at the periphery of the WSN, is successively transported to other nodes for aggregation based on information-theoretic measures of correlation and ultimately sent as information to one or more destination (decision) nodes. The "multimedia services" in the MANET model are replaced by multiple types of sensors, e.g., audio, seismic, imaging, thermal, etc., at the nodes; the QoS metrics associated with MANETs become those associated with the quality of fused information flow, i.e., throughput, delay, packet error rate, data correlation, etc. Significantly, the essential analytical approach to MANET cross-layer optimization, now based on the MVPPs for discrete random events occurring in the WSN, can be applied to develop the stochastic characteristics and optimality conditions for cross-layer designs of sensor network protocols. Functional dependencies of WSN performance metrics are described in terms of the concatenated protocol parameters. New source-to-destination routes are sought that optimize cross-layer interdependencies to achieve the "best available" performance in the WSN. The protocol design, modified from a known reactive protocol, adapts the achievable performance to the transient network conditions and resource levels. Control of network behavior is realized through the conditional rates of the MVPPs. Optimal cross-layer protocol parameters are determined by stochastic dynamic programming conditions derived from models of transient packetized sensor data flows. Moreover, the defining conditions for WSN configurations, grouping sensor nodes into clusters and establishing data aggregation at processing nodes within those clusters, lead to computationally tractable solutions to the stochastic differential equations that describe network dynamics. Closed-form solution characteristics provide an alternative to the "directed diffusion" methods for resource-efficient WSN protocols published previously by other researchers. Performance verification of the resulting cross-layer designs is found by embedding the optimality conditions for the protocols in actual WSN scenarios replicated in a wireless network simulation environment. Performance tradeoffs among protocol parameters remain for a sequel to the paper.

  1. Advanced Map For Real-Time Process Control

    NASA Astrophysics Data System (ADS)

    Shiobara, Yasuhisa; Matsudaira, Takayuki; Sashida, Yoshio; Chikuma, Makoto

    1987-10-01

    MAP, a communications protocol for factory automation proposed by General Motors [1], has been accepted by users throughout the world and is rapidly becoming a user standard. In fact, it is now a LAN standard for factory automation. MAP is intended to interconnect different devices, such as computers and programmable devices, made by different manufacturers, enabling them to exchange information. It is based on the OSI intercomputer com-munications protocol standard under development by the ISO. With progress and standardization, MAP is being investigated for application to process control fields other than factory automation [2]. The transmission response time of the network system and centralized management of data exchanged with various devices for distributed control are import-ant in the case of a real-time process control with programmable controllers, computers, and instruments connected to a LAN system. MAP/EPA and MINI MAP aim at reduced overhead in protocol processing and enhanced transmission response. If applied to real-time process control, a protocol based on point-to-point and request-response transactions limits throughput and transmission response. This paper describes an advanced MAP LAN system applied to real-time process control by adding a new data transmission control that performs multicasting communication voluntarily and periodically in the priority order of data to be exchanged.

  2. Generation and confirmation of a (100 x 100)-dimensional entangled quantum system.

    PubMed

    Krenn, Mario; Huber, Marcus; Fickler, Robert; Lapkiewicz, Radek; Ramelow, Sven; Zeilinger, Anton

    2014-04-29

    Entangled quantum systems have properties that have fundamentally overthrown the classical worldview. Increasing the complexity of entangled states by expanding their dimensionality allows the implementation of novel fundamental tests of nature, and moreover also enables genuinely new protocols for quantum information processing. Here we present the creation of a (100 × 100)-dimensional entangled quantum system, using spatial modes of photons. For its verification we develop a novel nonlinear criterion which infers entanglement dimensionality of a global state by using only information about its subspace correlations. This allows very practical experimental implementation as well as highly efficient extraction of entanglement dimensionality information. Applications in quantum cryptography and other protocols are very promising.

  3. Generation and confirmation of a (100 × 100)-dimensional entangled quantum system

    PubMed Central

    Krenn, Mario; Huber, Marcus; Fickler, Robert; Lapkiewicz, Radek; Ramelow, Sven; Zeilinger, Anton

    2014-01-01

    Entangled quantum systems have properties that have fundamentally overthrown the classical worldview. Increasing the complexity of entangled states by expanding their dimensionality allows the implementation of novel fundamental tests of nature, and moreover also enables genuinely new protocols for quantum information processing. Here we present the creation of a (100 × 100)-dimensional entangled quantum system, using spatial modes of photons. For its verification we develop a novel nonlinear criterion which infers entanglement dimensionality of a global state by using only information about its subspace correlations. This allows very practical experimental implementation as well as highly efficient extraction of entanglement dimensionality information. Applications in quantum cryptography and other protocols are very promising. PMID:24706902

  4. A Novel College Network Resource Management Method using Cloud Computing

    NASA Astrophysics Data System (ADS)

    Lin, Chen

    At present information construction of college mainly has construction of college networks and management information system; there are many problems during the process of information. Cloud computing is development of distributed processing, parallel processing and grid computing, which make data stored on the cloud, make software and services placed in the cloud and build on top of various standards and protocols, you can get it through all kinds of equipments. This article introduces cloud computing and function of cloud computing, then analyzes the exiting problems of college network resource management, the cloud computing technology and methods are applied in the construction of college information sharing platform.

  5. Task-technology fit of video telehealth for nurses in an outpatient clinic setting.

    PubMed

    Cady, Rhonda G; Finkelstein, Stanley M

    2014-07-01

    Incorporating telehealth into outpatient care delivery supports management of consumer health between clinic visits. Task-technology fit is a framework for understanding how technology helps and/or hinders a person during work processes. Evaluating the task-technology fit of video telehealth for personnel working in a pediatric outpatient clinic and providing care between clinic visits ensures the information provided matches the information needed to support work processes. The workflow of advanced practice registered nurse (APRN) care coordination provided via telephone and video telehealth was described and measured using a mixed-methods workflow analysis protocol that incorporated cognitive ethnography and time-motion study. Qualitative and quantitative results were merged and analyzed within the task-technology fit framework to determine the workflow fit of video telehealth for APRN care coordination. Incorporating video telehealth into APRN care coordination workflow provided visual information unavailable during telephone interactions. Despite additional tasks and interactions needed to obtain the visual information, APRN workflow efficiency, as measured by time, was not significantly changed. Analyzed within the task-technology fit framework, the increased visual information afforded by video telehealth supported the assessment and diagnostic information needs of the APRN. Telehealth must provide the right information to the right clinician at the right time. Evaluating task-technology fit using a mixed-methods protocol ensured rigorous analysis of fit within work processes and identified workflows that benefit most from the technology.

  6. Improving investigational drug service operations through development of an innovative computer system.

    PubMed

    Sweet, Burgunda V; Tamer, Helen R; Siden, Rivka; McCreadie, Scott R; McGregory, Michael E; Benner, Todd; Tankanow, Roberta M

    2008-05-15

    The development of a computerized system for protocol management, dispensing, inventory accountability, and billing by the investigational drug service (IDS) of a university health system is described. After an unsuccessful search for a commercial system that would accommodate the variation among investigational protocols and meet regulatory requirements, the IDS worked with the health-system pharmacy's information technology staff and informatics pharmacists to develop its own system. The informatics pharmacists observed work-flow and information capture in the IDS and identified opportunities for improved efficiency with an automated system. An iterative build-test-design process was used to provide the flexibility needed for individual protocols. The intent was to design a system that would support most IDS processes, using components that would allow automated backup and redundancies. A browser-based system was chosen to allow remote access. Servers, bar-code scanners, and printers were integrated into the final system design. Initial implementation involved 10 investigational protocols chosen on the basis of dispensing volume and complexity of study design. Other protocols were added over a two-year period; all studies whose drugs were dispensed from the IDS were added, followed by those for which the drugs were dispensed from decentralized pharmacy areas. The IDS briefly used temporary staff to free pharmacist and technician time for system implementation. Decentralized pharmacy areas that rarely dispense investigational drugs continue to use manual processes, with subsequent data transcription into the system. Through the university's technology transfer division, the system was licensed by an external company for sale to other IDSs. The WebIDS system has improved daily operations, enhanced safety and efficiency, and helped meet regulatory requirements for investigational drugs.

  7. From Wagons to Race Cars, At Least Now We Have A Chassis

    NASA Astrophysics Data System (ADS)

    Morrow, A. L.

    2003-12-01

    For the past 30 years Very-Long-Baseline Interferometry (VLBI) has provided astronomers with the most accurate measurements to date of both distant radio sources as well as the tectonic plates. The resolutions attainable through VLBI are orders of magnitude better than other instruments. In order to transmit radio signals collected at different sites to a correlator for processing the VLBI data was stored on magnetic tapes, and then the magnetic tapes were shipped through the mail to the central processing site. This was not only arduous and inefficient, it was also costly. Now this means of shipment can be replaced by global high speed networks. This means of transmission is called e-VLBI. New protocols must be developed so e-VLBI can become a proficient high bandwidth background user. The protocol agreed upon uses a Real-time Transport Protocol (RTP) framework to preserve timing information and synchronization. The RTP is then transported using the Internet User Datagram Protocol (UDP) with RTP Control Protocol (RTCP) monitoring the networks performance. When this protocol is fully functional astronomers will be able to observe all over the world and receive results in real time.

  8. Pilot evaluation of a method to assess prescribers' information processing of medication alerts.

    PubMed

    Russ, Alissa L; Melton, Brittany L; Daggy, Joanne K; Saleem, Jason J

    2017-02-01

    Prescribers commonly receive alerts during medication ordering. Prescribers work in a complex, time-pressured environment; to enhance the effectiveness of safety alerts, the effort needed to cognitively process these alerts should be minimized. Methods to evaluate the extent to which computerized alerts support prescribers' information processing are lacking. To develop a methodological protocol to assess the extent to which alerts support prescribers' information processing at-a-glance; specifically, the incorporation of information into their working memory. We hypothesized that the method would be feasible and that we would be able to detect a significant difference in prescribers' information processing with a revised alert display that incorporates warning design guidelines compared to the original alert display. A counterbalanced, within-subject study was conducted with 20 prescribers in a human-computer interaction laboratory. We tested a single alert that was displayed in two different ways. Prescribers were informed that an alert would appear for 10s. After the alert was shown, a white screen was displayed, and prescribers were asked to verbally describe what they saw; indicate how many total warnings; and describe anything else they remembered about the alert. We measured information processing via the accuracy of prescribers' free recall and their ability to identify that three warning messages were present. Two analysts independently evaluated participants' responses against a comprehensive catalog of alert elements and then discussed discrepancies until reaching consensus. This feasibility study demonstrated that the method seemed to be effective for evaluating prescribers' information processing of medication alert displays. With this method, we were able to detect significant differences in prescribers' recall of alert information. The proportion of total data elements that prescribers were able to accurately recall was significantly greater for the revised versus original alert display (p=0.006). With the revised display, more prescribers accurately reported that three warnings were shown (p=0.002). The methodological protocol was feasible for evaluating the alert display and yielded important findings on prescribers' information processing. Study methods supplement traditional usability evaluation methods and may be useful for evaluating information processing of other healthcare technologies. Published by Elsevier Inc.

  9. A slotted access control protocol for metropolitan WDM ring networks

    NASA Astrophysics Data System (ADS)

    Baziana, P. A.; Pountourakis, I. E.

    2009-03-01

    In this study we focus on the serious scalability problems that many access protocols for WDM ring networks introduce due to the use of a dedicated wavelength per access node for either transmission or reception. We propose an efficient slotted MAC protocol suitable for WDM ring metropolitan area networks. The proposed network architecture employs a separate wavelength for control information exchange prior to the data packet transmission. Each access node is equipped with a pair of tunable transceivers for data communication and a pair of fixed tuned transceivers for control information exchange. Also, each access node includes a set of fixed delay lines for synchronization reasons; to keep the data packets, while the control information is processed. An efficient access algorithm is applied to avoid both the data wavelengths and the receiver collisions. In our protocol, each access node is capable of transmitting and receiving over any of the data wavelengths, facing the scalability issues. Two different slot reuse schemes are assumed: the source and the destination stripping schemes. For both schemes, performance measures evaluation is provided via an analytic model. The analytical results are validated by a discrete event simulation model that uses Poisson traffic sources. Simulation results show that the proposed protocol manages efficient bandwidth utilization, especially under high load. Also, comparative simulation results prove that our protocol achieves significant performance improvement as compared with other WDMA protocols which restrict transmission over a dedicated data wavelength. Finally, performance measures evaluation is explored for diverse numbers of buffer size, access nodes and data wavelengths.

  10. Deterministic quantum state transfer and remote entanglement using microwave photons.

    PubMed

    Kurpiers, P; Magnard, P; Walter, T; Royer, B; Pechal, M; Heinsoo, J; Salathé, Y; Akin, A; Storz, S; Besse, J-C; Gasparinetti, S; Blais, A; Wallraff, A

    2018-06-01

    Sharing information coherently between nodes of a quantum network is fundamental to distributed quantum information processing. In this scheme, the computation is divided into subroutines and performed on several smaller quantum registers that are connected by classical and quantum channels 1 . A direct quantum channel, which connects nodes deterministically rather than probabilistically, achieves larger entanglement rates between nodes and is advantageous for distributed fault-tolerant quantum computation 2 . Here we implement deterministic state-transfer and entanglement protocols between two superconducting qubits fabricated on separate chips. Superconducting circuits 3 constitute a universal quantum node 4 that is capable of sending, receiving, storing and processing quantum information 5-8 . Our implementation is based on an all-microwave cavity-assisted Raman process 9 , which entangles or transfers the qubit state of a transmon-type artificial atom 10 with a time-symmetric itinerant single photon. We transfer qubit states by absorbing these itinerant photons at the receiving node, with a probability of 98.1 ± 0.1 per cent, achieving a transfer-process fidelity of 80.02 ± 0.07 per cent for a protocol duration of only 180 nanoseconds. We also prepare remote entanglement on demand with a fidelity as high as 78.9 ± 0.1 per cent at a rate of 50 kilohertz. Our results are in excellent agreement with numerical simulations based on a master-equation description of the system. This deterministic protocol has the potential to be used for quantum computing distributed across different nodes of a cryogenic network.

  11. Atomic entanglement purification and concentration using coherent state input-output process in low-Q cavity QED regime.

    PubMed

    Cao, Cong; Wang, Chuan; He, Ling-Yan; Zhang, Ru

    2013-02-25

    We investigate an atomic entanglement purification protocol based on the coherent state input-output process by working in low-Q cavity in the atom-cavity intermediate coupling region. The information of entangled states are encoded in three-level configured single atoms confined in separated one-side optical micro-cavities. Using the coherent state input-output process, we design a two-qubit parity check module (PCM), which allows the quantum nondemolition measurement for the atomic qubits, and show its use for remote parities to distill a high-fidelity atomic entangled ensemble from an initial mixed state ensemble nonlocally. The proposed scheme can further be used for unknown atomic states entanglement concentration. Also by exploiting the PCM, we describe a modified scheme for atomic entanglement concentration by introducing ancillary single atoms. As the coherent state input-output process is robust and scalable in realistic applications, and the detection in the PCM is based on the intensity of outgoing coherent state, the present protocols may be widely used in large-scaled and solid-based quantum repeater and quantum information processing.

  12. Squeezed-state quantum key distribution with a Rindler observer

    NASA Astrophysics Data System (ADS)

    Zhou, Jian; Shi, Ronghua; Guo, Ying

    2018-03-01

    Lengthening the maximum transmission distance of quantum key distribution plays a vital role in quantum information processing. In this paper, we propose a directional squeezed-state protocol with signals detected by a Rindler observer in the relativistic quantum field framework. We derive an analytical solution to the transmission problem of squeezed states from the inertial sender to the accelerated receiver. The variance of the involved signal mode is closer to optimality than that of the coherent-state-based protocol. Simulation results show that the proposed protocol has better performance than the coherent-state counterpart especially in terms of the maximal transmission distance.

  13. Proposal for founding mistrustful quantum cryptography on coin tossing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kent, Adrian; Hewlett-Packard Laboratories, Filton Road, Stoke Gifford, Bristol BS34 8QZ,

    2003-07-01

    A significant branch of classical cryptography deals with the problems which arise when mistrustful parties need to generate, process, or exchange information. As Kilian showed a while ago, mistrustful classical cryptography can be founded on a single protocol, oblivious transfer, from which general secure multiparty computations can be built. The scope of mistrustful quantum cryptography is limited by no-go theorems, which rule out, inter alia, unconditionally secure quantum protocols for oblivious transfer or general secure two-party computations. These theorems apply even to protocols which take relativistic signaling constraints into account. The best that can be hoped for, in general, aremore » quantum protocols which are computationally secure against quantum attack. Here a method is described for building a classically certified bit commitment, and hence every other mistrustful cryptographic task, from a secure coin-tossing protocol. No security proof is attempted, but reasons are sketched why these protocols might resist quantum computational attack.« less

  14. Conducting Clinically Based Intimate Partner Violence Research: Safety Protocol Recommendations.

    PubMed

    Anderson, Jocelyn C; Glass, Nancy E; Campbell, Jacquelyn C

    Maintaining safety is of utmost importance during research involving participants who have experienced intimate partner violence (IPV). Limited guidance on safety protocols to protect participants is available, particularly information related to technology-based approaches to informed consent, data collection, and contacting participants during the course of a study. The purpose of the article is to provide details on the safety protocol developed and utilized with women receiving care at an urban HIV clinic and who were taking part in an observational study of IPV, mental health symptoms, and substance abuse and their relationship to HIV treatment adherence. The protocol presents the technological strategies to promote safety and allow autonomy in participant decision-making throughout the research process, including Voice over Internet Protocol telephone numbers, and tablet-based eligibility screening and data collection. Protocols for management of participants at risk for suicide and/or intimate partner homicide that included automated high-risk messaging to participants and research staff and facilitated disclosure of risk to clinical staff based on participant preferences are discussed. Use of technology and partnership with clinic staff helped to provide an environment where research regarding IPV could be conducted without undue burden or risk to participants. Utilizing tablet-based survey administration provided multiple practical and safety benefits for participants. Most women who screened into high-risk categories for suicide or intimate partner homicide did not choose to have their results shared with their healthcare providers, indicating the importance of allowing participants control over information sharing whenever possible.

  15. Thinking graphically: Connecting vision and cognition during graph comprehension.

    PubMed

    Ratwani, Raj M; Trafton, J Gregory; Boehm-Davis, Deborah A

    2008-03-01

    Task analytic theories of graph comprehension account for the perceptual and conceptual processes required to extract specific information from graphs. Comparatively, the processes underlying information integration have received less attention. We propose a new framework for information integration that highlights visual integration and cognitive integration. During visual integration, pattern recognition processes are used to form visual clusters of information; these visual clusters are then used to reason about the graph during cognitive integration. In 3 experiments, the processes required to extract specific information and to integrate information were examined by collecting verbal protocol and eye movement data. Results supported the task analytic theories for specific information extraction and the processes of visual and cognitive integration for integrative questions. Further, the integrative processes scaled up as graph complexity increased, highlighting the importance of these processes for integration in more complex graphs. Finally, based on this framework, design principles to improve both visual and cognitive integration are described. PsycINFO Database Record (c) 2008 APA, all rights reserved

  16. A streamlined collecting and preparation protocol for DNA barcoding of Lepidoptera as part of large-scale rapid biodiversity assessment projects, exemplified by the Indonesian Biodiversity Discovery and Information System (IndoBioSys).

    PubMed

    Schmidt, Olga; Hausmann, Axel; Cancian de Araujo, Bruno; Sutrisno, Hari; Peggie, Djunijanti; Schmidt, Stefan

    2017-01-01

    Here we present a general collecting and preparation protocol for DNA barcoding of Lepidoptera as part of large-scale rapid biodiversity assessment projects, and a comparison with alternative preserving and vouchering methods. About 98% of the sequenced specimens processed using the present collecting and preparation protocol yielded sequences with more than 500 base pairs. The study is based on the first outcomes of the Indonesian Biodiversity Discovery and Information System (IndoBioSys). IndoBioSys is a German-Indonesian research project that is conducted by the Museum für Naturkunde in Berlin and the Zoologische Staatssammlung München, in close cooperation with the Research Center for Biology - Indonesian Institute of Sciences (RCB-LIPI, Bogor).

  17. Application Protocol, Initial Graphics Exchange Specification (IGES), Layered Electrical Product

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O`Connell, L.J.

    1994-12-01

    An application protocol is an information systems engineering view of a specific product The view represents an agreement on the generic activities needed to design and fabricate the product the agreement on the information needed to support those activities, and the specific constructs of a product data standard for use in transferring some or all of the information required. This application protocol describes the data for electrical and electronic products in terms of a product description standard called the Initial Graphics Exchange Specification (IGES). More specifically, the Layered Electrical Product IGES Application Protocol (AP) specifies the mechanisms for defining andmore » exchanging computer-models and their associated data for those products which have been designed in two dimensional geometry so as to be produced as a series of layers in IGES format The AP defines the appropriateness of the data items for describing the geometry of the various parts of a product (shape and location), the connectivity, and the processing and material characteristics. Excluded is the behavioral requirements which the product was intended to satisfy, except as those requirements have been recorded as design rules or product testing requirements.« less

  18. Layered Electrical Product Application Protocol (AP). Draft: Initial Graphics Exchange Specification (IGES)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1994-09-01

    An application protocol is an information systems engineering view of a specific product. The view represents an agreement on the generic activities needed to design and fabricate the product, the agreement on the information needed to support those activities, and the specific constructs of a product data standard for use in transfering some or all of the information required. This applications protocol describes the data for electrical and electronic products in terms of a product description standard called the Initial Graphics Exchange Specification (IGES). More specifically, the Layered Electrical Product IGES Application Protocol (AP) specifies the mechanisms for defining andmore » exchanging computer-models and their associated data for those products which have been designed in two dimensional geometry so as to be produced as a series of layers in IGES format. The AP defines the appropriateness of the data items for describing the geometry of the various parts of a product (shape and location), the connectivity, and the processing and material characteristics. Excluded is the behavioral requirements which the product was intended to satisfy, except as those requirements have been recorded as design rules or product testing requirements.« less

  19. Bearer channel control protocol for the dynamic VB5.2 interface in ATM access networks

    NASA Astrophysics Data System (ADS)

    Fragoulopoulos, Stratos K.; Mavrommatis, K. I.; Venieris, Iakovos S.

    1996-12-01

    In the multi-vendor systems, a customer connected to an Access network (AN) must be capable of selecting a specific Service Node (SN) according to the services the SN provides. The multiplicity of technologically varying AN calls for the definition of a standard reference point between the AN and the SN widely known as the VB interface. Two versions are currently offered. The VB5.1 is simpler to implement but is not as flexible as the VB5.2, which supports switched connections. The VB5.2 functionality is closely coupled to the Broadband Bearer Channel Connection Protocol (B-BCCP). The B-BCCP is used for conveying the necessary information for dynamic resource allocation, traffic policing and routing in the AN as well as for information exchange concerning the status of the AN before a new call is established by the SN. By relying on such a protocol for the exchange of information instead of intercepting and interpreting signalling messages in the AN, the architecture of the AN is simplified because the functionality related to processing is not duplicated. In this paper a prominent B- BCCP candidate is defined, called the Service node Access network Interaction Protocol.

  20. [Prevention and detection of obstetric violence: A need in the Spanish delivery rooms?].

    PubMed

    Freire Barja, Natalia; Luces Lago, Ana María; Mosquera Pan, Lucía; Tizón Bouza, Eva

    2016-01-01

    The obstetric violence (OV) is the type of violence perpetrated against the pregnant woman through acts such as lack of respect to her autonomy and her freedom to decide. The increasing medicalization of the labour process, seems to be associated to this type of violence. Our objective is to provide health professionals with the necessary knowledge to be able to inform their patients about their rights and recognize those situations that can imply violence during the care process. The literature search was conducted in the following databases: PubMed, Cochrane Database of Systematic Reviews, EMBASE, Joanna Briggs Institute, UpToDate and CUIDEN. The search was limited to articles published during the last five years. The next medical subject heading were used both in English and Spanish: "humanizing delivery", "obstetrics", "medicalization" and "violence". The performance of harmful practices and the unjustified medicalization of the labour process represent a potential damage to pregnant women by action, violating their rights as a result. To prevent and eradicate this, new lines of less interventionist work are being proposed. As health professionals we should promote the humanization of labour and informs women about the existent legislation, protocols and guidelines that offer adequate information based on the latest evidence and promote their advance role as patients. The health institutions are responsible for initiating this change, by implementing protocols to guide the practice of the health professionals involved in the care of women during labour. These protocols should be based on the WHO recommendations.

  1. Kennedy Space Center Timing and Countdown Interface to Kennedy Ground Control Subsystem

    NASA Technical Reports Server (NTRS)

    Olsen, James C.

    2015-01-01

    Kennedy Ground Control System (KGCS) engineers at the National Aeronautics and Space Administration (NASA) Kennedy Space Center (KSC) are developing a time-tagging process to enable reconstruction of the events during a launch countdown. Such a process can be useful in the case of anomalies or other situations where it is necessary to know the exact time an event occurred. It is thus critical for the timing information to be accurate. KGCS will synchronize all items with Coordinated Universal Time (UTC) obtained from the Timing and Countdown (T&CD) organization. Network Time Protocol (NTP) is the protocol currently in place for synchronizing UTC. However, NTP has a peak error that is too high for today's standards. Precision Time Protocol (PTP) is a newer protocol with a much smaller peak error. The focus of this project has been to implement a PTP solution on the network to increase timing accuracy while introducing and configuring the implementation of a firewall between T&CD and the KGCS network.

  2. Age differences in decision making: a process methodology for examining strategic information processing.

    PubMed

    Johnson, M M

    1990-03-01

    This study explored the use of process tracing techniques in examining the decision-making processes of older and younger adults. Thirty-six college-age and thirty-six retirement-age participants decided which one of six cars they would purchase on the basis of computer-accessed data. They provided information search protocols. Results indicate that total time to reach a decision did not differ according to age. However, retirement-age participants used less information, spent more time viewing, and re-viewed fewer bits of information than college-age participants. Information search patterns differed markedly between age groups. Patterns of retirement-age adults indicated their use of noncompensatory decision rules which, according to decision-making literature (Payne, 1976), reduce cognitive processing demands. The patterns of the college-age adults indicated their use of compensatory decision rules, which have higher processing demands.

  3. Usefulness of multiqubit W-type states in quantum information processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, P.; Adhikari, S.; Kumar, A., E-mail: atulk@iitj.ac.in

    We analyze the efficiency of multiqubit W-type states as resources for quantum information. For this, we identify and generalize four-qubit W-type states. Our results show that these states can be used as resources for deterministic quantum information processing. The utility of results, however, is limited by the availability of experimental setups to perform and distinguish multiqubit measurements. We therefore emphasize protocols where two users want to establish an optimal bipartite entanglement using the partially entangled W-type states. We find that for such practical purposes, four-qubit W-type states can be a better resource in comparison to three-qubit W-type states. For amore » dense coding protocol, our states can be used deterministically to send two bits of classical message by locally manipulating a single qubit. In addition, we also propose a realistic experimental method to prepare the four-qubit W-type states using standard unitary operations and weak measurements.« less

  4. Information trade-offs for optical quantum communication.

    PubMed

    Wilde, Mark M; Hayden, Patrick; Guha, Saikat

    2012-04-06

    Recent work has precisely characterized the achievable trade-offs between three key information processing tasks-classical communication (generation or consumption), quantum communication (generation or consumption), and shared entanglement (distribution or consumption), measured in bits, qubits, and ebits per channel use, respectively. Slices and corner points of this three-dimensional region reduce to well-known protocols for quantum channels. A trade-off coding technique can attain any point in the region and can outperform time sharing between the best-known protocols for accomplishing each information processing task by itself. Previously, the benefits of trade-off coding that had been found were too small to be of practical value (viz., for the dephasing and the universal cloning machine channels). In this Letter, we demonstrate that the associated performance gains are in fact remarkably high for several physically relevant bosonic channels that model free-space or fiber-optic links, thermal-noise channels, and amplifiers. We show that significant performance gains from trade-off coding also apply when trading photon-number resources between transmitting public and private classical information simultaneously over secret-key-assisted bosonic channels. © 2012 American Physical Society

  5. Clinical guideline representation in a CDS: a human information processing method.

    PubMed

    Kilsdonk, Ellen; Riezebos, Rinke; Kremer, Leontien; Peute, Linda; Jaspers, Monique

    2012-01-01

    The Dutch Childhood Oncology Group (DCOG) has developed evidence-based guidelines for screening childhood cancer survivors for possible late complications of treatment. These paper-based guidelines appeared to not suit clinicians' information retrieval strategies; it was thus decided to communicate the guidelines through a Computerized Decision Support (CDS) tool. To ensure high usability of this tool, an analysis of clinicians' cognitive strategies in retrieving information from the paper-based guidelines was used as requirements elicitation method. An information processing model was developed through an analysis of think aloud protocols and used as input for the design of the CDS user interface. Usability analysis of the user interface showed that the navigational structure of the CDS tool fitted well with the clinicians' mental strategies employed in deciding on survivors screening protocols. Clinicians were more efficient and more complete in deciding on patient-tailored screening procedures when supported by the CDS tool than by the paper-based guideline booklet. The think-aloud method provided detailed insight into users' clinical work patterns that supported the design of a highly usable CDS system.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lima Bernardo, Bertúlio de, E-mail: bertulio.fisica@gmail.com

    We describe a novel quantum information protocol, which probabilistically entangles two distant photons that have never interacted. Different from the entanglement swapping protocol, which requires two pairs of maximally entangled photons as the input states, as well as a Bell-state measurement (BSM), the present scheme only requires three photons: two to be entangled and another to mediate the correlation, and no BSM, in a process that we call “entanglement mediation”. Furthermore, in analyzing the paths of the photons in our arrangement, we conclude that one of them, the mediator, exchanges information with the two others simultaneously, which seems to bemore » a new quantum-mechanical feature.« less

  7. iLAP: a workflow-driven software for experimental protocol development, data acquisition and analysis

    PubMed Central

    2009-01-01

    Background In recent years, the genome biology community has expended considerable effort to confront the challenges of managing heterogeneous data in a structured and organized way and developed laboratory information management systems (LIMS) for both raw and processed data. On the other hand, electronic notebooks were developed to record and manage scientific data, and facilitate data-sharing. Software which enables both, management of large datasets and digital recording of laboratory procedures would serve a real need in laboratories using medium and high-throughput techniques. Results We have developed iLAP (Laboratory data management, Analysis, and Protocol development), a workflow-driven information management system specifically designed to create and manage experimental protocols, and to analyze and share laboratory data. The system combines experimental protocol development, wizard-based data acquisition, and high-throughput data analysis into a single, integrated system. We demonstrate the power and the flexibility of the platform using a microscopy case study based on a combinatorial multiple fluorescence in situ hybridization (m-FISH) protocol and 3D-image reconstruction. iLAP is freely available under the open source license AGPL from http://genome.tugraz.at/iLAP/. Conclusion iLAP is a flexible and versatile information management system, which has the potential to close the gap between electronic notebooks and LIMS and can therefore be of great value for a broad scientific community. PMID:19941647

  8. A Data Scheduling and Management Infrastructure for the TEAM Network

    NASA Astrophysics Data System (ADS)

    Andelman, S.; Baru, C.; Chandra, S.; Fegraus, E.; Lin, K.; Unwin, R.

    2009-04-01

    The objective of the Tropical Ecology Assessment and Monitoring Network (www.teamnetwork.org) is "To generate real time data for monitoring long-term trends in tropical biodiversity through a global network of TEAM sites (i.e. field stations in tropical forests), providing an early warning system on the status of biodiversity to effectively guide conservation action". To achieve this, the TEAM Network operates by collecting data via standardized protocols at TEAM Sites. The standardized TEAM protocols include the Climate, Vegetation and Terrestrial Vertebrate Protocols. Some sites also implement additional protocols. There are currently 7 TEAM Sites with plans to grow the network to 15 by June 30, 2009 and 50 TEAM Sites by the end of 2010. Climate Protocol The Climate Protocol entails the collection of climate data via meteorological stations located at the TEAM Sites. This includes information such as precipitation, temperature, wind direction and strength and various solar radiation measurements. Vegetation Protocol The Vegetation Protocol collects standardized information on tropical forest trees and lianas. A TEAM Site will have between 6-9 1ha plots where trees and lianas larger than a pre-specified size are mapped, identified and measured. This results in each TEAM Site repeatedly measuring between 3000-5000 trees annually. Terrestrial Vertebrate Protocol The Terrestrial Vertebrate Protocol collects standardized information on mid-sized tropical forest fauna (i.e. birds and mammals). This information is collected via camera traps (i.e. digital cameras with motion sensors housed in weather proof casings). The images taken by the camera trap are reviewed to identify what species are captured in the image by the camera trap. The image and the interpretation of what is in the image are the data for the Terrestrial Vertebrate Protocol. The amount of data collected through the TEAM protocols provides a significant yet exciting IT challenge. The TEAM Network is currently partnering with the San Diego Super Computer Center to build the data management infrastructure. Data collected from the three core protocols as well as others are currently made available through the TEAM Network portal, which provides the content management framework, the data scheduling and management framework, an administrative framework to implement and manage TEAM sites, collaborative tools and a number of tools and applications utilizing Google Map and Google Earth products. A critical element of the TEAM Network data management infrastructure is to make the data publicly available in as close to real-time as possible (the TEAM Network Data Use Policy: http://www.teamnetwork.org/en/data/policy). This requires two essential tasks to be accomplished, 1) A data collection schedule has to be planned, proposed and approved for a given TEAM site. This is a challenging process since TEAM sites are geographically distributed across the tropics and hence have different seasons where they schedule field sampling for the different TEAM protocols. Capturing this information and ensuring that TEAM sites follow the outlined legal contract is key to the data collection process and 2) A stream-lined and efficient information management system to ensure data collected from the field meet the minimum data standards (i.e. are of the highest scientific quality) and are securely transferred, archived, processed and be rapidly made publicaly available, as a finished consumable product via the TEAM Network portal. The TEAM Network is achieving these goals by implementing an end-to-end framework consisting of the Sampling Scheduler application and the Data Management Framework. Sampling Scheduler The Sampling Scheduler is a project management, calendar based portal application that will allow scientists at a TEAM site to schedule field sampling for each of the TEAM protocols implemented at that site. The sampling scheduler addresses the specific requirements established in the TEAM protocols with the logistical scheduling needs of each TEAM Site. For example, each TEAM protocol defines when data must be collected (e.g. time of day, number of times per year, during which seasons, etc) as well as where data must be collected (from which sampling units, which trees, etc). Each TEAM Site has a limited number of resources and must create plans that will both satisfy the requirements of the protocols as well as be logistically feasible for their TEAM Site. With 15 TEAM Sites (and many more coming soon) the schedules of each TEAM Site must be communicated to the Network Office to ensure data are being collected as scheduled and to address the many problems when working in difficult environments like Tropical Forests. The Sampling Schedule provides built-in proposal and approval functionality to ensure that the TEAM Sites are and the Network office are in sync as well as provides the capability to modify schedules when needed. The Data Management Framework The Data Management framework is a three-tier data ingestion, edit and review application for protocols defined in the TEAM network. The data ingestion framework provides online web forms for field personnel to submit and edit data collected at TEAM Sites. These web forms will be accessible from the TEAM content management site. Once the data is securely uploaded, cured, processed and approved, it will be made publicly available for consumption by the scientific community. The Data Management framework, when combined with the Sampling Scheduler provides a closed loop Data Scheduling and Management infrastructure. All information starting from data collection plan, tools to input, modify and curate data, review and run QA/QC tests, as well as verify data are collected as planed are included. Finally, TEAM Network data are available for download via the Data Query and Download Application. This application utilizes a Google Maps custom interface to search, visualize, and download TEAM Network data. References • TEAM Network, http://www.teamnetwork.org • Center for Applied Biodiversity Science, Conservation International. http://science.conservation.org/portal/server.pt • TEAM Data Query and Download Application, http://www.teamnetwork.org/en/data/query

  9. SU-E-P-03: Implementing a Low Dose Lung Screening CT Program Meeting Regulatory Requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    LaFrance, M; Marsh, S; O'Donnell, G

    Purpose: To provide information pertaining to IROC Houston QA Center's (RPC) credentialing process for institutions participating in NCI-sponsored clinical trials. Purpose: Provide guidance to the Radiology Departments with the intent of implementing a Low Dose CT Screening Program using different CT Scanners with multiple techniques within the framework of the required state regulations. Method: State Requirements for the purpose of implementing a Low Dose CT Lung Protocol required working with the Radiology and Pulmonary Department in setting up a Low Dose Screening Protocol designed to reduce the radiation burden to the patients enrolled. Radiation dose measurements (CTDIvol) for various CTmore » manufacturers (Siemens16, Siemens 64, Philips 64, and Neusoft128) for three different weight based protocols. All scans were reviewed by the Radiologist. Prior to starting a low dose lung screening protocol, information had to be submitted to the state for approval. Performing a Healing Arts protocol requires extensive information. This not only includes name and address of the applicant but a detailed description of the disease, the x-ray examination and the population to be examined. The unit had to be tested by a qualified expert using the technique charts. The credentials of all the operators, the supervisors and the Radiologists had to be submitted to the state. Results: All the appropriate documentation was sent to the state for review. The measured results between the Low Dose Protocol versus the default Adult Chest Protocol showed that there was a dose reduction of 65% for small (100-150 lb.) patient, 75% for the Medium patient (151-250 lbs.), and a 55% reduction for the Large patient ( over 250 lbs.). Conclusion: Measured results indicated that the Low Dose Protocol indeed lowered the screening patient's radiation dose and the institution was able to submit the protocol to the State's regulators.« less

  10. An Energy Balanced and Lifetime Extended Routing Protocol for Underwater Sensor Networks.

    PubMed

    Wang, Hao; Wang, Shilian; Zhang, Eryang; Lu, Luxi

    2018-05-17

    Energy limitation is an adverse problem in designing routing protocols for underwater sensor networks (UWSNs). To prolong the network lifetime with limited battery power, an energy balanced and efficient routing protocol, called energy balanced and lifetime extended routing protocol (EBLE), is proposed in this paper. The proposed EBLE not only balances traffic loads according to the residual energy, but also optimizes data transmissions by selecting low-cost paths. Two phases are operated in the EBLE data transmission process: (1) candidate forwarding set selection phase and (2) data transmission phase. In candidate forwarding set selection phase, nodes update candidate forwarding nodes by broadcasting the position and residual energy level information. The cost value of available nodes is calculated and stored in each sensor node. Then in data transmission phase, high residual energy and relatively low-cost paths are selected based on the cost function and residual energy level information. We also introduce detailed analysis of optimal energy consumption in UWSNs. Numerical simulation results on a variety of node distributions and data load distributions prove that EBLE outperforms other routing protocols (BTM, BEAR and direct transmission) in terms of network lifetime and energy efficiency.

  11. Optimal molecular profiling of tissue and tissue components: defining the best processing and microdissection methods for biomedical applications.

    PubMed

    Bova, G Steven; Eltoum, Isam A; Kiernan, John A; Siegal, Gene P; Frost, Andra R; Best, Carolyn J M; Gillespie, John W; Su, Gloria H; Emmert-Buck, Michael R

    2005-02-01

    Isolation of well-preserved pure cell populations is a prerequisite for sound studies of the molecular basis of any tissue-based biological phenomenon. This article reviews current methods for obtaining anatomically specific signals from molecules isolated from tissues, a basic requirement for productive linking of phenotype and genotype. The quality of samples isolated from tissue and used for molecular analysis is often glossed over or omitted from publications, making interpretation and replication of data difficult or impossible. Fortunately, recently developed techniques allow life scientists to better document and control the quality of samples used for a given assay, creating a foundation for improvement in this area. Tissue processing for molecular studies usually involves some or all of the following steps: tissue collection, gross dissection/identification, fixation, processing/embedding, storage/archiving, sectioning, staining, microdissection/annotation, and pure analyte labeling/identification and quantification. We provide a detailed comparison of some current tissue microdissection technologies, and provide detailed example protocols for tissue component handling upstream and downstream from microdissection. We also discuss some of the physical and chemical issues related to optimal tissue processing, and include methods specific to cytology specimens. We encourage each laboratory to use these as a starting point for optimization of their overall process of moving from collected tissue to high quality, appropriately anatomically tagged scientific results. In optimized protocols is a source of inefficiency in current life science research. Improvement in this area will significantly increase life science quality and productivity. The article is divided into introduction, materials, protocols, and notes sections. Because many protocols are covered in each of these sections, information relating to a single protocol is not contiguous. To get the greatest benefit from this article, readers are advised to read through the entire article first, identify protocols appropriate to their laboratory for each step in their workflow, and then reread entries in each section pertaining to each of these single protocols.

  12. A service protocol for post-processing of medical images on the mobile device

    NASA Astrophysics Data System (ADS)

    He, Longjun; Ming, Xing; Xu, Lang; Liu, Qian

    2014-03-01

    With computing capability and display size growing, the mobile device has been used as a tool to help clinicians view patient information and medical images anywhere and anytime. It is uneasy and time-consuming for transferring medical images with large data size from picture archiving and communication system to mobile client, since the wireless network is unstable and limited by bandwidth. Besides, limited by computing capability, memory and power endurance, it is hard to provide a satisfactory quality of experience for radiologists to handle some complex post-processing of medical images on the mobile device, such as real-time direct interactive three-dimensional visualization. In this work, remote rendering technology is employed to implement the post-processing of medical images instead of local rendering, and a service protocol is developed to standardize the communication between the render server and mobile client. In order to make mobile devices with different platforms be able to access post-processing of medical images, the Extensible Markup Language is taken to describe this protocol, which contains four main parts: user authentication, medical image query/ retrieval, 2D post-processing (e.g. window leveling, pixel values obtained) and 3D post-processing (e.g. maximum intensity projection, multi-planar reconstruction, curved planar reformation and direct volume rendering). And then an instance is implemented to verify the protocol. This instance can support the mobile device access post-processing of medical image services on the render server via a client application or on the web page.

  13. [Analysis of palliative sedation in hospitalised elderly patients: Effectiveness of a protocol].

    PubMed

    Mateos-Nozal, Jesús; García-Cabrera, Lorena; Montero Errasquín, Beatriz; Cruz-Jentoft, Alfonso José; Rexach Cano, Lourdes

    2016-01-01

    To measure changes in the practice of palliative sedation during agony in hospitalised elderly patients before and after the implementation of a palliative sedation protocol. A retrospective before-after study was performed in hospitalised patients over 65 years old who received midazolam during hospital admission and died in the hospital in two 3-month periods, before and after the implementation of the protocol. Non-sedative uses of midazolam and patients in intensive care were excluded. Patient and admission characteristics, the consent process, withdrawal of life-sustaining treatments, and the sedation process (refractory symptom treated, drug doses, assessment and use of other drugs) were recorded. Association was analysed using the Chi(2) and Student t tests. A total of 143 patients were included, with no significant differences between groups in demographic characteristics or symptoms. Do not resuscitate (DNR) orders were recorded in approximately 70% of the subjects of each group, and informed consent for sedation was recorded in 91% before vs. 84% after the protocol. Induction and maintenance doses of midazolam followed protocol recommendations in 1.3% before vs 10.4% after the protocol was implemented (P=.02) and adequate rescue doses were used in 1.3% vs 11.9% respectively (P=.01). Midazolam doses were significantly lower (9.86mg vs 18.67mg, P<.001) when the protocol was used than when it was not used. Ramsay sedation score was used in 8% vs. 12% and the Palliative Care Team was involved in 35.5% and 16.4% of the cases (P=.008) before and after the protocol, respectively. Use of midazolam slightly improved after the implementation of a hospital protocol on palliative sedation. The percentage of adequate sedations and the general process of sedation were mostly unchanged by the protocol. More education and further assessment is needed to gauge the effect of these measures in the future. Copyright © 2015 SEGG. Published by Elsevier Espana. All rights reserved.

  14. Secured Communication for Business Process Outsourcing Using Optimized Arithmetic Cryptography Protocol Based on Virtual Parties

    NASA Astrophysics Data System (ADS)

    Pathak, Rohit; Joshi, Satyadhar

    Within a span of over a decade, India has become one of the most favored destinations across the world for Business Process Outsourcing (BPO) operations. India has rapidly achieved the status of being the most preferred destination for BPO for companies located in the US and Europe. Security and privacy are the two major issues needed to be addressed by the Indian software industry to have an increased and long-term outsourcing contract from the US. Another important issue is about sharing employee’s information to ensure that data and vital information of an outsourcing company is secured and protected. To ensure that the confidentiality of a client’s information is maintained, BPOs need to implement some data security measures. In this paper, we propose a new protocol for specifically for BPO Secure Multi-Party Computation (SMC). As there are many computations and surveys which involve confidential data from many parties or organizations and the concerned data is property of the organization, preservation and security of this data is of prime importance for such type of computations. Although the computation requires data from all the parties, but none of the associated parties would want to reveal their data to the other parties. We have proposed a new efficient and scalable protocol to perform computation on encrypted information. The information is encrypted in a manner that it does not affect the result of the computation. It uses modifier tokens which are distributed among virtual parties, and finally used in the computation. The computation function uses the acquired data and modifier tokens to compute right result from the encrypted data. Thus without revealing the data, right result can be computed and privacy of the parties is maintained. We have given a probabilistic security analysis of hacking the protocol and shown how zero hacking security can be achieved. Also we have analyzed the specific case of Indian BPO.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dumitrescu, Eugene; Humble, Travis S.

    The accurate and reliable characterization of quantum dynamical processes underlies efforts to validate quantum technologies, where discrimination between competing models of observed behaviors inform efforts to fabricate and operate qubit devices. We present a protocol for quantum channel discrimination that leverages advances in direct characterization of quantum dynamics (DCQD) codes. We demonstrate that DCQD codes enable selective process tomography to improve discrimination between entangling and correlated quantum dynamics. Numerical simulations show selective process tomography requires only a few measurement configurations to achieve a low false alarm rate and that the DCQD encoding improves the resilience of the protocol to hiddenmore » sources of noise. Lastly, our results show that selective process tomography with DCQD codes is useful for efficiently distinguishing sources of correlated crosstalk from uncorrelated noise in current and future experimental platforms.« less

  16. The evaluation of complex clinical trial protocols: resources available to research ethics committees and the use of clinical trial registries--a case study.

    PubMed

    Homedes, Núria; Ugalde, Antonio

    2015-06-01

    To assess the potential role of clinical trial (CT) registries and other resources available to research ethics committees (RECs) in the evaluation of complex CT protocols in low-income and middle-income countries. Using a case study approach, the authors examined the decision-making process of a REC in Argentina and its efforts to use available resources to decide on a complex protocol. We also analysed the information in the USA and other CT registries and consulted 24 CT experts in seven countries. Information requested by the Argentinean REC from other national RECs and ethics' experts was not useful to verify the adequacy of the REC's decision whether or not to approve the CT. The responses from the national regulatory agency and the sponsor were not helpful either. The identification of international resources that could assist was beyond the REC's capability. The information in the USA and other CT registries is limited, and at times misleading; and its accuracy is not verified by register keepers. RECs have limited access to experts and institutions that could assist them in their deliberations. Sponsors do not always answer RECs' request for information to properly conduct the ethical and methodological assessment of CT protocols. The usefulness of the CT registries is curtailed by the lack of appropriate codes and by data errors. Information about reasons for rejection, withdrawal or suspension of the trial should be included in the registries. Establishing formal channels of communication among national and foreign RECs and with independent international reference centres could strengthen the ethical review of CT protocols. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  17. Discrimination of correlated and entangling quantum channels with selective process tomography

    DOE PAGES

    Dumitrescu, Eugene; Humble, Travis S.

    2016-10-10

    The accurate and reliable characterization of quantum dynamical processes underlies efforts to validate quantum technologies, where discrimination between competing models of observed behaviors inform efforts to fabricate and operate qubit devices. We present a protocol for quantum channel discrimination that leverages advances in direct characterization of quantum dynamics (DCQD) codes. We demonstrate that DCQD codes enable selective process tomography to improve discrimination between entangling and correlated quantum dynamics. Numerical simulations show selective process tomography requires only a few measurement configurations to achieve a low false alarm rate and that the DCQD encoding improves the resilience of the protocol to hiddenmore » sources of noise. Lastly, our results show that selective process tomography with DCQD codes is useful for efficiently distinguishing sources of correlated crosstalk from uncorrelated noise in current and future experimental platforms.« less

  18. Task–Technology Fit of Video Telehealth for Nurses in an Outpatient Clinic Setting

    PubMed Central

    Finkelstein, Stanley M.

    2014-01-01

    Abstract Background: Incorporating telehealth into outpatient care delivery supports management of consumer health between clinic visits. Task–technology fit is a framework for understanding how technology helps and/or hinders a person during work processes. Evaluating the task–technology fit of video telehealth for personnel working in a pediatric outpatient clinic and providing care between clinic visits ensures the information provided matches the information needed to support work processes. Materials and Methods: The workflow of advanced practice registered nurse (APRN) care coordination provided via telephone and video telehealth was described and measured using a mixed-methods workflow analysis protocol that incorporated cognitive ethnography and time–motion study. Qualitative and quantitative results were merged and analyzed within the task–technology fit framework to determine the workflow fit of video telehealth for APRN care coordination. Results: Incorporating video telehealth into APRN care coordination workflow provided visual information unavailable during telephone interactions. Despite additional tasks and interactions needed to obtain the visual information, APRN workflow efficiency, as measured by time, was not significantly changed. Analyzed within the task–technology fit framework, the increased visual information afforded by video telehealth supported the assessment and diagnostic information needs of the APRN. Conclusions: Telehealth must provide the right information to the right clinician at the right time. Evaluating task–technology fit using a mixed-methods protocol ensured rigorous analysis of fit within work processes and identified workflows that benefit most from the technology. PMID:24841219

  19. Connecting the Library's Patron Database to Campus Administrative Software: Simplifying the Library's Accounts Receivable Process

    ERIC Educational Resources Information Center

    Oliver, Astrid; Dahlquist, Janet; Tankersley, Jan; Emrich, Beth

    2010-01-01

    This article discusses the processes that occurred when the Library, Controller's Office, and Information Technology Department agreed to create an interface between the Library's Innovative Interfaces patron database and campus administrative software, Banner, using file transfer protocol, in an effort to streamline the Library's accounts…

  20. A streamlined collecting and preparation protocol for DNA barcoding of Lepidoptera as part of large-scale rapid biodiversity assessment projects, exemplified by the Indonesian Biodiversity Discovery and Information System (IndoBioSys)

    PubMed Central

    Hausmann, Axel; Cancian de Araujo, Bruno; Sutrisno, Hari; Peggie, Djunijanti; Schmidt, Stefan

    2017-01-01

    Abstract Here we present a general collecting and preparation protocol for DNA barcoding of Lepidoptera as part of large-scale rapid biodiversity assessment projects, and a comparison with alternative preserving and vouchering methods. About 98% of the sequenced specimens processed using the present collecting and preparation protocol yielded sequences with more than 500 base pairs. The study is based on the first outcomes of the Indonesian Biodiversity Discovery and Information System (IndoBioSys). IndoBioSys is a German-Indonesian research project that is conducted by the Museum für Naturkunde in Berlin and the Zoologische Staatssammlung München, in close cooperation with the Research Center for Biology – Indonesian Institute of Sciences (RCB-LIPI, Bogor). PMID:29134041

  1. Software Design Document MCC CSCI (1). Volume 1 Sections 1.0-2.18

    DTIC Science & Technology

    1991-06-01

    AssociationUserProtocol /simnet/common!include/prot ____________________ ____________________ ocol/p assoc.h Primitive long Standard C type...Information. 2.2.1.4.2 ProcessMessage ProcessMessage processes a message from another process. type describes the message as either one-way, a synchronous or...Macintosh Consoles. This is sometimes necessary due to normal clock skew so that operations among the MCC components will remain synchronized . This

  2. Influence of measurement error on Maxwell's demon

    NASA Astrophysics Data System (ADS)

    Sørdal, Vegard; Bergli, Joakim; Galperin, Y. M.

    2017-06-01

    In any general cycle of measurement, feedback, and erasure, the measurement will reduce the entropy of the system when information about the state is obtained, while erasure, according to Landauer's principle, is accompanied by a corresponding increase in entropy due to the compression of logical and physical phase space. The total process can in principle be fully reversible. A measurement error reduces the information obtained and the entropy decrease in the system. The erasure still gives the same increase in entropy, and the total process is irreversible. Another consequence of measurement error is that a bad feedback is applied, which further increases the entropy production if the proper protocol adapted to the expected error rate is not applied. We consider the effect of measurement error on a realistic single-electron box Szilard engine, and we find the optimal protocol for the cycle as a function of the desired power P and error ɛ .

  3. Optimal diabatic dynamics of Majorana-based quantum gates

    NASA Astrophysics Data System (ADS)

    Rahmani, Armin; Seradjeh, Babak; Franz, Marcel

    2017-08-01

    In topological quantum computing, unitary operations on qubits are performed by adiabatic braiding of non-Abelian quasiparticles, such as Majorana zero modes, and are protected from local environmental perturbations. In the adiabatic regime, with timescales set by the inverse gap of the system, the errors can be made arbitrarily small by performing the process more slowly. To enhance the performance of quantum information processing with Majorana zero modes, we apply the theory of optimal control to the diabatic dynamics of Majorana-based qubits. While we sacrifice complete topological protection, we impose constraints on the optimal protocol to take advantage of the nonlocal nature of topological information and increase the robustness of our gates. By using the Pontryagin's maximum principle, we show that robust equivalent gates to perfect adiabatic braiding can be implemented in finite times through optimal pulses. In our implementation, modifications to the device Hamiltonian are avoided. Focusing on thermally isolated systems, we study the effects of calibration errors and external white and 1 /f (pink) noise on Majorana-based gates. While a noise-induced antiadiabatic behavior, where a slower process creates more diabatic excitations, prohibits indefinite enhancement of the robustness of the adiabatic scheme, our fast optimal protocols exhibit remarkable stability to noise and have the potential to significantly enhance the practical performance of Majorana-based information processing.

  4. Mediating Emotive Empathy With Informational Text: Three Students' Think-Aloud Protocols of "Gettysburg: The Graphic Novel"

    ERIC Educational Resources Information Center

    Chisholm, James S.; Shelton, Ashley L.; Sheffield, Caroline C.

    2017-01-01

    Although the popularity and use of graphic novels in literacy instruction has increased in the last decade, few sustained analyses have examined adolescents' reading processes with informational texts in social studies classrooms. Recent research that has foregrounded visual, emotional, and embodied textual responses situates this qualitative…

  5. A General Approach to Access Morphologies of Polyoxometalates in Solution by Using SAXS: An Ab Initio Modeling Protocol.

    PubMed

    Li, Mu; Wang, Weiyu; Yin, Panchao

    2018-05-02

    Herein, we reported a general protocol for an ab initio modeling approach to deduce structure information of polyoxometalates (POMs) in solutions from scattering data collected by the small-angle X-ray scattering (SAXS) technique. To validate the protocol, the morphologies of a serious of known POMs in either aqueous or organic solvents were analyzed. The obtained particle morphologies were compared and confirmed with previous reported crystal structures. To extend the feasibility of the protocol to an unknown system of aqueous solutions of Na 2 MoO 4 with the pH ranging from -1 to 8.35, the formation of {Mo 36 } clusters was probed, identified, and confirmed by SAXS. The approach was further optimized with a multi-processing capability to achieve fast analysis of experimental data, thereby, facilitating in situ studies of formations of POMs in solutions. The advantage of this approach is to generate intuitive 3D models of POMs in solutions without confining information such as symmetries and possible sizes. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. AFRL Projects to Replace Cadmium

    DTIC Science & Technology

    2005-03-01

    Protocol does not – Identify/ select a material or process – Impose processing restrictions on candidates – Implement a material or process into production...within proper limits • Use XRF to measure composition and thickness – Strippability • Remove coating within 60 minutes • Replate coating and pass...product information available? Magnetron Sputtering to Replace Cd • Task 2: Coating Deposition and Screening – Selection of qualified vendors and

  7. Security of quantum key distribution with multiphoton components

    PubMed Central

    Yin, Hua-Lei; Fu, Yao; Mao, Yingqiu; Chen, Zeng-Bing

    2016-01-01

    Most qubit-based quantum key distribution (QKD) protocols extract the secure key merely from single-photon component of the attenuated lasers. However, with the Scarani-Acin-Ribordy-Gisin 2004 (SARG04) QKD protocol, the unconditionally secure key can be extracted from the two-photon component by modifying the classical post-processing procedure in the BB84 protocol. Employing the merits of SARG04 QKD protocol and six-state preparation, one can extract secure key from the components of single photon up to four photons. In this paper, we provide the exact relations between the secure key rate and the bit error rate in a six-state SARG04 protocol with single-photon, two-photon, three-photon, and four-photon sources. By restricting the mutual information between the phase error and bit error, we obtain a higher secure bit error rate threshold of the multiphoton components than previous works. Besides, we compare the performances of the six-state SARG04 with other prepare-and-measure QKD protocols using decoy states. PMID:27383014

  8. Continuing oversight through site monitoring: experiences of an institutional ethics committee in an Indian tertiary-care hospital.

    PubMed

    Shetty, Yashashri C; Marathe, Padmaja; Kamat, Sandhya; Thatte, Urmila

    2012-01-01

    WHO-TDR and the Indian Council of Medical Research recommend site visits by institutional ethics committees (IECs) for continued oversight, to ensure the ethical conduct of research. Our IEC conducted seven site visits in 2008-2009 using a standardised format to monitor adherence to protocol and the informed consent process. The study identified issues related to informed consent (6/7), deviation from protocol (5/7), reporting of study progress to the IEC (3/7), recruiting additional participants without IEC approval (2/7), reporting of serious adverse events (1/7), investigator's lack of awareness of protocol and the informed consent document (2/7) and other findings. Investigators were informed about the findings and were asked to submit an explanation. The IEC issued warnings about not repeating such lapses in the future (5/7), restricted enrollment of new participants (2/7), recommended continued good clinical practice training to the study team (4/7), advised the recruitment of additional study coordinators (2/7), and requested the submission of adverse event reports (2/7) or sponsors' audit reports (2/7). Our study showed that the ethical conduct of studies can be ensured by conducting routine site monitoring.

  9. Remote control of the industry processes. POWERLINK protocol application

    NASA Astrophysics Data System (ADS)

    Wóbel, A.; Paruzel, D.; Paszkiewicz, B.

    2017-08-01

    The present technological development enables the use of solutions characterized by a lower failure rate, and work with greater precision. This allows you to obtain the most efficient production, high speed production and reliability of individual components. The main scope of this article was POWERLINK protocol application for communication with the controller B & R through communication Ethernet for recording process parameters. This enables control of run production cycle using an internal network connected to the PC industry. Knowledge of the most important parameters of the production in real time allows detecting of a failure immediately after occurrence. For this purpose, the position of diagnostic use driver X20CP1301 B&R to record measurement data such as pressure, temperature valve between the parties and the torque required to change the valve setting was made. The use of POWERLINK protocol allows for the transmission of information on the status of every 200 μs.

  10. Application of SEM and EDX in studying biomineralization in plant tissues.

    PubMed

    He, Honghua; Kirilak, Yaowanuj

    2014-01-01

    This chapter describes protocols using formalin-acetic acid-alcohol (FAA) to fix plant tissues for studying biomineralization by means of scanning electron microscopy (SEM) and qualitative energy-dispersive X-ray microanalysis (EDX). Specimen preparation protocols for SEM and EDX mainly include fixation, dehydration, critical point drying (CPD), mounting, and coating. Gold-coated specimens are used for SEM imaging, while gold- and carbon-coated specimens are prepared for qualitative X-ray microanalyses separately to obtain complementary information on the elemental compositions of biominerals. During the specimen preparation procedure for SEM, some biominerals may be dislodged or scattered, making it difficult to determine their accurate locations, and light microscopy is used to complement SEM studies. Specimen preparation protocols for light microscopy generally include fixation, dehydration, infiltration and embedding with resin, microtome sectioning, and staining. In addition, microwave processing methods are adopted here to speed up the specimen preparation process for both SEM and light microscopy.

  11. Improving Conduct and Feasibility of Clinical Trials to Evaluate Antibacterial Drugs to Treat Hospital-Acquired Bacterial Pneumonia and Ventilator-Associated Bacterial Pneumonia: Recommendations of the Clinical Trials Transformation Initiative Antibacterial Drug Development Project Team.

    PubMed

    Knirsch, Charles; Alemayehu, Demissie; Botgros, Radu; Comic-Savic, Sabrina; Friedland, David; Holland, Thomas L; Merchant, Kunal; Noel, Gary J; Pelfrene, Eric; Reith, Christina; Santiago, Jonas; Tiernan, Rosemary; Tenearts, Pamela; Goldsack, Jennifer C; Fowler, Vance G

    2016-08-15

    The etiology of hospital-acquired or ventilator-associated bacterial pneumonia (HABP/VABP) is often multidrug-resistant infections. The evaluation of new antibacterial drugs for efficacy in this population is important, as many antibacterial drugs have demonstrated limitations when studied in this population. HABP/VABP trials are expensive and challenging to conduct due to protocol complexity and low patient enrollment, among other factors. The Clinical Trials Transformation Initiative (CTTI) seeks to advance antibacterial drug development by streamlining HABP/VABP clinical trials to improve efficiency and feasibility while maintaining ethical rigor, patient safety, information value, and scientific validity. In 2013, CTTI engaged a multidisciplinary group of experts to discuss challenges impeding the conduct of HABP/VABP trials. Separate workstreams identified challenges associated with HABP/VABP protocol complexity. The Project Team developed potential solutions to streamline HABP/VABP trials using a Quality by Design approach. CTTI recommendations focus on 4 key areas to improve HABP/VABP trials: informed consent processes/practices, protocol design, choice of an institutional review board (IRB), and trial outcomes. Informed consent processes should include legally authorized representatives. Protocol design decisions should focus on eligibility criteria, prestudy antibacterial therapy considerations, use of new diagnostics, and sample size. CTTI recommends that sponsors use a central IRB and discuss trial endpoints with regulators, including defining a clinical failure and evaluating the impact of concomitant antibacterial drugs. Streamlining HABP/VABP trials by addressing key protocol elements can improve trial startup and patient recruitment/retention, reduce trial complexity and costs, and ensure patient safety while advancing antibacterial drug development. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.

  12. The Montreal Protocol treaty and its illuminating history of science-policy decision-making

    NASA Astrophysics Data System (ADS)

    Grady, C.

    2017-12-01

    The Montreal Protocol on Substances that Deplete the Ozone Layer, hailed as one of the most effective environmental treaties of all time, has a thirty year history of science-policy decision-making. The partnership between Parties to the Montreal Protocol and its technical assessment panels serve as a basis for understanding successes and evaluating stumbles of global environmental decision-making. Real-world environmental treaty negotiations can be highly time-sensitive, politically motivated, and resource constrained thus scientists and policymakers alike are often unable to confront the uncertainties associated with the multitude of choices. The science-policy relationship built within the framework of the Montreal Protocol has helped constrain uncertainty and inform policy decisions but has also highlighted the limitations of the use of scientific understanding in political decision-making. This talk will describe the evolution of the scientist-policymaker relationship over the history of the Montreal Protocol. Examples will illustrate how the Montreal Protocol's technical panels inform decisions of the country governments and will characterize different approaches pursued by different countries with a particular focus on the recently adopted Kigali Amendment. In addition, this talk will take a deeper dive with an analysis of the historic technical panel assessments on estimating financial resources necessary to enable compliance to the Montreal Protocol compared to the political financial decisions made through the Protocol's Multilateral Fund replenishment negotiation process. Finally, this talk will describe the useful lessons and challenges from these interactions and how they may be applicable in other environmental management frameworks across multiple scales under changing climatic conditions.

  13. Quantum information processing in phase space: A modular variables approach

    NASA Astrophysics Data System (ADS)

    Ketterer, A.; Keller, A.; Walborn, S. P.; Coudreau, T.; Milman, P.

    2016-08-01

    Binary quantum information can be fault-tolerantly encoded in states defined in infinite-dimensional Hilbert spaces. Such states define a computational basis, and permit a perfect equivalence between continuous and discrete universal operations. The drawback of this encoding is that the corresponding logical states are unphysical, meaning infinitely localized in phase space. We use the modular variables formalism to show that, in a number of protocols relevant for quantum information and for the realization of fundamental tests of quantum mechanics, it is possible to loosen the requirements on the logical subspace without jeopardizing their usefulness or their successful implementation. Such protocols involve measurements of appropriately chosen modular variables that permit the readout of the encoded discrete quantum information from the corresponding logical states. Finally, we demonstrate the experimental feasibility of our approach by applying it to the transverse degrees of freedom of single photons.

  14. Hospital Casemix Protocol - Medibank Private Perspective.

    PubMed

    Szakiel, John

    2010-06-01

    Hospital Casemix Protocol data provide a brief summary outlining morbidity data and costs associated with an episode of care. Federal government legislation requires that hospitals report this information to private health insurers who, in turn, merge these data with benefit outlays and report their findings to the Department of Health and Ageing (DoHA). This article gives a brief outline of the collection, cleansing and processing of these data and subsequent reporting to DoHA by Medibank Private, which accounts for approximately 30% of collected data.

  15. Security of quantum key distribution with iterative sifting

    NASA Astrophysics Data System (ADS)

    Tamaki, Kiyoshi; Lo, Hoi-Kwong; Mizutani, Akihiro; Kato, Go; Lim, Charles Ci Wen; Azuma, Koji; Curty, Marcos

    2018-01-01

    Several quantum key distribution (QKD) protocols employ iterative sifting. After each quantum transmission round, Alice and Bob disclose part of their setting information (including their basis choices) for the detected signals. This quantum phase then ends when the basis dependent termination conditions are met, i.e., the numbers of detected signals per basis exceed certain pre-agreed threshold values. Recently, however, Pfister et al (2016 New J. Phys. 18 053001) showed that the basis dependent termination condition makes QKD insecure, especially in the finite key regime, and they suggested to disclose all the setting information after finishing the quantum phase. However, this protocol has two main drawbacks: it requires that Alice possesses a large memory, and she also needs to have some a priori knowledge about the transmission rate of the quantum channel. Here we solve these two problems by introducing a basis-independent termination condition to the iterative sifting in the finite key regime. The use of this condition, in combination with Azuma’s inequality, provides a precise estimation on the amount of privacy amplification that needs to be applied, thus leading to the security of QKD protocols, including the loss-tolerant protocol (Tamaki et al 2014 Phys. Rev. A 90 052314), with iterative sifting. Our analysis indicates that to announce the basis information after each quantum transmission round does not compromise the key generation rate of the loss-tolerant protocol. Our result allows the implementation of wider classes of classical post-processing techniques in QKD with quantified security.

  16. Reliability of the Nursing Home Survey Process: A Simultaneous Survey Approach

    ERIC Educational Resources Information Center

    Lee, Robert H.; Gajewski, Byron J.; Thompson, Sarah

    2006-01-01

    Purpose: We designed this study to examine the reliability of the nursing home survey process in the state of Kansas using regular and simultaneous survey teams. In particular, the study examined how two survey teams exposed to the same information at the same time differed in their interpretations. Design and Methods: The protocol for…

  17. Improvements in Clinical Trials Information Will Improve the Reproductive Health and Fertility of Cancer Patients.

    PubMed

    Dauti, Angela; Gerstl, Brigitte; Chong, Serena; Chisholm, Orin; Anazodo, Antoinette

    2017-06-01

    There are a number of barriers that result in cancer patients not being referred for oncofertility care, which include knowledge about reproductive risks of antineoplastic agents. Without this information, clinicians do not always make recommendations for oncofertility care. The objective of this study was to describe the level of reproductive information and recommendations that clinicians have available in clinical trial protocols regarding oncofertility management and follow-up, and the information that patients may receive in clinical trials patient information sheets or consent forms. A literature review of the 71 antineoplastic drugs included in the 68 clinical trial protocols showed that 68% of the antineoplastic drugs had gonadotoxic animal data, 32% had gonadotoxic human data, 83% had teratogenic animal data, and 32% had teratogenic human data. When the clinical trial protocols were reviewed, only 22% of the protocols reported the teratogenic risks and 32% of the protocols reported the gonadotoxic risk. Only 56% of phase 3 protocols had gonadotoxic information and 13% of phase 3 protocols had teratogenic information. Nine percent of the protocols provided fertility preservation recommendations and 4% provided reproductive information in the follow-up and survivorship period. Twenty-six percent had a section in the clinical trials protocol, which identified oncofertility information easily. When gonadotoxic and teratogenic effects of treatment were known, they were not consistently included in the clinical trial protocols and the lack of data for new drugs was not reported. Very few protocols gave recommendations for oncofertility management and follow-up following the completion of cancer treatment. The research team proposes a number of recommendations that should be required for clinicians and pharmaceutical companies developing new trials.

  18. Research protocol for the Picture Talk Project: a qualitative study on research and consent with remote Australian Aboriginal communities

    PubMed Central

    Fitzpatrick, Emily F M; Carter, Maureen; Oscar, June; Lawford, Tom; Martiniuk, Alexandra L C; D’Antoine, Heather A; Elliott, Elizabeth J

    2017-01-01

    Introduction Research with Indigenous populations is not always designed with cultural sensitivity. Few publications evaluate or describe in detail seeking consent for research with Indigenous participants. When potential participants are not engaged in a culturally respectful manner, participation rates and research quality can be adversely affected. It is unethical to proceed with research without truly informed consent. Methods and analysis We describe a culturally appropriate research protocol that is invited by Aboriginal communities of the Fitzroy Valley in Western Australia. The Picture Talk Project is a research partnership with local Aboriginal leaders who are also chief investigators. We will interview Aboriginal leaders about research, community engagement and the consent process and hold focus groups with Aboriginal community members about individual consent. Cultural protocols will be applied to recruit and conduct research with participants. Transcripts will be analysed using NVivo10 qualitative software and themes synthesised to highlight the key issues raised by the community about the research process. This protocol will guide future research with the Aboriginal communities of the Fitzroy Valley and may inform the approach to research with other Indigenous communities of Australia or the world. It must be noted that no community is the same and all research requires local consultation and input. To conduct culturally sensitive research, respected local people from the community who have knowledge of cultural protocol and language are engaged to guide each step of the research process from the project design to the delivery of results. Ethics and dissemination Ethics approval was granted by the University of Sydney Human Research Ethics Committee (No. 2012/348, reference:14760), the Western Australia Country Health Service Ethics Committee (No. 2012:15), the Western Australian Aboriginal Health Ethics Committee and reviewed by the Kimberley Aboriginal Health Planning Forum Research Sub-Committee (No. 2012–008). Results will be disseminated through peer review articles, a local Fitzroy Valley report and conference presentations. PMID:29288181

  19. The OPERA trial: a protocol for the process evaluation of a randomised trial of an exercise intervention for older people in residential and nursing accommodation

    PubMed Central

    2011-01-01

    Background The OPERA trial is large cluster randomised trial testing a physical activity intervention to address depression amongst people living in nursing and residential homes for older people. A process evaluation was commissioned alongside the trial and we report the protocol for this process evaluation. Challenges included the cognitive and physical ability of the participants, the need to respect the privacy of all home residents, including study non-participants, and the physical structure of the homes. Evaluation activity had to be organised around the structured timetable of homes, leaving limited opportunities for data collection. The aims of this process evaluation are to provide findings that will assist in the interpretation of the clinical trial results, and to inform potential implementation of the physical activity intervention on a wider scale. Methods/design Quantitative data on recruitment of homes and individuals is being collected. For homes in the intervention arm, data on dose and fidelity of the intervention delivered; including individual rates of participation in exercise classes are collected. In the control homes, uptake and delivery of depression awareness training is monitored. These data will be combined with qualitative data from an in-depth study of a purposive sample of eight homes (six intervention and two control). Discussion Although process evaluations are increasingly funded alongside trials, it is still rare to see the findings published, and even rarer to see the protocol for such an evaluation published. Process evaluations have the potential to assist in interpreting and understanding trial results as well as informing future roll-outs of interventions. If such evaluations are funded they should also be reported and reviewed in a similar way to the trial outcome evaluation. Trial Registration ISRCTN No: ISRCTN43769277 PMID:21288341

  20. Text-Processing Differences in Adolescent Adequate and Poor Comprehenders Reading Accessible and Challenging Narrative and Informational Text

    ERIC Educational Resources Information Center

    Denton, Carolyn A.; Enos, Mischa; York, Mary J.; Francis, David J.; Barnes, Marcia A.; Kulesz, Paulina A.; Fletcher, Jack M.; Carter, Suzanne

    2015-01-01

    Based on the analysis of 620 think-aloud verbal protocols from students in grades 7, 9, and 11, we examined students' conscious engagement in inference generation, paraphrasing, verbatim text repetition, and monitoring while reading narrative or informational texts that were either at or above the students' current reading levels. Students were…

  1. An Approach to Verification and Validation of a Reliable Multicasting Protocol

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Montgomery, Todd L.

    1994-01-01

    This paper describes the process of implementing a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally using a combination of formal and informal techniques in an attempt to ensure the correctness of its implementation. Our development process involved three concurrent activities: (1) the initial construction and incremental enhancement of a formal state model of the protocol machine; (2) the initial coding and incremental enhancement of the implementation; and (3) model-based testing of iterative implementations of the protocol. These activities were carried out by two separate teams: a design team and a V&V team. The design team built the first version of RMP with limited functionality to handle only nominal requirements of data delivery. In a series of iterative steps, the design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation. This was done by generating test cases based on suspected errant or offnominal behaviors predicted by the current model. If the execution of a test was different between the model and implementation, then the differences helped identify inconsistencies between the model and implementation. The dialogue between both teams drove the co-evolution of the model and implementation. Testing served as the vehicle for keeping the model and implementation in fidelity with each other. This paper describes (1) our experiences in developing our process model; and (2) three example problems found during the development of RMP.

  2. An approach to verification and validation of a reliable multicasting protocol

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Montgomery, Todd L.

    1995-01-01

    This paper describes the process of implementing a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally using a combination of formal and informal techniques in an attempt to ensure the correctness of its implementation. Our development process involved three concurrent activities: (1) the initial construction and incremental enhancement of a formal state model of the protocol machine; (2) the initial coding and incremental enhancement of the implementation; and (3) model-based testing of iterative implementations of the protocol. These activities were carried out by two separate teams: a design team and a V&V team. The design team built the first version of RMP with limited functionality to handle only nominal requirements of data delivery. In a series of iterative steps, the design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation. This was done by generating test cases based on suspected errant or off-nominal behaviors predicted by the current model. If the execution of a test was different between the model and implementation, then the differences helped identify inconsistencies between the model and implementation. The dialogue between both teams drove the co-evolution of the model and implementation. Testing served as the vehicle for keeping the model and implementation in fidelity with each other. This paper describes (1) our experiences in developing our process model; and (2) three example problems found during the development of RMP.

  3. Space Communications Technology Conference: Onboard Processing and Switching

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Papers and presentations from the conference are presented. The topics covered include the following: satellite network architecture, network control and protocols, fault tolerance and autonomy, multichanned demultiplexing and demodulation, information switching and routing, modulation and coding, and planned satellite communications systems.

  4. Home Energy Management System - VOLTTRON Integration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zandi, Helia

    In most Home Energy Management Systems (HEMS) available in the market, different devices running different communication protocols cannot interact with each other and exchange information. As a result of this integration, the information about different devices running different communication protocol can be accessible by other agents and devices running on VOLTTRON platform. The integration process can be used by any HEMS available in the market regardless of the programming language they use. If the existing HEMS provides an Application Programming Interface (API) based on the RESTFul architecture, that API can be used for integration. Our candidate HEMS in this projectmore » is home-assistant (Hass). An agent is implemented which can communicate with the Hass API and receives information about the devices loaded on the API. The agent publishes the information it receives on the VOLTTRON message bus so other agents can have access to this information. On the other side, for each type of devices, an agent is implemented such as Climate Agent, Lock Agent, Switch Agent, Light Agent, etc. Each of these agents is subscribed to the messages published on the message bus about their associated devices. These agents can also change the status of the devices by sending appropriate service calls to the API. Other agents and services on the platform can also access this information and coordinate their decision-making process based on this information.« less

  5. Application of Metagenomic Sequencing to Food Safety: Detection of Shiga Toxin-Producing Escherichia coli on Fresh Bagged Spinach

    PubMed Central

    Leonard, Susan R.; Mammel, Mark K.; Lacher, David W.

    2015-01-01

    Culture-independent diagnostics reduce the reliance on traditional (and slower) culture-based methodologies. Here we capitalize on advances in next-generation sequencing (NGS) to apply this approach to food pathogen detection utilizing NGS as an analytical tool. In this study, spiking spinach with Shiga toxin-producing Escherichia coli (STEC) following an established FDA culture-based protocol was used in conjunction with shotgun metagenomic sequencing to determine the limits of detection, sensitivity, and specificity levels and to obtain information on the microbiology of the protocol. We show that an expected level of contamination (∼10 CFU/100 g) could be adequately detected (including key virulence determinants and strain-level specificity) within 8 h of enrichment at a sequencing depth of 10,000,000 reads. We also rationalize the relative benefit of static versus shaking culture conditions and the addition of selected antimicrobial agents, thereby validating the long-standing culture-based parameters behind such protocols. Moreover, the shotgun metagenomic approach was informative regarding the dynamics of microbial communities during the enrichment process, including initial surveys of the microbial loads associated with bagged spinach; the microbes found included key genera such as Pseudomonas, Pantoea, and Exiguobacterium. Collectively, our metagenomic study highlights and considers various parameters required for transitioning to such sequencing-based diagnostics for food safety and the potential to develop better enrichment processes in a high-throughput manner not previously possible. Future studies will investigate new species-specific DNA signature target regimens, rational design of medium components in concert with judicious use of additives, such as antibiotics, and alterations in the sample processing protocol to enhance detection. PMID:26386062

  6. Optimal molecular profiling of tissue and tissue components: defining the best processing and microdissection methods for biomedical applications.

    PubMed

    Rodriguez-Canales, Jaime; Hanson, Jeffrey C; Hipp, Jason D; Balis, Ulysses J; Tangrea, Michael A; Emmert-Buck, Michael R; Bova, G Steven

    2013-01-01

    Isolation of well-preserved pure cell populations is a prerequisite for sound studies of the molecular basis of any tissue-based biological phenomenon. This updated chapter reviews current methods for obtaining anatomically specific signals from molecules isolated from tissues, a basic requirement for productive linking of phenotype and genotype. The quality of samples isolated from tissue and used for molecular analysis is often glossed over or omitted from publications, making interpretation and replication of data difficult or impossible. Fortunately, recently developed techniques allow life scientists to better document and control the quality of samples used for a given assay, creating a foundation for improvement in this area. Tissue processing for molecular studies usually involves some or all of the following steps: tissue collection, gross dissection/identification, fixation, processing/embedding, storage/archiving, sectioning, staining, microdissection/annotation, and pure analyte labeling/identification and quantification. We provide a detailed comparison of some current tissue microdissection technologies and provide detailed example protocols for tissue component handling upstream and downstream from microdissection. We also discuss some of the physical and chemical issues related to optimal tissue processing and include methods specific to cytology specimens. We encourage each laboratory to use these as a starting point for optimization of their overall process of moving from collected tissue to high-quality, appropriately anatomically tagged scientific results. Improvement in this area will significantly increase life science quality and productivity. The chapter is divided into introduction, materials, protocols, and notes subheadings. Because many protocols are covered in each of these sections, information relating to a single protocol is not contiguous. To get the greatest benefit from this chapter, readers are advised to read through the entire chapter first, identify protocols appropriate to their laboratory for each step in their workflow, and then reread entries in each section pertaining to each of these single protocols.

  7. Open PHACTS computational protocols for in silico target validation of cellular phenotypic screens: knowing the knowns† †The authors declare no competing interests. ‡ ‡Electronic supplementary information (ESI) available: Pipeline Pilot protocols, xls file with the output of the Pipeline Pilot protocols, KNIME workflows, and supplementary figures showing the Pipeline Pilot protocols. See DOI: 10.1039/c6md00065g Click here for additional data file.

    PubMed Central

    Zdrazil, B.; Neefs, J.-M.; Van Vlijmen, H.; Herhaus, C.; Caracoti, A.; Brea, J.; Roibás, B.; Loza, M. I.; Queralt-Rosinach, N.; Furlong, L. I.; Gaulton, A.; Bartek, L.; Senger, S.; Chichester, C.; Engkvist, O.; Evelo, C. T.; Franklin, N. I.; Marren, D.; Ecker, G. F.

    2016-01-01

    Phenotypic screening is in a renaissance phase and is expected by many academic and industry leaders to accelerate the discovery of new drugs for new biology. Given that phenotypic screening is per definition target agnostic, the emphasis of in silico and in vitro follow-up work is on the exploration of possible molecular mechanisms and efficacy targets underlying the biological processes interrogated by the phenotypic screening experiments. Herein, we present six exemplar computational protocols for the interpretation of cellular phenotypic screens based on the integration of compound, target, pathway, and disease data established by the IMI Open PHACTS project. The protocols annotate phenotypic hit lists and allow follow-up experiments and mechanistic conclusions. The annotations included are from ChEMBL, ChEBI, GO, WikiPathways and DisGeNET. Also provided are protocols which select from the IUPHAR/BPS Guide to PHARMACOLOGY interaction file selective compounds to probe potential targets and a correlation robot which systematically aims to identify an overlap of active compounds in both the phenotypic as well as any kinase assay. The protocols are applied to a phenotypic pre-lamin A/C splicing assay selected from the ChEMBL database to illustrate the process. The computational protocols make use of the Open PHACTS API and data and are built within the Pipeline Pilot and KNIME workflow tools. PMID:27774140

  8. Internet Protocol Transition Workbook

    DTIC Science & Technology

    1982-03-01

    U N C-* INTERNET PROTOCOL TRANSITION WORKBOOK March 1982 Network Information Canter SRI International Menlo Park, CA 94025 t tv l...Feinler Network Information Center SRI International Menlo Park. California 94025 (415) 859-3695 FEINLEROSRI-NIC (Online mail) [Page ii] I.7 Internet ...31 Postel. J., " Internet Control Message Protocol - DARPA Internet Program Protocol Specification." RFC 792, USC/ Information Sciences Institute

  9. Experimental purification of two-atom entanglement.

    PubMed

    Reichle, R; Leibfried, D; Knill, E; Britton, J; Blakestad, R B; Jost, J D; Langer, C; Ozeri, R; Seidelin, S; Wineland, D J

    2006-10-19

    Entanglement is a necessary resource for quantum applications--entanglement established between quantum systems at different locations enables private communication and quantum teleportation, and facilitates quantum information processing. Distributed entanglement is established by preparing an entangled pair of quantum particles in one location, and transporting one member of the pair to another location. However, decoherence during transport reduces the quality (fidelity) of the entanglement. A protocol to achieve entanglement 'purification' has been proposed to improve the fidelity after transport. This protocol uses separate quantum operations at each location and classical communication to distil high-fidelity entangled pairs from lower-fidelity pairs. Proof-of-principle experiments distilling entangled photon pairs have been carried out. However, these experiments obtained distilled pairs with a low probability of success and required destruction of the entangled pairs, rendering them unavailable for further processing. Here we report efficient and non-destructive entanglement purification with atomic quantum bits. Two noisy entangled pairs were created and distilled into one higher-fidelity pair available for further use. Success probabilities were above 35 per cent. The many applications of entanglement purification make it one of the most important techniques in quantum information processing.

  10. Protocols — EDRN Public Portal

    Cancer.gov

    EDRN investigators protocols. The following is a list of the EDRN protocols that have been captured and curated. Additional information will be added as it is available. Contact information is provided as part of the detail for each protocol.

  11. Talking the Test: Using Verbal Report Data in Looking at the Processing of Cloze Tasks.

    ERIC Educational Resources Information Center

    Gibson, Bob

    1997-01-01

    The use of verbal report procedures as a research tool for gaining insight into the language learning process is discussed. Specifically, having second language students complete think-aloud protocols when they take cloze tests can provide useful information about what is being measured and how it has been learned. Use of such introspective…

  12. Controlled information destruction: the final frontier in preserving information security for every organisation

    NASA Astrophysics Data System (ADS)

    Curiac, Daniel-Ioan; Pachia, Mihai

    2015-05-01

    Information security represents the cornerstone of every data processing system that resides in an organisation's trusted network, implementing all necessary protocols, mechanisms and policies to be one step ahead of possible threats. Starting from the need to strengthen the set of security services, in this article we introduce a new and innovative process named controlled information destruction (CID) that is meant to secure sensitive data that are no longer needed for the organisation's future purposes but would be very damaging if revealed. The disposal of this type of data has to be controlled carefully in order to delete not only the information itself but also all its splinters spread throughout the network, thus denying any possibility of recovering the information after its alleged destruction. This process leads to a modified model of information assurance and also reconfigures the architecture of any information security management system. The scheme we envisioned relies on a reshaped information lifecycle, which reveals the impact of the CID procedure directly upon the information states.

  13. An application protocol for CAD to CAD transfer of electronic information

    NASA Technical Reports Server (NTRS)

    Azu, Charles C., Jr.

    1993-01-01

    The exchange of Computer Aided Design (CAD) information between dissimilar CAD systems is a problem. This is especially true for transferring electronics CAD information such as multi-chip module (MCM), hybrid microcircuit assembly (HMA), and printed circuit board (PCB) designs. Currently, there exists several neutral data formats for transferring electronics CAD information. These include IGES, EDIF, and DXF formats. All these formats have limitations for use in exchanging electronic data. In an attempt to overcome these limitations, the Navy's MicroCIM program implemented a project to transfer hybrid microcircuit design information between dissimilar CAD systems. The IGES (Initial Graphics Exchange Specification) format is used since it is well established within the CAD industry. The goal of the project is to have a complete transfer of microelectronic CAD information, using IGES, without any data loss. An Application Protocol (AP) is being developed to specify how hybrid microcircuit CAD information will be represented by IGES entity constructs. The AP defines which IGES data items are appropriate for describing HMA geometry, connectivity, and processing as well as HMA material characteristics.

  14. Brain Monitoring with Electroencephalography and the Electroencephalogram-Derived Bispectral Index During Cardiac Surgery

    PubMed Central

    Kertai, Miklos D.; Whitlock, Elizabeth L.; Avidan, Michael S.

    2011-01-01

    Cardiac surgery presents particular challenges for the anesthesiologist. In addition to standard and advanced monitors typically used during cardiac surgery, anesthesiologists may consider monitoring the brain with raw or processed electroencephalography (EEG). There is strong evidence that a protocol incorporating the processed EEG Bispectral Index (BIS) decreases the incidence intraoperative awareness compared with standard practice. However there is conflicting evidence that incorporating the BIS into cardiac anesthesia practice improves “fast-tracking,” decreases anesthetic drug use, or detects cerebral ischemia. Recent research, including many cardiac surgical patients, shows that a protocol based on BIS monitoring is not superior to a protocol based on end tidal anesthetic concentration monitoring in preventing awareness. There has been a resurgence of interest in the anesthesia literature in limited montage EEG monitoring, including nonproprietary processed indices. This has been accompanied by research showing that with structured training, anesthesiologists can glean useful information from the raw EEG trace. In this review, we discuss both the hypothesized benefits and limitations of BIS and frontal channel EEG monitoring in the cardiac surgical population. PMID:22253267

  15. A gossip based information fusion protocol for distributed frequent itemset mining

    NASA Astrophysics Data System (ADS)

    Sohrabi, Mohammad Karim

    2018-07-01

    The computational complexity, huge memory space requirement, and time-consuming nature of frequent pattern mining process are the most important motivations for distribution and parallelization of this mining process. On the other hand, the emergence of distributed computational and operational environments, which causes the production and maintenance of data on different distributed data sources, makes the parallelization and distribution of the knowledge discovery process inevitable. In this paper, a gossip based distributed itemset mining (GDIM) algorithm is proposed to extract frequent itemsets, which are special types of frequent patterns, in a wireless sensor network environment. In this algorithm, local frequent itemsets of each sensor are extracted using a bit-wise horizontal approach (LHPM) from the nodes which are clustered using a leach-based protocol. Heads of clusters exploit a gossip based protocol in order to communicate each other to find the patterns which their global support is equal to or more than the specified support threshold. Experimental results show that the proposed algorithm outperforms the best existing gossip based algorithm in term of execution time.

  16. Faithful Remote Information Concentration Based on the Optimal Universal 1→2 Telecloning of Arbitrary Two-Qubit States

    NASA Astrophysics Data System (ADS)

    Peng, Jia-Yin; Lei, Hong-Xuan; Mo, Zhi-Wen

    2014-05-01

    The previous protocols of remote quantum information concentration were focused on the reverse process of quantum telecloning of single-qubit states. We here investigate the reverse process of optimal universal 1→2 telecloning of arbitrary two-qubit states. The aim of this telecloning is to distribute respectively the quantum information to two groups of spatially separated receivers from a group of two senders situated at two different locations. Our scheme shows that the distributed quantum information can be remotely concentrated back to a group of two different receivers with 1 of probability by utilizing maximally four-particle cluster state and four-particle GHZ state as quantum channel.

  17. Blockchain protocols in clinical trials: Transparency and traceability of consent.

    PubMed

    Benchoufi, Mehdi; Porcher, Raphael; Ravaud, Philippe

    2017-01-01

    Clinical trial consent for protocols and their revisions should be transparent for patients and traceable for stakeholders. Our goal is to implement a process allowing for collection of patients' informed consent, which is bound to protocol revisions, storing and tracking the consent in a secure, unfalsifiable and publicly verifiable way, and enabling the sharing of this information in real time. For that, we build a consent workflow using a trending technology called Blockchain. This is a distributed technology that brings a built-in layer of transparency and traceability. From a more general and prospective point of view, we believe Blockchain technology brings a paradigmatical shift to the entire clinical research field. We designed a Proof-of-Concept protocol consisting of time-stamping each step of the patient's consent collection using Blockchain, thus archiving and historicising the consent through cryptographic validation in a securely unfalsifiable and transparent way. For each protocol revision, consent was sought again.  We obtained a single document, in an open format, that accounted for the whole consent collection process: a time-stamped consent status regarding each version of the protocol. This document cannot be corrupted and can be checked on any dedicated public website. It should be considered a robust proof of data. However, in a live clinical trial, the authentication system should be strengthened to remove the need for third parties, here trial stakeholders, and give participative control to the peer users. In the future, the complex data flow of a clinical trial could be tracked by using Blockchain, which core functionality, named Smart Contract, could help prevent clinical trial events not occurring in the correct chronological order, for example including patients before they consented or analysing case report form data before freezing the database. Globally, Blockchain could help with reliability, security, transparency and could be a consistent step toward reproducibility.

  18. Blockchain protocols in clinical trials: Transparency and traceability of consent

    PubMed Central

    Benchoufi, Mehdi; Porcher, Raphael; Ravaud, Philippe

    2018-01-01

    Clinical trial consent for protocols and their revisions should be transparent for patients and traceable for stakeholders. Our goal is to implement a process allowing for collection of patients’ informed consent, which is bound to protocol revisions, storing and tracking the consent in a secure, unfalsifiable and publicly verifiable way, and enabling the sharing of this information in real time. For that, we build a consent workflow using a trending technology called Blockchain. This is a distributed technology that brings a built-in layer of transparency and traceability. From a more general and prospective point of view, we believe Blockchain technology brings a paradigmatical shift to the entire clinical research field. We designed a Proof-of-Concept protocol consisting of time-stamping each step of the patient’s consent collection using Blockchain, thus archiving and historicising the consent through cryptographic validation in a securely unfalsifiable and transparent way. For each protocol revision, consent was sought again.  We obtained a single document, in an open format, that accounted for the whole consent collection process: a time-stamped consent status regarding each version of the protocol. This document cannot be corrupted and can be checked on any dedicated public website. It should be considered a robust proof of data. However, in a live clinical trial, the authentication system should be strengthened to remove the need for third parties, here trial stakeholders, and give participative control to the peer users. In the future, the complex data flow of a clinical trial could be tracked by using Blockchain, which core functionality, named Smart Contract, could help prevent clinical trial events not occurring in the correct chronological order, for example including patients before they consented or analysing case report form data before freezing the database. Globally, Blockchain could help with reliability, security, transparency and could be a consistent step toward reproducibility. PMID:29167732

  19. Characterization of addressability by simultaneous randomized benchmarking.

    PubMed

    Gambetta, Jay M; Córcoles, A D; Merkel, S T; Johnson, B R; Smolin, John A; Chow, Jerry M; Ryan, Colm A; Rigetti, Chad; Poletto, S; Ohki, Thomas A; Ketchen, Mark B; Steffen, M

    2012-12-14

    The control and handling of errors arising from cross talk and unwanted interactions in multiqubit systems is an important issue in quantum information processing architectures. We introduce a benchmarking protocol that provides information about the amount of addressability present in the system and implement it on coupled superconducting qubits. The protocol consists of randomized benchmarking experiments run both individually and simultaneously on pairs of qubits. A relevant figure of merit for the addressability is then related to the differences in the measured average gate fidelities in the two experiments. We present results from two similar samples with differing cross talk and unwanted qubit-qubit interactions. The results agree with predictions based on simple models of the classical cross talk and Stark shifts.

  20. The Satisfaction and Use of Research Ethics Board Information Systems in Canada.

    PubMed

    Detlor, Brian; Wilson, Michael J

    2015-10-01

    This article reports findings from a national survey of Research Ethics Board (REB) personnel across Canada on the satisfaction and use of information systems that support the review and administration of research ethics protocols. Findings indicate that though a wide variety of REB systems are utilized, the majority fall short of desired characteristics. Despite these shortcomings, most respondents are satisfied with their current REB systems. Satisfaction is dependent on the volume of protocols processed in relation to the robustness of the system. Boards with higher volumes are more satisfied with full-fledged systems; however, the satisfaction of REBs with lower volumes is not affected by the robustness of the REB system used. Recommendations are provided. © The Author(s) 2015.

  1. Modeling abundance using multinomial N-mixture models

    USGS Publications Warehouse

    Royle, Andy

    2016-01-01

    Multinomial N-mixture models are a generalization of the binomial N-mixture models described in Chapter 6 to allow for more complex and informative sampling protocols beyond simple counts. Many commonly used protocols such as multiple observer sampling, removal sampling, and capture-recapture produce a multivariate count frequency that has a multinomial distribution and for which multinomial N-mixture models can be developed. Such protocols typically result in more precise estimates than binomial mixture models because they provide direct information about parameters of the observation process. We demonstrate the analysis of these models in BUGS using several distinct formulations that afford great flexibility in the types of models that can be developed, and we demonstrate likelihood analysis using the unmarked package. Spatially stratified capture-recapture models are one class of models that fall into the multinomial N-mixture framework, and we discuss analysis of stratified versions of classical models such as model Mb, Mh and other classes of models that are only possible to describe within the multinomial N-mixture framework.

  2. A covert authentication and security solution for GMOs.

    PubMed

    Mueller, Siguna; Jafari, Farhad; Roth, Don

    2016-09-21

    Proliferation and expansion of security risks necessitates new measures to ensure authenticity and validation of GMOs. Watermarking and other cryptographic methods are available which conceal and recover the original signature, but in the process reveal the authentication information. In many scenarios watermarking and standard cryptographic methods are necessary but not sufficient and new, more advanced, cryptographic protocols are necessary. Herein, we present a new crypto protocol, that is applicable in broader settings, and embeds the authentication string indistinguishably from a random element in the signature space and the string is verified or denied without disclosing the actual signature. Results show that in a nucleotide string of 1000, the algorithm gives a correlation of 0.98 or higher between the distribution of the codon and that of E. coli, making the signature virtually invisible. This algorithm may be used to securely authenticate and validate GMOs without disclosing the actual signature. While this protocol uses watermarking, its novelty is in use of more complex cryptographic techniques based on zero knowledge proofs to encode information.

  3. A reference model for scientific information interchange

    NASA Technical Reports Server (NTRS)

    Reich, Lou; Sawyer, Don; Davis, Randy

    1993-01-01

    This paper presents an overview of an Information Interchange Reference Model (IIRM) currently being developed by individuals participating in the Consultative Committee for Space Data Systems (CCSDS) Panel 2, the Planetary Data Systems (PDS), and the Committee on Earth Observing Satellites (CEOS). This is an ongoing research activity and is not an official position by these bodies. This reference model provides a framework for describing and assessing current and proposed methodologies for information interchange within and among the space agencies. It is hoped that this model will improve interoperability between the various methodologies. As such, this model attempts to address key information interchange issues as seen by the producers and users of space-related data and to put them into a coherent framework. Information is understood as the knowledge (e.g., the scientific content) represented by data. Therefore, concern is not primarily on mechanisms for transferring data from user to user (e.g., compact disk read-only memory (CD-ROM), wide-area networks, optical tape, and so forth) but on how information is encoded as data and how the information content is maintained with minimal loss or distortion during transmittal. The model assumes open systems, which means that the protocols or methods used should be fully described and the descriptions publicly available. Ideally these protocols are promoted by recognized standards organizations using processes that permit involvement by those most likely to be affected, thereby enhancing the protocol's stability and the likelihood of wide support.

  4. SNMP-SI: A Network Management Tool Based on Slow Intelligence System Approach

    NASA Astrophysics Data System (ADS)

    Colace, Francesco; de Santo, Massimo; Ferrandino, Salvatore

    The last decade has witnessed an intense spread of computer networks that has been further accelerated with the introduction of wireless networks. Simultaneously with, this growth has increased significantly the problems of network management. Especially in small companies, where there is no provision of personnel assigned to these tasks, the management of such networks is often complex and malfunctions can have significant impacts on their businesses. A possible solution is the adoption of Simple Network Management Protocol. Simple Network Management Protocol (SNMP) is a standard protocol used to exchange network management information. It is part of the Transmission Control Protocol/Internet Protocol (TCP/IP) protocol suite. SNMP provides a tool for network administrators to manage network performance, find and solve network problems, and plan for network growth. SNMP has a big disadvantage: its simple design means that the information it deals with is neither detailed nor well organized enough to deal with the expanding modern networking requirements. Over the past years much efforts has been given to improve the lack of Simple Network Management Protocol and new frameworks has been developed: A promising approach involves the use of Ontology. This is the starting point of this paper where a novel approach to the network management based on the use of the Slow Intelligence System methodologies and Ontology based techniques is proposed. Slow Intelligence Systems is a general-purpose systems characterized by being able to improve performance over time through a process involving enumeration, propagation, adaptation, elimination and concentration. Therefore, the proposed approach aims to develop a system able to acquire, according to an SNMP standard, information from the various hosts that are in the managed networks and apply solutions in order to solve problems. To check the feasibility of this model first experimental results in a real scenario are showed.

  5. Implementation of an anonymisation tool for clinical trials using a clinical trial processor integrated with an existing trial patient data information system.

    PubMed

    Aryanto, Kadek Y E; Broekema, André; Oudkerk, Matthijs; van Ooijen, Peter M A

    2012-01-01

    To present an adapted Clinical Trial Processor (CTP) test set-up for receiving, anonymising and saving Digital Imaging and Communications in Medicine (DICOM) data using external input from the original database of an existing clinical study information system to guide the anonymisation process. Two methods are presented for an adapted CTP test set-up. In the first method, images are pushed from the Picture Archiving and Communication System (PACS) using the DICOM protocol through a local network. In the second method, images are transferred through the internet using the HTTPS protocol. In total 25,000 images from 50 patients were moved from the PACS, anonymised and stored within roughly 2 h using the first method. In the second method, an average of 10 images per minute were transferred and processed over a residential connection. In both methods, no duplicated images were stored when previous images were retransferred. The anonymised images are stored in appropriate directories. The CTP can transfer and process DICOM images correctly in a very easy set-up providing a fast, secure and stable environment. The adapted CTP allows easy integration into an environment in which patient data are already included in an existing information system.

  6. Medical education research and IRB review: an analysis and comparison of the IRB review process at six institutions.

    PubMed

    Dyrbye, Liselotte N; Thomas, Matthew R; Mechaber, Alex J; Eacker, Anne; Harper, William; Massie, F Stanford; Power, David V; Shanafelt, Tait D

    2007-07-01

    To compare how different institutional review boards (IRBs) process and evaluate the same multiinstitutional educational research proposal of medical students' quality of life. Prospective collection in 2005 of key variables regarding the IRB submission and review process of the same educational research proposal involving medical students, which was submitted to six IRBs, each associated with a different medical school. Four IRBs determined the protocol was appropriate for expedited review, and the remaining two required full review. Substantial variation existed in the time to review the protocol by an IRB administrator/IRB member (range 1-101 days) and by the IRB committee (range 6-115 days). One IRB committee approved the study as written. The remaining five IRB committees had a median of 13 requests for additional information/changes to the protocol. Sixty-eight percent of requests (36 of 53) pertained to the informed consent letter; one third (12 of 36) of these requests were unique modifications requested by one IRB but not the others. Although five IRB committees approved the survey after a median of 47 days (range 6-73), one committee had not responded six months after submission (164 days), preventing that school from participating. The findings suggest variability in the timeliness and consistency of IRB review of medical education research across institutions that may hinder multi-institutional research and slow evidence-based medical education reform. The findings demonstrate the difficulties of having medical education research reviewed by IRBs, which are typically designed to review clinical trials, and suggest that the review process for medical education research needs reform.

  7. Improving the readability and processability of a pediatric informed consent document: effects on parents' understanding.

    PubMed

    Tait, Alan R; Voepel-Lewis, Terri; Malviya, Shobha; Philipson, Sandra J

    2005-04-01

    To examine whether a consent document modified to conform with the federal guidelines for readability and processability would result in greater parental understanding compared with a standard form. Randomized clinical study. The preoperative waiting area of a larger tertiary care children's hospital. A total of 305 parents of children scheduled for minor elective surgical procedures. Parents were randomized to receive information about a clinical study in 1 of 4 ways: (1) standard consent form alone, (2) standard consent form with verbal disclosure, (3) modified form alone (standard form modified to meet the federal guidelines for readability and processability), and (4) modified form with verbal disclosure. Parents were interviewed to determine their understanding of 11 elements of consent, including study purpose, protocol, risks, benefits to child (direct), benefit to others (indirect), freedom to withdraw, alternatives, duration of study, voluntariness, confidentiality, and whom to contact. Their responses were scored by 2 independent assessors. Understanding of the protocol, study duration, risks, and direct benefits, together with overall understanding, was greater among parents who received the modified form (P<.001). Additionally, parents reported that the modified form had greater clarity (P = .009) and improved layout compared with the standard form (P<.001). When parents were shown both forms, 81.2% preferred the modified version. Results suggest that a consent form written according to federal guidelines for readability and processability can improve parent understanding and thus will be important in enhancing the informed consent process.

  8. Intelligent routing protocol for ad hoc wireless network

    NASA Astrophysics Data System (ADS)

    Peng, Chaorong; Chen, Chang Wen

    2006-05-01

    A novel routing scheme for mobile ad hoc networks (MANETs), which combines hybrid and multi-inter-routing path properties with a distributed topology discovery route mechanism using control agents is proposed in this paper. In recent years, a variety of hybrid routing protocols for Mobile Ad hoc wireless networks (MANETs) have been developed. Which is proactively maintains routing information for a local neighborhood, while reactively acquiring routes to destinations beyond the global. The hybrid protocol reduces routing discovery latency and the end-to-end delay by providing high connectivity without requiring much of the scarce network capacity. On the other side the hybrid routing protocols in MANETs likes Zone Routing Protocol still need route "re-discover" time when a route between zones link break. Sine the topology update information needs to be broadcast routing request on local zone. Due to this delay, the routing protocol may not be applicable for real-time data and multimedia communication. We utilize the advantages of a clustering organization and multi-routing path in routing protocol to achieve several goals at the same time. Firstly, IRP efficiently saves network bandwidth and reduces route reconstruction time when a routing path fails. The IRP protocol does not require global periodic routing advertisements, local control agents will automatically monitor and repair broke links. Secondly, it efficiently reduces congestion and traffic "bottlenecks" for ClusterHeads in clustering network. Thirdly, it reduces significant overheads associated with maintaining clusters. Fourthly, it improves clusters stability due to dynamic topology changing frequently. In this paper, we present the Intelligent Routing Protocol. First, we discuss the problem of routing in ad hoc networks and the motivation of IRP. We describe the hierarchical architecture of IRP. We describe the routing process and illustrate it with an example. Further, we describe the control manage mechanisms, which are used to control active route and reduce the traffic amount in the route discovery procedure. Finial, the numerical experiments are given to show the effectiveness of IRP routing protocol.

  9. Routing protocol for wireless quantum multi-hop mesh backbone network based on partially entangled GHZ state

    NASA Astrophysics Data System (ADS)

    Xiong, Pei-Ying; Yu, Xu-Tao; Zhang, Zai-Chen; Zhan, Hai-Tao; Hua, Jing-Yu

    2017-08-01

    Quantum multi-hop teleportation is important in the field of quantum communication. In this study, we propose a quantum multi-hop communication model and a quantum routing protocol with multihop teleportation for wireless mesh backbone networks. Based on an analysis of quantum multi-hop protocols, a partially entangled Greenberger-Horne-Zeilinger (GHZ) state is selected as the quantum channel for the proposed protocol. Both quantum and classical wireless channels exist between two neighboring nodes along the route. With the proposed routing protocol, quantum information can be transmitted hop by hop from the source node to the destination node. Based on multi-hop teleportation based on the partially entangled GHZ state, a quantum route established with the minimum number of hops. The difference between our routing protocol and the classical one is that in the former, the processes used to find a quantum route and establish quantum channel entanglement occur simultaneously. The Bell state measurement results of each hop are piggybacked to quantum route finding information. This method reduces the total number of packets and the magnitude of air interface delay. The deduction of the establishment of a quantum channel between source and destination is also presented here. The final success probability of quantum multi-hop teleportation in wireless mesh backbone networks was simulated and analyzed. Our research shows that quantum multi-hop teleportation in wireless mesh backbone networks through a partially entangled GHZ state is feasible.

  10. Measuring bioenergetics in T cells using a Seahorse Extracellular Flux Analyzer

    PubMed Central

    van der Windt, Gerritje J.W.; Chang, Chih-Hao; Pearce, Erika L.

    2016-01-01

    This unit contains several protocols to determine the energy utilization of T cells in real-time using a Seahorse Extracellular Flux Analyzer (www.seahorsebio.com). The advantages to using this machine over traditional metabolic assays include the simultaneous measurement of glycolysis and mitochondrial respiration, in real-time, on relatively small numbers of cells, without any radioactivity. The Basic Protocol describes a standard mitochondrial stress test on the XFe96, which yields information about oxidative phosphorylation and glycolysis, two energy-generating pathways. The alternate protocols provide examples of adaptations to the Basic Protocol, including adjustments for the use of the XFe24. A protocol for real-time bioenergetic responses to T cell activation allows for the analysis of immediate metabolic changes after T cell receptor stimulation. Specific substrate utilization can be determined by the use of differential assay media, or the injection of drugs that specifically affect certain metabolic processes. Accurate cell numbers, purity, and viability are critical to obtain reliable results. PMID:27038461

  11. Measuring Bioenergetics in T Cells Using a Seahorse Extracellular Flux Analyzer.

    PubMed

    van der Windt, Gerritje J W; Chang, Chih-Hao; Pearce, Erika L

    2016-04-01

    This unit contains several protocols to determine the energy utilization of T cells in real-time using a Seahorse Extracellular Flux Analyzer (http://www.seahorsebio.com). The advantages to using this machine over traditional metabolic assays include the simultaneous measurement of glycolysis and mitochondrial respiration, in real-time, on relatively small numbers of cells, without any radioactivity. The Basic Protocol describes a standard mitochondrial stress test on the XF(e) 96, which yields information about oxidative phosphorylation and glycolysis, two energy-generating pathways. The alternate protocols provide examples of adaptations to the Basic Protocol, including adjustments for the use of the XF(e) 24. A protocol for real-time bioenergetic responses to T cell activation allows for the analysis of immediate metabolic changes after T cell receptor stimulation. Specific substrate utilization can be determined by the use of differential assay media, or the injection of drugs that specifically affect certain metabolic processes. Accurate cell numbers, purity, and viability are critical to obtain reliable results. Copyright © 2016 John Wiley & Sons, Inc.

  12. Quantum gates by inverse engineering of a Hamiltonian

    NASA Astrophysics Data System (ADS)

    Santos, Alan C.

    2018-01-01

    Inverse engineering of a Hamiltonian (IEH) from an evolution operator is a useful technique for the protocol of quantum control with potential applications in quantum information processing. In this paper we introduce a particular protocol to perform IEH and we show how this scheme can be used to implement a set of quantum gates by using minimal quantum resources (such as entanglement, interactions between more than two qubits or auxiliary qubits). Remarkably, while previous protocols request three-qubit interactions and/or auxiliary qubits to implement such gates, our protocol requires just two-qubit interactions and no auxiliary qubits. By using this approach we can obtain a large class of Hamiltonians that allow us to implement single and two-qubit gates necessary for quantum computation. To conclude this article we analyze the performance of our scheme against systematic errors related to amplitude noise, where we show that the free parameters introduced in our scheme can be useful for enhancing the robustness of the protocol against such errors.

  13. A Protocol for Advanced Psychometric Assessment of Surveys

    PubMed Central

    Squires, Janet E.; Hayduk, Leslie; Hutchinson, Alison M.; Cranley, Lisa A.; Gierl, Mark; Cummings, Greta G.; Norton, Peter G.; Estabrooks, Carole A.

    2013-01-01

    Background and Purpose. In this paper, we present a protocol for advanced psychometric assessments of surveys based on the Standards for Educational and Psychological Testing. We use the Alberta Context Tool (ACT) as an exemplar survey to which this protocol can be applied. Methods. Data mapping, acceptability, reliability, and validity are addressed. Acceptability is assessed with missing data frequencies and the time required to complete the survey. Reliability is assessed with internal consistency coefficients and information functions. A unitary approach to validity consisting of accumulating evidence based on instrument content, response processes, internal structure, and relations to other variables is taken. We also address assessing performance of survey data when aggregated to higher levels (e.g., nursing unit). Discussion. In this paper we present a protocol for advanced psychometric assessment of survey data using the Alberta Context Tool (ACT) as an exemplar survey; application of the protocol to the ACT survey is underway. Psychometric assessment of any survey is essential to obtaining reliable and valid research findings. This protocol can be adapted for use with any nursing survey. PMID:23401759

  14. Protecting single-photon entanglement with practical entanglement source

    NASA Astrophysics Data System (ADS)

    Zhou, Lan; Ou-Yang, Yang; Wang, Lei; Sheng, Yu-Bo

    2017-06-01

    Single-photon entanglement (SPE) is important for quantum communication and quantum information processing. However, SPE is sensitive to photon loss. In this paper, we discuss a linear optical amplification protocol for protecting SPE. Different from the previous protocols, we exploit the practical spontaneous parametric down-conversion (SPDC) source to realize the amplification, for the ideal entanglement source is unavailable in current quantum technology. Moreover, we prove that the amplification using the entanglement generated from SPDC source as auxiliary is better than the amplification assisted with single photons. The reason is that the vacuum state from SPDC source will not affect the amplification, so that it can be eliminated automatically. This protocol may be useful in future long-distance quantum communications.

  15. Relativistic (2,3)-threshold quantum secret sharing

    NASA Astrophysics Data System (ADS)

    Ahmadi, Mehdi; Wu, Ya-Dong; Sanders, Barry C.

    2017-09-01

    In quantum secret sharing protocols, the usual presumption is that the distribution of quantum shares and players' collaboration are both performed inertially. Here we develop a quantum secret sharing protocol that relaxes these assumptions wherein we consider the effects due to the accelerating motion of the shares. Specifically, we solve the (2,3)-threshold continuous-variable quantum secret sharing in noninertial frames. To this aim, we formulate the effect of relativistic motion on the quantum field inside a cavity as a bosonic quantum Gaussian channel. We investigate how the fidelity of quantum secret sharing is affected by nonuniform motion of the quantum shares. Furthermore, we fully characterize the canonical form of the Gaussian channel, which can be utilized in quantum-information-processing protocols to include relativistic effects.

  16. Quantum enigma cipher as a generalization of the quantum stream cipher

    NASA Astrophysics Data System (ADS)

    Kato, Kentaro

    2016-09-01

    Various types of randomizations for the quantum stream cipher by Y00 protocol have been developed so far. In particular, it must be noted that the analysis of immunity against correlation attacks with a new type of randomization by Hirota and Kurosawa prompted a new look at the quantum stream cipher by Y00 protocol (Quant. Inform. Process. 6(2) 2007). From the preceding study on the quantum stream cipher, we recognized that the quantum stream cipher by Y00 protocol would be able to be generalized to a new type of physical cipher that has potential to exceed the Shannon limit by installing additional randomization mechanisms, in accordance with the law of quantum mechanics. We call this new type of physical random cipher the quantum enigma cipher. In this article, we introduce the recent developments for the quantum stream cipher by Y00 protocol and future plans toward the quantum enigma cipher.

  17. Adaptive tracking of a time-varying field with a quantum sensor

    NASA Astrophysics Data System (ADS)

    Bonato, Cristian; Berry, Dominic W.

    2017-05-01

    Sensors based on single spins can enable magnetic-field detection with very high sensitivity and spatial resolution. Previous work has concentrated on sensing of a constant magnetic field or a periodic signal. Here, we instead investigate the problem of estimating a field with nonperiodic variation described by a Wiener process. We propose and study, by numerical simulations, an adaptive tracking protocol based on Bayesian estimation. The tracking protocol updates the probability distribution for the magnetic field based on measurement outcomes and adapts the choice of sensing time and phase in real time. By taking the statistical properties of the signal into account, our protocol strongly reduces the required measurement time. This leads to a reduction of the error in the estimation of a time-varying signal by up to a factor of four compare with protocols that do not take this information into account.

  18. Advanced information processing system: Authentication protocols for network communication

    NASA Technical Reports Server (NTRS)

    Harper, Richard E.; Adams, Stuart J.; Babikyan, Carol A.; Butler, Bryan P.; Clark, Anne L.; Lala, Jaynarayan H.

    1994-01-01

    In safety critical I/O and intercomputer communication networks, reliable message transmission is an important concern. Difficulties of communication and fault identification in networks arise primarily because the sender of a transmission cannot be identified with certainty, an intermediate node can corrupt a message without certainty of detection, and a babbling node cannot be identified and silenced without lengthy diagnosis and reconfiguration . Authentication protocols use digital signature techniques to verify the authenticity of messages with high probability. Such protocols appear to provide an efficient solution to many of these problems. The objective of this program is to develop, demonstrate, and evaluate intercomputer communication architectures which employ authentication. As a context for the evaluation, the authentication protocol-based communication concept was demonstrated under this program by hosting a real-time flight critical guidance, navigation and control algorithm on a distributed, heterogeneous, mixed redundancy system of workstations and embedded fault-tolerant computers.

  19. Robowell: An automated process for monitoring ground water quality using established sampling protocols

    USGS Publications Warehouse

    Granato, G.E.; Smith, K.P.

    1999-01-01

    Robowell is an automated process for monitoring selected ground water quality properties and constituents by pumping a well or multilevel sampler. Robowell was developed and tested to provide a cost-effective monitoring system that meets protocols expected for manual sampling. The process uses commercially available electronics, instrumentation, and hardware, so it can be configured to monitor ground water quality using the equipment, purge protocol, and monitoring well design most appropriate for the monitoring site and the contaminants of interest. A Robowell prototype was installed on a sewage treatment plant infiltration bed that overlies a well-studied unconfined sand and gravel aquifer at the Massachusetts Military Reservation, Cape Cod, Massachusetts, during a time when two distinct plumes of constituents were released. The prototype was operated from May 10 to November 13, 1996, and quality-assurance/quality-control measurements demonstrated that the data obtained by the automated method was equivalent to data obtained by manual sampling methods using the same sampling protocols. Water level, specific conductance, pH, water temperature, dissolved oxygen, and dissolved ammonium were monitored by the prototype as the wells were purged according to U.S Geological Survey (USGS) ground water sampling protocols. Remote access to the data record, via phone modem communications, indicated the arrival of each plume over a few days and the subsequent geochemical reactions over the following weeks. Real-time availability of the monitoring record provided the information needed to initiate manual sampling efforts in response to changes in measured ground water quality, which proved the method and characterized the screened portion of the plume in detail through time. The methods and the case study described are presented to document the process for future use.

  20. A Gossip-based Energy Efficient Protocol for Robust In-network Aggregation in Wireless Sensor Networks

    NASA Astrophysics Data System (ADS)

    Fauji, Shantanu

    We consider the problem of energy efficient and fault tolerant in--network aggregation for wireless sensor networks (WSNs). In-network aggregation is the process of aggregation while collecting data from sensors to the base station. This process should be energy efficient due to the limited energy at the sensors and tolerant to the high failure rates common in sensor networks. Tree based in--network aggregation protocols, although energy efficient, are not robust to network failures. Multipath routing protocols are robust to failures to a certain degree but are not energy efficient due to the overhead in the maintenance of multiple paths. We propose a new protocol for in-network aggregation in WSNs, which is energy efficient, achieves high lifetime, and is robust to the changes in the network topology. Our protocol, gossip--based protocol for in-network aggregation (GPIA) is based on the spreading of information via gossip. GPIA is not only adaptive to failures and changes in the network topology, but is also energy efficient. Energy efficiency of GPIA comes from all the nodes being capable of selective message reception and detecting convergence of the aggregation early. We experimentally show that GPIA provides significant improvement over some other competitors like the Ridesharing, Synopsis Diffusion and the pure version of gossip. GPIA shows ten fold, five fold and two fold improvement over the pure gossip, the synopsis diffusion and Ridesharing protocols in terms of network lifetime, respectively. Further, GPIA retains gossip's robustness to failures and improves upon the accuracy of synopsis diffusion and Ridesharing.

  1. A Natural Language Processing-based Model to Automate MRI Brain Protocol Selection and Prioritization.

    PubMed

    Brown, Andrew D; Marotta, Thomas R

    2017-02-01

    Incorrect imaging protocol selection can contribute to increased healthcare cost and waste. To help healthcare providers improve the quality and safety of medical imaging services, we developed and evaluated three natural language processing (NLP) models to determine whether NLP techniques could be employed to aid in clinical decision support for protocoling and prioritization of magnetic resonance imaging (MRI) brain examinations. To test the feasibility of using an NLP model to support clinical decision making for MRI brain examinations, we designed three different medical imaging prediction tasks, each with a unique outcome: selecting an examination protocol, evaluating the need for contrast administration, and determining priority. We created three models for each prediction task, each using a different classification algorithm-random forest, support vector machine, or k-nearest neighbor-to predict outcomes based on the narrative clinical indications and demographic data associated with 13,982 MRI brain examinations performed from January 1, 2013 to June 30, 2015. Test datasets were used to calculate the accuracy, sensitivity and specificity, predictive values, and the area under the curve. Our optimal results show an accuracy of 82.9%, 83.0%, and 88.2% for the protocol selection, contrast administration, and prioritization tasks, respectively, demonstrating that predictive algorithms can be used to aid in clinical decision support for examination protocoling. NLP models developed from the narrative clinical information provided by referring clinicians and demographic data are feasible methods to predict the protocol and priority of MRI brain examinations. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  2. A Protocol Analysis of the Influence of Technology on Students' Actions, Verbal Commentary, and Thought Processes During the Performance of Acid-Base Titrations.

    ERIC Educational Resources Information Center

    Nakhleh, Mary B.; Krajcik, Joseph S.

    1993-01-01

    From an analysis of 14 secondary student's actions and thought processes, it was found that technology's level of information affected the focus of student observations. The microcomputer group focused primarily on the graph while other groups exhibited multiple foci. The discussion data also reveal that students have three main ideas about how…

  3. Validation of Autoclave Protocols for Successful Decontamination of Category A Medical Waste Generated from Care of Patients with Serious Communicable Diseases

    PubMed Central

    Reimers, Mallory; Ernst, Neysa; Bova, Gregory; Nowakowski, Elaine; Bukowski, James; Ellis, Brandon C.; Smith, Chris; Sauer, Lauren; Dionne, Kim; Carroll, Karen C.; Maragakis, Lisa L.; Parrish, Nicole M.

    2016-01-01

    ABSTRACT In response to the Ebola outbreak in 2014, many hospitals designated specific areas to care for patients with Ebola and other highly infectious diseases. The safe handling of category A infectious substances is a unique challenge in this environment. One solution is on-site waste treatment with a steam sterilizer or autoclave. The Johns Hopkins Hospital (JHH) installed two pass-through autoclaves in its biocontainment unit (BCU). The JHH BCU and The Johns Hopkins biosafety level 3 (BSL-3) clinical microbiology laboratory designed and validated waste-handling protocols with simulated patient trash to ensure adequate sterilization. The results of the validation process revealed that autoclave factory default settings are potentially ineffective for certain types of medical waste and highlighted the critical role of waste packaging in successful sterilization. The lessons learned from the JHH validation process can inform the design of waste management protocols to ensure effective treatment of highly infectious medical waste. PMID:27927920

  4. Validation of Autoclave Protocols for Successful Decontamination of Category A Medical Waste Generated from Care of Patients with Serious Communicable Diseases.

    PubMed

    Garibaldi, Brian T; Reimers, Mallory; Ernst, Neysa; Bova, Gregory; Nowakowski, Elaine; Bukowski, James; Ellis, Brandon C; Smith, Chris; Sauer, Lauren; Dionne, Kim; Carroll, Karen C; Maragakis, Lisa L; Parrish, Nicole M

    2017-02-01

    In response to the Ebola outbreak in 2014, many hospitals designated specific areas to care for patients with Ebola and other highly infectious diseases. The safe handling of category A infectious substances is a unique challenge in this environment. One solution is on-site waste treatment with a steam sterilizer or autoclave. The Johns Hopkins Hospital (JHH) installed two pass-through autoclaves in its biocontainment unit (BCU). The JHH BCU and The Johns Hopkins biosafety level 3 (BSL-3) clinical microbiology laboratory designed and validated waste-handling protocols with simulated patient trash to ensure adequate sterilization. The results of the validation process revealed that autoclave factory default settings are potentially ineffective for certain types of medical waste and highlighted the critical role of waste packaging in successful sterilization. The lessons learned from the JHH validation process can inform the design of waste management protocols to ensure effective treatment of highly infectious medical waste. Copyright © 2017 American Society for Microbiology.

  5. iSBatch: a batch-processing platform for data analysis and exploration of live-cell single-molecule microscopy images and other hierarchical datasets.

    PubMed

    Caldas, Victor E A; Punter, Christiaan M; Ghodke, Harshad; Robinson, Andrew; van Oijen, Antoine M

    2015-10-01

    Recent technical advances have made it possible to visualize single molecules inside live cells. Microscopes with single-molecule sensitivity enable the imaging of low-abundance proteins, allowing for a quantitative characterization of molecular properties. Such data sets contain information on a wide spectrum of important molecular properties, with different aspects highlighted in different imaging strategies. The time-lapsed acquisition of images provides information on protein dynamics over long time scales, giving insight into expression dynamics and localization properties. Rapid burst imaging reveals properties of individual molecules in real-time, informing on their diffusion characteristics, binding dynamics and stoichiometries within complexes. This richness of information, however, adds significant complexity to analysis protocols. In general, large datasets of images must be collected and processed in order to produce statistically robust results and identify rare events. More importantly, as live-cell single-molecule measurements remain on the cutting edge of imaging, few protocols for analysis have been established and thus analysis strategies often need to be explored for each individual scenario. Existing analysis packages are geared towards either single-cell imaging data or in vitro single-molecule data and typically operate with highly specific algorithms developed for particular situations. Our tool, iSBatch, instead allows users to exploit the inherent flexibility of the popular open-source package ImageJ, providing a hierarchical framework in which existing plugins or custom macros may be executed over entire datasets or portions thereof. This strategy affords users freedom to explore new analysis protocols within large imaging datasets, while maintaining hierarchical relationships between experiments, samples, fields of view, cells, and individual molecules.

  6. The use of high-fidelity human patient simulation as an evaluative tool in the development of clinical research protocols and procedures.

    PubMed

    Wright, Melanie C; Taekman, Jeffrey M; Barber, Linda; Hobbs, Gene; Newman, Mark F; Stafford-Smith, Mark

    2005-12-01

    Errors in clinical research can be costly, in terms of patient safety, data integrity, and data collection. Data inaccuracy in early subjects of a clinical study may be associated with problems in the design of the protocol, procedures, and data collection tools. High-fidelity patient simulation centers provide an ideal environment to apply human-centered design to clinical trial development. A draft of a complex clinical protocol was designed, evaluated and modified using a high-fidelity human patient simulator in the Duke University Human Simulation and Patient Safety Center. The process included walk-throughs, detailed modifications of the protocol and development of procedural aids. Training of monitors and coordinators provided an opportunity for observation of performance that was used to identify further improvements to the protocol. Evaluative steps were used to design the research protocol and procedures. Iterative modifications were made to the protocol and data collection tools. The success in use of human simulation in the preparation of a complex clinical drug trial suggests the benefits of human patient simulation extend beyond training and medical equipment evaluation. Human patient simulation can provide a context for informal expert evaluation of clinical protocol design and for formal "rehearsal" to evaluate the efficacy of procedures and support tools.

  7. Integration and Analysis of Neighbor Discovery and Link Quality Estimation in Wireless Sensor Networks

    PubMed Central

    Radi, Marjan; Dezfouli, Behnam; Abu Bakar, Kamalrulnizam; Abd Razak, Shukor

    2014-01-01

    Network connectivity and link quality information are the fundamental requirements of wireless sensor network protocols to perform their desired functionality. Most of the existing discovery protocols have only focused on the neighbor discovery problem, while a few number of them provide an integrated neighbor search and link estimation. As these protocols require a careful parameter adjustment before network deployment, they cannot provide scalable and accurate network initialization in large-scale dense wireless sensor networks with random topology. Furthermore, performance of these protocols has not entirely been evaluated yet. In this paper, we perform a comprehensive simulation study on the efficiency of employing adaptive protocols compared to the existing nonadaptive protocols for initializing sensor networks with random topology. In this regard, we propose adaptive network initialization protocols which integrate the initial neighbor discovery with link quality estimation process to initialize large-scale dense wireless sensor networks without requiring any parameter adjustment before network deployment. To the best of our knowledge, this work is the first attempt to provide a detailed simulation study on the performance of integrated neighbor discovery and link quality estimation protocols for initializing sensor networks. This study can help system designers to determine the most appropriate approach for different applications. PMID:24678277

  8. GoActive: a protocol for the mixed methods process evaluation of a school-based physical activity promotion programme for 13-14year old adolescents.

    PubMed

    Jong, Stephanie T; Brown, Helen Elizabeth; Croxson, Caroline H D; Wilkinson, Paul; Corder, Kirsten L; van Sluijs, Esther M F

    2018-05-21

    Process evaluations are critical for interpreting and understanding outcome trial results. By understanding how interventions function across different settings, process evaluations have the capacity to inform future dissemination of interventions. The complexity of Get others Active (GoActive), a 12-week, school-based physical activity intervention implemented in eight schools, highlights the need to investigate how implementation is achieved across a variety of school settings. This paper describes the mixed methods GoActive process evaluation protocol that is embedded within the outcome evaluation. In this detailed process evaluation protocol, we describe the flexible and pragmatic methods that will be used for capturing the process evaluation data. A mixed methods design will be used for the process evaluation, including quantitative data collected in both the control and intervention arms of the GoActive trial, and qualitative data collected in the intervention arm. Data collection methods will include purposively sampled, semi-structured interviews and focus group interviews, direct observation, and participant questionnaires (completed by students, teachers, older adolescent mentors, and local authority-funded facilitators). Data will be analysed thematically within and across datasets. Overall synthesis of findings will address the process of GoActive implementation, and through which this process affects outcomes, with careful attention to the context of the school environment. This process evaluation will explore the experience of participating in GoActive from the perspectives of key groups, providing a greater understanding of the acceptability and process of implementation of the intervention across the eight intervention schools. This will allow for appraisal of the intervention's conceptual base, inform potential dissemination, and help optimise post-trial sustainability. The process evaluation will also assist in contextualising the trial effectiveness results with respect to how the intervention may or may not have worked and, if it was found to be effective, what might be required for it to be sustained in the 'real world'. Furthermore, it will offer suggestions for the development and implementation of future initiatives to promote physical activity within schools. ISRCTN, ISRCTN31583496 . Registered on 18 February 2014.

  9. 76 FR 11447 - Agency Information Collection Activities; Proposed Collection; Comment Request; Protection of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-02

    ... Protection regulations, the science of ozone layer depletion, and related topics. SUPPLEMENTARY INFORMATION... compliance with the Montreal Protocol on Substances that Deplete the Ozone Layer (Protocol) and the CAA.... obligations under Article 2H of the Montreal Protocol on Substances that Deplete the Ozone Layer (Protocol...

  10. A Study of Novice Systems Analysis Problem Solving Behaviors Using Protocol Analysis

    DTIC Science & Technology

    1992-09-01

    conducted. Each subject was given the same task to perform. The task involved a case study (Appendix B) of a utility company’s customer order processing system...behavior (Ramesh, 1989). The task was to design a customer order processing system that utilized a centralized telephone answering service center...of the utility company’s customer order processing system that was developed based on information obtained by a large systems consulting firm during

  11. Methods for Monitoring Fish Communities of Buffalo National River and Ozark National Scenic Riverways in the Ozark Plateaus of Arkansas and Missouri: Version 1.0

    USGS Publications Warehouse

    Petersen, James C.; Justus, B.G.; Dodd, H.R.; Bowles, D.E.; Morrison, L.W.; Williams, M.H.; Rowell, G.A.

    2008-01-01

    Buffalo National River located in north-central Arkansas, and Ozark National Scenic Riverways, located in southeastern Missouri, are the two largest units of the National Park Service in the Ozark Plateaus physiographic province. The purpose of this report is to provide a protocol that will be used by the National Park Service to sample fish communities and collect related water-quality, habitat, and stream discharge data of Buffalo National River and Ozark National Scenic Riverways to meet inventory and long-term monitoring objectives. The protocol includes (1) a protocol narrative, (2) several standard operating procedures, and (3) supplemental information helpful for implementation of the protocol. The protocol narrative provides background information about the protocol such as the rationale of why a particular resource or resource issue was selected for monitoring, information concerning the resource or resource issue of interest, a description of how monitoring results will inform management decisions, and a discussion of the linkages between this and other monitoring projects. The standard operating procedures cover preparation, training, reach selection, water-quality sampling, fish community sampling, physical habitat collection, measuring stream discharge, equipment maintenance and storage, data management and analysis, reporting, and protocol revision procedures. Much of the information in the standard operating procedures was gathered from existing protocols of the U.S. Geological Survey National Water Quality Assessment program or other sources. Supplemental information that would be helpful for implementing the protocol is included. This information includes information on fish species known or suspected to occur in the parks, sample sites, sample design, fish species traits, index of biotic integrity metrics, sampling equipment, and field forms.

  12. Forest, Trees, Dynamics: Results from a Novel Wisconsin Card Sorting Test Variant Protocol for Studying Global-Local Attention and Complex Cognitive Processes

    PubMed Central

    Cowley, Benjamin; Lukander, Kristian

    2016-01-01

    Background: Recognition of objects and their context relies heavily on the integrated functioning of global and local visual processing. In a realistic setting such as work, this processing becomes a sustained activity, implying a consequent interaction with executive functions. Motivation: There have been many studies of either global-local attention or executive functions; however it is relatively novel to combine these processes to study a more ecological form of attention. We aim to explore the phenomenon of global-local processing during a task requiring sustained attention and working memory. Methods: We develop and test a novel protocol for global-local dissociation, with task structure including phases of divided (“rule search”) and selective (“rule found”) attention, based on the Wisconsin Card Sorting Task (WCST). We test it in a laboratory study with 25 participants, and report on behavior measures (physiological data was also gathered, but not reported here). We develop novel stimuli with more naturalistic levels of information and noise, based primarily on face photographs, with consequently more ecological validity. Results: We report behavioral results indicating that sustained difficulty when participants test their hypotheses impacts matching-task performance, and diminishes the global precedence effect. Results also show a dissociation between subjectively experienced difficulty and objective dimension of performance, and establish the internal validity of the protocol. Contribution: We contribute an advance in the state of the art for testing global-local attention processes in concert with complex cognition. With three results we establish a connection between global-local dissociation and aspects of complex cognition. Our protocol also improves ecological validity and opens options for testing additional interactions in future work. PMID:26941689

  13. Factors affecting adoption, implementation fidelity, and sustainability of the Redesigned Community Health Fund in Tanzania: a mixed methods protocol for process evaluation in the Dodoma region

    PubMed Central

    Kalolo, Albino; Radermacher, Ralf; Stoermer, Manfred; Meshack, Menoris; De Allegri, Manuela

    2015-01-01

    Background Despite the implementation of various initiatives to address low enrollment in voluntary micro health insurance (MHI) schemes in sub-Saharan Africa, the problem of low enrollment remains unresolved. The lack of process evaluations of such interventions makes it difficult to ascertain whether their poor results are because of design failures or implementation weaknesses. Objective In this paper, we describe a process evaluation protocol aimed at opening the ‘black box’ to evaluate the implementation processes of the Redesigned Community Health Fund (CHF) program in the Dodoma region of Tanzania. Design The study employs a cross-sectional mixed methods design and is being carried out 3 years after the launch of the Redesigned CHF program. The study is grounded in a conceptual framework which rests on the Diffusion of Innovation Theory and the Implementation Fidelity Framework. The study utilizes a mixture of quantitative and qualitative data collection tools (questionnaires, focus group discussions, in-depth interviews, and document review), and aligns the evaluation to the Theory of Intervention developed by our team. Quantitative data will be used to measure program adoption, implementation fidelity, and their moderating factors. Qualitative data will be used to explore the responses of stakeholders to the intervention, contextual factors, and moderators of adoption, implementation fidelity, and sustainability. Discussion This protocol describes a systematic process evaluation in relation to the implementation of a reformed MHI. We trust that the theoretical approaches and methodologies described in our protocol may be useful to inform the design of future process evaluations focused on the assessment of complex interventions, such as MHI schemes. PMID:26679408

  14. REPHLEX II: An information management system for the ARS Water Data Base

    NASA Astrophysics Data System (ADS)

    Thurman, Jane L.

    1993-08-01

    The REPHLEX II computer system is an on-line information management system which allows scientists, engineers, and other researchers to retrieve data from the ARS Water Data Base using asynchronous communications. The system features two phone lines handling baud rates from 300 to 2400, customized menus to facilitate browsing, help screens, direct access to information and data files, electronic mail processing, file transfers using the XMODEM protocol, and log-in procedures which capture information on new users, process passwords, and log activity for a permanent audit trail. The primary data base on the REPHLEX II system is the ARS Water Data Base which consists of rainfall and runoff data from experimental agricultural watersheds located in the United States.

  15. ISPyB: an information management system for synchrotron macromolecular crystallography.

    PubMed

    Delagenière, Solange; Brenchereau, Patrice; Launer, Ludovic; Ashton, Alun W; Leal, Ricardo; Veyrier, Stéphanie; Gabadinho, José; Gordon, Elspeth J; Jones, Samuel D; Levik, Karl Erik; McSweeney, Seán M; Monaco, Stéphanie; Nanao, Max; Spruce, Darren; Svensson, Olof; Walsh, Martin A; Leonard, Gordon A

    2011-11-15

    Individual research groups now analyze thousands of samples per year at synchrotron macromolecular crystallography (MX) resources. The efficient management of experimental data is thus essential if the best possible experiments are to be performed and the best possible data used in downstream processes in structure determination pipelines. Information System for Protein crystallography Beamlines (ISPyB), a Laboratory Information Management System (LIMS) with an underlying data model allowing for the integration of analyses down-stream of the data collection experiment was developed to facilitate such data management. ISPyB is now a multisite, generic LIMS for synchrotron-based MX experiments. Its initial functionality has been enhanced to include improved sample tracking and reporting of experimental protocols, the direct ranking of the diffraction characteristics of individual samples and the archiving of raw data and results from ancillary experiments and post-experiment data processing protocols. This latter feature paves the way for ISPyB to play a central role in future macromolecular structure solution pipelines and validates the application of the approach used in ISPyB to other experimental techniques, such as biological solution Small Angle X-ray Scattering and spectroscopy, which have similar sample tracking and data handling requirements.

  16. Formal Logic and Flowchart for Diagnosis Validity Verification and Inclusion in Clinical Decision Support Systems

    NASA Astrophysics Data System (ADS)

    Sosa, M.; Grundel, L.; Simini, F.

    2016-04-01

    Logical reasoning is part of medical practice since its origins. Modern Medicine has included information-intensive tools to refine diagnostics and treatment protocols. We are introducing formal logic teaching in Medical School prior to Clinical Internship, to foster medical practice. Two simple examples (Acute Myocardial Infarction and Diabetes Mellitus) are given in terms of formal logic expression and truth tables. Flowcharts of both diagnostic processes help understand the procedures and to validate them logically. The particularity of medical information is that it is often accompanied by “missing data” which suggests to adapt formal logic to a “three state” logic in the future. Medical Education must include formal logic to understand complex protocols and best practices, prone to mutual interactions.

  17. Time Synchronization and Distribution Mechanisms for Space Networks

    NASA Technical Reports Server (NTRS)

    Woo, Simon S.; Gao, Jay L.; Clare, Loren P.; Mills, David L.

    2011-01-01

    This work discusses research on the problems of synchronizing and distributing time information between spacecraft based on the Network Time Protocol (NTP), where NTP is a standard time synchronization protocol widely used in the terrestrial network. The Proximity-1 Space Link Interleaved Time Synchronization (PITS) Protocol was designed and developed for synchronizing spacecraft that are in proximity where proximity is less than 100,000 km distant. A particular application is synchronization between a Mars orbiter and rover. Lunar scenarios as well as outer-planet deep space mother-ship-probe missions may also apply. Spacecraft with more accurate time information functions as a time-server, and the other spacecraft functions as a time-client. PITS can be easily integrated and adaptable to the CCSDS Proximity-1 Space Link Protocol with minor modifications. In particular, PITS can take advantage of the timestamping strategy that underlying link layer functionality provides for accurate time offset calculation. The PITS algorithm achieves time synchronization with eight consecutive space network time packet exchanges between two spacecraft. PITS can detect and avoid possible errors from receiving duplicate and out-of-order packets by comparing with the current state variables and timestamps. Further, PITS is able to detect error events and autonomously recover from unexpected events that can possibly occur during the time synchronization and distribution process. This capability achieves an additional level of protocol protection on top of CRC or Error Correction Codes. PITS is a lightweight and efficient protocol, eliminating the needs for explicit frame sequence number and long buffer storage. The PITS protocol is capable of providing time synchronization and distribution services for a more general domain where multiple entities need to achieve time synchronization using a single point-to-point link.

  18. [Construction of chemical information database based on optical structure recognition technique].

    PubMed

    Lv, C Y; Li, M N; Zhang, L R; Liu, Z M

    2018-04-18

    To create a protocol that could be used to construct chemical information database from scientific literature quickly and automatically. Scientific literature, patents and technical reports from different chemical disciplines were collected and stored in PDF format as fundamental datasets. Chemical structures were transformed from published documents and images to machine-readable data by using the name conversion technology and optical structure recognition tool CLiDE. In the process of molecular structure information extraction, Markush structures were enumerated into well-defined monomer molecules by means of QueryTools in molecule editor ChemDraw. Document management software EndNote X8 was applied to acquire bibliographical references involving title, author, journal and year of publication. Text mining toolkit ChemDataExtractor was adopted to retrieve information that could be used to populate structured chemical database from figures, tables, and textual paragraphs. After this step, detailed manual revision and annotation were conducted in order to ensure the accuracy and completeness of the data. In addition to the literature data, computing simulation platform Pipeline Pilot 7.5 was utilized to calculate the physical and chemical properties and predict molecular attributes. Furthermore, open database ChEMBL was linked to fetch known bioactivities, such as indications and targets. After information extraction and data expansion, five separate metadata files were generated, including molecular structure data file, molecular information, bibliographical references, predictable attributes and known bioactivities. Canonical simplified molecular input line entry specification as primary key, metadata files were associated through common key nodes including molecular number and PDF number to construct an integrated chemical information database. A reasonable construction protocol of chemical information database was created successfully. A total of 174 research articles and 25 reviews published in Marine Drugs from January 2015 to June 2016 collected as essential data source, and an elementary marine natural product database named PKU-MNPD was built in accordance with this protocol, which contained 3 262 molecules and 19 821 records. This data aggregation protocol is of great help for the chemical information database construction in accuracy, comprehensiveness and efficiency based on original documents. The structured chemical information database can facilitate the access to medical intelligence and accelerate the transformation of scientific research achievements.

  19. The View from Here: Emergence of Graphical Literacy

    ERIC Educational Resources Information Center

    Roberts, Kathryn L.; Brugar, Kristy A.

    2017-01-01

    The purpose of this study is to describe upper elementary students' understandings of four graphical devices that frequently occur in social studies texts: captioned images, maps, tables, and timelines. Using verbal protocol data collection procedures, we collected information on students' metacognitive processes when they were explicitly asked to…

  20. Overview of a Linguistic Theory of Design. AI Memo 383A.

    ERIC Educational Resources Information Center

    Miller, Mark L.; Goldstein, Ira P.

    The SPADE theory, which uses linguistic formalisms to model the planning and debugging processes of computer programming, was simultaneously developed and tested in three separate contexts--computer uses in education, automatic programming (a traditional artificial intelligence arena), and protocol analysis (the domain of information processing…

  1. Improving specialist drug prescribing in primary care using task and error analysis: an observational study.

    PubMed

    Chana, Narinder; Porat, Talya; Whittlesea, Cate; Delaney, Brendan

    2017-03-01

    Electronic prescribing has benefited from computerised clinical decision support systems (CDSSs); however, no published studies have evaluated the potential for a CDSS to support GPs in prescribing specialist drugs. To identify potential weaknesses and errors in the existing process of prescribing specialist drugs that could be addressed in the development of a CDSS. Semi-structured interviews with key informants followed by an observational study involving GPs in the UK. Twelve key informants were interviewed to investigate the use of CDSSs in the UK. Nine GPs were observed while performing case scenarios depicting requests from hospitals or patients to prescribe a specialist drug. Activity diagrams, hierarchical task analysis, and systematic human error reduction and prediction approach analyses were performed. The current process of prescribing specialist drugs by GPs is prone to error. Errors of omission due to lack of information were the most common errors, which could potentially result in a GP prescribing a specialist drug that should only be prescribed in hospitals, or prescribing a specialist drug without reference to a shared care protocol. Half of all possible errors in the prescribing process had a high probability of occurrence. A CDSS supporting GPs during the process of prescribing specialist drugs is needed. This could, first, support the decision making of whether or not to undertake prescribing, and, second, provide drug-specific parameters linked to shared care protocols, which could reduce the errors identified and increase patient safety. © British Journal of General Practice 2017.

  2. Semi-Structured Interview Protocol for Constructing Logic Models

    ERIC Educational Resources Information Center

    Gugiu, P. Cristian; Rodriguez-Campos, Liliana

    2007-01-01

    This paper details a semi-structured interview protocol that evaluators can use to develop a logic model of a program's services and outcomes. The protocol presents a series of questions, which evaluators can ask of specific program informants, that are designed to: (1) identify key informants basic background and contextual information, (2)…

  3. Assisted closed-loop optimization of SSVEP-BCI efficiency

    PubMed Central

    Fernandez-Vargas, Jacobo; Pfaff, Hanns U.; Rodríguez, Francisco B.; Varona, Pablo

    2012-01-01

    We designed a novel assisted closed-loop optimization protocol to improve the efficiency of brain-computer interfaces (BCI) based on steady state visually evoked potentials (SSVEP). In traditional paradigms, the control over the BCI-performance completely depends on the subjects' ability to learn from the given feedback cues. By contrast, in the proposed protocol both the subject and the machine share information and control over the BCI goal. Generally, the innovative assistance consists in the delivery of online information together with the online adaptation of BCI stimuli properties. In our case, this adaptive optimization process is realized by (1) a closed-loop search for the best set of SSVEP flicker frequencies and (2) feedback of actual SSVEP magnitudes to both the subject and the machine. These closed-loop interactions between subject and machine are evaluated in real-time by continuous measurement of their efficiencies, which are used as online criteria to adapt the BCI control parameters. The proposed protocol aims to compensate for variability in possibly unknown subjects' state and trait dimensions. In a study with N = 18 subjects, we found significant evidence that our protocol outperformed classic SSVEP-BCI control paradigms. Evidence is presented that it takes indeed into account interindividual variabilities: e.g., under the new protocol, baseline resting state EEG measures predict subjects' BCI performances. This paper illustrates the promising potential of assisted closed-loop protocols in BCI systems. Probably their applicability might be expanded to innovative uses, e.g., as possible new diagnostic/therapeutic tools for clinical contexts and as new paradigms for basic research. PMID:23443214

  4. Assisted closed-loop optimization of SSVEP-BCI efficiency.

    PubMed

    Fernandez-Vargas, Jacobo; Pfaff, Hanns U; Rodríguez, Francisco B; Varona, Pablo

    2013-01-01

    We designed a novel assisted closed-loop optimization protocol to improve the efficiency of brain-computer interfaces (BCI) based on steady state visually evoked potentials (SSVEP). In traditional paradigms, the control over the BCI-performance completely depends on the subjects' ability to learn from the given feedback cues. By contrast, in the proposed protocol both the subject and the machine share information and control over the BCI goal. Generally, the innovative assistance consists in the delivery of online information together with the online adaptation of BCI stimuli properties. In our case, this adaptive optimization process is realized by (1) a closed-loop search for the best set of SSVEP flicker frequencies and (2) feedback of actual SSVEP magnitudes to both the subject and the machine. These closed-loop interactions between subject and machine are evaluated in real-time by continuous measurement of their efficiencies, which are used as online criteria to adapt the BCI control parameters. The proposed protocol aims to compensate for variability in possibly unknown subjects' state and trait dimensions. In a study with N = 18 subjects, we found significant evidence that our protocol outperformed classic SSVEP-BCI control paradigms. Evidence is presented that it takes indeed into account interindividual variabilities: e.g., under the new protocol, baseline resting state EEG measures predict subjects' BCI performances. This paper illustrates the promising potential of assisted closed-loop protocols in BCI systems. Probably their applicability might be expanded to innovative uses, e.g., as possible new diagnostic/therapeutic tools for clinical contexts and as new paradigms for basic research.

  5. [Design and piloting of a structured service medication dispensing process].

    PubMed

    Abaurre, Raquel; García-Delgado, Pilar; Maurandi, M Dolores; Arrebola, Cristóbal; Gastelurrutia, Miguel Ángel; Martínez-Martínez, Fernando

    2015-01-01

    The aim of this article is to design and pilot a protocol for the dispensing of medications service. Using the requirements proposed in the Ministry of Health Pharmaceutical Care Consensus, a literature search was made applying qualitative consensus techniques. An observational, cross-sectional study was conducted from March to June 2009. A total of 53 community pharmacies from 24 Spanish counties. Patients who requested one or more particular medications with or without medical prescription for their own use or for someone in their care. The personalised medication information (IPM), the problems associated with the medications (PRM), and the negative results associated with the medication (RNM), detected by the pharmacist each time medication was dispensed, as well as the perception of the pharmacist on the operability of the protocol were recorded. A total of 870 medications were dispensed, with 423 (48.6%) cases of lack of personalised medication information (IPM) being detected. PRM were detected in 10.11% of the dispensed medications, as well as 68 (7.81%) suspected RNM: safety (n = 35; 51.5%), effectiveness (n = 29; 42.6%) and necessity (n = 4; 5.8%). Almost two-thirds (65.21%) of the pharmacists said that the protocol is in operation. The designed protocol helped to detect deficiencies in the information to the patients about their medications, as well as the PRM and RNM, and is shown to be tool that is easy to use and apply. Copyright © 2013 Elsevier España, S.L.U. All rights reserved.

  6. Protocol for evaluation of the cost-effectiveness of ePrescribing systems and candidate prototype for other related health information technologies

    PubMed Central

    2014-01-01

    Background This protocol concerns the assessment of cost-effectiveness of hospital health information technology (HIT) in four hospitals. Two of these hospitals are acquiring ePrescribing systems incorporating extensive decision support, while the other two will implement systems incorporating more basic clinical algorithms. Implementation of an ePrescribing system will have diffuse effects over myriad clinical processes, so the protocol has to deal with a large amount of information collected at various ‘levels’ across the system. Methods/Design The method we propose is use of Bayesian ideas as a philosophical guide. Assessment of cost-effectiveness requires a number of parameters in order to measure incremental cost utility or benefit – the effectiveness of the intervention in reducing frequency of preventable adverse events; utilities for these adverse events; costs of HIT systems; and cost consequences of adverse events averted. There is no single end-point that adequately and unproblematically captures the effectiveness of the intervention; we therefore plan to observe changes in error rates and adverse events in four error categories (death, permanent disability, moderate disability, minimal effect). For each category we will elicit and pool subjective probability densities from experts for reductions in adverse events, resulting from deployment of the intervention in a hospital with extensive decision support. The experts will have been briefed with quantitative and qualitative data from the study and external data sources prior to elicitation. Following this, there will be a process of deliberative dialogues so that experts can “re-calibrate” their subjective probability estimates. The consolidated densities assembled from the repeat elicitation exercise will then be used to populate a health economic model, along with salient utilities. The credible limits from these densities can define thresholds for sensitivity analyses. Discussion The protocol we present here was designed for evaluation of ePrescribing systems. However, the methodology we propose could be used whenever research cannot provide a direct and unbiased measure of comparative effectiveness. PMID:25038609

  7. Analysis and Improvement of Large Payload Bidirectional Quantum Secure Direct Communication Without Information Leakage

    NASA Astrophysics Data System (ADS)

    Liu, Zhi-Hao; Chen, Han-Wu

    2018-02-01

    As we know, the information leakage problem should be avoided in a secure quantum communication protocol. Unfortunately, it is found that this problem does exist in the large payload bidirectional quantum secure direct communication (BQSDC) protocol (Ye Int. J. Quantum. Inf. 11(5), 1350051 2013) which is based on entanglement swapping between any two Greenberger-Horne-Zeilinger (GHZ) states. To be specific, one half of the information interchanged in this protocol is leaked out unconsciously without any active attack from an eavesdropper. Afterward, this BQSDC protocol is revised to the one without information leakage. It is shown that the improved BQSDC protocol is secure against the general individual attack and has some obvious features compared with the original one.

  8. Do we need 3D tube current modulation information for accurate organ dosimetry in chest CT? Protocols dose comparisons.

    PubMed

    Lopez-Rendon, Xochitl; Zhang, Guozhi; Coudyzer, Walter; Develter, Wim; Bosmans, Hilde; Zanca, Federica

    2017-11-01

    To compare the lung and breast dose associated with three chest protocols: standard, organ-based tube current modulation (OBTCM) and fast-speed scanning; and to estimate the error associated with organ dose when modelling the longitudinal (z-) TCM versus the 3D-TCM in Monte Carlo simulations (MC) for these three protocols. Five adult and three paediatric cadavers with different BMI were scanned. The CTDI vol of the OBTCM and the fast-speed protocols were matched to the patient-specific CTDI vol of the standard protocol. Lung and breast doses were estimated using MC with both z- and 3D-TCM simulated and compared between protocols. The fast-speed scanning protocol delivered the highest doses. A slight reduction for breast dose (up to 5.1%) was observed for two of the three female cadavers with the OBTCM in comparison to the standard. For both adult and paediatric, the implementation of the z-TCM data only for organ dose estimation resulted in 10.0% accuracy for the standard and fast-speed protocols, while relative dose differences were up to 15.3% for the OBTCM protocol. At identical CTDI vol values, the standard protocol delivered the lowest overall doses. Only for the OBTCM protocol is the 3D-TCM needed if an accurate (<10.0%) organ dosimetry is desired. • The z-TCM information is sufficient for accurate dosimetry for standard protocols. • The z-TCM information is sufficient for accurate dosimetry for fast-speed scanning protocols. • For organ-based TCM schemes, the 3D-TCM information is necessary for accurate dosimetry. • At identical CTDI vol , the fast-speed scanning protocol delivered the highest doses. • Lung dose was higher in XCare than standard protocol at identical CTDI vol .

  9. Experimental verification of multipartite entanglement in quantum networks

    PubMed Central

    McCutcheon, W.; Pappa, A.; Bell, B. A.; McMillan, A.; Chailloux, A.; Lawson, T.; Mafu, M.; Markham, D.; Diamanti, E.; Kerenidis, I.; Rarity, J. G.; Tame, M. S.

    2016-01-01

    Multipartite entangled states are a fundamental resource for a wide range of quantum information processing tasks. In particular, in quantum networks, it is essential for the parties involved to be able to verify if entanglement is present before they carry out a given distributed task. Here we design and experimentally demonstrate a protocol that allows any party in a network to check if a source is distributing a genuinely multipartite entangled state, even in the presence of untrusted parties. The protocol remains secure against dishonest behaviour of the source and other parties, including the use of system imperfections to their advantage. We demonstrate the verification protocol in a three- and four-party setting using polarization-entangled photons, highlighting its potential for realistic photonic quantum communication and networking applications. PMID:27827361

  10. Probabilistic Analysis of Hierarchical Cluster Protocols for Wireless Sensor Networks

    NASA Astrophysics Data System (ADS)

    Kaj, Ingemar

    Wireless sensor networks are designed to extract data from the deployment environment and combine sensing, data processing and wireless communication to provide useful information for the network users. Hundreds or thousands of small embedded units, which operate under low-energy supply and with limited access to central network control, rely on interconnecting protocols to coordinate data aggregation and transmission. Energy efficiency is crucial and it has been proposed that cluster based and distributed architectures such as LEACH are particularly suitable. We analyse the random cluster hierarchy in this protocol and provide a solution for low-energy and limited-loss optimization. Moreover, we extend these results to a multi-level version of LEACH, where clusters of nodes again self-organize to form clusters of clusters, and so on.

  11. Ethical dilemmas in genetic testing: examples from the Cuban program for predictive diagnosis of hereditary ataxias.

    PubMed

    Mariño, Tania Cruz; Armiñán, Rubén Reynaldo; Cedeño, Humberto Jorge; Mesa, José Miguel Laffita; Zaldivar, Yanetza González; Rodríguez, Raúl Aguilera; Santos, Miguel Velázquez; Mederos, Luis Enrique Almaguer; Herrera, Milena Paneque; Pérez, Luis Velázquez

    2011-06-01

    Predictive testing protocols are intended to help patients affected with hereditary conditions understand their condition and make informed reproductive choices. However, predictive protocols may expose clinicians and patients to ethical dilemmas that interfere with genetic counseling and the decision making process. This paper describes ethical dilemmas in a series of five cases involving predictive testing for hereditary ataxias in Cuba. The examples herein present evidence of the deeply controversial situations faced by both individuals at risk and professionals in charge of these predictive studies, suggesting a need for expanded guidelines to address such complexities.

  12. Systems Imaging of the Immune Synapse.

    PubMed

    Ambler, Rachel; Ruan, Xiangtao; Murphy, Robert F; Wülfing, Christoph

    2017-01-01

    Three-dimensional live cell imaging of the interaction of T cells with antigen-presenting cells (APCs) visualizes the subcellular distributions of signaling intermediates during T cell activation at thousands of resolved positions within a cell. These information-rich maps of local protein concentrations are a valuable resource in understanding T cell signaling. Here, we describe a protocol for the efficient acquisition of such imaging data and their computational processing to create four-dimensional maps of local concentrations. This protocol allows quantitative analysis of T cell signaling as it occurs inside live cells with resolution in time and space across thousands of cells.

  13. Improving security of the ping-pong protocol

    NASA Astrophysics Data System (ADS)

    Zawadzki, Piotr

    2013-01-01

    A security layer for the asymptotically secure ping-pong protocol is proposed and analyzed in the paper. The operation of the improvement exploits inevitable errors introduced by the eavesdropping in the control and message modes. Its role is similar to the privacy amplification algorithms known from the quantum key distribution schemes. Messages are processed in blocks which guarantees that an eavesdropper is faced with a computationally infeasible problem as long as the system parameters are within reasonable limits. The introduced additional information preprocessing does not require quantum memory registers and confidential communication is possible without prior key agreement or some shared secret.

  14. “Quantumness” versus “classicality” of quantum states and quantum protocols

    NASA Astrophysics Data System (ADS)

    Brodutch, Aharon; Groisman, Berry; Kenigsberg, Dan; Mor, Tal

    Entanglement is one of the pillars of quantum mechanics and quantum information processing, and as a result, the quantumness of nonentangled states has typically been overlooked and unrecognized until the last decade. We give a robust definition for the classicality versus quantumness of a single multipartite quantum state, a set of states, and a protocol using quantum states. We show a variety of nonentangled (separable) states that exhibit interesting quantum properties, and we explore the “zoo” of separable states; several interesting subclasses are defined based on the diagonalizing bases of the states, and their nonclassical behavior is investigated.

  15. Contextual information management: An example of independent-checking in the review of laboratory-based bloodstain pattern analysis.

    PubMed

    Osborne, Nikola K P; Taylor, Michael C

    2018-05-01

    This article describes a New Zealand forensic agency's contextual information management protocol for bloodstain pattern evidence examined in the laboratory. In an effort to create a protocol that would have minimal impact on current work-flow, while still effectively removing task-irrelevant contextual information, the protocol was designed following an in-depth consultation with management and forensic staff. The resulting design was for a protocol of independent-checking (i.e. blind peer-review) where the checker's interpretation of the evidence is conducted in the absence of case information and the original examiner's notes or interpretation(s). At the conclusion of a ten-case trial period, there was widespread agreement that the protocol had minimal impact on the number of people required, the cost, or the time to complete an item examination. The agency is now looking to adopt the protocol into standard operating procedures and in some cases the protocol has been extended to cover other laboratory-based examinations (e.g. fabric damage, shoeprint examination, and physical fits). The protocol developed during this trial provides a useful example for agencies seeking to adopt contextual information management into their workflow. Copyright © 2018 The Chartered Society of Forensic Sciences. Published by Elsevier B.V. All rights reserved.

  16. Design and Analysis of A Beacon-Less Routing Protocol for Large Volume Content Dissemination in Vehicular Ad Hoc Networks.

    PubMed

    Hu, Miao; Zhong, Zhangdui; Ni, Minming; Baiocchi, Andrea

    2016-11-01

    Large volume content dissemination is pursued by the growing number of high quality applications for Vehicular Ad hoc NETworks(VANETs), e.g., the live road surveillance service and the video-based overtaking assistant service. For the highly dynamical vehicular network topology, beacon-less routing protocols have been proven to be efficient in achieving a balance between the system performance and the control overhead. However, to the authors' best knowledge, the routing design for large volume content has not been well considered in the previous work, which will introduce new challenges, e.g., the enhanced connectivity requirement for a radio link. In this paper, a link Lifetime-aware Beacon-less Routing Protocol (LBRP) is designed for large volume content delivery in VANETs. Each vehicle makes the forwarding decision based on the message header information and its current state, including the speed and position information. A semi-Markov process analytical model is proposed to evaluate the expected delay in constructing one routing path for LBRP. Simulations show that the proposed LBRP scheme outperforms the traditional dissemination protocols in providing a low end-to-end delay. The analytical model is shown to exhibit a good match on the delay estimation with Monte Carlo simulations, as well.

  17. Design and Analysis of A Beacon-Less Routing Protocol for Large Volume Content Dissemination in Vehicular Ad Hoc Networks

    PubMed Central

    Hu, Miao; Zhong, Zhangdui; Ni, Minming; Baiocchi, Andrea

    2016-01-01

    Large volume content dissemination is pursued by the growing number of high quality applications for Vehicular Ad hoc NETworks(VANETs), e.g., the live road surveillance service and the video-based overtaking assistant service. For the highly dynamical vehicular network topology, beacon-less routing protocols have been proven to be efficient in achieving a balance between the system performance and the control overhead. However, to the authors’ best knowledge, the routing design for large volume content has not been well considered in the previous work, which will introduce new challenges, e.g., the enhanced connectivity requirement for a radio link. In this paper, a link Lifetime-aware Beacon-less Routing Protocol (LBRP) is designed for large volume content delivery in VANETs. Each vehicle makes the forwarding decision based on the message header information and its current state, including the speed and position information. A semi-Markov process analytical model is proposed to evaluate the expected delay in constructing one routing path for LBRP. Simulations show that the proposed LBRP scheme outperforms the traditional dissemination protocols in providing a low end-to-end delay. The analytical model is shown to exhibit a good match on the delay estimation with Monte Carlo simulations, as well. PMID:27809285

  18. Field validation of protocols developed to evaluate in-line mastitis detection systems.

    PubMed

    Kamphuis, C; Dela Rue, B T; Eastwood, C R

    2016-02-01

    This paper reports on a field validation of previously developed protocols for evaluating the performance of in-line mastitis-detection systems. The protocols outlined 2 requirements of these systems: (1) to detect cows with clinical mastitis (CM) promptly and accurately to enable timely and appropriate treatment and (2) to identify cows with high somatic cell count (SCC) to manage bulk milk SCC levels. Gold standard measures, evaluation tests, performance measures, and performance targets were proposed. The current study validated the protocols on commercial dairy farms with automated in-line mastitis-detection systems using both electrical conductivity (EC) and SCC sensor systems that both monitor at whole-udder level. The protocol for requirement 1 was applied on 3 commercial farms. For requirement 2, the protocol was applied on 6 farms; 3 of them had low bulk milk SCC (128×10(3) cells/mL) and were the same farms as used for field evaluation of requirement 1. Three farms with high bulk milk SCC (270×10(3) cells/mL) were additionally enrolled. The field evaluation methodology and results were presented at a workshop including representation from 7 international suppliers of in-line mastitis-detection systems. Feedback was sought on the acceptance of standardized performance evaluation protocols and recommended refinements to the protocols. Although the methodology for requirement 1 was relatively labor intensive and required organizational skills over an extended period, no major issues were encountered during the field validation of both protocols. The validation, thus, proved the protocols to be practical. Also, no changes to the data collection process were recommended by the technology supplier representatives. However, 4 recommendations were made to refine the protocols: inclusion of an additional analysis that ignores small (low-density) clot observations in the definition of CM, extension of the time window from 4 to 5 milkings for timely alerts for CM, setting a maximum number of 10 milkings for the time window to detect a CM episode, and presentation of sensitivity for a larger range of false alerts per 1,000 milkings replacing minimum performance targets. The recommended refinements are discussed with suggested changes to the original protocols. The information presented is intended to inform further debate toward achieving international agreement on standard protocols to evaluate performance of in-line mastitis-detection systems. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  19. The Native Plant Propagation Protocol Database: 16 years of sharing information

    Treesearch

    R. Kasten Dumroese; Thomas D. Landis

    2016-01-01

    The Native Plant Propagation Protocol Database was launched in 2001 to provide an online mechanism for sharing information about growing native plants. It relies on plant propagators to upload their protocols (detailed directions for growing particular native plants) so that others may benefit from their experience. Currently the database has nearly 3000 protocols and...

  20. The implementation of a new Malaria Treatment Protocol in Timor-Leste: challenges and constraints

    PubMed Central

    Martins, João Soares; Zwi, Anthony B; Hobday, Karen; Bonaparte, Fernando; Kelly, Paul M

    2012-01-01

    Background Timor-Leste changed its malaria treatment protocol in 2007, replacing the first-line for falciparum malaria from sulphadoxine-pyrimethamine to artemether-lumefantrine. This study explored the factors affecting the implementation of the revised treatment protocol, with an emphasis on identifying key constraints. Methods A mixed method approach drew on both qualitative and quantitative data. The study included data from District Health Services in seven districts, community health centres in 14 sub-districts, four hospitals, five private clinics, one private pharmacy and the country's autonomous medical store. In-depth interviews with 36 key informants, five group interviews and 15 focus group discussions were conducted. A survey was also undertaken at community health centres and hospitals to assess the availability of a physical copy of the Malaria Treatment Protocol, as well as the availability and utilization of artemether-lumefantrine and sulphadoxine-pyrimethamine. Results Many factors impeded the implementation of the new malaria protocol. These included: inadequate introduction and training around the revised treatment protocol; unclear phasing out of sulphadoxine-pyrimethamine and phasing in of the revised treatment, artemether-lumefantrine, and the rapid diagnostic test (RDT); lack of supervision; lack of adherence to the revised guidelines by foreign health workers; lack of access to the new drug by the private sector; obstacles in the procurement process; and the use of trade names rather than generic drug description. Insufficient understanding of the rapid diagnostic test and the untimely supply of drugs further hampered implementation. Conclusion To effectively implement a revised malaria treatment protocol, barriers should be identified during the policy formulation process and those emerging during implementation should be recognized promptly and addressed. PMID:22460007

  1. A General Self-Organized Tree-Based Energy-Balance Routing Protocol for Wireless Sensor Network

    NASA Astrophysics Data System (ADS)

    Han, Zhao; Wu, Jie; Zhang, Jie; Liu, Liefeng; Tian, Kaiyun

    2014-04-01

    Wireless sensor network (WSN) is a system composed of a large number of low-cost micro-sensors. This network is used to collect and send various kinds of messages to a base station (BS). WSN consists of low-cost nodes with limited battery power, and the battery replacement is not easy for WSN with thousands of physically embedded nodes, which means energy efficient routing protocol should be employed to offer a long-life work time. To achieve the aim, we need not only to minimize total energy consumption but also to balance WSN load. Researchers have proposed many protocols such as LEACH, HEED, PEGASIS, TBC and PEDAP. In this paper, we propose a General Self-Organized Tree-Based Energy-Balance routing protocol (GSTEB) which builds a routing tree using a process where, for each round, BS assigns a root node and broadcasts this selection to all sensor nodes. Subsequently, each node selects its parent by considering only itself and its neighbors' information, thus making GSTEB a dynamic protocol. Simulation results show that GSTEB has a better performance than other protocols in balancing energy consumption, thus prolonging the lifetime of WSN.

  2. Verification and validation of a reliable multicast protocol

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Montgomery, Todd L.

    1995-01-01

    This paper describes the methods used to specify and implement a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally by two complementary teams using a combination of formal and informal techniques in an attempt to ensure the correctness of the protocol implementation. The first team, called the Design team, initially specified protocol requirements using a variant of SCR requirements tables and implemented a prototype solution. The second team, called the V&V team, developed a state model based on the requirements tables and derived test cases from these tables to exercise the implementation. In a series of iterative steps, the Design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation through testing. Test cases derived from state transition paths in the formal model formed the dialogue between teams during development and served as the vehicles for keeping the model and implementation in fidelity with each other. This paper describes our experiences in developing our process model, details of our approach, and some example problems found during the development of RMP.

  3. CT protocol management: simplifying the process by using a master protocol concept.

    PubMed

    Szczykutowicz, Timothy P; Bour, Robert K; Rubert, Nicholas; Wendt, Gary; Pozniak, Myron; Ranallo, Frank N

    2015-07-08

    This article explains a method for creating CT protocols for a wide range of patient body sizes and clinical indications, using detailed tube current information from a small set of commonly used protocols. Analytical expressions were created relating CT technical acquisition parameters which can be used to create new CT protocols on a given scanner or customize protocols from one scanner to another. Plots of mA as a function of patient size for specific anatomical regions were generated and used to identify the tube output needs for patients as a function of size for a single master protocol. Tube output data were obtained from the DICOM header of clinical images from our PACS and patient size was measured from CT localizer radiographs under IRB approval. This master protocol was then used to create 11 additional master protocols. The 12 master protocols were further combined to create 39 single and multiphase clinical protocols. Radiologist acceptance rate of exams scanned using the clinical protocols was monitored for 12,857 patients to analyze the effectiveness of the presented protocol management methods using a two-tailed Fisher's exact test. A single routine adult abdominal protocol was used as the master protocol to create 11 additional master abdominal protocols of varying dose and beam energy. Situations in which the maximum tube current would have been exceeded are presented, and the trade-offs between increasing the effective tube output via 1) decreasing pitch, 2) increasing the scan time, or 3) increasing the kV are discussed. Out of 12 master protocols customized across three different scanners, only one had a statistically significant acceptance rate that differed from the scanner it was customized from. The difference, however, was only 1% and was judged to be negligible. All other master protocols differed in acceptance rate insignificantly between scanners. The methodology described in this paper allows a small set of master protocols to be adapted among different clinical indications on a single scanner and among different CT scanners.

  4. Emergency and Disaster Information Service

    NASA Astrophysics Data System (ADS)

    Boszormenyi, Zsolt

    2010-05-01

    The Hungarian National Association of Radio Distress-Signalling and Infocommunications (RSOE) operates Emergency and Disaster Information Service (EDIS) within the frame of its own website which has the objective to monitor and document all the events on the Earth which may cause disaster or emergency. Our service is using the speed and the data spectrum of the internet to gather information. We are monitoring and processing several foreign organisation's data to get quick and certified information. The EDIS website operated together by the General-Directorate of National Disaster Management (OKF) and RSOE, in co-operation with the Crisis Management Centre of the Ministry of Foreign Affairs, provides useful information regarding emergency situations and their prevention. Extraordinary events happening in Hungary, Europe and other areas of the World are being monitored in 24 hours per day. All events processed by RSOE EDIS are displayed real time - for the sake of international compatibility - according to the CAP protocol on a secure website. To ensure clear transparency all events are categorized separately in the RSS directory (e.g. earthquake, fire, flood, landslide, nuclear event, tornado, vulcano). RSOE EDIS also contributes in dissemination of the CAP protocol in Hungary. Beside the official information, with the help of special programs nearly 900-1000 internet press publication will be monitored and the publication containing predefined keywords will be processed. However, these "news" cannot be considered as official and reliable information, but many times we have learnt critical information from the internet press. We are screening the incoming information and storing in a central database sorted by category. After processing the information we are sending it immediately via E-Mail (or other format) for the organisations and persons who have requested it (e.g. National Disaster Management, United Nations etc.). We are aspiring that the processed data will be validated and reliable in all cases, to avoid the possible panic situation caused by unreal information. That is why we are trying to create and keep contact with all organisations, which can provide validated information for us, to operate the RSOE EDIS. Certainly we are publishing all incoming data and information at our website to provide up-to-date information to the citizens as well as we are publishing useful knowledge for them. We have a knowledge database, which contains all necessary information, which can help the citizens in an emergency situation. For the prevention and the most relevant information we are willing to amend our published data with the population information.

  5. Providing Hemostatic and Blood Conservation Options for Jehovah's Witness Patients in a Large Medical System.

    PubMed

    Bai, Yu; Castillo, Brian S; Tchakarov, Amanda; Escobar, Miguel A; Cotton, Bryan A; Holcomb, John B; Brown, Robert E

    2016-12-01

    People of the Jehovah's Witness faith believe that they shall "abstain from blood." Because of this belief, we encounter the challenges from Jehovah's Witness patients who actively seek medical care for themselves and their children, but refuse the transfusion of blood products, which may result in increased morbidity and mortality in this patient population. With the development/availability of new hemostatic/coagulation products and the advances in medical technology, we, in collaboration with our clinical colleagues and our local Jehovah's Witness leadership, have developed a clinical guideline comprising medical protocol and surgical strategy for patients refusing blood products. Included in the medical protocol is an informative handout on related details to help treating physicians and patients make informed decisions about transfusion alternatives. Together, we have entered the medical protocol into the entire Memorial Hermann Hospital's electronic system. We report the detailed development and implementation process in order to share our experience and encourage others to develop their own management plan for this patient population. © 2016 by the Association of Clinical Scientists, Inc.

  6. Selection and application of microbial source tracking tools for water-quality investigations

    USGS Publications Warehouse

    Stoeckel, Donald M.

    2005-01-01

    Microbial source tracking (MST) is a complex process that includes many decision-making steps. Once a contamination problem has been defined, the potential user of MST tools must thoroughly consider study objectives before deciding upon a source identifier, a detection method, and an analytical approach to apply to the problem. Regardless of which MST protocol is chosen, underlying assumptions can affect the results and interpretation. It is crucial to incorporate tests of those assumptions in the study quality-control plan to help validate results and facilitate interpretation. Detailed descriptions of MST objectives, protocols, and assumptions are provided in this report to assist in selection and application of MST tools for water-quality investigations. Several case studies illustrate real-world applications of MST protocols over a range of settings, spatial scales, and types of contamination. Technical details of many available source identifiers and detection methods are included as appendixes. By use of this information, researchers should be able to formulate realistic expectations for the information that MST tools can provide and, where possible, successfully execute investigations to characterize sources of fecal contamination to resource waters.

  7. Protocol evaluation for effective music therapy for persons with nonfluent aphasia.

    PubMed

    Kim, Mijin; Tomaino, Concetta M

    2008-01-01

    Although the notion of the language specificity of neural correlates has been widely accepted in the past (e.g., lefthemispheric dominance including Broca's and Wernike's area, N400 ERP component of semantic processing, and the P600 ERP component of syntactic processing, etc.), recent studies have shown that music and language share some important neurological aspects in their processing, both involving bilateral hemispheric activities. In line with this are the frequent behavioral clinical observations that persons with aphasia show improved articulation and prosody of speech in musically assisted phrases. Connecting recent neurological findings with clinical observations would not only inform clinical practice but would enhance understanding of the neurological mechanisms involved in the processing of speech/language and music. This article presents a music therapy treatment protocol study of 7 nonfluent patients with aphasia. The data and findings are discussed with regard to some of the recent focuses and issues addressed in the experimental studies using cognitive-behavioral, electrophysiological, and brain-imaging techniques.

  8. Real-time spectral characterization of a photon pair source using a chirped supercontinuum seed.

    PubMed

    Erskine, Jennifer; England, Duncan; Kupchak, Connor; Sussman, Benjamin

    2018-02-15

    Photon pair sources have wide ranging applications in a variety of quantum photonic experiments and protocols. Many of these protocols require well controlled spectral correlations between the two output photons. However, due to low cross-sections, measuring the joint spectral properties of photon pair sources has historically been a challenging and time-consuming task. Here, we present an approach for the real-time measurement of the joint spectral properties of a fiber-based four wave mixing source. We seed the four wave mixing process using a broadband chirped pulse, studying the stimulated process to extract information regarding the spontaneous process. In addition, we compare stimulated emission measurements with the spontaneous process to confirm the technique's validity. Joint spectral measurements have taken many hours historically and several minutes with recent techniques. Here, measurements have been demonstrated in 5-30 s depending on resolution, offering substantial improvement. Additional benefits of this approach include flexible resolution, large measurement bandwidth, and reduced experimental overhead.

  9. Non-adiabatic quantum state preparation and quantum state transport in chains of Rydberg atoms

    NASA Astrophysics Data System (ADS)

    Ostmann, Maike; Minář, Jiří; Marcuzzi, Matteo; Levi, Emanuele; Lesanovsky, Igor

    2017-12-01

    Motivated by recent progress in the experimental manipulation of cold atoms in optical lattices, we study three different protocols for non-adiabatic quantum state preparation and state transport in chains of Rydberg atoms. The protocols we discuss are based on the blockade mechanism between atoms which, when excited to a Rydberg state, interact through a van der Waals potential, and rely on single-site addressing. Specifically, we discuss protocols for efficient creation of an antiferromagnetic GHZ state, a class of matrix product states including a so-called Rydberg crystal and for the state transport of a single-qubit quantum state between two ends of a chain of atoms. We identify system parameters allowing for the operation of the protocols on timescales shorter than the lifetime of the Rydberg states while yielding high fidelity output states. We discuss the effect of positional disorder on the resulting states and comment on limitations due to other sources of noise such as radiative decay of the Rydberg states. The proposed protocols provide a testbed for benchmarking the performance of quantum information processing platforms based on Rydberg atoms.

  10. Obtaining i.v. fosfomycin through an expanded-access protocol.

    PubMed

    Frederick, Corey M; Burnette, Jennifer; Aragon, Laura; Gauthier, Timothy P

    2016-08-15

    One hospital's experience with procuring i.v. fosfomycin via an expanded-access protocol to treat a panresistant infection is described. In mid-2014, a patient at a tertiary care institution had an infection caused by a gram-negative pathogen expressing notable drug resistance. Once it was determined by the infectious diseases (ID) attending physician that i.v. fosfomycin was a possible treatment for this patient, the ID pharmacist began the process of drug procurement. The research and ID pharmacists completed an investigational new drug (IND) application, which required patient-specific details and contributions from the ID physician. After obtaining approval of the IND, an Internet search identified a product vendor in the United Kingdom, who was then contacted to begin the drug purchasing and acquisition processes. Authorization of the transaction required signatures from key senior hospital administrators, including the chief financial officer and the chief operating officer. Approximately 6 days after beginning the acquisition process, the research pharmacist arranged for the wholesaler to expedite product delivery. The ID pharmacist contacted the wholesaler's shipping company at the U.S. Customs Office, providing relevant contact information to ensure that any unexpected circumstances could be quickly addressed. The product arrived at the U.S. Customs Office 8 days after beginning the acquisition process and was held in the U.S. Customs Office for 2 days. The patient received the first dose of i.v. fosfomycin 13 days after starting the expanded-access protocol process. I.V. fosfomycin was successfully procured through an FDA expanded-access protocol by coordinating efforts among ID physicians, pharmacists, and hospital executives. Copyright © 2016 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  11. Quantum secret information equal exchange protocol based on dense coding

    NASA Astrophysics Data System (ADS)

    Jiang, Ying-Hua; Zhang, Shi-Bin; Dai, Jin-Qiao; Shi, Zhi-Ping

    2018-04-01

    In this paper, we design a novel quantum secret information equal exchange protocol, which implements the equal exchange of secret information between the two parties with the help of semi-trusted third party (TP). In the protocol, EPR pairs prepared by the TP are, respectively, distributed to both the communication parties. Then, the two parties perform Pauli operation on each particle and return the new particles to TP, respectively. TP measures each new pair with Bell basis and announces the measurement results. Both parties deduce the secret information of each other according to the result of announcement by TP. Finally, the security analysis shows that this protocol solves the problem about equal exchange of secret information between two parties and verifies the security of semi-trusted TPs. It proves that the protocol can effectively resist glitch attacks, intercept retransmission attacks and entanglement attack.

  12. Creating a Controlled Vocabulary for the Ethics of Human Research: Towards a Biomedical Ethics Ontology

    PubMed Central

    Koepsell, David; Arp, Robert; Fostel, Jennifer; Smith, Barry

    2009-01-01

    Ontologies describe reality in specific domains in ways that can bridge various disciplines and languages. They allow easier access and integration of information that is collected by different groups. Ontologies are currently used in the biomedical sciences, geography, and law. A Biomedical Ethics Ontology (BMEO) would benefit members of ethics committees who deal with protocols and consent forms spanning numerous fields of inquiry. There already exists the Ontology for Biomedical Investigations (OBI); the proposed BMEO would interoperate with OBI, creating a powerful information tool. We define a domain ontology and begin to construct a BMEO, focused on the process of evaluating human research protocols. Finally, we show how our BMEO can have practical applications for ethics committees. This paper describes ongoing research and a strategy for its broader continuation and cooperation. PMID:19374479

  13. Anti-Noise Bidirectional Quantum Steganography Protocol with Large Payload

    NASA Astrophysics Data System (ADS)

    Qu, Zhiguo; Chen, Siyi; Ji, Sai; Ma, Songya; Wang, Xiaojun

    2018-06-01

    An anti-noise bidirectional quantum steganography protocol with large payload protocol is proposed in this paper. In the new protocol, Alice and Bob enable to transmit classical information bits to each other while teleporting secret quantum states covertly. The new protocol introduces the bidirectional quantum remote state preparation into the bidirectional quantum secure communication, not only to expand secret information from classical bits to quantum state, but also extract the phase and amplitude values of secret quantum state for greatly enlarging the capacity of secret information. The new protocol can also achieve better imperceptibility, since the eavesdropper can hardly detect the hidden channel or even obtain effective secret quantum states. Comparing with the previous quantum steganography achievements, due to its unique bidirectional quantum steganography, the new protocol can obtain higher transmission efficiency and better availability. Furthermore, the new algorithm can effectively resist quantum noises through theoretical analysis. Finally, the performance analysis proves the conclusion that the new protocol not only has good imperceptibility, high security, but also large payload.

  14. Anti-Noise Bidirectional Quantum Steganography Protocol with Large Payload

    NASA Astrophysics Data System (ADS)

    Qu, Zhiguo; Chen, Siyi; Ji, Sai; Ma, Songya; Wang, Xiaojun

    2018-03-01

    An anti-noise bidirectional quantum steganography protocol with large payload protocol is proposed in this paper. In the new protocol, Alice and Bob enable to transmit classical information bits to each other while teleporting secret quantum states covertly. The new protocol introduces the bidirectional quantum remote state preparation into the bidirectional quantum secure communication, not only to expand secret information from classical bits to quantum state, but also extract the phase and amplitude values of secret quantum state for greatly enlarging the capacity of secret information. The new protocol can also achieve better imperceptibility, since the eavesdropper can hardly detect the hidden channel or even obtain effective secret quantum states. Comparing with the previous quantum steganography achievements, due to its unique bidirectional quantum steganography, the new protocol can obtain higher transmission efficiency and better availability. Furthermore, the new algorithm can effectively resist quantum noises through theoretical analysis. Finally, the performance analysis proves the conclusion that the new protocol not only has good imperceptibility, high security, but also large payload.

  15. Validating visual disturbance types and classes used for forest soil monitoring protocols

    Treesearch

    D. S. Page-Dumroese; A. M. Abbott; M. P. Curran; M. F. Jurgensen

    2012-01-01

    We describe several methods for validating visual soil disturbance classes used during forest soil monitoring after specific management operations. Site-specific vegetative, soil, and hydrologic responses to soil disturbance are needed to identify sensitive and resilient soil properties and processes; therefore, validation of ecosystem responses can provide information...

  16. U.S. Army Research Institute Program in Basic Research-FY 2010

    DTIC Science & Technology

    2010-11-01

    2007). Do learning protocols support learning strategies and outcomes? The role of cognitive and metacognitive prompts. Learning and Instruction ...73 Achievement in Complex Learning Environments as a Function of Information Processing Ability ...Development and Validation of a Situational Judgment Test to Predict Attrition Incrementally Over General Cognitive Ability and a Forced-Choice

  17. 75 FR 70268 - Submission for OMB Review; Comment Request; NIH NCI Central Institutional Review Board (CIRB...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-17

    ... number. Proposed Collection: Title: NIH NCI Central Institutional Review Board (CIRB). Type of... is absent of conflicts of interest with the protocols under review. Tools utilized to accomplish this... information on workflow and processes of CIRB operations as well as a non-disclosure agreement. A conflict of...

  18. Applications and Methods Utilizing the Simple Semantic Web Architecture and Protocol (SSWAP) for Bioinformatics Resource Discovery and Disparate Data and Service Integration

    USDA-ARS?s Scientific Manuscript database

    Scientific data integration and computational service discovery are challenges for the bioinformatic community. This process is made more difficult by the separate and independent construction of biological databases, which makes the exchange of scientific data between information resources difficu...

  19. Successful Outcomes from a Structured Curriculum Used in the Veterans Affairs Low Vision Intervention Trial

    ERIC Educational Resources Information Center

    Stelmack, Joan A.; Rinne, Stephen; Mancil, Rickilyn M.; Dean, Deborah; Moran, D'Anna; Tang, X. Charlene; Cummings, Roger; Massof, Robert W.

    2008-01-01

    A low vision rehabilitation program with a structured curriculum was evaluated in a randomized controlled trial. The treatment group demonstrated large improvements in self-reported visual function (reading, mobility, visual information processing, visual motor skills, and overall). The team approach and the protocols of the treatment program are…

  20. Finite-key analysis for quantum key distribution with weak coherent pulses based on Bernoulli sampling

    NASA Astrophysics Data System (ADS)

    Kawakami, Shun; Sasaki, Toshihiko; Koashi, Masato

    2017-07-01

    An essential step in quantum key distribution is the estimation of parameters related to the leaked amount of information, which is usually done by sampling of the communication data. When the data size is finite, the final key rate depends on how the estimation process handles statistical fluctuations. Many of the present security analyses are based on the method with simple random sampling, where hypergeometric distribution or its known bounds are used for the estimation. Here we propose a concise method based on Bernoulli sampling, which is related to binomial distribution. Our method is suitable for the Bennett-Brassard 1984 (BB84) protocol with weak coherent pulses [C. H. Bennett and G. Brassard, Proceedings of the IEEE Conference on Computers, Systems and Signal Processing (IEEE, New York, 1984), Vol. 175], reducing the number of estimated parameters to achieve a higher key generation rate compared to the method with simple random sampling. We also apply the method to prove the security of the differential-quadrature-phase-shift (DQPS) protocol in the finite-key regime. The result indicates that the advantage of the DQPS protocol over the phase-encoding BB84 protocol in terms of the key rate, which was previously confirmed in the asymptotic regime, persists in the finite-key regime.

  1. A Simple XML Producer-Consumer Protocol

    NASA Technical Reports Server (NTRS)

    Smith, Warren; Gunter, Dan; Quesnel, Darcy

    2000-01-01

    This document describes a simple XML-based protocol that can be used for producers of events to communicate with consumers of events. The protocol described here is not meant to be the most efficient protocol, the most logical protocol, or the best protocol in any way. This protocol was defined quickly and it's intent is to give us a reasonable protocol that we can implement relatively easily and then use to gain experience in distributed event services. This experience will help us evaluate proposals for event representations, XML-based encoding of information, and communication protocols. The next section of this document describes how we represent events in this protocol and then defines the two events that we choose to use for our initial experiments. These definitions are made by example so that they are informal and easy to understand. The following section then proceeds to define the producer-consumer protocol we have agreed upon for our initial experiments.

  2. Mercury Assessment and Monitoring Protocol for the Bear Creek Watershed, Colusa County, California

    USGS Publications Warehouse

    Suchanek, Thomas H.; Hothem, Roger L.; Rytuba, James J.; Yee, Julie L.

    2010-01-01

    This report summarizes the known information on the occurrence and distribution of mercury (Hg) in physical/chemical and biological matrices within the Bear Creek watershed. Based on these data, a matrix-specific monitoring protocol for the evaluation of the effectiveness of activities designed to remediate Hg contamination in the Bear Creek watershed is presented. The monitoring protocol documents procedures for collecting and processing water, sediment, and biota for estimation of total Hg (TotHg) and monomethyl mercury (MMeHg) in the Bear Creek watershed. The concurrent sampling of TotHg and MMeHg in biota as well as water and sediment from 10 monitoring sites is designed to assess the relative bioavailability of Hg released from Hg sources in the watershed and identify environments conducive to Hg methylation. These protocols are designed to assist landowners, land managers, water quality regulators, and scientists in determining whether specific restoration/mitigation actions lead to significant progress toward achieving water quality goals to reduce Hg in Bear and Sulphur Creeks.

  3. Bell nonlocality: a resource for device-independent quantum information protocols

    NASA Astrophysics Data System (ADS)

    Acin, Antonio

    2015-05-01

    Bell nonlocality is not only one of the most fundamental properties of quantum physics, but has also recently acquired the status of an information resource for device-independent quantum information protocols. In the device-independent approach, protocols are designed so that their performance is independent of the internal working of the devices used in the implementation. We discuss all these ideas and argue that device-independent protocols are especially relevant or cryptographic applications, as they are insensitive to hacking attacks exploiting imperfections on the modelling of the devices.

  4. Optimizing radiologist e-prescribing of CT oral contrast agent using a protocoling portal.

    PubMed

    Wasser, Elliot J; Galante, Nicholas J; Andriole, Katherine P; Farkas, Cameron; Khorasani, Ramin

    2013-12-01

    The purpose of this study is to quantify the time expenditure associated with radiologist ordering of CT oral contrast media when using an integrated protocoling portal and to determine radiologists' perceptions of the ordering process. This prospective study was performed at a large academic tertiary care facility. Detailed timing information for CT inpatient oral contrast orders placed via the computerized physician order entry (CPOE) system was gathered over a 14-day period. Analyses evaluated the amount of physician time required for each component of the ordering process. Radiologists' perceptions of the ordering process were assessed by survey. Descriptive statistics and chi-square analysis were performed. A total of 96 oral contrast agent orders were placed by 13 radiologists during the study period. The average time necessary to create a protocol for each case was 40.4 seconds (average range by subject, 20.0-130.0 seconds; SD, 37.1 seconds), and the average total time to create and sign each contrast agent order was 27.2 seconds (range, 10.0-50.0 seconds; SD, 22.4 seconds). Overall, 52.5% (21/40) of survey respondents indicated that radiologist entry of oral contrast agent orders improved patient safety. A minority of respondents (15% [6/40]) indicated that contrast agent order entry was either very or extremely disruptive to workflow. Radiologist e-prescribing of CT oral contrast agents using CPOE can be embedded in a protocol workflow. Integration of health IT tools can help to optimize user acceptance and adoption.

  5. LC-MS based analysis of endogenous steroid hormones in human hair.

    PubMed

    Gao, Wei; Kirschbaum, Clemens; Grass, Juliane; Stalder, Tobias

    2016-09-01

    The quantification of endogenous steroid hormone concentrations in hair is increasingly used as a method for obtaining retrospective information on long-term integrated hormone exposure. Several different analytical procedures have been employed for hair steroid analysis, with liquid chromatography-mass spectrometry (LC-MS) being recognized as a particularly powerful analytical tool. Several methodological aspects affect the performance of LC-MS systems for hair steroid analysis, including sample preparation and pretreatment, steroid extraction, post-incubation purification, LC methodology, ionization techniques and MS specifications. Here, we critically review the differential value of such protocol variants for hair steroid hormones analysis, focusing on both analytical quality and practical feasibility issues. Our results show that, when methodological challenges are adequately addressed, LC-MS protocols can not only yield excellent sensitivity and specificity but are also characterized by relatively simple sample processing and short run times. This makes LC-MS based hair steroid protocols particularly suitable as a high-quality option for routine application in research contexts requiring the processing of larger numbers of samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Mercury Shopping Cart Interface

    NASA Technical Reports Server (NTRS)

    Pfister, Robin; McMahon, Joe

    2006-01-01

    Mercury Shopping Cart Interface (MSCI) is a reusable component of the Power User Interface 5.0 (PUI) program described in another article. MSCI is a means of encapsulating the logic and information needed to describe an orderable item consistent with Mercury Shopping Cart service protocol. Designed to be used with Web-browser software, MSCI generates Hypertext Markup Language (HTML) pages on which ordering information can be entered. MSCI comprises two types of Practical Extraction and Report Language (PERL) modules: template modules and shopping-cart logic modules. Template modules generate HTML pages for entering the required ordering details and enable submission of the order via a Hypertext Transfer Protocol (HTTP) post. Shopping cart modules encapsulate the logic and data needed to describe an individual orderable item to the Mercury Shopping Cart service. These modules evaluate information entered by the user to determine whether it is sufficient for the Shopping Cart service to process the order. Once an order has been passed from MSCI to a deployed Mercury Shopping Cart server, there is no further interaction with the user.

  7. Experimental protocol for high-fidelity heralded photon-to-atom quantum state transfer.

    PubMed

    Kurz, Christoph; Schug, Michael; Eich, Pascal; Huwer, Jan; Müller, Philipp; Eschner, Jürgen

    2014-11-21

    A quantum network combines the benefits of quantum systems regarding secure information transmission and calculational speed-up by employing quantum coherence and entanglement to store, transmit and process information. A promising platform for implementing such a network are atom-based quantum memories and processors, interconnected by photonic quantum channels. A crucial building block in this scenario is the conversion of quantum states between single photons and single atoms through controlled emission and absorption. Here we present an experimental protocol for photon-to-atom quantum state conversion, whereby the polarization state of an absorbed photon is mapped onto the spin state of a single absorbing atom with >95% fidelity, while successful conversion is heralded by a single emitted photon. Heralded high-fidelity conversion without affecting the converted state is a main experimental challenge, in order to make the transferred information reliably available for further operations. We record >80 s(-1) successful state transfer events out of 18,000 s(-1) repetitions.

  8. Efficiency Improvement in a Busy Radiology Practice: Determination of Musculoskeletal Magnetic Resonance Imaging Protocol Using Deep-Learning Convolutional Neural Networks.

    PubMed

    Lee, Young Han

    2018-04-04

    The purposes of this study are to evaluate the feasibility of protocol determination with a convolutional neural networks (CNN) classifier based on short-text classification and to evaluate the agreements by comparing protocols determined by CNN with those determined by musculoskeletal radiologists. Following institutional review board approval, the database of a hospital information system (HIS) was queried for lists of MRI examinations, referring department, patient age, and patient gender. These were exported to a local workstation for analyses: 5258 and 1018 consecutive musculoskeletal MRI examinations were used for the training and test datasets, respectively. The subjects for pre-processing were routine or tumor protocols and the contents were word combinations of the referring department, region, contrast media (or not), gender, and age. The CNN Embedded vector classifier was used with Word2Vec Google news vectors. The test set was tested with each classification model and results were output as routine or tumor protocols. The CNN determinations were evaluated using the receiver operating characteristic (ROC) curves. The accuracies were evaluated by a radiologist-confirmed protocol as the reference protocols. The optimal cut-off values for protocol determination between routine protocols and tumor protocols was 0.5067 with a sensitivity of 92.10%, a specificity of 95.76%, and an area under curve (AUC) of 0.977. The overall accuracy was 94.2% for the ConvNet model. All MRI protocols were correct in the pelvic bone, upper arm, wrist, and lower leg MRIs. Deep-learning-based convolutional neural networks were clinically utilized to determine musculoskeletal MRI protocols. CNN-based text learning and applications could be extended to other radiologic tasks besides image interpretations, improving the work performance of the radiologist.

  9. Informed consent in human research: what to say and how to say it.

    PubMed

    Reiman, Robert E

    2013-02-01

    To ensure that the possibility of harm to human research subjects is minimized, clinical trials and other research protocols are subject to oversight by Institutional Review Boards (IRBs). IRBs require that subjects be fully informed about the real or potential risks of participation in a research study. The use of radiological examinations in research protocols subjects the participants to exposure to ionizing radiation, which in theory carries a risk of stochastic effects such as radiation-induced cancer, and in practice may lead to deterministic effects such as skin injury. Because IRB members and clinical study coordinators may have little knowledge of radiation effects or how best to communicate the risk to the research subjects, they will consult with institutional Radiation Safety Committees and radiation protection professionals regarding how to integrate radiation risk information into the informed consent process. Elements of radiation informed consent include: (1) comparison of the radiation dose to some benchmark that enables the study subjects to make a value judgment regarding the acceptability of the risk; (2) a quantitative expression of the absolute risk of stochastic effects; (3) an expression of uncertainty in the risk; and (4) understandability. Standardized risk statement templates may be created for specific radiological examinations. These standardized risk statements may be deployed as paper forms or electronically in the form of internet-based applications. The technical nature of creating useful radiation risk statements represents an opportunity for radiation protection professionals to participate productively in the clinical research process.

  10. Multiparty quantum key agreement protocol based on locally indistinguishable orthogonal product states

    NASA Astrophysics Data System (ADS)

    Jiang, Dong-Huan; Xu, Guang-Bao

    2018-07-01

    Based on locally indistinguishable orthogonal product states, we propose a novel multiparty quantum key agreement (QKA) protocol. In this protocol, the private key information of each party is encoded as some orthogonal product states that cannot be perfectly distinguished by local operations and classical communications. To ensure the security of the protocol with small amount of decoy particles, the different particles of each product state are transmitted separately. This protocol not only can make each participant fairly negotiate a shared key, but also can avoid information leakage in the maximum extent. We give a detailed security proof of this protocol. From comparison result with the existing QKA protocols, we can know that the new protocol is more efficient.

  11. Compositional Verification of a Communication Protocol for a Remotely Operated Vehicle

    NASA Technical Reports Server (NTRS)

    Goodloe, Alwyn E.; Munoz, Cesar A.

    2009-01-01

    This paper presents the specification and verification in the Prototype Verification System (PVS) of a protocol intended to facilitate communication in an experimental remotely operated vehicle used by NASA researchers. The protocol is defined as a stack-layered com- position of simpler protocols. It can be seen as the vertical composition of protocol layers, where each layer performs input and output message processing, and the horizontal composition of different processes concurrently inhabiting the same layer, where each process satisfies a distinct requirement. It is formally proven that the protocol components satisfy certain delivery guarantees. Compositional techniques are used to prove these guarantees also hold in the composed system. Although the protocol itself is not novel, the methodology employed in its verification extends existing techniques by automating the tedious and usually cumbersome part of the proof, thereby making the iterative design process of protocols feasible.

  12. Image processing and Quality Control for the first 10,000 brain imaging datasets from UK Biobank.

    PubMed

    Alfaro-Almagro, Fidel; Jenkinson, Mark; Bangerter, Neal K; Andersson, Jesper L R; Griffanti, Ludovica; Douaud, Gwenaëlle; Sotiropoulos, Stamatios N; Jbabdi, Saad; Hernandez-Fernandez, Moises; Vallee, Emmanuel; Vidaurre, Diego; Webster, Matthew; McCarthy, Paul; Rorden, Christopher; Daducci, Alessandro; Alexander, Daniel C; Zhang, Hui; Dragonu, Iulius; Matthews, Paul M; Miller, Karla L; Smith, Stephen M

    2018-02-01

    UK Biobank is a large-scale prospective epidemiological study with all data accessible to researchers worldwide. It is currently in the process of bringing back 100,000 of the original participants for brain, heart and body MRI, carotid ultrasound and low-dose bone/fat x-ray. The brain imaging component covers 6 modalities (T1, T2 FLAIR, susceptibility weighted MRI, Resting fMRI, Task fMRI and Diffusion MRI). Raw and processed data from the first 10,000 imaged subjects has recently been released for general research access. To help convert this data into useful summary information we have developed an automated processing and QC (Quality Control) pipeline that is available for use by other researchers. In this paper we describe the pipeline in detail, following a brief overview of UK Biobank brain imaging and the acquisition protocol. We also describe several quantitative investigations carried out as part of the development of both the imaging protocol and the processing pipeline. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Device-independent quantum private query

    NASA Astrophysics Data System (ADS)

    Maitra, Arpita; Paul, Goutam; Roy, Sarbani

    2017-04-01

    In quantum private query (QPQ), a client obtains values corresponding to his or her query only, and nothing else from the server, and the server does not get any information about the queries. V. Giovannetti et al. [Phys. Rev. Lett. 100, 230502 (2008)], 10.1103/PhysRevLett.100.230502 gave the first QPQ protocol and since then quite a few variants and extensions have been proposed. However, none of the existing protocols are device independent; i.e., all of them assume implicitly that the entangled states supplied to the client and the server are of a certain form. In this work, we exploit the idea of a local CHSH game and connect it with the scheme of Y. G. Yang et al. [Quantum Info. Process. 13, 805 (2014)], 10.1007/s11128-013-0692-8 to present the concept of a device-independent QPQ protocol.

  14. Heralded noiseless amplification for single-photon entangled state with polarization feature

    NASA Astrophysics Data System (ADS)

    Wang, Dan-Dan; Jin, Yu-Yu; Qin, Sheng-Xian; Zu, Hao; Zhou, Lan; Zhong, Wei; Sheng, Yu-Bo

    2018-03-01

    Heralded noiseless amplification is a promising method to overcome the transmission photon loss in practical noisy quantum channel and can effectively lengthen the quantum communication distance. Single-photon entanglement is an important resource in current quantum communications. Here, we construct two single-photon-assisted heralded noiseless amplification protocols for the single-photon two-mode entangled state and single-photon three-mode W state, respectively, where the single-photon qubit has an arbitrary unknown polarization feature. After the amplification, the fidelity of the single-photon entangled state can be increased, while the polarization feature of the single-photon qubit can be well remained. Both the two protocols only require the linear optical elements, so that they can be realized under current experimental condition. Our protocols may be useful in current and future quantum information processing.

  15. Oncosurgical Management of Liver Limited Stage IV Colorectal Cancer: Preliminary Data and Protocol for a Randomized Controlled Trial.

    PubMed

    Sutton, Paul; Vimalachandran, Dale; Poston, Graeme; Fenwick, Stephen; Malik, Hassan

    2018-05-09

    Colorectal cancer is the fourth commonest cancer and second commonest cause of cancer-related death in the United Kingdom. Almost 15% of patients have metastases on presentation. An increasing number of surgical strategies and better neoadjuvant treatment options are responsible for more patients undergoing resection of liver metastases, with prolonged survival in a select group of patients who present with synchronous disease. It is clear that the optimal strategy for the management of these patients remains unclear, and there is certainly a complete absence of Level 1 evidence in the literature. The objective of this study is to undertake preliminary work and devise an outline trial protocol to inform the future development of clinical studies to investigate the management of patients with liver limited stage IV colorectal cancer. We have undertaken some preliminary work and begun the process of designing a randomized controlled trial and present a draft trial protocol here. This study is at the protocol development stage only, and as such no results are available. There is no funding in place for this study, and no anticipated start date. We have presented preliminary work and an outline trial protocol which we anticipate will inform the future development of clinical studies to investigate the management of patients with liver limited stage IV colorectal cancer. We do not believe that the trial we have designed will answer the most significant clinical questions, nor that it is feasible to be delivered within the United Kingdom's National Health Service at this current time. ©Paul Sutton, Dale Vimalachandran, Graeme Poston, Stephen Fenwick, Hassan Malik. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 09.05.2018.

  16. A post-gene silencing bioinformatics protocol for plant-defence gene validation and underlying process identification: case study of the Arabidopsis thaliana NPR1.

    PubMed

    Yocgo, Rosita E; Geza, Ephifania; Chimusa, Emile R; Mazandu, Gaston K

    2017-11-23

    Advances in forward and reverse genetic techniques have enabled the discovery and identification of several plant defence genes based on quantifiable disease phenotypes in mutant populations. Existing models for testing the effect of gene inactivation or genes causing these phenotypes do not take into account eventual uncertainty of these datasets and potential noise inherent in the biological experiment used, which may mask downstream analysis and limit the use of these datasets. Moreover, elucidating biological mechanisms driving the induced disease resistance and influencing these observable disease phenotypes has never been systematically tackled, eliciting the need for an efficient model to characterize completely the gene target under consideration. We developed a post-gene silencing bioinformatics (post-GSB) protocol which accounts for potential biases related to the disease phenotype datasets in assessing the contribution of the gene target to the plant defence response. The post-GSB protocol uses Gene Ontology semantic similarity and pathway dataset to generate enriched process regulatory network based on the functional degeneracy of the plant proteome to help understand the induced plant defence response. We applied this protocol to investigate the effect of the NPR1 gene silencing to changes in Arabidopsis thaliana plants following Pseudomonas syringae pathovar tomato strain DC3000 infection. Results indicated that the presence of a functionally active NPR1 reduced the plant's susceptibility to the infection, with about 99% of variability in Pseudomonas spore growth between npr1 mutant and wild-type samples. Moreover, the post-GSB protocol has revealed the coordinate action of target-associated genes and pathways through an enriched process regulatory network, summarizing the potential target-based induced disease resistance mechanism. This protocol can improve the characterization of the gene target and, potentially, elucidate induced defence response by more effectively utilizing available phenotype information and plant proteome functional knowledge.

  17. Compliance with AAPM Practice Guideline 1.a: CT Protocol Management and Review — from the perspective of a university hospital

    PubMed Central

    Bour, Robert K.; Pozniak, Myron; Ranallo, Frank N.

    2015-01-01

    The purpose of this paper is to describe our experience with the AAPM Medical Physics Practice Guideline 1.a: “CT Protocol Management and Review Practice Guideline”. Specifically, we will share how our institution's quality management system addresses the suggestions within the AAPM practice report. We feel this paper is needed as it was beyond the scope of the AAPM practice guideline to provide specific details on fulfilling individual guidelines. Our hope is that other institutions will be able to emulate some of our practices and that this article would encourage other types of centers (e.g., community hospitals) to share their methodology for approaching CT protocol optimization and quality control. Our institution had a functioning CT protocol optimization process, albeit informal, since we began using CT. Recently, we made our protocol development and validation process compliant with a number of the ISO 9001:2008 clauses and this required us to formalize the roles of the members of our CT protocol optimization team. We rely heavily on PACS‐based IT solutions for acquiring radiologist feedback on the performance of our CT protocols and the performance of our CT scanners in terms of dose (scanner output) and the function of the automatic tube current modulation. Specific details on our quality management system covering both quality control and ongoing optimization have been provided. The roles of each CT protocol team member have been defined, and the critical role that IT solutions provides for the management of files and the monitoring of CT protocols has been reviewed. In addition, the invaluable role management provides by being a champion for the project has been explained; lack of a project champion will mitigate the efforts of a CT protocol optimization team. Meeting the guidelines set forth in the AAPM practice guideline was not inherently difficult, but did, in our case, require the cooperation of radiologists, technologists, physicists, IT, administrative staff, and hospital management. Some of the IT solutions presented in this paper are novel and currently unique to our institution. PACS number: 87.57.Q PMID:26103176

  18. Secure quantum private information retrieval using phase-encoded queries

    NASA Astrophysics Data System (ADS)

    Olejnik, Lukasz

    2011-08-01

    We propose a quantum solution to the classical private information retrieval (PIR) problem, which allows one to query a database in a private manner. The protocol offers privacy thresholds and allows the user to obtain information from a database in a way that offers the potential adversary, in this model the database owner, no possibility of deterministically establishing the query contents. This protocol may also be viewed as a solution to the symmetrically private information retrieval problem in that it can offer database security (inability for a querying user to steal its contents). Compared to classical solutions, the protocol offers substantial improvement in terms of communication complexity. In comparison with the recent quantum private queries [Phys. Rev. Lett.PRLTAO0031-900710.1103/PhysRevLett.100.230502 100, 230502 (2008)] protocol, it is more efficient in terms of communication complexity and the number of rounds, while offering a clear privacy parameter. We discuss the security of the protocol and analyze its strengths and conclude that using this technique makes it challenging to obtain the unconditional (in the information-theoretic sense) privacy degree; nevertheless, in addition to being simple, the protocol still offers a privacy level. The oracle used in the protocol is inspired both by the classical computational PIR solutions as well as the Deutsch-Jozsa oracle.

  19. Secure quantum private information retrieval using phase-encoded queries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olejnik, Lukasz

    We propose a quantum solution to the classical private information retrieval (PIR) problem, which allows one to query a database in a private manner. The protocol offers privacy thresholds and allows the user to obtain information from a database in a way that offers the potential adversary, in this model the database owner, no possibility of deterministically establishing the query contents. This protocol may also be viewed as a solution to the symmetrically private information retrieval problem in that it can offer database security (inability for a querying user to steal its contents). Compared to classical solutions, the protocol offersmore » substantial improvement in terms of communication complexity. In comparison with the recent quantum private queries [Phys. Rev. Lett. 100, 230502 (2008)] protocol, it is more efficient in terms of communication complexity and the number of rounds, while offering a clear privacy parameter. We discuss the security of the protocol and analyze its strengths and conclude that using this technique makes it challenging to obtain the unconditional (in the information-theoretic sense) privacy degree; nevertheless, in addition to being simple, the protocol still offers a privacy level. The oracle used in the protocol is inspired both by the classical computational PIR solutions as well as the Deutsch-Jozsa oracle.« less

  20. CDC WONDER: a cooperative processing architecture for public health.

    PubMed Central

    Friede, A; Rosen, D H; Reid, J A

    1994-01-01

    CDC WONDER is an information management architecture designed for public health. It provides access to information and communications without the user's needing to know the location of data or communication pathways and mechanisms. CDC WONDER users have access to extractions from some 40 databases; electronic mail (e-mail); and surveillance data processing. System components include the Remote Client, the Communications Server, the Queue Managers, and Data Servers and Process Servers. The Remote Client software resides in the user's machine; other components are at the Centers for Disease Control and Prevention (CDC). The Remote Client, the Communications Server, and the Applications Server provide access to the information and functions in the Data Servers and Process Servers. The system architecture is based on cooperative processing, and components are coupled via pure message passing, using several protocols. This architecture allows flexibility in the choice of hardware and software. One system limitation is that final results from some subsystems are obtained slowly. Although designed for public health, CDC WONDER could be useful for other disciplines that need flexible, integrated information exchange. PMID:7719813

  1. The U.S. Culture Collection Network Responding to the Requirements of the Nagoya Protocol on Access and Benefit Sharing

    PubMed Central

    Barker, Katharine B.; Barton, Hazel A.; Boundy-Mills, Kyria; Brown, Daniel R.; Coddington, Jonathan A.; Cook, Kevin; Desmeth, Philippe; Geiser, David; Glaeser, Jessie A.; Greene, Stephanie; Kang, Seogchan; Lomas, Michael W.; Melcher, Ulrich; Miller, Scott E.; Nobles, David R.; Owens, Kristina J.; Reichman, Jerome H.; da Silva, Manuela; Wertz, John; Whitworth, Cale; Smith, David

    2017-01-01

    ABSTRACT The U.S. Culture Collection Network held a meeting to share information about how culture collections are responding to the requirements of the recently enacted Nagoya Protocol on Access to Genetic Resources and the Fair and Equitable Sharing of Benefits Arising from their Utilization to the Convention on Biological Diversity (CBD). The meeting included representatives of many culture collections and other biological collections, the U.S. Department of State, U.S. Department of Agriculture, Secretariat of the CBD, interested scientific societies, and collection groups, including Scientific Collections International and the Global Genome Biodiversity Network. The participants learned about the policies of the United States and other countries regarding access to genetic resources, the definition of genetic resources, and the status of historical materials and genetic sequence information. Key topics included what constitutes access and how the CBD Access and Benefit-Sharing Clearing-House can help guide researchers through the process of obtaining Prior Informed Consent on Mutually Agreed Terms. U.S. scientists and their international collaborators are required to follow the regulations of other countries when working with microbes originally isolated outside the United States, and the local regulations required by the Nagoya Protocol vary by the country of origin of the genetic resource. Managers of diverse living collections in the United States described their holdings and their efforts to provide access to genetic resources. This meeting laid the foundation for cooperation in establishing a set of standard operating procedures for U.S. and international culture collections in response to the Nagoya Protocol. PMID:28811341

  2. The U.S. Culture Collection Network Responding to the Requirements of the Nagoya Protocol on Access and Benefit Sharing.

    PubMed

    McCluskey, Kevin; Barker, Katharine B; Barton, Hazel A; Boundy-Mills, Kyria; Brown, Daniel R; Coddington, Jonathan A; Cook, Kevin; Desmeth, Philippe; Geiser, David; Glaeser, Jessie A; Greene, Stephanie; Kang, Seogchan; Lomas, Michael W; Melcher, Ulrich; Miller, Scott E; Nobles, David R; Owens, Kristina J; Reichman, Jerome H; da Silva, Manuela; Wertz, John; Whitworth, Cale; Smith, David

    2017-08-15

    The U.S. Culture Collection Network held a meeting to share information about how culture collections are responding to the requirements of the recently enacted Nagoya Protocol on Access to Genetic Resources and the Fair and Equitable Sharing of Benefits Arising from their Utilization to the Convention on Biological Diversity (CBD). The meeting included representatives of many culture collections and other biological collections, the U.S. Department of State, U.S. Department of Agriculture, Secretariat of the CBD, interested scientific societies, and collection groups, including Scientific Collections International and the Global Genome Biodiversity Network. The participants learned about the policies of the United States and other countries regarding access to genetic resources, the definition of genetic resources, and the status of historical materials and genetic sequence information. Key topics included what constitutes access and how the CBD Access and Benefit-Sharing Clearing-House can help guide researchers through the process of obtaining Prior Informed Consent on Mutually Agreed Terms. U.S. scientists and their international collaborators are required to follow the regulations of other countries when working with microbes originally isolated outside the United States, and the local regulations required by the Nagoya Protocol vary by the country of origin of the genetic resource. Managers of diverse living collections in the United States described their holdings and their efforts to provide access to genetic resources. This meeting laid the foundation for cooperation in establishing a set of standard operating procedures for U.S. and international culture collections in response to the Nagoya Protocol.

  3. Adaptive hybrid optimal quantum control for imprecisely characterized systems.

    PubMed

    Egger, D J; Wilhelm, F K

    2014-06-20

    Optimal quantum control theory carries a huge promise for quantum technology. Its experimental application, however, is often hindered by imprecise knowledge of the input variables, the quantum system's parameters. We show how to overcome this by adaptive hybrid optimal control, using a protocol named Ad-HOC. This protocol combines open- and closed-loop optimal control by first performing a gradient search towards a near-optimal control pulse and then an experimental fidelity estimation with a gradient-free method. For typical settings in solid-state quantum information processing, adaptive hybrid optimal control enhances gate fidelities by an order of magnitude, making optimal control theory applicable and useful.

  4. Single-shot secure quantum network coding on butterfly network with free public communication

    NASA Astrophysics Data System (ADS)

    Owari, Masaki; Kato, Go; Hayashi, Masahito

    2018-01-01

    Quantum network coding on the butterfly network has been studied as a typical example of quantum multiple cast network. We propose a secure quantum network code for the butterfly network with free public classical communication in the multiple unicast setting under restricted eavesdropper’s power. This protocol certainly transmits quantum states when there is no attack. We also show the secrecy with shared randomness as additional resource when the eavesdropper wiretaps one of the channels in the butterfly network and also derives the information sending through public classical communication. Our protocol does not require verification process, which ensures single-shot security.

  5. Focus on Quantum Memory

    NASA Astrophysics Data System (ADS)

    Brennen, Gavin; Giacobino, Elisabeth; Simon, Christoph

    2015-05-01

    Quantum memories are essential for quantum information processing and long-distance quantum communication. The field has recently seen a lot of progress, and the present focus issue offers a glimpse of these developments, showing both experimental and theoretical results from many of the leading groups around the world. On the experimental side, it shows work on cold gases, warm vapors, rare-earth ion doped crystals and single atoms. On the theoretical side there are in-depth studies of existing memory protocols, proposals for new protocols including approaches based on quantum error correction, and proposals for new applications of quantum storage. Looking forward, we anticipate many more exciting results in this area.

  6. Quantum Private Comparison of Equality Based on Five-Particle Cluster State

    NASA Astrophysics Data System (ADS)

    Chang, Yan; Zhang, Wen-Bo; Zhang, Shi-Bin; Wang, Hai-Chun; Yan, Li-Li; Han, Gui-Hua; Sheng, Zhi-Wei; Huang, Yuan-Yuan; Suo, Wang; Xiong, Jin-Xin

    2016-12-01

    A protocol for quantum private comparison of equality (QPCE) is proposed based on five-particle cluster state with the help of a semi-honest third party (TP). In our protocol, TP is allowed to misbehave on its own but can not conspire with either of two parties. Compared with most two-user QPCE protocols, our protocol not only can compare two groups of private information (each group has two users) in one execution, but also compare just two private information. Compared with the multi-user QPCE protocol proposed, our protocol is safer with more reasonable assumptions of TP. The qubit efficiency is computed and analyzed. Our protocol can also be generalized to the case of 2N participants with one TP. The 2N-participant protocol can compare two groups (each group has N private information) in one execution or just N private information. Supported by NSFC under Grant Nos. 61402058, 61572086, the Fund for Middle and Young Academic Leaders of CUIT under Grant No. J201511, the Science and Technology Support Project of Sichuan Province of China under Grant No. 2013GZX0137, the Fund for Young Persons Project of Sichuan Province of China under Grant No. 12ZB017, and the Foundation of Cyberspace Security Key Laboratory of Sichuan Higher Education Institutions under Grant No. szjj2014-074

  7. Shortcomings of protocols of drug trials in relation to sponsorship as identified by Research Ethics Committees: analysis of comments raised during ethical review.

    PubMed

    van Lent, Marlies; Rongen, Gerard A; Out, Henk J

    2014-12-10

    Submission of study protocols to research ethics committees (RECs) constitutes one of the earliest stages at which planned trials are documented in detail. Previous studies have investigated the amendments requested from researchers by RECs, but the type of issues raised during REC review have not been compared by sponsor type. The objective of this study was to identify recurring shortcomings in protocols of drug trials based on REC comments and to assess whether these were more common among industry-sponsored or non-industry trials. Retrospective analysis of 226 protocols of drug trials approved in 2010-2011 by three RECs affiliated to academic medical centres in The Netherlands. For each protocol, information on sponsorship, number of participating centres, participating countries, study phase, registration status of the study drug, and type and number of subjects was retrieved. REC comments were extracted from decision letters sent to investigators after review and were classified using a predefined checklist that was based on legislation and guidelines on clinical drug research and previous literature. Most protocols received comments regarding participant information and consent forms (n = 182, 80.5%), methodology and statistical analyses (n = 160, 70.8%), and supporting documentation, including trial agreements and certificates of insurance (n = 154, 68.1%). Of the submitted protocols, 122 (54.0%) were non-industry and 104 (46.0%) were industry-sponsored trials. Non-industry trials more often received comments on subject selection (n = 44, 36.1%) than industry-sponsored trials (n = 18, 17.3%; RR, 1.58; 95% CI, 1.01 to 2.47), and on methodology and statistical analyses (n = 95, 77.9% versus n = 65, 62.5%, respectively; RR, 1.18; 95% CI, 1.01 to 1.37). Non-industry trials less often received comments on supporting documentation (n = 72, 59.0%) than industry-sponsored trials (n = 82, 78.8%; RR, 0.83; 95% CI, 0.72 to 0.95). RECs identified important ethical and methodological shortcomings in protocols of both industry-sponsored and non-industry drug trials. Investigators, especially of non-industry trials, should better prepare their research protocols in order to facilitate the ethical review process.

  8. Protocol Coordinator | Center for Cancer Research

    Cancer.gov

    PROGRAM DESCRIPTION Within the Leidos Biomedical Research Inc.’s Clinical Research Directorate, the Clinical Monitoring Research Program (CMRP) provides high-quality comprehensive and strategic operational support to the high-profile domestic and international clinical research initiatives of the National Cancer Institute (NCI), National Institute of Allergy and Infectious Diseases (NIAID), Clinical Center (CC), National Institute of Heart, Lung and Blood Institute (NHLBI), National Institute of Arthritis and Musculoskeletal and Skin Diseases (NIAMS), National Center for Advancing Translational Sciences (NCATS), National Institute of Neurological Disorders and Stroke (NINDS), and the National Institute of Mental Health (NIMH). Since its inception in 2001, CMRP’s ability to provide rapid responses, high-quality solutions, and to recruit and retain experts with a variety of backgrounds to meet the growing research portfolios of NCI, NIAID, CC, NHLBI, NIAMS, NCATS, NINDS, and NIMH has led to the considerable expansion of the program and its repertoire of support services. CMRP’s support services are strategically aligned with the program’s mission to provide comprehensive, dedicated support to assist National Institutes of Health researchers in providing the highest quality of clinical research in compliance with applicable regulations and guidelines, maintaining data integrity, and protecting human subjects. For the scientific advancement of clinical research, CMRP services include comprehensive clinical trials, regulatory, pharmacovigilance, protocol navigation and development, and programmatic and project management support for facilitating the conduct of 400+ Phase I, II, and III domestic and international trials on a yearly basis. These trials investigate the prevention, diagnosis, treatment of, and therapies for cancer, influenza, HIV, and other infectious diseases and viruses such as hepatitis C, tuberculosis, malaria, and Ebola virus; heart, lung, and blood diseases and conditions; parasitic infections; rheumatic and inflammatory diseases; and rare and neglected diseases.  CMRP’s collaborative approach to clinical research and the expertise and dedication of staff to the continuation and success of the program’s mission has contributed to improving the overall standards of public health on a global scale. The Clinical Monitoring Research Program (CMRP) provides comprehensive clinical and administrative support to the National Cancer Institute’s Center for Cancer Research’s (CCR), Office of Regulatory Affairs for protocol development review, regulatory review, and the implementation process as well as oversees medical writing/editing, regulatory/ compliance, and protocol coordination/navigation and administration. KEY ROLES/RESPONSIBILITIES - THIS POSITION IS CONTINGENT UPON FUNDING APPROVAL The Protocol Coordinator II: Provides programmatic and logistical support for the operations of clinical research for Phase I and Phase II clinical trials Provides deployment of clinical support services for clinical research Streamlines protocol development timeline Provides data and document collection and compilation for regulatory filing with the FDA and other regulatory authorities Provides administrative coordination and general logistical support for regulatory activities Ensures the provision of training for investigators and associate staff to reinforce and enhance a GCP culture Provides quality assurance and quality control oversight Performs regulatory review of clinical protocols, informed consent and other clinical documents Tracks and facilitates a portfolio of protocols through each process step (IRB, RAC, DSMB, Office of Protocol Services) Assists clinical investigators in preparing clinical research protocols, including writing and formatting protocol documents and consent forms Prepares protocol packages for review and ensures that protocol packages include all of the required material and comply with CCR, NCI and NIH policies Collaborates with investigators to resolve any protocol/data issues Coordinates submission of protocols for scientific and ethical review by the Branch scientific review committees, the NCI Institutional Review Board (IRB) and the clinical trial sponsor or the FDA Monitors the review process and maintains detailed, complete and accurate records for each protocol of the approvals at the various stages of the review process, including new protocol submissions, amendments to protocols, and continuing reviews, as well as other submissions such as adverse events Attends and prepares minutes for the Branch Protocol Review Committees For protocols that are performed with other research centers: contacts coordinators at other centers to obtain review committee approvals at these centers, maintains records of these approvals at the outside centers in the protocol files, and sends protocol amendments and other reports to the participating centers Maintains a schedule of all review committee submission deadline dates and meeting dates Assists clinical investigators in understanding and complying with the entire review process Works closely with the NCI Protocol Review Office in establishing and maintaining a paperless automated document management and tracking system for NCI protocols Converts protocols from Word format to PDF with bookmarks Maintains the PDF version of the most current approved version of each active clinical protocol on a central server    This position has the option to be located in Frederick or Rockville, Maryland.

  9. Design and implementation of a smart card based healthcare information system.

    PubMed

    Kardas, Geylani; Tunali, E Turhan

    2006-01-01

    Smart cards are used in information technologies as portable integrated devices with data storage and data processing capabilities. As in other fields, smart card use in health systems became popular due to their increased capacity and performance. Their efficient use with easy and fast data access facilities leads to implementation particularly widespread in security systems. In this paper, a smart card based healthcare information system is developed. The system uses smart card for personal identification and transfer of health data and provides data communication via a distributed protocol which is particularly developed for this study. Two smart card software modules are implemented that run on patient and healthcare professional smart cards, respectively. In addition to personal information, general health information about the patient is also loaded to patient smart card. Health care providers use their own smart cards to be authenticated on the system and to access data on patient cards. Encryption keys and digital signature keys stored on smart cards of the system are used for secure and authenticated data communication between clients and database servers over distributed object protocol. System is developed on Java platform by using object oriented architecture and design patterns.

  10. An integrated healthcare information system for end-to-end standardized exchange and homogeneous management of digital ECG formats.

    PubMed

    Trigo, Jesús Daniel; Martínez, Ignacio; Alesanco, Alvaro; Kollmann, Alexander; Escayola, Javier; Hayn, Dieter; Schreier, Günter; García, José

    2012-07-01

    This paper investigates the application of the enterprise information system (EIS) paradigm to standardized cardiovascular condition monitoring. There are many specifications in cardiology, particularly in the ECG standardization arena. The existence of ECG formats, however, does not guarantee the implementation of homogeneous, standardized solutions for ECG management. In fact, hospital management services need to cope with various ECG formats and, moreover, several different visualization applications. This heterogeneity hampers the normalization of integrated, standardized healthcare information systems, hence the need for finding an appropriate combination of ECG formats and a suitable EIS-based software architecture that enables standardized exchange and homogeneous management of ECG formats. Determining such a combination is one objective of this paper. The second aim is to design and develop the integrated healthcare information system that satisfies the requirements posed by the previous determination. The ECG formats selected include ISO/IEEE11073, Standard Communications Protocol for Computer-Assisted Electrocardiography, and an ECG ontology. The EIS-enabling techniques and technologies selected include web services, simple object access protocol, extensible markup language, or business process execution language. Such a selection ensures the standardized exchange of ECGs within, or across, healthcare information systems while providing modularity and accessibility.

  11. Agents Based e-Commerce and Securing Exchanged Information

    NASA Astrophysics Data System (ADS)

    Al-Jaljouli, Raja; Abawajy, Jemal

    Mobile agents have been implemented in e-Commerce to search and filter information of interest from electronic markets. When the information is very sensitive and critical, it is important to develop a novel security protocol that can efficiently protect the information from malicious tampering as well as unauthorized disclosure or at least detect any malicious act of intruders. In this chapter, we describe robust security techniques that ensure a sound security of information gathered throughout agent’s itinerary against various security attacks, as well as truncation attacks. A sound security protocol is described, which implements the various security techniques that would jointly prevent or at least detect any malicious act of intruders. We reason about the soundness of the protocol usingSymbolic Trace Analyzer (STA), a formal verification tool that is based on symbolic techniques. We analyze the protocol in key configurations and show that it is free of flaws. We also show that the protocol fulfils the various security requirements of exchanged information in MAS, including data-integrity, data-confidentiality, data-authenticity, origin confidentiality and data non-repudiability.

  12. Using Fluorescent Reporters to Monitor Autophagy in the Female Germline Cells in Drosophila melanogaster.

    PubMed

    Jacomin, Anne-Claire; Nezis, Ioannis P

    2016-01-01

    Oogenesis is a fundamental biological process for the transmission of genetic information to the next generations. Drosophila has proven to be a valuable model for elucidating the molecular and cellular mechanisms involved in this developmental process. It has been shown that autophagy participates in the maturation of the egg chamber. Here we provide a protocol for monitoring and quantification of the autophagic process in the Drosophila germline cells using the fluorescent reporters mCherry-DmAtg8a and GFP-mCherry-DmAtg8a.

  13. Practical quantum appointment scheduling

    NASA Astrophysics Data System (ADS)

    Touchette, Dave; Lovitz, Benjamin; Lütkenhaus, Norbert

    2018-04-01

    We propose a protocol based on coherent states and linear optics operations for solving the appointment-scheduling problem. Our main protocol leaks strictly less information about each party's input than the optimal classical protocol, even when considering experimental errors. Along with the ability to generate constant-amplitude coherent states over two modes, this protocol requires the ability to transfer these modes back-and-forth between the two parties multiple times with very low losses. The implementation requirements are thus still challenging. Along the way, we develop tools to study quantum information cost of interactive protocols in the finite regime.

  14. Robust quantum entanglement generation and generation-plus-storage protocols with spin chains

    NASA Astrophysics Data System (ADS)

    Estarellas, Marta P.; D'Amico, Irene; Spiller, Timothy P.

    2017-04-01

    Reliable quantum communication and/or processing links between modules are a necessary building block for various quantum processing architectures. Here we consider a spin-chain system with alternating strength couplings and containing three defects, which impose three domain walls between topologically distinct regions of the chain. We show that—in addition to its useful, high-fidelity, quantum state transfer properties—an entangling protocol can be implemented in this system, with optional localization and storage of the entangled states. We demonstrate both numerically and analytically that, given a suitable initial product-state injection, the natural dynamics of the system produces a maximally entangled state at a given time. We present detailed investigations of the effects of fabrication errors, analyzing random static disorder both in the diagonal and off-diagonal terms of the system Hamiltonian. Our results show that the entangled state formation is very robust against perturbations of up to ˜10 % the weaker chain coupling, and also robust against timing injection errors. We propose a further protocol, which manipulates the chain in order to localize and store each of the entangled qubits. The engineering of a system with such characteristics would thus provide a useful device for quantum information processing tasks involving the creation and storage of entangled resources.

  15. Technical Assessment: Integrated Photonics

    DTIC Science & Technology

    2015-10-01

    in global internet protocol traffic as a function of time by local access technology. Photonics continues to play a critical role in enabling this...communication networks. This has enabled services like the internet , high performance computing, and power-efficient large-scale data centers. The...signal processing, quantum information science, and optics for free space applications. However major obstacles challenge the implementation of

  16. Data Integration in Computer Distributed Systems

    NASA Astrophysics Data System (ADS)

    Kwiecień, Błażej

    In this article the author analyze a problem of data integration in a computer distributed systems. Exchange of information between different levels in integrated pyramid of enterprise process is fundamental with regard to efficient enterprise work. Communication and data exchange between levels are not always the same cause of necessity of different network protocols usage, communication medium, system response time, etc.

  17. Signaling Task Awareness in Think-Aloud Protocols from Students Selecting Relevant Information from Text

    ERIC Educational Resources Information Center

    Schellings, Gonny L. M.; Broekkamp, Hein

    2011-01-01

    Self-regulated learning has been described as an adaptive process: students adapt their learning strategies for attaining different learning goals. In order to be adaptive, students must have a clear notion of what the task requirements consist of. Both trace data and questionnaire data indicate that students adapt study strategies in limited ways…

  18. Threshold Things That Think: Authorisation for Resharing

    NASA Astrophysics Data System (ADS)

    Peeters, Roel; Kohlweiss, Markulf; Preneel, Bart

    As we are evolving towards ubiquitous computing, users carry an increasing number of mobile devices with sensitive information. The security of this information can be protected using threshold cryptography, in which secret computations are shared between multiple devices. Threshold cryptography can be made more robust by resharing protocols, which allow recovery from partial compromises. This paper introduces user-friendly and secure protocols for the authorisation of resharing protocols. We present both automatic and manual protocols, utilising a group manual authentication protocol to add a new device. We analyse the security of these protocols: our analysis considers permanent and temporary compromises, denial of service attacks and manual authentications errors of the user.

  19. A Secure RFID Tag Authentication Protocol with Privacy Preserving in Telecare Medicine Information System.

    PubMed

    Li, Chun-Ta; Weng, Chi-Yao; Lee, Cheng-Chi

    2015-08-01

    Radio Frequency Identification (RFID) based solutions are widely used for providing many healthcare applications include patient monitoring, object traceability, drug administration system and telecare medicine information system (TMIS) etc. In order to reduce malpractices and ensure patient privacy, in 2015, Srivastava et al. proposed a hash based RFID tag authentication protocol in TMIS. Their protocol uses lightweight hash operation and synchronized secret value shared between back-end server and tag, which is more secure and efficient than other related RFID authentication protocols. Unfortunately, in this paper, we demonstrate that Srivastava et al.'s tag authentication protocol has a serious security problem in that an adversary may use the stolen/lost reader to connect to the medical back-end server that store information associated with tagged objects and this privacy damage causing the adversary could reveal medical data obtained from stolen/lost readers in a malicious way. Therefore, we propose a secure and efficient RFID tag authentication protocol to overcome security flaws and improve the system efficiency. Compared with Srivastava et al.'s protocol, the proposed protocol not only inherits the advantages of Srivastava et al.'s authentication protocol for TMIS but also provides better security with high system efficiency.

  20. The Remote Maxwell Demon as Energy Down-Converter

    NASA Astrophysics Data System (ADS)

    Hossenfelder, S.

    2016-04-01

    It is demonstrated that Maxwell's demon can be used to allow a machine to extract energy from a heat bath by use of information that is processed by the demon at a remote location. The model proposed here effectively replaces transmission of energy by transmission of information. For that we use a feedback protocol that enables a net gain by stimulating emission in selected fluctuations around thermal equilibrium. We estimate the down conversion rate and the efficiency of energy extraction from the heat bath.

  1. Alert Exchange Process Protocol

    NASA Technical Reports Server (NTRS)

    Groen, Frank

    2015-01-01

    The National Aeronautics and Space Administration of the United States of America (NASA), and the European Space Agency (ESA), and the Japanese Aerospace Exploration Agency (JAXA), acknowledging that NASA, ESA and JAXA have a mutual interest in exchanging Alerts and Alert Status Lists to enhance the information base for each system participant while fortifying the general level of cooperation between the policy agreement subscribers, and each Party will exchange Alert listings on regular basis and detailed Alert information on a need to know basis to the extent permitted by law.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ishihara, T

    Currently, the problem at hand is in distributing identical copies of OEP and filter software to a large number of farm nodes. One of the common methods used to transfer these softwares is through unicast. Unicast protocol faces the problem of repetitiously sending the same data over the network. Since the sending rate is limited, this process poses to be a bottleneck. Therefore, one possible solution to this problem lies in creating a reliable multicast protocol. A specific type of multicast protocol is the Bulk Multicast Protocol [4]. This system consists of one sender distributing data to many receivers. Themore » sender delivers data at a given rate of data packets. In response to that, the receiver replies to the sender with a status packet which contains information about the packet loss in terms of Negative Acknowledgment. The probability of the status packet sent back to the sender is+, where N is the number of receivers. The protocol is designed to have approximately 1 status packet for each data packet sent. In this project, we were able to show that the time taken for the complete transfer of a file to multiple receivers was about 12 times faster with multicast than by the use of unicast. The implementation of this experimental protocol shows remarkable improvement in mass data transfer to a large number of farm machines.« less

  3. Protocol Coordinator | Center for Cancer Research

    Cancer.gov

    PROGRAM DESCRIPTION Within the Leidos Biomedical Research Inc.’s Clinical Research Directorate, the Clinical Monitoring Research Program (CMRP) provides high-quality comprehensive and strategic operational support to the high-profile domestic and international clinical research initiatives of the National Cancer Institute (NCI), National Institute of Allergy and Infectious Diseases (NIAID), Clinical Center (CC), National Institute of Heart, Lung and Blood Institute (NHLBI), National Institute of Arthritis and Musculoskeletal and Skin Diseases (NIAMS), National Center for Advancing Translational Sciences (NCATS), National Institute of Neurological Disorders and Stroke (NINDS), and the National Institute of Mental Health (NIMH). Since its inception in 2001, CMRP’s ability to provide rapid responses, high-quality solutions, and to recruit and retain experts with a variety of backgrounds to meet the growing research portfolios of NCI, NIAID, CC, NHLBI, NIAMS, NCATS, NINDS, and NIMH has led to the considerable expansion of the program and its repertoire of support services. CMRP’s support services are strategically aligned with the program’s mission to provide comprehensive, dedicated support to assist National Institutes of Health researchers in providing the highest quality of clinical research in compliance with applicable regulations and guidelines, maintaining data integrity, and protecting human subjects. For the scientific advancement of clinical research, CMRP services include comprehensive clinical trials, regulatory, pharmacovigilance, protocol navigation and development, and programmatic and project management support for facilitating the conduct of 400+ Phase I, II, and III domestic and international trials on a yearly basis. These trials investigate the prevention, diagnosis, treatment of, and therapies for cancer, influenza, HIV, and other infectious diseases and viruses such as hepatitis C, tuberculosis, malaria, and Ebola virus; heart, lung, and blood diseases and conditions; parasitic infections; rheumatic and inflammatory diseases; and rare and neglected diseases. CMRP’s collaborative approach to clinical research and the expertise and dedication of staff to the continuation and success of the program’s mission has contributed to improving the overall standards of public health on a global scale. The Clinical Monitoring Research Program (CMRP) provides comprehensive clinical and administrative support to the National Cancer Institute’s Center for Cancer Research’s (CCR) Protocol Support Office (PSO) for protocol development review, regulatory review, and the implementation process as well as oversees medical writing/editing, regulatory/ compliance, and protocol coordination/navigation and administration. KEY ROLES/RESPONSIBILITIES The Protocol Coordinator III: Provides programmatic and logistical support for the operations of clinical research for Phase I and Phase II clinical trials. Provides deployment of clinical support services for clinical research. Streamlines the protocol development timeline. Provides data and documents collection and compilation for regulatory filing with the U.S. Food and Drug Administration (FDA) and other regulatory authorities.. Provides technical review and report preparation. Provides administrative coordination and general logistical support for regulatory activities. Ensures the provision of training for investigators and associate staff to reinforce and enhance a Good Clinical Practices (GCP) culture. Oversees quality assurance and quality control, performs regulatory review of clinical protocols, informed consent and other clinical documents. Tracks and facilitates a portfolio of protocols through each process step (Institutional Review Board [IRB], Regulatory Affairs Compliance [RAC], Data Safety Monitoring Board [DSMB], Office of Protocol Services). Assists clinical investigators in preparing clinical research protocols, including writing and formatting consent forms. Prepares protocol packages for review and ensures that protocol packages include all of the required material and comply with CCR, NCI and NIH policies. Collaborates with investigators to resolve any protocol/data issues. Coordinates submission of protocols for scientific and ethical review by the Branch scientific review committees, the NCI IRB, and the clinical trial sponsor or the FDA. Monitors the review process and maintains detailed, complete and accurate records for each protocol of the approvals at the various stages of the review process, including new protocol submissions, amendments to protocols, and continuing reviews, as well as other submissions such as adverse events. Attends and prepares minutes for the Branch Protocol Review Committees. Contacts coordinators at other centers for protocols that are performed there to obtain review committee approvals at those centers, maintains records of these approvals and sends protocol amendments and other reports to the participating centers. Maintains a schedule of all review committee submission deadline dates and meeting dates. Assists clinical investigators in understanding and complying with the entire review process. Works closely with the NCI Protocol Review Office in establishing and maintaining a paperless automated document and tracking system for NCI protocols. Converts protocols from Word format to .pdf with bookmarks. Maintains the .pdf version of the most current approved version of each active clinical protocol on a central server. This position is located in Rockville, Maryland.

  4. Integrating biomedical and herbal medicine in Ghana - experiences from the Kumasi South Hospital: a qualitative study.

    PubMed

    Boateng, Millicent Addai; Danso-Appiah, Anthony; Turkson, Bernard Kofi; Tersbøl, Britt Pinkowski

    2016-07-07

    Over the past decade there has been growing interest in the use of herbal medicine both in developed and developing countries. Given the high proportion of patients using herbal medicine in Ghana, some health facilities have initiated implementation of herbal medicine as a component of their healthcare delivery. However, the extent to which herbal medicine has been integrated in Ghanaian health facilities, how integration is implemented and perceived by different stakeholders has not been documented. The study sought to explore these critical issues at the Kumasi South Hospital (KSH) and outline the challenges and motivations of the integration process. Qualitative phenomenological exploratory study design involving fieldwork observations, focus group discussion, in-depth interviews and key informants' interviews was employed to collect data. Policies and protocols outlining the definition, process and goals of integration were lacking, with respondents sharing different views about the purpose and value of integration of herbal medicine within public health facilities. Key informants were supportive of the initiative. Whilst biomedical health workers perceived the system to be parallel than integrated, health personnel providing herbal medicine perceived the system as integrated. Most patients were not aware of the herbal clinic in the hospital but those who had utilized services of the herbal clinic viewed the clinic as part of the hospital. The lack of a regulatory policy and protocol for the integration seemed to have led to the different perception of the integration. Policy and protocol to guide the integration are key recommendations.

  5. Environmental Characteristics and Geographic Information System Applications for the Development of Nutrient Thresholds in Oklahoma Streams

    USGS Publications Warehouse

    Masoner, Jason R.; Haggard, Brian E.; Rea, Alan

    2002-01-01

    The U.S.Environmental Protection Agency has developed nutrient criteria using ecoregions to manage and protect rivers and streams in the United States. Individual states and tribes are encouraged by the U.S. Environmental Protection Agency to modify or improve upon the ecoregion approach. The Oklahoma Water Resources Board uses a dichotomous process that stratifies streams using environmental characteristics such as stream order and stream slope. This process is called the Use Support Assessment Protocols, subchapter15. The Use Support Assessment Protocols can be used to identify streams threatened by excessive amounts of nutrients, dependant upon a beneficial use designation for each stream. The Use Support Assessment Protocols, subchapter 15 uses nutrient and environmental characteristic thresholds developed from a study conducted in the Netherlands, but the Oklahoma Water Resources Board wants to modify the thresholds to reflect hydrologic and ecological conditions relevant to Oklahoma streams and rivers. Environmental characteristics thought to affect impairment from nutrient concentrations in Oklahoma streams and rivers were determined for 798 water-quality sites in Oklahoma. Nutrient, chlorophyll, water-properties, and location data were retrieved from the U.S. Environmental Protection Agency STORET database including data from the U.S. Geological Survey, Oklahoma Conservation Commission, and Oklahoma Water Resources Board. Drainage-basin area, stream order, stream slope, and land-use proportions were determined for each site using a Geographic Information System. The methods, procedures, and data sets used to determine the environmental characteristics are described.

  6. Addressing the potential adverse effects of school-based BMI assessments on children's wellbeing.

    PubMed

    Gibbs, Lisa; O'Connor, Thea; Waters, Elizabeth; Booth, Michael; Walsh, Orla; Green, Julie; Bartlett, Jenny; Swinburn, Boyd

    2008-01-01

    INTRODUCTION. Do child obesity prevention research and intervention measures have the potential to generate adverse concerns about body image by focussing on food, physical activity and body weight? Research findings now demonstrate the emergence of body image concerns in children as young as 5 years. In the context of a large school-community-based child health promotion and obesity prevention study, we aimed to address the potential negative effects of height and weight measures on child wellbeing by developing and implementing an evidence-informed protocol to protect and prevent body image concerns. fun 'n healthy in Moreland! is a cluster randomised controlled trial of a child health promotion and obesity prevention intervention in 23 primary schools in an inner urban area of Melbourne, Australia. Body image considerations were incorporated into the study philosophies, aims, methods, staff training, language, data collection and reporting procedures of this study. This was informed by the published literature, professional body image expertise, pilot testing and implementation in the conduct of baseline data collection and the intervention. This study is the first record of a body image protection protocol being an integral part of the research processes of a child obesity prevention study. Whilst we are yet to measure its impact and outcome, we have developed and tested a protocol based on the evidence and with support from stakeholders in order to minimise the adverse impact of study processes on child body image concerns.

  7. Bound entangled states with a private key and their classical counterpart.

    PubMed

    Ozols, Maris; Smith, Graeme; Smolin, John A

    2014-03-21

    Entanglement is a fundamental resource for quantum information processing. In its pure form, it allows quantum teleportation and sharing classical secrets. Realistic quantum states are noisy and their usefulness is only partially understood. Bound-entangled states are central to this question--they have no distillable entanglement, yet sometimes still have a private classical key. We present a construction of bound-entangled states with a private key based on classical probability distributions. From this emerge states possessing a new classical analogue of bound entanglement, distinct from the long-sought bound information. We also find states of smaller dimensions and higher key rates than previously known. Our construction has implications for classical cryptography: we show that existing protocols are insufficient for extracting private key from our distributions due to their "bound-entangled" nature. We propose a simple extension of existing protocols that can extract a key from them.

  8. Cycle-Triggered Cortical Stimulation during Slow Wave Sleep Facilitates Learning a BMI Task: A Case Report in a Non-Human Primate

    PubMed Central

    Rembado, Irene; Zanos, Stavros; Fetz, Eberhard E.

    2017-01-01

    Slow wave sleep (SWS) has been identified as the sleep stage involved in consolidating newly acquired information. A growing body of evidence has shown that delta (1–4 Hz) oscillatory activity, the characteristic electroencephalographic signature of SWS, is involved in coordinating interaction between the hippocampus and the neocortex and is thought to take a role in stabilizing memory traces related to a novel task. This case report describes a new protocol that uses neuroprosthetics training of a non-human primate to evaluate the effects of surface cortical electrical stimulation triggered from SWS cycles. The results suggest that stimulation phase-locked to SWS oscillatory activity promoted learning of the neuroprosthetic task. This protocol could be used to elucidate mechanisms of synaptic plasticity underlying off-line learning during sleep and offers new insights into the role of brain oscillations in information processing and memory consolidation. PMID:28450831

  9. The Quantum Steganography Protocol via Quantum Noisy Channels

    NASA Astrophysics Data System (ADS)

    Wei, Zhan-Hong; Chen, Xiu-Bo; Niu, Xin-Xin; Yang, Yi-Xian

    2015-08-01

    As a promising branch of quantum information hiding, Quantum steganography aims to transmit secret messages covertly in public quantum channels. But due to environment noise and decoherence, quantum states easily decay and change. Therefore, it is very meaningful to make a quantum information hiding protocol apply to quantum noisy channels. In this paper, we make the further research on a quantum steganography protocol for quantum noisy channels. The paper proved that the protocol can apply to transmit secret message covertly in quantum noisy channels, and explicity showed quantum steganography protocol. In the protocol, without publishing the cover data, legal receivers can extract the secret message with a certain probability, which make the protocol have a good secrecy. Moreover, our protocol owns the independent security, and can be used in general quantum communications. The communication, which happen in our protocol, do not need entangled states, so our protocol can be used without the limitation of entanglement resource. More importantly, the protocol apply to quantum noisy channels, and can be used widely in the future quantum communication.

  10. CT protocol management: simplifying the process by using a master protocol concept

    PubMed Central

    Bour, Robert K.; Rubert, Nicholas; Wendt, Gary; Pozniak, Myron; Ranallo, Frank N.

    2015-01-01

    This article explains a method for creating CT protocols for a wide range of patient body sizes and clinical indications, using detailed tube current information from a small set of commonly used protocols. Analytical expressions were created relating CT technical acquisition parameters which can be used to create new CT protocols on a given scanner or customize protocols from one scanner to another. Plots of mA as a function of patient size for specific anatomical regions were generated and used to identify the tube output needs for patients as a function of size for a single master protocol. Tube output data were obtained from the DICOM header of clinical images from our PACS and patient size was measured from CT localizer radiographs under IRB approval. This master protocol was then used to create 11 additional master protocols. The 12 master protocols were further combined to create 39 single and multiphase clinical protocols. Radiologist acceptance rate of exams scanned using the clinical protocols was monitored for 12,857 patients to analyze the effectiveness of the presented protocol management methods using a two‐tailed Fisher's exact test. A single routine adult abdominal protocol was used as the master protocol to create 11 additional master abdominal protocols of varying dose and beam energy. Situations in which the maximum tube current would have been exceeded are presented, and the trade‐offs between increasing the effective tube output via 1) decreasing pitch, 2) increasing the scan time, or 3) increasing the kV are discussed. Out of 12 master protocols customized across three different scanners, only one had a statistically significant acceptance rate that differed from the scanner it was customized from. The difference, however, was only 1% and was judged to be negligible. All other master protocols differed in acceptance rate insignificantly between scanners. The methodology described in this paper allows a small set of master protocols to be adapted among different clinical indications on a single scanner and among different CT scanners. PACS number: 87.57.Q PMID:26219005

  11. The Effects of Training Contingency Awareness During Attention Bias Modification on Learning and Stress Reactivity.

    PubMed

    Lazarov, Amit; Abend, Rany; Seidner, Shiran; Pine, Daniel S; Bar-Haim, Yair

    2017-09-01

    Current attention bias modification (ABM) procedures are designed to implicitly train attention away from threatening stimuli with the hope of reducing stress reactivity and anxiety symptoms. However, the mechanisms underlying effective ABM delivery are not well understood, with awareness of the training contingency suggested as one possible factor contributing to ABM efficacy. Here, 45 high-anxious participants were trained to divert attention away from threat in two ABM sessions. They were randomly assigned to one of three training protocols: an implicit protocol, comprising two standard implicit ABM training sessions; an explicit protocol, comprising two sessions with explicit instruction as to the attention training contingency; and an implicit-explicit protocol, in which participants were not informed of the training contingency in the first ABM session and informed of it at the start of the second session. We examined learning processes and stress reactivity following a stress-induction task. Results indicate that relative to implicit instructions, explicit instructions led to stronger learning during the first training session. Following rest, the explicit and implicit groups exhibited consolidation-related improvement in performance, whereas no such improvement was noted for the implicit-explicit group. Finally, although stress reactivity was reduced after training, contingency awareness did not yield a differential effect on stress reactivity measured using both self-reports and skin conductance, within and across sessions. These results suggest that explicit ABM administration leads to greater initial learning during the training protocol while not differing from standard implicit administration in terms of off-line learning and stress reactivity. Copyright © 2017. Published by Elsevier Ltd.

  12. Joining the quantum state of two photons into one

    NASA Astrophysics Data System (ADS)

    Vitelli, Chiara; Spagnolo, Nicolò; Aparo, Lorenzo; Sciarrino, Fabio; Santamato, Enrico; Marrucci, Lorenzo

    2013-07-01

    Photons are the ideal carriers of quantum information for communication. Each photon can have a single or multiple qubits encoded in its internal quantum state, as defined by optical degrees of freedom such as polarization, wavelength, transverse modes and so on. However, as photons do not interact, multiplexing and demultiplexing the quantum information across photons has not been possible hitherto. Here, we introduce and demonstrate experimentally a physical process, named `quantum joining', in which the two-dimensional quantum states (qubits) of two input photons are combined into a single output photon, within a four-dimensional Hilbert space. The inverse process is also proposed, in which the four-dimensional quantum state of a single photon is split into two photons, each carrying a qubit. Both processes can be iterated, and hence provide a flexible quantum interconnect to bridge multiparticle protocols of quantum information with multidegree-of-freedom ones, with possible applications in future quantum networking.

  13. Assessment of an improved bone washing protocol for deceased donor human bone.

    PubMed

    Eagle, M J; Man, J; Rooney, P; Hogg, P; Kearney, J N

    2015-03-01

    NHSBT Tissue Services issues bone to surgeons in the UK in two formats, fresh-frozen unprocessed bone from living donors and processed bone from deceased donors. Processed bone may be frozen or freeze dried and all processed bone is currently subjected to a washing protocol to remove blood and bone marrow. In this study we have improved the current bone washing protocol for cancellous bone and assessed the success of the protocol by measuring the removal of the bone marrow components: soluble protein, DNA and haemoglobin at each step in the process, and residual components in the bone at the end of the process. The bone washing protocol is a combination of sonication, warm water washes, centrifugation and chemical (ethanol and hydrogen peroxide) treatments. We report that the bone washing protocol is capable of removing up to 99.85 % soluble protein, 99.95 % DNA and 100 % of haemoglobin from bone. The new bone washing protocol does not render any bone cytotoxic as shown by contact cytotoxicity assays. No microbiological cell growth was detected in any of the wash steps. This process is now in use for processed cancellous bone issued by NHSBT.

  14. Clinical Trials Management | Division of Cancer Prevention

    Cancer.gov

    Information for researchers about developing, reporting, and managing NCI-funded cancer prevention clinical trials. Protocol Information Office The central clearinghouse for clinical trials management within the Division of Cancer Prevention.Read more about the Protocol Information Office. | Information for researchers about developing, reporting, and managing NCI-funded

  15. Unification of quantum information theory

    NASA Astrophysics Data System (ADS)

    Abeyesinghe, Anura

    We present the unification of many previously disparate results in noisy quantum Shannon theory and the unification of all of noiseless quantum Shannon theory. More specifically we deal here with bipartite, unidirectional, and memoryless quantum Shannon theory. We find all the optimal protocols and quantify the relationship between the resources used, both for the one-shot and for the ensemble case, for what is arguably the most fundamental task in quantum information theory: sharing entangled states between a sender and a receiver. We find that all of these protocols are derived from our one-shot superdense coding protocol and relate nicely to each other. We then move on to noisy quantum information theory and give a simple, direct proof of the "mother" protocol, or rather her generalization to the Fully Quantum Slepian-Wolf protocol (FQSW). FQSW simultaneously accomplishes two goals: quantum communication-assisted entanglement distillation, and state transfer from the sender to the receiver. As a result, in addition to her other "children," the mother protocol generates the state merging primitive of Horodecki, Oppenheim, and Winter as well as a new class of distributed compression protocols for correlated quantum sources, which are optimal for sources described by separable density operators. Moreover, the mother protocol described here is easily transformed into the so-called "father" protocol, demonstrating that the division of single-sender/single-receiver protocols into two families was unnecessary: all protocols in the family are children of the mother.

  16. 48 CFR 3439.701 - Internet Protocol version 6.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... REGULATION SPECIAL CATEGORIES OF CONTRACTING ACQUISITION OF INFORMATION TECHNOLOGY Department Requirements for Acquisition of Information Technology 3439.701 Internet Protocol version 6. The contracting...

  17. Internet Protocol Implementation Guide.

    DTIC Science & Technology

    1982-08-01

    RD-R153 624 INTERNET PROTOCOL IMPLEMENTATION GIDE(U) SRI 1/2 INTERNATIONAL MENLO PARK CA NETWORK INFORMATION CENTER AUG 82 DCA2e-83-C-8e25 N... INTERNET PROTOCOL S IMPLEMENTATION GUIDE August 1982 DTICFL. !.ECTE .-" MAY 1 31985 ;z B Q.. Network Information Center SRI International Menlo Park...this is more information than the receiving Internet * module needs. The specified procedure is to take the return route recorded in the first

  18. Quantum protocols within Spekkens' toy model

    NASA Astrophysics Data System (ADS)

    Disilvestro, Leonardo; Markham, Damian

    2017-05-01

    Quantum mechanics is known to provide significant improvements in information processing tasks when compared to classical models. These advantages range from computational speedups to security improvements. A key question is where these advantages come from. The toy model developed by Spekkens [R. W. Spekkens, Phys. Rev. A 75, 032110 (2007), 10.1103/PhysRevA.75.032110] mimics many of the features of quantum mechanics, such as entanglement and no cloning, regarded as being important in this regard, despite being a local hidden variable theory. In this work, we study several protocols within Spekkens' toy model where we see it can also mimic the advantages and limitations shown in the quantum case. We first provide explicit proofs for the impossibility of toy bit commitment and the existence of a toy error correction protocol and consequent k -threshold secret sharing. Then, defining a toy computational model based on the quantum one-way computer, we prove the existence of blind and verified protocols. Importantly, these two last quantum protocols are known to achieve a better-than-classical security. Our results suggest that such quantum improvements need not arise from any Bell-type nonlocality or contextuality, but rather as a consequence of steering correlations.

  19. Toward fidelity between specification and implementation

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Montgomery, Todd L.; Morrison, Jeff; Wu, Yunqing

    1994-01-01

    This paper describes the methods used to specify and implement a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally by two complementary teams using a combination of formal and informal techniques in an attempt to ensure the correctness of the protocol implementation. The first team, called the Design team, initially specified protocol requirements using a variant of SCR requirements tables and implemented a prototype solution. The second team, called the V&V team, developed a state model based on the requirements tables and derived test cases from these tables to exercise the implementation. In a series of iterative steps, the Design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation through testing. Test cases derived from state transition paths in the formal model formed the dialogue between teams during development and served as the vehicles for keeping the model and implementation in fidelity with each other. This paper describes our experiences in developing our process model, details of our approach, and some example problems found during the development of RMP.

  20. A Synopsis of Technical Issues of Concern for Monitoring Trace Elements in Highway and Urban Runoff

    USGS Publications Warehouse

    Breault, Robert F.; Granato, Gregory E.

    2000-01-01

    Trace elements, which are regulated for aquatic life protection, are a primary concern in highway- and urban-runoff studies because stormwater runoff may transport these constituents from the land surface to receiving waters. Many of these trace elements are essential for biological activity and become detrimental only when geologic or anthropogenic sources exceed concentrations beyond ranges typical of the natural environment. The Federal Highway Administration and State Transportation Agencies are concerned about the potential effects of highway runoff on the watershed scale and for the management and protection of watersheds. Transportation agencies need information that is documented as valid, current, and scientifically defensible to support planning and management decisions. There are many technical issues of concern for monitoring trace elements; therefore, trace-element data commonly are considered suspect, and the responsibility to provide data-quality information to support the validity of reported results rests with the data-collection agency. Paved surfaces are fundamentally different physically, hydraulically, and chemically from the natural surfaces typical of most freshwater systems that have been the focus of many traceelement- monitoring studies. Existing scientific conceptions of the behavior of trace elements in the environment are based largely upon research on natural systems, rather than on systems typical of pavement runoff. Additionally, the logistics of stormwater sampling are difficult because of the great uncertainty in the occurrence and magnitude of storm events. Therefore, trace-element monitoring programs may be enhanced if monitoring and sampling programs are automated. Automation would standardize the process and provide a continuous record of the variations in flow and water-quality characteristics. Great care is required to collect and process samples in a manner that will minimize potential contamination or attenuation of trace elements and other sources of bias and variability in the sampling process. Trace elements have both natural and anthropogenic sources that may affect the sampling process, including the sample-collection and handling materials used in many trace-element monitoring studies. Trace elements also react with these materials within the timescales typical for collection, processing and analysis of runoff samples. To study the characteristics and potential effects of trace elements in highway and urban runoff, investigators typically sample one or more operationally defined matrixes including: whole water, dissolved (filtered water), suspended sediment, bottom sediment, biological tissue, and contaminant sources. The sampling and analysis of each of these sample matrixes can provide specific information about the occurrence and distribution of trace elements in runoff and receiving waters. There are, however, technical concerns specific to each matrix that must be understood and addressed through use of proper collection and processing protocols. Valid protocols are designed to minimize inherent problems and to maximize the accuracy, precision, comparability, and representativeness of data collected. Documentation, including information about monitoring protocols, quality assurance and quality control efforts, and ancillary data also is necessary to establish data quality. This documentation is especially important for evaluation of historical traceelement monitoring data, because trace-element monitoring protocols and analysis methods have been constantly changing over the past 30 years.

  1. TH-C-18A-08: A Management Tool for CT Dose Monitoring, Analysis, and Protocol Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, J; Chan, F; Newman, B

    2014-06-15

    Purpose: To develop a customizable tool for enterprise-wide managing of CT protocols and analyzing radiation dose information of CT exams for a variety of quality control applications Methods: All clinical CT protocols implemented on the 11 CT scanners at our institution were extracted in digital format. The original protocols had been preset by our CT management team. A commercial CT dose tracking software (DoseWatch,GE healthcare,WI) was used to collect exam information (exam date, patient age etc.), scanning parameters, and radiation doses for all CT exams. We developed a Matlab-based program (MathWorks,MA) with graphic user interface which allows to analyze themore » scanning protocols with the actual dose estimates, and compare the data to national (ACR,AAPM) and internal reference values for CT quality control. Results: The CT protocol review portion of our tool allows the user to look up the scanning and image reconstruction parameters of any protocol on any of the installed CT systems among about 120 protocols per scanner. In the dose analysis tool, dose information of all CT exams (from 05/2013 to 02/2014) was stratified on a protocol level, and within a protocol down to series level, i.e. each individual exposure event. This allows numerical and graphical review of dose information of any combination of scanner models, protocols and series. The key functions of the tool include: statistics of CTDI, DLP and SSDE, dose monitoring using user-set CTDI/DLP/SSDE thresholds, look-up of any CT exam dose data, and CT protocol review. Conclusion: our inhouse CT management tool provides radiologists, technologists and administration a first-hand near real-time enterprise-wide knowledge on CT dose levels of different exam types. Medical physicists use this tool to manage CT protocols, compare and optimize dose levels across different scanner models. It provides technologists feedback on CT scanning operation, and knowledge on important dose baselines and thresholds.« less

  2. 48 CFR 3439.701 - Internet Protocol version 6.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING ACQUISITION OF INFORMATION TECHNOLOGY Department Requirements for Acquisition of Information Technology 3439.701 Internet Protocol version 6. The contracting...

  3. 48 CFR 3439.701 - Internet Protocol version 6.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING ACQUISITION OF INFORMATION TECHNOLOGY Department Requirements for Acquisition of Information Technology 3439.701 Internet Protocol version 6. The contracting...

  4. A protocol for generating a high-quality genome-scale metabolic reconstruction.

    PubMed

    Thiele, Ines; Palsson, Bernhard Ø

    2010-01-01

    Network reconstructions are a common denominator in systems biology. Bottom-up metabolic network reconstructions have been developed over the last 10 years. These reconstructions represent structured knowledge bases that abstract pertinent information on the biochemical transformations taking place within specific target organisms. The conversion of a reconstruction into a mathematical format facilitates a myriad of computational biological studies, including evaluation of network content, hypothesis testing and generation, analysis of phenotypic characteristics and metabolic engineering. To date, genome-scale metabolic reconstructions for more than 30 organisms have been published and this number is expected to increase rapidly. However, these reconstructions differ in quality and coverage that may minimize their predictive potential and use as knowledge bases. Here we present a comprehensive protocol describing each step necessary to build a high-quality genome-scale metabolic reconstruction, as well as the common trials and tribulations. Therefore, this protocol provides a helpful manual for all stages of the reconstruction process.

  5. A protocol for generating a high-quality genome-scale metabolic reconstruction

    PubMed Central

    Thiele, Ines; Palsson, Bernhard Ø.

    2011-01-01

    Network reconstructions are a common denominator in systems biology. Bottom-up metabolic network reconstructions have developed over the past 10 years. These reconstructions represent structured knowledge-bases that abstract pertinent information on the biochemical transformations taking place within specific target organisms. The conversion of a reconstruction into a mathematical format facilitates myriad computational biological studies including evaluation of network content, hypothesis testing and generation, analysis of phenotypic characteristics, and metabolic engineering. To date, genome-scale metabolic reconstructions for more than 30 organisms have been published and this number is expected to increase rapidly. However, these reconstructions differ in quality and coverage that may minimize their predictive potential and use as knowledge-bases. Here, we present a comprehensive protocol describing each step necessary to build a high-quality genome-scale metabolic reconstruction as well as common trials and tribulations. Therefore, this protocol provides a helpful manual for all stages of the reconstruction process. PMID:20057383

  6. Integrative Application of Life Cycle Assessment and Risk Assessment to Environmental Impacts of Anthropogenic Pollutants at a Watershed Scale.

    PubMed

    Lin, Xiaodan; Yu, Shen; Ma, Hwongwen

    2018-01-01

    Intense human activities have led to increasing deterioration of the watershed environment via pollutant discharge, which threatens human health and ecosystem function. To meet a need of comprehensive environmental impact/risk assessment for sustainable watershed development, a biogeochemical process-based life cycle assessment and risk assessment (RA) integration for pollutants aided by geographic information system is proposed in this study. The integration is to frame a conceptual protocol of "watershed life cycle assessment (WLCA) for pollutants". The proposed WLCA protocol consists of (1) geographic and environmental characterization mapping; (2) life cycle inventory analysis; (3) integration of life-cycle impact assessment (LCIA) with RA via characterization factor of pollutant of interest; and (4) result analysis and interpretation. The WLCA protocol can visualize results of LCIA and RA spatially for the pollutants of interest, which might be useful for decision or policy makers for mitigating impacts of watershed development.

  7. Department of Defense picture archiving and communication system acceptance testing: results and identification of problem components.

    PubMed

    Allison, Scott A; Sweet, Clifford F; Beall, Douglas P; Lewis, Thomas E; Monroe, Thomas

    2005-09-01

    The PACS implementation process is complicated requiring a tremendous amount of time, resources, and planning. The Department of Defense (DOD) has significant experience in developing and refining PACS acceptance testing (AT) protocols that assure contract compliance, clinical safety, and functionality. The DOD's AT experience under the initial Medical Diagnostic Imaging Support System contract led to the current Digital Imaging Network-Picture Archiving and Communications Systems (DIN-PACS) contract AT protocol. To identify the most common system and component deficiencies under the current DIN-PACS AT protocol, 14 tri-service sites were evaluated during 1998-2000. Sixteen system deficiency citations with 154 separate types of limitations were noted with problems involving the workstation, interfaces, and the Radiology Information System comprising more than 50% of the citations. Larger PACS deployments were associated with a higher number of deficiencies. The most commonly cited systems deficiencies were among the most expensive components of the PACS.

  8. Superior memory efficiency of quantum devices for the simulation of continuous-time stochastic processes

    NASA Astrophysics Data System (ADS)

    Elliott, Thomas J.; Gu, Mile

    2018-03-01

    Continuous-time stochastic processes pervade everyday experience, and the simulation of models of these processes is of great utility. Classical models of systems operating in continuous-time must typically track an unbounded amount of information about past behaviour, even for relatively simple models, enforcing limits on precision due to the finite memory of the machine. However, quantum machines can require less information about the past than even their optimal classical counterparts to simulate the future of discrete-time processes, and we demonstrate that this advantage extends to the continuous-time regime. Moreover, we show that this reduction in the memory requirement can be unboundedly large, allowing for arbitrary precision even with a finite quantum memory. We provide a systematic method for finding superior quantum constructions, and a protocol for analogue simulation of continuous-time renewal processes with a quantum machine.

  9. Novel method of extracting motion from natural movies.

    PubMed

    Suzuki, Wataru; Ichinohe, Noritaka; Tani, Toshiki; Hayami, Taku; Miyakawa, Naohisa; Watanabe, Satoshi; Takeichi, Hiroshige

    2017-11-01

    The visual system in primates can be segregated into motion and shape pathways. Interaction occurs at multiple stages along these pathways. Processing of shape-from-motion and biological motion is considered to be a higher-order integration process involving motion and shape information. However, relatively limited types of stimuli have been used in previous studies on these integration processes. We propose a new algorithm to extract object motion information from natural movies and to move random dots in accordance with the information. The object motion information is extracted by estimating the dynamics of local normal vectors of the image intensity projected onto the x-y plane of the movie. An electrophysiological experiment on two adult common marmoset monkeys (Callithrix jacchus) showed that the natural and random dot movies generated with this new algorithm yielded comparable neural responses in the middle temporal visual area. In principle, this algorithm provided random dot motion stimuli containing shape information for arbitrary natural movies. This new method is expected to expand the neurophysiological and psychophysical experimental protocols to elucidate the integration processing of motion and shape information in biological systems. The novel algorithm proposed here was effective in extracting object motion information from natural movies and provided new motion stimuli to investigate higher-order motion information processing. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  10. Communication-Gateway Software For NETEX, DECnet, And TCP/IP

    NASA Technical Reports Server (NTRS)

    Keith, B.; Ferry, D.; Fendler, E.

    1990-01-01

    Communications gateway software, GATEWAY, provides process-to-process communication between remote applications programs in different protocol domains. Communicating peer processes may be resident on any paired combination of NETEX, DECnet, or TCP/IP hosts. Provides necessary mapping from one protocol to another and facilitates practical intermachine communications in cost-effective manner by eliminating need to standardize on single protocol or to implement multiple protocols in host computers. Written in Ada.

  11. Patients' perspectives and experiences concerning barriers to accessing information about bilateral prophylactic mastectomy.

    PubMed

    Glassey, Rachael; O'Connor, Moira; Ives, Angela; Saunders, Christobel; kConFab Investigators; O'Sullivan, Sarah; Hardcastle, Sarah J

    2018-05-11

    To explore the barriers and experiences of accessing information for women who have received genetic risk assessment/testing results for breast cancer (BC) and are considering a bilateral prophylactic mastectomy (BPM) and, exploring participants' preferences concerning information and support needs. A qualitative retrospective study guided by interpretative phenomenological analysis was utilised. Semi-structured interviews were conducted with forty-six women who were either considering BPM or had already undergone the surgery. Three themes identified barriers to accessing information; difficulties accessing information, inconsistent information and clinical focus/medicalized information. A fourth theme - preferences of information and support needs, identified three subthemes; these were, psychological support, clearly defined processes and photos of mastectomies/reconstruction surgeries. Barriers to accessing information appeared to be widespread. A lack of integrated services contributed to inconsistent information, and medicalized terminology/clinical focus of consultations further complicated understanding. Preferences for information include clearly defined processes, so women know the pathways after confirmation of familial BC risk. Clinical implications include a multidisciplinary team approach, and a protocol that reflects current practice. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. A Guide to Writing a Qualitative Systematic Review Protocol to Enhance Evidence-Based Practice in Nursing and Health Care.

    PubMed

    Butler, Ashleigh; Hall, Helen; Copnell, Beverley

    2016-06-01

    The qualitative systematic review is a rapidly developing area of nursing research. In order to present trustworthy, high-quality recommendations, such reviews should be based on a review protocol to minimize bias and enhance transparency and reproducibility. Although there are a number of resources available to guide researchers in developing a quantitative review protocol, very few resources exist for qualitative reviews. To guide researchers through the process of developing a qualitative systematic review protocol, using an example review question. The key elements required in a systematic review protocol are discussed, with a focus on application to qualitative reviews: Development of a research question; formulation of key search terms and strategies; designing a multistage review process; critical appraisal of qualitative literature; development of data extraction techniques; and data synthesis. The paper highlights important considerations during the protocol development process, and uses a previously developed review question as a working example. This paper will assist novice researchers in developing a qualitative systematic review protocol. By providing a worked example of a protocol, the paper encourages the development of review protocols, enhancing the trustworthiness and value of the completed qualitative systematic review findings. Qualitative systematic reviews should be based on well planned, peer reviewed protocols to enhance the trustworthiness of results and thus their usefulness in clinical practice. Protocols should outline, in detail, the processes which will be used to undertake the review, including key search terms, inclusion and exclusion criteria, and the methods used for critical appraisal, data extraction and data analysis to facilitate transparency of the review process. Additionally, journals should encourage and support the publication of review protocols, and should require reference to a protocol prior to publication of the review results. © 2016 Sigma Theta Tau International.

  13. Design and Implementation of a Prospective Adult Congenital Heart Disease Biobank.

    PubMed

    Opotowsky, Alexander R; Loukas, Brittani; Ellervik, Christina; Moko, Lilamarie E; Singh, Michael N; Landzberg, Elizabeth I; Rimm, Eric B; Landzberg, Michael J

    2016-11-01

    Adults with congenital heart disease (ACHD) comprise a growing, increasingly complex population. The Boston Adult Congenital Heart Disease Biobank is a program for the collection and storage of biospecimens to provide a sustainable resource for scientific biomarker investigation in ACHD. We describe a protocol to collect, process, and store biospecimens for ACHD or associated diagnoses developed based on existing literature and consultation with cardiovascular biomarker epidemiologists. The protocol involves collecting urine and ∼48.5 mL of blood. A subset of the blood and urine undergoes immediate clinically relevant testing. The remaining biospecimens are processed soon after collection and stored at -80°C as aliquots of ethylenediaminetetraacetic acid (EDTA) and lithium heparin plasma, serum, red cell and buffy coat pellet, and urine supernatant. Including tubes with diverse anticoagulant and clot accelerator contents will enable flexible downstream use. Demographic and clinical data are entered into a database; data on biospecimen collection, processing, and storage are managed by an enterprise laboratory information management system. Since implementation in 2012, we have enrolled more than 650 unique participants (aged 18-80 years, 53.3% women); the Biobank contains over 11,000 biospecimen aliquots. The most common primary CHD diagnoses are single ventricle status-post Fontan procedure (18.8%), repaired tetralogy of Fallot with pulmonary stenosis or atresia (17.6%), and left-sided obstructive lesions (17.5%). We describe the design and implementation of biospecimen collection, handling, and storage protocols with multiple levels of quality assurance. These protocols are feasible and reflect the size and goals of the Boston ACHD Biobank. © The Author(s) 2016.

  14. Asserting National Sovereignty in Cyberspace: The Case for Internet Border Inspection

    DTIC Science & Technology

    2003-06-01

    Influencing Foreign Policy. in Internet and International Systems: Information Technology and American Foreign Policy Decisionmaking Workshop. 1999...investigative Agencies that investigate violations of federal law IO Information Operations, military operations in information realm IP Internet ...Protocol, a specific format for Internet packet headers IW Information Warfare, part of information operations NCP Network Control Protocol NSA

  15. High-resolution Single Particle Analysis from Electron Cryo-microscopy Images Using SPHIRE

    PubMed Central

    Moriya, Toshio; Saur, Michael; Stabrin, Markus; Merino, Felipe; Voicu, Horatiu; Huang, Zhong; Penczek, Pawel A.; Raunser, Stefan; Gatsogiannis, Christos

    2017-01-01

    SPHIRE (SPARX for High-Resolution Electron Microscopy) is a novel open-source, user-friendly software suite for the semi-automated processing of single particle electron cryo-microscopy (cryo-EM) data. The protocol presented here describes in detail how to obtain a near-atomic resolution structure starting from cryo-EM micrograph movies by guiding users through all steps of the single particle structure determination pipeline. These steps are controlled from the new SPHIRE graphical user interface and require minimum user intervention. Using this protocol, a 3.5 Å structure of TcdA1, a Tc toxin complex from Photorhabdus luminescens, was derived from only 9500 single particles. This streamlined approach will help novice users without extensive processing experience and a priori structural information, to obtain noise-free and unbiased atomic models of their purified macromolecular complexes in their native state. PMID:28570515

  16. Survey on Security Issues in File Management in Cloud Computing Environment

    NASA Astrophysics Data System (ADS)

    Gupta, Udit

    2015-06-01

    Cloud computing has pervaded through every aspect of Information technology in past decade. It has become easier to process plethora of data, generated by various devices in real time, with the advent of cloud networks. The privacy of users data is maintained by data centers around the world and hence it has become feasible to operate on that data from lightweight portable devices. But with ease of processing comes the security aspect of the data. One such security aspect is secure file transfer either internally within cloud or externally from one cloud network to another. File management is central to cloud computing and it is paramount to address the security concerns which arise out of it. This survey paper aims to elucidate the various protocols which can be used for secure file transfer and analyze the ramifications of using each protocol.

  17. Robust Characterization of Loss Rates

    NASA Astrophysics Data System (ADS)

    Wallman, Joel J.; Barnhill, Marie; Emerson, Joseph

    2015-08-01

    Many physical implementations of qubits—including ion traps, optical lattices and linear optics—suffer from loss. A nonzero probability of irretrievably losing a qubit can be a substantial obstacle to fault-tolerant methods of processing quantum information, requiring new techniques to safeguard against loss that introduce an additional overhead that depends upon the loss rate. Here we present a scalable and platform-independent protocol for estimating the average loss rate (averaged over all input states) resulting from an arbitrary Markovian noise process, as well as an independent estimate of detector efficiency. Moreover, we show that our protocol gives an additional constraint on estimated parameters from randomized benchmarking that improves the reliability of the estimated error rate and provides a new indicator for non-Markovian signatures in the experimental data. We also derive a bound for the state-dependent loss rate in terms of the average loss rate.

  18. Event-Driven Messaging for Offline Data Quality Monitoring at ATLAS

    NASA Astrophysics Data System (ADS)

    Onyisi, Peter

    2015-12-01

    During LHC Run 1, the information flow through the offline data quality monitoring in ATLAS relied heavily on chains of processes polling each other's outputs for handshaking purposes. This resulted in a fragile architecture with many possible points of failure and an inability to monitor the overall state of the distributed system. We report on the status of a project undertaken during the LHC shutdown to replace the ad hoc synchronization methods with a uniform message queue system. This enables the use of standard protocols to connect processes on multiple hosts; reliable transmission of messages between possibly unreliable programs; easy monitoring of the information flow; and the removal of inefficient polling-based communication.

  19. Experimental realization of a feedback optical parametric amplifier with four-wave mixing

    NASA Astrophysics Data System (ADS)

    Pan, Xiaozhou; Chen, Hui; Wei, Tianxiang; Zhang, Jun; Marino, Alberto M.; Treps, Nicolas; Glasser, Ryan T.; Jing, Jietai

    2018-04-01

    Optical parametric amplifiers (OPAs) play a fundamental role in the generation of quantum correlation for quantum information processing and quantum metrology. In order to increase the communication fidelity of the quantum information protocol and the measurement precision of quantum metrology, it requires a high degree of quantum correlation. In this Rapid Communication we report a feedback optical parametric amplifier that employs a four-wave mixing (FWM) process as the underlying OPA and a beam splitter as the feedback controller. We first construct a theoretical model for this feedback-based FWM process and experimentally study the effect of the feedback control on the quantum properties of the system. Specifically, we find that the quantum correlation between the output fields can be enhanced by tuning the strength of the feedback.

  20. Evaluation of counterfactuality in counterfactual communication protocols

    NASA Astrophysics Data System (ADS)

    Arvidsson-Shukur, D. R. M.; Barnes, C. H. W.; Gottfries, A. N. O.

    2017-12-01

    We provide an in-depth investigation of parameter estimation in nested Mach-Zehnder interferometers (NMZIs) using two information measures: the Fisher information and the Shannon mutual information. Protocols for counterfactual communication have, so far, been based on two different definitions of counterfactuality. In particular, some schemes have been based on NMZI devices, and have recently been subject to criticism. We provide a methodology for evaluating the counterfactuality of these protocols, based on an information-theoretical framework. More specifically, we make the assumption that any realistic quantum channel in MZI structures will have some weak uncontrolled interaction. We then use the Fisher information of this interaction to measure counterfactual violations. The measure is used to evaluate the suggested counterfactual communication protocol of H. Salih et al. [Phys. Rev. Lett. 110, 170502 (2013), 10.1103/PhysRevLett.110.170502]. The protocol of D. R. M. Arvidsson-Shukur and C. H. W. Barnes [Phys. Rev. A 94, 062303 (2016), 10.1103/PhysRevA.94.062303], based on a different definition, is evaluated with a probability measure. Our results show that the definition of Arvidsson-Shukur and Barnes is satisfied by their scheme, while that of Salih et al. is only satisfied by perfect quantum channels. For realistic devices the latter protocol does not achieve its objective.

  1. Normalization of cortical thickness measurements across different T1 magnetic resonance imaging protocols by novel W-Score standardization.

    PubMed

    Chung, Jinyong; Yoo, Kwangsun; Lee, Peter; Kim, Chan Mi; Roh, Jee Hoon; Park, Ji Eun; Kim, Sang Joon; Seo, Sang Won; Shin, Jeong-Hyeon; Seong, Joon-Kyung; Jeong, Yong

    2017-10-01

    The use of different 3D T1-weighted magnetic resonance (T1 MR) imaging protocols induces image incompatibility across multicenter studies, negating the many advantages of multicenter studies. A few methods have been developed to address this problem, but significant image incompatibility still remains. Thus, we developed a novel and convenient method to improve image compatibility. W-score standardization creates quality reference values by using a healthy group to obtain normalized disease values. We developed a protocol-specific w-score standardization to control the protocol effect, which is applied to each protocol separately. We used three data sets. In dataset 1, brain T1 MR images of normal controls (NC) and patients with Alzheimer's disease (AD) from two centers, acquired with different T1 MR protocols, were used (Protocol 1 and 2, n = 45/group). In dataset 2, data from six subjects, who underwent MRI with two different protocols (Protocol 1 and 2), were used with different repetition times, echo times, and slice thicknesses. In dataset 3, T1 MR images from a large number of healthy normal controls (Protocol 1: n = 148, Protocol 2: n = 343) were collected for w-score standardization. The protocol effect and disease effect on subjects' cortical thickness were analyzed before and after the application of protocol-specific w-score standardization. As expected, different protocols resulted in differing cortical thickness measurements in both NC and AD subjects. Different measurements were obtained for the same subject when imaged with different protocols. Multivariate pattern difference between measurements was observed between the protocols. Classification accuracy between two protocols was nearly 90%. After applying protocol-specific w-score standardization, the differences between the protocols substantially decreased. Most importantly, protocol-specific w-score standardization reduced both univariate and multivariate differences in the images while maintaining the AD disease effect. Compared to conventional regression methods, our method showed the best performance for in terms of controlling the protocol effect while preserving disease information. Protocol-specific w-score standardization effectively resolved the concerns of conventional regression methods. It showed the best performance for improving the compatibility of a T1 MR post-processed feature, cortical thickness. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. An XML-Based Protocol for Distributed Event Services

    NASA Technical Reports Server (NTRS)

    Smith, Warren; Gunter, Dan; Quesnel, Darcy; Biegel, Bryan (Technical Monitor)

    2001-01-01

    A recent trend in distributed computing is the construction of high-performance distributed systems called computational grids. One difficulty we have encountered is that there is no standard format for the representation of performance information and no standard protocol for transmitting this information. This limits the types of performance analysis that can be undertaken in complex distributed systems. To address this problem, we present an XML-based protocol for transmitting performance events in distributed systems and evaluate the performance of this protocol.

  3. Generation of concatenated Greenberger-Horne-Zeilinger-type entangled coherent state based on linear optics

    NASA Astrophysics Data System (ADS)

    Guo, Rui; Zhou, Lan; Gu, Shi-Pu; Wang, Xing-Fu; Sheng, Yu-Bo

    2017-03-01

    The concatenated Greenberger-Horne-Zeilinger (C-GHZ) state is a new type of multipartite entangled state, which has potential application in future quantum information. In this paper, we propose a protocol of constructing arbitrary C-GHZ entangled state approximatively. Different from previous protocols, each logic qubit is encoded in the coherent state. This protocol is based on the linear optics, which is feasible in experimental technology. This protocol may be useful in quantum information based on the C-GHZ state.

  4. Objective and automated protocols for the evaluation of biomedical search engines using No Title Evaluation protocols.

    PubMed

    Campagne, Fabien

    2008-02-29

    The evaluation of information retrieval techniques has traditionally relied on human judges to determine which documents are relevant to a query and which are not. This protocol is used in the Text Retrieval Evaluation Conference (TREC), organized annually for the past 15 years, to support the unbiased evaluation of novel information retrieval approaches. The TREC Genomics Track has recently been introduced to measure the performance of information retrieval for biomedical applications. We describe two protocols for evaluating biomedical information retrieval techniques without human relevance judgments. We call these protocols No Title Evaluation (NT Evaluation). The first protocol measures performance for focused searches, where only one relevant document exists for each query. The second protocol measures performance for queries expected to have potentially many relevant documents per query (high-recall searches). Both protocols take advantage of the clear separation of titles and abstracts found in Medline. We compare the performance obtained with these evaluation protocols to results obtained by reusing the relevance judgments produced in the 2004 and 2005 TREC Genomics Track and observe significant correlations between performance rankings generated by our approach and TREC. Spearman's correlation coefficients in the range of 0.79-0.92 are observed comparing bpref measured with NT Evaluation or with TREC evaluations. For comparison, coefficients in the range 0.86-0.94 can be observed when evaluating the same set of methods with data from two independent TREC Genomics Track evaluations. We discuss the advantages of NT Evaluation over the TRels and the data fusion evaluation protocols introduced recently. Our results suggest that the NT Evaluation protocols described here could be used to optimize some search engine parameters before human evaluation. Further research is needed to determine if NT Evaluation or variants of these protocols can fully substitute for human evaluations.

  5. Cryptanalysis and improvement of an improved two factor authentication protocol for telecare medical information systems.

    PubMed

    Chaudhry, Shehzad Ashraf; Naqvi, Husnain; Shon, Taeshik; Sher, Muhammad; Farash, Mohammad Sabzinejad

    2015-06-01

    Telecare medical information systems (TMIS) provides rapid and convenient health care services remotely. Efficient authentication is a prerequisite to guarantee the security and privacy of patients in TMIS. Authentication is used to verify the legality of the patients and TMIS server during remote access. Very recently Islam et al. (J. Med. Syst. 38(10):135, 2014) proposed a two factor authentication protocol for TMIS using elliptic curve cryptography (ECC) to improve Xu et al.'s (J. Med. Syst. 38(1):9994, 2014) protocol. They claimed their improved protocol to be efficient and provides all security requirements. However our analysis reveals that Islam et al.'s protocol suffers from user impersonation and server impersonation attacks. Furthermore we proposed an enhanced protocol. The proposed protocol while delivering all the virtues of Islam et al.'s protocol resists all known attacks.

  6. SU-F-P-04: Implementation of Dose Monitoring Software: Successes and Pitfalls

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Och, J

    2016-06-15

    Purpose: to successfully install a dose monitoring software (DMS) application to assist in CT protocol and dose management. Methods: Upon selecting the DMS, we began our implementation of the application. A working group composed of Medical Physics, Radiology Administration, Information Technology, and CT technologists was formed. On-site training in the application was supplied by the vendor. The decision was made to apply the process for all the CT protocols on all platforms at all facilities. Protocols were painstakingly mapped to the correct masters, and the system went ‘live’. Results: We are routinely using DMS as a tool in our Clinicalmore » Performance CT QA program. It is useful in determining the effectiveness of revisions to existing protocols, and establishing performance baselines for new units. However, the implementation was not without difficulty. We identified several pitfalls and obstacles which frustrated progress. Including: Training deficiencies, Nomenclature problems, Communication, DICOM variability. Conclusion: Dose monitoring software can be a potent tool for QA. However, implementation of the program can be problematic and requires planning, organization and commitment.« less

  7. Study protocol of a randomized controlled trial comparing integrative body-mind-spirit intervention and cognitive behavioral therapy in fostering quality of life of patients with lung cancer and their family caregivers.

    PubMed

    Lau, Bobo Hi-Po; Chow, Amy Y M; Wong, Daniel F K; Chan, Jessie S M; Chan, Celia H Y; Ho, Rainbow T H; So, Tsz-Him; Lam, Tai-Chung; Lee, Victor Ho-Fun; Lee, Anne W M; Chow, Sau Fong; Chan, Cecilia L W

    2018-01-01

    Compared to cancers at other sites, lung cancer often results in greater psychosocial distress to both the patients and their caregivers, due to the poor prognosis and survival rate, as well as the heavy symptom burden. In recent years, making protocols of proposed or on-going studies publicly available via clinical trial registries and/or peer-reviewed journals has benefited health sciences with timely communication of the latest research trends and improved transparency in reporting. However, such practice is yet to be a common sight in evidence-informed social work. Hence, this paper discusses the value of publishing protocols in social work research and presents the protocol of a randomized controlled trial that compares the effectiveness of integrative body-mind-spirit intervention with cognitive behavioral therapy for enhancing quality of life of patients with lung cancer and their family caregivers. The data collection process was still on-going at the time of manuscript submission.

  8. VANET Clustering Based Routing Protocol Suitable for Deserts.

    PubMed

    Nasr, Mohammed Mohsen Mohammed; Abdelgader, Abdeldime Mohamed Salih; Wang, Zhi-Gong; Shen, Lian-Feng

    2016-04-06

    In recent years, there has emerged applications of vehicular ad hoc networks (VANETs) towards security, safety, rescue, exploration, military and communication redundancy systems in non-populated areas, besides its ordinary use in urban environments as an essential part of intelligent transportation systems (ITS). This paper proposes a novel algorithm for the process of organizing a cluster structure and cluster head election (CHE) suitable for VANETs. Moreover, it presents a robust clustering-based routing protocol, which is appropriate for deserts and can achieve high communication efficiency, ensuring reliable information delivery and optimal exploitation of the equipment on each vehicle. A comprehensive simulation is conducted to evaluate the performance of the proposed CHE and routing algorithms.

  9. VANET Clustering Based Routing Protocol Suitable for Deserts

    PubMed Central

    Mohammed Nasr, Mohammed Mohsen; Abdelgader, Abdeldime Mohamed Salih; Wang, Zhi-Gong; Shen, Lian-Feng

    2016-01-01

    In recent years, there has emerged applications of vehicular ad hoc networks (VANETs) towards security, safety, rescue, exploration, military and communication redundancy systems in non-populated areas, besides its ordinary use in urban environments as an essential part of intelligent transportation systems (ITS). This paper proposes a novel algorithm for the process of organizing a cluster structure and cluster head election (CHE) suitable for VANETs. Moreover, it presents a robust clustering-based routing protocol, which is appropriate for deserts and can achieve high communication efficiency, ensuring reliable information delivery and optimal exploitation of the equipment on each vehicle. A comprehensive simulation is conducted to evaluate the performance of the proposed CHE and routing algorithms. PMID:27058539

  10. Summary Report Panel 1: The Need for Protocols and Standards in Research on Underwater Noise Impacts on Marine Life.

    PubMed

    Erbe, Christine; Ainslie, Michael A; de Jong, Christ A F; Racca, Roberto; Stocker, Michael

    2016-01-01

    As concern about anthropogenic noise and its impacts on marine fauna is increasing around the globe, data are being compared across populations, species, noise sources, geographic regions, and time. However, much of the raw and processed data are not comparable due to differences in measurement methodology, analysis and reporting, and a lack of metadata. Common protocols and more formal, international standards are needed to ensure the effectiveness of research, conservation, regulation and practice, and unambiguous communication of information and ideas. Developing standards takes time and effort, is largely driven by a few expert volunteers, and would benefit from stakeholders' contribution and support.

  11. Experimental demonstration of graph-state quantum secret sharing.

    PubMed

    Bell, B A; Markham, D; Herrera-Martí, D A; Marin, A; Wadsworth, W J; Rarity, J G; Tame, M S

    2014-11-21

    Quantum communication and computing offer many new opportunities for information processing in a connected world. Networks using quantum resources with tailor-made entanglement structures have been proposed for a variety of tasks, including distributing, sharing and processing information. Recently, a class of states known as graph states has emerged, providing versatile quantum resources for such networking tasks. Here we report an experimental demonstration of graph state-based quantum secret sharing--an important primitive for a quantum network with applications ranging from secure money transfer to multiparty quantum computation. We use an all-optical setup, encoding quantum information into photons representing a five-qubit graph state. We find that one can reliably encode, distribute and share quantum information amongst four parties, with various access structures based on the complex connectivity of the graph. Our results show that graph states are a promising approach for realising sophisticated multi-layered communication protocols in quantum networks.

  12. The British Services Dhaulagiri Medical Research Expedition 2016: a unique military and civilian research collaboration.

    PubMed

    Mellor, Adrian; Bakker-Dyos, J; Howard, M; Boos, C; Cooke, M; Vincent, E; Scott, P; O'Hara, J; Clarke, S B; Barlow, M; Matu, J; Deighton, K; Hill, N; Newman, C; Cruttenden, R; Holdsworth, D; Woods, D

    2017-12-01

    High-altitude environments lead to a significant physiological challenge and disease processes which can be life threatening; operational effectiveness at high altitude can be severely compromised. The UK military research is investigating ways of mitigating the physiological effects of high altitude. The British Service Dhaulagiri Research Expedition took place from March to May 2016, and the military personnel were invited to consent to a variety of study protocols investigating adaptation to high altitudes and diagnosis of high-altitude illness. The studies took place in remote and austere environments at altitudes of up to 7500 m. This paper gives an overview of the individual research protocols investigated, the execution of the expedition and the challenges involved. 129 servicemen and women were involved at altitudes of up to 7500 m; 8 research protocols were investigated. The outputs from these studies will help to individualise the acclimatisation process and inform strategies for pre-acclimatisation should troops ever need to deploy at high altitude at short notice. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  13. Paperless protocoling of CT and MRI requests at an outpatient imaging center.

    PubMed

    Bassignani, Matthew J; Dierolf, David A; Roberts, David L; Lee, Steven

    2010-04-01

    We created our imaging center (IC) to move outpatient imaging from our busy inpatient imaging suite off-site to a location that is more inviting to ambulatory patients. Nevertheless, patients scanned at our IC still represent the depth and breadth of illness complexity seen with our tertiary care population. Thus, we protocol exams on an individualized basis to ensure that the referring clinician's question is fully answered by the exam performed. Previously, paper based protocoling was a laborious process for all those involved where the IC business office would fax the requests to various reading rooms for protocoling by the subspecialist radiologists who are 3 miles away at the main hospital. Once protocoled, reading room coordinators would fax back the protocoled request to the IC technical area in preparation for the next day's scheduled exams. At any breakdown in this process (e.g., lost paperwork), patient exams were delayed and clinicians and patients became upset. To improve this process, we developed a paper free process whereby protocoling is accomplished through scanning of exam requests into our PACS. Using the common worklist functionality found in most PACS, we created "protocoling worklists" that contain these scanned documents. Radiologists protocol these studies in the PACS worklist (with the added benefit of having all imaging and report data available), and subsequently, the technologists can see and act on the protocols they find in PACS. This process has significantly decreased interruptions in our busy reading rooms and decreased rework of IC staff.

  14. Data-driven CT protocol review and management—experience from a large academic hospital.

    PubMed

    Zhang, Da; Savage, Cristy A; Li, Xinhua; Liu, Bob

    2015-03-01

    Protocol review plays a critical role in CT quality assurance, but large numbers of protocols and inconsistent protocol names on scanners and in exam records make thorough protocol review formidable. In this investigation, we report on a data-driven cataloging process that can be used to assist in the reviewing and management of CT protocols. We collected lists of scanner protocols, as well as 18 months of recent exam records, for 10 clinical scanners. We developed computer algorithms to automatically deconstruct the protocol names on the scanner and in the exam records into core names and descriptive components. Based on the core names, we were able to group the scanner protocols into a much smaller set of "core protocols," and to easily link exam records with the scanner protocols. We calculated the percentage of usage for each core protocol, from which the most heavily used protocols were identified. From the percentage-of-usage data, we found that, on average, 18, 33, and 49 core protocols per scanner covered 80%, 90%, and 95%, respectively, of all exams. These numbers are one order of magnitude smaller than the typical numbers of protocols that are loaded on a scanner (200-300, as reported in the literature). Duplicated, outdated, and rarely used protocols on the scanners were easily pinpointed in the cataloging process. The data-driven cataloging process can facilitate the task of protocol review. Copyright © 2015 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  15. Fractional optical cryptographic protocol for data containers in a noise-free multiuser environment

    NASA Astrophysics Data System (ADS)

    Jaramillo, Alexis; Barrera, John Fredy; Zea, Alejandro Vélez; Torroba, Roberto

    2018-03-01

    Optical encryption systems have great potential for flexible and high-performance data protection, making them an area of rapid development. However, most approaches present two main issues, namely, the presence of speckle noise, and the degree of security they offer. Here we introduce an experimental implementation of an optical encrypting protocol that tackles these issues by taking advantage of recent developments in the field. These developments include the introduction of information containers for noise free information retrieval, the use of multiplexing to allow for a multiple user environment and an architecture based on the Joint fractional Fourier transform that allows increased degrees of freedom and simplifies the experimental requirements. Thus, data handling via QR code containers involving multiple users processed in a fractional joint transform correlator produce coded information with increased security and ease of use. In this way, we can guarantee that only the user with the correct combination of encryption key and security parameters can achieve noise free information after deciphering. We analyze the performance of the system when the order of the fractional Fourier transform is changed during decryption. We show experimental results that confirm the validity of our proposal.

  16. Semi-quantum Dialogue Based on Single Photons

    NASA Astrophysics Data System (ADS)

    Ye, Tian-Yu; Ye, Chong-Qiang

    2018-02-01

    In this paper, we propose two semi-quantum dialogue (SQD) protocols by using single photons as the quantum carriers, where one requires the classical party to possess the measurement capability and the other does not have this requirement. The security toward active attacks from an outside Eve in the first SQD protocol is guaranteed by the complete robustness of present semi-quantum key distribution (SQKD) protocols, the classical one-time pad encryption, the classical party's randomization operation and the decoy photon technology. The information leakage problem of the first SQD protocol is overcome by the classical party' classical basis measurements on the single photons carrying messages which makes him share their initial states with the quantum party. The security toward active attacks from Eve in the second SQD protocol is guaranteed by the classical party's randomization operation, the complete robustness of present SQKD protocol and the classical one-time pad encryption. The information leakage problem of the second SQD protocol is overcome by the quantum party' classical basis measurements on each two adjacent single photons carrying messages which makes her share their initial states with the classical party. Compared with the traditional information leakage resistant QD protocols, the advantage of the proposed SQD protocols lies in that they only require one party to have quantum capabilities. Compared with the existing SQD protocol, the advantage of the proposed SQD protocols lies in that they only employ single photons rather than two-photon entangled states as the quantum carriers. The proposed SQD protocols can be implemented with present quantum technologies.

  17. A Simple XML Producer-Consumer Protocol

    NASA Technical Reports Server (NTRS)

    Smith, Warren; Gunter, Dan; Quesnel, Darcy; Biegel, Bryan (Technical Monitor)

    2001-01-01

    There are many different projects from government, academia, and industry that provide services for delivering events in distributed environments. The problem with these event services is that they are not general enough to support all uses and they speak different protocols so that they cannot interoperate. We require such interoperability when we, for example, wish to analyze the performance of an application in a distributed environment. Such an analysis might require performance information from the application, computer systems, networks, and scientific instruments. In this work we propose and evaluate a standard XML-based protocol for the transmission of events in distributed systems. One recent trend in government and academic research is the development and deployment of computational grids. Computational grids are large-scale distributed systems that typically consist of high-performance compute, storage, and networking resources. Examples of such computational grids are the DOE Science Grid, the NASA Information Power Grid (IPG), and the NSF Partnerships for Advanced Computing Infrastructure (PACIs). The major effort to deploy these grids is in the area of developing the software services to allow users to execute applications on these large and diverse sets of resources. These services include security, execution of remote applications, managing remote data, access to information about resources and services, and so on. There are several toolkits for providing these services such as Globus, Legion, and Condor. As part of these efforts to develop computational grids, the Global Grid Forum is working to standardize the protocols and APIs used by various grid services. This standardization will allow interoperability between the client and server software of the toolkits that are providing the grid services. The goal of the Performance Working Group of the Grid Forum is to standardize protocols and representations related to the storage and distribution of performance data. These standard protocols and representations must support tasks such as profiling parallel applications, monitoring the status of computers and networks, and monitoring the performance of services provided by a computational grid. This paper describes a proposed protocol and data representation for the exchange of events in a distributed system. The protocol exchanges messages formatted in XML and it can be layered atop any low-level communication protocol such as TCP or UDP Further, we describe Java and C++ implementations of this protocol and discuss their performance. The next section will provide some further background information. Section 3 describes the main communication patterns of our protocol. Section 4 describes how we represent events and related information using XML. Section 5 describes our protocol and Section 6 discusses the performance of two implementations of the protocol. Finally, an appendix provides the XML Schema definition of our protocol and event information.

  18. Attacks on quantum key distribution protocols that employ non-ITS authentication

    NASA Astrophysics Data System (ADS)

    Pacher, C.; Abidin, A.; Lorünser, T.; Peev, M.; Ursin, R.; Zeilinger, A.; Larsson, J.-Å.

    2016-01-01

    We demonstrate how adversaries with large computing resources can break quantum key distribution (QKD) protocols which employ a particular message authentication code suggested previously. This authentication code, featuring low key consumption, is not information-theoretically secure (ITS) since for each message the eavesdropper has intercepted she is able to send a different message from a set of messages that she can calculate by finding collisions of a cryptographic hash function. However, when this authentication code was introduced, it was shown to prevent straightforward man-in-the-middle (MITM) attacks against QKD protocols. In this paper, we prove that the set of messages that collide with any given message under this authentication code contains with high probability a message that has small Hamming distance to any other given message. Based on this fact, we present extended MITM attacks against different versions of BB84 QKD protocols using the addressed authentication code; for three protocols, we describe every single action taken by the adversary. For all protocols, the adversary can obtain complete knowledge of the key, and for most protocols her success probability in doing so approaches unity. Since the attacks work against all authentication methods which allow to calculate colliding messages, the underlying building blocks of the presented attacks expose the potential pitfalls arising as a consequence of non-ITS authentication in QKD post-processing. We propose countermeasures, increasing the eavesdroppers demand for computational power, and also prove necessary and sufficient conditions for upgrading the discussed authentication code to the ITS level.

  19. Protocol Processing for 100 Gbit/s and Beyond - A Soft Real-Time Approach in Hardware and Software

    NASA Astrophysics Data System (ADS)

    Büchner, Steffen; Lopacinski, Lukasz; Kraemer, Rolf; Nolte, Jörg

    2017-09-01

    100 Gbit/s wireless communication protocol processing stresses all parts of a communication system until the outermost. The efficient use of upcoming 100 Gbit/s and beyond transmission technology requires the rethinking of the way protocols are processed by the communication endpoints. This paper summarizes the achievements of the project End2End100. We will present a comprehensive soft real-time stream processing approach that allows the protocol designer to develop, analyze, and plan scalable protocols for ultra high data rates of 100 Gbit/s and beyond. Furthermore, we will present an ultra-low power, adaptable, and massively parallelized FEC (Forward Error Correction) scheme that detects and corrects bit errors at line rate with an energy consumption between 1 pJ/bit and 13 pJ/bit. The evaluation results discussed in this publication show that our comprehensive approach allows end-to-end communication with a very low protocol processing overhead.

  20. Secure quantum communication using classical correlated channel

    NASA Astrophysics Data System (ADS)

    Costa, D.; de Almeida, N. G.; Villas-Boas, C. J.

    2016-10-01

    We propose a secure protocol to send quantum information from one part to another without a quantum channel. In our protocol, which resembles quantum teleportation, a sender (Alice) and a receiver (Bob) share classical correlated states instead of EPR ones, with Alice performing measurements in two different bases and then communicating her results to Bob through a classical channel. Our secure quantum communication protocol requires the same amount of classical bits as the standard quantum teleportation protocol. In our scheme, as in the usual quantum teleportation protocol, once the classical channel is established in a secure way, a spy (Eve) will never be able to recover the information of the unknown quantum state, even if she is aware of Alice's measurement results. Security, advantages, and limitations of our protocol are discussed and compared with the standard quantum teleportation protocol.

  1. Developing family planning nurse practitioner protocols.

    PubMed

    Hawkins, J W; Roberto, D

    1984-01-01

    This article focuses on the process of development of protocols for family planning nurse practitioners. A rationale for the use of protocols, a definition of the types and examples, and the pros and cons of practice with protocols are presented. A how-to description for the development process follows, including methods and a suggested tool for critique and evaluation. The aim of the article is to assist nurse practitioners in developing protocols for their practice.

  2. A Novel Process Audit for Standardized Perioperative Handoff Protocols.

    PubMed

    Pallekonda, Vinay; Scholl, Adam T; McKelvey, George M; Amhaz, Hassan; Essa, Deanna; Narreddy, Spurthy; Tan, Jens; Templonuevo, Mark; Ramirez, Sasha; Petrovic, Michelle A

    2017-11-01

    A perioperative handoff protocol provides a standardized delivery of communication during a handoff that occurs from the operating room to the postanestheisa care unit or ICU. The protocol's success is dependent, in part, on its continued proper use over time. A novel process audit was developed to help ensure that a perioperative handoff protocol is used accurately and appropriately over time. The Audit Observation Form is used for the Audit Phase of the process audit, while the Audit Averages Form is used for the Data Analysis Phase. Employing minimal resources and using quantitative methods, the process audit provides the necessary means to evaluate the proper execution of any perioperative handoff protocol. Copyright © 2017 The Joint Commission. Published by Elsevier Inc. All rights reserved.

  3. 78 FR 3431 - Proposed Information Collection Activity; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-16

    ... protocols to collect further qualitative information through interviews and/or focus groups with program... Readiness Goals and Head Start Program Functioning'' research project. The purpose of this study is to... functioning. ACF is proposing to use a semi-structured telephone interview protocol to collect information...

  4. OSI in the NASA science internet: An analysis

    NASA Technical Reports Server (NTRS)

    Nitzan, Rebecca

    1990-01-01

    The Open Systems Interconnection (OSI) protocol suite is a result of a world-wide effort to develop international standards for networking. OSI is formalized through the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC). The goal of OSI is to provide interoperability between network products without relying on one particular vendor, and to do so on a multinational basis. The National Institute for Standards and Technology (NIST) has developed a Government OSI Profile (GOSIP) that specified a subset of the OSI protocols as a Federal Information Processing Standard (FIPS 146). GOSIP compatibility has been adopted as the direction for all U.S. government networks. OSI is extremely diverse, and therefore adherence to a profile will facilitate interoperability within OSI networks. All major computer vendors have indicated current or future support of GOSIP-compliant OSI protocols in their products. The NASA Science Internet (NSI) is an operational network, serving user requirements under NASA's Office of Space Science and Applications. NSI consists of the Space Physics Analysis Network (SPAN) that uses the DECnet protocols and the NASA Science Network (NSN) that uses TCP/IP protocols. The NSI Project Office is currently working on an OSI integration analysis and strategy. A long-term goal is to integrate SPAN and NSN into one unified network service, using a full OSI protocol suite, which will support the OSSA user community.

  5. Improving Treatment Response for Paediatric Anxiety Disorders: An Information-Processing Perspective.

    PubMed

    Ege, Sarah; Reinholdt-Dunne, Marie Louise

    2016-12-01

    Cognitive behavioural therapy (CBT) is considered the treatment of choice for paediatric anxiety disorders, yet there remains substantial room for improvement in treatment outcomes. This paper examines whether theory and research into the role of information-processing in the underlying psychopathology of paediatric anxiety disorders indicate possibilities for improving treatment response. Using a critical review of recent theoretical, empirical and academic literature, the paper examines the role of information-processing biases in paediatric anxiety disorders, the extent to which CBT targets information-processing biases, and possibilities for improving treatment response. The literature reviewed indicates a role for attentional and interpretational biases in anxious psychopathology. While there is theoretical grounding and limited empirical evidence to indicate that CBT ameliorates interpretational biases, evidence regarding the effects of CBT on attentional biases is mixed. Novel treatment methods including attention bias modification training, attention feedback awareness and control training, and mindfulness-based therapy may hold potential in targeting attentional biases, and thereby in improving treatment response. The integration of novel interventions into an existing evidence-based protocol is a complex issue and faces important challenges with regard to determining the optimal treatment package. Novel interventions targeting information-processing biases may hold potential in improving response to CBT for paediatric anxiety disorders. Many important questions remain to be answered.

  6. From workshop to work practice: An exploration of context and facilitation in the development of evidence-based practice.

    PubMed

    Ellis, Isabelle; Howard, Peter; Larson, Ann; Robertson, Jeanette

    2005-01-01

    This article examines the process of translating evidence into practice using a facilitation model developed by the Western Australian Centre for Evidence Based Nursing and Midwifery. Using the conceptual framework Promoting Action on Research Implementation in Health Services (PARIHS), the aims of the study were (1) to explore the relative and combined importance of context and facilitation in the successful implementation of a new evidence-based clinical practice protocol and (2) to examine the establishment of more lasting change to individuals and organizations that resulted in greater incorporation of the principles of evidence-based practice (EBP). A pre-workshop, semi-structured telephone survey with 16 nurse managers in six rural hospitals; a summative evaluation immediately post-workshop with 54 participants; and follow-up, semi-structured interviews with 23 workshop participants. The contexts in each of the participating hospitals were very different; of the six hospitals, only one had not implemented the new protocol. Five had reviewed their practices and brought them in line with the protocol developed at the workshop. The rate of adoption varied considerably from 2 weeks to months. The participants reported being better informed about EBP in general and were positive about their ability to improve their practice and search more efficiently for best practice information. Underlying motivations for protocol development should be included in the PARIHS framework. IMPLICATIONS FOR EDUCATION: Good facilitation appears to be more influential than context in overcoming the barriers to the uptake of EBP.

  7. Security Issues for Mobile Medical Imaging: A Primer.

    PubMed

    Choudhri, Asim F; Chatterjee, Arindam R; Javan, Ramin; Radvany, Martin G; Shih, George

    2015-10-01

    The end-user of mobile device apps in the practice of clinical radiology should be aware of security measures that prevent unauthorized use of the device, including passcode policies, methods for dealing with failed login attempts, network manager-controllable passcode enforcement, and passcode enforcement for the protection of the mobile device itself. Protection of patient data must be in place that complies with the Health Insurance Portability and Accountability Act and U.S. Federal Information Processing Standards. Device security measures for data protection include methods for locally stored data encryption, hardware encryption, and the ability to locally and remotely clear data from the device. As these devices transfer information over both local wireless networks and public cell phone networks, wireless network security protocols, including wired equivalent privacy and Wi-Fi protected access, are important components in the chain of security. Specific virtual private network protocols, Secure Sockets Layer and related protocols (especially in the setting of hypertext transfer protocols), native apps, virtual desktops, and nonmedical commercial off-the-shelf apps require consideration in the transmission of medical data over both private and public networks. Enterprise security and management of both personal and enterprise mobile devices are discussed. Finally, specific standards for hardware and software platform security, including prevention of hardware tampering, protection from malicious software, and application authentication methods, are vital components in establishing a secure platform for the use of mobile devices in the medical field. © RSNA, 2015.

  8. Analysis of energy efficient routing protocols for implementation of a ubiquitous health system

    NASA Astrophysics Data System (ADS)

    Kwon, Jongwon; Park, Yongman; Koo, Sangjun; Ayurzana, Odgeral; Kim, Hiesik

    2007-12-01

    The innovative Ubiquitous-Health was born through convergence of medical service, with development of up to date information technologies and ubiquitous IT. The U-Health can be applied to a variety of special situations for managing functions of each medical center efficiently. This paper focuses on estimation of various routing protocols for implementation of U-health monitoring system. In order to facilitate wireless communication over the network, a routing protocol on the network layer is used to establish precise and efficient route between sensor nodes so that information acquired from sensors may be delivered in a timely manner. A route establishment should be considered to minimize overhead, data loss and power consumption because wireless networks for U-health are organized by a large number of sensor nodes which are small in size and have limited processing power, memory and battery life. In this paper a overview of wireless sensor network technologies commonly known is described as well as evaluation of three multi hop routing protocols which are flooding, gossiping and modified low energy adaptive clustering hierarchy(LEACH) for use with these networks using TOSSIM simulator. As a result of evaluation the integrated wireless sensor board was developed in particular. The board is embedded device based on AVR128 porting TinyOS. Also it employs bio sensor measures blood pressure, pulse frequency and ZigBee module for wireless communication. This paper accelerates the digital convergence age through continual research and development of technologies related the U-Health.

  9. The effects of Cognitive Bias Modification training and oxytocin administration on trust in maternal support: study protocol for a randomized controlled trial.

    PubMed

    Verhees, Martine W F T; Ceulemans, Eva; Bakermans-Kranenburg, Marian J; van IJzendoorn, Marinus H; de Winter, Simon; Bosmans, Guy

    2017-07-14

    Lack of trust in parental support is a transdiagnostic risk factor for the development of psychological problems throughout the lifespan. Research suggests that children's cognitive attachment representations and related information processing biases could be an important target for interventions aiming to build trust in the parent-child relationship. A paradigm that can alter these biases and increase trust is that of Cognitive Bias Modification (CBM), during which a target processing bias is systematically trained. Trust-related CBM training effects could possibly be enhanced by oxytocin, a neuropeptide that has been proposed to play an important role in social information processing and social relationships. The present article describes the study protocol for a double-blind randomized controlled trial (RCT) aimed at testing the individual and combined effects of CBM training and oxytocin administration on trust in maternal support. One hundred children (aged 8-12 years) are randomly assigned to one of four intervention conditions. Participants inhale a nasal spray that either contains oxytocin (OT) or a placebo. Additionally, they receive either a CBM training aimed at positively modifying trust-related information processing bias or a neutral placebo training aimed to have no trust-related effects. Main and interaction effects of the interventions are assessed on three levels of trust-related outcome measures: trust-related interpretation bias; self-reported trust; and mother-child interactional behavior. Importantly, side-effects of a single administration of OT in middle childhood are monitored closely to provide further information on the safety of OT administration in this age group. The present RCT is the first study to combine CBM training with oxytocin to test for individual and combined effects on trust in mother. If effective, CBM training and oxytocin could be easily applicable and nonintrusive additions to interventions that target trust in the context of the parent-child relationship. ClinicalTrials.gov, ID: NCT02737254 . Registered on 23 March 2016.

  10. Partial-mouth periodontal examination protocols for the determination of the prevalence and extent of gingival bleeding in adolescents.

    PubMed

    Machado, Michely Ediani; Tomazoni, Fernanda; Casarin, Maísa; Ardenghi, Thiago M; Zanatta, Fabricio Batistin

    2017-10-01

    To compare the performance of partial-mouth periodontal examination (PMPE) protocols with different cut-off points to the full-mouth examination (FME) in the assessment of the prevalence and extent of gingival bleeding in adolescents. A cross-sectional study was conducted involving 12-year-old adolescents. Following a systematic two-stage cluster sampling process, 1134 individuals were evaluated. Different PMPE protocols were compared to the FME with six sites per tooth. Sensitivity, specificity, area under the ROC curve (AUC), intraclass correlation coefficient (ICC), relative and absolute biases and the inflation factor were assessed for each PMPE protocol with different cut-off points for the severity of gingival bleeding. The highest AUC values were found for the six-site two-diagonal quadrant (2-4) (0.97), six-site random half-mouth (0.95) and Community Periodontal Index (0.95) protocols. The assessment of three sites [mesiobuccal (MB), buccal (B) and distolingual (DL)] in two diagonal quadrants and the random half-mouth protocol had higher sensitivity and lower specificity than the same protocols with distobuccal (DB) sites. However, the use of DB sites led to better specificity and improved the balance between sensitivity and specificity, except for the two-diagonal quadrant (1-3) protocol. The ≥1 cut-off point led to the most discrepant results from the FME. Six-site two-diagonal quadrant (2-4) and random half-mouth assessments perform better in the evaluation of gingival bleeding in adolescents. However, when a faster protocol is needed, a two-diagonal quadrant assessment using only MB, B and DL sites can be used with no important loss of information. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  11. Junior high school students' cognitive process in solving the developed algebraic problems based on information processing taxonomy model

    NASA Astrophysics Data System (ADS)

    Purwoko, Saad, Noor Shah; Tajudin, Nor'ain Mohd

    2017-05-01

    This study aims to: i) develop problem solving questions of Linear Equations System of Two Variables (LESTV) based on levels of IPT Model, ii) explain the level of students' skill of information processing in solving LESTV problems; iii) explain students' skill in information processing in solving LESTV problems; and iv) explain students' cognitive process in solving LESTV problems. This study involves three phases: i) development of LESTV problem questions based on Tessmer Model; ii) quantitative survey method on analyzing students' skill level of information processing; and iii) qualitative case study method on analyzing students' cognitive process. The population of the study was 545 eighth grade students represented by a sample of 170 students of five Junior High Schools in Hilir Barat Zone, Palembang (Indonesia) that were chosen using cluster sampling. Fifteen students among them were drawn as a sample for the interview session with saturated information obtained. The data were collected using the LESTV problem solving test and the interview protocol. The quantitative data were analyzed using descriptive statistics, while the qualitative data were analyzed using the content analysis. The finding of this study indicated that students' cognitive process was just at the step of indentifying external source and doing algorithm in short-term memory fluently. Only 15.29% students could retrieve type A information and 5.88% students could retrieve type B information from long-term memory. The implication was the development problems of LESTV had validated IPT Model in modelling students' assessment by different level of hierarchy.

  12. Chemistry in Bioinformatics

    PubMed Central

    Murray-Rust, Peter; Mitchell, John BO; Rzepa, Henry S

    2005-01-01

    Chemical information is now seen as critical for most areas of life sciences. But unlike Bioinformatics, where data is openly available and freely re-usable, most chemical information is closed and cannot be re-distributed without permission. This has led to a failure to adopt modern informatics and software techniques and therefore paucity of chemistry in bioinformatics. New technology, however, offers the hope of making chemical data (compounds and properties) free during the authoring process. We argue that the technology is already available; we require a collective agreement to enhance publication protocols. PMID:15941476

  13. Remote preparation of an atomic quantum memory.

    PubMed

    Rosenfeld, Wenjamin; Berner, Stefan; Volz, Jürgen; Weber, Markus; Weinfurter, Harald

    2007-02-02

    Storage and distribution of quantum information are key elements of quantum information processing and future quantum communication networks. Here, using atom-photon entanglement as the main physical resource, we experimentally demonstrate the preparation of a distant atomic quantum memory. Applying a quantum teleportation protocol on a locally prepared state of a photonic qubit, we realized this so-called remote state preparation on a single, optically trapped 87Rb atom. We evaluated the performance of this scheme by the full tomography of the prepared atomic state, reaching an average fidelity of 82%.

  14. Association between heart rhythm and cortical sound processing.

    PubMed

    Marcomini, Renata S; Frizzo, Ana Claúdia F; de Góes, Viviane B; Regaçone, Simone F; Garner, David M; Raimundo, Rodrigo D; Oliveira, Fernando R; Valenti, Vitor E

    2018-04-26

    Sound signal processing signifies an important factor for human conscious communication and it may be assessed through cortical auditory evoked potentials (CAEP). Heart rate variability (HRV) provides information about heart rate autonomic regulation. We investigated the association between resting HRV and CAEP. We evaluated resting HRV in the time and frequency domain and the CAEP components. The subjects remained at rest for 10 minutes for HRV recording, then they performed the CAEP examinations through frequency and duration protocols in both ears. Linear regression indicated that the amplitude of the N2 wave of the CAEP in the left ear (not right ear) was significantly influenced by standard deviation of normal-to-normal RR-intervals (17.7%) and percentage of adjacent RR-intervals with a difference of duration greater than 50 milliseconds (25.3%) time domain HRV indices in the frequency protocol. In the duration protocol and in the left ear the latency of the P2 wave was significantly influenced by low (LF) (20.8%) and high frequency (HF) bands in normalized units (21%) and LF/HF ratio (22.4%) indices of HRV spectral analysis. The latency of the N2 wave was significantly influenced by LF (25.8%), HF (25.9%) and LF/HF (28.8%). In conclusion, we promote the supposition that resting heart rhythm is associated with thalamo-cortical, cortical-cortical and auditory cortex pathways involved with auditory processing in the right hemisphere.

  15. A Cartesian reflex assessment of face processing.

    PubMed

    Polewan, Robert J; Vigorito, Christopher M; Nason, Christopher D; Block, Richard A; Moore, John W

    2006-03-01

    Commands to blink were embedded within pictures of faces and simple geometric shapes or forms. The faces and shapes were conditioned stimuli (CSs), and the required responses were conditioned responses, or more properly, Cartesian reflexes (CRs). As in classical conditioning protocols, response times (RTs) were measured from CS onset. RTs provided a measure of the processing cost (PC) of attending to a CS. A PC is the extra time required to respond relative to RTs to unconditioned stimulus (US) commands presented alone. They reflect the interplay between attentional processing of the informational content of a CS and its signaling function with respect to the US command. This resulted in longer RTs to embedded commands. Differences between PCs of faces and geometric shapes represent a starting place for a new mental chronometry based on the traditional idea that differences in RT reflect differences in information processing.

  16. An approach for setting evidence-based and stakeholder-informed research priorities in low- and middle-income countries.

    PubMed

    Rehfuess, Eva A; Durão, Solange; Kyamanywa, Patrick; Meerpohl, Joerg J; Young, Taryn; Rohwer, Anke

    2016-04-01

    To derive evidence-based and stakeholder-informed research priorities for implementation in African settings, the international research consortium Collaboration for Evidence-Based Healthcare and Public Health in Africa (CEBHA+) developed and applied a pragmatic approach. First, an online survey and face-to-face consultation between CEBHA+ partners and policy-makers generated priority research areas. Second, evidence maps for these priority research areas identified gaps and related priority research questions. Finally, study protocols were developed for inclusion within a grant proposal. Policy and practice representatives were involved throughout the process. Tuberculosis, diabetes, hypertension and road traffic injuries were selected as priority research areas. Evidence maps covered screening and models of care for diabetes and hypertension, population-level prevention of diabetes and hypertension and their risk factors, and prevention and management of road traffic injuries. Analysis of these maps yielded three priority research questions on hypertension and diabetes and one on road traffic injuries. The four resulting study protocols employ a broad range of primary and secondary research methods; a fifth promotes an integrated methodological approach across all research activities. The CEBHA+ approach, in particular evidence mapping, helped to formulate research questions and study protocols that would be owned by African partners, fill gaps in the evidence base, address policy and practice needs and be feasible given the existing research infrastructure and expertise. The consortium believes that the continuous involvement of decision-makers throughout the research process is an important means of ensuring that studies are relevant to the African context and that findings are rapidly implemented.

  17. A Methodology and Software Environment for Testing Process Model’s Sequential Predictions with Protocols

    DTIC Science & Technology

    1992-12-21

    in preparation). Foundations of artificial intelligence. Cambridge, MA: MIT Press. O’Reilly, R. C. (1991). X3DNet: An X- Based Neural Network ...2.2.3 Trace based protocol analysis 19 2.2A Summary of important data features 21 2.3 Tools related to process model testing 23 2.3.1 Tools for building...algorithm 57 3. Requirements for testing process models using trace based protocol 59 analysis 3.1 Definition of trace based protocol analysis (TBPA) 59

  18. Integrating human impacts and ecological integrity into a risk-based protocol for conservation planning

    USGS Publications Warehouse

    Mattson, K.M.; Angermeier, P.L.

    2007-01-01

    Conservation planning aims to protect biodiversity by sustainng the natural physical, chemical, and biological processes within representative ecosystems. Often data to measure these components are inadequate or unavailable. The impact of human activities on ecosystem processes complicates integrity assessments and might alter ecosystem organization at multiple spatial scales. Freshwater conservation targets, such as populations and communities, are influenced by both intrinsic aquatic properties and the surrounding landscape, and locally collected data might not accurately reflect potential impacts. We suggest that changes in five major biotic drivers-energy sources, physical habitat, flow regime, water quality, and biotic interactions-might be used as surrogates to inform conservation planners of the ecological integrity of freshwater ecosystems. Threats to freshwater systems might be evaluated based on their impact to these drivers to provide an overview of potential risk to conservation targets. We developed a risk-based protocol, the Ecological Risk Index (ERI), to identify watersheds with least/most risk to conservation targets. Our protocol combines risk-based components, specifically the frequency and severity of human-induced stressors, with biotic drivers and mappable land- and water-use data to provide a summary of relative risk to watersheds. We illustrate application of our protocol with a case study of the upper Tennessee River basin, USA. Differences in risk patterns among the major drainages in the basin reflect dominant land uses, such as mining and agriculture. A principal components analysis showed that localized, moderately severe threats accounted for most of the threat composition differences among our watersheds. We also found that the relative importance of threats is sensitive to the spatial grain of the analysis. Our case study demonstrates that the ERI is useful for evaluating the frequency and severity of ecosystemwide risk, which can inform local and regional conservation planning. ?? 2006 Springer Science+Business Media, Inc.

  19. NASA STI Program Coordinating Council Twelfth Meeting: Standards

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The theme of this NASA Scientific and Technical Information Program Coordinating Council Meeting was standards and their formation and application. Topics covered included scientific and technical information architecture, the Open Systems Interconnection Transmission Control Protocol/Internet Protocol, Machine-Readable Cataloging (MARC) open system environment procurement, and the Government Information Locator Service.

  20. Telemetry Standards, RCC Standard 106-17. Chapter 21. Telemetry Network Standard Introduction

    DTIC Science & Technology

    2017-07-01

    Critical RF radio frequency RFC Request for Comment SNMP Simple Network Management Protocol TA test article TCP Transmission Control Protocol...chapters might be of most interest for a particular reader. In order to guide the reader toward the chapters of further interest , the applicable... Simple Network Management Protocol (SNMP) to pass management information through the system. The SNMP management information bases (MIBs) provide

  1. Dependency of image quality on acquisition protocol and image processing in chest tomosynthesis-a visual grading study based on clinical data.

    PubMed

    Jadidi, Masoud; Båth, Magnus; Nyrén, Sven

    2018-04-09

    To compare the quality of images obtained with two different protocols with different acquisition time and the influence from image post processing in a chest digital tomosynthesis (DTS) system. 20 patients with suspected lung cancer were imaged with a chest X-ray equipment with tomosynthesis option. Two examination protocols with different acquisition times (6.3 and 12 s) were performed on each patient. Both protocols were presented with two different image post-processing (standard DTS processing and more advanced processing optimised for chest radiography). Thus, 4 series from each patient, altogether 80 series, were presented anonymously and in a random order. Five observers rated the quality of the reconstructed section images according to predefined quality criteria in three different classes. Visual grading characteristics (VGC) was used to analyse the data and the area under the VGC curve (AUC VGC ) was used as figure-of-merit. The 12 s protocol and the standard DTS processing were used as references in the analyses. The protocol with 6.3 s acquisition time had a statistically significant advantage over the vendor-recommended protocol with 12 s acquisition time for the classes of criteria, Demarcation (AUC VGC = 0.56, p = 0.009) and Disturbance (AUC VGC = 0.58, p < 0.001). A similar value of AUC VGC was found also for the class Structure (definition of bone structures in the spine) (0.56) but it could not be statistically separated from 0.5 (p = 0.21). For the image processing, the VGC analysis showed a small but statistically significant advantage for the standard DTS processing over the more advanced processing for the classes of criteria Demarcation (AUC VGC = 0.45, p = 0.017) and Disturbance (AUC VGC = 0.43, p = 0.005). A similar value of AUC VGC was found also for the class Structure (0.46), but it could not be statistically separated from 0.5 (p = 0.31). The study indicates that the protocol with 6.3 s acquisition time yields slightly better image quality than the vender-recommended protocol with acquisition time 12 s for several anatomical structures. Furthermore, the standard gradation processing  (the vendor-recommended post-processing for DTS), yields to some extent advantage over the gradation processing/multiobjective frequency processing/flexible noise control processing in terms of image quality for all classes of criteria. Advances in knowledge: The study proves that the image quality may be strongly affected by the selection of DTS protocol and that the vendor-recommended protocol may not always be the optimal choice.

  2. Non-Markovianity and reservoir memory of quantum channels: a quantum information theory perspective

    PubMed Central

    Bylicka, B.; Chruściński, D.; Maniscalco, S.

    2014-01-01

    Quantum technologies rely on the ability to coherently transfer information encoded in quantum states along quantum channels. Decoherence induced by the environment sets limits on the efficiency of any quantum-enhanced protocol. Generally, the longer a quantum channel is the worse its capacity is. We show that for non-Markovian quantum channels this is not always true: surprisingly the capacity of a longer channel can be greater than of a shorter one. We introduce a general theoretical framework linking non-Markovianity to the capacities of quantum channels and demonstrate how harnessing non-Markovianity may improve the efficiency of quantum information processing and communication. PMID:25043763

  3. Optimization of a sample processing protocol for recovery of Bacillus anthracis spores from soil

    USGS Publications Warehouse

    Silvestri, Erin E.; Feldhake, David; Griffin, Dale; Lisle, John T.; Nichols, Tonya L.; Shah, Sanjiv; Pemberton, A; Schaefer III, Frank W

    2016-01-01

    Following a release of Bacillus anthracis spores into the environment, there is a potential for lasting environmental contamination in soils. There is a need for detection protocols for B. anthracis in environmental matrices. However, identification of B. anthracis within a soil is a difficult task. Processing soil samples helps to remove debris, chemical components, and biological impurities that can interfere with microbiological detection. This study aimed to optimize a previously used indirect processing protocol, which included a series of washing and centrifugation steps. Optimization of the protocol included: identifying an ideal extraction diluent, variation in the number of wash steps, variation in the initial centrifugation speed, sonication and shaking mechanisms. The optimized protocol was demonstrated at two laboratories in order to evaluate the recovery of spores from loamy and sandy soils. The new protocol demonstrated an improved limit of detection for loamy and sandy soils over the non-optimized protocol with an approximate matrix limit of detection at 14 spores/g of soil. There were no significant differences overall between the two laboratories for either soil type, suggesting that the processing protocol will be robust enough to use at multiple laboratories while achieving comparable recoveries.

  4. Illustration Watermarking for Digital Images: An Investigation of Hierarchical Signal Inheritances for Nested Object-based Embedding

    DTIC Science & Technology

    2007-02-23

    approach for signal-level watermark inheritance. 15. SUBJECT TERMS EOARD, Steganography , Image Fusion, Data Mining, Image ...in watermarking algorithms , a program interface and protocol has been de - veloped, which allows control of the embedding and retrieval processes by the...watermarks in an image . Watermarking algorithm (DLL) Watermarking editor (Delphi) - User marks all objects: ci - class information oi - object instance

  5. Intervention mapping protocol for developing a theory-based diabetes self-management education program.

    PubMed

    Song, Misoon; Choi, Suyoung; Kim, Se-An; Seo, Kyoungsan; Lee, Soo Jin

    2015-01-01

    Development of behavior theory-based health promotion programs is encouraged with the paradigm shift from contents to behavior outcomes. This article describes the development process of the diabetes self-management program for older Koreans (DSME-OK) using intervention mapping (IM) protocol. The IM protocol includes needs assessment, defining goals and objectives, identifying theory and determinants, developing a matrix to form change objectives, selecting strategies and methods, structuring the program, and planning for evaluation and pilot testing. The DSME-OK adopted seven behavior objectives developed by the American Association of Diabetes Educators as behavioral outcomes. The program applied an information-motivation-behavioral skills model, and interventions were targeted to 3 determinants to change health behaviors. Specific methods were selected to achieve each objective guided by IM protocol. As the final step, program evaluation was planned including a pilot test. The DSME-OK was structured as the 3 determinants of the IMB model were intervened to achieve behavior objectives in each session. The program has 12 weekly 90-min sessions tailored for older adults. Using the IM protocol in developing a theory-based self-management program was beneficial in terms of providing a systematic guide to developing theory-based and behavior outcome-focused health education programs.

  6. Public health and terrorism preparedness: cross-border issues.

    PubMed

    Olson, Debra; Leitheiser, Aggie; Atchison, Christopher; Larson, Susan; Homzik, Cassandra

    2005-01-01

    On December 15, 2003, the Centers for Public Health Preparedness at the University of Minnesota and the University of Iowa convened the "Public Health and Terrorism Preparedness: Cross-Border Issues Roundtable." The purpose of the roundtable was to gather public health professionals and government agency representatives at the state, provincial, and local levels to identify unmet cross-border emergency preparedness and response needs and develop strategies for addressing these needs. Representatives from six state and local public health departments and three provincial governments were invited to identify cross-border needs and issues using a nominal group process. The result of the roundtable was identification of the needs considered most important and most doable across all the focus groups. The need to collaborate on and exchange plans and protocols among agencies was identified as most important and most doable across all groups. Development of contact protocols and creation and maintenance of a contact database was also considered important and doable for a majority of groups. Other needs ranked important across the majority of groups included specific isolation and quarantine protocols for multi-state responses; a system for rapid and secure exchange of information; specific protocols for sharing human resources across borders, including emergency credentials for physicians and health care workers; and a specific protocol to coordinate Strategic National Stockpile mechanisms across border communities.

  7. Moving Triadic Gaze Intervention Into Practice: Measuring Clinician Attitude and Implementation Fidelity

    PubMed Central

    Olswang, Lesley B.; Greenslade, Kathryn; Pinder, Gay Lloyd; Dowden, Patricia; Madden, Jodi

    2017-01-01

    Purpose This research investigated a first step in implementing the dynamic assessment (DA) component of Triadic Gaze Intervention (Olswang, Feuerstein, Pinder, & Dowden, 2013; Olswang et al., 2014), an evidence-based protocol for teaching early signals of communication to young children with physical disabilities. Clinician attitudes about adopting external evidence into practice and implementation fidelity in DA protocol delivery were examined following training. Method Seven early intervention clinicians from multiple disciplines were trained to deliver the four essential elements of the DA protocol: (a) provide communication opportunity, (b) recognize child's potentially communicative signal, (c) shape child's signal toward triadic gaze, and (d) reinforce with play. Clinician attitude regarding adopting evidence into practice was measured at baseline and follow-up, with the Evidence-Based Practice Attitude Scale (Aarons, 2004). Implementation fidelity in delivering the protocol was measured for adherence (accuracy) and competence (quality) during trial implementation. Results Clinicians' attitudes about trying new evidence that at first was perceived as incongruent with their practice improved over the course of the research. Clinicians demonstrated strong adherence to the DA protocol; however, competence varied across clinicians and appeared related to child performance. Conclusions The results provided insight into moving Triadic Gaze Intervention into practice and yielded valuable information regarding the implementation process, with implications for future research. PMID:28525577

  8. A Standard Mutual Authentication Protocol for Cloud Computing Based Health Care System.

    PubMed

    Mohit, Prerna; Amin, Ruhul; Karati, Arijit; Biswas, G P; Khan, Muhammad Khurram

    2017-04-01

    Telecare Medical Information System (TMIS) supports a standard platform to the patient for getting necessary medical treatment from the doctor(s) via Internet communication. Security protection is important for medical records (data) of the patients because of very sensitive information. Besides, patient anonymity is another most important property, which must be protected. Most recently, Chiou et al. suggested an authentication protocol for TMIS by utilizing the concept of cloud environment. They claimed that their protocol is patient anonymous and well security protected. We reviewed their protocol and found that it is completely insecure against patient anonymity. Further, the same protocol is not protected against mobile device stolen attack. In order to improve security level and complexity, we design a light weight authentication protocol for the same environment. Our security analysis ensures resilience of all possible security attacks. The performance of our protocol is relatively standard in comparison with the related previous research.

  9. Development of Information Assurance Protocol for Low Bandwidth Nanosatellite Communications

    DTIC Science & Technology

    2017-09-01

    INFORMATION ASSURANCE PROTOCOL FOR LOW BANDWIDTH NANOSATELLITE COMMUNICATIONS by Cervando A. Banuelos II September 2017 Thesis Advisor...reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction...searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments

  10. Research ethics board approval for an international thromboprophylaxis trial.

    PubMed

    Lutz, Kristina; Wilton, Kelly; Zytaruk, Nicole; Julien, Lisa; Hall, Richard; Harvey, Johanne; Skrobik, Yoanna; Vlahakis, Nicholas; Meade, Laurie; Matte, Andrea; Meade, Maureen; Burns, Karen; Albert, Martin; Cash, Bronwyn Barlow; Vallance, Shirley; Klinger, James; Heels-Ansdell, Diane; Cook, Deborah

    2012-06-01

    Research ethics board (REB) review of scientific protocols is essential, ensuring participants' dignity, safety, and rights. The objectives of this study were to examine the time from submission to approval, to analyze predictors of approval time, and to describe the scope of conditions from REBs evaluating an international thromboprophylaxis trial. We generated survey items through literature review and investigators' discussions, creating 4 domains: respondent and institutional demographics, the REB application process, and alternate consent models. We conducted a document analysis that involved duplicate assessment of themes from REB critique of the protocol and informed consent forms (ICF). Approval was granted from 65 REB institutions, requiring 58 unique applications. We analyzed 44 (75.9%) of 58 documents and surveys. Survey respondents completing the applications had 8 (5-12) years of experience; 77% completed 4 or more REB applications in previous 5 years. Critical care personnel were represented on 54% of REBs. The time to approval was a median (interquartile range) of 75 (42, 150) days, taking longer for sites with national research consortium membership (89.1 vs 31.0 days, P = .03). Document analysis of the application process and ICF yielded 5 themes: methodology, data management, consent procedures, cataloguing, and miscellaneous. Protocol-specific themes focused on trial implementation, external critiques, and budget. The only theme specific to the ICF was risks and benefits. The most frequent comments on the protocol and ICF were about methodology and miscellaneous issues; ICF comments also addressed study risks and benefits. More studies on methods to enhance efficiency and consistency of the REB approval processes for clinical trials are needed while still maintaining high ethical standards. Copyright © 2012 Elsevier Inc. All rights reserved.

  11. A laboratory information management system for the analysis of tritium (3H) in environmental waters.

    PubMed

    Belachew, Dagnachew Legesse; Terzer-Wassmuth, Stefan; Wassenaar, Leonard I; Klaus, Philipp M; Copia, Lorenzo; Araguás, Luis J Araguás; Aggarwal, Pradeep

    2018-07-01

    Accurate and precise measurements of low levels of tritium ( 3 H) in environmental waters are difficult to attain due to complex steps of sample preparation, electrolytic enrichment, liquid scintillation decay counting, and extensive data processing. We present a Microsoft Access™ relational database application, TRIMS (Tritium Information Management System) to assist with sample and data processing of tritium analysis by managing the processes from sample registration and analysis to reporting and archiving. A complete uncertainty propagation algorithm ensures tritium results are reported with robust uncertainty metrics. TRIMS will help to increase laboratory productivity and improve the accuracy and precision of 3 H assays. The software supports several enrichment protocols and LSC counter types. TRIMS is available for download at no cost from the IAEA at www.iaea.org/water. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Multimode entanglement in reconfigurable graph states using optical frequency combs

    PubMed Central

    Cai, Y.; Roslund, J.; Ferrini, G.; Arzani, F.; Xu, X.; Fabre, C.; Treps, N.

    2017-01-01

    Multimode entanglement is an essential resource for quantum information processing and quantum metrology. However, multimode entangled states are generally constructed by targeting a specific graph configuration. This yields to a fixed experimental setup that therefore exhibits reduced versatility and scalability. Here we demonstrate an optical on-demand, reconfigurable multimode entangled state, using an intrinsically multimode quantum resource and a homodyne detection apparatus. Without altering either the initial squeezing source or experimental architecture, we realize the construction of thirteen cluster states of various sizes and connectivities as well as the implementation of a secret sharing protocol. In particular, this system enables the interrogation of quantum correlations and fluctuations for any multimode Gaussian state. This initiates an avenue for implementing on-demand quantum information processing by only adapting the measurement process and not the experimental layout. PMID:28585530

  13. Deterministic quantum teleportation and information splitting via a peculiar W-class state

    NASA Astrophysics Data System (ADS)

    Mei, Feng; Yu, Ya-Fei; Zhang, Zhi-Ming

    2010-02-01

    In the paper (Phys. Rev. 2006 A 74 062320) Agrawal et al. have introduced a kind of W-class state which can be used for the quantum teleportation of single-particle state via a three-particle von Neumann measurement, and they thought that the state could not be used to teleport an unknown state by making two-particle and one-particle measurements. Here we reconsider the features of the W-class state and the quantum teleportation process via the W-class state. We show that, by introducing a unitary operation, the quantum teleportation can be achieved deterministically by making two-particle and one-particle measurements. In addition, our protocol is extended to the process of teleporting two-particle state and splitting information.

  14. Linking community-based monitoring to water policy: Perceptions of citizen scientists.

    PubMed

    Carlson, Tyler; Cohen, Alice

    2018-05-05

    This paper examines the relationships between Community-Based Water Monitoring (CBM) and government-led water initiatives. Drawing on a cross-Canada survey of over one hundred organizations, we explore the reasons why communities undertake CBM, the monitoring protocols they follow, and the extent to which CBM program members feel their findings are incorporated into formal (i.e., government-led) decision-making processes. Our results indicate that despite following standardized and credible monitoring protocols, fewer than half of CBM organizations report that their data is being used to inform water policy at any level of government. Moreover, respondents report higher rates of cooperation and data-sharing between CBM organizations themselves than between CBM organizations and their respective governments. These findings are significant, because many governments continue to express support for CBM. We explore the barriers between CBM data collection and government policy, and suggest that structural barriers include lack of multi-year funding, inconsistent protocols, and poor communication. More broadly, we argue that the distinction between formal and informal programming is unclear, and that addressing known CBM challenges will rely on a change in perception: CBM cannot simply be a less expensive alternative to government-driven data collection. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. A technique for automatically extracting useful field of view and central field of view images.

    PubMed

    Pandey, Anil Kumar; Sharma, Param Dev; Aheer, Deepak; Kumar, Jay Prakash; Sharma, Sanjay Kumar; Patel, Chetan; Kumar, Rakesh; Bal, Chandra Sekhar

    2016-01-01

    It is essential to ensure the uniform response of the single photon emission computed tomography gamma camera system before using it for the clinical studies by exposing it to uniform flood source. Vendor specific acquisition and processing protocol provide for studying flood source images along with the quantitative uniformity parameters such as integral and differential uniformity. However, a significant difficulty is that the time required to acquire a flood source image varies from 10 to 35 min depending both on the activity of Cobalt-57 flood source and the pre specified counts in the vendors protocol (usually 4000K-10,000K counts). In case the acquired total counts are less than the total prespecified counts, and then the vendor's uniformity processing protocol does not precede with the computation of the quantitative uniformity parameters. In this study, we have developed and verified a technique for reading the flood source image, remove unwanted information, and automatically extract and save the useful field of view and central field of view images for the calculation of the uniformity parameters. This was implemented using MATLAB R2013b running on Ubuntu Operating system and was verified by subjecting it to the simulated and real flood sources images. The accuracy of the technique was found to be encouraging, especially in view of practical difficulties with vendor-specific protocols. It may be used as a preprocessing step while calculating uniformity parameters of the gamma camera in lesser time with fewer constraints.

  16. The Unanticipated Challenges Associated With Implementing an Observational Study Protocol in a Large-Scale Physical Activity and Global Positioning System Data Collection.

    PubMed

    McCrorie, Paul; Walker, David; Ellaway, Anne

    2018-04-30

    Large-scale primary data collections are complex, costly, and time-consuming. Study protocols for trial-based research are now commonplace, with a growing number of similar pieces of work being published on observational research. However, useful additions to the literature base are publications that describe the issues and challenges faced while conducting observational studies. These can provide researchers with insightful knowledge that can inform funding proposals or project development work. In this study, we identify and reflectively discuss the unforeseen or often unpublished issues associated with organizing and implementing a large-scale objectively measured physical activity and global positioning system (GPS) data collection. The SPACES (Studying Physical Activity in Children's Environments across Scotland) study was designed to collect objectively measured physical activity and GPS data from 10- to 11-year-old children across Scotland, using a postal delivery method. The 3 main phases of the project (recruitment, delivery of project materials, and data collection and processing) are described within a 2-stage framework: (1) intended design and (2) implementation of the intended design. Unanticipated challenges arose, which influenced the data collection process; these encompass four main impact categories: (1) cost, budget, and funding; (2) project timeline; (3) participation and engagement; and (4) data challenges. The main unforeseen issues that impacted our timeline included the informed consent process for children under the age of 18 years; the use of, and coordination with, the postal service to deliver study information and equipment; and the variability associated with when participants began data collection and the time taken to send devices and consent forms back (1-12 months). Unanticipated budgetary issues included the identification of some study materials (AC power adapter) not fitting through letterboxes, as well as the employment of fieldworkers to increase recruitment and the return of consent forms. Finally, we encountered data issues when processing physical activity and GPS data that had been initiated across daylight saving time. We present learning points and recommendations that may benefit future studies of similar methodology in their early stages of development. ©Paul McCrorie, David Walker, Anne Ellaway. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 30.04.2018.

  17. DDN (Defense Data Network) Protocol Handbook. Volume 1. DoD Military Standard Protocols

    DTIC Science & Technology

    1985-12-01

    official Military Standard communication protocols in use on the DDN are included, as are several ARPANET (Advanced Research Projects Agency Network... research protocols which are currently in use, and some protocols currently undergoing review. Tutorial information and auxiliary documents are also...compatible with DoD needs, by researchers wishing to improve the protocols, and by impleroentors of local area networks (LANs) wishing their

  18. Organizational principles of cloud storage to support collaborative biomedical research.

    PubMed

    Kanbar, Lara J; Shalish, Wissam; Robles-Rubio, Carlos A; Precup, Doina; Brown, Karen; Sant'Anna, Guilherme M; Kearney, Robert E

    2015-08-01

    This paper describes organizational guidelines and an anonymization protocol for the management of sensitive information in interdisciplinary, multi-institutional studies with multiple collaborators. This protocol is flexible, automated, and suitable for use in cloud-based projects as well as for publication of supplementary information in journal papers. A sample implementation of the anonymization protocol is illustrated for an ongoing study dealing with Automated Prediction of EXtubation readiness (APEX).

  19. Buckets, Clusters and Dienst

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Maly, Kurt; Shen, Stewart N. T.

    1997-01-01

    In this paper we describe NCSTRL+, a unified, canonical digital library for scientific and technical information (STI). NCSTRL+ is based on the Networked Computer Science Technical Report Library (NCSTRL), a World Wide Web (WWW) accessible digital library (DL) that provides access to over 80 university departments and laboratories. NCSTRL+ implements two new technologies: cluster functionality and publishing "buckets." We have extended the Dienst protocol, the protocol underlying NCSTRL, to provide the ability to "cluster" independent collections into a logically centralized digital library based upon subject category classification, type of organization, and genres of material. The concept of "buckets" provides a mechanism for publishing and managing logically linked entities with multiple data formats. The NCSTRL+ prototype DL contains the holdings of NCSTRL and the NASA Technical Report Server (NTRS). The prototype demonstrates the feasibility of publishing into a multi-cluster DL, searching across clusters, and storing and presenting buckets of information. We show that the overhead for these additional capabilities is minimal to both the author and the user when compared to the equivalent process within NCSTRL.

  20. Growth and Visual Information Processing in Infants in Southern Ethiopia

    PubMed Central

    Kennedy, Tay; Thomas, David G.; Woltamo, Tesfaye; Abebe, Yewelsew; Hubbs-Tait, Laura; Sykova, Vladimira; Stoecker, Barbara J.; Hambidge, K. Michael

    2009-01-01

    Speed of information processing and recognition memory can be assessed in infants using a visual information processing (VIP) paradigm. In a sample of 100 infants 6–8 months of age from Southern Ethiopia, we assessed relations between growth and VIP. The 69 infants who completed the VIP protocol had a mean weight z score of −1.12 ± 1.19 SD, and length z score of −1.05 ± 1.31. The age-appropriate novelty preference was shown by only 12 infants. When age was controlled, longest look duration during familiarization was predicted by weight (sr2 = .16, p = .001) and length (sr2 = .05, p =.058), and mean look duration during test phases was predicted by head circumference (sr2 = .08, p = .018) implying that growth is associated with development of VIP. These data support the validity of VIP as a measure of infant cognitive development that is sensitive to nutritional factors and flexible enough to be adapted to individual cultures. PMID:19684873

  1. Noise-Resilient Quantum Computing with a Nitrogen-Vacancy Center and Nuclear Spins.

    PubMed

    Casanova, J; Wang, Z-Y; Plenio, M B

    2016-09-23

    Selective control of qubits in a quantum register for the purposes of quantum information processing represents a critical challenge for dense spin ensembles in solid-state systems. Here we present a protocol that achieves a complete set of selective electron-nuclear gates and single nuclear rotations in such an ensemble in diamond facilitated by a nearby nitrogen-vacancy (NV) center. The protocol suppresses internuclear interactions as well as unwanted coupling between the NV center and other spins of the ensemble to achieve quantum gate fidelities well exceeding 99%. Notably, our method can be applied to weakly coupled, distant spins representing a scalable procedure that exploits the exceptional properties of nuclear spins in diamond as robust quantum memories.

  2. Structure and composition of a watershed-scale sediment information network

    USGS Publications Warehouse

    Osterkamp, W.R.; Gray, J.R.; Laronne, J.B.; Martin, J.R.

    2007-01-01

    A 'Watershed-Scale Sediment Information Network' (WaSSIN), designed to complement UNESCO's International Sedimentation Initiative, was endorsed as an initial project by the World Association for Sedimentation and Erosion Research. WaSSIN is to address global fluvial-sediment information needs through a network approach based on consistent protocols for the collection, analysis, and storage of fluvial-sediment and ancillary information at smaller spatial scales than those of the International Sedimentation Initiative. As a second step of implementation, it is proposed herein that the WaSSIN have a general structure of two components, (1) monitoring and data acquisition and (2) research. Monitoring is to be conducted in small watersheds, each of which has an established database for discharge of water and suspended sediment and possibly for bed load, bed material, and bed topography. Ideally, documented protocols have been used for collecting, analyzing, storing, and sharing the derivative data. The research component is to continue the collection and interpretation of data, to compare those data among candidate watersheds, and to determine gradients of fluxes and processes among the selected watersheds. To define gradients and evaluate processes, the initial watersheds will have several common attributes. Watersheds of the first group will be: (1) six to ten in number, (2) less than 1000 km2 in area, (3) generally in mid-latitudes of continents, and (4) of semiarid climate. Potential candidate watersheds presently include the Weany Creek Basin, northeastern Australia, the Zhi Fanggou catchment, northern China, the Eshtemoa Watershed, southern Israel, the Metsemotlhaba River Basin, Botswana, the Aiuaba Experimental Basin, Brazil, and the Walnut Gulch Experimental Watershed, southwestern United States.

  3. Families of quantum fingerprinting protocols

    NASA Astrophysics Data System (ADS)

    Lovitz, Benjamin; Lütkenhaus, Norbert

    2018-03-01

    We introduce several families of quantum fingerprinting protocols to evaluate the equality function on two n -bit strings in the simultaneous message passing model. The original quantum fingerprinting protocol uses a tensor product of a small number of O (logn ) -qubit high-dimensional signals [H. Buhrman et al., Phys. Rev. Lett. 87, 167902 (2001), 10.1103/PhysRevLett.87.167902], whereas a recently proposed optical protocol uses a tensor product of O (n ) single-qubit signals, while maintaining the O (logn ) information leakage of the original protocol [J. M. Arazola and N. Lütkenhaus, Phys. Rev. A 89, 062305 (2014), 10.1103/PhysRevA.89.062305]. We find a family of protocols which interpolate between the original and optical protocols while maintaining the O (logn ) information leakage, thus demonstrating a tradeoff between the number of signals sent and the dimension of each signal. There has been interest in experimental realization of the recently proposed optical protocol using coherent states [F. Xu et al., Nat. Commun. 6, 8735 (2015), 10.1038/ncomms9735; J.-Y. Guan et al., Phys. Rev. Lett. 116, 240502 (2016), 10.1103/PhysRevLett.116.240502], but as the required number of laser pulses grows linearly with the input size n , eventual challenges for the long-time stability of experimental setups arise. We find a coherent state protocol which reduces the number of signals by a factor 1/2 while also reducing the information leakage. Our reduction makes use of a simple modulation scheme in optical phase space, and we find that more complex modulation schemes are not advantageous. Using a similar technique, we improve a recently proposed coherent state protocol for evaluating the Euclidean distance between two real unit vectors [N. Kumar et al., Phys. Rev. A 95, 032337 (2017), 10.1103/PhysRevA.95.032337] by reducing the number of signals by a factor 1/2 and also reducing the information leakage.

  4. 48 CFR 3439.701 - Internet Protocol version 6.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 6. 3439.701 Section 3439.701 Federal Acquisition Regulations System DEPARTMENT OF EDUCATION ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING ACQUISITION OF INFORMATION TECHNOLOGY Department Requirements for Acquisition of Information Technology 3439.701 Internet Protocol version 6. The contracting...

  5. Ship to Shore Data Communication and Prioritization

    DTIC Science & Technology

    2011-12-01

    First Out FTP File Transfer Protocol GCCS-M Global Command and Control System Maritime HAIPE High Assurance Internet Protocol Encryptor HTTP Hypertext...Transfer Protocol (world wide web protocol ) IBS Integrated Bar Code System IDEF0 Integration Definition IER Information Exchange Requirements...INTEL Intelligence IP Internet Protocol IPT Integrated Product Team ISEA In-Service Engineering Agent ISNS Integrated Shipboard Network System IT

  6. Relativistic quantum private database queries

    NASA Astrophysics Data System (ADS)

    Sun, Si-Jia; Yang, Yu-Guang; Zhang, Ming-Ou

    2015-04-01

    Recently, Jakobi et al. (Phys Rev A 83, 022301, 2011) suggested the first practical private database query protocol (J-protocol) based on the Scarani et al. (Phys Rev Lett 92, 057901, 2004) quantum key distribution protocol. Unfortunately, the J-protocol is just a cheat-sensitive private database query protocol. In this paper, we present an idealized relativistic quantum private database query protocol based on Minkowski causality and the properties of quantum information. Also, we prove that the protocol is secure in terms of the user security and the database security.

  7. A new, ultra-low latency data transmission protocol for Earthquake Early Warning Systems

    NASA Astrophysics Data System (ADS)

    Hill, P.; Hicks, S. P.; McGowan, M.

    2016-12-01

    One measure used to assess the performance of Earthquake Early Warning Systems (EEWS) is the delay time between earthquake origin and issued alert. EEWS latency is dependent on a number of sources (e.g. P-wave propagation, digitisation, transmission, receiver processing, triggering, event declaration). Many regional seismic networks use the SEEDlink protocol; however, packet size is fixed to 512-byte miniSEED records, resulting in transmission latencies of >0.5 s. Data packetisation is seen as one of the main sources of delays in EEWS (Brown et al., 2011). Optimising data-logger and telemetry configurations is a cost-effective strategy to improve EEWS alert times (Behr et al., 2015). Digitisers with smaller, selectable packets can result in faster alerts (Sokos et al., 2016). We propose a new seismic protocol for regional seismic networks benefiting low-latency applications such as EEWS. The protocol, based on Güralp's existing GDI-link format is an efficient and flexible method to exchange data between seismic stations and data centers for a range of network configurations. The main principle is to stream data sample-by-sample instead of fixed-length packets to minimise transmission latency. Self-adaptive packetisation with compression maximises available telemetry bandwidth. Highly flexible metadata fields within GDI-link are compatible with existing miniSEED definitions. Data is sent as integers or floats, supporting a wide range of data formats, including discrete parameters such as Pd & τC for on-site earthquake early warning. Other advantages include: streaming station state-of-health information, instrument control, support of backfilling and fail-over strategies during telemetry outages. Based on tests carried out on the Güralp Minimus data-logger, we show our new protocol can reduce transmission latency to as low as 1 ms. The low-latency protocol is currently being implemented with common processing packages. The results of these tests will help to highlight latency levels that can be achieved with next-generation EEWS.

  8. Iterative tailoring of optical quantum states with homodyne measurements.

    PubMed

    Etesse, Jean; Kanseri, Bhaskar; Tualle-Brouri, Rosa

    2014-12-01

    As they can travel long distances, free space optical quantum states are good candidates for carrying information in quantum information technology protocols. These states, however, are often complex to produce and require protocols whose success probability drops quickly with an increase of the mean photon number. Here we propose a new protocol for the generation and growth of arbitrary states, based on one by one coherent adjunctions of the simple state superposition α|0〉 + β|1〉. Due to the nature of the protocol, which allows for the use of quantum memories, it can lead to high performances.

  9. Enhancing robustness of multiparty quantum correlations using weak measurement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Uttam, E-mail: uttamsingh@hri.res.in; Mishra, Utkarsh, E-mail: utkarsh@hri.res.in; Dhar, Himadri Shekhar, E-mail: dhar.himadri@gmail.com

    Multipartite quantum correlations are important resources for the development of quantum information and computation protocols. However, the resourcefulness of multipartite quantum correlations in practical settings is limited by its fragility under decoherence due to environmental interactions. Though there exist protocols to protect bipartite entanglement under decoherence, the implementation of such protocols for multipartite quantum correlations has not been sufficiently explored. Here, we study the effect of local amplitude damping channel on the generalized Greenberger–Horne–Zeilinger state, and use a protocol of optimal reversal quantum weak measurement to protect the multipartite quantum correlations. We observe that the weak measurement reversal protocol enhancesmore » the robustness of multipartite quantum correlations. Further it increases the critical damping value that corresponds to entanglement sudden death. To emphasize the efficacy of the technique in protection of multipartite quantum correlation, we investigate two proximately related quantum communication tasks, namely, quantum teleportation in a one sender, many receivers setting and multiparty quantum information splitting, through a local amplitude damping channel. We observe an increase in the average fidelity of both the quantum communication tasks under the weak measurement reversal protocol. The method may prove beneficial, for combating external interactions, in other quantum information tasks using multipartite resources. - Highlights: • Extension of weak measurement reversal scheme to protect multiparty quantum correlations. • Protection of multiparty quantum correlation under local amplitude damping noise. • Enhanced fidelity of quantum teleportation in one sender and many receivers setting. • Enhanced fidelity of quantum information splitting protocol.« less

  10. Thermodynamic framework for information in nanoscale systems with memory

    NASA Astrophysics Data System (ADS)

    Arias-Gonzalez, J. Ricardo

    2017-11-01

    Information is represented by linear strings of symbols with memory that carry errors as a result of their stochastic nature. Proofreading and edition are assumed to improve certainty although such processes may not be effective. Here, we develop a thermodynamic theory for material chains made up of nanoscopic subunits with symbolic meaning in the presence of memory. This framework is based on the characterization of single sequences of symbols constructed under a protocol and is used to derive the behavior of ensembles of sequences similarly constructed. We then analyze the role of proofreading and edition in the presence of memory finding conditions to make revision an effective process, namely, to decrease the entropy of the chain. Finally, we apply our formalism to DNA replication and RNA transcription finding that Watson and Crick hybridization energies with which nucleotides are branched to the template strand during the copying process are optimal to regulate the fidelity in proofreading. These results are important in applications of information theory to a variety of solid-state physical systems and other biomolecular processes.

  11. Thermodynamic framework for information in nanoscale systems with memory.

    PubMed

    Arias-Gonzalez, J Ricardo

    2017-11-28

    Information is represented by linear strings of symbols with memory that carry errors as a result of their stochastic nature. Proofreading and edition are assumed to improve certainty although such processes may not be effective. Here, we develop a thermodynamic theory for material chains made up of nanoscopic subunits with symbolic meaning in the presence of memory. This framework is based on the characterization of single sequences of symbols constructed under a protocol and is used to derive the behavior of ensembles of sequences similarly constructed. We then analyze the role of proofreading and edition in the presence of memory finding conditions to make revision an effective process, namely, to decrease the entropy of the chain. Finally, we apply our formalism to DNA replication and RNA transcription finding that Watson and Crick hybridization energies with which nucleotides are branched to the template strand during the copying process are optimal to regulate the fidelity in proofreading. These results are important in applications of information theory to a variety of solid-state physical systems and other biomolecular processes.

  12. A network monitor for HTTPS protocol based on proxy

    NASA Astrophysics Data System (ADS)

    Liu, Yangxin; Zhang, Lingcui; Zhou, Shuguang; Li, Fenghua

    2016-10-01

    With the explosive growth of harmful Internet information such as pornography, violence, and hate messages, network monitoring is essential. Traditional network monitors is based mainly on bypass monitoring. However, we can't filter network traffic using bypass monitoring. Meanwhile, only few studies focus on the network monitoring for HTTPS protocol. That is because HTTPS data is in the encrypted traffic, which makes it difficult to monitor. This paper proposes a network monitor for HTTPS protocol based on proxy. We adopt OpenSSL to establish TLS secure tunes between clients and servers. Epoll is used to handle a large number of concurrent client connections. We also adopt Knuth- Morris-Pratt string searching algorithm (or KMP algorithm) to speed up the search process. Besides, we modify request packets to reduce the risk of errors and modify response packets to improve security. Experiments show that our proxy can monitor the content of all tested HTTPS websites efficiently with little loss of network performance.

  13. The Network Protocol Analysis Technique in Snort

    NASA Astrophysics Data System (ADS)

    Wu, Qing-Xiu

    Network protocol analysis is a network sniffer to capture data for further analysis and understanding of the technical means necessary packets. Network sniffing is intercepted by packet assembly binary format of the original message content. In order to obtain the information contained. Required based on TCP / IP protocol stack protocol specification. Again to restore the data packets at protocol format and content in each protocol layer. Actual data transferred, as well as the application tier.

  14. Two-dimensional distributed-phase-reference protocol for quantum key distribution

    NASA Astrophysics Data System (ADS)

    Bacco, Davide; Christensen, Jesper Bjerge; Castaneda, Mario A. Usuga; Ding, Yunhong; Forchhammer, Søren; Rottwitt, Karsten; Oxenløwe, Leif Katsuo

    2016-12-01

    Quantum key distribution (QKD) and quantum communication enable the secure exchange of information between remote parties. Currently, the distributed-phase-reference (DPR) protocols, which are based on weak coherent pulses, are among the most practical solutions for long-range QKD. During the last 10 years, long-distance fiber-based DPR systems have been successfully demonstrated, although fundamental obstacles such as intrinsic channel losses limit their performance. Here, we introduce the first two-dimensional DPR-QKD protocol in which information is encoded in the time and phase of weak coherent pulses. The ability of extracting two bits of information per detection event, enables a higher secret key rate in specific realistic network scenarios. Moreover, despite the use of more dimensions, the proposed protocol remains simple, practical, and fully integrable.

  15. Two-dimensional distributed-phase-reference protocol for quantum key distribution.

    PubMed

    Bacco, Davide; Christensen, Jesper Bjerge; Castaneda, Mario A Usuga; Ding, Yunhong; Forchhammer, Søren; Rottwitt, Karsten; Oxenløwe, Leif Katsuo

    2016-12-22

    Quantum key distribution (QKD) and quantum communication enable the secure exchange of information between remote parties. Currently, the distributed-phase-reference (DPR) protocols, which are based on weak coherent pulses, are among the most practical solutions for long-range QKD. During the last 10 years, long-distance fiber-based DPR systems have been successfully demonstrated, although fundamental obstacles such as intrinsic channel losses limit their performance. Here, we introduce the first two-dimensional DPR-QKD protocol in which information is encoded in the time and phase of weak coherent pulses. The ability of extracting two bits of information per detection event, enables a higher secret key rate in specific realistic network scenarios. Moreover, despite the use of more dimensions, the proposed protocol remains simple, practical, and fully integrable.

  16. Two-dimensional distributed-phase-reference protocol for quantum key distribution

    PubMed Central

    Bacco, Davide; Christensen, Jesper Bjerge; Castaneda, Mario A. Usuga; Ding, Yunhong; Forchhammer, Søren; Rottwitt, Karsten; Oxenløwe, Leif Katsuo

    2016-01-01

    Quantum key distribution (QKD) and quantum communication enable the secure exchange of information between remote parties. Currently, the distributed-phase-reference (DPR) protocols, which are based on weak coherent pulses, are among the most practical solutions for long-range QKD. During the last 10 years, long-distance fiber-based DPR systems have been successfully demonstrated, although fundamental obstacles such as intrinsic channel losses limit their performance. Here, we introduce the first two-dimensional DPR-QKD protocol in which information is encoded in the time and phase of weak coherent pulses. The ability of extracting two bits of information per detection event, enables a higher secret key rate in specific realistic network scenarios. Moreover, despite the use of more dimensions, the proposed protocol remains simple, practical, and fully integrable. PMID:28004821

  17. TRIGA: Telecommunications Protocol Processing Subsystem Using Reconfigurable Interoperable Gate Arrays

    NASA Technical Reports Server (NTRS)

    Pang, Jackson; Pingree, Paula J.; Torgerson, J. Leigh

    2006-01-01

    We present the Telecommunications protocol processing subsystem using Reconfigurable Interoperable Gate Arrays (TRIGA), a novel approach that unifies fault tolerance, error correction coding and interplanetary communication protocol off-loading to implement CCSDS File Delivery Protocol and Datalink layers. The new reconfigurable architecture offers more than one order of magnitude throughput increase while reducing footprint requirements in memory, command and data handling processor utilization, communication system interconnects and power consumption.

  18. A Family of ACO Routing Protocols for Mobile Ad Hoc Networks.

    PubMed

    Rupérez Cañas, Delfín; Sandoval Orozco, Ana Lucila; García Villalba, Luis Javier; Kim, Tai-Hoon

    2017-05-22

    In this work, an ACO routing protocol for mobile ad hoc networks based on AntHocNet is specified. As its predecessor, this new protocol, called AntOR, is hybrid in the sense that it contains elements from both reactive and proactive routing. Specifically, it combines a reactive route setup process with a proactive route maintenance and improvement process. Key aspects of the AntOR protocol are the disjoint-link and disjoint-node routes, separation between the regular pheromone and the virtual pheromone in the diffusion process and the exploration of routes, taking into consideration the number of hops in the best routes. In this work, a family of ACO routing protocols based on AntOR is also specified. These protocols are based on protocol successive refinements. In this work, we also present a parallelized version of AntOR that we call PAntOR. Using programming multiprocessor architectures based on the shared memory protocol, PAntOR allows running tasks in parallel using threads. This parallelization is applicable in the route setup phase, route local repair process and link failure notification. In addition, a variant of PAntOR that consists of having more than one interface, which we call PAntOR-MI (PAntOR-Multiple Interface), is specified. This approach parallelizes the sending of broadcast messages by interface through threads.

  19. Recommendations for Guidelines for Environment-Specific Magnetic-Field Measurements, Rapid Program Engineering Project #2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Electric Research and Management, Inc.; IIT Research Institute; Magnetic Measurements

    1997-03-11

    The purpose of this project was to document widely applicable methods for characterizing the magnetic fields in a given environment, recognizing the many sources co-existing within that space. The guidelines are designed to allow the reader to follow an efficient process to (1) plan the goals and requirements of a magnetic-field study, (2) develop a study structure and protocol, and (3) document and carry out the plan. These guidelines take the reader first through the process of developing a basic study strategy, then through planning and performing the data collection. Last, the critical factors of data management, analysis reporting, andmore » quality assurance are discussed. The guidelines are structured to allow the researcher to develop a protocol that responds to specific site and project needs. The Research and Public Information Dissemination Program (RAPID) is based on exposure to magnetic fields and the potential health effects. Therefore, the most important focus for these magnetic-field measurement guidelines is relevance to exposure. The assumed objective of an environment-specific measurement is to characterize the environment (given a set of occupants and magnetic-field sources) so that information about the exposure of the occupants may be inferred. Ideally, the researcher seeks to obtain complete or "perfect" information about these magnetic fields, so that personal exposure might also be modeled perfectly. However, complete data collection is not feasible. In fact, it has been made more difficult as the research field has moved to expand the list of field parameters measured, increasing the cost and complexity of performing a measurement and analyzing the data. The guidelines address this issue by guiding the user to design a measurement protocol that will gather the most exposure-relevant information based on the locations of people in relation to the sources. We suggest that the "microenvironment" become the base unit of area in a study, with boundaries defined by the occupant's activity patterns and the field variation from the sources affecting the area. Such a stratification allows the researcher to determine which microenvironment are of most interest, and to methodically focus the areas, in order to gather the most relevant set of data.« less

  20. Maternal antecedents of adiposity and studying the transgenerational role of hyperglycemia and insulin (MAASTHI): a prospective cohort study : Protocol of birth cohort at Bangalore, India.

    PubMed

    Babu, Giridhara R; Murthy, Gvs; Deepa, R; Yamuna; Prafulla; Kumar, H Kiran; Karthik, Maithili; Deshpande, Keerti; Benjamin Neelon, Sara E; Prabhakaran, D; Kurpad, Anura; Kinra, Sanjay

    2016-10-14

    India is experiencing an epidemic of obesity-hyperglycaemia, which coincides with child bearing age for women. The epidemic can be sustained and augmented through transgenerational transmission of adiposity and glucose intolerance in women. This presents an opportunity for exploring a clear strategy for the control of this epidemic in India. We conducted a study between November 2013 and May 2015 to inform the design of a large pregnancy cohort study. Based on the findings of this pilot, we developed the protocol for the proposed birth cohort of 5000 women, the recruitment for which will start in April 2016. The protocol of the study documents the processes which aim at advancing the available knowledge, linking several steps in the evolution of obesity led hyperglycemia. Maternal Antecedents of Adiposity and Studying the Transgenerational role of Hyperglycemia and Insulin (MAASTHI) is a cohort study in the public health facilities in Bangalore, India. The objective of MAASTHI is to prospectively assess the effects of glucose levels in pregnancy on the risk of adverse infant outcomes, especially in predicting the possible risk markers of later chronic diseases. The primary objective of the proposed study is to investigate the effect of glucose levels in pregnancy on skinfold thickness (adiposity) in infancy as a marker of future obesity and diabetes in offspring. The secondary objective is to assess the association between psychosocial environment of mothers and adverse neonatal outcomes including adiposity. The study aims to recruit 5000 pregnant women and follow them and their offspring for a period of 4 years. The institutional review board at The Indian Institute of Public Health (IIPH)-H, Bangalore, Public Health Foundation of India has approved the protocol. All participants are required to provide written informed consent. The findings from this study may help to address important questions on screening and management of high blood sugar in pregnancy. It may provide critical information on the specific determinants driving the underweight-obesity-T2DM epidemic in India. The study can inform the policy regarding the potential impact of screening and management protocols in public healthcare facilities. The public health implications include prioritising issues of maternal glycemic control and weight management and better understanding of the lifecourse determinants in the development of T2DM.

  1. Parental views on informed consent for expanded newborn screening.

    PubMed

    Moody, Louise; Choudhry, Kubra

    2013-09-01

    An increasing array of rare inherited conditions can be detected as part of the universal newborn screening programme. The introduction and evaluation of these service developments require consideration of the ethical issues involved and appropriate mechanisms for informing parents and gaining consent if required. Exploration of parental views is needed to inform the debate and specifically consider whether more flexible protocols are needed to fit with the public perception of new developments in this context. This study has been undertaken to explore perceptions and attitudes of parents and future parents to an expanded newborn screening programme in the United Kingdom and the necessary information provision and consent processes. A mixed methods study involving focus groups (n = 29) and a web-survey (n = 142) undertaken with parents and future parents. Parents want guaranteed information provision with clear decision-making powers and an awareness of the choices available to them. The difference between existing screening provision and expanded screening was not considered to be significant enough by participants to warrant formal written, informed consent for expanded screening. It is argued that the ethical review processes need to be more flexible towards the provision of information and consent processes for service developments in newborn screening. © 2011 John Wiley & Sons Ltd.

  2. A universal quantum frequency converter via four-wave-mixing processes

    NASA Astrophysics Data System (ADS)

    Cheng, Mingfei; Fang, Jinghuai

    2016-06-01

    We present a convenient and flexible way to realize a universal quantum frequency converter by using nondegenerate four-wave-mixing processes in the ladder-type three-level atomic system. It is shown that quantum state exchange between two fields with large frequency difference can be readily achieved, where one corresponds to the atomic resonant transition in the visible spectral region for quantum memory and the other to the telecommunication range wavelength (1550 nm) for long-distance transmission over optical fiber. This method would bring great facility in realistic quantum information processing protocols with atomic ensembles as quantum memory and low-loss optical fiber as transmission channel.

  3. Practical single-photon-assisted remote state preparation with non-maximally entanglement

    NASA Astrophysics Data System (ADS)

    Wang, Dong; Huang, Ai-Jun; Sun, Wen-Yang; Shi, Jia-Dong; Ye, Liu

    2016-08-01

    Remote state preparation (RSP) and joint remote state preparation (JRSP) protocols for single-photon states are investigated via linear optical elements with partially entangled states. In our scheme, by choosing two-mode instances from a polarizing beam splitter, only the sender in the communication protocol needs to prepare an ancillary single-photon and operate the entanglement preparation process in order to retrieve an arbitrary single-photon state from a photon pair in partially entangled state. In the case of JRSP, i.e., a canonical model of RSP with multi-party, we consider that the information of the desired state is split into many subsets and in prior maintained by spatially separate parties. Specifically, with the assistance of a single-photon state and a three-photon entangled state, it turns out that an arbitrary single-photon state can be jointly and remotely prepared with certain probability, which is characterized by the coefficients of both the employed entangled state and the target state. Remarkably, our protocol is readily to extend to the case for RSP and JRSP of mixed states with the all optical means. Therefore, our protocol is promising for communicating among optics-based multi-node quantum networks.

  4. Reliable Multihop Broadcast Protocol with a Low-Overhead Link Quality Assessment for ITS Based on VANETs in Highway Scenarios

    PubMed Central

    Galaviz-Mosqueda, Alejandro; Villarreal-Reyes, Salvador; Galeana-Zapién, Hiram; Rubio-Loyola, Javier; Covarrubias-Rosales, David H.

    2014-01-01

    Vehicular ad hoc networks (VANETs) have been identified as a key technology to enable intelligent transport systems (ITS), which are aimed to radically improve the safety, comfort, and greenness of the vehicles in the road. However, in order to fully exploit VANETs potential, several issues must be addressed. Because of the high dynamic of VANETs and the impairments in the wireless channel, one key issue arising when working with VANETs is the multihop dissemination of broadcast packets for safety and infotainment applications. In this paper a reliable low-overhead multihop broadcast (RLMB) protocol is proposed to address the well-known broadcast storm problem. The proposed RLMB takes advantage of the hello messages exchanged between the vehicles and it processes such information to intelligently select a relay set and reduce the redundant broadcast. Additionally, to reduce the hello messages rate dependency, RLMB uses a point-to-zone link evaluation approach. RLMB performance is compared with one of the leading multihop broadcast protocols existing to date. Performance metrics show that our RLMB solution outperforms the leading protocol in terms of important metrics such as packet dissemination ratio, overhead, and delay. PMID:25133224

  5. Protocols for second-generation business satellites systems

    NASA Astrophysics Data System (ADS)

    Evans, B. G.; Coakley, F. P.; El Amin, M. H. M.

    The paper discusses the nature and mix of traffic in business satellite systems and describes the limitations on the protocol imposed by the differing impairments of speech, video, and data. A simple TDMA system protocol is presented which meets the requirements of mixed-service operation. The efficiency of the protocol together with implications for allocation, scheduling and synchronisation are discussed. Future-generation satellites will probably use on-board processing. Some initial work on protocols that make use of on-board processing and the implications for satellite and earth-station equipment are presented.

  6. CD-ROM-aided Databases

    NASA Astrophysics Data System (ADS)

    Sano, Tomoyuki; Suzuki, Masataka; Nishida, Hideo

    The Development of CAI system using CD-ROM and NAPLPS (North American Presentation Level Protocol Syntax) was taken place by Himeji Dokkyo University. The characteristics of CAI using CD-ROM as information processing series for the department of liberal arts student are described. The system is that the computer program, vast amount of voice data and graphics data are stored in a CD-ROM. It is very effective to improve learning ability of student.

  7. High speed polling protocol for multiple node network

    NASA Technical Reports Server (NTRS)

    Kirkham, Harold (Inventor)

    1995-01-01

    The invention is a multiple interconnected network of intelligent message-repeating remote nodes which employs a remote node polling process performed by a master node by transmitting a polling message generically addressed to all remote nodes associated with the master node. Each remote node responds upon receipt of the generically addressed polling message by transmitting a poll-answering informational message and by relaying the polling message to other adjacent remote nodes.

  8. Probability- and model-based approaches to inference for proportion forest using satellite imagery as ancillary data

    Treesearch

    Ronald E. McRoberts

    2010-01-01

    Estimates of forest area are among the most common and useful information provided by national forest inventories. The estimates are used for local and national purposes and for reporting to international agreements such as the Montréal Process, the Ministerial Conference on the Protection of Forests in Europe, and the Kyoto Protocol. The estimates are usually based on...

  9. Direct Fabrication of a-Si:H Thin Film Transistor Arrays on Plastic and Metal Foils for Flexible Displays

    DTIC Science & Technology

    2008-12-01

    TFTs ) arrays for high information content active matrix flexible displays for Army applications. For all flexible substrates a manufacturable...impermeable flexible substrate systems “display-ready” materials and handling protocols, (ii) high performance TFT devices and circuits fabricated...processes for integration with the flexible TFT arrays. Approaches and solution to address each of these major challenges are described in the

  10. A mobile information management system used in textile enterprises

    NASA Astrophysics Data System (ADS)

    Huang, C.-R.; Yu, W.-D.

    2008-02-01

    The mobile information management system (MIMS) for textile enterprises is based on Microsoft Visual Studios. NET2003 Server, Microsoft SQL Server 2000, C++ language and wireless application protocol (WAP) and wireless markup language (WML) technology. The portable MIMS is composed of three-layer structures, i.e. showing layer; operating layer; and data visiting layer corresponding to the port-link module; processing module; and database module. By using the MIMS, not only the information exchanges become more convenient and easier, but also the compatible between the giant information capacity and a micro-cell phone and functional expansion nature in operating and designing can be realized by means of build-in units. The development of MIMS is suitable for the utilization in textile enterprises.

  11. Pediatric oncologists' attitudes towards involving adolescents in decision-making concerning research participation.

    PubMed

    de Vries, Martine C; Wit, Jan M; Engberts, Dirk P; Kaspers, Gertjan J L; van Leeuwen, Evert

    2010-07-15

    Various regulations and guidelines stipulate the importance of involving adolescents in decision-making concerning research participation. Several studies have shown that in the context of pediatric oncology this involvement is difficult to achieve due to emotional stress, the complexity of research protocols and limited time. Still, up to 80% of adolescents with cancer enter onto a trial during their illness. The aim of this study was to determine clinicians' views and attitudes towards enrolling adolescents in research, considering the difficulties surrounding their involvement in decision-making. A qualitative multicenter study was performed, using in-depth semi-structured interviews on the informed consent process with 15 pediatric hemato-oncologists. Four central themes emerged that characterize clinicians' attitudes towards involving adolescents in the decision-making process: (1) clinicians regard most adolescents as not capable of participating meaningfully in discussions regarding research; (2) clinicians do not always provide adolescents with all information; (3) proxy consent from parents is obtained and is deemed sufficient; (4) clinician-investigator integrity: clinicians judge research protocols as not being harmful and even in the best interest of the adolescent. Clinicians justify not involving adolescents in research discussions by referring to best interest arguments (adolescents' incompetence, proxy consent, and investigator integrity), although this is not in line with legal regulations and ethical guidelines.

  12. Patient-Reported Outcome (PRO) Assessment in Clinical Trials: A Systematic Review of Guidance for Trial Protocol Writers

    PubMed Central

    Calvert, Melanie; Kyte, Derek; Duffy, Helen; Gheorghe, Adrian; Mercieca-Bebber, Rebecca; Ives, Jonathan; Draper, Heather; Brundage, Michael; Blazeby, Jane; King, Madeleine

    2014-01-01

    Background Evidence suggests there are inconsistencies in patient-reported outcome (PRO) assessment and reporting in clinical trials, which may limit the use of these data to inform patient care. For trials with a PRO endpoint, routine inclusion of key PRO information in the protocol may help improve trial conduct and the reporting and appraisal of PRO results; however, it is currently unclear exactly what PRO-specific information should be included. The aim of this review was to summarize the current PRO-specific guidance for clinical trial protocol developers. Methods and Findings We searched the MEDLINE, EMBASE, CINHAL and Cochrane Library databases (inception to February 2013) for PRO-specific guidance regarding trial protocol development. Further guidance documents were identified via Google, Google scholar, requests to members of the UK Clinical Research Collaboration registered clinical trials units and international experts. Two independent investigators undertook title/abstract screening, full text review and data extraction, with a third involved in the event of disagreement. 21,175 citations were screened and 54 met the inclusion criteria. Guidance documents were difficult to access: electronic database searches identified just 8 documents, with the remaining 46 sourced elsewhere (5 from citation tracking, 27 from hand searching, 7 from the grey literature review and 7 from experts). 162 unique PRO-specific protocol recommendations were extracted from included documents. A further 10 PRO recommendations were identified relating to supporting trial documentation. Only 5/162 (3%) recommendations appeared in ≥50% of guidance documents reviewed, indicating a lack of consistency. Conclusions PRO-specific protocol guidelines were difficult to access, lacked consistency and may be challenging to implement in practice. There is a need to develop easily accessible consensus-driven PRO protocol guidance. Guidance should be aimed at ensuring key PRO information is routinely included in appropriate trial protocols, in order to facilitate rigorous collection/reporting of PRO data, to effectively inform patient care. PMID:25333995

  13. Australian Aboriginal and Torres Strait Islander-focused primary healthcare social and emotional wellbeing research: a systematic review protocol.

    PubMed

    Farnbach, Sara; Eades, Anne-Marie; Hackett, Maree Lisa

    2015-12-30

    Research with a focus on Aboriginal and Torres Strait Islander Australian's (hereafter referred to as Indigenous(1)) needs is crucial to ensure culturally appropriate evidence-based strategies are developed to improve health. However, concerns surrounding this research exist, arising from some previous research lacking community consultation, resulting in little community benefit or infringing on important cultural values. Values and Ethics: Guidelines for Ethical conduct in Aboriginal and Torres Strait Islander Health Research (hereafter referred to as Values and Ethics), developed by The National Health and Medical Research Council of Australia in 2003, is the ethical standard for Indigenous-focused health research. Researchers must address its Values in research design and conduct. However, its impact on research processes is unclear. Local Protocols should also be considered. This review aims to systematically examine practices related to Values and Ethics, Local Protocols and the processes of conducting Indigenous-focused primary healthcare research in collaboration with external researchers. The following electronic databases and grey literature will be searched (2003 to current): MEDLINE, EMBASE, CINAHL, Informit and HealthInfoNet--an Indigenous-specific research and program website. Indigenous-focused research will be included. Research must be conducted in one or more primary healthcare services, in collaboration with external researchers and with a focus on social and emotional well being. One reviewer will review titles and abstracts to remove obviously irrelevant research articles. Full-text research articles will be retrieved and independently examined by two reviewers. Data and quality assessment will be completed by one reviewer and verified by a second reviewer. Quality will be assessed using modified versions of established quality assessment tools. This review will provide information on research processes and the impact of Values and Ethics on Indigenous-focused primary healthcare research, informing communities and primary healthcare staff around research practices, and researchers and policy makers of strengths and weaknesses of practice. PROSPERO CRD42015024994.

  14. Optimizing radiotherapy protocols using computer automata to model tumour cell death as a function of oxygen diffusion processes.

    PubMed

    Paul-Gilloteaux, Perrine; Potiron, Vincent; Delpon, Grégory; Supiot, Stéphane; Chiavassa, Sophie; Paris, François; Costes, Sylvain V

    2017-05-23

    The concept of hypofractionation is gaining momentum in radiation oncology centres, enabled by recent advances in radiotherapy apparatus. The gain of efficacy of this innovative treatment must be defined. We present a computer model based on translational murine data for in silico testing and optimization of various radiotherapy protocols with respect to tumour resistance and the microenvironment heterogeneity. This model combines automata approaches with image processing algorithms to simulate the cellular response of tumours exposed to ionizing radiation, modelling the alteration of oxygen permeabilization in blood vessels against repeated doses, and introducing mitotic catastrophe (as opposed to arbitrary delayed cell-death) as a means of modelling radiation-induced cell death. Published data describing cell death in vitro as well as tumour oxygenation in vivo are used to inform parameters. Our model is validated by comparing simulations to in vivo data obtained from the radiation treatment of mice transplanted with human prostate tumours. We then predict the efficacy of untested hypofractionation protocols, hypothesizing that tumour control can be optimized by adjusting daily radiation dosage as a function of the degree of hypoxia in the tumour environment. Further biological refinement of this tool will permit the rapid development of more sophisticated strategies for radiotherapy.

  15. Development of yarn breakage detection software system based on machine vision

    NASA Astrophysics Data System (ADS)

    Wang, Wenyuan; Zhou, Ping; Lin, Xiangyu

    2017-10-01

    For questions spinning mills and yarn breakage cannot be detected in a timely manner, and save the cost of textile enterprises. This paper presents a software system based on computer vision for real-time detection of yarn breakage. The system and Windows8.1 system Tablet PC, cloud server to complete the yarn breakage detection and management. Running on the Tablet PC software system is designed to collect yarn and location information for analysis and processing. And will be processed after the information through the Wi-Fi and http protocol sent to the cloud server to store in the Microsoft SQL2008 database. In order to follow up on the yarn break information query and management. Finally sent to the local display on time display, and remind the operator to deal with broken yarn. The experimental results show that the system of missed test rate not more than 5%o, and no error detection.

  16. A Robust and Efficient Quantum Private Comparison of Equality Based on the Entangled Swapping of GHZ-like State and χ + State

    NASA Astrophysics Data System (ADS)

    Xu, Ling; Zhao, Zhiwen

    2017-08-01

    A new quantum protocol with the assistance of a semi-honest third party (TP) is proposed, which allows the participants comparing the equality of their private information without disclosing them. Different from previous protocols, this protocol utilizes quantum key distribution against the collective-dephasing noise and the collective-rotation noise, which is more robust and abandons few samples, to transmit the classical information. In addition, this protocol utilizes the GHZ-like state and the χ + state to produce the entanglement swapping. And the Bell basis and the dual basis are used to measure the particle pair so that 3 bits of each participant's private information can be compared in each comparison time, which is more efficient and consumes fewer comparison times. Meanwhile, there is no need of unitary operation and hash function in this protocol. At the end, various kinds of outside attack and participant attack are discussed and analyzed to be invalid, so it can complete the comparison in security.

  17. Survey of risks and benefits communication strategies by research nurses.

    PubMed

    Nusbaum, Lika; Douglas, Brenda; Estrella-Luna, Neenah; Paasche-Orlow, Michael; Damus, Karla

    2017-01-01

    An ethical, informed consent process requires that potential participants understand the study, their rights, and the risks and benefits. Yet, despite strategies to improve communication, many participants still lack understanding of potential risks and benefits. Investigating attitudes and practices of research nurses can identify ways to improve the informed consent process. What are the attitudes, practices, and preparedness of nurses involved in the informed consent process regarding communication of risks and benefits? A survey was developed and administered online to a national purposive sample of 107 research nurses with experience obtaining informed consent for clinical trials. Survey responses stratified by selected work-related characteristics were analyzed. Ethical considerations: Participants were instructed they need not answer each question and could stop at any time. They consented by clicking "accept" on the email which linked to the survey. The study was approved by the Northeastern University Institutional Review Board, Boston, Massachusetts (NU-IRB Protocol #: 13-06-17). Most research nurses (87%) used a teach-back method to assess participant comprehension, while 72% relied on their intuition. About one-third did not feel prepared to communicate related statistics. About 20% did not feel prepared to tailor information, and half did not feel competent using supplemental materials to enhance risks and benefits comprehension. Only 70% had received training in the informed consent process which included in-person training (84%), case studies (69%), online courses (57%), feedback during practice sessions (54%), and simulation, such as role playing (49%) and viewing videos (45%). Perceived preparedness was significantly associated with greater informed consent experience and training. Research nurses may have inadequate training to encourage, support, and reinforce communication of risks and benefits during the informed consent process. Relevant purposeful education and training should help to improve and standardize the ethical informed consent process.

  18. A SOAP Web Service for accessing MODIS land product subsets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    SanthanaVannan, Suresh K; Cook, Robert B; Pan, Jerry Yun

    2011-01-01

    Remote sensing data from satellites have provided valuable information on the state of the earth for several decades. Since March 2000, the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor on board NASA s Terra and Aqua satellites have been providing estimates of several land parameters useful in understanding earth system processes at global, continental, and regional scales. However, the HDF-EOS file format, specialized software needed to process the HDF-EOS files, data volume, and the high spatial and temporal resolution of MODIS data make it difficult for users wanting to extract small but valuable amounts of information from the MODIS record. Tomore » overcome this usability issue, the NASA-funded Distributed Active Archive Center (DAAC) for Biogeochemical Dynamics at Oak Ridge National Laboratory (ORNL) developed a Web service that provides subsets of MODIS land products using Simple Object Access Protocol (SOAP). The ORNL DAAC MODIS subsetting Web service is a unique way of serving satellite data that exploits a fairly established and popular Internet protocol to allow users access to massive amounts of remote sensing data. The Web service provides MODIS land product subsets up to 201 x 201 km in a non-proprietary comma delimited text file format. Users can programmatically query the Web service to extract MODIS land parameters for real time data integration into models, decision support tools or connect to workflow software. Information regarding the MODIS SOAP subsetting Web service is available on the World Wide Web (WWW) at http://daac.ornl.gov/modiswebservice.« less

  19. Strategies for preventing group B streptococcal infections in newborns: a nation-wide survey of Italian policies.

    PubMed

    Tzialla, Chryssoula; Berardi, Alberto; Farina, Claudio; Clerici, Pierangelo; Borghesi, Alessandro; Viora, Elsa; Scollo, Paolo; Stronati, Mauro

    2017-11-02

    There are no Italian data regarding the strategies for preventing neonatal group B streptococcal (GBS) infection. We conducted a national survey in order to explore obstetrical, neonatal and microbiological practices for the GBS prevention. Three distinct questionnaires were sent to obstetricians, neonatologists and microbiologists. Questionnaires included data on prenatal GBS screening, maternal risk factors, intrapartum antibiotic prophylaxis, microbiological information concerning specimen processing and GBS antimicrobial susceptibility. All respondent obstetrical units used the culture-based screening approach to identify women who should receive intrapartum antibiotic prophylaxis, and more than half of the microbiological laboratories (58%) reported using specimen processing consistent with CDC guidelines. Most neonatal units (89 out of 107, 82%) reported using protocols for preventing GBS early-onset sepsis consistent with CDC guidelines. The screening-based strategy is largely prevalent in Italy, and most protocols for preventing GBS early-onset sepsis are consistent with CDC guidelines. However, we found discrepancies in practices among centers that may reflect the lack of Italian guidelines issued by public health organizations.

  20. Development of a real-time PCR protocol for the species origin confirmation of isolated animal particles detected by NIRM.

    PubMed

    Fumière, O; Marien, A; Fernández Pierna, J A; Baeten, V; Berben, G

    2010-08-01

    At present, European legislation prohibits totally the use of processed animal proteins in feed for all farmed animals (Commission Regulation (EC) No. 1234/2003-extended feed ban). A softening of the feed ban for non-ruminants would nevertheless be considered if alternative methods could be used to gain more information concerning the species origin of processed animal proteins than that which can be provided by classical optical microscopy. This would allow control provisions such as the ban of feeding animals with proteins from the same species or intra-species recycling (Regulation (EC) No. 1774/2002). Two promising alternative methods, near-infrared microscopy (NIRM) and real-time polymerase chain reaction (PCR), were combined to authenticate, at the species level, the presence of animal particles. The paper describes the improvements of the real-time PCR method made to the DNA extraction protocol, allowing five PCR analyses to be performed with the DNA extracted from a single particle.

  1. Multiprocessor shared-memory information exchange

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santoline, L.L.; Bowers, M.D.; Crew, A.W.

    1989-02-01

    In distributed microprocessor-based instrumentation and control systems, the inter-and intra-subsystem communication requirements ultimately form the basis for the overall system architecture. This paper describes a software protocol which addresses the intra-subsystem communications problem. Specifically the protocol allows for multiple processors to exchange information via a shared-memory interface. The authors primary goal is to provide a reliable means for information to be exchanged between central application processor boards (masters) and dedicated function processor boards (slaves) in a single computer chassis. The resultant Multiprocessor Shared-Memory Information Exchange (MSMIE) protocol, a standard master-slave shared-memory interface suitable for use in nuclear safety systems, ismore » designed to pass unidirectional buffers of information between the processors while providing a minimum, deterministic cycle time for this data exchange.« less

  2. Faster modified protocol for first order reversal curve measurements

    NASA Astrophysics Data System (ADS)

    De Biasi, Emilio

    2017-10-01

    In this work we present a faster modified protocol for first order reversal curve (FORC) measurements. The main idea of this procedure is to use the information of the ascending and descending branches constructed through successive sweeps of magnetic field. The new method reduces the number of field sweeps to almost one half as compared to the traditional method. The length of each branch is reduced faster than in the usual FORC protocol. The new method implies not only a new measurement protocol but also a new recipe for the previous treatment of the data. After of these pre-processing, the FORC diagram can be obtained by the conventional methods. In the present work we show that the new FORC procedure leads to results identical to the conventional method if the system under study follows the Stoner-Wohlfarth model with interactions that do not depend of the magnetic state (up or down) of the entities, as in the Preisach model. More specifically, if the coercive and interactions fields are not correlated, and the hysteresis loops have a square shape. Some numerical examples show the comparison between the usual FORC procedure and the propose one. We also discuss that it is possible to find some differences in the case of real systems, due to the magnetic interactions. There is no reason to prefer one FORC method over the other from the point of view of the information to be obtained. On the contrary, the use of both methods could open doors for a more accurate and deep analysis.

  3. A security proof of the round-robin differential phase shift quantum key distribution protocol based on the signal disturbance

    NASA Astrophysics Data System (ADS)

    Sasaki, Toshihiko; Koashi, Masato

    2017-06-01

    The round-robin differential phase shift (RRDPS) quantum key distribution (QKD) protocol is a unique QKD protocol whose security has not been understood through an information-disturbance trade-off relation, and a sufficient amount of privacy amplification was given independently of signal disturbance. Here, we discuss the security of the RRDPS protocol in the asymptotic regime when a good estimate of the bit error rate is available as a measure of signal disturbance. The uniqueness of the RRDPS protocol shows up as a peculiar form of information-disturbance trade-off curve. When the length of a block of pulses used for encoding and the signal disturbance are both small, it provides a significantly better key rate than that from the original security proof. On the other hand, when the block length is large, the use of the signal disturbance makes little improvement in the key rate. Our analysis will bridge a gap between the RRDPS protocol and the conventional QKD protocols.

  4. Exploring the Implementation of Steganography Protocols on Quantum Audio Signals

    NASA Astrophysics Data System (ADS)

    Chen, Kehan; Yan, Fei; Iliyasu, Abdullah M.; Zhao, Jianping

    2018-02-01

    Two quantum audio steganography (QAS) protocols are proposed, each of which manipulates or modifies the least significant qubit (LSQb) of the host quantum audio signal that is encoded as an FRQA (flexible representation of quantum audio) audio content. The first protocol (i.e. the conventional LSQb QAS protocol or simply the cLSQ stego protocol) is built on the exchanges between qubits encoding the quantum audio message and the LSQb of the amplitude information in the host quantum audio samples. In the second protocol, the embedding procedure to realize it implants information from a quantum audio message deep into the constraint-imposed most significant qubit (MSQb) of the host quantum audio samples, we refer to it as the pseudo MSQb QAS protocol or simply the pMSQ stego protocol. The cLSQ stego protocol is designed to guarantee high imperceptibility between the host quantum audio and its stego version, whereas the pMSQ stego protocol ensures that the resulting stego quantum audio signal is better immune to illicit tampering and copyright violations (a.k.a. robustness). Built on the circuit model of quantum computation, the circuit networks to execute the embedding and extraction algorithms of both QAS protocols are determined and simulation-based experiments are conducted to demonstrate their implementation. Outcomes attest that both protocols offer promising trade-offs in terms of imperceptibility and robustness.

  5. PNNI Performance Validation Test Report

    NASA Technical Reports Server (NTRS)

    Dimond, Robert P.

    1999-01-01

    Two Private Network-Network Interface (PNNI) neighboring peers were monitored with a protocol analyzer to understand and document how PNNI works with regards to initialization and recovery processes. With the processes documented, pertinent events were found and measured to determine the protocols behavior in several environments, which consisted of congestion and/or delay. Subsequent testing of the protocol in these environments was conducted to determine the protocol's suitability for use in satellite-terrestrial network architectures.

  6. A Novel Re-keying Function Protocol (NRFP) For Wireless Sensor Network Security

    PubMed Central

    Abdullah, Maan Younis; Hua, Gui Wei; Alsharabi, Naif

    2008-01-01

    This paper describes a novel re-keying function protocol (NRFP) for wireless sensor network security. A re-keying process management system for sensor networks is designed to support in-network processing. The design of the protocol is motivated by decentralization key management for wireless sensor networks (WSNs), covering key deployment, key refreshment, and key establishment. NRFP supports the establishment of novel administrative functions for sensor nodes that derive/re-derive a session key for each communication session. The protocol proposes direct connection, in-direct connection and hybrid connection. NRFP also includes an efficient protocol for local broadcast authentication based on the use of one-way key chains. A salient feature of the authentication protocol is that it supports source authentication without precluding innetwork processing. Security and performance analysis shows that it is very efficient in computation, communication and storage and, that NRFP is also effective in defending against many sophisticated attacks. PMID:27873963

  7. A Novel Re-keying Function Protocol (NRFP) For Wireless Sensor Network Security.

    PubMed

    Abdullah, Maan Younis; Hua, Gui Wei; Alsharabi, Naif

    2008-12-04

    This paper describes a novel re-keying function protocol (NRFP) for wireless sensor network security. A re-keying process management system for sensor networks is designed to support in-network processing. The design of the protocol is motivated by decentralization key management for wireless sensor networks (WSNs), covering key deployment, key refreshment, and key establishment. NRFP supports the establishment of novel administrative functions for sensor nodes that derive/re-derive a session key for each communication session. The protocol proposes direct connection, in-direct connection and hybrid connection. NRFP also includes an efficient protocol for local broadcast authentication based on the use of one-way key chains. A salient feature of the authentication protocol is that it supports source authentication without precluding in-network processing. Security and performance analysis shows that it is very efficient in computation, communication and storage and, that NRFP is also effective in defending against many sophisticated attacks.

  8. Developing a monitoring protocol for visitor-created informal trails in Yosemite National Park, USA.

    PubMed

    Leung, Yu-Fai; Newburger, Todd; Jones, Marci; Kuhn, Bill; Woiderski, Brittany

    2011-01-01

    Informal trails created or perpetuated by visitors is a management challenge in many protected natural areas such as Yosemite National Park. This is a significant issue as informal trail networks penetrate and proliferate into protected landscapes and habitats, threatening ecological integrity, aesthetics, and visitor experiences. In order to develop effective strategies for addressing this problem under an adaptive management framework, indicators must be developed and monitoring protocol must be established to gather timely and relevant data about the condition, extent, and distribution of these undesired trail segments. This article illustrates a process of developing and evaluating informal trail indicators for meadows in Yosemite Valley. Indicator measures developed in past research were reviewed to identify their appropriateness for the current application. Information gaps in existing indicator measures were addressed by creating two new indices to quantify the degree of informal trailing based on its land fragmentation effects. The selected indicator measures were applied to monitoring data collected between 2006 and 2008. The selected measures and indices were evaluated for their ability to characterize informal trail impacts at site and landscape scales. Results demonstrate the utility of indicator measures in capturing different characteristics of the informal trail problem, though several metrics are strongly related to each other. The two fragmentation indices were able to depict fragmentation without being too sensitive to changes in one constituent parameter. This study points to the need for a multiparameter approach to informal trail monitoring and integration with other monitoring data. Implications for monitoring programs and research are discussed.

  9. Developing a Monitoring Protocol for Visitor-Created Informal Trails in Yosemite National Park, USA

    NASA Astrophysics Data System (ADS)

    Leung, Yu-Fai; Newburger, Todd; Jones, Marci; Kuhn, Bill; Woiderski, Brittany

    2011-01-01

    Informal trails created or perpetuated by visitors is a management challenge in many protected natural areas such as Yosemite National Park. This is a significant issue as informal trail networks penetrate and proliferate into protected landscapes and habitats, threatening ecological integrity, aesthetics, and visitor experiences. In order to develop effective strategies for addressing this problem under an adaptive management framework, indicators must be developed and monitoring protocol must be established to gather timely and relevant data about the condition, extent, and distribution of these undesired trail segments. This article illustrates a process of developing and evaluating informal trail indicators for meadows in Yosemite Valley. Indicator measures developed in past research were reviewed to identify their appropriateness for the current application. Information gaps in existing indicator measures were addressed by creating two new indices to quantify the degree of informal trailing based on its land fragmentation effects. The selected indicator measures were applied to monitoring data collected between 2006 and 2008. The selected measures and indices were evaluated for their ability to characterize informal trail impacts at site and landscape scales. Results demonstrate the utility of indicator measures in capturing different characteristics of the informal trail problem, though several metrics are strongly related to each other. The two fragmentation indices were able to depict fragmentation without being too sensitive to changes in one constituent parameter. This study points to the need for a multiparameter approach to informal trail monitoring and integration with other monitoring data. Implications for monitoring programs and research are discussed.

  10. From hospital information system components to the medical record and clinical guidelines & protocols.

    PubMed

    Veloso, M; Estevão, N; Ferreira, P; Rodrigues, R; Costa, C T; Barahona, P

    1997-01-01

    This paper introduces an ongoing project towards the development of a new generation HIS, aiming at the integration of clinical and administrative information within a common framework. Its design incorporates explicit knowledge about domain objects and professional activities to be processed by the system together with related knowledge management services and act management services. The paper presents the conceptual model of the proposed HIS architecture, that supports a rich and fully integrated patient data model, enabling the implementation of a dynamic electronic patient record tightly coupled with computerised guideline knowledge bases.

  11. Evaluating robustness in rank-based risk assessments of freshwater ecosystems

    USGS Publications Warehouse

    Mattson, K.M.; Angermeier, Paul

    2007-01-01

    Conservation planning aims to protect biodiversity by sustainng the natural physical, chemical, and biological processes within representative ecosystems. Often data to measure these components are inadequate or unavailable. The impact of human activities on ecosystem processes complicates integrity assessments and might alter ecosystem organization at multiple spatial scales. Freshwater conservation targets, such as populations and communities, are influenced by both intrinsic aquatic properties and the surrounding landscape, and locally collected data might not accurately reflect potential impacts. We suggest that changes in five major biotic drivers—energy sources, physical habitat, flow regime, water quality, and biotic interactions—might be used as surrogates to inform conservation planners of the ecological integrity of freshwater ecosystems. Threats to freshwater systems might be evaluated based on their impact to these drivers to provide an overview of potential risk to conservation targets. We developed a risk-based protocol, the Ecological Risk Index (ERI), to identify watersheds with least/most risk to conservation targets. Our protocol combines risk-based components, specifically the frequency and severity of human-induced stressors, with biotic drivers and mappable land- and water-use data to provide a summary of relative risk to watersheds. We illustrate application of our protocol with a case study of the upper Tennessee River basin, USA. Differences in risk patterns among the major drainages in the basin reflect dominant land uses, such as mining and agriculture. A principal components analysis showed that localized, moderately severe threats accounted for most of the threat composition differences among our watersheds. We also found that the relative importance of threats is sensitive to the spatial grain of the analysis. Our case study demonstrates that the ERI is useful for evaluating the frequency and severity of ecosystemwide risk, which can inform local and regional conservation planning.

  12. A participatory approach to designing and enhancing integrated health information technology systems for veterans: protocol.

    PubMed

    Haun, Jolie N; Nazi, Kim M; Chavez, Margeaux; Lind, Jason D; Antinori, Nicole; Gosline, Robert M; Martin, Tracey L

    2015-02-27

    The Department of Veterans Affairs (VA) has developed health information technologies (HIT) and resources to improve veteran access to health care programs and services, and to support a patient-centered approach to health care delivery. To improve VA HIT access and meaningful use by veterans, it is necessary to understand their preferences for interacting with various HIT resources to accomplish health management related tasks and to exchange information. The objective of this paper was to describe a novel protocol for: (1) developing a HIT Digital Health Matrix Model; (2) conducting an Analytic Hierarchy Process called pairwise comparison to understand how and why veterans want to use electronic health resources to complete tasks related to health management; and (3) developing visual modeling simulations that depict veterans' preferences for using VA HIT to manage their health conditions and exchange health information. The study uses participatory research methods to understand how veterans prefer to use VA HIT to accomplish health management tasks within a given context, and how they would like to interact with HIT interfaces (eg, look, feel, and function) in the future. This study includes two rounds of veteran focus groups with self-administered surveys and visual modeling simulation techniques. This study will also convene an expert panel to assist in the development of a VA HIT Digital Health Matrix Model, so that both expert panel members and veteran participants can complete an Analytic Hierarchy Process, pairwise comparisons to evaluate and rank the applicability of electronic health resources for a series of health management tasks. This protocol describes the iterative, participatory, and patient-centered process for: (1) developing a VA HIT Digital Health Matrix Model that outlines current VA patient-facing platforms available to veterans, describing their features and relevant contexts for use; and (2) developing visual model simulations based on direct veteran feedback that depict patient preferences for enhancing the synchronization, integration, and standardization of VA patient-facing platforms. Focus group topics include current uses, preferences, facilitators, and barriers to using electronic health resources; recommendations for synchronizing, integrating, and standardizing VA HIT; and preferences on data sharing and delegation within the VA system. This work highlights the practical, technological, and personal factors that facilitate and inhibit use of current VA HIT, and informs an integrated system redesign. The Digital Health Matrix Model and visual modeling simulations use knowledge of veteran preferences and experiences to directly inform enhancements to VA HIT and provide a more holistic and integrated user experience. These efforts are designed to support the adoption and sustained use of VA HIT to support patient self-management and clinical care coordination in ways that are directly aligned with veteran preferences.

  13. A Participatory Approach to Designing and Enhancing Integrated Health Information Technology Systems for Veterans: Protocol

    PubMed Central

    Nazi, Kim M; Chavez, Margeaux; Lind, Jason D; Antinori, Nicole; Gosline, Robert M; Martin, Tracey L

    2015-01-01

    Background The Department of Veterans Affairs (VA) has developed health information technologies (HIT) and resources to improve veteran access to health care programs and services, and to support a patient-centered approach to health care delivery. To improve VA HIT access and meaningful use by veterans, it is necessary to understand their preferences for interacting with various HIT resources to accomplish health management related tasks and to exchange information. Objective The objective of this paper was to describe a novel protocol for: (1) developing a HIT Digital Health Matrix Model; (2) conducting an Analytic Hierarchy Process called pairwise comparison to understand how and why veterans want to use electronic health resources to complete tasks related to health management; and (3) developing visual modeling simulations that depict veterans’ preferences for using VA HIT to manage their health conditions and exchange health information. Methods The study uses participatory research methods to understand how veterans prefer to use VA HIT to accomplish health management tasks within a given context, and how they would like to interact with HIT interfaces (eg, look, feel, and function) in the future. This study includes two rounds of veteran focus groups with self-administered surveys and visual modeling simulation techniques. This study will also convene an expert panel to assist in the development of a VA HIT Digital Health Matrix Model, so that both expert panel members and veteran participants can complete an Analytic Hierarchy Process, pairwise comparisons to evaluate and rank the applicability of electronic health resources for a series of health management tasks. Results This protocol describes the iterative, participatory, and patient-centered process for: (1) developing a VA HIT Digital Health Matrix Model that outlines current VA patient-facing platforms available to veterans, describing their features and relevant contexts for use; and (2) developing visual model simulations based on direct veteran feedback that depict patient preferences for enhancing the synchronization, integration, and standardization of VA patient-facing platforms. Focus group topics include current uses, preferences, facilitators, and barriers to using electronic health resources; recommendations for synchronizing, integrating, and standardizing VA HIT; and preferences on data sharing and delegation within the VA system. Conclusions This work highlights the practical, technological, and personal factors that facilitate and inhibit use of current VA HIT, and informs an integrated system redesign. The Digital Health Matrix Model and visual modeling simulations use knowledge of veteran preferences and experiences to directly inform enhancements to VA HIT and provide a more holistic and integrated user experience. These efforts are designed to support the adoption and sustained use of VA HIT to support patient self-management and clinical care coordination in ways that are directly aligned with veteran preferences. PMID:25803324

  14. An Indoor Positioning-Based Mobile Payment System Using Bluetooth Low Energy Technology

    PubMed Central

    Winata, Doni

    2018-01-01

    The development of information technology has paved the way for faster and more convenient payment process flows and new methodology for the design and implementation of next generation payment systems. The growth of smartphone usage nowadays has fostered a new and popular mobile payment environment. Most of the current generation smartphones support Bluetooth Low Energy (BLE) technology to communicate with nearby BLE-enabled devices. It is plausible to construct an Over-the-Air BLE-based mobile payment system as one of the payment methods for people living in modern societies. In this paper, a secure indoor positioning-based mobile payment authentication protocol with BLE technology and the corresponding mobile payment system design are proposed. The proposed protocol consists of three phases: initialization phase, session key construction phase, and authentication phase. When a customer moves toward the POS counter area, the proposed mobile payment system will automatically detect the position of the customer to confirm whether the customer is ready for the checkout process. Once the system has identified the customer is standing within the payment-enabled area, the payment system will invoke authentication process between POS and the customer’s smartphone through BLE communication channel to generate a secure session key and establish an authenticated communication session to perform the payment transaction accordingly. A prototype is implemented to assess the performance of the proposed design for mobile payment system. In addition, security analysis is conducted to evaluate the security strength of the proposed protocol. PMID:29587399

  15. An Indoor Positioning-Based Mobile Payment System Using Bluetooth Low Energy Technology.

    PubMed

    Yohan, Alexander; Lo, Nai-Wei; Winata, Doni

    2018-03-25

    The development of information technology has paved the way for faster and more convenient payment process flows and new methodology for the design and implementation of next generation payment systems. The growth of smartphone usage nowadays has fostered a new and popular mobile payment environment. Most of the current generation smartphones support Bluetooth Low Energy (BLE) technology to communicate with nearby BLE-enabled devices. It is plausible to construct an Over-the-Air BLE-based mobile payment system as one of the payment methods for people living in modern societies. In this paper, a secure indoor positioning-based mobile payment authentication protocol with BLE technology and the corresponding mobile payment system design are proposed. The proposed protocol consists of three phases: initialization phase, session key construction phase, and authentication phase. When a customer moves toward the POS counter area, the proposed mobile payment system will automatically detect the position of the customer to confirm whether the customer is ready for the checkout process. Once the system has identified the customer is standing within the payment-enabled area, the payment system will invoke authentication process between POS and the customer's smartphone through BLE communication channel to generate a secure session key and establish an authenticated communication session to perform the payment transaction accordingly. A prototype is implemented to assess the performance of the proposed design for mobile payment system. In addition, security analysis is conducted to evaluate the security strength of the proposed protocol.

  16. Developing the skills required for evidence-based practice.

    PubMed

    French, B

    1998-01-01

    The current health care environment requires practitioners with the skills to find and apply the best currently available evidence for effective health care, to contribute to the development of evidence-based practice protocols, and to evaluate the impact of utilizing validated research findings in practice. Current approaches to teaching research are based mainly on gaining skills by participation in the research process. Emphasis on the requirement for rigour in the process of creating new knowledge is assumed to lead to skill in the process of using research information created by others. This article reflects upon the requirements for evidence-based practice, and the degree to which current approaches to teaching research prepare practitioners who are able to find, evaluate and best use currently available research information. The potential for using the principles of systematic review as a teaching and learning strategy for research is explored, and some of the possible strengths and weakness of this approach are highlighted.

  17. Basic Microsurgery Training Using the Laboratory Rat (Rattus norvegicus)

    DTIC Science & Technology

    2018-03-01

    all information . Use additional pages if necessary.) PROTOCOL #: FDG20170016A DATE: 1 March 2018 PROTOCOL TITLE: “Basic Microsurgery Training...2018__ RYAN M. DIEPENBROCK, Lt Col, USAF, DC (Date) 3 FDG20170016A Attachments: Attachment 1: Defense Technical Information Center (DTIC) Abstract...Submission Attachment 1 Defense Technical Information Center (DTIC) Abstract Submission Objectives: The purpose of this course is to

  18. Audit of HIV counselling and testing services among primary healthcare facilities in Cameroon: a protocol for a multicentre national cross-sectional study.

    PubMed

    Tianyi, Frank-Leonel; Tochie, Joel Noutakdie; Agbor, Valirie Ndip; Kadia, Benjamin Momo

    2018-03-01

    HIV testing is an invaluable entry point to prevention, care and treatment services for people living with HIV and AIDS. Poor adherence to recommended protocols and guidelines reduces the performance of rapid diagnostic tests, leading to misdiagnosis and poor estimation of HIV seroprevalence. This study seeks to evaluate the adherence of primary healthcare facilities in Cameroon to recommended HIV counselling and testing (HCT) procedures and the impact this may have on the reliability of HIV test results. This will be an analytical cross-sectional study involving primary healthcare facilities from all the 10 regions of Cameroon, selected by a multistaged random sampling of primary care facilities in each region. The study will last for 9 months. A structured questionnaire will be used to collect general information concerning the health facility, laboratory and other departments involved in the HCT process. The investigators will directly observe at least 10 HIV testing processes in each facility and fill out the checklist accordingly. Clearance has been obtained from the National Ethical Committee to carry out the study. Informed consent will be sought from the patients to observe the HIV testing process. The final study will be published in a peer-reviewed journal and the findings presented to health policy-makers and the general public. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  19. The Development of a Design and Construction Process Protocol to Support the Home Modification Process Delivered by Occupational Therapists

    PubMed Central

    Ormerod, Marcus; Newton, Rita

    2018-01-01

    Modifying the home environments of older people as they age in place is a well-established health and social care intervention. Using design and construction methods to redress any imbalance caused by the ageing process or disability within the home environment, occupational therapists are seen as the experts in this field of practice. However, the process used by occupational therapists when modifying home environments has been criticised for being disorganised and not founded on theoretical principles and concepts underpinning the profession. To address this issue, research was conducted to develop a design and construction process protocol specifically for home modifications. A three-stage approach was taken for the analysis of qualitative data generated from an online survey, completed by 135 occupational therapists in the UK. Using both the existing occupational therapy intervention process model and the design and construction process protocol as the theoretical frameworks, a 4-phase, 9-subphase design and construction process protocol for home modifications was developed. Overall, the study is innovative in developing the first process protocol for home modifications, potentially providing occupational therapists with a systematic and effective approach to the design and delivery of home modification services for older and disabled people. PMID:29682348

  20. The Development of a Design and Construction Process Protocol to Support the Home Modification Process Delivered by Occupational Therapists.

    PubMed

    Russell, Rachel; Ormerod, Marcus; Newton, Rita

    2018-01-01

    Modifying the home environments of older people as they age in place is a well-established health and social care intervention. Using design and construction methods to redress any imbalance caused by the ageing process or disability within the home environment, occupational therapists are seen as the experts in this field of practice. However, the process used by occupational therapists when modifying home environments has been criticised for being disorganised and not founded on theoretical principles and concepts underpinning the profession. To address this issue, research was conducted to develop a design and construction process protocol specifically for home modifications. A three-stage approach was taken for the analysis of qualitative data generated from an online survey, completed by 135 occupational therapists in the UK. Using both the existing occupational therapy intervention process model and the design and construction process protocol as the theoretical frameworks, a 4-phase, 9-subphase design and construction process protocol for home modifications was developed. Overall, the study is innovative in developing the first process protocol for home modifications, potentially providing occupational therapists with a systematic and effective approach to the design and delivery of home modification services for older and disabled people.

  1. Informing clinical policy decision-making practices in ambulance services.

    PubMed

    Muecke, Sandy; Curac, Nada; Binks, Darryn

    2013-12-01

    This study aims to identify the processes and frameworks that support an evidence-based approach to clinical policy decision-making practices in ambulance services. This literature review focused on: (i) the setting (pre-hospital); and (ii) the process of evidence translation, for studies published after the year 2000. Searches of Medline, CINAHL and Google were undertaken. Reference lists of eligible publications were searched for relevant articles. A total of 954 articles were identified. Of these, 20 full text articles were assessed for eligibility and seven full text articles met the inclusion criteria. Three provided detailed descriptions of the evidence-based practice processes used to inform ambulance service protocol or guideline development or review. There is little published literature that describes the processes involved, and frameworks required, to inform clinical policy decision making within ambulance services. This review found that processes were iterative and involved collaborations across many internal and external stakeholders. In several jurisdictions, these were coordinated by a dedicated team. Success appears dependent on committed leadership and purposive human and structural resources. Although time consuming, structured processes have been developed in some jurisdictions to assist decision-making processes. Further insight is likely to be obtained from literature published by those from other disciplines. © 2013 The Authors. International Journal of Evidence-Based Healthcare © 2013 The Joanna Briggs Institute.

  2. Experience in Construction and Operation of the Distributed Information Systems on the Basis of the Z39.50 Protocol

    NASA Astrophysics Data System (ADS)

    Zhizhimov, Oleg; Mazov, Nikolay; Skibin, Sergey

    Questions concerned with construction and operation of the distributed information systems on the basis of ANSI/NISO Z39.50 Information Retrieval Protocol are discussed in the paper. The paper is based on authors' practice in developing ZooPARK server. Architecture of distributed information systems, questions of reliability of such systems, minimization of search time and administration are examined. Problems with developing of distributed information systems are also described.

  3. A Lightweight RFID Mutual Authentication Protocol Based on Physical Unclonable Function.

    PubMed

    Xu, He; Ding, Jie; Li, Peng; Zhu, Feng; Wang, Ruchuan

    2018-03-02

    With the fast development of the Internet of Things, Radio Frequency Identification (RFID) has been widely applied into many areas. Nevertheless, security problems of the RFID technology are also gradually exposed, when it provides life convenience. In particular, the appearance of a large number of fake and counterfeit goods has caused massive loss for both producers and customers, for which the clone tag is a serious security threat. If attackers acquire the complete information of a tag, they can then obtain the unique identifier of the tag by some technological means. In general, because there is no extra identifier of a tag, it is difficult to distinguish an original tag and its clone one. Once the legal tag data is obtained, attackers can be able to clone this tag. Therefore, this paper shows an efficient RFID mutual verification protocol. This protocol is based on the Physical Unclonable Function (PUF) and the lightweight cryptography to achieve efficient verification of a single tag. The protocol includes three process: tag recognition, mutual verification and update. The tag recognition is that the reader recognizes the tag; mutual verification is that the reader and tag mutually verify the authenticity of each other; update is supposed to maintain the latest secret key for the following verification. Analysis results show that this protocol has a good balance between performance and security.

  4. A Lightweight RFID Mutual Authentication Protocol Based on Physical Unclonable Function

    PubMed Central

    Ding, Jie; Zhu, Feng; Wang, Ruchuan

    2018-01-01

    With the fast development of the Internet of Things, Radio Frequency Identification (RFID) has been widely applied into many areas. Nevertheless, security problems of the RFID technology are also gradually exposed, when it provides life convenience. In particular, the appearance of a large number of fake and counterfeit goods has caused massive loss for both producers and customers, for which the clone tag is a serious security threat. If attackers acquire the complete information of a tag, they can then obtain the unique identifier of the tag by some technological means. In general, because there is no extra identifier of a tag, it is difficult to distinguish an original tag and its clone one. Once the legal tag data is obtained, attackers can be able to clone this tag. Therefore, this paper shows an efficient RFID mutual verification protocol. This protocol is based on the Physical Unclonable Function (PUF) and the lightweight cryptography to achieve efficient verification of a single tag. The protocol includes three process: tag recognition, mutual verification and update. The tag recognition is that the reader recognizes the tag; mutual verification is that the reader and tag mutually verify the authenticity of each other; update is supposed to maintain the latest secret key for the following verification. Analysis results show that this protocol has a good balance between performance and security. PMID:29498684

  5. Bluetooth Roaming for Sensor Network System in Clinical Environment.

    PubMed

    Kuroda, Tomohiro; Noma, Haruo; Takase, Kazuhiko; Sasaki, Shigeto; Takemura, Tadamasa

    2015-01-01

    A sensor network is key infrastructure for advancing a hospital information system (HIS). The authors proposed a method to provide roaming functionality for Bluetooth to realize a Bluetooth-based sensor network, which is suitable to connect clinical devices. The proposed method makes the average response time of a Bluetooth connection less than one second by making the master device repeat the inquiry process endlessly and modifies parameters of the inquiry process. The authors applied the developed sensor network for daily clinical activities in an university hospital, and confirmed the stabilitya and effectiveness of the sensor network. As Bluetooth becomes a quite common wireless interface for medical devices, the proposed protocol that realizes Bluetooth-based sensor network enables HIS to equip various clinical devices and, consequently, lets information and communication technologies advance clinical services.

  6. Improved compressed sensing-based cone-beam CT reconstruction using adaptive prior image constraints

    NASA Astrophysics Data System (ADS)

    Lee, Ho; Xing, Lei; Davidi, Ran; Li, Ruijiang; Qian, Jianguo; Lee, Rena

    2012-04-01

    Volumetric cone-beam CT (CBCT) images are acquired repeatedly during a course of radiation therapy and a natural question to ask is whether CBCT images obtained earlier in the process can be utilized as prior knowledge to reduce patient imaging dose in subsequent scans. The purpose of this work is to develop an adaptive prior image constrained compressed sensing (APICCS) method to solve this problem. Reconstructed images using full projections are taken on the first day of radiation therapy treatment and are used as prior images. The subsequent scans are acquired using a protocol of sparse projections. In the proposed APICCS algorithm, the prior images are utilized as an initial guess and are incorporated into the objective function in the compressed sensing (CS)-based iterative reconstruction process. Furthermore, the prior information is employed to detect any possible mismatched regions between the prior and current images for improved reconstruction. For this purpose, the prior images and the reconstructed images are classified into three anatomical regions: air, soft tissue and bone. Mismatched regions are identified by local differences of the corresponding groups in the two classified sets of images. A distance transformation is then introduced to convert the information into an adaptive voxel-dependent relaxation map. In constructing the relaxation map, the matched regions (unchanged anatomy) between the prior and current images are assigned with smaller weight values, which are translated into less influence on the CS iterative reconstruction process. On the other hand, the mismatched regions (changed anatomy) are associated with larger values and the regions are updated more by the new projection data, thus avoiding any possible adverse effects of prior images. The APICCS approach was systematically assessed by using patient data acquired under standard and low-dose protocols for qualitative and quantitative comparisons. The APICCS method provides an effective way for us to enhance the image quality at the matched regions between the prior and current images compared to the existing PICCS algorithm. Compared to the current CBCT imaging protocols, the APICCS algorithm allows an imaging dose reduction of 10-40 times due to the greatly reduced number of projections and lower x-ray tube current level coming from the low-dose protocol.

  7. Implementing the information prescription protocol in a family medicine practice: a case study*†‡

    PubMed Central

    Carey, Peggy; Haines, Laura; Lampson, Alan P; Pond, Fred

    2010-01-01

    Question: Can an information prescription protocol be successfully integrated into a family medicine practice seeking to enhance patient education and self-management? Setting: Milton Family Practice, an outpatient clinic and resident teaching site of the University of Vermont and Fletcher Allen Health Care, is located in a semirural area fifteen miles from main campus. Objectives: The objectives were to increase physicians' knowledge and use of information prescriptions, sustain integration of information prescription use, and increase physicians' ability to provide patient education information. Methods: Methods used were promotion of the National Library of Medicine's Information Rx, physician instruction, installation of patient and provider workstations, and a collaborative approach to practice integration. Main Results: A post-intervention survey showed increased physician knowledge and use of the Information Rx protocol. Support procedures were integrated at the practice. Conclusions: Sustainable integration of Information Rx in a primary care clinic requires not only promotion and education, but also attention to clinic organization and procedures. PMID:20648257

  8. A Family of ACO Routing Protocols for Mobile Ad Hoc Networks

    PubMed Central

    Rupérez Cañas, Delfín; Sandoval Orozco, Ana Lucila; García Villalba, Luis Javier; Kim, Tai-hoon

    2017-01-01

    In this work, an ACO routing protocol for mobile ad hoc networks based on AntHocNet is specified. As its predecessor, this new protocol, called AntOR, is hybrid in the sense that it contains elements from both reactive and proactive routing. Specifically, it combines a reactive route setup process with a proactive route maintenance and improvement process. Key aspects of the AntOR protocol are the disjoint-link and disjoint-node routes, separation between the regular pheromone and the virtual pheromone in the diffusion process and the exploration of routes, taking into consideration the number of hops in the best routes. In this work, a family of ACO routing protocols based on AntOR is also specified. These protocols are based on protocol successive refinements. In this work, we also present a parallelized version of AntOR that we call PAntOR. Using programming multiprocessor architectures based on the shared memory protocol, PAntOR allows running tasks in parallel using threads. This parallelization is applicable in the route setup phase, route local repair process and link failure notification. In addition, a variant of PAntOR that consists of having more than one interface, which we call PAntOR-MI (PAntOR-Multiple Interface), is specified. This approach parallelizes the sending of broadcast messages by interface through threads. PMID:28531159

  9. Sharing Service Resource Information for Application Integration in a Virtual Enterprise - Modeling the Communication Protocol for Exchanging Service Resource Information

    NASA Astrophysics Data System (ADS)

    Yamada, Hiroshi; Kawaguchi, Akira

    Grid computing and web service technologies enable us to use networked resources in a coordinated manner. An integrated service is made of individual services running on coordinated resources. In order to achieve such coordinated services autonomously, the initiator of a coordinated service needs to know detailed service resource information. This information ranges from static attributes like the IP address of the application server to highly dynamic ones like the CPU load. The most famous wide-area service discovery mechanism based on names is DNS. Its hierarchical tree organization and caching methods take advantage of the static information managed. However, in order to integrate business applications in a virtual enterprise, we need a discovery mechanism to search for the optimal resources based on the given a set of criteria (search keys). In this paper, we propose a communication protocol for exchanging service resource information among wide-area systems. We introduce the concept of the service domain that consists of service providers managed under the same management policy. This concept of the service domain is similar to that for autonomous systems (ASs). In each service domain, the service information provider manages the service resource information of service providers that exist in this service domain. The service resource information provider exchanges this information with other service resource information providers that belong to the different service domains. We also verified the protocol's behavior and effectiveness using a simulation model developed for proposed protocol.

  10. Two-qubit correlations revisited: average mutual information, relevant (and useful) observables and an application to remote state preparation

    NASA Astrophysics Data System (ADS)

    Giorda, Paolo; Allegra, Michele

    2017-07-01

    Understanding how correlations can be used for quantum communication protocols is a central goal of quantum information science. While many authors have linked the global measures of correlations such as entanglement or discord to the performance of specific protocols, in general the latter may require only correlations between specific observables. In this work, we first introduce a general measure of correlations for two-qubit states, based on the classical mutual information between local observables. Our measure depends on the state’s purity and the symmetry in the correlation distribution, according to which we provide a classification of maximally mixed marginal states (MMMS). We discuss the complementarity relation between correlations and coherence. By focusing on a simple yet paradigmatic example, i.e. the remote state preparation protocol, we introduce a method to systematically define the proper protocol-tailored measures of the correlations. The method is based on the identification of those correlations that are relevant (useful) for the protocol. On the one hand, the approach allows the role of the symmetry of the correlation distribution to be discussed in determining the efficiency of the protocol, both for MMMS and general two-qubit quantum states, and on the other hand, it allows an optimized protocol for non-MMMS to be devised, which is more efficient with respect to the standard one. Overall, our findings clarify how the key resources in simple communication protocols are the purity of the state used and the symmetry of the correlation distribution.

  11. Process evaluation of a primary healthcare validation study of a culturally adapted depression screening tool for use by Aboriginal and Torres Strait Islander people: study protocol.

    PubMed

    Farnbach, Sara; Evans, John; Eades, Anne-Marie; Gee, Graham; Fernando, Jamie; Hammond, Belinda; Simms, Matty; DeMasi, Karrina; Hackett, Maree

    2017-11-03

    Process evaluations are conducted alongside research projects to identify the context, impact and consequences of research, determine whether it was conducted per protocol and to understand how, why and for whom an intervention is effective. We present a process evaluation protocol for the Getting it Right research project, which aims to determine validity of a culturally adapted depression screening tool for use by Aboriginal and Torres Strait Islander people. In this process evaluation, we aim to: (1) explore the context, impact and consequences of conducting Getting It Right, (2) explore primary healthcare staff and community representatives' experiences with the research project, (3) determine if it was conducted per protocol and (4) explore experiences with the depression screening tool, including perceptions about how it could be implemented into practice (if found to be valid). We also describe the partnerships established to conduct this process evaluation and how the national Values and Ethics: Guidelines for Ethical Conduct in Aboriginal and Torres Strait Islander Health Research is met. Realist and grounded theory approaches are used. Qualitative data include semistructured interviews with primary healthcare staff and community representatives involved with Getting it Right. Iterative data collection and analysis will inform a coding framework. Interviews will continue until saturation of themes is reached, or all participants are considered. Data will be triangulated against administrative data and patient feedback. An Aboriginal and Torres Strait Islander Advisory Group guides this research. Researchers will be blinded from validation data outcomes for as long as is feasible. The University of Sydney Human Research Ethics Committee, Aboriginal Health and Medical Research Council of New South Wales and six state ethics committees have approved this research. Findings will be submitted to academic journals and presented at conferences. ACTRN12614000705684. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  12. [Automated anesthesia record systems].

    PubMed

    Heinrichs, W; Mönk, S; Eberle, B

    1997-07-01

    The introduction of electronic anaesthesia documentation systems was attempted as early as in 1979, although their efficient application has become reality only in the past few years. The advantages of the electronic protocol are apparent: Continuous high quality documentation, comparability of data due to the availability of a data bank, reduction in the workload of the anaesthetist and availability of additional data. Disadvantages of the electronic protocol have also been discussed in the literature. By going through the process of entering data on the course of the anaesthetic procedure on the protocol sheet, the information is mentally absorbed and evaluated by the anaesthetist. This information may, however, be lost when the data are recorded fully automatically-without active involvement on the part of the anaesthetist. Recent publications state that by using intelligent alarms and/or integrated displays manual record keeping is no longer necessary for anaesthesia vigilance. The technical design of automated anaesthesia records depends on an integration of network technology into the hospital. It will be appropriate to connect the systems to the internet, but safety requirements have to be followed strictly. Concerning the database, client server architecture as well as language standards like SQL should be used. Object oriented databases will be available in the near future. Another future goal of automated anaesthesia record systems will be using knowledge based technologies within these systems. Drug interactions, disease related anaesthetic techniques and other information sources can be integrated. At this time, almost none of the commercially available systems has matured to a point where their purchase can be recommended without reservation. There is still a lack of standards for the subsequent exchange of data and a solution to a number of ergonomic problems still remains to be found. Nevertheless, electronic anaesthesia protocols will be required in the near future. The advantages of accurate documentation and quality control in the presence of careful planning outweight cost considerations by far.

  13. LANES - LOCAL AREA NETWORK EXTENSIBLE SIMULATOR

    NASA Technical Reports Server (NTRS)

    Gibson, J.

    1994-01-01

    The Local Area Network Extensible Simulator (LANES) provides a method for simulating the performance of high speed local area network (LAN) technology. LANES was developed as a design and analysis tool for networking on board the Space Station. The load, network, link and physical layers of a layered network architecture are all modeled. LANES models to different lower-layer protocols, the Fiber Distributed Data Interface (FDDI) and the Star*Bus. The load and network layers are included in the model as a means of introducing upper-layer processing delays associated with message transmission; they do not model any particular protocols. FDDI is an American National Standard and an International Organization for Standardization (ISO) draft standard for a 100 megabit-per-second fiber-optic token ring. Specifications for the LANES model of FDDI are taken from the Draft Proposed American National Standard FDDI Token Ring Media Access Control (MAC), document number X3T9.5/83-16 Rev. 10, February 28, 1986. This is a mature document describing the FDDI media-access-control protocol. Star*Bus, also known as the Fiber Optic Demonstration System, is a protocol for a 100 megabit-per-second fiber-optic star-topology LAN. This protocol, along with a hardware prototype, was developed by Sperry Corporation under contract to NASA Goddard Space Flight Center as a candidate LAN protocol for the Space Station. LANES can be used to analyze performance of a networking system based on either FDDI or Star*Bus under a variety of loading conditions. Delays due to upper-layer processing can easily be nullified, allowing analysis of FDDI or Star*Bus as stand-alone protocols. LANES is a parameter-driven simulation; it provides considerable flexibility in specifying both protocol an run-time parameters. Code has been optimized for fast execution and detailed tracing facilities have been included. LANES was written in FORTRAN 77 for implementation on a DEC VAX under VMS 4.6. It consists of two programs, a simulation program and a user-interface program. The simulation program requires the SLAM II simulation library from Pritsker and Associates, W. Lafayette IN; the user interface is implemented using the Ingres database manager from Relational Technology, Inc. Information about running the simulation program without the user-interface program is contained in the documentation. The memory requirement is 129,024 bytes. LANES was developed in 1988.

  14. Refining MARGINS Mini-Lessons Using Classroom Observations

    NASA Astrophysics Data System (ADS)

    Iverson, E. A.; Manduca, C. A.; McDaris, J. R.; Lee, S.

    2009-12-01

    One of the challenges that we face in developing teaching materials or activities from research findings is testing the materials to determine that they work as intended. Traditionally faculty develop material for their own class, notice what worked and didn’t, and improve them the next year. However, as we move to a community process of creating and sharing teaching materials, a community-based process for testing materials is appropriate. The MARGINS project has piloted such a process for testing teaching materials and activities developed as part of its mini-lesson project (http://serc.carleton.edu/margins/index.html). Building on prior work developing mechanisms for community review of teaching resources (e.g. Kastens, 2002; Hancock and Manduca, 2005; Mayhew and Hall, 2007), the MARGINS evaluation team developed a structured classroom observation protocol. The goals of field testing are to a) gather structured, consistent feedback for the lesson authors based on classroom use; b) guide reviewers of these lessons to reflect on research-based educational practice as a framework for their comments; c) collect information on the data and observations that the reviewer used to underpin their review; d) determine which mini-lessons are ready to be made widely available on the website. The protocol guides faculty observations on why they used the activity, the effectiveness of the activity in their classroom, the success of the activity in leading to the desired learning, and what other faculty need to successfully use the activity. Available online (http://serc.carleton.edu/margins/protocol.html), the protocol can be downloaded and completed during instruction with the activity. In order to encourage review of mini-lessons using the protocol, a workshop focused on review and revision of activities was held in May 2009. In preparation for the workshop, 13 of the 28 participants chose to field test a mini-lesson prior to the workshop and reported that they found this process instructive. Activity authors found the observations very helpful and the first mini-lessons have now been revised using feedback from testers. Initial results show that the tested mini-lessons give students hands-on experience with scientific data and help students make connections between geologic phenomena and data. Productive feedback ranged from suggestions for improving activity design, adaptations for other audiences, suggestions for clearer presentation, and tips for using the materials. The team plans to broaden the use of the protocol to test and refine all of the mini-lessons in the MARGINS collection.

  15. A reference model for space data system interconnection services

    NASA Astrophysics Data System (ADS)

    Pietras, John; Theis, Gerhard

    1993-03-01

    The widespread adoption of standard packet-based data communication protocols and services for spaceflight missions provides the foundation for other standard space data handling services. These space data handling services can be defined as increasingly sophisticated processing of data or information received from lower-level services, using a layering approach made famous in the International Organization for Standardization (ISO) Open System Interconnection Reference Model (OSI-RM). The Space Data System Interconnection Reference Model (SDSI-RM) incorporates the conventions of the OSIRM to provide a framework within which a complete set of space data handling services can be defined. The use of the SDSI-RM is illustrated through its application to data handling services and protocols that have been defined by, or are under consideration by, the Consultative Committee for Space Data Systems (CCSDS).

  16. A reference model for space data system interconnection services

    NASA Technical Reports Server (NTRS)

    Pietras, John; Theis, Gerhard

    1993-01-01

    The widespread adoption of standard packet-based data communication protocols and services for spaceflight missions provides the foundation for other standard space data handling services. These space data handling services can be defined as increasingly sophisticated processing of data or information received from lower-level services, using a layering approach made famous in the International Organization for Standardization (ISO) Open System Interconnection Reference Model (OSI-RM). The Space Data System Interconnection Reference Model (SDSI-RM) incorporates the conventions of the OSIRM to provide a framework within which a complete set of space data handling services can be defined. The use of the SDSI-RM is illustrated through its application to data handling services and protocols that have been defined by, or are under consideration by, the Consultative Committee for Space Data Systems (CCSDS).

  17. An Educators' Guide to Information Access across the Internet.

    ERIC Educational Resources Information Center

    Hazari, Sunil

    1994-01-01

    A discussion of tools available for use of the Internet, particularly by college and university educators and students, offers information on use of various services, including electronic mailing list servers, data communications protocols for networking, inter-host connections, file transfer protocol, gopher software, bibliographic searching,…

  18. Health-care management of an unexpected case of Ebola virus disease at the Alcorcón Foundation University Teaching Hospital.

    PubMed

    Rodríguez-Caravaca, Gil; Timermans, Rafael; Parra-Ramírez, Juan Manuel; Domínguez-Hernández, Francisco Javier; Algora-Weber, Alejandro; Delgado-Iribarren, Alberto; Hermida-Gutiérrez, Guillermo

    2015-04-01

    The first Ebola virus infected patient outside Africa was diagnosed and treated at Alcorcón Foundation University Teaching Hospital (AFUTH). We describe the integrated management strategy (medical, occupational health, preventive and public health) applied to the case. Descriptive study of health-care management of an unexpected case of Ebola virus disease (EVD) at AFUTH treated on 6 October 2014. We describe the clinical evolution of the patient while he was attended at the Emergency Department, the drawing-up process of the action protocol, the process of training of hospital staff, the administrative management for transferring the patient to the referral centre, and the measures implemented for cleaning, disinfection and management of waste. Qualitative variables are expressed as percentages. Our centre designed and updated, from May to October, five versions of the acting and care protocol for patients with EVD. The protocol was in force at the AFUTH when a nursing assistant was attended on 6 October 2014. All preventive, diagnostic and therapeutic measures outlined in the protocol were applied and 206 professionals had received training and information about care procedures with a suspect case. Health-care management of an unexpected case of EVD was adequate and there was no secondary cases in our staff as a result. All resources available should be used to fight EVD. Copyright © 2015 Elsevier España, S.L.U. y Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.

  19. Genome-wide analysis of replication timing by next-generation sequencing with E/L Repli-seq.

    PubMed

    Marchal, Claire; Sasaki, Takayo; Vera, Daniel; Wilson, Korey; Sima, Jiao; Rivera-Mulia, Juan Carlos; Trevilla-García, Claudia; Nogues, Coralin; Nafie, Ebtesam; Gilbert, David M

    2018-05-01

    This protocol is an extension to: Nat. Protoc. 6, 870-895 (2014); doi:10.1038/nprot.2011.328; published online 02 June 2011Cycling cells duplicate their DNA content during S phase, following a defined program called replication timing (RT). Early- and late-replicating regions differ in terms of mutation rates, transcriptional activity, chromatin marks and subnuclear position. Moreover, RT is regulated during development and is altered in diseases. Here, we describe E/L Repli-seq, an extension of our Repli-chip protocol. E/L Repli-seq is a rapid, robust and relatively inexpensive protocol for analyzing RT by next-generation sequencing (NGS), allowing genome-wide assessment of how cellular processes are linked to RT. Briefly, cells are pulse-labeled with BrdU, and early and late S-phase fractions are sorted by flow cytometry. Labeled nascent DNA is immunoprecipitated from both fractions and sequenced. Data processing leads to a single bedGraph file containing the ratio of nascent DNA from early versus late S-phase fractions. The results are comparable to those of Repli-chip, with the additional benefits of genome-wide sequence information and an increased dynamic range. We also provide computational pipelines for downstream analyses, for parsing phased genomes using single-nucleotide polymorphisms (SNPs) to analyze RT allelic asynchrony, and for direct comparison to Repli-chip data. This protocol can be performed in up to 3 d before sequencing, and requires basic cellular and molecular biology skills, as well as a basic understanding of Unix and R.

  20. Understanding information synthesis in oral surgery for the design of systems for clinical information technology.

    PubMed

    Suebnukarn, Siriwan; Chanakarn, Piyawadee; Phisutphatthana, Sirada; Pongpatarat, Kanchala; Wongwaithongdee, Udom; Oupadissakoon, Chanekrid

    2015-12-01

    An understanding of the processes of clinical decision-making is essential for the development of health information technology. In this study we have analysed the acquisition of information during decision-making in oral surgery, and analysed cognitive tasks using a "think-aloud" protocol. We studied the techniques of processing information that were used by novices and experts as they completed 4 oral surgical cases modelled from data obtained from electronic hospital records. We studied 2 phases of an oral surgeon's preoperative practice including the "diagnosis and planning of treatment" and "preparing for a procedure". A framework analysis approach was used to analyse the qualitative data, and a descriptive statistical analysis was made of the quantitative data. The results showed that novice surgeons used hypotheticodeductive reasoning, whereas experts recognised patterns to diagnose and manage patients. Novices provided less detail when they prepared for a procedure. Concepts regarding "signs", "importance", "decisions", and "process" occurred most often during acquisition of information by both novices and experts. Based on these results, we formulated recommendations for the design of clinical information technology that would help to improve the acquisition of clinical information required by oral surgeons at all levels of expertise in their clinical decision-making. Copyright © 2015 The British Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  1. A review of blood sample handling and pre-processing for metabolomics studies.

    PubMed

    Hernandes, Vinicius Veri; Barbas, Coral; Dudzik, Danuta

    2017-09-01

    Metabolomics has been found to be applicable to a wide range of clinical studies, bringing a new era for improving clinical diagnostics, early disease detection, therapy prediction and treatment efficiency monitoring. A major challenge in metabolomics, particularly untargeted studies, is the extremely diverse and complex nature of biological specimens. Despite great advances in the field there still exist fundamental needs for considering pre-analytical variability that can introduce bias to the subsequent analytical process and decrease the reliability of the results and moreover confound final research outcomes. Many researchers are mainly focused on the instrumental aspects of the biomarker discovery process, and sample related variables sometimes seem to be overlooked. To bridge the gap, critical information and standardized protocols regarding experimental design and sample handling and pre-processing are highly desired. Characterization of a range variation among sample collection methods is necessary to prevent results misinterpretation and to ensure that observed differences are not due to an experimental bias caused by inconsistencies in sample processing. Herein, a systematic discussion of pre-analytical variables affecting metabolomics studies based on blood derived samples is performed. Furthermore, we provide a set of recommendations concerning experimental design, collection, pre-processing procedures and storage conditions as a practical review that can guide and serve for the standardization of protocols and reduction of undesirable variation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Patient Characterization Protocols for Psychophysiological Studies of Traumatic Brain Injury and Post-TBI Psychiatric Disorders

    DTIC Science & Technology

    2013-07-22

    injury and psychiatric disorders are receiving increased research attention, and ERP technologies are making contributions to this effort. This review has... technology like ERPs. They yield information not evi- dent in RT and allow us to fractionate the stimulus input-response output process with greater...Homaifar et al. (98), who writing specifically about depression following TBI, recommended that multiple means of assessment should be used when diagnosing

  3. PERFECTED enhanced recovery (PERFECT-ER) care versus standard acute care for patients admitted to acute settings with hip fracture identified as experiencing confusion: study protocol for a feasibility cluster randomized controlled trial.

    PubMed

    Hammond, Simon P; Cross, Jane L; Shepstone, Lee; Backhouse, Tamara; Henderson, Catherine; Poland, Fiona; Sims, Erika; MacLullich, Alasdair; Penhale, Bridget; Howard, Robert; Lambert, Nigel; Varley, Anna; Smith, Toby O; Sahota, Opinder; Donell, Simon; Patel, Martyn; Ballard, Clive; Young, John; Knapp, Martin; Jackson, Stephen; Waring, Justin; Leavey, Nick; Howard, Gregory; Fox, Chris

    2017-12-04

    Health and social care provision for an ageing population is a global priority. Provision for those with dementia and hip fracture has specific and growing importance. Older people who break their hip are recognised as exceptionally vulnerable to experiencing confusion (including but not exclusively, dementia and/or delirium and/or cognitive impairment(s)) before, during or after acute admissions. Older people experiencing hip fracture and confusion risk serious complications, linked to delayed recovery and higher mortality post-operatively. Specific care pathways acknowledging the differences in patient presentation and care needs are proposed to improve clinical and process outcomes. This protocol describes a multi-centre, feasibility, cluster-randomised, controlled trial (CRCT) to be undertaken across ten National Health Service hospital trusts in the UK. The trial will explore the feasibility of undertaking a CRCT comparing the multicomponent PERFECTED enhanced recovery intervention (PERFECT-ER), which acknowledges the differences in care needs of confused older patients experiencing hip fracture, with standard care. The trial will also have an integrated process evaluation to explore how PERFECT-ER is implemented and interacts with the local context. The study will recruit 400 hip fracture patients identified as experiencing confusion and will also recruit "suitable informants" (individuals in regular contact with participants who will complete proxy measures). We will also recruit NHS professionals for the process evaluation. This mixed methods design will produce data to inform a definitive evaluation of the intervention via a large-scale pragmatic randomised controlled trial (RCT). The trial will provide a preliminary estimate of potential efficacy of PERFECT-ER versus standard care; assess service delivery variation, inform primary and secondary outcome selection, generate estimates of recruitment and retention rates, data collection difficulties, and completeness of outcome data and provide an indication of potential economic benefits. The process evaluation will enhance knowledge of implementation delivery and receipt. ISRCTN, 99336264 . Registered on 5 September 2016.

  4. Active Computer Network Defense: An Assessment

    DTIC Science & Technology

    2001-04-01

    sufficient base of knowledge in information technology can be assumed to be working on some form of computer network warfare, even if only defensive in...the Defense Information Infrastructure (DII) to attack. Transmission Control Protocol/ Internet Protocol (TCP/IP) networks are inherently resistant to...aims to create this part of information superiority, and computer network defense is one of its fundamental components. Most of these efforts center

  5. Accelerating Coagulation in Traumatic Injuries Using Inorganic Polyphosphate-Coated Silica Nanoparticles in a Swine (Sus scrofa) Model

    DTIC Science & Technology

    2018-03-13

    all information . Use additional pages if necessary.) PROTOCOL #: FDG20160012A DATE: 13 March 2018 PROTOCOL TITLE: Accelerating Coagulation...Investigator Attachments: Attachment 1: Defense Technical Information Center (DTIC) Abstract Submission (Mandatory) 4 FDG20160012A...Attachment 1 Defense Technical Information Center (DTIC) Abstract Submission This abstract requires a brief (no more than 200 words) factual summary of the

  6. Determining the Cardiovascular Effect of Partial versus Complete REBOA in a Porcine (Sus scrofa) Model of Hemorrhagic Shock.

    DTIC Science & Technology

    2018-03-09

    all information . Use additional pages if necessary.) PROTOCOL #: FDG20170005A DATE: 9 March 2018 PROTOCOL TITLE: Determining...Investigator Attachments: Attachment 1: Defense Technical Information Center (DTIC) Abstract Submission (Mandatory) 4 FDG20170005A...Attachment 1 Defense Technical Information Center (DTIC) Abstract Submission This abstract requires a brief (no more than 200 words) factual summary of the

  7. Guideline for Performing Systematic Approach to Evaluate and Qualify Legacy Documents that Support Advanced Reactor Technology Activity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Honma, George

    The establishment of a systematic process for the evaluation of historic technology information for use in advanced reactor licensing is described. Efforts are underway to recover and preserve Experimental Breeder Reactor II and Fast Flux Test Facility historical data. These efforts have generally emphasized preserving information from data-acquisition systems and hard-copy reports and entering it into modern electronic formats suitable for data retrieval and examination. The guidance contained in this document has been developed to facilitate consistent and systematic evaluation processes relating to quality attributes of historic technical information (with focus on sodium-cooled fast reactor (SFR) technology) that will bemore » used to eventually support licensing of advanced reactor designs. The historical information may include, but is not limited to, design documents for SFRs, research-and-development (R&D) data and associated documents, test plans and associated protocols, operations and test data, international research data, technical reports, and information associated with past U.S. Nuclear Regulatory Commission (NRC) reviews of SFR designs. The evaluation process is prescribed in terms of SFR technology, but the process can be used to evaluate historical information for any type of advanced reactor technology. An appendix provides a discussion of typical issues that should be considered when evaluating and qualifying historical information for advanced reactor technology fuel and source terms, based on current light water reactor (LWR) requirements and recent experience gained from Next Generation Nuclear Plant (NGNP).« less

  8. 78 FR 40815 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-08

    ... using the NASDAQ Information Exchange (``QIX'') protocol,\\7\\ (ii) Financial Information Exchange (``FIX'') trading ports,\\8\\ and (iii) ports using other trading telecommunications protocols.\\9\\ Beginning July 1... because market participants may readily adjust their order routing practices, NASDAQ believes that the...

  9. 76 FR 65721 - Agency Information Collection Activities; Submission to OMB for Review and Approval; Comment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-24

    ... Stratospheric Ozone Protection regulations, the science of ozone layer depletion, and related topics... Layer (Protocol) and the CAA. Entities applying for this exemption are asked to submit to EPA... Substances that Deplete the Ozone Layer (Protocol). The information collection request is required to obtain...

  10. Nonbibliographic Applications of Z39.50.

    ERIC Educational Resources Information Center

    Kunze, John A.

    1992-01-01

    Describes the use of the Z39.50 information retrieval protocol as the basis for Infocal, a read-only, client/server-based campus information system. Technical considerations in adapting the protocol to nonbibliographic data, including semantic modules, dynamic attribute sets, and dynamic record syntax, are described in detail. (Contains 11…

  11. Managed traffic evacuation using distributed sensor processing

    NASA Astrophysics Data System (ADS)

    Ramuhalli, Pradeep; Biswas, Subir

    2005-05-01

    This paper presents an integrated sensor network and distributed event processing architecture for managed in-building traffic evacuation during natural and human-caused disasters, including earthquakes, fire and biological/chemical terrorist attacks. The proposed wireless sensor network protocols and distributed event processing mechanisms offer a new distributed paradigm for improving reliability in building evacuation and disaster management. The networking component of the system is constructed using distributed wireless sensors for measuring environmental parameters such as temperature, humidity, and detecting unusual events such as smoke, structural failures, vibration, biological/chemical or nuclear agents. Distributed event processing algorithms will be executed by these sensor nodes to detect the propagation pattern of the disaster and to measure the concentration and activity of human traffic in different parts of the building. Based on this information, dynamic evacuation decisions are taken for maximizing the evacuation speed and minimizing unwanted incidents such as human exposure to harmful agents and stampedes near exits. A set of audio-visual indicators and actuators are used for aiding the automated evacuation process. In this paper we develop integrated protocols, algorithms and their simulation models for the proposed sensor networking and the distributed event processing framework. Also, efficient harnessing of the individually low, but collectively massive, processing abilities of the sensor nodes is a powerful concept behind our proposed distributed event processing algorithms. Results obtained through simulation in this paper are used for a detailed characterization of the proposed evacuation management system and its associated algorithmic components.

  12. Network Security via Biometric Recognition of Patterns of Gene Expression

    NASA Technical Reports Server (NTRS)

    Shaw, Harry C.

    2016-01-01

    Molecular biology provides the ability to implement forms of information and network security completely outside the bounds of legacy security protocols and algorithms. This paper addresses an approach which instantiates the power of gene expression for security. Molecular biology provides a rich source of gene expression and regulation mechanisms, which can be adopted to use in the information and electronic communication domains. Conventional security protocols are becoming increasingly vulnerable due to more intensive, highly capable attacks on the underlying mathematics of cryptography. Security protocols are being undermined by social engineering and substandard implementations by IT (Information Technology) organizations. Molecular biology can provide countermeasures to these weak points with the current security approaches. Future advances in instruments for analyzing assays will also enable this protocol to advance from one of cryptographic algorithms to an integrated system of cryptographic algorithms and real-time assays of gene expression products.

  13. Framework for managing mycotoxin risks in the food industry.

    PubMed

    Baker, Robert C; Ford, Randall M; Helander, Mary E; Marecki, Janusz; Natarajan, Ramesh; Ray, Bonnie

    2014-12-01

    We propose a methodological framework for managing mycotoxin risks in the food processing industry. Mycotoxin contamination is a well-known threat to public health that has economic significance for the food processing industry; it is imperative to address mycotoxin risks holistically, at all points in the procurement, processing, and distribution pipeline, by tracking the relevant data, adopting best practices, and providing suitable adaptive controls. The proposed framework includes (i) an information and data repository, (ii) a collaborative infrastructure with analysis and simulation tools, (iii) standardized testing and acceptance sampling procedures, and (iv) processes that link the risk assessments and testing results to the sourcing, production, and product release steps. The implementation of suitable acceptance sampling protocols for mycotoxin testing is considered in some detail.

  14. Wireless Sensor Node for Autonomous Monitoring and Alerts in Remote Environments

    NASA Technical Reports Server (NTRS)

    Panangadan, Anand V. (Inventor); Monacos, Steve P. (Inventor)

    2015-01-01

    A method, apparatus, system, and computer program products provides personal alert and tracking capabilities using one or more nodes. Each node includes radio transceiver chips operating at different frequency ranges, a power amplifier, sensors, a display, and embedded software. The chips enable the node to operate as either a mobile sensor node or a relay base station node while providing a long distance relay link between nodes. The power amplifier enables a line-of-sight communication between the one or more nodes. The sensors provide a GPS signal, temperature, and accelerometer information (used to trigger an alert condition). The embedded software captures and processes the sensor information, provides a multi-hop packet routing protocol to relay the sensor information to and receive alert information from a command center, and to display the alert information on the display.

  15. An Ontology for Identifying Cyber Intrusion Induced Faults in Process Control Systems

    NASA Astrophysics Data System (ADS)

    Hieb, Jeffrey; Graham, James; Guan, Jian

    This paper presents an ontological framework that permits formal representations of process control systems, including elements of the process being controlled and the control system itself. A fault diagnosis algorithm based on the ontological model is also presented. The algorithm can identify traditional process elements as well as control system elements (e.g., IP network and SCADA protocol) as fault sources. When these elements are identified as a likely fault source, the possibility exists that the process fault is induced by a cyber intrusion. A laboratory-scale distillation column is used to illustrate the model and the algorithm. Coupled with a well-defined statistical process model, this fault diagnosis approach provides cyber security enhanced fault diagnosis information to plant operators and can help identify that a cyber attack is underway before a major process failure is experienced.

  16. [Computerized clinical protocol for occlusion].

    PubMed

    Salsench, J; Ferrer, J; Nogueras, J

    1988-11-01

    In making a protocol it is necessary that all members of the team who are going to collect information have the same unity of criterion about the different variables that compose it. The drawing up of this document is as much or more necessary than the protocol itself. In this work we all data collected in the protocol and we give the explanations of each concept.

  17. Application Level Protocol Development for Library and Information Science Applications. Volume 1: Service Definition. Volume 2: Protocol Specification. Report No. TG.1.5; TG.50.

    ERIC Educational Resources Information Center

    Aagaard, James S.; And Others

    This two-volume document specifies a protocol that was developed using the Reference Model for Open Systems Interconnection (OSI), which provides a framework for communications within a heterogeneous network environment. The protocol implements the features necessary for bibliographic searching, record maintenance, and mail transfer between…

  18. Asynchronous Message Service Reference Implementation

    NASA Technical Reports Server (NTRS)

    Burleigh, Scott C.

    2011-01-01

    This software provides a library of middleware functions with a simple application programming interface, enabling implementation of distributed applications in conformance with the CCSDS AMS (Consultative Committee for Space Data Systems Asynchronous Message Service) specification. The AMS service, and its protocols, implement an architectural concept under which the modules of mission systems may be designed as if they were to operate in isolation, each one producing and consuming mission information without explicit awareness of which other modules are currently operating. Communication relationships among such modules are self-configuring; this tends to minimize complexity in the development and operations of modular data systems. A system built on this model is a society of generally autonomous, inter-operating modules that may fluctuate freely over time in response to changing mission objectives, modules functional upgrades, and recovery from individual module failure. The purpose of AMS, then, is to reduce mission cost and risk by providing standard, reusable infrastructure for the exchange of information among data system modules in a manner that is simple to use, highly automated, flexible, robust, scalable, and efficient. The implementation is designed to spawn multiple threads of AMS functionality under the control of an AMS application program. These threads enable all members of an AMS-based, distributed application to discover one another in real time, subscribe to messages on specific topics, and to publish messages on specific topics. The query/reply (client/server) communication model is also supported. Message exchange is optionally subject to encryption (to support confidentiality) and authorization. Fault tolerance measures in the discovery protocol minimize the likelihood of overall application failure due to any single operational error anywhere in the system. The multi-threaded design simplifies processing while enabling application nodes to operate at high speeds; linked lists protected by mutex semaphores and condition variables are used for efficient, inter-thread communication. Applications may use a variety of transport protocols underlying AMS itself, including TCP (Transmission Control Protocol), UDP (User Datagram Protocol), and message queues.

  19. A Taxonomy of Attacks on the DNP3 Protocol

    NASA Astrophysics Data System (ADS)

    East, Samuel; Butts, Jonathan; Papa, Mauricio; Shenoi, Sujeet

    Distributed Network Protocol (DNP3) is the predominant SCADA protocol in the energy sector - more than 75% of North American electric utilities currently use DNP3 for industrial control applications. This paper presents a taxonomy of attacks on the protocol. The attacks are classified based on targets (control center, outstation devices and network/communication paths) and threat categories (interception, interruption, modification and fabrication). To facilitate risk analysis and mitigation strategies, the attacks are associated with the specific DNP3 protocol layers they exploit. Also, the operational impact of the attacks is categorized in terms of three key SCADA objectives: process confi- dentiality, process awareness and process control. The attack taxonomy clarifies the nature and scope of the threats to DNP3 systems, and can provide insights into the relative costs and benefits of implementing mitigation strategies.

  20. Processing Protocol for Soil Samples Potentially ...

    EPA Pesticide Factsheets

    Method Operating Procedures This protocol describes the processing steps for 45 g and 9 g soil samples potentially contaminated with Bacillus anthracis spores. The protocol is designed to separate and concentrate the spores from bulk soil down to a pellet that can be used for further analysis. Soil extraction solution and mechanical shaking are used to disrupt soil particle aggregates and to aid in the separation of spores from soil particles. Soil samples are washed twice with soil extraction solution to maximize recovery. Differential centrifugation is used to separate spores from the majority of the soil material. The 45 g protocol has been demonstrated by two laboratories using both loamy and sandy soil types. There were no significant differences overall between the two laboratories for either soil type, suggesting that the processing protocol would be robust enough to use at multiple laboratories while achieving comparable recoveries. The 45 g protocol has demonstrated a matrix limit of detection at 14 spores/gram of soil for loamy and sandy soils.

  1. Processing protocol for soil samples potentially contaminated with Bacillus anthracis spores [HS7.52.02 - 514

    USGS Publications Warehouse

    Silvestri, Erin E.; Griffin, Dale W.

    2017-01-01

    This protocol describes the processing steps for 45 g and 9 g soil samples potentially contaminated with Bacillus anthracis spores. The protocol is designed to separate and concentrate the spores from bulk soil down to a pellet that can be used for further analysis. Soil extraction solution and mechanical shaking are used to disrupt soil particle aggregates and to aid in the separation of spores from soil particles. Soil samples are washed twice with soil extraction solution to maximize recovery. Differential centrifugation is used to separate spores from the majority of the soil material. The 45 g protocol has been demonstrated by two laboratories using both loamy and sandy soil types. There were no significant differences overall between the two laboratories for either soil type, suggesting that the processing protocol would be robust enough to use at multiple laboratories while achieving comparable recoveries. The 45 g protocol has demonstrated a matrix limit of detection at 14 spores/gram of soil for loamy and sandy soils.

  2. A Secure Three-Factor User Authentication and Key Agreement Protocol for TMIS With User Anonymity.

    PubMed

    Amin, Ruhul; Biswas, G P

    2015-08-01

    Telecare medical information system (TMIS) makes an efficient and convenient connection between patient(s)/user(s) and doctor(s) over the insecure internet. Therefore, data security, privacy and user authentication are enormously important for accessing important medical data over insecure communication. Recently, many user authentication protocols for TMIS have been proposed in the literature and it has been observed that most of the protocols cannot achieve complete security requirements. In this paper, we have scrutinized two (Mishra et al., Xu et al.) remote user authentication protocols using smart card and explained that both the protocols are suffering against several security weaknesses. We have then presented three-factor user authentication and key agreement protocol usable for TMIS, which fix the security pitfalls of the above mentioned schemes. The informal cryptanalysis makes certain that the proposed protocol provides well security protection on the relevant security attacks. Furthermore, the simulator AVISPA tool confirms that the protocol is secure against active and passive attacks including replay and man-in-the-middle attacks. The security functionalities and performance comparison analysis confirm that our protocol not only provide strong protection on security attacks, but it also achieves better complexities along with efficient login and password change phase as well as session key verification property.

  3. Reliable communication in the presence of failures

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.; Joseph, Thomas A.

    1987-01-01

    The design and correctness of a communication facility for a distributed computer system are reported on. The facility provides support for fault-tolerant process groups in the form of a family of reliable multicast protocols that can be used in both local- and wide-area networks. These protocols attain high levels of concurrency, while respecting application-specific delivery ordering constraints, and have varying cost and performance that depend on the degree of ordering desired. In particular, a protocol that enforces causal delivery orderings is introduced and shown to be a valuable alternative to conventional asynchronous communication protocols. The facility also ensures that the processes belonging to a fault-tolerant process group will observe consistant orderings of events affecting the group as a whole, including process failures, recoveries, migration, and dynamic changes to group properties like member rankings. A review of several uses for the protocols is the ISIS system, which supports fault-tolerant resilient objects and bulletin boards, illustrates the significant simplification of higher level algorithms made possible by our approach.

  4. Systematic Evaluation of the Patient-Reported Outcome (PRO) Content of Clinical Trial Protocols

    PubMed Central

    Kyte, Derek; Duffy, Helen; Fletcher, Benjamin; Gheorghe, Adrian; Mercieca-Bebber, Rebecca; King, Madeleine; Draper, Heather; Ives, Jonathan; Brundage, Michael; Blazeby, Jane; Calvert, Melanie

    2014-01-01

    Background Qualitative evidence suggests patient-reported outcome (PRO) information is frequently absent from clinical trial protocols, potentially leading to inconsistent PRO data collection and risking bias. Direct evidence regarding PRO trial protocol content is lacking. The aim of this study was to systematically evaluate the PRO-specific content of UK National Institute for Health Research (NIHR) Health Technology Assessment (HTA) programme trial protocols. Methods and Findings We conducted an electronic search of the NIHR HTA programme database (inception to August 2013) for protocols describing a randomised controlled trial including a primary/secondary PRO. Two investigators independently reviewed the content of each protocol, using a specially constructed PRO-specific protocol checklist, alongside the ‘Standard Protocol Items: Recommendations for Interventional Trials’ (SPIRIT) checklist. Disagreements were resolved through discussion with a third investigator. 75 trial protocols were included in the analysis. Protocols included a mean of 32/51 (63%) SPIRIT recommendations (range 16–41, SD 5.62) and 11/33 (33%) PRO-specific items (range 4–18, SD 3.56). Over half (61%) of the PRO items were incomplete. Protocols containing a primary PRO included slightly more PRO checklist items (mean 14/33 (43%)). PRO protocol content was not associated with general protocol completeness; thus, protocols judged as relatively ‘complete’ using SPIRIT were still likely to have omitted a large proportion of PRO checklist items. Conclusions The PRO components of HTA clinical trial protocols require improvement. Information on the PRO rationale/hypothesis, data collection methods, training and management was often absent. This low compliance is unsurprising; evidence shows existing PRO guidance for protocol developers remains difficult to access and lacks consistency. Study findings suggest there are a number of PRO protocol checklist items that are not fully addressed by the current SPIRIT statement. We therefore advocate the development of consensus-based supplementary guidelines, aimed at improving the completeness and quality of PRO content in clinical trial protocols. PMID:25333349

  5. Secure Display of Space-Exploration Images

    NASA Technical Reports Server (NTRS)

    Cheng, Cecilia; Thornhill, Gillian; McAuley, Michael

    2006-01-01

    Java EDR Display Interface (JEDI) is software for either local display or secure Internet distribution, to authorized clients, of image data acquired from cameras aboard spacecraft engaged in exploration of remote planets. ( EDR signifies experimental data record, which, in effect, signifies image data.) Processed at NASA s Multimission Image Processing Laboratory (MIPL), the data can be from either near-realtime processing streams or stored files. JEDI uses the Java Advanced Imaging application program interface, plus input/output packages that are parts of the Video Image Communication and Retrieval software of the MIPL, to display images. JEDI can be run as either a standalone application program or within a Web browser as a servlet with an applet front end. In either operating mode, JEDI communicates using the HTTP(s) protocol(s). In the Web-browser case, the user must provide a password to gain access. For each user and/or image data type, there is a configuration file, called a "personality file," containing parameters that control the layout of the displays and the information to be included in them. Once JEDI has accepted the user s password, it processes the requested EDR (provided that user is authorized to receive the specific EDR) to create a display according to the user s personality file.

  6. Implementation and audit of 'Fast-Track Surgery' in gynaecological oncology surgery.

    PubMed

    Sidhu, Verinder S; Lancaster, Letitia; Elliott, David; Brand, Alison H

    2012-08-01

    Fast-track surgery is a multidisciplinary approach to surgery that results in faster recovery from surgery and decreased length of stay (LOS). The aims of this study were as follows: (i) to report on the processes required for the introduction of fast-track surgery to a gynaecological oncology unit and (ii) to report the results of a clinical audit conducted after the protocol's implementation. A fast-track protocol, specific to our unit, was developed after a series of multidisciplinary meetings. The protocol, agreed upon by those involved in the care of women in our unit, was then introduced into clinical practice. An audit was conducted of all women undergoing laparotomy, with known or suspected malignancy. Information on LOS, complication and readmission rates was collected. Descriptive statistics and Poisson regression were used for statistical analysis. The developed protocol involved a multidisciplinary approach to pre-, intra- and postoperative care. The audit included 104 consecutive women over a 6-month period, who were followed for 6 weeks postoperatively. The median LOS was 4 days. The readmission rate was 7% and the complication rate was 19% (1% intraoperative, 4% major and 14% minor). Multivariate analysis revealed that increased duration of surgery and increasing age were predictors of longer LOS. The development of a fast-track protocol is achievable in a gynaecological oncology unit, with input from a multidisciplinary team. Effective implementation of the protocol can result in a short LOS, with acceptable complication and readmission rates when applied non-selectively to gynaecological oncology patients. © 2012 The Authors ANZJOG © 2012 The Royal Australian and New Zealand College of Obstetricians and Gynaecologists.

  7. Schizophrenia research participants' responses to protocol safeguards: recruitment, consent, and debriefing.

    PubMed

    Roberts, Laura Weiss; Warner, Teddy D; Anderson, Charles T; Smithpeter, Megan V; Rogers, Melinda K

    2004-04-01

    To examine the perspectives and preferences regarding ethically important aspects of recruitment, consent, and debriefing of people with schizophrenia who volunteered for research protocols. A structured interview to assess research-related views of people with schizophrenia was developed and piloted. Data collection occurred at three sites. For this analysis, we examined the subset of responses from schizophrenia patients currently enrolled in a protocol. Data from 28 schizophrenia research volunteers were analyzed. Of these, 22 were men and 11 were voluntary inpatients. Most (n=23) recalled speaking with someone before enrolling in the protocol, and most (n=26) reported trusting the person who told them about it. Participants reported a moderate understanding of their protocols. All but one person (n=27) remembered signing a consent form. Twenty-one volunteers indicated that consent forms are meant to help both the patient and the researcher. Most (n=23) reported making the enrollment decision alone, with 22 making this decision prior to reviewing the consent form. The decision was described as relatively easy. Respondents felt some pressure to enroll, with women experiencing more pressure. Debriefing practices were strongly endorsed by participants. All 28 of the volunteers wished to be informed if a health problem (i.e., "something wrong") was discovered during the protocol. The persons living with schizophrenia who were interviewed for this project expressed interesting perspectives and preferences regarding ethically important aspects of recruitment, consent, and debriefing in clinical research that may help guide efforts to make research processes more attuned to participants and merit further inquiry.

  8. [Change of care model in natural childbirth: Implementation in La Ribera delivery room].

    PubMed

    Camacho-Morell, F; Romero-Martín, M J

    To assess knowledge, wish for inclusion and implementation of normal childbirth care protocols at La Ribera University Hospital, the reason why they are not applied, and to assess the attendance at antepartum training activities. Cross-sectional descriptive study. They were carried out 186 surveys by convenience sampling to pregnant women attending fetal well-being control at hospital between 2014 and 2015. They were collected data about knowledge, wish for inclusion, compliance of protocols and reasons for non-compliance, and attendance at antepartum training activities. Percentages and confidence intervals were calculated. Chi-square test was used to compare categorical variables. They were collected percentages of knowledge (77%, CI95%: 75,5-78,5) and wish for inclusion (84,6%, CI 95% : 82,5-86,7). Protocol compliance ranged from 6% (nitrous oxide administration) to 91% (skin-to-skin contact). The main reasons for non-compliance were due to circumstances of childbirth process (56,3%, CI 95% : 51,1-61,5). Attendance at maternal education classes was 62%, mainly primiparous women (p=0,0001) with medium or high education level (p=0,001). Pregnant women have a high knowledge and wish for inclusion of normal childbirth care protocols. Attendance at antepartum training activities could by improved and the main reason for non-attendance is lack of information. Compliance is good enough in most protocols; when they are not applied is due to childbirth circumstances. Remaining tasks include the introduction of additional protocols and to involve pregnant women in decision-making. Copyright © 2017 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.

  9. From policy to practice: implementing frontline community health services for substance dependence--study protocol.

    PubMed

    Gill, Kathryn J; Campbell, Emily; Gauthier, Gail; Xenocostas, Spyridoula; Charney, Dara; Macaulay, Ann C

    2014-08-20

    Substance abuse is a worldwide public health concern. Extensive scientific research has shown that screening and brief interventions for substance use disorders administered in primary care provide substantial benefit at relatively low cost. Frontline health clinicians are well placed to detect and treat patients with substance use disorders. Despite effectiveness shown in research, there are many factors that impact the implementation of these practices in real-world clinical practice. Recently, the Ministry of Health and Social Services in Quebec, Canada, issued two policy documents aimed at introducing screening and early intervention for substance abuse into frontline healthcare clinics in Quebec. The current research protocol was developed in order to study the process of implementation of evidence-based addiction treatment practices at three primary care clinics in Montreal (Phase 1). In addition, the research protocol was designed to examine the efficacy of overall policy implementation, including barriers and facilitators to addictions program development throughout Quebec (Phase 2). Phase 1 will provide an in-depth case study of knowledge translation and implementation. The study protocol will utilize an integrated knowledge translation strategy to build collaborative mechanisms for knowledge exchange between researchers, addiction specialists, and frontline practitioners (guided by the principles of participatory-action research), and directly examine the process of knowledge uptake and barriers to transfer using both qualitative and quantitative methodologies. Evaluation will involve multiple measures, time points and domains; program uptake and effectiveness will be determined by changes in healthcare service delivery, sustainability and outcomes. In Phase 2, qualitative methods will be utilized to examine the contextual facilitators and barriers that frontline organizations face in implementing services for substance dependence. Phase 2 will provide the first study exploring the wide-scale implementation of frontline services for substance dependence in the province of Quebec and yield needed information about how to effectively implement mandated policies into clinical practice and impact public health. Findings from this research program will contribute to the understanding of factors associated with implementation of frontline services for substance dependence and help to inform future policy and organizational support for the implementation of evidence-based practices.

  10. Treatment algorithms and protocolized care.

    PubMed

    Morris, Alan H

    2003-06-01

    Excess information in complex ICU environments exceeds human decision-making limits and likely contributes to unnecessary variation in clinical care, increasing the likelihood of clinical errors. I reviewed recent critical care clinical trials searching for information about the impact of protocol use on clinically pertinent outcomes. Several recently published clinical trials illustrate the importance of distinguishing efficacy and effectiveness trials. One of these trials illustrates the danger of conducting effectiveness trials before the efficacy of an intervention is established. The trials also illustrate the importance of distinguishing guidelines and inadequately explicit protocols from adequately explicit protocols. Only adequately explicit protocols contain enough detail to lead different clinicians to the same decision when faced with the same clinical scenario. Differences between guidelines and protocols are important. Guidelines lack detail and provide general guidance that requires clinicians to fill in many gaps. Computerized or paper-based protocols are detailed and, when used for complex clinical ICU problems, can generate patient-specific, evidence-based therapy instructions that can be carried out by different clinicians with almost no interclinician variability. Individualization of patient therapy can be preserved by these protocols when they are driven by individual patient data. Explicit decision-support tools (eg, guidelines and protocols) have favorable effects on clinician and patient outcomes and can reduce the variation in clinical practice. Guidelines and protocols that aid ICU decision makers should be more widely distributed.

  11. VehiHealth: An Emergency Routing Protocol for Vehicular Ad Hoc Network to Support Healthcare System.

    PubMed

    Bhoi, S K; Khilar, P M

    2016-03-01

    Survival of a patient depends on effective data communication in healthcare system. In this paper, an emergency routing protocol for Vehicular Ad hoc Network (VANET) is proposed to quickly forward the current patient status information from the ambulance to the hospital to provide pre-medical treatment. As the ambulance takes time to reach the hospital, ambulance doctor can provide sudden treatment to the patient in emergency by sending patient status information to the hospital through the vehicles using vehicular communication. Secondly, the experienced doctors respond to the information by quickly sending a treatment information to the ambulance. In this protocol, data is forwarded through that path which has less link breakage problem between the vehicles. This is done by calculating an intersection value I v a l u e for the neighboring intersections by using the current traffic information. Then the data is forwarded through that intersection which has minimum I v a l u e . Simulation results show VehiHealth performs better than P-GEDIR, GyTAR, A-STAR and GSR routing protocols in terms of average end-to-end delay, number of link breakage, path length, and average response time.

  12. Optimization of a Sample Processing Protocol for Recovery of ...

    EPA Pesticide Factsheets

    Journal Article Following a release of Bacillus anthracis spores into the environment, there is a potential for lasting environmental contamination in soils. There is a need for detection protocols for B. anthracis in environmental matrices. However, identification of B. anthracis within a soil is a difficult task. Processing soil samples helps to remove debris, chemical components, and biological impurities that can interfere with microbiological detection. This study aimed to optimize a previously used indirect processing protocol, which included a series of washing and centrifugation steps.

  13. Quantum Private Queries

    NASA Astrophysics Data System (ADS)

    Giovannetti, Vittorio; Lloyd, Seth; Maccone, Lorenzo

    2008-06-01

    We propose a cheat sensitive quantum protocol to perform a private search on a classical database which is efficient in terms of communication complexity. It allows a user to retrieve an item from the database provider without revealing which item he or she retrieved: if the provider tries to obtain information on the query, the person querying the database can find it out. The protocol ensures also perfect data privacy of the database: the information that the user can retrieve in a single query is bounded and does not depend on the size of the database. With respect to the known (quantum and classical) strategies for private information retrieval, our protocol displays an exponential reduction in communication complexity and in running-time computational complexity.

  14. Single-photon continuous-variable quantum key distribution based on the energy-time uncertainty relation.

    PubMed

    Qi, Bing

    2006-09-15

    We propose a new quantum key distribution protocol in which information is encoded on continuous variables of a single photon. In this protocol, Alice randomly encodes her information on either the central frequency of a narrowband single-photon pulse or the time delay of a broadband single-photon pulse, while Bob randomly chooses to do either frequency measurement or time measurement. The security of this protocol rests on the energy-time uncertainty relation, which prevents Eve from simultaneously determining both frequency and time information with arbitrarily high resolution. Since no interferometer is employed in this scheme, it is more robust against various channel noises, such as polarization and phase fluctuations.

  15. Lab on a Chip Application Development for Exploration

    NASA Technical Reports Server (NTRS)

    Monaco, Lisa

    2004-01-01

    At Marshall Space Flight Center a new capability has been established to aid the advancement of microfluidics for space flight monitoring systems. Lab-On-a-Chip Application Development (LOCAD) team has created a program for advancing Technology Readiness Levels (TRL) of 1 & 2 to TRL 6 and 7, quickly and economically for Lab-On-a-Chip (LOC) applications. Scientists and engineers can utilize LOCAD's process to efficiently learn about microfluidics and determine if microfluidics is applicable to their needs. Once the applicability has been determined, LOCAD can then perform tests to develop the new fluidic protocols which are different from macro-scale chemical reaction protocols. With this information new micro-devices can be created such as the development of a microfluidic system to aid in the search for life, past and present, on Mars. Particular indicators in the Martian soil can contain the direct evidence of life. But to extract the information from the soil and present it to the proper detectors requires multiple fluidic/chemical operations. This is where LOCAD is providing its unique abilities.

  16. Three-step semiquantum secure direct communication protocol

    NASA Astrophysics Data System (ADS)

    Zou, XiangFu; Qiu, DaoWen

    2014-09-01

    Quantum secure direct communication is the direct communication of secret messages without need for establishing a shared secret key first. In the existing schemes, quantum secure direct communication is possible only when both parties are quantum. In this paper, we construct a three-step semiquantum secure direct communication (SQSDC) protocol based on single photon sources in which the sender Alice is classical. In a semiquantum protocol, a person is termed classical if he (she) can measure, prepare and send quantum states only with the fixed orthogonal quantum basis {|0>, |1>}. The security of the proposed SQSDC protocol is guaranteed by the complete robustness of semiquantum key distribution protocols and the unconditional security of classical one-time pad encryption. Therefore, the proposed SQSDC protocol is also completely robust. Complete robustness indicates that nonzero information acquired by an eavesdropper Eve on the secret message implies the nonzero probability that the legitimate participants can find errors on the bits tested by this protocol. In the proposed protocol, we suggest a method to check Eves disturbing in the doves returning phase such that Alice does not need to announce publicly any position or their coded bits value after the photons transmission is completed. Moreover, the proposed SQSDC protocol can be implemented with the existing techniques. Compared with many quantum secure direct communication protocols, the proposed SQSDC protocol has two merits: firstly the sender only needs classical capabilities; secondly to check Eves disturbing after the transmission of quantum states, no additional classical information is needed.

  17. Developing Ubiquitous Sensor Network Platform Using Internet of Things: Application in Precision Agriculture.

    PubMed

    Ferrández-Pastor, Francisco Javier; García-Chamizo, Juan Manuel; Nieto-Hidalgo, Mario; Mora-Pascual, Jerónimo; Mora-Martínez, José

    2016-07-22

    The application of Information Technologies into Precision Agriculture methods has clear benefits. Precision Agriculture optimises production efficiency, increases quality, minimises environmental impact and reduces the use of resources (energy, water); however, there are different barriers that have delayed its wide development. Some of these main barriers are expensive equipment, the difficulty to operate and maintain and the standard for sensor networks are still under development. Nowadays, new technological development in embedded devices (hardware and communication protocols), the evolution of Internet technologies (Internet of Things) and ubiquitous computing (Ubiquitous Sensor Networks) allow developing less expensive systems, easier to control, install and maintain, using standard protocols with low-power consumption. This work develops and test a low-cost sensor/actuator network platform, based in Internet of Things, integrating machine-to-machine and human-machine-interface protocols. Edge computing uses this multi-protocol approach to develop control processes on Precision Agriculture scenarios. A greenhouse with hydroponic crop production was developed and tested using Ubiquitous Sensor Network monitoring and edge control on Internet of Things paradigm. The experimental results showed that the Internet technologies and Smart Object Communication Patterns can be combined to encourage development of Precision Agriculture. They demonstrated added benefits (cost, energy, smart developing, acceptance by agricultural specialists) when a project is launched.

  18. Developing Ubiquitous Sensor Network Platform Using Internet of Things: Application in Precision Agriculture

    PubMed Central

    Ferrández-Pastor, Francisco Javier; García-Chamizo, Juan Manuel; Nieto-Hidalgo, Mario; Mora-Pascual, Jerónimo; Mora-Martínez, José

    2016-01-01

    The application of Information Technologies into Precision Agriculture methods has clear benefits. Precision Agriculture optimises production efficiency, increases quality, minimises environmental impact and reduces the use of resources (energy, water); however, there are different barriers that have delayed its wide development. Some of these main barriers are expensive equipment, the difficulty to operate and maintain and the standard for sensor networks are still under development. Nowadays, new technological development in embedded devices (hardware and communication protocols), the evolution of Internet technologies (Internet of Things) and ubiquitous computing (Ubiquitous Sensor Networks) allow developing less expensive systems, easier to control, install and maintain, using standard protocols with low-power consumption. This work develops and test a low-cost sensor/actuator network platform, based in Internet of Things, integrating machine-to-machine and human-machine-interface protocols. Edge computing uses this multi-protocol approach to develop control processes on Precision Agriculture scenarios. A greenhouse with hydroponic crop production was developed and tested using Ubiquitous Sensor Network monitoring and edge control on Internet of Things paradigm. The experimental results showed that the Internet technologies and Smart Object Communication Patterns can be combined to encourage development of Precision Agriculture. They demonstrated added benefits (cost, energy, smart developing, acceptance by agricultural specialists) when a project is launched. PMID:27455265

  19. TrhOnt: building an ontology to assist rehabilitation processes.

    PubMed

    Berges, Idoia; Antón, David; Bermúdez, Jesús; Goñi, Alfredo; Illarramendi, Arantza

    2016-10-04

    One of the current research efforts in the area of biomedicine is the representation of knowledge in a structured way so that reasoning can be performed on it. More precisely, in the field of physiotherapy, information such as the physiotherapy record of a patient or treatment protocols for specific disorders must be adequately modeled, because they play a relevant role in the management of the evolutionary recovery process of a patient. In this scenario, we introduce TRHONT, an application ontology that can assist physiotherapists in the management of the patients' evolution via reasoning supported by semantic technology. The ontology was developed following the NeOn Methodology. It integrates knowledge from ontological (e.g. FMA ontology) and non-ontological resources (e.g. a database of movements, exercises and treatment protocols) as well as additional physiotherapy-related knowledge. We demonstrate how the ontology fulfills the purpose of providing a reference model for the representation of the physiotherapy-related information that is needed for the whole physiotherapy treatment of patients, since they step for the first time into the physiotherapist's office, until they are discharged. More specifically, we present the results for each of the intended uses of the ontology listed in the document that specifies its requirements, and show how TRHONT can answer the competency questions defined within that document. Moreover, we detail the main steps of the process followed to build the TRHONT ontology in order to facilitate its reproducibility in a similar context. Finally, we show an evaluation of the ontology from different perspectives. TRHONT has achieved the purpose of allowing for a reasoning process that changes over time according to the patient's state and performance.

  20. Superposing pure quantum states with partial prior information

    NASA Astrophysics Data System (ADS)

    Dogra, Shruti; Thomas, George; Ghosh, Sibasish; Suter, Dieter

    2018-05-01

    The principle of superposition is an intriguing feature of quantum mechanics, which is regularly exploited in many different circumstances. A recent work [M. Oszmaniec et al., Phys. Rev. Lett. 116, 110403 (2016), 10.1103/PhysRevLett.116.110403] shows that the fundamentals of quantum mechanics restrict the process of superimposing two unknown pure states, even though it is possible to superimpose two quantum states with partial prior knowledge. The prior knowledge imposes geometrical constraints on the choice of input states. We discuss an experimentally feasible protocol to superimpose multiple pure states of a d -dimensional quantum system and carry out an explicit experimental realization for two single-qubit pure states with partial prior information on a two-qubit NMR quantum information processor.

Top