SAVAH: Source Address Validation with Host Identity Protocol
NASA Astrophysics Data System (ADS)
Kuptsov, Dmitriy; Gurtov, Andrei
Explosive growth of the Internet and lack of mechanisms that validate the authenticity of a packet source produced serious security and accounting issues. In this paper, we propose validating source addresses in LAN using Host Identity Protocol (HIP) deployed in a first-hop router. Compared to alternative solutions such as CGA, our approach is suitable both for IPv4 and IPv6. We have implemented SAVAH in Wi-Fi access points and evaluated its overhead for clients and the first-hop router.
Remote Patron Validation: Posting a Proxy Server at the Digital Doorway.
ERIC Educational Resources Information Center
Webster, Peter
2002-01-01
Discussion of remote access to library services focuses on proxy servers as a method for remote access, based on experiences at Saint Mary's University (Halifax). Topics include Internet protocol user validation; browser-directed proxies; server software proxies; vendor alternatives for validating remote users; and Internet security issues. (LRW)
NASA Technical Reports Server (NTRS)
Bingham, Gail; Bates, Scott; Bugbee, Bruce; Garland, Jay; Podolski, Igor; Levinskikh, Rita; Sychev, Vladimir; Gushin, Vadim
2009-01-01
Validating Vegetable Production Unit (VPU) Plants, Protocols, Procedures and Requirements (P3R) Using Currently Existing Flight Resources (Lada-VPU-P3R) is a study to advance the technology required for plant growth in microgravity and to research related food safety issues. Lada-VPU-P3R also investigates the non-nutritional value to the flight crew of developing plants on-orbit. The Lada-VPU-P3R uses the Lada hardware on the ISS and falls under a cooperative agreement between National Aeronautics and Space Administration (NASA) and the Russian Federal Space Association (FSA). Research Summary: Validating Vegetable Production Unit (VPU) Plants, Protocols, Procedures and Requirements (P3R) Using Currently Existing Flight Resources (Lada-VPU-P3R) will optimize hardware and
The reliability and validity of fatigue measures during multiple-sprint work: an issue revisited.
Glaister, Mark; Howatson, Glyn; Pattison, John R; McInnes, Gill
2008-09-01
The ability to repeatedly produce a high-power output or sprint speed is a key fitness component of most field and court sports. The aim of this study was to evaluate the validity and reliability of eight different approaches to quantify this parameter in tests of multiple-sprint performance. Ten physically active men completed two trials of each of two multiple-sprint running protocols with contrasting recovery periods. Protocol 1 consisted of 12 x 30-m sprints repeated every 35 seconds; protocol 2 consisted of 12 x 30-m sprints repeated every 65 seconds. All testing was performed in an indoor sports facility, and sprint times were recorded using twin-beam photocells. All but one of the formulae showed good construct validity, as evidenced by similar within-protocol fatigue scores. However, the assumptions on which many of the formulae were based, combined with poor or inconsistent test-retest reliability (coefficient of variation range: 0.8-145.7%; intraclass correlation coefficient range: 0.09-0.75), suggested many problems regarding logical validity. In line with previous research, the results support the percentage decrement calculation as the most valid and reliable method of quantifying fatigue in tests of multiple-sprint performance.
Standards for Environmental Measurement Using GIS: Toward a Protocol for Protocols.
Forsyth, Ann; Schmitz, Kathryn H; Oakes, Michael; Zimmerman, Jason; Koepp, Joel
2006-02-01
Interdisciplinary research regarding how the built environment influences physical activity has recently increased. Many research projects conducted jointly by public health and environmental design professionals are using geographic information systems (GIS) to objectively measure the built environment. Numerous methodological issues remain, however, and environmental measurements have not been well documented with accepted, common definitions of valid, reliable variables. This paper proposes how to create and document standardized definitions for measures of environmental variables using GIS with the ultimate goal of developing reliable, valid measures. Inherent problems with software and data that hamper environmental measurement can be offset by protocols combining clear conceptual bases with detailed measurement instructions. Examples demonstrate how protocols can more clearly translate concepts into specific measurement. This paper provides a model for developing protocols to allow high quality comparative research on relationships between the environment and physical activity and other outcomes of public health interest.
Validation of Survivability Validation Protocols
1993-05-01
simu- lation fidelityl. Physical testing of P.i SOS, in either aboveground tests (AGTs) or underground test ( UGTs ), will usually be impossible, due...with some simulation fidelity compromises) are possible in UGTs and/orAGTs. Hence proof tests, if done in statistically significant numbers, can...level. Simulation fidelity and AGT/ UGT /threat correlation will be validation issues here. Extrapolation to threat environments will be done via modeling
Condron, Robin; Farrokh, Choreh; Jordan, Kieran; McClure, Peter; Ross, Tom; Cerf, Olivier
2015-01-02
Studies on the heat resistance of dairy pathogens are a vital part of assessing the safety of dairy products. However, harmonized methodology for the study of heat resistance of food pathogens is lacking, even though there is a need for such harmonized experimental design protocols and for harmonized validation procedures for heat treatment studies. Such an approach is of particular importance to allow international agreement on appropriate risk management of emerging potential hazards for human and animal health. This paper is working toward establishment of a harmonized protocol for the study of the heat resistance of pathogens, identifying critical issues for establishment of internationally agreed protocols, including a harmonized framework for reporting and interpretation of heat inactivation studies of potentially pathogenic microorganisms. Copyright © 2014 Elsevier B.V. All rights reserved.
Schmettow, Martin; Schnittker, Raphaela; Schraagen, Jan Maarten
2017-05-01
This paper proposes and demonstrates an extended protocol for usability validation testing of medical devices. A review of currently used methods for the usability evaluation of medical devices revealed two main shortcomings. Firstly, the lack of methods to closely trace the interaction sequences and derive performance measures. Secondly, a prevailing focus on cross-sectional validation studies, ignoring the issues of learnability and training. The U.S. Federal Drug and Food Administration's recent proposal for a validation testing protocol for medical devices is then extended to address these shortcomings: (1) a novel process measure 'normative path deviations' is introduced that is useful for both quantitative and qualitative usability studies and (2) a longitudinal, completely within-subject study design is presented that assesses learnability, training effects and allows analysis of diversity of users. A reference regression model is introduced to analyze data from this and similar studies, drawing upon generalized linear mixed-effects models and a Bayesian estimation approach. The extended protocol is implemented and demonstrated in a study comparing a novel syringe infusion pump prototype to an existing design with a sample of 25 healthcare professionals. Strong performance differences between designs were observed with a variety of usability measures, as well as varying training-on-the-job effects. We discuss our findings with regard to validation testing guidelines, reflect on the extensions and discuss the perspectives they add to the validation process. Copyright © 2017 Elsevier Inc. All rights reserved.
Field validation of protocols developed to evaluate in-line mastitis detection systems.
Kamphuis, C; Dela Rue, B T; Eastwood, C R
2016-02-01
This paper reports on a field validation of previously developed protocols for evaluating the performance of in-line mastitis-detection systems. The protocols outlined 2 requirements of these systems: (1) to detect cows with clinical mastitis (CM) promptly and accurately to enable timely and appropriate treatment and (2) to identify cows with high somatic cell count (SCC) to manage bulk milk SCC levels. Gold standard measures, evaluation tests, performance measures, and performance targets were proposed. The current study validated the protocols on commercial dairy farms with automated in-line mastitis-detection systems using both electrical conductivity (EC) and SCC sensor systems that both monitor at whole-udder level. The protocol for requirement 1 was applied on 3 commercial farms. For requirement 2, the protocol was applied on 6 farms; 3 of them had low bulk milk SCC (128×10(3) cells/mL) and were the same farms as used for field evaluation of requirement 1. Three farms with high bulk milk SCC (270×10(3) cells/mL) were additionally enrolled. The field evaluation methodology and results were presented at a workshop including representation from 7 international suppliers of in-line mastitis-detection systems. Feedback was sought on the acceptance of standardized performance evaluation protocols and recommended refinements to the protocols. Although the methodology for requirement 1 was relatively labor intensive and required organizational skills over an extended period, no major issues were encountered during the field validation of both protocols. The validation, thus, proved the protocols to be practical. Also, no changes to the data collection process were recommended by the technology supplier representatives. However, 4 recommendations were made to refine the protocols: inclusion of an additional analysis that ignores small (low-density) clot observations in the definition of CM, extension of the time window from 4 to 5 milkings for timely alerts for CM, setting a maximum number of 10 milkings for the time window to detect a CM episode, and presentation of sensitivity for a larger range of false alerts per 1,000 milkings replacing minimum performance targets. The recommended refinements are discussed with suggested changes to the original protocols. The information presented is intended to inform further debate toward achieving international agreement on standard protocols to evaluate performance of in-line mastitis-detection systems. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Lee, Joseph G L; Gregory, Kyle R; Baker, Hannah M; Ranney, Leah M; Goldstein, Adam O
2016-01-01
Most smokers become addicted to tobacco products before they are legally able to purchase these products. We systematically reviewed the literature on protocols to assess underage purchase and their ecological validity. We conducted a systematic search in May 2015 in PubMed and PsycINFO. We independently screened records for inclusion. We conducted a narrative review and examined implications of two types of legal authority for protocols that govern underage buy enforcement in the United States: criminal (state-level laws prohibiting sales to youth) and administrative (federal regulations prohibiting sales to youth). Ten studies experimentally assessed underage buy protocols and 44 studies assessed the association between youth characteristics and tobacco sales. Protocols that mimicked real-world youth behaviors were consistently associated with substantially greater likelihood of a sale to a youth. Many of the tested protocols appear to be designed for compliance with criminal law rather than administrative enforcement in ways that limited ecological validity. This may be due to concerns about entrapment. For administrative enforcement in particular, entrapment may be less of an issue than commonly thought. Commonly used underage buy protocols poorly represent the reality of youths' access to tobacco from retailers. Compliance check programs should allow youth to present themselves naturally and attempt to match the community's demographic makeup.
Lee, Joseph G. L.; Gregory, Kyle R.; Baker, Hannah M.; Ranney, Leah M.; Goldstein, Adam O.
2016-01-01
Most smokers become addicted to tobacco products before they are legally able to purchase these products. We systematically reviewed the literature on protocols to assess underage purchase and their ecological validity. We conducted a systematic search in May 2015 in PubMed and PsycINFO. We independently screened records for inclusion. We conducted a narrative review and examined implications of two types of legal authority for protocols that govern underage buy enforcement in the United States: criminal (state-level laws prohibiting sales to youth) and administrative (federal regulations prohibiting sales to youth). Ten studies experimentally assessed underage buy protocols and 44 studies assessed the association between youth characteristics and tobacco sales. Protocols that mimicked real-world youth behaviors were consistently associated with substantially greater likelihood of a sale to a youth. Many of the tested protocols appear to be designed for compliance with criminal law rather than administrative enforcement in ways that limited ecological validity. This may be due to concerns about entrapment. For administrative enforcement in particular, entrapment may be less of an issue than commonly thought. Commonly used underage buy protocols poorly represent the reality of youths' access to tobacco from retailers. Compliance check programs should allow youth to present themselves naturally and attempt to match the community’s demographic makeup. PMID:27050671
2013-06-01
accumulate and shelter sessile and mobile marine species. Fouling in sea chests and sea water pipework is also an operational issue for marine engineers ...pipework is also an operational issue for marine engineers , as it restricts water flow to essential vessel systems and may enhance biocorrosion [18, 19...subtidal marine communities worldwide and are considered as key species and important habitat engineers in benthic communities [30]. They possess high
Clemes, Stacy A; Biddle, Stuart J H
2013-02-01
Pedometers are increasingly being used to measure physical activity in children and adolescents. This review provides an overview of common measurement issues relating to their use. Studies addressing the following measurement issues in children/adolescents (aged 3-18 years) were included: pedometer validity and reliability, monitoring period, wear time, reactivity, and data treatment and reporting. Pedometer surveillance studies in children/adolescents (aged: 4-18 years) were also included to enable common measurement protocols to be highlighted. In children > 5 years, pedometers provide a valid and reliable, objective measure of ambulatory activity. Further evidence is required on pedometer validity in preschool children. Across all ages, optimal monitoring frames to detect habitual activity have yet to be determined; most surveillance studies use 7 days. It is recommended that standardized wear time criteria are established for different age groups, and that wear times are reported. As activity varies between weekdays and weekend days, researchers interested in habitual activity should include both types of day in surveillance studies. There is conflicting evidence on the presence of reactivity to pedometers. Pedometers are a suitable tool to objectively assess ambulatory activity in children (> 5 years) and adolescents. This review provides recommendations to enhance the standardization of measurement protocols.
Experimental control in software reliability certification
NASA Technical Reports Server (NTRS)
Trammell, Carmen J.; Poore, Jesse H.
1994-01-01
There is growing interest in software 'certification', i.e., confirmation that software has performed satisfactorily under a defined certification protocol. Regulatory agencies, customers, and prospective reusers all want assurance that a defined product standard has been met. In other industries, products are typically certified under protocols in which random samples of the product are drawn, tests characteristic of operational use are applied, analytical or statistical inferences are made, and products meeting a standard are 'certified' as fit for use. A warranty statement is often issued upon satisfactory completion of a certification protocol. This paper outlines specific engineering practices that must be used to preserve the validity of the statistical certification testing protocol. The assumptions associated with a statistical experiment are given, and their implications for statistical testing of software are described.
Qiu, Shuming; Xu, Guoai; Ahmad, Haseeb; Guo, Yanhui
2018-01-01
The Session Initiation Protocol (SIP) is an extensive and esteemed communication protocol employed to regulate signaling as well as for controlling multimedia communication sessions. Recently, Kumari et al. proposed an improved smart card based authentication scheme for SIP based on Farash's scheme. Farash claimed that his protocol is resistant against various known attacks. But, we observe some accountable flaws in Farash's protocol. We point out that Farash's protocol is prone to key-compromise impersonation attack and is unable to provide pre-verification in the smart card, efficient password change and perfect forward secrecy. To overcome these limitations, in this paper we present an enhanced authentication mechanism based on Kumari et al.'s scheme. We prove that the proposed protocol not only overcomes the issues in Farash's scheme, but it can also resist against all known attacks. We also provide the security analysis of the proposed scheme with the help of widespread AVISPA (Automated Validation of Internet Security Protocols and Applications) software. At last, comparing with the earlier proposals in terms of security and efficiency, we conclude that the proposed protocol is efficient and more secure.
A framework for the definition of standardized protocols for measuring upper-extremity kinematics.
Kontaxis, A; Cutti, A G; Johnson, G R; Veeger, H E J
2009-03-01
Increasing interest in upper extremity biomechanics has led to closer investigations of both segment movements and detailed joint motion. Unfortunately, conceptual and practical differences in the motion analysis protocols used up to date reduce compatibility for post data and cross validation analysis and so weaken the body of knowledge. This difficulty highlights a need for standardised protocols, each addressing a set of questions of comparable content. The aim of this work is therefore to open a discussion and propose a flexible framework to support: (1) the definition of standardised protocols, (2) a standardised description of these protocols, and (3) the formulation of general recommendations. Proposal of a framework for the definition of standardized protocols. The framework is composed by two nested flowcharts. The first defines what a motion analysis protocol is by pointing out its role in a motion analysis study. The second flowchart describes the steps to build a protocol, which requires decisions on the joints or segments to be investigated and the description of their mechanical equivalent model, the definition of the anatomical or functional coordinate frames, the choice of marker or sensor configuration and the validity of their use, the definition of the activities to be measured and the refinements that can be applied to the final measurements. Finally, general recommendations are proposed for each of the steps based on the current literature, and open issues are highlighted for future investigation and standardisation. Standardisation of motion analysis protocols is urgent. The proposed framework can guide this process through the rationalisation of the approach.
Evaluating Management Information Systems, A Protocol for Automated Peer Review Systems
Black, Gordon C.
1980-01-01
This paper discusses key issues in evaluating an automated Peer Review System. Included are the conceptual base, design, steps in planning structural components, operation parameters, criteria, costs and a detailed outline or protocol for use in the evaluation. At the heart of the Peer Review System is the criteria utilized for measuring quality. Criteria evaluation should embrace, as a minimum, appropriateness, validity and reliability, and completemess or comprehensiveness of content. Such an evaluation is not complete without determining the impact (clinical outcome) of the service system or the patient and the population served.
Tsoka-Gwegweni, Joyce M; Wassenaar, Douglas R
2014-12-01
The Emanuel, Wendler, and Grady framework was designed as a universal tool for use in many settings including developing countries. However, it is not known whether the work of African health research ethics committees (RECs) is compatible with this framework. The absence of any normative or empirical weighting of the eight principles within this framework suggests that different health RECs may raise some ethical issues more frequently than others when reviewing protocols. We used the Emanuel et al. framework to assess, code, and rank the most frequent ethical issues considered by a biomedical REC during review of research protocols for the years 2008 to 2012. We extracted data from the recorded minutes of a South African biomedical REC for the years 2008 to 2012, designed the data collection sheet according to the Emanuel et al. framework, and removed all identifiers during data processing and analysis. From the 98 protocols that we assessed, the most frequent issues that emerged were the informed consent, scientific validity, fair participant selection, and ongoing respect for participants. This study represents the first known attempt to analyze REC responses/minutes using the Emanuel et al. framework, and suggests that this framework may be useful in describing and categorizing the core activities of an REC. © The Author(s) 2014.
How Non-Linearity and Grade-Level Differences Complicate the Validation of Observation Protocols
ERIC Educational Resources Information Center
Lazarev, Valeriy; Newman, Denis
2013-01-01
Teacher evaluation is currently a major policy issue at all levels of the K-12 system driven in large part by current US Department of Education requirements. The main objective of this study is to explore the patterns of relationship between observational scores and value-added measures of teacher performance in math classrooms and the variation…
2018-01-01
The Session Initiation Protocol (SIP) is an extensive and esteemed communication protocol employed to regulate signaling as well as for controlling multimedia communication sessions. Recently, Kumari et al. proposed an improved smart card based authentication scheme for SIP based on Farash’s scheme. Farash claimed that his protocol is resistant against various known attacks. But, we observe some accountable flaws in Farash’s protocol. We point out that Farash’s protocol is prone to key-compromise impersonation attack and is unable to provide pre-verification in the smart card, efficient password change and perfect forward secrecy. To overcome these limitations, in this paper we present an enhanced authentication mechanism based on Kumari et al.’s scheme. We prove that the proposed protocol not only overcomes the issues in Farash’s scheme, but it can also resist against all known attacks. We also provide the security analysis of the proposed scheme with the help of widespread AVISPA (Automated Validation of Internet Security Protocols and Applications) software. At last, comparing with the earlier proposals in terms of security and efficiency, we conclude that the proposed protocol is efficient and more secure. PMID:29547619
Kelly, Janet L; Hirsch, Irl B; Furnary, Anthony P
2006-01-01
Diabetes mellitus is the fourth most common comorbid condition among hospitalized patients, and 30% of patients undergoing open-heart surgery have diabetes. The link between hyperglycemia and poor outcome has been well described, and large clinical trials have shown that aggressive control of blood glucose with an insulin infusion can improve these outcomes. The barriers to implementing an insulin infusion protocol are numerous, despite the fact that doing so is paramount to clinical success. Barriers include safety concerns, such as fear of hypoglycemia, insufficient nursing staff to patient ratios, lack of administrative and physician support, various system and procedural issues, and resistance to change. Key steps to overcome the barriers include building support with multidisciplinary champions, involving key staff, educating staff, and administrators of the clinical and economic benefits of improving glycemic control, setting realistic goals, selecting a validated insulin infusion protocol, and internally marketing the success of the protocol.
Advani, Aneel; Jones, Neil; Shahar, Yuval; Goldstein, Mary K; Musen, Mark A
2004-01-01
We develop a method and algorithm for deciding the optimal approach to creating quality-auditing protocols for guideline-based clinical performance measures. An important element of the audit protocol design problem is deciding which guide-line elements to audit. Specifically, the problem is how and when to aggregate individual patient case-specific guideline elements into population-based quality measures. The key statistical issue involved is the trade-off between increased reliability with more general population-based quality measures versus increased validity from individually case-adjusted but more restricted measures done at a greater audit cost. Our intelligent algorithm for auditing protocol design is based on hierarchically modeling incrementally case-adjusted quality constraints. We select quality constraints to measure using an optimization criterion based on statistical generalizability coefficients. We present results of the approach from a deployed decision support system for a hypertension guideline.
2017-01-01
In an ideal plasmonic surface sensor, the bioactive area, where analytes are recognized by specific biomolecules, is surrounded by an area that is generally composed of a different material. The latter, often the surface of the supporting chip, is generally hard to be selectively functionalized, with respect to the active area. As a result, cross talks between the active area and the surrounding one may occur. In designing a plasmonic sensor, various issues must be addressed: the specificity of analyte recognition, the orientation of the immobilized biomolecule that acts as the analyte receptor, and the selectivity of surface coverage. The objective of this tutorial review is to introduce the main rational tools required for a correct and complete approach to chemically functionalize plasmonic surface biosensors. After a short introduction, the review discusses, in detail, the most common strategies for achieving effective surface functionalization. The most important issues, such as the orientation of active molecules and spatial and chemical selectivity, are considered. A list of well-defined protocols is suggested for the most common practical situations. Importantly, for the reported protocols, we also present direct comparisons in term of costs, labor demand, and risk vs benefit balance. In addition, a survey of the most used characterization techniques necessary to validate the chemical protocols is reported. PMID:28796479
How Many Batches Are Needed for Process Validation under the New FDA Guidance?
Yang, Harry
2013-01-01
The newly updated FDA Guidance for Industry on Process Validation: General Principles and Practices ushers in a life cycle approach to process validation. While the guidance no longer considers the use of traditional three-batch validation appropriate, it does not prescribe the number of validation batches for a prospective validation protocol, nor does it provide specific methods to determine it. This potentially could leave manufacturers in a quandary. In this paper, I develop a Bayesian method to address the issue. By combining process knowledge gained from Stage 1 Process Design (PD) with expected outcomes of Stage 2 Process Performance Qualification (PPQ), the number of validation batches for PPQ is determined to provide a high level of assurance that the process will consistently produce future batches meeting quality standards. Several examples based on simulated data are presented to illustrate the use of the Bayesian method in helping manufacturers make risk-based decisions for Stage 2 PPQ, and they highlight the advantages of the method over traditional Frequentist approaches. The discussions in the paper lend support for a life cycle and risk-based approach to process validation recommended in the new FDA guidance. The newly updated FDA Guidance for Industry on Process Validation: General Principles and Practices ushers in a life cycle approach to process validation. While the guidance no longer considers the use of traditional three-batch validation appropriate, it does not prescribe the number of validation batches for a prospective validation protocol, nor does it provide specific methods to determine it. This potentially could leave manufacturers in a quandary. In this paper, I develop a Bayesian method to address the issue. By combining process knowledge gained from Stage 1 Process Design (PD) with expected outcomes of Stage 2 Process Performance Qualification (PPQ), the number of validation batches for PPQ is determined to provide a high level of assurance that the process will consistently produce future batches meeting quality standards. Several examples based on simulated data are presented to illustrate the use of the Bayesian method in helping manufacturers make risk-based decisions for Stage 2 PPQ, and THEY highlight the advantages of the method over traditional Frequentist approaches. The discussions in the paper lend support for a life cycle and risk-based approach to process validation recommended in the new FDA guidance.
NASA Astrophysics Data System (ADS)
Moriarty, Patrick; Sanz Rodrigo, Javier; Gancarski, Pawel; Chuchfield, Matthew; Naughton, Jonathan W.; Hansen, Kurt S.; Machefaux, Ewan; Maguire, Eoghan; Castellani, Francesco; Terzi, Ludovico; Breton, Simon-Philippe; Ueda, Yuko
2014-06-01
Researchers within the International Energy Agency (IEA) Task 31: Wakebench have created a framework for the evaluation of wind farm flow models operating at the microscale level. The framework consists of a model evaluation protocol integrated with a web-based portal for model benchmarking (www.windbench.net). This paper provides an overview of the building-block validation approach applied to wind farm wake models, including best practices for the benchmarking and data processing procedures for validation datasets from wind farm SCADA and meteorological databases. A hierarchy of test cases has been proposed for wake model evaluation, from similarity theory of the axisymmetric wake and idealized infinite wind farm, to single-wake wind tunnel (UMN-EPFL) and field experiments (Sexbierum), to wind farm arrays in offshore (Horns Rev, Lillgrund) and complex terrain conditions (San Gregorio). A summary of results from the axisymmetric wake, Sexbierum, Horns Rev and Lillgrund benchmarks are used to discuss the state-of-the-art of wake model validation and highlight the most relevant issues for future development.
Cavero, Icilio; Guillon, Jean-Michel; Ballet, Veronique; Clements, Mike; Gerbeau, Jean-Frédéric; Holzgrefe, Henry
2016-01-01
The Comprehensive in vitro Proarrhythmia Assay (CiPA) is a nonclinical Safety Pharmacology paradigm for discovering electrophysiological mechanisms that are likely to confer proarrhythmic liability to drug candidates intended for human use. Key talks delivered at the 'CiPA on my mind' session, held during the 2015 Annual Meeting of the Safety Pharmacology Society (SPS), are summarized. Issues and potential solutions relating to crucial constituents [e.g., biological materials (ion channels and pluripotent stem cell-derived cardiomyocytes), study platforms, drug solutions, and data analysis] of CiPA core assays are critically examined. In order to advance the CiPA paradigm from the current testing and validation stages to a research and regulatory drug development strategy, systematic guidance by CiPA stakeholders is necessary to expedite solutions to pending and newly arising issues. Once a study protocol is proved to yield robust and reproducible results within and across laboratories, it can be implemented as qualified regulatory procedure. Copyright © 2016 Elsevier Inc. All rights reserved.
da Silva, Fabiana Alves; Vidal, Cláudia Fernanda de Lacerda; de Araújo, Ednaldo Cavalcante
2015-01-01
Abstract Objective: to validate the content of the prevention protocol for early sepsis caused by Streptococcus agalactiaein newborns. Method: a transversal, descriptive and methodological study, with a quantitative approach. The sample was composed of 15 judges, 8 obstetricians and 7 pediatricians. The validation occurred through the assessment of the content of the protocol by the judges that received the instrument for data collection - checklist - which contained 7 items that represent the requisites to be met by the protocol. The validation of the content was achieved by applying the Content Validity Index. Result: in the judging process, all the items that represented requirements considered by the protocol obtained concordance within the established level (Content Validity Index > 0.75). Of 7 items, 6 have obtained full concordance (Content Validity Index 1.0) and the feasibility item obtained a Content Validity Index of 0.93. The global assessment of the instruments obtained a Content Validity Index of 0.99. Conclusion: the validation of content that was done was an efficient tool for the adjustment of the protocol, according to the judgment of experienced professionals, which demonstrates the importance of conducting a previous validation of the instruments. It is expected that this study will serve as an incentive for the adoption of universal tracking by other institutions through validated protocols. PMID:26444165
Measurement issues related to data collection on the World Wide Web.
Strickland, Ora L; Moloney, Margaret F; Dietrich, Alexa S; Myerburg, Stuart; Cotsonis, George A; Johnson, Robert V
2003-01-01
As the World Wide Web has become more prominent as a mode of communication, it has opened up new possibilities for research data collection. This article identifies measurement issues that occur with Internet data collection that are relevant to qualitative and quantitative research approaches as they occurred in a triangulated Internet study of perimenopausal women with migraine headaches. Issues associated with quantitative data collection over the Internet include (a) selecting and designing Internet data collection protocols that adequately address study aims while also taking advantage of the Internet, (b) ensuring the reliability and validity of Internet data collected, (c) adapting quantitative paper-and-pencil data collection protocols for the Internet, (d) making Internet data collection practical for respondents and researchers, and (e) ensuring the quality of quantitative data collected. Qualitative data collection over the Internet needs to remain true to the philosophical stance of the qualitative approach selected. Researcher expertise in qualitative data collection must be combined with expertise in computer technology and information services if data are to be of ultimate quality The advantages and limitations of collecting qualitative data in real time or at a later time are explored, as well as approaches to enhance qualitative data collection over the Internet. It was concluded that like any research approach or method, Internet data collection requires considerable creativity, expertise, and planning to take advantage of the technology for the collection of reliable and valid research data.
Simulation Fidelity Issues for Nuclear Survivability Validation Protocols.
1992-11-01
Explosive MILSTAR Military, Strategic and Tactical Relay satellite NTB National Test Bed PORTS Portable Radiation/Redout Testbed for Sensors RV Reentry...Vehicle SDI Strategic Defense Initiative SE System Element SEP System Element Platform SGEMP System Generated Electromagnetic Pulse S/N Signal-to...ELECTRONIC SCIENCES DIV ATTN: ACTL ATTN: CHIEF SCIENTIST ATTN: DDIR ATTN: DEP DIR RESEARCH ATTN: DFOP ATTN: DIR AEROSPACE & STRATEGIC TECH Dist-1 DNA-TR-92
Dewitt, James; Capistrant, Benjamin; Kohli, Nidhi; Mitteldorf, Darryl; Merengwa, Enyinnaya; West, William
2018-01-01
Background While deduplication and cross-validation protocols have been recommended for large Web-based studies, protocols for survey response validation of smaller studies have not been published. Objective This paper reports the challenges of survey validation inherent in a small Web-based health survey research. Methods The subject population was North American, gay and bisexual, prostate cancer survivors, who represent an under-researched, hidden, difficult-to-recruit, minority-within-a-minority population. In 2015-2016, advertising on a large Web-based cancer survivor support network, using email and social media, yielded 478 completed surveys. Results Our manual deduplication and cross-validation protocol identified 289 survey submissions (289/478, 60.4%) as likely spam, most stemming from advertising on social media. The basic components of this deduplication and validation protocol are detailed. An unexpected challenge encountered was invalid survey responses evolving across the study period. This necessitated the static detection protocol be augmented with a dynamic one. Conclusions Five recommendations for validation of Web-based samples, especially with smaller difficult-to-recruit populations, are detailed. PMID:29691203
Caron, Alexandra G M; Thomas, Colette R; Berry, Kathryn L E; Motti, Cherie A; Ariel, Ellen; Brodie, Jon E
2018-02-01
Ocean contamination by plastics is a global issue. Although ingestion of plastic debris by sea turtles has been widely documented, contamination by microplastics (<5mm) is poorly known and likely to be under-reported. We developed a microplastic extraction protocol for examining green turtle (Chelonia mydas) chyme, which is multifarious in nature, by modifying and combining pre-established methods used to separate microplastics from organic matter and sediments. This protocol consists of visual inspection, nitric acid digestion, emulsification of residual fat, density separation, and chemical identification by Fourier transform infrared spectroscopy. This protocol enables the extraction of polyethylene, high-density polyethylene, (aminoethyl) polystyrene, polypropylene, and polyvinyl chloride microplastics >100μm. Two macroplastics and seven microplastics (two plastic paint chips and five synthetic fabric particles) were isolated from subsamples of two green turtles. Our results highlight the need for more research towards understanding the impact of microplastics on these threatened marine reptiles. Copyright © 2018 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barratt, B.I.P.; Moeed, A.; Malone, L.A.
2006-05-15
An analysis of established biosafety protocols for release into the environment of exotic plants and biological control agents for weeds and arthropod pests has been carried out to determine whether such protocols can be applied to relatively new and emerging technologies intended for the primary production industries, such as transgenic plants. Example case studies are described to indicate the scope of issues considered by regulators who make decisions on new organism releases. No transgenic plants have been released to date in New Zealand, but two field test approvals are described as examples. An analysis of the biosafety protocols has shownmore » that, while many of the risk criteria considered for decision-making by regulators are similar for all new organisms, a case-by-case examination of risks and potential impacts is required in order to fully assess risk. The value of post-release monitoring and validation of decisions made by regulators is emphasised.« less
Ocean Optics Protocols for Satellite Ocean Color Sensor Validation. Volume 1; Revised
NASA Technical Reports Server (NTRS)
Mueller, James L. (Editor); Fargion, Giulietta (Editor); Mueller, J. L.; Trees, C.; Austin, R. W.; Pietras, C.; Hooker, S.; Holben, B.; McClain, Charles R.; Clark, D. K.;
2002-01-01
This document stipulates protocols for measuring bio-optical and radiometric data for the SIMBIOS Project. It supersedes the earlier version, and is organized into four parts: Introductory Background, Instrument Characteristics, Field Measurements and Data Analysis, Data Reporting and Archival. Changes in this revision include the addition of three new chapters: (1) Fundamental Definitions, Relationships and Conventions; (2) MOBY, A Radiometric Buoy for Performance Monitoring and Vicarious Calibration of Satellite Ocean Color Sensors: Measurement and Data Analysis Protocols; and (3) Normalized Water-Leaving Radiance and Remote Sensing Reflectance: Bidirectional Reflectance and Other Factors. Although the present document represents another significant, incremental improvement in the ocean optics protocols, there are several protocols that have either been overtaken by recent technological progress, or have been otherwise identified as inadequate. Revision 4 is scheduled for completion sometime in 2003. This technical report is not meant as a substitute for scientific literature. Instead, it will provide a ready and responsive vehicle for the multitude of technical reports issued by an operational Project. The contributions are published as submitted, after only minor editing to correct obvious grammatical or clerical errors.
Ocean Optics Protocols for Satellite Ocean Color Sensor Validation. Volume 2; Revised
NASA Technical Reports Server (NTRS)
Mueller, James L. (Editor); Fargion, Giulietta S. (Editor); Trees, C.; Austin, R. W.; Pietras, C. (Editor); Hooker, S.; Holben, B.; McClain, Charles R.; Clark, D. K.; Yuen, M.
2002-01-01
This document stipulates protocols for measuring bio-optical and radiometric data for the SIMBIOS Project. It supersedes the earlier version, and is organized into four parts: Introductory Background, Instrument Characteristics, Field Measurements and Data Analysis, Data Reporting and Archival. Changes in this revision include the addition of three new chapters: (1) Fundamental Definitions, Relationships and Conventions; (2) MOBY, A Radiometric Buoy for Performance Monitoring and Vicarious Calibration of Satellite Ocean Color Sensors: Measurement and Data Analysis Protocols; and (3) Normalized Water-Leaving Radiance and Remote Sensing Reflectance: Bidirectional Reflectance and Other Factors. Although the present document represents another significant, incremental improvement in the ocean optics protocols, there are several protocols that have either been overtaken by recent technological progress, or have been otherwise identified as inadequate. Revision 4 is scheduled for completion sometime in 2003. This technical report is not meant as a substitute for scientific literature. Instead, it will provide a ready and responsive vehicle for the multitude of technical reports issued by an operational Project. The contributions are published as submitted, after only minor editing to correct obvious grammatical or clerical errors.
You, Ilsun; Kwon, Soonhyun; Choudhary, Gaurav; Sharma, Vishal; Seo, Jung Taek
2018-06-08
The Internet of Things (IoT) utilizes algorithms to facilitate intelligent applications across cities in the form of smart-urban projects. As the majority of devices in IoT are battery operated, their applications should be facilitated with a low-power communication setup. Such facility is possible through the Low-Power Wide-Area Network (LPWAN), but at a constrained bit rate. For long-range communication over LPWAN, several approaches and protocols are adopted. One such protocol is the Long-Range Wide Area Network (LoRaWAN), which is a media access layer protocol for long-range communication between the devices and the application servers via LPWAN gateways. However, LoRaWAN comes with fewer security features as a much-secured protocol consumes more battery because of the exorbitant computational overheads. The standard protocol fails to support end-to-end security and perfect forward secrecy while being vulnerable to the replay attack that makes LoRaWAN limited in supporting applications where security (especially end-to-end security) is important. Motivated by this, an enhanced LoRaWAN security protocol is proposed, which not only provides the basic functions of connectivity between the application server and the end device, but additionally averts these listed security issues. The proposed protocol is developed with two options, the Default Option (DO) and the Security-Enhanced Option (SEO). The protocol is validated through Burrows⁻Abadi⁻Needham (BAN) logic and the Automated Validation of Internet Security Protocols and Applications (AVISPA) tool. The proposed protocol is also analyzed for overheads through system-based and low-power device-based evaluations. Further, a case study on a smart factory-enabled parking system is considered for its practical application. The results, in terms of network latency with reliability fitting and signaling overheads, show paramount improvements and better performance for the proposed protocol compared with the two handshake options, Pre-Shared Key (PSK) and Elliptic Curve Cryptography (ECC), of Datagram Transport Layer Security (DTLS).
Determining drug release rates of hydrophobic compounds from nanocarriers
D’Addio, Suzanne M.; Bukari, Abdallah A.; Dawoud, Mohammed; Bunjes, Heike; Rinaldi, Carlos; Prud’homme, Robert K.
2016-01-01
Obtaining meaningful drug release profiles for drug formulations is essential prior to in vivo testing and for ensuring consistent quality. The release kinetics of hydrophobic drugs from nanocarriers (NCs) are not well understood because the standard protocols for maintaining sink conditions and sampling are not valid owing to mass transfer and solubility limitations. In this work, a new in vitroassay protocol based on ‘lipid sinks’ and magnetic separation produces release conditions that mimic the concentrations of lipid membranes and lipoproteins in vivo, facilitates separation, and thus allows determination of intrinsic release rates of drugs from NCs. The assay protocol is validated by (i) determining the magnetic separation efficiency, (ii) demonstrating that sink condition requirements are met, and (iii) accounting for drug by completing a mass balance. NCs of itraconazole and cyclosporine A (CsA) were prepared and the drug release profiles were determined. This release protocol has been used to compare the drug release from a polymer stabilized NC of CsA to a solid drug NP of CsA alone. These data have led to the finding that stabilizing block copolymer layers have a retarding effect on drug release from NCs, reducing the rate of CsA release fourfold compared with the nanoparticle without a polymer coating. This article is part of the themed issue ‘Soft interfacial materials: from fundamentals to formulation’. PMID:27298440
Determining drug release rates of hydrophobic compounds from nanocarriers.
D'Addio, Suzanne M; Bukari, Abdallah A; Dawoud, Mohammed; Bunjes, Heike; Rinaldi, Carlos; Prud'homme, Robert K
2016-07-28
Obtaining meaningful drug release profiles for drug formulations is essential prior to in vivo testing and for ensuring consistent quality. The release kinetics of hydrophobic drugs from nanocarriers (NCs) are not well understood because the standard protocols for maintaining sink conditions and sampling are not valid owing to mass transfer and solubility limitations. In this work, a new in vitroassay protocol based on 'lipid sinks' and magnetic separation produces release conditions that mimic the concentrations of lipid membranes and lipoproteins in vivo, facilitates separation, and thus allows determination of intrinsic release rates of drugs from NCs. The assay protocol is validated by (i) determining the magnetic separation efficiency, (ii) demonstrating that sink condition requirements are met, and (iii) accounting for drug by completing a mass balance. NCs of itraconazole and cyclosporine A (CsA) were prepared and the drug release profiles were determined. This release protocol has been used to compare the drug release from a polymer stabilized NC of CsA to a solid drug NP of CsA alone. These data have led to the finding that stabilizing block copolymer layers have a retarding effect on drug release from NCs, reducing the rate of CsA release fourfold compared with the nanoparticle without a polymer coating.This article is part of the themed issue 'Soft interfacial materials: from fundamentals to formulation'. © 2016 The Author(s).
Tharyan, Prathap; George, Aneesh Thomas; Kirubakaran, Richard; Barnabas, Jabez Paul
2013-01-01
We sought to evaluate if editorial policies and the reporting quality of randomized controlled trials (RCTs) had improved since our 2004-05 survey of 151 RCTs in 65 Indian journals, and to compare reporting quality of protocols in the Clinical Trials Registry-India (CTRI). An observational study of endorsement of Consolidated Standards for the Reporting of Trials (CONSORT) and International Committee of Medical Journal Editors (ICMJE) requirements in the instructions to authors in Indian journals, and compliance with selected requirements in all RCTs published during 2007-08 vs. our previous survey and between all RCT protocols in the CTRI on August 31, 2010 and published RCTs from both surveys. Journal policies endorsing the CONSORT statement (22/67, 33%) and ICMJE requirements (35/67, 52%) remained suboptimal, and only 4 of 13 CONSORT items were reported in more than 50% of the 145 RCTs assessed. Reporting of ethical issues had improved significantly, and that of methods addressing internal validity had not improved. Adequate methods were reported significantly more frequently in 768 protocols in the CTRI, than in the 296 published trials. The CTRI template facilitates the reporting of valid methods in registered trial protocols. The suboptimal compliance with CONSORT and ICMJE requirements in RCTs published in Indian journals reduces credibility in the reliability of their results. Copyright © 2013 Elsevier Inc. All rights reserved.
Dewitt, James; Capistrant, Benjamin; Kohli, Nidhi; Rosser, B R Simon; Mitteldorf, Darryl; Merengwa, Enyinnaya; West, William
2018-04-24
While deduplication and cross-validation protocols have been recommended for large Web-based studies, protocols for survey response validation of smaller studies have not been published. This paper reports the challenges of survey validation inherent in a small Web-based health survey research. The subject population was North American, gay and bisexual, prostate cancer survivors, who represent an under-researched, hidden, difficult-to-recruit, minority-within-a-minority population. In 2015-2016, advertising on a large Web-based cancer survivor support network, using email and social media, yielded 478 completed surveys. Our manual deduplication and cross-validation protocol identified 289 survey submissions (289/478, 60.4%) as likely spam, most stemming from advertising on social media. The basic components of this deduplication and validation protocol are detailed. An unexpected challenge encountered was invalid survey responses evolving across the study period. This necessitated the static detection protocol be augmented with a dynamic one. Five recommendations for validation of Web-based samples, especially with smaller difficult-to-recruit populations, are detailed. ©James Dewitt, Benjamin Capistrant, Nidhi Kohli, B R Simon Rosser, Darryl Mitteldorf, Enyinnaya Merengwa, William West. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 24.04.2018.
Paoletti, Claudia; Esbensen, Kim H
2015-01-01
Material heterogeneity influences the effectiveness of sampling procedures. Most sampling guidelines used for assessment of food and/or feed commodities are based on classical statistical distribution requirements, the normal, binomial, and Poisson distributions-and almost universally rely on the assumption of randomness. However, this is unrealistic. The scientific food and feed community recognizes a strong preponderance of non random distribution within commodity lots, which should be a more realistic prerequisite for definition of effective sampling protocols. Nevertheless, these heterogeneity issues are overlooked as the prime focus is often placed only on financial, time, equipment, and personnel constraints instead of mandating acquisition of documented representative samples under realistic heterogeneity conditions. This study shows how the principles promulgated in the Theory of Sampling (TOS) and practically tested over 60 years provide an effective framework for dealing with the complete set of adverse aspects of both compositional and distributional heterogeneity (material sampling errors), as well as with the errors incurred by the sampling process itself. The results of an empirical European Union study on genetically modified soybean heterogeneity, Kernel Lot Distribution Assessment are summarized, as they have a strong bearing on the issue of proper sampling protocol development. TOS principles apply universally in the food and feed realm and must therefore be considered the only basis for development of valid sampling protocols free from distributional constraints.
A Group Neighborhood Average Clock Synchronization Protocol for Wireless Sensor Networks
Lin, Lin; Ma, Shiwei; Ma, Maode
2014-01-01
Clock synchronization is a very important issue for the applications of wireless sensor networks. The sensors need to keep a strict clock so that users can know exactly what happens in the monitoring area at the same time. This paper proposes a novel internal distributed clock synchronization solution using group neighborhood average. Each sensor node collects the offset and skew rate of the neighbors. Group averaging of offset and skew rate value are calculated instead of conventional point-to-point averaging method. The sensor node then returns compensated value back to the neighbors. The propagation delay is considered and compensated. The analytical analysis of offset and skew compensation is presented. Simulation results validate the effectiveness of the protocol and reveal that the protocol allows sensor networks to quickly establish a consensus clock and maintain a small deviation from the consensus clock. PMID:25120163
Bort-Roig, Judit; Puig-Ribera, Anna; Contreras, Ruth S; Chirveches-Pérez, Emilia; Martori, Joan C; Gilson, Nicholas D; McKenna, Jim
2017-09-15
This study validated the Walk@Work-Application (W@W-App) for measuring occupational sitting and stepping. The W@W-App was installed on the smartphones of office-based employees (n=17; 10 women; 26±3 years). A prescribed 1-hour laboratory protocol plus two continuous hours of occupational free-living activities were performed. Intra-class correlation coefficients (ICC) compared mean differences of sitting time and step count measurements between the W@W-App and criterion measures (ActivPAL3TM and SW200Yamax Digi-Walker). During the protocol, agreement between self-paced walking (ICC=0.85) and active working tasks step counts (ICC=0.80) was good. The smallest median difference was for sitting time (1.5seconds). During free-living conditions, sitting time (ICC=0.99) and stepping (ICC=0.92) showed excellent agreement, with a difference of 0.5minutes and 18 steps respectively. The W@W-App provided valid measures for monitoring occupational sedentary patterns in real life conditions; a key issue for increasing awareness and changing occupational sedentariness. Copyright © 2017 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.
NASA Technical Reports Server (NTRS)
Lewis, Pattie
2005-01-01
Headquarters National Aeronautics and Space Administration (NASA) chartered the Acquisition Pollution Prevention (AP2) Office to coordinate agency activities affecting pollution prevention issues identified during system and component acquisition and sustainment processes. The primary objectives of the AP2 Office are to: (1) Reduce or eliminate the use of hazardous materials (HazMats) or hazardous processes at manufacturing, remanufacturing, and sustainment locations. (2) A void duplication of effort in actions required to reduce or eliminate HazMats through joint center cooperation and technology sharing. This project will identify, evaluate and approve alternative surface preparation technologies for use at NASA and Air Force Space Command (AFSPC) installations. Materials and processes will be evaluated with the goal of selecting those processes that will improve corrosion protection at critical systems, facilitate easier maintenance activity, extend maintenance cycles, eliminate flight hardware contamination and reduce the amount of hazardous waste generated. This Joint Test Protocol (JTP) contains the critical requirements and tests necessary to qualify alternative Low-Emission Surface Preparation/Depainting Technologies for Structural Steel Applications. These tests were derived from engineering, performance, and operational impact (supportability) requirements defined by a consensus of NASA and Air Force Space Command (AFSPC) participants. The Field Test Plan (FTP), entitled Joint Test Protocol for Validation of Alternative Low Emission Surface Preparation/Depainting Technologies for Structural Steel, prepared by ITB, defines the field evaluation and testing requirements for validating alternative surface preparation/depainting technologies and supplements the JTP.
Applicability of three alternative instruments for food authenticity analysis: GMO identification.
Burrell, A; Foy, C; Burns, M
2011-03-06
Ensuring foods are correctly labelled for ingredients derived from genetically modified organisms (GMOs) is an issue facing manufacturers, retailers, and enforcement agencies. DNA approaches for the determination of food authenticitys often use the polymerase chain reaction (PCR), and PCR products can be detected using capillary or gel electrophoresis. This study examines the fitness for purpose of the application of three laboratory electrophoresis instruments (Agilent Bioanalyzer 2100, Lab901 TapeStation, and Shimadzu MCE-202 MultiNA) for the detection of GMOs using PCR based on a previously validated protocol. Whilst minor differences in the performance characteristics of bias and precision were observed, all three instruments demonstrated their applicability in using this protocol for screening of GMO ingredients.
Applicability of Three Alternative Instruments for Food Authenticity Analysis: GMO Identification
Burrell, A.; Foy, C.; Burns, M.
2011-01-01
Ensuring foods are correctly labelled for ingredients derived from genetically modified organisms (GMOs) is an issue facing manufacturers, retailers, and enforcement agencies. DNA approaches for the determination of food authenticitys often use the polymerase chain reaction (PCR), and PCR products can be detected using capillary or gel electrophoresis. This study examines the fitness for purpose of the application of three laboratory electrophoresis instruments (Agilent Bioanalyzer 2100, Lab901 TapeStation, and Shimadzu MCE-202 MultiNA) for the detection of GMOs using PCR based on a previously validated protocol. Whilst minor differences in the performance characteristics of bias and precision were observed, all three instruments demonstrated their applicability in using this protocol for screening of GMO ingredients. PMID:21527985
Guidance on validation and qualification of processes and operations involving radiopharmaceuticals.
Todde, S; Peitl, P Kolenc; Elsinga, P; Koziorowski, J; Ferrari, V; Ocak, E M; Hjelstuen, O; Patt, M; Mindt, T L; Behe, M
2017-01-01
Validation and qualification activities are nowadays an integral part of the day by day routine work in a radiopharmacy. This document is meant as an Appendix of Part B of the EANM "Guidelines on Good Radiopharmacy Practice (GRPP)" issued by the Radiopharmacy Committee of the EANM, covering the qualification and validation aspects related to the small-scale "in house" preparation of radiopharmaceuticals. The aim is to provide more detailed and practice-oriented guidance to those who are involved in the small-scale preparation of radiopharmaceuticals which are not intended for commercial purposes or distribution. The present guideline covers the validation and qualification activities following the well-known "validation chain", that begins with editing the general Validation Master Plan document, includes all the required documentation (e.g. User Requirement Specification, Qualification protocols, etc.), and leads to the qualification of the equipment used in the preparation and quality control of radiopharmaceuticals, until the final step of Process Validation. A specific guidance to the qualification and validation activities specifically addressed to small-scale hospital/academia radiopharmacies is here provided. Additional information, including practical examples, are also available.
Validated protocols for evaluating maternally mediated mechanisms of early pregnancy failure in rodents are needed for use in the risk assessment process. To supplement previous efforts in the validation of a panel of protocols assembled for this purpose, bromoergocryptine (BEC) ...
Reveiz, Ludovic; Haby, Michelle M; Martínez-Vega, Ruth; Pinzón-Flores, Carlos E; Elias, Vanessa; Smith, Emma; Pinart, Mariona; Broutet, Nathalie; Becerra-Posada, Francisco; Aldighieri, Sylvain; Van Kerkhove, Maria D
2017-01-01
Given the severity and impact of the current Zika virus (ZIKV) outbreak in the Americas, numerous countries have rushed to develop research studies to assess ZIKV and its potential health consequences. In an effort to ensure that studies are comprehensive, both internally and externally valid, and with reliable results, the World Health Organization, the Pan American Health Organization, Institut Pasteur, the networks of Fiocruz, the Consortia for the Standardization of Influenza Seroepidemiology (CONSISE) and the International Severe Acute Respiratory and Emerging Infection Consortium (ISARIC) have generated six standardized clinical and epidemiological research protocols and questionnaires to address key public health questions on ZIKV. We conducted a systematic search of ongoing study protocols related to ZIKV research. We analyzed the content of protocols of 32 cohort studies and 13 case control studies for systematic bias that could produce erroneous results. Additionally we aimed to characterize the risks of bias and confounding in observational studies related to ZIKV and to propose ways to minimize them, including the use of six newly standardized research protocols. Observational studies of ZIKV face an array of challenges, including measurement of exposure and outcomes (microcephaly and Guillain-Barré Syndrome). Potential confounders need to be measured where known and controlled for in the analysis. Selection bias due to non-random selection is a significant issue, particularly in the case-control design, and losses to follow-up is equally important for the cohort design. Observational research seeking to answer key questions on the ZIKV should consider these restrictions and take precautions to minimize bias in an effort to provide reliable and valid results. Utilization of the standardized research protocols developed by the WHO, PAHO, Institut Pasteur, and CONSISE will harmonize the key methodological aspects of each study design to minimize bias at different stages of the study. Biases need to be considered by researchers implementing the standardized protocols as well as by users of observational epidemiological studies of ZIKV.
Patel, Manesh R; Schardt, Connie M; Sanders, Linda L; Keitz, Sheri A
2006-10-01
The paper compares the speed, validity, and applicability of two different protocols for searching the primary medical literature. A randomized trial involving medicine residents was performed. An inpatient general medicine rotation was used. Thirty-two internal medicine residents were block randomized into four groups of eight. Success rate of each search protocol was measured by perceived search time, number of questions answered, and proportion of articles that were applicable and valid. Residents randomized to the MEDLINE-first (protocol A) group searched 120 questions, and residents randomized to the MEDLINE-last (protocol B) searched 133 questions. In protocol A, 104 answers (86.7%) and, in protocol B, 117 answers (88%) were found to clinical questions. In protocol A, residents reported that 26 (25.2%) of the answers were obtained quickly or rated as "fast" (<5 minutes) as opposed to 55 (51.9%) in protocol B, (P = 0.0004). A subset of questions and articles (n = 79) were reviewed by faculty who found that both protocols identified similar numbers of answer articles that addressed the questions and were felt to be valid using critical appraisal criteria. For resident-generated clinical questions, both protocols produced a similarly high percentage of applicable and valid articles. The MEDLINE-last search protocol was perceived to be faster. However, in the MEDLINE-last protocol, a significant portion of questions (23%) still required searching MEDLINE to find an answer.
Grobarczyk, Benjamin; Franco, Bénédicte; Hanon, Kevin; Malgrange, Brigitte
2015-10-01
Genome engineering and human iPS cells are two powerful technologies, which can be combined to highlight phenotypic differences and identify pathological mechanisms of complex diseases by providing isogenic cellular material. However, very few data are available regarding precise gene correction in human iPS cells. Here, we describe an optimized stepwise protocol to deliver CRISPR/Cas9 plasmids in human iPS cells. We highlight technical issues especially those associated to human stem cell culture and to the correction of a point mutation to obtain isogenic iPS cell line, without inserting any resistance cassette. Based on a two-steps clonal isolation protocol (mechanical picking followed by enzymatic dissociation), we succeed to select and expand corrected human iPS cell line with a great efficiency (more than 2% of the sequenced colonies). This protocol can also be used to obtain knock-out cell line from healthy iPS cell line by the NHEJ pathway (with about 15% efficiency) and reproduce disease phenotype. In addition, we also provide protocols for functional validation tests after every critical step.
A standardised protocol for the validation of banking methodologies for arterial allografts.
Lomas, R J; Dodd, P D F; Rooney, P; Pegg, D E; Hogg, P A; Eagle, M E; Bennett, K E; Clarkson, A; Kearney, J N
2013-09-01
The objective of this study was to design and test a protocol for the validation of banking methodologies for arterial allografts. A series of in vitro biomechanical and biological assessments were derived, and applied to paired fresh and banked femoral arteries. The ultimate tensile stress and strain, suture pullout stress and strain, expansion/rupture under hydrostatic pressure, histological structure and biocompatibility properties of disinfected and cryopreserved femoral arteries were compared to those of fresh controls. No significant differences were detected in any of the test criteria. This validation protocol provides an effective means of testing and validating banking protocols for arterial allografts.
iSAFT Protocol Validation Platform for On-Board Data Networks
NASA Astrophysics Data System (ADS)
Tavoularis, Antonis; Kollias, Vangelis; Marinis, Kostas
2014-08-01
iSAFT is an integrated powerful HW/SW environmentfor the simulation, validation & monitoring of satellite/spacecraft on-board data networks supporting simultaneously a wide range of protocols (RMAP, PTP, CCSDS Space Packet, TM/TC, CANopen, etc.) and network interfaces (SpaceWire, ECSS MIL-STD-1553, ECSS CAN). It is based on over 20 years of TELETEL's experience in the area of protocol validation in the telecommunications and aeronautical sectors, and it has been fully re-engineered in cooperation of TELETEL with ESA & space Primes, to comply with space on-board industrial validation requirements (ECSS, EGSE, AIT, AIV, etc.). iSAFT is highly modular and expandable to support new network interfaces & protocols and it is based on the powerful iSAFT graphical tool chain (Protocol Analyser / Recorder, TestRunner, Device Simulator, Traffic Generator, etc.).
Rethinking the NTCIP Design and Protocols - Analyzing the Issues
DOT National Transportation Integrated Search
1998-03-03
This working paper discusses the issues involved in changing the current draft NTCIP standard from an X.25-based protocol stack to an Internet-based protocol stack. It contains a methodology which could be used to change NTCIP's base protocols. This ...
Exploring Assessment Tools for Research and Evaluation in Astronomy Education and Outreach
NASA Astrophysics Data System (ADS)
Buxner, S. R.; Wenger, M. C.; Dokter, E. F. C.
2011-09-01
The ability to effectively measure knowledge, attitudes, and skills in formal and informal educational settings is an important aspect of astronomy education research and evaluation. Assessments may take the form of interviews, observations, surveys, exams, or other probes to help unpack people's understandings or beliefs. In this workshop, we discussed characteristics of a variety of tools that exist to assess understandings of different concepts in astronomy as well as attitudes towards science and science teaching; these include concept inventories, surveys, interview protocols, observation protocols, card sorting, reflection videos, and other methods currently being used in astronomy education research and EPO program evaluations. In addition, we discussed common questions in the selection of assessment tools including issues of reliability and validity, time to administer, format of implementation, analysis, and human subject concerns.
Paes, Thaís; Belo, Letícia Fernandes; da Silva, Diego Rodrigues; Morita, Andrea Akemi; Donária, Leila; Furlanetto, Karina Couto; Sant'Anna, Thaís; Pitta, Fabio; Hernandes, Nidia Aparecida
2017-03-01
It is important to assess activities of daily living (ADL) in older adults due to impairment of independence and quality of life. However, there is no objective and standardized protocol available to assess this outcome. Thus, the aim of this study was to verify the reproducibility and validity of a new protocol for ADL assessment applied in physically independent adults age ≥50 y, the Londrina ADL protocol, and to establish an equation to predict reference values of the Londrina ADL protocol. Ninety-three physically independent adults age ≥50 y had their performance in ADL evaluated by registering the time spent to conclude the protocol. The protocol was performed twice. The 6-min walk test, which assesses functional exercise capacity, was used as a validation criterion. A multiple linear regression model was applied, including anthropometric and demographic variables that correlated with the protocol, to establish an equation to predict the protocol's reference values. In general, the protocol was reproducible (intraclass correlation coefficient 0.91). The average difference between the first and second protocol was 5.3%. The new protocol was valid to assess ADL performance in the studied subjects, presenting a moderate correlation with the 6-min walk test (r = -0.53). The time spent to perform the protocol correlated significantly with age (r = 0.45) but neither with weight (r = -0.17) nor with height (r = -0.17). A model of stepwise multiple regression including sex and age showed that age was the only determinant factor to the Londrina ADL protocol, explaining 21% ( P < .001) of its variability. The derived reference equation was: Londrina ADL protocol pred (s) = 135.618 + (3.102 × age [y]). The Londrina ADL protocol was reproducible and valid in physically independent adults age ≥50 y. A reference equation for the protocol was established including only age as an independent variable (r 2 = 0.21), allowing a better interpretation of the protocol's results in clinical practice. Copyright © 2017 by Daedalus Enterprises.
Preparation of Formalin-fixed Paraffin-embedded Tissue Cores for both RNA and DNA Extraction.
Patel, Palak G; Selvarajah, Shamini; Boursalie, Suzanne; How, Nathan E; Ejdelman, Joshua; Guerard, Karl-Philippe; Bartlett, John M; Lapointe, Jacques; Park, Paul C; Okello, John B A; Berman, David M
2016-08-21
Formalin-fixed paraffin embedded tissue (FFPET) represents a valuable, well-annotated substrate for molecular investigations. The utility of FFPET in molecular analysis is complicated both by heterogeneous tissue composition and low yields when extracting nucleic acids. A literature search revealed a paucity of protocols addressing these issues, and none that showed a validated method for simultaneous extraction of RNA and DNA from regions of interest in FFPET. This method addresses both issues. Tissue specificity was achieved by mapping cancer areas of interest on microscope slides and transferring annotations onto FFPET blocks. Tissue cores were harvested from areas of interest using 0.6 mm microarray punches. Nucleic acid extraction was performed using a commercial FFPET extraction system, with modifications to homogenization, deparaffinization, and Proteinase K digestion steps to improve tissue digestion and increase nucleic acid yields. The modified protocol yields sufficient quantity and quality of nucleic acids for use in a number of downstream analyses, including a multi-analyte gene expression platform, as well as reverse transcriptase coupled real time PCR analysis of mRNA expression, and methylation-specific PCR (MSP) analysis of DNA methylation.
Effect of source tampering in the security of quantum cryptography
NASA Astrophysics Data System (ADS)
Sun, Shi-Hai; Xu, Feihu; Jiang, Mu-Sheng; Ma, Xiang-Chun; Lo, Hoi-Kwong; Liang, Lin-Mei
2015-08-01
The security of source has become an increasingly important issue in quantum cryptography. Based on the framework of measurement-device-independent quantum key distribution (MDI-QKD), the source becomes the only region exploitable by a potential eavesdropper (Eve). Phase randomization is a cornerstone assumption in most discrete-variable (DV) quantum communication protocols (e.g., QKD, quantum coin tossing, weak-coherent-state blind quantum computing, and so on), and the violation of such an assumption is thus fatal to the security of those protocols. In this paper, we show a simple quantum hacking strategy, with commercial and homemade pulsed lasers, by Eve that allows her to actively tamper with the source and violate such an assumption, without leaving a trace afterwards. Furthermore, our attack may also be valid for continuous-variable (CV) QKD, which is another main class of QKD protocol, since, excepting the phase random assumption, other parameters (e.g., intensity) could also be changed, which directly determine the security of CV-QKD.
NASA Technical Reports Server (NTRS)
Zenie, Alexandre; Luguern, Jean-Pierre
1987-01-01
The specification, verification, validation, and evaluation, which make up the different steps of the CS-PN software are outlined. The colored stochastic Petri net software is applied to a Wound/Wait protocol decomposable into two principal modules: request or couple (transaction, granule) treatment module and wound treatment module. Each module is specified, verified, validated, and then evaluated separately, to deduce a verification, validation and evaluation of the complete protocol. The colored stochastic Petri nets tool is shown to be a natural extension of the stochastic tool, adapted to distributed systems and protocols, because the color conveniently takes into account the numerous sites, transactions, granules and messages.
Stergiou, George S; Asmar, Roland; Myers, Martin; Palatini, Paolo; Parati, Gianfranco; Shennan, Andrew; Wang, Jiguang; O'Brien, Eoin
2018-03-01
The European Society of Hypertension (ESH) International Protocol (ESH-IP) for the validation of blood pressure (BP) measuring devices was published in 2002, with the main objective of simplifying the validation procedures, so that more BP monitors would be subjected to independent validation. This article provides an overview of the international impact of the ESH-IP and of the lessons learned from its use, to be able to justify further developments in validation protocols. A review of published (PubMed) validation studies from 2002 to 2017 was performed. One hundred and seventy-seven validation studies using the ESH-IP, 59 using the British Hypertension Society protocol, 46 using the Association for the Advancement of Medical Instrumentation (AAMI) standard and 23 using the International Organization for Standardization (ISO) standard were identified. Lists of validated office-clinic, home and ambulatory BP monitors are provided. Of the ESH-IP studies, 93% tested oscillometric devices, 80% upper arm, 71% home, 25% office and 7% ambulatory monitors (some had more than one function). The original goal of the ESH-IP has been fulfilled in that in the last decade the number of published validation studies has more than doubled. It is now recognized that the provision of accurate devices would be best served by having a universal protocol. An international initiative has been put in place by AAMI, ESH and ISO experts aiming to reach consensus for a universal validation protocol to be accepted worldwide, which will allow a more thorough evaluation of the accuracy and performance of future BP monitors.
NASA Technical Reports Server (NTRS)
Mueller, J. L. (Editor); Fargion, Giuletta S. (Editor); McClain, Charles R. (Editor); Pegau, Scott; Zaneveld, J. Ronald V.; Mitchell, B. Gregg; Kahru, Mati; Wieland, John; Stramska, Malgorzat
2003-01-01
This document stipulates protocols for measuring bio-optical and radiometric data for the Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project activities and algorithm development. The document is organized into 6 separate volumes as Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 4. Volume I: Introduction, Background and Conventions; Volume II: Instrument Specifications, Characterization and Calibration; Volume III: Radiometric Measurements and Data Analysis Methods; Volume IV: Inherent Optical Properties: Instruments, Characterization, Field Measurements and Data Analysis Protocols; Volume V: Biogeochemical and Bio-Optical Measurements and Data Analysis Methods; Volume VI: Special Topics in Ocean Optics Protocols and Appendices. The earlier version of Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 3 (Mueller and Fargion 2002, Volumes 1 and 2) is entirely superseded by the six volumes of Revision 4 listed above.
NASA Technical Reports Server (NTRS)
Mueller, J. L.; Fargion, G. S.; McClain, C. R. (Editor); Pegau, S.; Zanefeld, J. R. V.; Mitchell, B. G.; Kahru, M.; Wieland, J.; Stramska, M.
2003-01-01
This document stipulates protocols for measuring bio-optical and radiometric data for the Sensor Intercomparision and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project activities and algorithm development. The document is organized into 6 separate volumes as Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 4. Volume I: Introduction, Background, and Conventions; Volume II: Instrument Specifications, Characterization and Calibration; Volume III: Radiometric Measurements and Data Analysis Methods; Volume IV: Inherent Optical Properties: Instruments, Characterization, Field Measurements and Data Analysis Protocols; Volume V: Biogeochemical and Bio-Optical Measurements and Data Analysis Methods; Volume VI: Special Topics in Ocean Optics Protocols and Appendices. The earlier version of Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 3 is entirely superseded by the six volumes of Revision 4 listed above.
NASA Technical Reports Server (NTRS)
Mueller, J. L. (Editor); Fargion, Giulietta S. (Editor); McClain, Charles R. (Editor)
2003-01-01
This document stipulates protocols for measuring bio-optical and radiometric data for the Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project activities and algorithm development. The document is organized into 6 separate volumes as Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 4. Volume I: Introduction, Background and Conventions; Volume II: Instrument Specifications, Characterization and Calibration; Volume III: Radiometric Measurements and Data Analysis Methods; Volume IV: Inherent Optical Properties: Instruments, Characterization, Field Measurements and Data Analysis Protocols; Volume V: Biogeochemical and Bio-Optical Measurements and Data Analysis Methods; Volume VI: Special Topics in Ocean Optics Protocols and Appendices. The earlier version of Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 3 (Mueller and Fargion 2002, Volumes 1 and 2) is entirely superseded by the six volumes of Revision 4 listed above.
An Argument Approach to Observation Protocol Validity
ERIC Educational Resources Information Center
Bell, Courtney A.; Gitomer, Drew H.; McCaffrey, Daniel F.; Hamre, Bridget K.; Pianta, Robert C.; Qi, Yi
2012-01-01
This article develops a validity argument approach for use on observation protocols currently used to assess teacher quality for high-stakes personnel and professional development decisions. After defining the teaching quality domain, we articulate an interpretive argument for observation protocols. To illustrate the types of evidence that might…
Yang, Xianjin; Debonneuil, Edouard; Zhavoronkov, Alex; Mishra, Bud
2016-01-01
Advances in financial engineering are radically reshaping the biomedical marketplace. For instance, new methods of pooling diversified drug development programs by placing them in a special purpose vehicle (SPV) have been proposed to create a securitized cancer megafund allowing for debt and equity participation. In this study, we perform theoretical and numerical simulations that highlight the role of empirical validation of the projects comprising a cancer megafund. We quantify the degree to which the deliberately designed structure of derivatives and investments is key to its liquidity. Research megafunds with comprehensive in silico and laboratory validation protocols and ability to issue both debt, and equity as well as hybrid financial products may enable conservative investors including pension funds and sovereign government funds to profit from unique securitization opportunities. Thus, while hedging investor's longevity risk, such well-validated megafunds will contribute to health, wellbeing and longevity of the global population. PMID:27275544
Yang, Xianjin; Debonneuil, Edouard; Zhavoronkov, Alex; Mishra, Bud
2016-09-06
Advances in financial engineering are radically reshaping the biomedical marketplace. For instance, new methods of pooling diversified drug development programs by placing them in a special purpose vehicle (SPV) have been proposed to create a securitized cancer megafund allowing for debt and equity participation. In this study, we perform theoretical and numerical simulations that highlight the role of empirical validation of the projects comprising a cancer megafund. We quantify the degree to which the deliberately designed structure of derivatives and investments is key to its liquidity. Research megafunds with comprehensive in silico and laboratory validation protocols and ability to issue both debt, and equity as well as hybrid financial products may enable conservative investors including pension funds and sovereign government funds to profit from unique securitization opportunities. Thus, while hedging investor's longevity risk, such well-validated megafunds will contribute to health, well being and longevity of the global population.
Raggi, Alberto; Quintas, Rui; Russo, Emanuela; Martinuzzi, Andrea; Costardi, Daniela; Frisoni, Giovanni Battista; Franco, Maria Grazia; Andreotti, Alessandra; Ojala, Matti; Peña, Sebastián; Perales, Jaime; Chatterji, Somnath; Miret, Marta; Tobiasz-Adamczyk, Beata; Koskinen, Seppo; Frattura, Lucilla; Leonardi, Matilde
2014-01-01
The collaborative research on ageing in Europe protocol was based on that of the World Health Organization Study on global AGEing and adult health (SAGE) project that investigated the relationship between health and well-being and provided a set of instruments that can be used across countries to monitor health and health-related outcomes of older populations as well as the strategies for addressing issues concerning the ageing process. To evaluate the degree to which SAGE protocol covered the spectrum of disability given the scope of the World Health Organization International Classification of Functioning, Disability and Health (ICF), a mapping exercise was performed with SAGE protocol. Results show that the SAGE protocol covers ICF domains in a non-uniform way, with environmental factors categories being underrepresented, whereas mental, cardiovascular, sensory functions and mobility were overrepresented. To overcome this partial coverage of ICF functioning categories, new assessment instruments have been developed. PRACTITIONER MESSAGE: Mapping exercises are valid procedures to understand the extent to which a survey protocol covers the spectrum of functioning. The mapping exercise with SAGE protocol shows that it provides only a partial representation of body functions and activities and participation domains, and the coverage of environmental factors is poor. New instruments are therefore needed for researchers to properly understand the health and disability of ageing populations. Copyright © 2013 John Wiley & Sons, Ltd.
Wong, Adrian; Xiong, Yun-yun; Wang, Defeng; Lin, Shi; Chu, Winnie W C; Kwan, Pauline W K; Nyenhuis, David; Black, Sandra E; Wong, Ka Sing Lawrence; Mok, Vincent
2013-05-01
Vascular cognitive impairment (VCI) affects up to half of stroke survivors and predicts poor outcomes. Valid and reliable assessement for VCI is lacking, especially for the Chinese population. In 2005, the National Institute of Neurological Disorders and Stroke and Canadian Stroke Network (NINDS-CSN) Harmonisation workshop proposed a set of three neuropsychology protocols for VCI evaluation. This paper is to introduce the protocol design and to report the psychometric properties of the Chinese NINDS-CSN VCI protocols. Fifty patients with mild stroke (mean National Institute of Health Stroke Scale 2.2 (SD=3.2)) and 50 controls were recruited. The NINDS-CSN VCI protocols were adapted into Chinese. We assessed protocols' (1) external validity, defined by how well the protocol summary scores differentiated patients from controls using receiver operating characteristics (ROC) curve analysis; (2) concurrent validity, by correlations with functional measures including Stroke Impact Scale memory score and Chinese Disability Assessment for Dementia; (3) internal consistency; and (4) ease of administration. All three protocols differentiated patients from controls (area under ROC for the three protocols between 0.77 to 0.79, p<0.001), and significantly correlated with the functional measures (Pearson r ranged from 0.37 to 0.51). A cut-off of 19/20 on MMSE identified only one-tenth of patients classified as impaired on the 5-min protocol. Cronbach's α across the four cognitive domains of the 60-min protocol was 0.78 for all subjects and 0.76 for stroke patients. The Chinese NINDS-CSN VCI protocols are valid and reliable for cognitive assessment in Chinese patients with mild stroke.
NASA Technical Reports Server (NTRS)
Feng, C.; Sun, X.; Shen, Y. N.; Lombardi, Fabrizio
1992-01-01
This paper covers the verification and protocol validation for distributed computer and communication systems using a computer aided testing approach. Validation and verification make up the so-called process of conformance testing. Protocol applications which pass conformance testing are then checked to see whether they can operate together. This is referred to as interoperability testing. A new comprehensive approach to protocol testing is presented which address: (1) modeling for inter-layer representation for compatibility between conformance and interoperability testing; (2) computational improvement to current testing methods by using the proposed model inclusive of formulation of new qualitative and quantitative measures and time-dependent behavior; (3) analysis and evaluation of protocol behavior for interactive testing without extensive simulation.
A Lightweight Continuous Authentication Protocol for the Internet of Things.
Chuang, Yo-Hsuan; Lo, Nai-Wei; Yang, Cheng-Ying; Tang, Ssu-Wei
2018-04-05
Modern societies are moving toward an information-oriented environment. To gather and utilize information around people's modern life, tiny devices with all kinds of sensing devices and various sizes of gateways need to be deployed and connected with each other through the Internet or proxy-based wireless sensor networks (WSNs). Within this kind of Internet of Things (IoT) environment, how to authenticate each other between two communicating devices is a fundamental security issue. As a lot of IoT devices are powered by batteries and they need to transmit sensed data periodically, it is necessary for IoT devices to adopt a lightweight authentication protocol to reduce their energy consumption when a device wants to authenticate and transmit data to its targeted peer. In this paper, a lightweight continuous authentication protocol for sensing devices and gateway devices in general IoT environments is introduced. The concept of valid authentication time period is proposed to enhance robustness of authentication between IoT devices. To construct the proposed lightweight continuous authentication protocol, token technique and dynamic features of IoT devices are adopted in order to reach the design goals: the reduction of time consumption for consecutive authentications and energy saving for authenticating devices through by reducing the computation complexity during session establishment of continuous authentication. Security analysis is conducted to evaluate security strength of the proposed protocol. In addition, performance analysis has shown the proposed protocol is a strong competitor among existing protocols for device-to-device authentication in IoT environments.
Virtual laboratories: new opportunities for collaborative water science
NASA Astrophysics Data System (ADS)
Ceola, Serena; Arheimer, Berit; Bloeschl, Guenter; Baratti, Emanuele; Capell, Rene; Castellarin, Attilio; Freer, Jim; Han, Dawei; Hrachowitz, Markus; Hundecha, Yeshewatesfa; Hutton, Christopher; Lindström, Goran; Montanari, Alberto; Nijzink, Remko; Parajka, Juraj; Toth, Elena; Viglione, Alberto; Wagener, Thorsten
2015-04-01
Reproducibility and repeatability of experiments are the fundamental prerequisites that allow researchers to validate results and share hydrological knowledge, experience and expertise in the light of global water management problems. Virtual laboratories offer new opportunities to enable these prerequisites since they allow experimenters to share data, tools and pre-defined experimental procedures (i.e. protocols). Here we present the outcomes of a first collaborative numerical experiment undertaken by five different international research groups in a virtual laboratory to address the key issues of reproducibility and repeatability. Moving from the definition of accurate and detailed experimental protocols, a rainfall-runoff model was independently applied to 15 European catchments by the research groups and model results were collectively examined through a web-based discussion. We found that a detailed modelling protocol was crucial to ensure the comparability and reproducibility of the proposed experiment across groups. Our results suggest that sharing comprehensive and precise protocols and running the experiments within a controlled environment (e.g. virtual laboratory) is as fundamental as sharing data and tools for ensuring experiment repeatability and reproducibility across the broad scientific community and thus advancing hydrology in a more coherent way.
Palomaki, Glenn E; Lee, Jo Ellen S; Canick, Jacob A; McDowell, Geraldine A; Donnenfeld, Alan E
2009-09-01
This statement is intended to augment the current general ACMG Standards and Guidelines for Clinical Genetics Laboratories and to address guidelines specific to first-trimester screening for Down syndrome. The aim is to provide the laboratory the necessary information to ensure accurate and reliable Down syndrome screening results given a screening protocol (e.g., combined first trimester and integrated testing). Information about various test combinations and their expected performance are provided, but other issues such as availability of reagents, patient interest in early test results, access to open neural tube defect screening, and availability of chorionic villus sampling are all contextual factors in deciding which screening protocol(s) will be selected by individual health care providers. Individual laboratories are responsible for meeting the quality assurance standards described by the Clinical Laboratory Improvement Act, the College of American Pathologists, and other regulatory agencies, with respect to appropriate sample documentation, assay validation, general proficiency, and quality control measures. These guidelines address first-trimester screening that includes ultrasound measurement and interpretation of nuchal translucency thickness and protocols that combine markers from both the first and second trimesters. Laboratories can use their professional judgment to make modification or additions.
LOPP: A Location Privacy Protected Anonymous Routing Protocol for Disruption Tolerant Network
NASA Astrophysics Data System (ADS)
Lu, Xiaofeng; Hui, Pan; Towsley, Don; Pu, Juhua; Xiong, Zhang
In this paper, we propose an anonymous routing protocol, LOPP, to protect the originator's location privacy in Delay/Disruption Tolerant Network (DTN). The goals of our study are to minimize the originator's probability of being localized (Pl) and maximize the destination's probability of receiving the message (Pr). The idea of LOPP is to divide a sensitive message into k segments and send each of them to n different neighbors. Although message fragmentation could reduce the destination's probability to receive a complete message, LOPP can decrease the originator's Pl. We validate LOPP on a real-world human mobility dataset. The simulation results show that LOPP can decrease the originator's Pl by over 54% with only 5.7% decrease in destination's Pr. We address the physical localization issue of DTN, which was not studied in the literature.
NASA Technical Reports Server (NTRS)
Fargion, Giulietta S.; Barnes, Robert; McClain, Charles
2001-01-01
The purpose of this technical report is to provide current documentation of the Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project Office activities on in situ aerosol optical thickness (i.e., protocols, and data QC and analysis). This documentation is necessary to ensure that critical information is related to the scientific community and NASA management. This critical information includes the technical difficulties and challenges of validating and combining ocean color data from an array of independent satellite systems to form consistent and accurate global bio-optical time series products. This technical report is not meant as a substitute for scientific literature. Instead, it will provide a ready and responsive vehicle for the multitude of technical reports issued by an operational project.
O'Sullivan, Maureen
2007-02-01
Bond and Uysal (this issue) complain that expert lie detectors identified by O'Sullivan and Ekman (2004) are statistical flukes. They ignore one class of experts we have identified and misrepresent the procedures we use to identify the others. They also question the psychometric validity of the measures and protocol used. Many of their points are addressed in the chapter they criticize. The fruitfulness of the O'Sullivan-Ekman protocol is illustrated with respect to improved identification of expert lie detectors, as well as a replicated pattern of errors made by experts from different professional groups. The statistical arguments offered confuse the theoretical use of the binomial with the empirical use of the normal distribution. Data are provided that may clarify this distinction.
IoT security with one-time pad secure algorithm based on the double memory technique
NASA Astrophysics Data System (ADS)
Wiśniewski, Remigiusz; Grobelny, Michał; Grobelna, Iwona; Bazydło, Grzegorz
2017-11-01
Secure encryption of data in Internet of Things is especially important as many information is exchanged every day and the number of attack vectors on IoT elements still increases. In the paper a novel symmetric encryption method is proposed. The idea bases on the one-time pad technique. The proposed solution applies double memory concept to secure transmitted data. The presented algorithm is considered as a part of communication protocol and it has been initially validated against known security issues.
Haby, Michelle M.; Martínez-Vega, Ruth; Pinzón-Flores, Carlos E.; Smith, Emma; Pinart, Mariona; Broutet, Nathalie; Becerra-Posada, Francisco; Aldighieri, Sylvain; Van Kerkhove, Maria D.
2017-01-01
Introduction Given the severity and impact of the current Zika virus (ZIKV) outbreak in the Americas, numerous countries have rushed to develop research studies to assess ZIKV and its potential health consequences. In an effort to ensure that studies are comprehensive, both internally and externally valid, and with reliable results, the World Health Organization, the Pan American Health Organization, Institut Pasteur, the networks of Fiocruz, the Consortia for the Standardization of Influenza Seroepidemiology (CONSISE) and the International Severe Acute Respiratory and Emerging Infection Consortium (ISARIC) have generated six standardized clinical and epidemiological research protocols and questionnaires to address key public health questions on ZIKV. Methods We conducted a systematic search of ongoing study protocols related to ZIKV research. We analyzed the content of protocols of 32 cohort studies and 13 case control studies for systematic bias that could produce erroneous results. Additionally we aimed to characterize the risks of bias and confounding in observational studies related to ZIKV and to propose ways to minimize them, including the use of six newly standardized research protocols. Results Observational studies of ZIKV face an array of challenges, including measurement of exposure and outcomes (microcephaly and Guillain-Barré Syndrome). Potential confounders need to be measured where known and controlled for in the analysis. Selection bias due to non-random selection is a significant issue, particularly in the case-control design, and losses to follow-up is equally important for the cohort design. Conclusion Observational research seeking to answer key questions on the ZIKV should consider these restrictions and take precautions to minimize bias in an effort to provide reliable and valid results. Utilization of the standardized research protocols developed by the WHO, PAHO, Institut Pasteur, and CONSISE will harmonize the key methodological aspects of each study design to minimize bias at different stages of the study. Biases need to be considered by researchers implementing the standardized protocols as well as by users of observational epidemiological studies of ZIKV. PMID:28686621
Gaudin, V; Hedou, C; Rault, A; Verdon, E
2010-07-01
The STAR protocol is a Five Plate Test (FPT) developed several years ago at the Community Reference Laboratory (CRL) for the screening of antimicrobial residues in milk and muscle. This paper presents the validation of this method according to European Decision 2002/657/EC and to an internal guideline for validation. A validation protocol based on 'simulated tissues' and on a list of 16 representative antimicrobials to be validated was implemented in our laboratory during several months for the STAR protocol. The performance characteristics of the method were determined (specificity, detection capabilities CCbeta, applicability, ruggedness). In conclusion, the STAR protocol is applicable to the broad-spectrum detection of antibiotic residues in muscles of different animal species (pig, cattle, sheep, poultry). The method has good specificity (false-positive rate = 4%). The detection capabilities were determined for 16 antibiotics from different families in relation to their respective maximum residue limit (MRL): beta-lactams (penicillins and cephalosporins < or = MRL), tetracyclines (< or = MRL and < or = 2.5 MRL), macrolides (2 MRL), quinolones (< or = 2 MRL), some sulphonamides (< or = 3 MRL), and trimethoprim (2 MRL). However, the sensitivity of the STAR protocol towards aminoglycosides (> 8 MRL) and florfenicol (< or = 10 MRL) was unsatisfactory (>MRL). The two objectives of this study were met: firstly, to validate the STAR protocol according to European Decision 2002/657/EC, then to demonstrate that the validation guideline developed to implement this decision is applicable to microbiological plate tests even for muscle. The use of simulated tissue appeared a good compromise between spiked discs with antibiotic solutions and incurred tissues. In addition, the choice of a list of representative antibiotics allowed the reduction of the scope of the validation, which was already costly in time and effort.
Epitope mapping of commercial antibodies that detect myocilin.
Patterson-Orazem, Athéna C; Hill, Shannon E; Fautsch, Michael P; Lieberman, Raquel L
2018-05-09
The presence of myocilin is often used in the process of validating trabecular meshwork (TM) cells and eye tissues, but the antibody reagents used for detection are poorly characterized. Indeed, for over a century, researchers have been using antibodies to track proteins of interest in a variety of biological contexts, but many antibodies remain ill-defined at the molecular level and in their target epitope. Such issues have prompted efforts from major funding agencies to validate reagents and combat reproducibility issues across biomedical sciences. Here we characterize the epitopes recognized by four commercial myocilin antibodies, aided by structurally and biochemically characterized myocilin fragments. All four antibodies recognize enriched myocilin secreted from human TM cell media. The detection of myocilin fragments by ELISA and Western blot reveal a variety of epitopes across the myocilin polypeptide chain. A more precise understanding of myocilin antibody targets, including conformational specificity, should aid the community in standardizing protocols across laboratories and in turn, lead to a better understanding of eye physiology and disease. Copyright © 2018 Elsevier Ltd. All rights reserved.
The biomarker-based diagnosis of Alzheimer's disease. 1-ethical and societal issues.
Porteri, Corinna; Albanese, Emiliano; Scerri, Charles; Carrillo, Maria C; Snyder, Heather M; Martensson, Birgitta; Baker, Mark; Giacobini, Ezio; Boccardi, Marina; Winblad, Bengt; Frisoni, Giovanni B; Hurst, Samia
2017-04-01
There is great interest in the use of biomarkers to assist in the timely identification of Alzheimer's disease (AD) in individuals with mild symptoms. However, the inclusion of AD biomarkers in clinical criteria poses socioethical challenges. The Geneva Task Force for the Roadmap of Alzheimer's Biomarkers was established to deliver a systematic strategic research agenda (aka roadmap) to promote efficient and effective validation of AD biomarkers and to foster their uptake in clinical practice. In this article, we summarize the workshop discussion of the Geneva Task Force "ethical and societal issues" working group, which comprised bioethicists, clinicians, health economists, and representatives of those affected by AD. The working group identified the following key issues that need to be included in the roadmap: improving access to services through timely diagnosis, the need for a diagnostic research protocol before moving to clinical routine, recruitment in diagnostic research protocols in the absence of effective therapy, respect for the autonomy of the individual with mild cognitive impairment in information and consent process and the right not to know biomarkers results, need for counseling programs, disclosure of the diagnosis in a structured environment and the involvement of family members, health policies including the individuals' views and the protection of their interests, and the economic costs for society. Copyright © 2016 Elsevier Inc. All rights reserved.
Fractional optical cryptographic protocol for data containers in a noise-free multiuser environment
NASA Astrophysics Data System (ADS)
Jaramillo, Alexis; Barrera, John Fredy; Zea, Alejandro Vélez; Torroba, Roberto
2018-03-01
Optical encryption systems have great potential for flexible and high-performance data protection, making them an area of rapid development. However, most approaches present two main issues, namely, the presence of speckle noise, and the degree of security they offer. Here we introduce an experimental implementation of an optical encrypting protocol that tackles these issues by taking advantage of recent developments in the field. These developments include the introduction of information containers for noise free information retrieval, the use of multiplexing to allow for a multiple user environment and an architecture based on the Joint fractional Fourier transform that allows increased degrees of freedom and simplifies the experimental requirements. Thus, data handling via QR code containers involving multiple users processed in a fractional joint transform correlator produce coded information with increased security and ease of use. In this way, we can guarantee that only the user with the correct combination of encryption key and security parameters can achieve noise free information after deciphering. We analyze the performance of the system when the order of the fractional Fourier transform is changed during decryption. We show experimental results that confirm the validity of our proposal.
Validation protocols for blood pressure-measuring devices: status quo and development needs.
Beime, Beate; Deutsch, Cornelia; Gomez, Timothy; Zwingers, Thomas; Mengden, Thomas; Bramlage, Peter
2016-02-01
Hypertension is a major risk factor for cardiovascular morbidity and mortality. Therefore, blood pressure self-measuring devices have emerged as valuable tools in patient care and the accuracy of these instruments is of fundamental importance. For this reason, several validation procedures for assessing the efficacy of blood pressure monitoring devices have been developed, including protocols by the Association for the Advancement of Medical Instrumentation, the British Hypertension Society, the German Hypertension League (Prüfsiegelprotokoll), and the International Protocol of the Working Group on Blood Pressure Monitoring of the European Society of Hypertension. In the past, most of the protocols have been reviewed and modified because of experiences gained during the validation studies carried out. However, each shows distinct differences, that is number and characteristics of patients required, blood pressure ranges, and the length of the validation procedure, which may result in unique advantages and/or limitations associated with their use. The continued standardization and evolution of these guidelines is essential to ensure the efficacy of blood pressure-measuring devices marketed for clinical and home use. Here, we aimed to compare four currently used validation protocols and to initiate a discussion on potential future improvements.
NASA Astrophysics Data System (ADS)
Vielhauer, Claus; Croce Ferri, Lucilla
2003-06-01
Our paper addresses two issues of a biometric authentication algorithm for ID cardholders previously presented namely the security of the embedded reference data and the aging process of the biometric data. We describe a protocol that allows two levels of verification, combining a biometric hash technique based on handwritten signature and hologram watermarks with cryptographic signatures in a verification infrastructure. This infrastructure consists of a Trusted Central Public Authority (TCPA), which serves numerous Enrollment Stations (ES) in a secure environment. Each individual performs an enrollment at an ES, which provides the TCPA with the full biometric reference data and a document hash. The TCPA then calculates the authentication record (AR) with the biometric hash, a validity timestamp, and a document hash provided by the ES. The AR is then signed with a cryptographic signature function, initialized with the TCPA's private key and embedded in the ID card as a watermark. Authentication is performed at Verification Stations (VS), where the ID card will be scanned and the signed AR is retrieved from the watermark. Due to the timestamp mechanism and a two level biometric verification technique based on offline and online features, the AR can deal with the aging process of the biometric feature by forcing a re-enrollment of the user after expiry, making use of the ES infrastructure. We describe some attack scenarios and we illustrate the watermarking embedding, retrieval and dispute protocols, analyzing their requisites, advantages and disadvantages in relation to security requirements.
A Unified Fault-Tolerance Protocol
NASA Technical Reports Server (NTRS)
Miner, Paul; Gedser, Alfons; Pike, Lee; Maddalon, Jeffrey
2004-01-01
Davies and Wakerly show that Byzantine fault tolerance can be achieved by a cascade of broadcasts and middle value select functions. We present an extension of the Davies and Wakerly protocol, the unified protocol, and its proof of correctness. We prove that it satisfies validity and agreement properties for communication of exact values. We then introduce bounded communication error into the model. Inexact communication is inherent for clock synchronization protocols. We prove that validity and agreement properties hold for inexact communication, and that exact communication is a special case. As a running example, we illustrate the unified protocol using the SPIDER family of fault-tolerant architectures. In particular we demonstrate that the SPIDER interactive consistency, distributed diagnosis, and clock synchronization protocols are instances of the unified protocol.
A new method to assess the deformations of internal organs of the abdomen during impact.
Helfenstein-Didier, Clémentine; Rongiéras, Frédéric; Gennisson, Jean-Luc; Tanter, Mickaël; Beillas, Philippe
2016-11-16
Due to limitations of classic imaging approaches, the internal response of abdominal organs is difficult to observe during an impact. Within the context of impact biomechanics for the protection of the occupant of transports, this could be an issue for human model validation and injury prediction. In the current study, a previously developed technique (ultrafast ultrasound imaging) was used as the basis to develop a protocol to observe the internal response of abdominal organs in situ at high imaging rates. The protocol was applied to 3 postmortem human surrogates to observe the liver and the colon during impacts delivered to the abdomen. The results show the sensitivity of the liver motion to the impact location. Compression of the colon was also quantified and compared to the abdominal compression. These results illustrate the feasibility of the approach. Further tests and comparisons with simulations are under preparation.
A slotted access control protocol for metropolitan WDM ring networks
NASA Astrophysics Data System (ADS)
Baziana, P. A.; Pountourakis, I. E.
2009-03-01
In this study we focus on the serious scalability problems that many access protocols for WDM ring networks introduce due to the use of a dedicated wavelength per access node for either transmission or reception. We propose an efficient slotted MAC protocol suitable for WDM ring metropolitan area networks. The proposed network architecture employs a separate wavelength for control information exchange prior to the data packet transmission. Each access node is equipped with a pair of tunable transceivers for data communication and a pair of fixed tuned transceivers for control information exchange. Also, each access node includes a set of fixed delay lines for synchronization reasons; to keep the data packets, while the control information is processed. An efficient access algorithm is applied to avoid both the data wavelengths and the receiver collisions. In our protocol, each access node is capable of transmitting and receiving over any of the data wavelengths, facing the scalability issues. Two different slot reuse schemes are assumed: the source and the destination stripping schemes. For both schemes, performance measures evaluation is provided via an analytic model. The analytical results are validated by a discrete event simulation model that uses Poisson traffic sources. Simulation results show that the proposed protocol manages efficient bandwidth utilization, especially under high load. Also, comparative simulation results prove that our protocol achieves significant performance improvement as compared with other WDMA protocols which restrict transmission over a dedicated data wavelength. Finally, performance measures evaluation is explored for diverse numbers of buffer size, access nodes and data wavelengths.
A Lightweight Continuous Authentication Protocol for the Internet of Things
Chuang, Yo-Hsuan; Yang, Cheng-Ying; Tang, Ssu-Wei
2018-01-01
Modern societies are moving toward an information-oriented environment. To gather and utilize information around people’s modern life, tiny devices with all kinds of sensing devices and various sizes of gateways need to be deployed and connected with each other through the Internet or proxy-based wireless sensor networks (WSNs). Within this kind of Internet of Things (IoT) environment, how to authenticate each other between two communicating devices is a fundamental security issue. As a lot of IoT devices are powered by batteries and they need to transmit sensed data periodically, it is necessary for IoT devices to adopt a lightweight authentication protocol to reduce their energy consumption when a device wants to authenticate and transmit data to its targeted peer. In this paper, a lightweight continuous authentication protocol for sensing devices and gateway devices in general IoT environments is introduced. The concept of valid authentication time period is proposed to enhance robustness of authentication between IoT devices. To construct the proposed lightweight continuous authentication protocol, token technique and dynamic features of IoT devices are adopted in order to reach the design goals: the reduction of time consumption for consecutive authentications and energy saving for authenticating devices through by reducing the computation complexity during session establishment of continuous authentication. Security analysis is conducted to evaluate security strength of the proposed protocol. In addition, performance analysis has shown the proposed protocol is a strong competitor among existing protocols for device-to-device authentication in IoT environments. PMID:29621168
Dandanell, Sune; Præst, Charlotte Boslev; Søndergård, Stine Dam; Skovborg, Camilla; Dela, Flemming; Larsen, Steen; Helge, Jørn Wulff
2017-04-01
Maximal fat oxidation (MFO) and the exercise intensity that elicits MFO (Fat Max ) are commonly determined by indirect calorimetry during graded exercise tests in both obese and normal-weight individuals. However, no protocol has been validated in individuals with obesity. Thus, the aims were to develop a graded exercise protocol for determination of Fat Max in individuals with obesity, and to test validity and inter-method reliability. Fat oxidation was assessed over a range of exercise intensities in 16 individuals (age: 28 (26-29) years; body mass index: 36 (35-38) kg·m -2 ; 95% confidence interval) on a cycle ergometer. The graded exercise protocol was validated against a short continuous exercise (SCE) protocol, in which Fat Max was determined from fat oxidation at rest and during 10 min of continuous exercise at 35%, 50%, and 65% of maximal oxygen uptake. Intraclass and Pearson correlation coefficients between the protocols were 0.75 and 0.72 and within-subject coefficient of variation (CV) was 5 (3-7)%. A Bland-Altman plot revealed a bias of -3% points of maximal oxygen uptake (limits of agreement: -12 to 7). A tendency towards a systematic difference (p = 0.06) was observed, where Fat Max occurred at 42 (40-44)% and 45 (43-47)% of maximal oxygen uptake with the graded and the SCE protocol, respectively. In conclusion, there was a high-excellent correlation and a low CV between the 2 protocols, suggesting that the graded exercise protocol has a high inter-method reliability. However, considerable intra-individual variation and a trend towards systematic difference between the protocols reveal that further optimization of the graded exercise protocol is needed to improve validity.
Martín-Sabroso, Cristina; Tavares-Fernandes, Daniel Filipe; Espada-García, Juan Ignacio; Torres-Suárez, Ana Isabel
2013-12-15
In this work a protocol to validate analytical procedures for the quantification of drug substances formulated in polymeric systems that comprise both drug entrapped into the polymeric matrix (assay:content test) and drug released from the systems (assay:dissolution test) is developed. This protocol is applied to the validation two isocratic HPLC analytical procedures for the analysis of dexamethasone phosphate disodium microparticles for parenteral administration. Preparation of authentic samples and artificially "spiked" and "unspiked" samples is described. Specificity (ability to quantify dexamethasone phosphate disodium in presence of constituents of the dissolution medium and other microparticle constituents), linearity, accuracy and precision are evaluated, in the range from 10 to 50 μg mL(-1) in the assay:content test procedure and from 0.25 to 10 μg mL(-1) in the assay:dissolution test procedure. The robustness of the analytical method to extract drug from microparticles is also assessed. The validation protocol developed allows us to conclude that both analytical methods are suitable for their intended purpose, but the lack of proportionality of the assay:dissolution analytical method should be taken into account. The validation protocol designed in this work could be applied to the validation of any analytical procedure for the quantification of drugs formulated in controlled release polymeric microparticles. Copyright © 2013 Elsevier B.V. All rights reserved.
Computation of transmitted and received B1 fields in magnetic resonance imaging.
Milles, Julien; Zhu, Yue Min; Chen, Nan-Kuei; Panych, Lawrence P; Gimenez, Gérard; Guttmann, Charles R G
2006-05-01
Computation of B1 fields is a key issue for determination and correction of intensity nonuniformity in magnetic resonance images. This paper presents a new method for computing transmitted and received B1 fields. Our method combines a modified MRI acquisition protocol and an estimation technique based on the Levenberg-Marquardt algorithm and spatial filtering. It enables accurate estimation of transmitted and received B1 fields for both homogeneous and heterogeneous objects. The method is validated using numerical simulations and experimental data from phantom and human scans. The experimental results are in agreement with theoretical expectations.
Space human factors publications: 1980-1990
NASA Technical Reports Server (NTRS)
Dickson, Katherine J.
1991-01-01
A 10 year cummulative bibliography of publications resulting from research supported by the NASA Space Human Factors Program of the Life Science Division is provided. The goal of this program is to understand the basic mechanisms underlying behavioral adaptation to space and to develop and validate system design requirements, protocols, and countermeasures to ensure the psychological well-being, safety, and productivity of crewmembers. Subjects encompassed by this bibliography include selection and training, group dynamics, psychophysiological interactions, habitability issues, human-machine interactions, psychological support measures, and anthropometric data. Principal Investigators whose research tasks resulted in publication are identified by asterisk.
Allen, K A; Bredero, B; Van Damme, T; Ulrich, D A; Simons, J
2017-03-01
The validity and reliability of the Test of Gross Motor Development-3 (TGMD-3) were measured, taking into consideration the preference for visual learning of children with autism spectrum disorder (ASD). The TGMD-3 was administered to 14 children with ASD (4-10 years) and 21 age-matched typically developing children under two conditions: TGMD-3 traditional protocol, and TGMD-3 visual support protocol. Excellent levels of internal consistency, test-retest, interrater and intrarater reliability were achieved for the TGMD-3 visual support protocol. TGMD-3 raw scores of children with ASD were significantly lower than typically developing peers, however, significantly improved using the TGMD-3 visual support protocol. This demonstrates that the TGMD-3 visual support protocol is a valid and reliable assessment of gross motor performance for children with ASD.
Li, Rebecca H; Wacholtz, Mary C; Barnes, Mark; Boggs, Liam; Callery-D'Amico, Susan; Davis, Amy; Digilova, Alla; Forster, David; Heffernan, Kate; Luthin, Maeve; Lynch, Holly Fernandez; McNair, Lindsay; Miller, Jennifer E; Murphy, Jacquelyn; Van Campen, Luann; Wilenzick, Mark; Wolf, Delia; Woolston, Cris; Aldinger, Carmen; Bierer, Barbara E
2016-01-01
A novel Protocol Ethics Tool Kit (‘Ethics Tool Kit’) has been developed by a multi-stakeholder group of the Multi-Regional Clinical Trials Center of Brigham and Women's Hospital and Harvard. The purpose of the Ethics Tool Kit is to facilitate effective recognition, consideration and deliberation of critical ethical issues in clinical trial protocols. The Ethics Tool Kit may be used by investigators and sponsors to develop a dedicated Ethics Section within a protocol to improve the consistency and transparency between clinical trial protocols and research ethics committee reviews. It may also streamline ethics review and may facilitate and expedite the review process by anticipating the concerns of ethics committee reviewers. Specific attention was given to issues arising in multinational settings. With the use of this Tool Kit, researchers have the opportunity to address critical research ethics issues proactively, potentially speeding the time and easing the process to final protocol approval. PMID:26811365
Validity of the modified back-saver sit-and-reach test: a comparison with other protocols.
Hui, S S; Yuen, P Y
2000-09-01
Studies have shown that the classical sit-and-reach (CSR) test, the modified sit-and-reach (MSR), and the newly developed back-saver sit-and-reach (BS) test have poor criterion-related validity in estimating low-back flexibility but yielded moderate criterion-related validity in hamstring flexibility. The V sit-and-reach (VSR) test was found to be practical but the validity has not been established. The purpose of this study was to propose a modified back-saver sit-and-reach (MBS) test, which incorporated all advantages of the various protocols, and to compare the criterion-related validity and reliability of all these tests. 158 college students (F = 96, and M = 62; age = 20.77 +/- 2.51) performed CSR, VSR, BS (left and right leg), and MBS (left and right leg) tests in a randomized order. Scores from each test were then correlated with the criterion measures. For all sit-reach tests, intraclass reliability (single trial) was very high (r = 0.89-0.98). MBS yielded significant and highest r with low-back and hamstring criterion for men (r = 0.47-0.67) and women (r = 0.23-0.54). The low-back and right hamstring validity of MBS for men were significantly (P < 0.01) higher than those from BS and CSR, whereas no differences in criterion-related validity were found between the MBS and other protocols in women. The ratings of perceived comfort among the sit-and-reach protocols were significantly different (P < 0.001) from each other. The rating for MBS was observed the most comfortable test as compared with other protocols. The MBS test is not only a reliable test for hamstring and low-back flexibility, it is also a more practical with improved validity for hamstring and low-back flexibility in men than previous protocols.
Alternating-Offers Protocol for Multi-issue Bilateral Negotiation in Semantic-Enabled Marketplaces
NASA Astrophysics Data System (ADS)
Ragone, Azzurra; di Noia, Tommaso; di Sciascio, Eugenio; Donini, Francesco M.
We present a semantic-based approach to multi-issue bilateral negotiation for e-commerce. We use Description Logics to model advertisements, and relations among issues as axioms in a TBox. We then introduce a logic-based alternating-offers protocol, able to handle conflicting information, that merges non-standard reasoning services in Description Logics with utility thoery to find the most suitable agreements. We illustrate and motivate the theoretical framework, the logical language, and the negotiation protocol.
Stergiou, George S; Alpert, Bruce; Mieke, Stephan; Asmar, Roland; Atkins, Neil; Eckert, Siegfried; Frick, Gerhard; Friedman, Bruce; Graßl, Thomas; Ichikawa, Tsutomu; Ioannidis, John P; Lacy, Peter; McManus, Richard; Murray, Alan; Myers, Martin; Palatini, Paolo; Parati, Gianfranco; Quinn, David; Sarkis, Josh; Shennan, Andrew; Usuda, Takashi; Wang, Jiguang; Wu, Colin O; O'Brien, Eoin
2018-03-01
: In the last 30 years, several organizations, such as the US Association for the Advancement of Medical Instrumentation (AAMI), the British Hypertension Society, the European Society of Hypertension (ESH) Working Group on Blood Pressure (BP) Monitoring and the International Organization for Standardization (ISO) have developed protocols for clinical validation of BP measuring devices. However, it is recognized that science, as well as patients, consumers and manufacturers would be best served if all BP measuring devices were assessed for accuracy according to an agreed single validation protocol that had global acceptance. Therefore, an international initiative was taken by AAMI, ESH and ISO experts who agreed to develop a universal standard for device validation. This statement presents the key aspects of a validation procedure, which were agreed by the AAMI, ESH and ISO representatives as the basis for a single universal validation protocol. As soon as the AAMI/ESH/ISO standard is fully developed, this will be regarded as the single universal standard and will replace all other previous standards/protocols.
Stergiou, George S; Alpert, Bruce; Mieke, Stephan; Asmar, Roland; Atkins, Neil; Eckert, Siegfried; Frick, Gerhard; Friedman, Bruce; Graßl, Thomas; Ichikawa, Tsutomu; Ioannidis, John P; Lacy, Peter; McManus, Richard; Murray, Alan; Myers, Martin; Palatini, Paolo; Parati, Gianfranco; Quinn, David; Sarkis, Josh; Shennan, Andrew; Usuda, Takashi; Wang, Jiguang; Wu, Colin O; O'Brien, Eoin
2018-03-01
In the past 30 years, several organizations, such as the US Association for the Advancement of Medical Instrumentation (AAMI), the British Hypertension Society, the European Society of Hypertension (ESH) Working Group on Blood Pressure (BP) Monitoring, and the International Organization for Standardization (ISO), have developed protocols for clinical validation of BP measuring devices. However, it is recognized that science, as well as patients, consumers, and manufacturers, would be best served if all BP measuring devices were assessed for accuracy according to an agreed single validation protocol that had global acceptance. Therefore, an international initiative was taken by the AAMI, ESH, and ISO experts who agreed to develop a universal standard for device validation. This statement presents the key aspects of a validation procedure, which were agreed by the AAMI, ESH, and ISO representatives as the basis for a single universal validation protocol. As soon as the AAMI/ESH/ISO standard is fully developed, this will be regarded as the single universal standard and will replace all other previous standards/protocols. © 2018 American Heart Association, Inc., and Wolters Kluwer Health, Inc.
NASA Astrophysics Data System (ADS)
Guillevic, P. C.; Nickeson, J. E.; Roman, M. O.; camacho De Coca, F.; Wang, Z.; Schaepman-Strub, G.
2016-12-01
The Global Climate Observing System (GCOS) has specified the need to systematically produce and validate Essential Climate Variables (ECVs). The Committee on Earth Observation Satellites (CEOS) Working Group on Calibration and Validation (WGCV) and in particular its subgroup on Land Product Validation (LPV) is playing a key coordination role leveraging the international expertise required to address actions related to the validation of global land ECVs. The primary objective of the LPV subgroup is to set standards for validation methods and reporting in order to provide traceable and reliable uncertainty estimates for scientists and stakeholders. The Subgroup is comprised of 9 focus areas that encompass 10 land surface variables. The activities of each focus area are coordinated by two international co-leads and currently include leaf area index (LAI) and fraction of absorbed photosynthetically active radiation (FAPAR), vegetation phenology, surface albedo, fire disturbance, snow cover, land cover and land use change, soil moisture, land surface temperature (LST) and emissivity. Recent additions to the focus areas include vegetation indices and biomass. The development of best practice validation protocols is a core activity of CEOS LPV with the objective to standardize the evaluation of land surface products. LPV has identified four validation levels corresponding to increasing spatial and temporal representativeness of reference samples used to perform validation. Best practice validation protocols (1) provide the definition of variables, ancillary information and uncertainty metrics, (2) describe available data sources and methods to establish reference validation datasets with SI traceability, and (3) describe evaluation methods and reporting. An overview on validation best practice components will be presented based on the LAI and LST protocol efforts to date.
Development and validation of a remote home safety protocol.
Romero, Sergio; Lee, Mi Jung; Simic, Ivana; Levy, Charles; Sanford, Jon
2018-02-01
Environmental assessments and subsequent modifications conducted by healthcare professionals can enhance home safety and promote independent living. However, travel time, expense and the availability of qualified professionals can limit the broad application of this intervention. Remote technology has the potential to increase access to home safety evaluations. This study describes the development and validation of a remote home safety protocol that can be used by a caregiver of an elderly person to video-record their home environment for later viewing and evaluation by a trained professional. The protocol was developed based on literature reviews and evaluations from clinical and content experts. Cognitive interviews were conducted with a group of six caregivers to validate the protocol. The final protocol included step-by-step directions to record indoor and outdoor areas of the home. The validation process resulted in modifications related to safety, clarity of the protocol, readability, visual appearance, technical descriptions and usability. Our final protocol includes detailed instructions that a caregiver should be able to follow to record a home environment for subsequent evaluation by a home safety professional. Implications for Rehabilitation The results of this study have several implications for rehabilitation practice The remote home safety evaluation protocol can potentially improve access to rehabilitation services for clients in remote areas and prevent unnecessary delays for needed care. Using our protocol, a patient's caregiver can partner with therapists to quickly and efficiently evaluate a patient's home before they are released from the hospital. Caregiver narration, which reflects a caregiver's own perspective, is critical to evaluating home safety. In-home safety evaluations, currently not available to all who need them due to access barriers, can enhance a patient's independence and provide a safer home environment.
Topouchian, Jirar; Agnoletti, Davide; Blacher, Jacques; Youssef, Ahmed; Ibanez, Isabel; Khabouth, Jose; Khawaja, Salwa; Beaino, Layale; Asmar, Roland
2011-01-01
Four oscillometric devices for self-measurement of blood pressure (SBPM) were evaluated according to the European Society of Hypertension (ESH) international protocol and its 2010 revision in four separate studies. The Omron® M2, Omron M3, and Omron M6 measure blood pressure (BP) at the brachial level, while the Omron R2 measures BP at the wrist level. The international protocol requires a total number of 33 subjects in which the validation is performed. The Omron M2 and Omron R2 were validated in 2009 according to the ESH international protocol, while the Omron M3 and Omron M6 were validated in 2010-2011 according to the 2010 ESH international protocol revision. The protocol procedures were followed precisely. All four tested devices passed the validation process. The mean differences between the device and mercury readings were 2.7 ± 5.0 and -1.4 ± 3.2 mmHg for systolic and diastolic BP, respectively, using the Omron M2 device, and 1.7 ± 3.2 and -0.9 ± 2.6 mmHg using the Omron M3, 1.6 ± 2.9 and -0.9 ± 2.5 mmHg using the Omron M6, and -1.1 ± 4.8 and -0.9 ± 4.3 mmHg using the Omron R2. Readings from the Omron M2, Omron M3, Omron M6, and Omron R2, differing by less than 5, 10, and 15 mmHg, fulfill the ESH international protocol and its 2010 revision requirements. Therefore, each of these four devices can be used by patients for SBPM.
An Authentication Protocol for Future Sensor Networks.
Bilal, Muhammad; Kang, Shin-Gak
2017-04-28
Authentication is one of the essential security services in Wireless Sensor Networks (WSNs) for ensuring secure data sessions. Sensor node authentication ensures the confidentiality and validity of data collected by the sensor node, whereas user authentication guarantees that only legitimate users can access the sensor data. In a mobile WSN, sensor and user nodes move across the network and exchange data with multiple nodes, thus experiencing the authentication process multiple times. The integration of WSNs with Internet of Things (IoT) brings forth a new kind of WSN architecture along with stricter security requirements; for instance, a sensor node or a user node may need to establish multiple concurrent secure data sessions. With concurrent data sessions, the frequency of the re-authentication process increases in proportion to the number of concurrent connections. Moreover, to establish multiple data sessions, it is essential that a protocol participant have the capability of running multiple instances of the protocol run, which makes the security issue even more challenging. The currently available authentication protocols were designed for the autonomous WSN and do not account for the above requirements. Hence, ensuring a lightweight and efficient authentication protocol has become more crucial. In this paper, we present a novel, lightweight and efficient key exchange and authentication protocol suite called the Secure Mobile Sensor Network (SMSN) Authentication Protocol. In the SMSN a mobile node goes through an initial authentication procedure and receives a re-authentication ticket from the base station. Later a mobile node can use this re-authentication ticket when establishing multiple data exchange sessions and/or when moving across the network. This scheme reduces the communication and computational complexity of the authentication process. We proved the strength of our protocol with rigorous security analysis (including formal analysis using the BAN-logic) and simulated the SMSN and previously proposed schemes in an automated protocol verifier tool. Finally, we compared the computational complexity and communication cost against well-known authentication protocols.
An Authentication Protocol for Future Sensor Networks
Bilal, Muhammad; Kang, Shin-Gak
2017-01-01
Authentication is one of the essential security services in Wireless Sensor Networks (WSNs) for ensuring secure data sessions. Sensor node authentication ensures the confidentiality and validity of data collected by the sensor node, whereas user authentication guarantees that only legitimate users can access the sensor data. In a mobile WSN, sensor and user nodes move across the network and exchange data with multiple nodes, thus experiencing the authentication process multiple times. The integration of WSNs with Internet of Things (IoT) brings forth a new kind of WSN architecture along with stricter security requirements; for instance, a sensor node or a user node may need to establish multiple concurrent secure data sessions. With concurrent data sessions, the frequency of the re-authentication process increases in proportion to the number of concurrent connections. Moreover, to establish multiple data sessions, it is essential that a protocol participant have the capability of running multiple instances of the protocol run, which makes the security issue even more challenging. The currently available authentication protocols were designed for the autonomous WSN and do not account for the above requirements. Hence, ensuring a lightweight and efficient authentication protocol has become more crucial. In this paper, we present a novel, lightweight and efficient key exchange and authentication protocol suite called the Secure Mobile Sensor Network (SMSN) Authentication Protocol. In the SMSN a mobile node goes through an initial authentication procedure and receives a re-authentication ticket from the base station. Later a mobile node can use this re-authentication ticket when establishing multiple data exchange sessions and/or when moving across the network. This scheme reduces the communication and computational complexity of the authentication process. We proved the strength of our protocol with rigorous security analysis (including formal analysis using the BAN-logic) and simulated the SMSN and previously proposed schemes in an automated protocol verifier tool. Finally, we compared the computational complexity and communication cost against well-known authentication protocols. PMID:28452937
Bires, Angela Macci; Lawson, Dori; Wasser, Thomas E; Raber-Baer, Donna
2013-12-01
Clinically valid cardiac evaluation via treadmill stress testing requires patients to achieve specific target heart rates and to successfully complete the cardiac examination. A comparison of the standard Bruce protocol and the ramped Bruce protocol was performed using data collected over a 1-y period from a targeted patient population with a body mass index (BMI) equal to or greater than 30 to determine which treadmill protocol provided more successful examination results. The functional capacity, metabolic equivalent units achieved, pressure rate product, and total time on the treadmill as measured for the obese patients were clinically valid and comparable to normal-weight and overweight patients (P < 0.001). Data gathered from each protocol demonstrated that the usage of the ramped Bruce protocol achieved more consistent results in comparison across all BMI groups in achieving 80%-85% of their age-predicted maximum heart rate. This study did not adequately establish that the ramped Bruce protocol was superior to the standard Bruce protocol for the examination of patients with a BMI equal to or greater than 30.
Advancing implementation science through measure development and evaluation: a study protocol.
Lewis, Cara C; Weiner, Bryan J; Stanick, Cameo; Fischer, Sarah M
2015-07-22
Significant gaps related to measurement issues are among the most critical barriers to advancing implementation science. Three issues motivated the study aims: (a) the lack of stakeholder involvement in defining pragmatic measure qualities; (b) the dearth of measures, particularly for implementation outcomes; and (c) unknown psychometric and pragmatic strength of existing measures. Aim 1: Establish a stakeholder-driven operationalization of pragmatic measures and develop reliable, valid rating criteria for assessing the construct. Aim 2: Develop reliable, valid, and pragmatic measures of three critical implementation outcomes, acceptability, appropriateness, and feasibility. Aim 3: Identify Consolidated Framework for Implementation Research and Implementation Outcome Framework-linked measures that demonstrate both psychometric and pragmatic strength. For Aim 1, we will conduct (a) interviews with stakeholder panelists (N = 7) and complete a literature review to populate pragmatic measure construct criteria, (b) Q-sort activities (N = 20) to clarify the internal structure of the definition, (c) Delphi activities (N = 20) to achieve consensus on the dimension priorities, (d) test-retest and inter-rater reliability assessments of the emergent rating system, and (e) known-groups validity testing of the top three prioritized pragmatic criteria. For Aim 2, our systematic development process involves domain delineation, item generation, substantive validity assessment, structural validity assessment, reliability assessment, and predictive validity assessment. We will also assess discriminant validity, known-groups validity, structural invariance, sensitivity to change, and other pragmatic features. For Aim 3, we will refine our established evidence-based assessment (EBA) criteria, extract the relevant data from the literature, rate each measure using the EBA criteria, and summarize the data. The study outputs of each aim are expected to have a positive impact as they will establish and guide a comprehensive measurement-focused research agenda for implementation science and provide empirically supported measures, tools, and methods for accomplishing this work.
Mahmood, Zahid; Ning, Huansheng; Ghafoor, AtaUllah
2017-03-24
Wireless Sensor Networks (WSNs) consist of lightweight devices to measure sensitive data that are highly vulnerable to security attacks due to their constrained resources. In a similar manner, the internet-based lightweight devices used in the Internet of Things (IoT) are facing severe security and privacy issues because of the direct accessibility of devices due to their connection to the internet. Complex and resource-intensive security schemes are infeasible and reduce the network lifetime. In this regard, we have explored the polynomial distribution-based key establishment schemes and identified an issue that the resultant polynomial value is either storage intensive or infeasible when large values are multiplied. It becomes more costly when these polynomials are regenerated dynamically after each node join or leave operation and whenever key is refreshed. To reduce the computation, we have proposed an Efficient Key Management (EKM) scheme for multiparty communication-based scenarios. The proposed session key management protocol is established by applying a symmetric polynomial for group members, and the group head acts as a responsible node. The polynomial generation method uses security credentials and secure hash function. Symmetric cryptographic parameters are efficient in computation, communication, and the storage required. The security justification of the proposed scheme has been completed by using Rubin logic, which guarantees that the protocol attains mutual validation and session key agreement property strongly among the participating entities. Simulation scenarios are performed using NS 2.35 to validate the results for storage, communication, latency, energy, and polynomial calculation costs during authentication, session key generation, node migration, secure joining, and leaving phases. EKM is efficient regarding storage, computation, and communication overhead and can protect WSN-based IoT infrastructure.
Mahmood, Zahid; Ning, Huansheng; Ghafoor, AtaUllah
2017-01-01
Wireless Sensor Networks (WSNs) consist of lightweight devices to measure sensitive data that are highly vulnerable to security attacks due to their constrained resources. In a similar manner, the internet-based lightweight devices used in the Internet of Things (IoT) are facing severe security and privacy issues because of the direct accessibility of devices due to their connection to the internet. Complex and resource-intensive security schemes are infeasible and reduce the network lifetime. In this regard, we have explored the polynomial distribution-based key establishment schemes and identified an issue that the resultant polynomial value is either storage intensive or infeasible when large values are multiplied. It becomes more costly when these polynomials are regenerated dynamically after each node join or leave operation and whenever key is refreshed. To reduce the computation, we have proposed an Efficient Key Management (EKM) scheme for multiparty communication-based scenarios. The proposed session key management protocol is established by applying a symmetric polynomial for group members, and the group head acts as a responsible node. The polynomial generation method uses security credentials and secure hash function. Symmetric cryptographic parameters are efficient in computation, communication, and the storage required. The security justification of the proposed scheme has been completed by using Rubin logic, which guarantees that the protocol attains mutual validation and session key agreement property strongly among the participating entities. Simulation scenarios are performed using NS 2.35 to validate the results for storage, communication, latency, energy, and polynomial calculation costs during authentication, session key generation, node migration, secure joining, and leaving phases. EKM is efficient regarding storage, computation, and communication overhead and can protect WSN-based IoT infrastructure. PMID:28338632
A Protocol for Advanced Psychometric Assessment of Surveys
Squires, Janet E.; Hayduk, Leslie; Hutchinson, Alison M.; Cranley, Lisa A.; Gierl, Mark; Cummings, Greta G.; Norton, Peter G.; Estabrooks, Carole A.
2013-01-01
Background and Purpose. In this paper, we present a protocol for advanced psychometric assessments of surveys based on the Standards for Educational and Psychological Testing. We use the Alberta Context Tool (ACT) as an exemplar survey to which this protocol can be applied. Methods. Data mapping, acceptability, reliability, and validity are addressed. Acceptability is assessed with missing data frequencies and the time required to complete the survey. Reliability is assessed with internal consistency coefficients and information functions. A unitary approach to validity consisting of accumulating evidence based on instrument content, response processes, internal structure, and relations to other variables is taken. We also address assessing performance of survey data when aggregated to higher levels (e.g., nursing unit). Discussion. In this paper we present a protocol for advanced psychometric assessment of survey data using the Alberta Context Tool (ACT) as an exemplar survey; application of the protocol to the ACT survey is underway. Psychometric assessment of any survey is essential to obtaining reliable and valid research findings. This protocol can be adapted for use with any nursing survey. PMID:23401759
ERIC Educational Resources Information Center
Williams, Harriet G.; Pfeiffer, Karin A.; Dowda, Marsha; Jeter, Chevy; Jones, Shaverra; Pate, Russell R.
2009-01-01
The purpose of this study was to develop a valid and reliable tool for use in assessing motor skills in preschool children in field-based settings. The development of the Children's Activity and Movement in Preschool Study Motor Skills Protocol included evidence of its reliability and validity for use in field-based environments as part of large…
Statistical Methods and Tools for Uxo Characterization (SERDP Final Technical Report)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pulsipher, Brent A.; Gilbert, Richard O.; Wilson, John E.
2004-11-15
The Strategic Environmental Research and Development Program (SERDP) issued a statement of need for FY01 titled Statistical Sampling for Unexploded Ordnance (UXO) Site Characterization that solicited proposals to develop statistically valid sampling protocols for cost-effective, practical, and reliable investigation of sites contaminated with UXO; protocols that could be validated through subsequent field demonstrations. The SERDP goal was the development of a sampling strategy for which a fraction of the site is initially surveyed by geophysical detectors to confidently identify clean areas and subsections (target areas, TAs) that had elevated densities of anomalous geophysical detector readings that could indicate the presencemore » of UXO. More detailed surveys could then be conducted to search the identified TAs for UXO. SERDP funded three projects: those proposed by the Pacific Northwest National Laboratory (PNNL) (SERDP Project No. UXO 1199), Sandia National Laboratory (SNL), and Oak Ridge National Laboratory (ORNL). The projects were closely coordinated to minimize duplication of effort and facilitate use of shared algorithms where feasible. This final report for PNNL Project 1199 describes the methods developed by PNNL to address SERDP's statement-of-need for the development of statistically-based geophysical survey methods for sites where 100% surveys are unattainable or cost prohibitive.« less
Online Error Reporting for Managing Quality Control Within Radiology.
Golnari, Pedram; Forsberg, Daniel; Rosipko, Beverly; Sunshine, Jeffrey L
2016-06-01
Information technology systems within health care, such as picture archiving and communication system (PACS) in radiology, can have a positive impact on production but can also risk compromising quality. The widespread use of PACS has removed the previous feedback loop between radiologists and technologists. Instead of direct communication of quality discrepancies found for an examination, the radiologist submitted a paper-based quality-control report. A web-based issue-reporting tool can help restore some of the feedback loop and also provide possibilities for more detailed analysis of submitted errors. The purpose of this study was to evaluate the hypothesis that data from use of an online error reporting software for quality control can focus our efforts within our department. For the 372,258 radiologic examinations conducted during the 6-month period study, 930 errors (390 exam protocol, 390 exam validation, and 150 exam technique) were submitted, corresponding to an error rate of 0.25 %. Within the category exam protocol, technologist documentation had the highest number of submitted errors in ultrasonography (77 errors [44 %]), while imaging protocol errors were the highest subtype error for computed tomography modality (35 errors [18 %]). Positioning and incorrect accession had the highest errors in the exam technique and exam validation error category, respectively, for nearly all of the modalities. An error rate less than 1 % could signify a system with a very high quality; however, a more likely explanation is that not all errors were detected or reported. Furthermore, staff reception of the error reporting system could also affect the reporting rate.
Murphy, Malia S Q; Hawken, Steven; Atkinson, Katherine M; Milburn, Jennifer; Pervin, Jesmin; Gravett, Courtney; Stringer, Jeffrey S A; Rahman, Anisur; Lackritz, Eve; Chakraborty, Pranesh; Wilson, Kumanan
2017-01-01
Background Knowledge of gestational age (GA) is critical for guiding neonatal care and quantifying regional burdens of preterm birth. In settings where access to ultrasound dating is limited, postnatal estimates are frequently used despite the issues of accuracy associated with postnatal approaches. Newborn metabolic profiles are known to vary by severity of preterm birth. Recent work by our group and others has highlighted the accuracy of postnatal GA estimation algorithms derived from routinely collected newborn screening profiles. This protocol outlines the validation of a GA model originally developed in a North American cohort among international newborn cohorts. Methods Our primary objective is to use blood spot samples collected from infants born in Zambia and Bangladesh to evaluate our algorithm’s capacity to correctly classify GA within 1, 2, 3 and 4 weeks. Secondary objectives are to 1) determine the algorithm's accuracy in small-for-gestational-age and large-for-gestational-age infants, 2) determine its ability to correctly discriminate GA of newborns across dichotomous thresholds of preterm birth (≤34 weeks, <37 weeks GA) and 3) compare the relative performance of algorithms derived from newborn screening panels including all available analytes and those restricted to analyte subsets. The study population will consist of infants born to mothers already enrolled in one of two preterm birth cohorts in Lusaka, Zambia, and Matlab, Bangladesh. Dried blood spot samples will be collected and sent for analysis in Ontario, Canada, for model validation. Discussion This study will determine the validity of a GA estimation algorithm across ethnically diverse infant populations and assess population specific variations in newborn metabolic profiles. PMID:29104765
Validation of a High Sampling Rate Inertial Measurement Unit for Acceleration During Running.
Provot, Thomas; Chiementin, Xavier; Oudin, Emeric; Bolaers, Fabrice; Murer, Sébastien
2017-08-25
The musculo-skeletal response of athletes to various activities during training exercises has become a critical issue in order to optimize their performance and minimize injuries. However, dynamic and kinematic measures of an athlete's activity are generally limited by constraints in data collection and technology. Thus, the choice of reliable and accurate sensors is crucial for gathering data in indoor and outdoor conditions. The aim of this study is to validate the use of the accelerometer of a high sampling rate ( 1344 Hz ) Inertial Measurement Unit (IMU) in the frame of running activities. To this end, two validation protocols are imposed: a classical one on a shaker, followed by another one during running, the IMU being attached to a test subject. For each protocol, the response of the IMU Accelerometer (IMUA) is compared to a calibrated industrial accelerometer, considered as the gold standard for dynamic and kinematic data collection. The repeatability, impact of signal frequency and amplitude (on shaker) as well as the influence of speed (while running) are investigated. Results reveal that the IMUA exhibits good repeatability. Coefficient of Variation CV is 1 % 8.58 ± 0.06 m / s 2 on the shaker and 3 % 26.65 ± 0.69 m / s 2 while running. However, the shaker test shows that the IMUA is affected by the signal frequency (error exceeds 10 % beyond 80 Hz ), an observation confirmed by the running test. Nevertheless, the IMUA provides a reliable measure in the range 0-100 Hz, i.e., the most relevant part in the energy spectrum over the range 0-150 Hz during running. In our view, these findings emphasize the validity of IMUs for the measurement of acceleration during running.
Li, Rebecca H; Wacholtz, Mary C; Barnes, Mark; Boggs, Liam; Callery-D'Amico, Susan; Davis, Amy; Digilova, Alla; Forster, David; Heffernan, Kate; Luthin, Maeve; Lynch, Holly Fernandez; McNair, Lindsay; Miller, Jennifer E; Murphy, Jacquelyn; Van Campen, Luann; Wilenzick, Mark; Wolf, Delia; Woolston, Cris; Aldinger, Carmen; Bierer, Barbara E
2016-04-01
A novel Protocol Ethics Tool Kit ('Ethics Tool Kit') has been developed by a multi-stakeholder group of the Multi-Regional Clinical Trials Center of Brigham and Women's Hospital and Harvard. The purpose of the Ethics Tool Kit is to facilitate effective recognition, consideration and deliberation of critical ethical issues in clinical trial protocols. The Ethics Tool Kit may be used by investigators and sponsors to develop a dedicated Ethics Section within a protocol to improve the consistency and transparency between clinical trial protocols and research ethics committee reviews. It may also streamline ethics review and may facilitate and expedite the review process by anticipating the concerns of ethics committee reviewers. Specific attention was given to issues arising in multinational settings. With the use of this Tool Kit, researchers have the opportunity to address critical research ethics issues proactively, potentially speeding the time and easing the process to final protocol approval. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Design and Development of Layered Security: Future Enhancements and Directions in Transmission
Shahzad, Aamir; Lee, Malrey; Kim, Suntae; Kim, Kangmin; Choi, Jae-Young; Cho, Younghwa; Lee, Keun-Kwang
2016-01-01
Today, security is a prominent issue when any type of communication is being undertaken. Like traditional networks, supervisory control and data acquisition (SCADA) systems suffer from a number of vulnerabilities. Numerous end-to-end security mechanisms have been proposed for the resolution of SCADA-system security issues, but due to insecure real-time protocol use and the reliance upon open protocols during Internet-based communication, these SCADA systems can still be compromised by security challenges. This study reviews the security challenges and issues that are commonly raised during SCADA/protocol transmissions and proposes a secure distributed-network protocol version 3 (DNP3) design, and the implementation of the security solution using a cryptography mechanism. Due to the insecurities found within SCADA protocols, the new development consists of a DNP3 protocol that has been designed as a part of the SCADA system, and the cryptographically derived security is deployed within the application layer as a part of the DNP3 stack. PMID:26751443
Design and Development of Layered Security: Future Enhancements and Directions in Transmission.
Shahzad, Aamir; Lee, Malrey; Kim, Suntae; Kim, Kangmin; Choi, Jae-Young; Cho, Younghwa; Lee, Keun-Kwang
2016-01-06
Today, security is a prominent issue when any type of communication is being undertaken. Like traditional networks, supervisory control and data acquisition (SCADA) systems suffer from a number of vulnerabilities. Numerous end-to-end security mechanisms have been proposed for the resolution of SCADA-system security issues, but due to insecure real-time protocol use and the reliance upon open protocols during Internet-based communication, these SCADA systems can still be compromised by security challenges. This study reviews the security challenges and issues that are commonly raised during SCADA/protocol transmissions and proposes a secure distributed-network protocol version 3 (DNP3) design, and the implementation of the security solution using a cryptography mechanism. Due to the insecurities found within SCADA protocols, the new development consists of a DNP3 protocol that has been designed as a part of the SCADA system, and the cryptographically derived security is deployed within the application layer as a part of the DNP3 stack.
A shortened protocol for assessing cognitive bias in rats.
Brydges, Nichola M; Hall, Lynsey
2017-07-15
Reliable measurement of affective state in animals is a significant goal of animal welfare. Such measurements would also improve the validity of pre-clinical mental health research which relies on animal models. However, at present, affective states in animals are inaccessible to direct measurement. In humans, changes in cognitive processing can give reliable indications of emotional state. Therefore, similar techniques are increasingly being used to gain proxy measures of affective states in animals. In particular, the 'cognitive bias' assay has gained popularity in recent years. Major disadvantages of this technique include length of time taken for animals to acquire the task (typically several weeks), negative experiences associated with task training, and issues of motivation. Here we present a shortened cognitive bias protocol using only positive reinforcers which must actively be responded to. The protocol took an average of 4days to complete, and produced similar results to previous, longer methods (minimum 30days). Specifically, rats housed in standard laboratory conditions demonstrated negative cognitive biases when presented with ambiguous stimuli, and took longer to make a decision when faced with an ambiguous stimulus. Compared to previous methods, this protocol is significantly shorter (average 4days vs. minimum 30days), utilises only positive reinforcers to avoid inducing negative affective states, and requires active responses to all cues, avoiding potential confounds of motivational state. We have successfully developed a shortened cognitive bias protocol, suitable for use with laboratory rats. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.
Sauter, Jennifer L; Grogg, Karen L; Vrana, Julie A; Law, Mark E; Halvorson, Jennifer L; Henry, Michael R
2016-02-01
The objective of the current study was to establish a process for validating immunohistochemistry (IHC) protocols for use on the Cellient cell block (CCB) system. Thirty antibodies were initially tested on CCBs using IHC protocols previously validated on formalin-fixed, paraffin-embedded tissue (FFPE). Cytology samples were split to generate thrombin cell blocks (TCB) and CCBs. IHC was performed in parallel. Antibody immunoreactivity was scored, and concordance or discordance in immunoreactivity between the TCBs and CCBs for each sample was determined. Criteria for validation of an antibody were defined as concordant staining in expected positive and negative cells, in at least 5 samples each, and concordance in at least 90% of the samples total. Antibodies that failed initial validation were retested after alterations in IHC conditions. Thirteen of the 30 antibodies (43%) did not meet initial validation criteria. Of those, 8 antibodies (calretinin, clusters of differentiation [CD] 3, CD20, CDX2, cytokeratin 20, estrogen receptor, MOC-31, and p16) were optimized for CCBs and subsequently validated. Despite several alterations in conditions, 3 antibodies (Ber-EP4, D2-40, and paired box gene 8 [PAX8]) were not successfully validated. Nearly one-half of the antibodies tested in the current study failed initial validation using IHC conditions that were established in the study laboratory for FFPE material. Although some antibodies subsequently met validation criteria after optimization of conditions, a few continued to demonstrate inadequate immunoreactivity. These findings emphasize the importance of validating IHC protocols for methanol-fixed tissue before clinical use and suggest that optimization for alcohol fixation may be needed to obtain adequate immunoreactivity on CCBs. © 2016 American Cancer Society.
The specification-based validation of reliable multicast protocol: Problem Report. M.S. Thesis
NASA Technical Reports Server (NTRS)
Wu, Yunqing
1995-01-01
Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP multicasting. In this report, we develop formal models for RMP using existing automated verification systems, and perform validation on the formal RMP specifications. The validation analysis help identifies some minor specification and design problems. We also use the formal models of RMP to generate a test suite for conformance testing of the implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress of implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.
Assessment and risk classification protocol for patients in emergency units1
Silva, Michele de Freitas Neves; Oliveira, Gabriela Novelli; Pergola-Marconato, Aline Maino; Marconato, Rafael Silva; Bargas, Eliete Boaventura; Araujo, Izilda Esmenia Muglia
2014-01-01
Objective to develop, validate the contents and verify the reliability of a risk classification protocol for an Emergency Unit. Method the content validation was developed in a University Hospital in a country town located in the state of Sao Paulo and was carried out in two stages: the first with the individual assessment of specialists and the second with the meeting between the researchers and the specialists. The use of the protocol followed a specific guide. Concerning reliability, the concordance or equivalent method among observers was used. Results the protocol developed showed to have content validity and, after the suggested changes were made, there were excellent results concerning reliability. Conclusion the assistance flow chart was shown to be easy to use, and facilitate the search for the complaint in each assistance priority. PMID:26107828
Validating Analytical Protocols to Determine Selected Pesticides and PCBs Using Routine Samples.
Pindado Jiménez, Oscar; García Alonso, Susana; Pérez Pastor, Rosa María
2017-01-01
This study aims at providing recommendations concerning the validation of analytical protocols by using routine samples. It is intended to provide a case-study on how to validate the analytical methods in different environmental matrices. In order to analyze the selected compounds (pesticides and polychlorinated biphenyls) in two different environmental matrices, the current work has performed and validated two analytical procedures by GC-MS. A description is given of the validation of the two protocols by the analysis of more than 30 samples of water and sediments collected along nine months. The present work also scopes the uncertainty associated with both analytical protocols. In detail, uncertainty of water sample was performed through a conventional approach. However, for the sediments matrices, the estimation of proportional/constant bias is also included due to its inhomogeneity. Results for the sediment matrix are reliable, showing a range 25-35% of analytical variability associated with intermediate conditions. The analytical methodology for the water matrix determines the selected compounds with acceptable recoveries and the combined uncertainty ranges between 20 and 30%. Analyzing routine samples is rarely applied to assess trueness of novel analytical methods and up to now this methodology was not focused on organochlorine compounds in environmental matrices.
SeaWiFS technical report series. Volume 5: Ocean optics protocols for SeaWiFS validation
NASA Technical Reports Server (NTRS)
Mueller, James L.; Austin, Roswell W.; Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor)
1992-01-01
Protocols are presented for measuring optical properties, and other environmental variables, to validate the radiometric performance of the Sea-viewing Wide Field-of-view Sensor (SeaWiFS), and to develop and validate bio-optical algorithms for use with SeaWiFS data. The protocols are intended to establish foundations for a measurement strategy to verify the challenging SeaWiFS accuracy goals of 5 percent in water-leaving radiances and 35 percent in chlorophyll alpha concentration. The protocols first specify the variables which must be measured, and briefly review rationale. Subsequent chapters cover detailed protocols for instrument performance specifications, characterizing and calibration instruments, methods of making measurements in the field, and methods of data analysis. These protocols were developed at a workshop sponsored by the SeaWiFS Project Office (SPO) and held at the Naval Postgraduate School in Monterey, California (9-12 April, 1991). This report is the proceedings of that workshop, as interpreted and expanded by the authors and reviewed by workshop participants and other members of the bio-optical research community. The protocols are a first prescription to approach unprecedented measurement accuracies implied by the SeaWiFS goals, and research and development are needed to improve the state-of-the-art in specific areas. The protocols should be periodically revised to reflect technical advances during the SeaWiFS Project cycle.
Intelligent Local Avoided Collision (iLAC) MAC Protocol for Very High Speed Wireless Network
NASA Astrophysics Data System (ADS)
Hieu, Dinh Chi; Masuda, Akeo; Rabarijaona, Verotiana Hanitriniala; Shimamoto, Shigeru
Future wireless communication systems aim at very high data rates. As the medium access control (MAC) protocol plays the central role in determining the overall performance of the wireless system, designing a suitable MAC protocol is critical to fully exploit the benefit of high speed transmission that the physical layer (PHY) offers. In the latest 802.11n standard [2], the problem of long overhead has been addressed adequately but the issue of excessive colliding transmissions, especially in congested situation, remains untouched. The procedure of setting the backoff value is the heart of the 802.11 distributed coordination function (DCF) to avoid collision in which each station makes its own decision on how to avoid collision in the next transmission. However, collision avoidance is a problem that can not be solved by a single station. In this paper, we introduce a new MAC protocol called Intelligent Local Avoided Collision (iLAC) that redefines individual rationality in choosing the backoff counter value to avoid a colliding transmission. The distinguishing feature of iLAC is that it fundamentally changes this decision making process from collision avoidance to collaborative collision prevention. As a result, stations can avoid colliding transmissions with much greater precision. Analytical solution confirms the validity of this proposal and simulation results show that the proposed algorithm outperforms the conventional algorithms by a large margin.
Brubaker, Chad; Jana, Suman; Ray, Baishakhi; Khurshid, Sarfraz; Shmatikov, Vitaly
2014-01-01
Modern network security rests on the Secure Sockets Layer (SSL) and Transport Layer Security (TLS) protocols. Distributed systems, mobile and desktop applications, embedded devices, and all of secure Web rely on SSL/TLS for protection against network attacks. This protection critically depends on whether SSL/TLS clients correctly validate X.509 certificates presented by servers during the SSL/TLS handshake protocol. We design, implement, and apply the first methodology for large-scale testing of certificate validation logic in SSL/TLS implementations. Our first ingredient is "frankencerts," synthetic certificates that are randomly mutated from parts of real certificates and thus include unusual combinations of extensions and constraints. Our second ingredient is differential testing: if one SSL/TLS implementation accepts a certificate while another rejects the same certificate, we use the discrepancy as an oracle for finding flaws in individual implementations. Differential testing with frankencerts uncovered 208 discrepancies between popular SSL/TLS implementations such as OpenSSL, NSS, CyaSSL, GnuTLS, PolarSSL, MatrixSSL, etc. Many of them are caused by serious security vulnerabilities. For example, any server with a valid X.509 version 1 certificate can act as a rogue certificate authority and issue fake certificates for any domain, enabling man-in-the-middle attacks against MatrixSSL and GnuTLS. Several implementations also accept certificate authorities created by unauthorized issuers, as well as certificates not intended for server authentication. We also found serious vulnerabilities in how users are warned about certificate validation errors. When presented with an expired, self-signed certificate, NSS, Safari, and Chrome (on Linux) report that the certificate has expired-a low-risk, often ignored error-but not that the connection is insecure against a man-in-the-middle attack. These results demonstrate that automated adversarial testing with frankencerts is a powerful methodology for discovering security flaws in SSL/TLS implementations.
Brubaker, Chad; Jana, Suman; Ray, Baishakhi; Khurshid, Sarfraz; Shmatikov, Vitaly
2014-01-01
Modern network security rests on the Secure Sockets Layer (SSL) and Transport Layer Security (TLS) protocols. Distributed systems, mobile and desktop applications, embedded devices, and all of secure Web rely on SSL/TLS for protection against network attacks. This protection critically depends on whether SSL/TLS clients correctly validate X.509 certificates presented by servers during the SSL/TLS handshake protocol. We design, implement, and apply the first methodology for large-scale testing of certificate validation logic in SSL/TLS implementations. Our first ingredient is “frankencerts,” synthetic certificates that are randomly mutated from parts of real certificates and thus include unusual combinations of extensions and constraints. Our second ingredient is differential testing: if one SSL/TLS implementation accepts a certificate while another rejects the same certificate, we use the discrepancy as an oracle for finding flaws in individual implementations. Differential testing with frankencerts uncovered 208 discrepancies between popular SSL/TLS implementations such as OpenSSL, NSS, CyaSSL, GnuTLS, PolarSSL, MatrixSSL, etc. Many of them are caused by serious security vulnerabilities. For example, any server with a valid X.509 version 1 certificate can act as a rogue certificate authority and issue fake certificates for any domain, enabling man-in-the-middle attacks against MatrixSSL and GnuTLS. Several implementations also accept certificate authorities created by unauthorized issuers, as well as certificates not intended for server authentication. We also found serious vulnerabilities in how users are warned about certificate validation errors. When presented with an expired, self-signed certificate, NSS, Safari, and Chrome (on Linux) report that the certificate has expired—a low-risk, often ignored error—but not that the connection is insecure against a man-in-the-middle attack. These results demonstrate that automated adversarial testing with frankencerts is a powerful methodology for discovering security flaws in SSL/TLS implementations. PMID:25404868
Enhanced parent selection algorithms in mintroute protocol
NASA Astrophysics Data System (ADS)
Kim, Ki-Il
2012-11-01
A low-rate, short-range wireless radio communication on a small device often hampers high reliability in wireless sensor networks. However, more applications are increasingly demanding high reliability. To meet this requirement, various approaches have been proposed in each viewpoint of layers. Among those, MintRoute is a well-known network layer approach to develop a new metric based on link quality for path selection towards the sink. By choosing the link with the highest measured value, it has a higher possibility to transmit a packet over the link without error. However, there are still several issues to be mentioned during operations. In this paper, we propose how to improve the MintRoute protocol through revised algorithms. They include a parent selection considering distance and level from the sink node, and a fast recovery method against failures. Simulations and analysis are performed by in order to validate the suitability of reduced end-to-end delay and fast recovery for failures, thus to enhance the reliability of communication.
Bernard, Lise; Roche, Béatrice; Batisse, Marie; Maqdasy, Salwan; Terral, Daniel; Sautou, Valérie; Tauveron, Igor
2016-10-01
In non-critically ill patients, the use of an insulin syringe pump allows the management of temporary situations during which other therapies cannot be used (failure of subcutaneous injections, awaiting advice from the diabetes team, emergency situations, prolonged corticosteroid therapy, initiation of an artificial nutrition, need for a fasting status, etc.). To manage the risks related to this «never event», the use of a standard validated protocol for insulin administration and monitoring is an essential prerequisite. To this end, a multidisciplinary approach is recommended. With the support of our subcommission «Endocrinology-Diabetology», we proceeded with a «step-by-step process» to create such a standardized protocol: (1) review of all existing protocols in our hospital; (2) overview of the literature data concerning insulin infusion protocols developed by multidisciplinary teams in France and abroad; (3) development of a standardized protocol for non-intensive care unit patients, respecting the current recommendations and adapting it to the working habits of health teams; and (4) validation of the protocol Two protocols based on the same structure but adapted to the health status of the patient have been developed. The protocols are divided in to three parts: (1) golden rules to make the use of the protocol appropriate and safe; (2) the algorithm (a double entry table) corresponding to a dynamic adaptation of insulin doses, clearly defining the target and the 'at risk situations'; and (3) practical aspects of the protocol: preparation of the syringe, treatment initiation and traceability. The protocols have been validated by the institution. Our standardized insulin infusion protocol is simple, easy to implement, safe and is likely to be applicable in diverse care units. However, the efficiency, safety and the workability of our protocols have to be clinically evaluated. © 2016 John Wiley & Sons, Ltd.
Jonker, Dirk; Gustafsson, Ewa; Rolander, Bo; Arvidsson, Inger; Nordander, Catarina
2015-01-01
A new health surveillance protocol for work-related upper-extremity musculoskeletal disorders has been validated by comparing the results with a reference protocol. The studied protocol, Health Surveillance in Adverse Ergonomics Conditions (HECO), is a new version of the reference protocol modified for application in the Occupational Health Service (OHS). The HECO protocol contains both a screening part and a diagnosing part. Sixty-three employees were examined. The screening in HECO did not miss any diagnosis found when using the reference protocol, but in comparison to the reference protocol considerable time savings could be achieved. Fair to good agreement between the protocols was obtained for one or more diagnoses in neck/shoulders (86%, k = 0.62) and elbow/hands (84%, k = 0.49). Therefore, the results obtained using the HECO protocol can be compared with a reference material collected with the reference protocol, and thus provide information of the magnitude of disorders in an examined work group. Practitioner Summary: The HECO protocol is a relatively simple physical examination protocol for identification of musculoskeletal disorders in the neck and upper extremities. The protocol is a reliable and cost-effective tool for the OHS to use for occupational health surveillance in order to detect workplaces at high risk for developing musculoskeletal disorders.
Power Conservation through Energy Efficient Routing in Wireless Sensor Networks.
Kandris, Dionisis; Tsioumas, Panagiotis; Tzes, Anthony; Nikolakopoulos, George; Vergados, Dimitrios D
2009-01-01
The power awareness issue is the primary concern within the domain of Wireless Sensor Networks (WSNs). Most power dissipation ocurrs during communication, thus routing protocols in WSNs mainly aim at power conservation. Moreover, a routing protocol should be scalable, so that its effectiveness does not degrade as the network size increases. In response to these issues, this work describes the development of an efficient routing protocol, named SHPER (Scaling Hierarchical Power Efficient Routing).
2017-01-01
Background Clinicians, such as respiratory therapists and physicians, are often required to set up pieces of medical equipment that use inconsistent terminology. Current lung ventilator terminology that is used by different manufacturers contributes to the risk of usage errors, and in turn the risk of ventilator-associated lung injuries and other conditions. Human factors and communication issues are often associated with ventilator-related sentinel events, and inconsistent ventilator terminology compounds these issues. This paper describes our proposed protocol, which will be implemented at the University of Waterloo, Canada when this project is externally funded. Objective We propose to determine whether a standardized vocabulary improves the ease of use, safety, and utility as it relates to the usability of medical devices, compared to legacy medical devices from multiple manufacturers, which use different terms. Methods We hypothesize that usage errors by clinicians will be lower when standardization is consistently applied by all manufacturers. The proposed study will experimentally examine the impact of standardized nomenclature on performance declines in the use of an unfamiliar ventilator product in clinically relevant scenarios. Participants will be respiratory therapy practitioners and trainees, and we propose studying approximately 60 participants. Results The work reported here is in the proposal phase. Once the protocol is implemented, we will report the results in a follow-up paper. Conclusions The proposed study will help us better understand the effects of standardization on medical device usability. The study will also help identify any terms in the International Organization for Standardization (ISO) Draft International Standard (DIS) 19223 that may be associated with recurrent errors. Amendments to the standard will be proposed if recurrent errors are identified. This report contributes a protocol that can be used to assess the effect of standardization in any given domain that involves equipment, multiple manufacturers, inconsistent vocabulary, symbology, audio tones, or patterns in interface navigation. Second, the protocol can be used to experimentally evaluate the ISO DIS 19223 for its effectiveness, as researchers around the world may wish to conduct such tests and compare results. PMID:28887292
Validity of Assessments of Youth Access to Tobacco: The Familiarity Effect
Landrine, Hope; Klonoff, Elizabeth A.
2003-01-01
Objectives. We examined the standard compliance protocol and its validity as a measure of youth access to tobacco. Methods. In Study 1, youth smokers reported buying cigarettes in stores where they are regular customers. In Study 2, youths attempted to purchase cigarettes by using the Standard Protocol, in which they appeared at stores once for cigarettes, and by using the Familiarity Protocol, in which they were rendered regular customers by purchasing nontobacco items 4 times and then requested cigarettes during their fifth visit. Results. Sales to youths aged 17 years in the Familiarity Protocol were significantly higher than sales to the same age group in the Standard Protocols (62.5% vs. 6%, respectively). Conclusions. The Standard Protocol does not match how youths obtain cigarettes. Access is low for stranger youths within compliance studies, but access is high for familiar youths outside of compliance studies. PMID:14600057
A Mutual Authentication Framework for Wireless Medical Sensor Networks.
Srinivas, Jangirala; Mishra, Dheerendra; Mukhopadhyay, Sourav
2017-05-01
Wireless medical sensor networks (WMSN) comprise of distributed sensors, which can sense human physiological signs and monitor the health condition of the patient. It is observed that providing privacy to the patient's data is an important issue and can be challenging. The information passing is done via the public channel in WMSN. Thus, the patient, sensitive information can be obtained by eavesdropping or by unauthorized use of handheld devices which the health professionals use in monitoring the patient. Therefore, there is an essential need of restricting the unauthorized access to the patient's medical information. Hence, the efficient authentication scheme for the healthcare applications is needed to preserve the privacy of the patients' vital signs. To ensure secure and authorized communication in WMSN, we design a symmetric key based authentication protocol for WMSN environment. The proposed protocol uses only computationally efficient operations to achieve lightweight attribute. We analyze the security of the proposed protocol. We use a formal security proof algorithm to show the scheme security against known attacks. We also use the Automated Validation of Internet Security Protocols and Applications (AVISPA) simulator to show protocol secure against man-in-the-middle attack and replay attack. Additionally, we adopt an informal analysis to discuss the key attributes of the proposed scheme. From the formal proof of security, we can see that an attacker has a negligible probability of breaking the protocol security. AVISPA simulator also demonstrates the proposed scheme security against active attacks, namely, man-in-the-middle attack and replay attack. Additionally, through the comparison of computational efficiency and security attributes with several recent results, proposed scheme seems to be battered.
Comment on "Dynamic quantum secret sharing"
NASA Astrophysics Data System (ADS)
Liao, Ci-Hong; Yang, Chun-Wei; Hwang, Tzonelish
2013-10-01
Hsu et al. (Quantum Inf Process 12:331-344,2013) proposed a dynamic quantum secret sharing (DQSS) protocol using the entanglement swapping of Bell states for an agent to easily join (or leave) the system. In 2013, Wang and Li (Quantum Inf Process 12(5):1991-1997, 2013) proposed a collusion attack on Hsu et al.'s DQSS protocol. Nevertheless, this study points out a new security issue on Hsu et al.'s DQSS protocol regarding to the honesty of a revoked agent. Without considering this issue, the DQSS protocol could be failed to provide secret sharing function.
Palm, Peter; Josephson, Malin; Mathiassen, Svend Erik; Kjellberg, Katarina
2016-06-01
We evaluated the intra- and inter-observer reliability and criterion validity of an observation protocol, developed in an iterative process involving practicing ergonomists, for assessment of working technique during cash register work for the purpose of preventing upper extremity symptoms. Two ergonomists independently assessed 17 15-min videos of cash register work on two occasions each, as a basis for examining reliability. Criterion validity was assessed by comparing these assessments with meticulous video-based analyses by researchers. Intra-observer reliability was acceptable (i.e. proportional agreement >0.7 and kappa >0.4) for 10/10 questions. Inter-observer reliability was acceptable for only 3/10 questions. An acceptable inter-observer reliability combined with an acceptable criterion validity was obtained only for one working technique aspect, 'Quality of movements'. Thus, major elements of the cashiers' working technique could not be assessed with an acceptable accuracy from short periods of observations by one observer, such as often desired by practitioners. Practitioner Summary: We examined an observation protocol for assessing working technique in cash register work. It was feasible in use, but inter-observer reliability and criterion validity were generally not acceptable when working technique aspects were assessed from short periods of work. We recommend the protocol to be used for educational purposes only.
Desmedt, Bart; Ates, Gamze; Courselle, Patricia; De Beer, Jacques O; Rogiers, Vera; Hendrickx, Benoit; Deconinck, Eric; De Paepe, Kristien
2016-01-01
In Europe, hydroquinone is a forbidden cosmetic ingredient. It is, however, still abundantly used because of its effective skin-whitening properties. The question arises as to whether the quantities of hydroquinone used become systemically available and may cause damage to human health. Dermal absorption studies can provide this information. In the EU, dermal absorption has to be assessed in vitro since the Cosmetic Regulation 1223/2009/EC forbids the use of animals. To obtain human-relevant data, a Franz diffusion cell protocol was validated using human skin. The results obtained were comparable to those from a multicentre validation study. The protocol was applied to hydroquinone and the dermal absorption ranged between 31 and 44%, which is within the range of published in vivo human values. This shows that a well-validated in vitro dermal absorption study using human skin provides relevant human data. The validated protocol was used to determine the dermal absorption of illegal skin-whitening cosmetics containing hydroquinone. All samples gave high dermal absorption values, rendering them all unsafe for human health. These results add to our knowledge of illegal cosmetics on the EU market, namely that they exhibit a negative toxicological profile and are likely to induce health problems. © 2017 S. Karger AG, Basel.
Stability and sensitivity of ABR flow control protocols
NASA Astrophysics Data System (ADS)
Tsai, Wie K.; Kim, Yuseok; Chiussi, Fabio; Toh, Chai-Keong
1998-10-01
This tutorial paper surveys the important issues in stability and sensitivity analysis of ABR flow control of ATM networks. THe stability and sensitivity issues are formulated in a systematic framework. Four main cause of instability in ABR flow control are identified: unstable control laws, temporal variations of available bandwidth with delayed feedback control, misbehaving components, and interactions between higher layer protocols and ABR flow control. Popular rate-based ABR flow control protocols are evaluated. Stability and sensitivity is shown to be the fundamental issues when the network has dynamically-varying bandwidth. Simulation result confirming the theoretical studies are provided. Open research problems are discussed.
ERIC Educational Resources Information Center
Stevens, Christopher John; Dascombe, Ben James
2015-01-01
Sports performance testing is one of the most common and important measures used in sport science. Performance testing protocols must have high reliability to ensure any changes are not due to measurement error or inter-individual differences. High validity is also important to ensure test performance reflects true performance. Time-trial…
NASA Technical Reports Server (NTRS)
Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor); Acker, James G. (Editor); Mueller, James L.; Austin, Roswell W.
1995-01-01
This report presents protocols for measuring optical properties, and other environmental variables, to validate the radiometric performance of the Sea-viewing Wide Field-of-view Sensor (SeaWiFS), and to develop and validate bio-optical algorithms for use with SeaWiFS data. The protocols are intended to establish foundations for a measurement strategy to verify the challenging SeaWiFS uncertainty goals of 5 percent in water-leaving radiances and 35 percent in chlorophyll alpha concentration. The protocols first specify the variables which must be measured, and briefly review the rationale for measuring each variable. Subsequent chapters cover detailed protocols for instrument performance specifications, characterizing and calibrating instruments, methods of making measurements in the field, and methods of data analysis. These protocols were developed at a workshop sponsored by the SeaWiFS Project Office (SPO) and held at the Naval Postgraduate School in Monterey, California (9-12 April 1991). This report began as the proceedings of the workshop, as interpreted and expanded by the authors and reviewed by workshop participants and other members of the bio-optical research community. The protocols are an evolving prescription to allow the research community to approach the unprecedented measurement uncertainties implied by the SeaWiFS goals; research and development are needed to improve the state-of-the-art in specific areas. These protocols should be periodically revised to reflect technical advances during the SeaWiFS Project cycle. The present edition (Revision 1) incorporates new protocols in several areas, including expanded protocol descriptions for Case-2 waters and other improvements, as contributed by several members of the SeaWiFS Science Team.
Liu, Ze-Yu; Zhang, Qing-Han; Ye, Xiao-Lei; Liu, Da-Peng; Cheng, Kang; Zhang, Chun-Hai; Wan, Yi
2017-04-01
To validate the G.LAB MD2200 automated wrist blood pressure (BP) monitors according to the European Society of Hypertension International Protocol (ESH-IP) revision 2010, the British Hypertension Society (BHS), and the International Organization for Standardization (ISO) 81060-2:2013 protocols. The device was assessed on 33 participants according to the ESH requirements and was then tested on 85 participants according to the BHS and ISO 81060-2:2013 criteria. The validation procedures and data analysis followed the protocols precisely. The G.LAB MD2200 devices passed all parts of ESH-IP revision 2010 for both systolic and diastolic BP, with a device-observer difference of 2.15±5.51 and 1.51±5.16 mmHg, respectively. The device achieved A/A grading for the BHS protocol and it also fulfilled the criteria of ISO 81060-2:2013, with mean differences of systolic and diastolic BP between the device and the observer of 2.19±5.21 and 2.11±4.70 mmHg, respectively. The G.LAB MD2200 automated wrist BP monitor passed the ESH-IP revision 2010 and the ISO 81060-2:2013 protocol, and achieved the A/A grade of the BHS protocol, which can be recommended for self-measurement in the general population.
Very High-Speed Report File System
1992-12-15
1.5 and 45 Mb/s and is expected 1 Introduction to reach 150 Mb/s. These new technologies pose some challenges to The Internet Protocol (IP) family (IP... Internet Engineering Task Force (IETF) has R taken up the issue, but a definitive answer is probably some time away. The basic issues are the choice of AAL...by an IEEE 802. la Subnetwork Access Protocol (SNAP) However, with a large number of networks all header. The third proposal identifies the protocol
Raina, Rupesh; Wang, Joseph; Krishnappa, Vinod; Ferris, Maria
2018-01-16
The transition from pediatric to adult medical services is an important time in the life of an adolescent or young adult with a renal transplant. Failure of proper transition can lead to medical non-adherence and subsequent loss of graft and/or return to dialysis. The aim of this study was to conduct a systematic review and survey to assess the challenges and existing practices in transition of renal transplant recipient children to adult services, and to develop a transition protocol. We conducted a literature review and performed a survey of pediatric nephrologists across the United States to examine the current state of transition care. A structured transition protocol was developed based on these results. Our literature review revealed that a transition program has a positive impact on decline in renal function and acute rejection episodes, and may improve long-term graft outcomes in pediatric kidney transplant patients. With a response rate of 40% (60/150) from nephrologists in 56% (49/87) of centers, our survey shows inconsistent use of validated tools despite their availability, inefficient communication between teams, and lack of use of dedicated clinics. To address these issues, we developed the "RISE to Transition" protocol, which relies on 4 competency areas: Recognition, Insight, Self-reliance, and Establishment of healthy habits. The transition program decreases acute graft rejection episodes, and the main challenges in transition care are the communication gap between health care providers and inconsistent use of transition tools. Our RISE to transition protocol incorporates transition tools, defines personnel, and aims to improve communication between teams.
Mostafa, Gehan M A; Shazly, Mona M; Sherief, Wafaa I
2009-01-01
Good healthcare waste management in a hospital depends on a dedicated waste management team, good administration, careful planning, sound organization, underpinning legislation, adequate financing, and full participation by trained staff. Hence, waste management protocols must be convenient and sensible. To assess the knowledge and practice related to waste management among doctors, nurses, and housekeepers in the surgical departments at Al-Mansoura University Hospital, and to design and validate a waste management protocol for the health team in these settings. This cross-sectional study was carried out in the eight surgical departments at Al-Mansoura University Hospital. All health care personnel and their assistants were included: 38 doctors, 106 nurses, and 56 housekeepers. Two groups of jury were included for experts' opinions validation of the developed protocol, one from academia (30 members) and the other from service providers (30 members). Data were collected using a self-administered knowledge questionnaire for nurses and doctors, and an interview questionnaire for housekeepers. Observation checklists were used for assessment of performance. The researchers developed the first draft of the waste management protocol according to the results of the analysis of the data collected in the assessment phase. Then, the protocol was presented to the jury group for validation, and then was implemented. Only 27.4% of the nurses, 32.1% of the housekeepers, and 36.8% of the doctors had satisfactory knowledge. Concerning practice, 18.9% of the nurses, 7.1% of the housekeepers, and none of the doctors had adequate practice. Nurses' knowledge score had a statistically significant weak positive correlation with the attendance of training courses (r=0.23, p<0.05). Validation of the developed protocol was done, and the percent of agreement ranged between 60.0% and 96.7% for the service group, and 60.0% and 90.0% for the academia group. The majority of the doctors, nurses, and housekeepers have unsatisfactory knowledge and inadequate practice related to health care waste management. The knowledge among nurses is positively affected by attendance of training programs. Based on the findings, a protocol for healthcare waste management was developed and validated. It is recommended to implement the developed waste management protocol for the surgical departments in the designed hospital, with establishment of waste management audits.
Jonker, Dirk; Gustafsson, Ewa; Rolander, Bo; Arvidsson, Inger; Nordander, Catarina
2015-01-01
A new health surveillance protocol for work-related upper-extremity musculoskeletal disorders has been validated by comparing the results with a reference protocol. The studied protocol, Health Surveillance in Adverse Ergonomics Conditions (HECO), is a new version of the reference protocol modified for application in the Occupational Health Service (OHS). The HECO protocol contains both a screening part and a diagnosing part. Sixty-three employees were examined. The screening in HECO did not miss any diagnosis found when using the reference protocol, but in comparison to the reference protocol considerable time savings could be achieved. Fair to good agreement between the protocols was obtained for one or more diagnoses in neck/shoulders (86%, k = 0.62) and elbow/hands (84%, k = 0.49). Therefore, the results obtained using the HECO protocol can be compared with a reference material collected with the reference protocol, and thus provide information of the magnitude of disorders in an examined work group. Practitioner Summary: The HECO protocol is a relatively simple physical examination protocol for identification of musculoskeletal disorders in the neck and upper extremities. The protocol is a reliable and cost-effective tool for the OHS to use for occupational health surveillance in order to detect workplaces at high risk for developing musculoskeletal disorders. PMID:25761380
Towards an improved LAI collection protocol via simulated field-based PAR sensing
Yao, Wei; Van Leeuwen, Martin; Romanczyk, Paul; ...
2016-07-14
In support of NASA’s next-generation spectrometer—the Hyperspectral Infrared Imager (HyspIRI)—we are working towards assessing sub-pixel vegetation structure from imaging spectroscopy data. Of particular interest is Leaf Area Index (LAI), which is an informative, yet notoriously challenging parameter to efficiently measure in situ. While photosynthetically-active radiation (PAR) sensors have been validated for measuring crop LAI, there is limited literature on the efficacy of PAR-based LAI measurement in the forest environment. This study (i) validates PAR-based LAI measurement in forest environments, and (ii) proposes a suitable collection protocol, which balances efficiency with measurement variation, e.g., due to sun flecks and various-sized canopymore » gaps. A synthetic PAR sensor model was developed in the Digital Imaging and Remote Sensing Image Generation (DIRSIG) model and used to validate LAI measurement based on first-principles and explicitly-known leaf geometry. Simulated collection parameters were adjusted to empirically identify optimal collection protocols. Furthermore, these collection protocols were then validated in the field by correlating PAR-based LAI measurement to the normalized difference vegetation index (NDVI) extracted from the “classic” Airborne Visible Infrared Imaging Spectrometer (AVIRIS-C) data (R 2 was 0.61). The results indicate that our proposed collecting protocol is suitable for measuring the LAI of sparse forest (LAI < 3–5 ( m 2/m 2)).« less
2012-01-01
Background Management of cancer treatment-related symptoms is an important safety issue given that symptoms can become life-threatening and often occur when patients are at home. With funding from the Canadian Partnership Against Cancer, a pan-Canadian steering committee was established with representation from eight provinces to develop symptom protocols using a rigorous methodology (CAN-IMPLEMENT©). Each protocol is based on a systematic review of the literature to identify relevant clinical practice guidelines. Protocols were validated by cancer nurses from across Canada. The aim of this study is to build an effective and sustainable approach for implementing evidence-informed protocols for nurses to use when providing remote symptom assessment, triage, and guidance in self-management for patients experiencing symptoms while undergoing cancer treatments. Methods A prospective mixed-methods study design will be used. Guided by the Knowledge to Action Framework, the study will involve (a) establishing an advisory knowledge user team in each of three targeted settings; (b) assessing factors influencing nurses’ use of protocols using interviews/focus groups and a standardized survey instrument; (c) adapting protocols for local use, ensuring fidelity of the content; (d) selecting intervention strategies to overcome known barriers and implementing the protocols; (e) conducting think-aloud usability testing; (f) evaluating protocol use and outcomes by conducting an audit of 100 randomly selected charts at each of the three settings; and (g) assessing satisfaction with remote support using symptom protocols and change in nurses’ barriers to use using survey instruments. The primary outcome is sustained use of the protocols, defined as use in 75% of the calls. Descriptive analysis will be conducted for the barriers, use of protocols, and chart audit outcomes. Content analysis will be conducted on interviews/focus groups and usability testing with comparisons across settings. Discussion Given the importance of patient safety, patient-centered care, and delivery of quality services, learning how to effectively implement evidence-informed symptom protocols in oncology healthcare services is essential for ensuring safe, consistent, and effective care for individuals with cancer. This study is likely to have a significant contribution to the delivery of remote oncology services, as well as influence symptom management by patients at home. PMID:23164244
Formal Approach For Resilient Reachability based on End-System Route Agility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rauf, Usman; Gillani, Fida; Al-Shaer, Ehab
The deterministic nature of existing routing protocols has resulted into an ossified Internet with static and predictable network routes. This gives persistent attackers (e.g. eavesdroppers and DDoS attackers) plenty of time to study the network and identify the vulnerable links (critical) to plan a devastating and stealthy attack. Recently, route mutation approaches have been proposed to address such issues. However, these approaches incur significantly high overhead and depend upon the availability of disjoint routes in the network, which inherently limit their use for mission critical services. To cope with these issues, we extend the current routing architecture to consider end-hostsmore » as routing elements, and present a formal method based agile defense mechanism to increase resiliency of the existing cyber infrastructure. The major contributions of this paper include: (1) formalization of efficient and resilient End to End (E2E) reachability problem as a constraint satisfaction problem, which identifies the potential end-hosts to reach a destination while satisfying resilience and QoS constraints, (2) design and implementation of a novel decentralized End Point Route Mutation (EPRM) protocol, and (3) design and implementation of planning algorithm to minimize the overlap between multiple flows, for the sake of maximizing the agility in the system. Our implementation and evaluation validates the correctness, effectiveness and scalability of the proposed approach.« less
Nano-immunosafety: issues in assay validation
NASA Astrophysics Data System (ADS)
Boraschi, Diana; Oostingh, Gertie J.; Casals, Eudald; Italiani, Paola; Nelissen, Inge; Puntes, Victor F.; Duschl, Albert
2011-07-01
Assessing the safety of engineered nanomaterials for human health must include a thorough evaluation of their effects on the immune system, which is responsible for defending the integrity of our body from damage and disease. An array of robust and representative assays should be set up and validated, which could be predictive of the effects of nanomaterials on immune responses. In a trans-European collaborative work, in vitro assays have been developed to this end. In vitro tests have been preferred for their suitability to standardisation and easier applicability. Adapting classical assays to testing the immunotoxicological effects of nanoparticulate materials has raised a series of issues that needed to be appropriately addressed in order to ensure reliability of results. Besides the exquisitely immunological problem of selecting representative endpoints predictive of the risk of developing disease, assay results turned out to be significantly biased by artefactual interference of the nanomaterials or contaminating agents with the assay protocol. Having addressed such problems, a series of robust and representative assays have been developed that describe the effects of engineered nanoparticles on professional and non-professional human defence cells. Two of such assays are described here, one based on primary human monocytes and the other employing human lung epithelial cells transfected with a reporter gene.
A Public-Key Based Authentication and Key Establishment Protocol Coupled with a Client Puzzle.
ERIC Educational Resources Information Center
Lee, M. C.; Fung, Chun-Kan
2003-01-01
Discusses network denial-of-service attacks which have become a security threat to the Internet community and suggests the need for reliable authentication protocols in client-server applications. Presents a public-key based authentication and key establishment protocol coupled with a client puzzle protocol and validates it through formal logic…
Pinheiro, Alexandre; Dias Canedo, Edna; de Sousa Junior, Rafael Timoteo; de Oliveira Albuquerque, Robson; García Villalba, Luis Javier; Kim, Tai-Hoon
2018-03-02
Cloud computing is considered an interesting paradigm due to its scalability, availability and virtually unlimited storage capacity. However, it is challenging to organize a cloud storage service (CSS) that is safe from the client point-of-view and to implement this CSS in public clouds since it is not advisable to blindly consider this configuration as fully trustworthy. Ideally, owners of large amounts of data should trust their data to be in the cloud for a long period of time, without the burden of keeping copies of the original data, nor of accessing the whole content for verifications regarding data preservation. Due to these requirements, integrity, availability, privacy and trust are still challenging issues for the adoption of cloud storage services, especially when losing or leaking information can bring significant damage, be it legal or business-related. With such concerns in mind, this paper proposes an architecture for periodically monitoring both the information stored in the cloud and the service provider behavior. The architecture operates with a proposed protocol based on trust and encryption concepts to ensure cloud data integrity without compromising confidentiality and without overloading storage services. Extensive tests and simulations of the proposed architecture and protocol validate their functional behavior and performance.
2018-01-01
Cloud computing is considered an interesting paradigm due to its scalability, availability and virtually unlimited storage capacity. However, it is challenging to organize a cloud storage service (CSS) that is safe from the client point-of-view and to implement this CSS in public clouds since it is not advisable to blindly consider this configuration as fully trustworthy. Ideally, owners of large amounts of data should trust their data to be in the cloud for a long period of time, without the burden of keeping copies of the original data, nor of accessing the whole content for verifications regarding data preservation. Due to these requirements, integrity, availability, privacy and trust are still challenging issues for the adoption of cloud storage services, especially when losing or leaking information can bring significant damage, be it legal or business-related. With such concerns in mind, this paper proposes an architecture for periodically monitoring both the information stored in the cloud and the service provider behavior. The architecture operates with a proposed protocol based on trust and encryption concepts to ensure cloud data integrity without compromising confidentiality and without overloading storage services. Extensive tests and simulations of the proposed architecture and protocol validate their functional behavior and performance. PMID:29498641
Wound-healing outcomes using standardized assessment and care in clinical practice.
Bolton, Laura; McNees, Patrick; van Rijswijk, Lia; de Leon, Jean; Lyder, Courtney; Kobza, Laura; Edman, Kelly; Scheurich, Anne; Shannon, Ron; Toth, Michelle
2004-01-01
Wound-healing outcomes applying standardized protocols have typically been measured within controlled clinical trials, not natural settings. Standardized protocols of wound care have been validated for clinical use, creating an opportunity to measure the resulting outcomes. Wound-healing outcomes were explored during clinical use of standardized validated protocols of care based on patient and wound assessments. This was a prospective multicenter study of wound-healing outcomes management in real-world clinical practice. Healing outcomes from March 26 to October 31, 2001, were recorded on patients in 3 long-term care facilities, 1 long-term acute care hospital, and 12 home care agencies for wounds selected by staff to receive care based on computer-generated validated wound care algorithms. After diagnosis, wound dimensions and status were assessed using a tool adapted from the Pressure Sore Status Toolfor use on all wounds. Wound, ostomy, and continence nursing professionals accessed consistent protocols of care, via telemedicine in home care or paper forms in long-term care. A physician entered assessments into a desktop computer in the wound clinic. Based on evidence that healing proceeds faster with fewer infections in environments without gauze, the protocols generally avoided gauze dressings. Most of the 767 wounds selected to receive the standardized-protocols of care were stage III-IV pressure ulcers (n = 373; mean healing time 62 days) or full-thickness venous ulcers (n = 124; mean healing time 57 days). Partial-thickness wounds healed faster than same-etiology full-thickness wounds. These results provide benchmarks for natural-setting healing outcomes and help to define and address wound care challenges. Outcomes primarily using nongauze protocols of care matched or surpassed best previously published results on similar wounds using gauze-based protocols of care, including protocols applying gauze impregnated with growth factors or other agents.
Lindemann, Ulrich; Zijlstra, Wiebren; Aminian, Kamiar; Chastin, Sebastien F M; de Bruin, Eling D; Helbostad, Jorunn L; Bussmann, Johannes B J
2014-01-10
Physical activity is an important determinant of health and well-being in older persons and contributes to their social participation and quality of life. Hence, assessment tools are needed to study this physical activity in free-living conditions. Wearable motion sensing technology is used to assess physical activity. However, there is a lack of harmonisation of validation protocols and applied statistics, which make it hard to compare available and future studies. Therefore, the aim of this paper is to formulate recommendations for assessing the validity of sensor-based activity monitoring in older persons with focus on the measurement of body postures and movements. Validation studies of body-worn devices providing parameters on body postures and movements were identified and summarized and an extensive inter-active process between authors resulted in recommendations about: information on the assessed persons, the technical system, and the analysis of relevant parameters of physical activity, based on a standardized and semi-structured protocol. The recommended protocols can be regarded as a first attempt to standardize validity studies in the area of monitoring physical activity.
On Ramps: Options and Issues in Accessing the Internet.
ERIC Educational Resources Information Center
Bocher, Bob
1995-01-01
Outlines the basic options that schools and libraries have for accessing the Internet, focusing on four models: direct connection; dial access using SLIP/PPP (Serial Line Internet Protocol/Point-to-Point Protocol); dial-up using terminal emulation mode; and dial access through commercial online services. Discusses access option issues such as…
Protocol for Uniformly Measuring and Expressing the Performance of Energy Storage Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Conover, David R.; Crawford, Aladsair J.; Fuller, Jason C.
The Protocol for Uniformly Measuring and Expressing the Performance of Energy Storage Systems (PNNL-22010) was first issued in November 2012 as a first step toward providing a foundational basis for developing an initial standard for the uniform measurement and expression of energy storage system (ESS) performance. Based on experiences with the application and use of that document, and to include additional ESS applications and associated duty cycles, test procedures and performance metrics, a first revision of the November 2012 Protocol was issued in June 2014 (PNNL 22010 Rev. 1). As an update of the 2014 revision 1 to the Protocol,more » this document (the March 2016 revision 2 to the Protocol) is intended to supersede the June 2014 revision 1 to the Protocol and provide a more user-friendly yet more robust and comprehensive basis for measuring and expressing ESS performance.« less
A guideline for the validation of likelihood ratio methods used for forensic evidence evaluation.
Meuwly, Didier; Ramos, Daniel; Haraksim, Rudolf
2017-07-01
This Guideline proposes a protocol for the validation of forensic evaluation methods at the source level, using the Likelihood Ratio framework as defined within the Bayes' inference model. In the context of the inference of identity of source, the Likelihood Ratio is used to evaluate the strength of the evidence for a trace specimen, e.g. a fingermark, and a reference specimen, e.g. a fingerprint, to originate from common or different sources. Some theoretical aspects of probabilities necessary for this Guideline were discussed prior to its elaboration, which started after a workshop of forensic researchers and practitioners involved in this topic. In the workshop, the following questions were addressed: "which aspects of a forensic evaluation scenario need to be validated?", "what is the role of the LR as part of a decision process?" and "how to deal with uncertainty in the LR calculation?". The questions: "what to validate?" focuses on the validation methods and criteria and "how to validate?" deals with the implementation of the validation protocol. Answers to these questions were deemed necessary with several objectives. First, concepts typical for validation standards [1], such as performance characteristics, performance metrics and validation criteria, will be adapted or applied by analogy to the LR framework. Second, a validation strategy will be defined. Third, validation methods will be described. Finally, a validation protocol and an example of validation report will be proposed, which can be applied to the forensic fields developing and validating LR methods for the evaluation of the strength of evidence at source level under the following propositions. Copyright © 2016. Published by Elsevier B.V.
ERIC Educational Resources Information Center
Ganek, Hillary V.; Eriks-Brophy, Alice
2018-01-01
The aim of this study was to present a protocol for the validation of the Language ENvironment Analysis (LENA) System's conversational turn count (CTC) for Vietnamese speakers. Ten families of children aged between 22 and 42 months, recruited near Ho Chi Minh City, participated in this project. Each child wore the LENA audio recorder for a full…
Telemetry Standards, RCC Standard 106-17. Chapter 26. TmNSDataMessage Transfer Protocol
2017-07-01
Channel (RTSPDataChannel) ............................................ 26-13 26.4.3 Reliability Critical (RC) Delivery Protocol...error status code specified in RFC 2326 for "Request-URI Too Large" is 虮". 26.4.1.5 Request Types RTSPDataSources shall return valid ...to the following requirements. • Valid TmNSDataMessages shall be delivered containing the original Packages matching the requested
Reimers, Mallory; Ernst, Neysa; Bova, Gregory; Nowakowski, Elaine; Bukowski, James; Ellis, Brandon C.; Smith, Chris; Sauer, Lauren; Dionne, Kim; Carroll, Karen C.; Maragakis, Lisa L.; Parrish, Nicole M.
2016-01-01
ABSTRACT In response to the Ebola outbreak in 2014, many hospitals designated specific areas to care for patients with Ebola and other highly infectious diseases. The safe handling of category A infectious substances is a unique challenge in this environment. One solution is on-site waste treatment with a steam sterilizer or autoclave. The Johns Hopkins Hospital (JHH) installed two pass-through autoclaves in its biocontainment unit (BCU). The JHH BCU and The Johns Hopkins biosafety level 3 (BSL-3) clinical microbiology laboratory designed and validated waste-handling protocols with simulated patient trash to ensure adequate sterilization. The results of the validation process revealed that autoclave factory default settings are potentially ineffective for certain types of medical waste and highlighted the critical role of waste packaging in successful sterilization. The lessons learned from the JHH validation process can inform the design of waste management protocols to ensure effective treatment of highly infectious medical waste. PMID:27927920
Garibaldi, Brian T; Reimers, Mallory; Ernst, Neysa; Bova, Gregory; Nowakowski, Elaine; Bukowski, James; Ellis, Brandon C; Smith, Chris; Sauer, Lauren; Dionne, Kim; Carroll, Karen C; Maragakis, Lisa L; Parrish, Nicole M
2017-02-01
In response to the Ebola outbreak in 2014, many hospitals designated specific areas to care for patients with Ebola and other highly infectious diseases. The safe handling of category A infectious substances is a unique challenge in this environment. One solution is on-site waste treatment with a steam sterilizer or autoclave. The Johns Hopkins Hospital (JHH) installed two pass-through autoclaves in its biocontainment unit (BCU). The JHH BCU and The Johns Hopkins biosafety level 3 (BSL-3) clinical microbiology laboratory designed and validated waste-handling protocols with simulated patient trash to ensure adequate sterilization. The results of the validation process revealed that autoclave factory default settings are potentially ineffective for certain types of medical waste and highlighted the critical role of waste packaging in successful sterilization. The lessons learned from the JHH validation process can inform the design of waste management protocols to ensure effective treatment of highly infectious medical waste. Copyright © 2017 American Society for Microbiology.
Intelligent QoS routing algorithm based on improved AODV protocol for Ad Hoc networks
NASA Astrophysics Data System (ADS)
Huibin, Liu; Jun, Zhang
2016-04-01
Mobile Ad Hoc Networks were playing an increasingly important part in disaster reliefs, military battlefields and scientific explorations. However, networks routing difficulties are more and more outstanding due to inherent structures. This paper proposed an improved cuckoo searching-based Ad hoc On-Demand Distance Vector Routing protocol (CSAODV). It elaborately designs the calculation methods of optimal routing algorithm used by protocol and transmission mechanism of communication-package. In calculation of optimal routing algorithm by CS Algorithm, by increasing QoS constraint, the found optimal routing algorithm can conform to the requirements of specified bandwidth and time delay, and a certain balance can be obtained among computation spending, bandwidth and time delay. Take advantage of NS2 simulation software to take performance test on protocol in three circumstances and validate the feasibility and validity of CSAODV protocol. In results, CSAODV routing protocol is more adapt to the change of network topological structure than AODV protocol, which improves package delivery fraction of protocol effectively, reduce the transmission time delay of network, reduce the extra burden to network brought by controlling information, and improve the routing efficiency of network.
Love, Seth; Chalmers, Katy; Ince, Paul; Esiri, Margaret; Attems, Johannes; Kalaria, Raj; Jellinger, Kurt; Yamada, Masahito; McCarron, Mark; Minett, Thais; Matthews, Fiona; Greenberg, Steven; Mann, David; Kehoe, Patrick Gavin
2015-01-01
In a collaboration involving 11 groups with research interests in cerebral amyloid angiopathy (CAA), we used a two-stage process to develop and in turn validate a new consensus protocol and scoring scheme for the assessment of CAA and associated vasculopathic abnormalities in post-mortem brain tissue. Stage one used an iterative Delphi-style survey to develop the consensus protocol. The resultant scoring scheme was tested on a series of digital images and paraffin sections that were circulated blind to a number of scorers. The scoring scheme and choice of staining methods were refined by open-forum discussion. The agreed protocol scored parenchymal and meningeal CAA on a 0-3 scale, capillary CAA as present/absent and vasculopathy on 0-2 scale, in the 4 cortical lobes that were scored separately. A further assessment involving three centres was then undertaken. Neuropathologists in three centres (Bristol, Oxford and Sheffield) independently scored sections from 75 cases (25 from each centre) and high inter-rater reliability was demonstrated. Stage two used the results of the three-centre assessment to validate the protocol by investigating previously described associations between APOE genotype (previously determined), and both CAA and vasculopathy. Association of capillary CAA with or without arteriolar CAA with APOE ε4 was confirmed. However APOE ε2 was also found to be a strong risk factor for the development of CAA, not only in AD but also in elderly non-demented controls. Further validation of this protocol and scoring scheme is encouraged, to aid its wider adoption to facilitate collaborative and replication studies of CAA.[This corrects the article on p. 19 in vol. 3, PMID: 24754000.].
Love, Seth; Chalmers, Katy; Ince, Paul; Esiri, Margaret; Attems, Johannes; Kalaria, Raj; Jellinger, Kurt; Yamada, Masahito; McCarron, Mark; Minett, Thais; Matthews, Fiona; Greenberg, Steven; Mann, David; Kehoe, Patrick Gavin
2015-01-01
In a collaboration involving 11 groups with research interests in cerebral amyloid angiopathy (CAA), we used a two-stage process to develop and in turn validate a new consensus protocol and scoring scheme for the assessment of CAA and associated vasculopathic abnormalities in post-mortem brain tissue. Stage one used an iterative Delphi-style survey to develop the consensus protocol. The resultant scoring scheme was tested on a series of digital images and paraffin sections that were circulated blind to a number of scorers. The scoring scheme and choice of staining methods were refined by open-forum discussion. The agreed protocol scored parenchymal and meningeal CAA on a 0-3 scale, capillary CAA as present/absent and vasculopathy on 0-2 scale, in the 4 cortical lobes that were scored separately. A further assessment involving three centres was then undertaken. Neuropathologists in three centres (Bristol, Oxford and Sheffield) independently scored sections from 75 cases (25 from each centre) and high inter-rater reliability was demonstrated. Stage two used the results of the three-centre assessment to validate the protocol by investigating previously described associations between APOE genotype (previously determined), and both CAA and vasculopathy. Association of capillary CAA with or without arteriolar CAA with APOE ε4 was confirmed. However APOE ε2 was also found to be a strong risk factor for the development of CAA, not only in AD but also in elderly non-demented controls. Further validation of this protocol and scoring scheme is encouraged, to aid its wider adoption to facilitate collaborative and replication studies of CAA. PMID:26807344
Love, Seth; Chalmers, Katy; Ince, Paul; Esiri, Margaret; Attems, Johannes; Jellinger, Kurt; Yamada, Masahito; McCarron, Mark; Minett, Thais; Matthews, Fiona; Greenberg, Steven; Mann, David; Kehoe, Patrick Gavin
2014-01-01
In a collaboration involving 11 groups with research interests in cerebral amyloid angiopathy (CAA), we used a two-stage process to develop and in turn validate a new consensus protocol and scoring scheme for the assessment of CAA and associated vasculopathic abnormalities in post-mortem brain tissue. Stage one used an iterative Delphi-style survey to develop the consensus protocol. The resultant scoring scheme was tested on a series of digital images and paraffin sections that were circulated blind to a number of scorers. The scoring scheme and choice of staining methods were refined by open-forum discussion. The agreed protocol scored parenchymal and meningeal CAA on a 0-3 scale, capillary CAA as present/absent and vasculopathy on 0-2 scale, in the 4 cortical lobes that were scored separately. A further assessment involving three centres was then undertaken. Neuropathologists in three centres (Bristol, Oxford and Sheffield) independently scored sections from 75 cases (25 from each centre) and high inter-rater reliability was demonstrated. Stage two used the results of the three-centre assessment to validate the protocol by investigating previously described associations between APOE genotype (previously determined), and both CAA and vasculopathy. Association of capillary CAA with or without arteriolar CAA with APOE ε4 was confirmed. However APOE ε2 was also found to be a strong risk factor for the development of CAA, not only in AD but also in elderly non-demented controls. Further validation of this protocol and scoring scheme is encouraged, to aid its wider adoption to facilitate collaborative and replication studies of CAA. PMID:24754000
Pini, Claudio; Pastori, Marco; Baccheschi, Jordan; Omboni, Stefano; Parati, Gianfranco
2007-06-01
There is evidence that blood pressure measurement outside the doctor's office can provide valuable information for the diagnostic evaluation of hypertensive patients and for monitoring their response to treatment. Home blood pressure monitoring devices have a major role in this setting, provided that their accuracy in measuring blood pressure is demonstrated by validation studies. This study aimed at verifying whether the automatic electronic oscillometric blood pressure measuring device Artsana CSI 610 complied with the standard of accuracy indicated by the ESH International Protocol. Sequential measurements of systolic and diastolic blood pressure were obtained in 33 participants using the mercury sphygmomanometer (two observers) and the test device (one supervisor). A standard adult cuff was always employed during the study. According to the ESH validation protocol, 99 couples of test device and reference blood pressure measurements were obtained during the two phases of the study (three pairs for each of the 33 participants). The Artsana CSI 610 device successfully passed phase 1 of study validation with the number of absolute differences between test and reference device never <35 within 5 mmHg and never <40 within 10 and 15 mmHg. The test device also passed phase 2 of the validation study with a mean (+/-SD) device-observer difference of -1.4+/-4.8 mmHg for systolic and -0.9+/-3.5 mmHg for diastolic blood pressure. According to the results of the validation study on the basis of the ESH International Protocol, the Artsana CSI 610 can be recommended for clinical use in adults.
Xie, Peigen; Wang, Yanling; Xu, Xiaoying; Huang, Fei; Pan, Jingru
2015-04-01
The objective of this study was to determine the accuracy of the Pangao PG-800A11 wrist blood pressure monitor according to the European Society of Hypertension International Protocol (ESH-IP) revision 2010 and the protocol of the British Hypertension Society (BHS). The device evaluations were performed in 85 participants, 33 of whom were included according to the ESH-IP revision 2010 and 52 of whom were included on the basis of the requirements of the BHS protocol. The validation procedure and data analysis followed the protocols precisely. The device achieved an A/A grading for the BHS protocol and passed all phases of the ESH-IP revision 2010 protocol. The mean difference ±SD for the ESH and BHS protocols, respectively, was -0.6±4.5 and -0.8±6.2 mmHg for systolic pressure and 1.2±4.6 and -0.5±5.1 mmHg for diastolic pressure. The device maintained its A/A grading throughout the low, medium, and high-pressure ranges. The Pangao PG-800A11 wrist blood pressure monitor passed all requirements of the ESH-IP revision 2010 and achieved A/A grade of the BHS protocol in an adult population.
Yu, Wenbo; Lakkaraju, Sirish Kaushik; Raman, E. Prabhu; Fang, Lei; MacKerell, Alexander D.
2015-01-01
Receptor-based pharmacophore modeling is an efficient computer-aided drug design technique that uses the structure of the target protein to identify novel leads. However, most methods consider protein flexibility and desolvation effects in a very approximate way, which may limit their use in practice. The Site-Identification by Ligand Competitive Saturation (SILCS) assisted pharmacophore modeling protocol (SILCS-Pharm) was introduced recently to address these issues as SILCS naturally takes both protein flexibility and desolvation effects into account by using full MD simulations to determine 3D maps of the functional group-affinity patterns on a target receptor. In the present work, the SILCS-Pharm protocol is extended to use a wider range of probe molecules including benzene, propane, methanol, formamide, acetaldehyde, methylammonium, acetate and water. This approach removes the previous ambiguity brought by using water as both the hydrogen-bond donor and acceptor probe molecule. The new SILCS-Pharm protocol is shown to yield improved screening results as compared to the previous approach based on three target proteins. Further validation of the new protocol using five additional protein targets showed improved screening compared to those using common docking methods, further indicating improvements brought by the explicit inclusion of additional feature types associated with the wider collection of probe molecules in the SILCS simulations. The advantage of using complementary features and volume constraints, based on exclusion maps of the protein defined from the SILCS simulations, is presented. In addition, re-ranking using SILCS-based ligand grid free energies is shown to enhance the diversity of identified ligands for the majority of targets. These results suggest that the SILCS-Pharm protocol will be of utility in rational drug design. PMID:25622696
Use of enzymatic cleaners on US Navy ships. Research report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Venkatachalam, R.S.
1996-03-01
The Naval Surface Warfare Center, Carderock Division, conducted a study to determine the feasibility of using enzymatic and bacterial products in cleaning applications aboard U.S. Navy ships. A review of the most recent technical literature and a survey of potential suppliers were conducted. In addition, shipboard systems, subsystems and housekeeping processes were evaluated to identify suitable applications for enzymatic and bacterial cleaners. The study identified numerous commercial products that, based on manufacturers` claims, would be effective and safe for use aboard ship to clean walls, floors, galley work surfaces, engine and machine parts, drains, pipes, grease traps, collection, holding andmore » transfer (CHT) tanks, ballast tanks and bilge areas. However, the study also revealed the absence of standardized test protocols essential for validation of manufacturers` claims, and recommended the cooperative development of such protocols by representatives from the commercial sector, Government and academia. The need to obtain meaningful cost information based on actual use scenarios and to investigate any permitting issues associated with the discharge of related wastes to pierside facilities was also identified.« less
A Simplified Test for Blanching Susceptibility of Copper Alloys
NASA Technical Reports Server (NTRS)
Thomas-Ogbuji, Linus U.; Humphrey, Donald; Setlock, John
2003-01-01
GRCop-84 (Cu-8Cr-4Nb) is a dispersion-strengthened alloy developed for space-launch rocket engine applications, as a liner for the combustion chamber and nozzle ramp. Its main advantage over rival alloys, particularly NARloy-Z (Cu-Ag-Zr), the current liner alloy, is in high temperature mechanical properties. Further validation required that the two alloys be compared with respect to service performance and durability. This has been done, under conditions resembling those expected in reusable launch engine applications. GRCop-84 was found to have a superior resistance to static and cyclic oxidation up to approx. 700 C. In order to improve its performance above 700 C, Cu-Cr coatings have also been developed and evaluated. The major oxidative issue with Cu alloys is blanching, a mode of degradation induced by oxidation-reduction fluctuations in hydrogen-fueled engines. That fluctuation cannot be addressed with conventional static or cyclic oxidation testing. Hence, a further evaluation of the alloy substrates and Cu-Cr coating material necessitated our devising a test protocol that involves oxidaton-reduction cycles. This paper describes the test protocols used and the results obtained.
Active surveillance for nonmuscle invasive bladder cancer.
Miyake, Makito; Fujimoto, Kiyohide; Hirao, Yoshihiko
2016-06-01
Nonmuscle invasive bladder cancer (NMIBC) is known to be a heterogeneous malignancy that requires varying treatment modalities and follow-up schedules. Low-grade Ta papillary tumors are categorized as low-risk NMIBC because of their favorable prognosis. There is an expanding movement that overdiagnosis and overtreatment should be avoided considering the economic impact and the patients' quality of life. It has been over 10 years since the initial assessment of active surveillance for low-risk NMIBC suggested its feasibility and safety. However, urologists are still unfamiliar with this treatment option, which can be ideal in appropriately selected patients. In this review article, we focus on active surveillance for low-risk NMIBC and discuss the evidence and rationale for this treatment option. There are several issues to resolve in order to advocate active surveillance as a standard option in selected patients. A specific follow-up protocol including intervals of cystoscopy, urine cytology, urine markers, and other radiographic examinations need to be optimized and validated. Finally, we integrate the available data into the follow-up strategy and propose a new surveillance protocol for active surveillance of recurrent low-risk bladder cancer.
De Jongh, A; Ten Broeke, E; Renssen, M R
1999-01-01
This paper considers the current empirical status of Eye Movement Desensitization and Reprocessing (EMDR) as a treatment method for specific phobias, along with some conceptual and practical issues in relation to its use. Both uncontrolled and controlled studies on the application of EMDR with specific phobias demonstrate that EMDR can produce significant improvements within a limited number of sessions. With regard to the treatment of childhood spider phobia, EMDR has been found to be more effective than a placebo control condition, but less effective than exposure in vivo. The empirical support for EMDR with specific phobias is still meagre, therefore, one should remain cautious. However, given that there is insufficient research to validate any method for complex or trauma related phobias, that EMDR is a time-limited procedure, and that it can be used in cases for which an exposure in vivo approach is difficult to administer, the application of EMDR with specific phobias merits further clinical and research attention.
OSI-compatible protocols for mobile-satellite communications: The AMSS experience
NASA Technical Reports Server (NTRS)
Moher, Michael
1990-01-01
The protocol structure of the international aeronautical mobile satellite service (AMSS) is reviewed with emphasis on those aspects of protocol performance, validation, and conformance which are peculiar to mobile services. This is in part an analysis of what can be learned from the AMSS experience with protocols which is relevant to the design of other mobile satellite data networks, e.g., land mobile.
Wong, Adrian; Nyenhuis, David; Black, Sandra E; Law, Lorraine S N; Lo, Eugene S K; Kwan, Pauline W L; Au, Lisa; Chan, Anne Y Y; Wong, Lawrence K S; Nasreddine, Ziad; Mok, Vincent
2015-04-01
The National Institute of Neurological Disorders and Stroke-Canadian Stroke Network Vascular Cognitive Impairment Harmonization working group proposed a brief cognitive protocol for screening of vascular cognitive impairment. We investigated the validity, reliability, and feasibility of the Montreal Cognitive Assessment 5-minute protocol (MoCA 5-minute protocol) administered over the telephone. Four items examining attention, verbal learning and memory, executive functions/language, and orientation were extracted from the MoCA to form the MoCA 5-minute protocol. One hundred four patients with stroke or transient ischemic attack, including 53 with normal cognition (Clinical Dementia Rating, 0) and 51 with cognitive impairment (Clinical Dementia Rating, 0.5 or 1), were administered the MoCA in clinic and a month later, the MoCA 5-minute protocol over the telephone. Administration of the MoCA 5-minute protocol took 5 minutes over the telephone. Total score of the MoCA 5-minute protocol correlated negatively with age (r=-0.36; P<0.001) and positively with years of education (r=0.41; P<0.001) but not with sex (ρ=0.03; P=0.773). Total scores of the MoCA and MoCA 5-minute protocol were highly correlated (r=0.87; P<0.001). The MoCA 5-minute protocol performed equally well as the MoCA in differentiating patients with cognitive impairment from those without (areas under receiver operating characteristics curve for MoCA 5-minute protocol, 0.78; MoCA=0.74; P>0.05 for difference; Cohen d for group difference, 0.80-1.13). It differentiated cognitively impaired patients with executive domain impairment from those without (areas under receiver operating characteristics curve, 0.89; P<0.001; Cohen d=1.7 for group difference). Thirty-day test-retest reliability was excellent (intraclass correlation coefficient, 0.89). The MoCA 5-minute protocol is a free, valid, and reliable cognitive screen for stroke and transient ischemic attack. It is brief and highly feasible for telephone administration. © 2015 American Heart Association, Inc.
Security of a sessional blind signature based on quantum cryptograph
NASA Astrophysics Data System (ADS)
Wang, Tian-Yin; Cai, Xiao-Qiu; Zhang, Rui-Ling
2014-08-01
We analyze the security of a sessional blind signature protocol based on quantum cryptograph and show that there are two security leaks in this protocol. One is that the legal user Alice can change the signed message after she gets a valid blind signature from the signatory Bob, and the other is that an external opponent Eve also can forge a valid blind message by a special attack, which are not permitted for blind signature. Therefore, this protocol is not secure in the sense that it does not satisfy the non-forgeability of blind signatures. We also discuss the methods to prevent the attack strategies in the end.
A calibration protocol for population-specific accelerometer cut-points in children.
Mackintosh, Kelly A; Fairclough, Stuart J; Stratton, Gareth; Ridgers, Nicola D
2012-01-01
To test a field-based protocol using intermittent activities representative of children's physical activity behaviours, to generate behaviourally valid, population-specific accelerometer cut-points for sedentary behaviour, moderate, and vigorous physical activity. Twenty-eight children (46% boys) aged 10-11 years wore a hip-mounted uniaxial GT1M ActiGraph and engaged in 6 activities representative of children's play. A validated direct observation protocol was used as the criterion measure of physical activity. Receiver Operating Characteristics (ROC) curve analyses were conducted with four semi-structured activities to determine the accelerometer cut-points. To examine classification differences, cut-points were cross-validated with free-play and DVD viewing activities. Cut-points of ≤ 372, >2160 and >4806 counts • min(-1) representing sedentary, moderate and vigorous intensity thresholds, respectively, provided the optimal balance between the related needs for sensitivity (accurately detecting activity) and specificity (limiting misclassification of the activity). Cross-validation data demonstrated that these values yielded the best overall kappa scores (0.97; 0.71; 0.62), and a high classification agreement (98.6%; 89.0%; 87.2%), respectively. Specificity values of 96-97% showed that the developed cut-points accurately detected physical activity, and sensitivity values (89-99%) indicated that minutes of activity were seldom incorrectly classified as inactivity. The development of an inexpensive and replicable field-based protocol to generate behaviourally valid and population-specific accelerometer cut-points may improve the classification of physical activity levels in children, which could enhance subsequent intervention and observational studies.
Validation of a school-based amblyopia screening protocol in a kindergarten population.
Casas-Llera, Pilar; Ortega, Paula; Rubio, Inmaculada; Santos, Verónica; Prieto, María J; Alio, Jorge L
2016-08-04
To validate a school-based amblyopia screening program model by comparing its outcomes to those of a state-of-the-art conventional ophthalmic clinic examination in a kindergarten population of children between the ages of 4 and 5 years. An amblyopia screening protocol, which consisted of visual acuity measurement using Lea charts, ocular alignment test, ocular motility assessment, and stereoacuity with TNO random-dot test, was performed at school in a pediatric 4- to 5-year-old population by qualified healthcare professionals. The outcomes were validated in a selected group by a conventional ophthalmologic examination performed in a fully equipped ophthalmologic center. The ophthalmologic evaluation was used to confirm whether or not children were correctly classified by the screening protocol. The sensitivity and specificity of the test model to detect amblyopia were established. A total of 18,587 4- to 5-year-old children were subjected to the amblyopia screening program during the 2010-2011 school year. A population of 100 children were selected for the ophthalmologic validation screening. A sensitivity of 89.3%, specificity of 93.1%, positive predictive value of 83.3%, negative predictive value of 95.7%, positive likelihood ratio of 12.86, and negative likelihood ratio of 0.12 was obtained for the amblyopia screening validation model. The amblyopia screening protocol model tested in this investigation shows high sensitivity and specificity in detecting high-risk cases of amblyopia compared to the standard ophthalmologic examination. This screening program may be highly relevant for amblyopia screening at schools.
A Synopsis of Technical Issues of Concern for Monitoring Trace Elements in Highway and Urban Runoff
Breault, Robert F.; Granato, Gregory E.
2000-01-01
Trace elements, which are regulated for aquatic life protection, are a primary concern in highway- and urban-runoff studies because stormwater runoff may transport these constituents from the land surface to receiving waters. Many of these trace elements are essential for biological activity and become detrimental only when geologic or anthropogenic sources exceed concentrations beyond ranges typical of the natural environment. The Federal Highway Administration and State Transportation Agencies are concerned about the potential effects of highway runoff on the watershed scale and for the management and protection of watersheds. Transportation agencies need information that is documented as valid, current, and scientifically defensible to support planning and management decisions. There are many technical issues of concern for monitoring trace elements; therefore, trace-element data commonly are considered suspect, and the responsibility to provide data-quality information to support the validity of reported results rests with the data-collection agency. Paved surfaces are fundamentally different physically, hydraulically, and chemically from the natural surfaces typical of most freshwater systems that have been the focus of many traceelement- monitoring studies. Existing scientific conceptions of the behavior of trace elements in the environment are based largely upon research on natural systems, rather than on systems typical of pavement runoff. Additionally, the logistics of stormwater sampling are difficult because of the great uncertainty in the occurrence and magnitude of storm events. Therefore, trace-element monitoring programs may be enhanced if monitoring and sampling programs are automated. Automation would standardize the process and provide a continuous record of the variations in flow and water-quality characteristics. Great care is required to collect and process samples in a manner that will minimize potential contamination or attenuation of trace elements and other sources of bias and variability in the sampling process. Trace elements have both natural and anthropogenic sources that may affect the sampling process, including the sample-collection and handling materials used in many trace-element monitoring studies. Trace elements also react with these materials within the timescales typical for collection, processing and analysis of runoff samples. To study the characteristics and potential effects of trace elements in highway and urban runoff, investigators typically sample one or more operationally defined matrixes including: whole water, dissolved (filtered water), suspended sediment, bottom sediment, biological tissue, and contaminant sources. The sampling and analysis of each of these sample matrixes can provide specific information about the occurrence and distribution of trace elements in runoff and receiving waters. There are, however, technical concerns specific to each matrix that must be understood and addressed through use of proper collection and processing protocols. Valid protocols are designed to minimize inherent problems and to maximize the accuracy, precision, comparability, and representativeness of data collected. Documentation, including information about monitoring protocols, quality assurance and quality control efforts, and ancillary data also is necessary to establish data quality. This documentation is especially important for evaluation of historical traceelement monitoring data, because trace-element monitoring protocols and analysis methods have been constantly changing over the past 30 years.
Manualized treatment programs for FSD: research challenges and recommendations.
Hucker, Alice; McCabe, Marita P
2012-02-01
The use of manualized treatment programs offers a useful research framework for assessing psychotherapeutic interventions for female sexual dysfunctions (FSDs), but it does not address all issues related to methodological rigor and replication, and raises new research issues in need of discussion. The goals of this manuscript are to review the literature on treatment trials utilizing manualized psychotherapy treatments for FSD and to explore the benefits and research issues associated with the flexible use of treatment manuals. The method used was the review of the relevant literature. While the use of manualized treatments for FSDs can address certain methodological issues inherent in psychotherapy research, flexibility in manual administration is necessary in order to allow tailoring for individual needs that can be beneficial to both the participant and the research. The flexible use of manuals, as opposed to strict manual adherence, may also be more relevant for clinical utility. In order to administer manualized treatments for FSDs with appropriate flexibility, while also maximizing internal validity and replicability, the authors recommend that predetermined decision rules be utilized to guide individual tailoring, that potential gaps in the manual be identified and addressed, and that differing levels of motivation and readiness for treatment be taken into consideration in the treatment protocol. © 2011 International Society for Sexual Medicine.
Systems Engineering and Integration for Advanced Life Support System and HST
NASA Technical Reports Server (NTRS)
Kamarani, Ali K.
2005-01-01
Systems engineering (SE) discipline has revolutionized the way engineers and managers think about solving issues related to design of complex systems: With continued development of state-of-the-art technologies, systems are becoming more complex and therefore, a systematic approach is essential to control and manage their integrated design and development. This complexity is driven from integration issues. In this case, subsystems must interact with one another in order to achieve integration objectives, and also achieve the overall system's required performance. Systems engineering process addresses these issues at multiple levels. It is a technology and management process dedicated to controlling all aspects of system life cycle to assure integration at all levels. The Advanced Integration Matrix (AIM) project serves as the systems engineering and integration function for the Human Support Technology (HST) program. AIM provides means for integrated test facilities and personnel for performance trade studies, analyses, integrated models, test results, and validated requirements of the integration of HST. The goal of AIM is to address systems-level integration issues for exploration missions. It will use an incremental systems integration approach to yield technologies, baselines for further development, and possible breakthrough concepts in the areas of technological and organizational interfaces, total information flow, system wide controls, technical synergism, mission operations protocols and procedures, and human-machine interfaces.
Garcia Hejl, Carine; Ramirez, Jose Manuel; Vest, Philippe; Chianea, Denis; Renard, Christophe
2014-09-01
Laboratories working towards accreditation by the International Standards Organization (ISO) 15189 standard are required to demonstrate the validity of their analytical methods. The different guidelines set by various accreditation organizations make it difficult to provide objective evidence that an in-house method is fit for the intended purpose. Besides, the required performance characteristics tests and acceptance criteria are not always detailed. The laboratory must choose the most suitable validation protocol and set the acceptance criteria. Therefore, we propose a validation protocol to evaluate the performance of an in-house method. As an example, we validated the process for the detection and quantification of lead in whole blood by electrothermal absorption spectrometry. The fundamental parameters tested were, selectivity, calibration model, precision, accuracy (and uncertainty of measurement), contamination, stability of the sample, reference interval, and analytical interference. We have developed a protocol that has been applied successfully to quantify lead in whole blood by electrothermal atomic absorption spectrometry (ETAAS). In particular, our method is selective, linear, accurate, and precise, making it suitable for use in routine diagnostics.
Sathar, Aslam; Dhai, Amaboo; van der Linde, Stephan
2014-12-01
Human Biological Materials (HBMs) are an invaluable resource in biomedical research. To determine if researchers and a Research Ethics Committee (REC) at a South African institution addressed ethical issues pertaining to HBMs in collaborative research with developed countries. Ethically approved retrospective cross-sectional descriptive audit. Of the 1305 protocols audited, 151 (11.57%) fulfilled the study's inclusion criteria. Compared to other developed countries, a majority of sponsors (90) were from the USA (p = 0.0001). The principle investigators (PIs) in all 151 protocols informed the REC of their intent to store HBMs. Only 132 protocols informed research participants (P < 0.0001). In 148 protocols informed consent (IC) was obtained from research participants, 116 protocols (76.8%) solicited broad consent compared to specific consent (32; 21.2%) [p < 0.0001]. In 105 cases a code was used to maintain confidentiality. HBMs were anonymised in 14 protocols [p < 0.0001]. More protocols informed the REC (90) than the research participants (67) that HBMs would be exported (p = 0.011). Export permits (EPs) and Material Transfer Agreements (MTAs) were not available in 109 and 143 protocols, respectively. Researchers and the REC did not adequately address the inter-related ethical and regulatory issues pertaining to HBMs. There was a lack of congruence between the ethical guidelines of developed countries and their actions which are central to the access to HBMs in collaborative research. HBMs may be leaving South Africa without EPs and MTAs during the process of international collaborative research. © 2013 John Wiley & Sons Ltd.
Jumhawan, Udi; Yamashita, Toshiyuki; Ishida, Kazuya; Fukusaki, Eiichiro; Bamba, Takeshi
2017-01-01
There is urgent need to develop a new protocol for the evaluation of chemical substances to potentially interact with the endocrine system and induce numerous pathological issues. The recently validated in vitro screening assay is limited on monitoring two steroid hormones. Methodology & results: The H295R model cell was exposed to seven endocrine disrupting chemicals (EDCs). The levels of 17 steroid hormones in cell extracts were subsequently determined by a quantitative targeted GC/MS/MS method. Through wide coverage, this system managed to capture the effects of exposure to increasing EDCs concentrations in the entire steroidogenic pathways. The developed approach could be beneficial for the mechanistic investigation of EDCs.
Community detection in networks: A user guide
NASA Astrophysics Data System (ADS)
Fortunato, Santo; Hric, Darko
2016-11-01
Community detection in networks is one of the most popular topics of modern network science. Communities, or clusters, are usually groups of vertices having higher probability of being connected to each other than to members of other groups, though other patterns are possible. Identifying communities is an ill-defined problem. There are no universal protocols on the fundamental ingredients, like the definition of community itself, nor on other crucial issues, like the validation of algorithms and the comparison of their performances. This has generated a number of confusions and misconceptions, which undermine the progress in the field. We offer a guided tour through the main aspects of the problem. We also point out strengths and weaknesses of popular methods, and give directions to their use.
Designing typefaces for maps. A protocol of tests.
NASA Astrophysics Data System (ADS)
Biniek, Sébastien; Touya, Guillaume; Rouffineau, Gilles; Huot-Marchand, Thomas
2018-05-01
The text management in map design is a topic generally linked to placement and composition issues. Whereas the type design issue is rarely addressed or at least only partially. Moreover the typefaces especially designed for maps are rare. This paper presents a protocol of tests to evaluate characters for digital topographic maps and fonts that were designed for the screen through the use of geographical information systems using this protocol. It was launched by the Atelier National de Recherche Typographique Research (ANRT, located in Nancy, France) and took place over his `post-master' course in 2013. The purpose is to isolate different issues inherent to text in a topographic map: map background, nonlinear text placement and toponymic hierarchies. Further research is necessary to improve this kind of approach.
Tortosa, Jean-Christophe; Rodríguez-Arias Vailhen, David; Moutel, Grégoire
2010-02-01
France, Spain and US are three leader countries in activity of organ procurement and transplantation. Donation after cardiac death is one of the strategies they have been implemented in order to face organ shortage. Donation after cardiac death is internationally considered to be an encouraging source of organs for transplantation both because of its capacity to significantly increase the donor pool and because of the quality of the organs obtained from non-heart-beating organ donors. These protocols give rise to important ethical issues that have been widely discussed in the international literature. The aim of this paper is to identify and discuss the ethical issues that these protocols raise in these three countries.
Paes, Thaís; Machado, Felipe Vilaça Cavallari; Cavalheri, Vinícius; Pitta, Fabio; Hernandes, Nidia Aparecida
2017-07-01
People with chronic obstructive pulmonary disease (COPD) present symptoms such as dyspnea and fatigue, which hinder their performance in activities of daily living (ADL). A few multitask protocols have been developed to assess ADL performance in this population, although measurement properties of such protocols were not yet systematically reviewed. Areas covered: Studies were included if an assessment of the ability to perform ADL was conducted in people with COPD using a (objective) performance-based protocol. The search was conducted in the following databases: Pubmed, EMBASE, Cochrane Library, PEDro, CINAHL and LILACS. Furthermore, hand searches were conducted. Expert commentary: Up to this moment, only three protocols had measurement properties described: the Glittre ADL Test, the Monitored Functional Task Evaluation and the Londrina ADL Protocol were shown to be valid and reliable whereas only the Glittre ADL Test was shown to be responsive to change after pulmonary rehabilitation. These protocols can be used in laboratory settings and clinical practice to evaluate ADL performance in people with COPD, although there is need for more in-depth information on their validity, reliability and especially responsiveness due to the growing interest in the accurate assessment of ADL performance in this population.
Huang, Jinhua; Wang, Yun; Liu, Zhaoying; Wang, Yuling
2017-02-01
The aim of this study was to determine the accuracy of the Grandway MD2301 digital automatic blood pressure monitor by the British Hypertension Society (BHS) and the Association for the Advancement of Medical Instrumentation (AAMI)/the International Organization for Standardization (ISO) protocols. A total of 85 participants were included for evaluation based on the requirements of the BHS and the AAMI/ISO protocols. The validation procedure and data analysis followed the protocols precisely. The device achieved A/A grading for the BHS protocol and maintained A/A grading throughout the low, medium and high blood pressure ranges. The device also fulfilled the requirement of the AAMI/ISO protocol with device-observer differences of -0.9±5.6 and 0.8±5.2 mmHg for systolic and diastolic blood pressure, respectively, for criterion 1, and -0.9±4.7 and 0.8±4.2 mmHg, respectively, for criterion 2. The Grandway MD2301 digital automatic blood pressure monitor achieved A/A grade of the BHS protocol and passed the requirements of the AAMI/ISO protocol in adults.
Berger, Cezar; Freitas, Renato; Malafaia, Osvaldo; Pinto, José Simão de Paula; Mocellin, Marcos; Macedo, Evaldo; Fagundes, Marina Serrato Coelho
2012-01-01
Summary Introduction: In the health field, computerization has become increasingly necessary in professional practice, since it facilitates data recovery and assists in the development of research with greater scientific rigor. Objective: the present work aimed to develop, apply, and validate specific electronic protocols for patients referred for rhinoplasty. Methods: The prospective research had 3 stages: (1) preparation of theoretical data bases; (2) creation of a master protocol using Integrated System of Electronic Protocol (SINPE©); and (3) elaboration, application, and validation of a specific protocol for the nose and sinuses regarding rhinoplasty. Results: After the preparation of the master protocol, which dealt with the entire field of otorhinolaryngology, we idealized a specific protocol containing all matters related to the patient. In particular, the aesthetic and functional nasal complaints referred for surgical treatment (i.e., rhinoplasty) were organized into 6 main hierarchical categories: anamnesis, physical examination, complementary exams, diagnosis, treatment, and outcome. This protocol utilized these categories and their sub-items: finality; access; surgical maneuvers on the nasal dorsum, tip, and base; clinical evolution after 3, 6, and 12 months; revisional surgery; and quantitative and qualitative evaluations. Conclusion: The developed electronic-specific protocol is feasible and important for information registration from patients referred to rhinoplasty. PMID:25991979
de Paula, Jonas Jardim; Bertola, Laiss; Ávila, Rafaela Teixeira; Moreira, Lafaiete; Coutinho, Gabriel; de Moraes, Edgar Nunes; Bicalho, Maria Aparecida Camargos; Nicolato, Rodrigo; Diniz, Breno Satler; Malloy-Diniz, Leandro Fernandes
2013-01-01
Background and Objectives The neuropsychological exam plays a central role in the assessment of elderly patients with cognitive complaints. It is particularly relevant to differentiate patients with mild dementia from those subjects with mild cognitive impairment. Formal education is a critical factor in neuropsychological performance; however, there are few studies that evaluated the psychometric properties, especially criterion related validity, neuropsychological tests for patients with low formal education. The present study aims to investigate the validity of an unstructured neuropsychological assessment protocol for this population and develop cutoff values for clinical use. Methods and Results A protocol composed by the Rey-Auditory Verbal Learning Test, Frontal Assessment Battery, Category and Letter Fluency, Stick Design Test, Clock Drawing Test, Digit Span, Token Test and TN-LIN was administered to 274 older adults (96 normal aging, 85 mild cognitive impairment and 93 mild Alzheimer`s disease) with predominantly low formal education. Factor analysis showed a four factor structure related to Executive Functions, Language/Semantic Memory, Episodic Memory and Visuospatial Abilities, accounting for 65% of explained variance. Most of the tests showed a good sensitivity and specificity to differentiate the diagnostic groups. The neuropsychological protocol showed a significant ecological validity as 3 of the cognitive factors explained 31% of the variance on Instrumental Activities of Daily Living. Conclusion The study presents evidence of the construct, criteria and ecological validity for this protocol. The neuropsychological tests and the proposed cutoff values might be used for the clinical assessment of older adults with low formal education. PMID:24066031
Combining accuracy assessment of land-cover maps with environmental monitoring programs
Stehman, S.V.; Czaplewski, R.L.; Nusser, S.M.; Yang, L.; Zhu, Z.
2000-01-01
A scientifically valid accuracy assessment of a large-area, land-cover map is expensive. Environmental monitoring programs offer a potential source of data to partially defray the cost of accuracy assessment while still maintaining the statistical validity. In this article, three general strategies for combining accuracy assessment and environmental monitoring protocols are described. These strategies range from a fully integrated accuracy assessment and environmental monitoring protocol, to one in which the protocols operate nearly independently. For all three strategies, features critical to using monitoring data for accuracy assessment include compatibility of the land-cover classification schemes, precisely co-registered sample data, and spatial and temporal compatibility of the map and reference data. Two monitoring programs, the National Resources Inventory (NRI) and the Forest Inventory and Monitoring (FIM), are used to illustrate important features for implementing a combined protocol.
How to write a surgical clinical research protocol: literature review and practical guide.
Rosenthal, Rachel; Schäfer, Juliane; Briel, Matthias; Bucher, Heiner C; Oertli, Daniel; Dell-Kuster, Salome
2014-02-01
The study protocol is the core document of every clinical research project. Clinical research in studies involving surgical interventions presents some specific challenges, which need to be accounted for and described in the study protocol. The aim of this review is to provide a practical guide for developing a clinical study protocol for surgical interventions with a focus on methodologic issues. On the basis of an in-depth literature search of methodologic literature and on some cardinal published surgical trials and observational studies, the authors provides a 10-step guide for developing a clinical study protocol in surgery. This practical guide outlines key methodologic issues important when planning an ethically and scientifically sound research project involving surgical interventions, with the ultimate goal of providing high-level evidence relevant for health care decision making in surgery. Copyright © 2014 Elsevier Inc. All rights reserved.
A survey on temperature-aware routing protocols in wireless body sensor networks.
Oey, Christian Henry Wijaya; Moh, Sangman
2013-08-02
The rapid growth of the elderly population in the world and the rising cost of healthcare impose big issues for healthcare and medical monitoring. A Wireless Body Sensor Network (WBSN) is comprised of small sensor nodes attached inside, on or around a human body, the main purpose of which is to monitor the functions and surroundings of the human body. However, the heat generated by the node's circuitry and antenna could cause damage to the human tissue. Therefore, in designing a routing protocol for WBSNs, it is important to reduce the heat by incorporating temperature into the routing metric. The main contribution of this paper is to survey existing temperature-aware routing protocols that have been proposed for WBSNs. In this paper, we present a brief overview of WBSNs, review the existing routing protocols comparatively and discuss challenging open issues in the design of routing protocols.
A Survey on Temperature-Aware Routing Protocols in Wireless Body Sensor Networks
Oey, Christian Henry Wijaya; Moh, Sangman
2013-01-01
The rapid growth of the elderly population in the world and the rising cost of healthcare impose big issues for healthcare and medical monitoring. A Wireless Body Sensor Network (WBSN) is comprised of small sensor nodes attached inside, on or around a human body, the main purpose of which is to monitor the functions and surroundings of the human body. However, the heat generated by the node's circuitry and antenna could cause damage to the human tissue. Therefore, in designing a routing protocol for WBSNs, it is important to reduce the heat by incorporating temperature into the routing metric. The main contribution of this paper is to survey existing temperature-aware routing protocols that have been proposed for WBSNs. In this paper, we present a brief overview of WBSNs, review the existing routing protocols comparatively and discuss challenging open issues in the design of routing protocols. PMID:23917259
Crary, Michael A.; Carnaby, Giselle D.; Sia, Isaac
2017-01-01
Background The aim of this study was to compare spontaneous swallow frequency analysis (SFA) with clinical screening protocols for identification of dysphagia in acute stroke. Methods In all, 62 patients with acute stroke were evaluated for spontaneous swallow frequency rates using a validated acoustic analysis technique. Independent of SFA, these same patients received a routine nurse-administered clinical dysphagia screening as part of standard stroke care. Both screening tools were compared against a validated clinical assessment of dysphagia for acute stroke. In addition, psychometric properties of SFA were compared against published, validated clinical screening protocols. Results Spontaneous SFA differentiates patients with versus without dysphagia after acute stroke. Using a previously identified cut point based on swallows per minute, spontaneous SFA demonstrated superior ability to identify dysphagia cases compared with a nurse-administered clinical screening tool. In addition, spontaneous SFA demonstrated equal or superior psychometric properties to 4 validated, published clinical dysphagia screening tools. Conclusions Spontaneous SFA has high potential to identify dysphagia in acute stroke with psychometric properties equal or superior to clinical screening protocols. PMID:25088166
Crary, Michael A; Carnaby, Giselle D; Sia, Isaac
2014-09-01
The aim of this study was to compare spontaneous swallow frequency analysis (SFA) with clinical screening protocols for identification of dysphagia in acute stroke. In all, 62 patients with acute stroke were evaluated for spontaneous swallow frequency rates using a validated acoustic analysis technique. Independent of SFA, these same patients received a routine nurse-administered clinical dysphagia screening as part of standard stroke care. Both screening tools were compared against a validated clinical assessment of dysphagia for acute stroke. In addition, psychometric properties of SFA were compared against published, validated clinical screening protocols. Spontaneous SFA differentiates patients with versus without dysphagia after acute stroke. Using a previously identified cut point based on swallows per minute, spontaneous SFA demonstrated superior ability to identify dysphagia cases compared with a nurse-administered clinical screening tool. In addition, spontaneous SFA demonstrated equal or superior psychometric properties to 4 validated, published clinical dysphagia screening tools. Spontaneous SFA has high potential to identify dysphagia in acute stroke with psychometric properties equal or superior to clinical screening protocols. Copyright © 2014 National Stroke Association. Published by Elsevier Inc. All rights reserved.
Cross-Layer Service Discovery Mechanism for OLSRv2 Mobile Ad Hoc Networks.
Vara, M Isabel; Campo, Celeste
2015-07-20
Service discovery plays an important role in mobile ad hoc networks (MANETs). The lack of central infrastructure, limited resources and high mobility make service discovery a challenging issue for this kind of network. This article proposes a new service discovery mechanism for discovering and advertising services integrated into the Optimized Link State Routing Protocol Version 2 (OLSRv2). In previous studies, we demonstrated the validity of a similar service discovery mechanism integrated into the previous version of OLSR (OLSRv1). In order to advertise services, we have added a new type-length-value structure (TLV) to the OLSRv2 protocol, called service discovery message (SDM), according to the Generalized MANET Packet/Message Format defined in Request For Comments (RFC) 5444. Each node in the ad hoc network only advertises its own services. The advertisement frequency is a user-configurable parameter, so that it can be modified depending on the user requirements. Each node maintains two service tables, one to store information about its own services and another one to store information about the services it discovers in the network. We present simulation results, that compare our service discovery integrated into OLSRv2 with the one defined for OLSRv1 and with the integration of service discovery in Ad hoc On-demand Distance Vector (AODV) protocol, in terms of service discovery ratio, service latency and network overhead.
NASA Astrophysics Data System (ADS)
Amyay, Omar
A method defined in terms of synthesis and verification steps is presented. The specification of the services and protocols of communication within a multilayered architecture of the Open Systems Interconnection (OSI) type is an essential issue for the design of computer networks. The aim is to obtain an operational specification of the protocol service couple of a given layer. Planning synthesis and verification steps constitute a specification trajectory. The latter is based on the progressive integration of the 'initial data' constraints and verification of the specification originating from each synthesis step, through validity constraints that characterize an admissible solution. Two types of trajectories are proposed according to the style of the initial specification of the service protocol couple: operational type and service supplier viewpoint; knowledge property oriented type and service viewpoint. Synthesis and verification activities were developed and formalized in terms of labeled transition systems, temporal logic and epistemic logic. The originality of the second specification trajectory and the use of the epistemic logic are shown. An 'artificial intelligence' approach enables a conceptual model to be defined for a knowledge base system for implementing the method proposed. It is structured in three levels of representation of the knowledge relating to the domain, the reasoning characterizing synthesis and verification activities and the planning of the steps of a specification trajectory.
Cross-Layer Service Discovery Mechanism for OLSRv2 Mobile Ad Hoc Networks
Vara, M. Isabel; Campo, Celeste
2015-01-01
Service discovery plays an important role in mobile ad hoc networks (MANETs). The lack of central infrastructure, limited resources and high mobility make service discovery a challenging issue for this kind of network. This article proposes a new service discovery mechanism for discovering and advertising services integrated into the Optimized Link State Routing Protocol Version 2 (OLSRv2). In previous studies, we demonstrated the validity of a similar service discovery mechanism integrated into the previous version of OLSR (OLSRv1). In order to advertise services, we have added a new type-length-value structure (TLV) to the OLSRv2 protocol, called service discovery message (SDM), according to the Generalized MANET Packet/Message Format defined in Request For Comments (RFC) 5444. Each node in the ad hoc network only advertises its own services. The advertisement frequency is a user-configurable parameter, so that it can be modified depending on the user requirements. Each node maintains two service tables, one to store information about its own services and another one to store information about the services it discovers in the network. We present simulation results, that compare our service discovery integrated into OLSRv2 with the one defined for OLSRv1 and with the integration of service discovery in Ad hoc On-demand Distance Vector (AODV) protocol, in terms of service discovery ratio, service latency and network overhead. PMID:26205272
An optimized 13C-urea breath test for the diagnosis of H pylori infection
Campuzano-Maya, Germán
2007-01-01
AIM: To validate an optimized 13C-urea breath test (13C-UBT) protocol for the diagnosis of H pylori infection that is cost-efficient and maintains excellent diagnostic accuracy. METHODS: 70 healthy volunteers were tested with two simplified 13C-UBT protocols, with test meal (Protocol 2) and without test meal (Protocol 1). Breath samples were collected at 10, 20 and 30 min after ingestion of 50 mg 13C-urea dissolved in 10 mL of water, taken as a single swallow, followed by 200 mL of water (pH 6.0) and a circular motion around the waistline to homogenize the urea solution. Performance of both protocols was analyzed at various cut-off values. Results were validated against the European protocol. RESULTS: According to the reference protocol, 65.7% individuals were positive for H pylori infection and 34.3% were negative. There were no significant differences in the ability of both protocols to correctly identify positive and negative H pylori individuals. However, only Protocol 1 with no test meal achieved accuracy, sensitivity, specificity, positive and negative predictive values of 100%. The highest values achieved by Protocol 2 were 98.57%, 97.83%, 100%, 100% and 100%, respectively. CONCLUSION: A 10 min, 50 mg 13C-UBT with no test meal using a cut-off value of 2-2.5 is a highly accurate test for the diagnosis of H pylori infection at a reduced cost. PMID:17907288
PNNI Performance Validation Test Report
NASA Technical Reports Server (NTRS)
Dimond, Robert P.
1999-01-01
Two Private Network-Network Interface (PNNI) neighboring peers were monitored with a protocol analyzer to understand and document how PNNI works with regards to initialization and recovery processes. With the processes documented, pertinent events were found and measured to determine the protocols behavior in several environments, which consisted of congestion and/or delay. Subsequent testing of the protocol in these environments was conducted to determine the protocol's suitability for use in satellite-terrestrial network architectures.
Validation and Comprehension: An Integrated Overview
ERIC Educational Resources Information Center
Kendeou, Panayiota
2014-01-01
In this article, I review and discuss the work presented in this special issue while focusing on a number of issues that warrant further investigation in validation research. These issues pertain to the nature of the validation processes, the processes and mechanisms that support validation during comprehension, the factors that influence…
Akpolat, Tekin; Erdem, Emre; Aydogdu, Türkan
2012-01-01
Encouragement of home blood pressure (BP) monitoring has a great potential to improve hypertension control rates. The purpose of this study was to test validation of the Omron M3 Intellisense (HEM-7051-E) upper arm BP measuring monitor for self-measurement according to the European Society of Hypertension International Protocol revision 2010 (ESH-IP2) in stage 3-5 chronic kidney disease (CKD) patients. 66 patients having CKD stage 3-5 were included in the study. Nine consecutive measurements were made according to the ESH-IP2 protocol. The Omron M3 Intellisense device fulfills the validation criteria of the ESH-IP2 for stage 3-5 CKD patients. Although arterial stiffness can affect accurate BP measurement, there are limited data regarding the use of automated oscillometric devices in CKD. To our knowledge, this is the first study investigating validation of an oscillometric device in stage 3-5 predialysis CKD patients. This study validates the Omron M3 Intellisense upper arm device for stage 3-5 CKD patients. New validation studies investigating other oscillometric sphygmomanometers for CKD patients and involvement of nephrologists in these studies have great potential to increase patient care in CKD. Copyright © 2011 S. Karger AG, Basel.
Validity of an Exercise Test Based on Habitual Gait Speed in Mobility-Limited Older Adults
Li, Xin; Forman, Daniel E.; Kiely, Dan K.; LaRose, Sharon; Hirschberg, Ronald; Frontera, Walter R.; Bean, Jonathan F.
2013-01-01
Objective To evaluate whether a customized exercise tolerance testing (ETT) protocol based on an individual’s habitual gait speed (HGS) on level ground would be a valid mode of exercise testing older adults. Although ETT provides a useful means to risk-stratify adults, age-related declines in gait speed paradoxically limit the utility of standard ETT protocols for evaluating older adults. A customized ETT protocol may be a useful alternative to these standard methods, and this study hypothesized that this alternative approach would be valid. Design We performed a cross-sectional analysis of baseline data from a randomized controlled trial of older adults with observed mobility problems. Screening was performed using a treadmill-based ETT protocol customized for each individual’s HGS. We determined the content validity by assessing the results of the ETTs, and we evaluated the construct validity of treadmill time in relation to the Physical Activity Scale for the Elderly (PASE) and the Late Life Function and Disability Instrument (LLFDI). Setting Outpatient rehabilitation center. Participants Community-dwelling, mobility-limited older adults (N = 141). Interventions Not applicable. Main Outcome Measures Cardiac instability, ETT duration, peak heart rate, peak systolic blood pressure, PASE, and LLFDI. Results Acute cardiac instability was identified in 4 of the participants who underwent ETT. The remaining participants (n = 137, 68% female; mean age, 75.3y) were included in the subsequent analyses. Mean exercise duration was 9.39 minutes, with no significant differences in durations being observed after evaluating among tertiles by HGS status. Mean peak heart rate and mean peak systolic blood pressure were 126.6 beats/ min and 175.0mmHg, respectively. Within separate multivariate models, ETT duration in each of the 3 gait speed groups was significantly associated (P<.05) with PASE and LLFDI. Conclusions Mobility-limited older adults can complete this customized ETT protocol, allowing for the identification of acute cardiac instability and the achievement of optimal exercise parameters. PMID:22289248
Review and publication of protocol submissions to Trials - what have we learned in 10 years?
Li, Tianjing; Boutron, Isabelle; Al-Shahi Salman, Rustam; Cobo, Erik; Flemyng, Ella; Grimshaw, Jeremy M; Altman, Douglas G
2016-12-16
Trials has 10 years of experience in providing open access publication of protocols for randomised controlled trials. In this editorial, the senior editors and editors-in-chief of Trials discuss editorial issues regarding managing trial protocol submissions, including the content and format of the protocol, timing of submission, approaches to tracking protocol amendments, and the purpose of peer reviewing a protocol submission. With the clarification and guidance provided, we hope we can make the process of publishing trial protocols more efficient and useful to trial investigators and readers.
Montedori, Alessandro; Abraha, Iosief; Chiatti, Carlos; Cozzolino, Francesco; Orso, Massimiliano; Luchetta, Maria Laura; Rimland, Joseph M; Ambrosio, Giuseppe
2016-09-15
Administrative healthcare databases are useful to investigate the epidemiology, health outcomes, quality indicators and healthcare utilisation concerning peptic ulcers and gastrointestinal bleeding, but the databases need to be validated in order to be a reliable source for research. The aim of this protocol is to perform the first systematic review of studies reporting the validation of International Classification of Diseases, 9th Revision and 10th version (ICD-9 and ICD-10) codes for peptic ulcer and upper gastrointestinal bleeding diagnoses. MEDLINE, EMBASE, Web of Science and the Cochrane Library databases will be searched, using appropriate search strategies. We will include validation studies that used administrative data to identify peptic ulcer disease and upper gastrointestinal bleeding diagnoses or studies that evaluated the validity of peptic ulcer and upper gastrointestinal bleeding codes in administrative data. The following inclusion criteria will be used: (a) the presence of a reference standard case definition for the diseases of interest; (b) the presence of at least one test measure (eg, sensitivity, etc) and (c) the use of an administrative database as a source of data. Pairs of reviewers will independently abstract data using standardised forms and will evaluate quality using the checklist of the Standards for Reporting of Diagnostic Accuracy (STARD) criteria. This systematic review protocol has been produced in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analysis Protocol (PRISMA-P) 2015 statement. Ethics approval is not required given that this is a protocol for a systematic review. We will submit results of this study to a peer-reviewed journal for publication. The results will serve as a guide for researchers validating administrative healthcare databases to determine appropriate case definitions for peptic ulcer disease and upper gastrointestinal bleeding, as well as to perform outcome research using administrative healthcare databases of these conditions. CRD42015029216. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Learned helplessness: validity and reliability of depressive-like states in mice.
Chourbaji, S; Zacher, C; Sanchis-Segura, C; Dormann, C; Vollmayr, B; Gass, P
2005-12-01
The learned helplessness paradigm is a depression model in which animals are exposed to unpredictable and uncontrollable stress, e.g. electroshocks, and subsequently develop coping deficits for aversive but escapable situations (J.B. Overmier, M.E. Seligman, Effects of inescapable shock upon subsequent escape and avoidance responding, J. Comp. Physiol. Psychol. 63 (1967) 28-33 ). It represents a model with good similarity to the symptoms of depression, construct, and predictive validity in rats. Despite an increased need to investigate emotional, in particular depression-like behaviors in transgenic mice, so far only a few studies have been published using the learned helplessness paradigm. One reason may be the fact that-in contrast to rats (B. Vollmayr, F.A. Henn, Learned helplessness in the rat: improvements in validity and reliability, Brain Res. Brain Res. Protoc. 8 (2001) 1-7)--there is no generally accepted learned helplessness protocol available for mice. This prompted us to develop a reliable helplessness procedure in C57BL/6N mice, to exclude possible artifacts, and to establish a protocol, which yields a consistent fraction of helpless mice following the shock exposure. Furthermore, we validated this protocol pharmacologically using the tricyclic antidepressant imipramine. Here, we present a mouse model with good face and predictive validity that can be used for transgenic, behavioral, and pharmacological studies.
Developing an evidence-based practice protocol: implications for midwifery practice.
Carr, K C
2000-01-01
Evidence-based practice is defined and its importance to midwifery practice is presented. Guidelines are provided for the development of an evidence-based practice protocol. These include: identifying the clinical question, obtaining the evidence, evaluating the validity and importance of the evidence, synthesizing the evidence and applying it to the development of a protocol or clinical algorithm, and, finally, developing an evaluation plan or measurement strategy to see if the new protocol is effective.
Advanced orbiting systems test-bedding and protocol verification
NASA Technical Reports Server (NTRS)
Noles, James; De Gree, Melvin
1989-01-01
The Consultative Committee for Space Data Systems (CCSDS) has begun the development of a set of protocol recommendations for Advanced Orbiting Systems (SOS). The AOS validation program and formal definition of AOS protocols are reviewed, and the configuration control of the AOS formal specifications is summarized. Independent implementations of the AOS protocols by NASA and ESA are discussed, and cross-support/interoperability tests which will allow the space agencies of various countries to share AOS communication facilities are addressed.
Public health and terrorism preparedness: cross-border issues.
Olson, Debra; Leitheiser, Aggie; Atchison, Christopher; Larson, Susan; Homzik, Cassandra
2005-01-01
On December 15, 2003, the Centers for Public Health Preparedness at the University of Minnesota and the University of Iowa convened the "Public Health and Terrorism Preparedness: Cross-Border Issues Roundtable." The purpose of the roundtable was to gather public health professionals and government agency representatives at the state, provincial, and local levels to identify unmet cross-border emergency preparedness and response needs and develop strategies for addressing these needs. Representatives from six state and local public health departments and three provincial governments were invited to identify cross-border needs and issues using a nominal group process. The result of the roundtable was identification of the needs considered most important and most doable across all the focus groups. The need to collaborate on and exchange plans and protocols among agencies was identified as most important and most doable across all groups. Development of contact protocols and creation and maintenance of a contact database was also considered important and doable for a majority of groups. Other needs ranked important across the majority of groups included specific isolation and quarantine protocols for multi-state responses; a system for rapid and secure exchange of information; specific protocols for sharing human resources across borders, including emergency credentials for physicians and health care workers; and a specific protocol to coordinate Strategic National Stockpile mechanisms across border communities.
Assessment of an improved bone washing protocol for deceased donor human bone.
Eagle, M J; Man, J; Rooney, P; Hogg, P; Kearney, J N
2015-03-01
NHSBT Tissue Services issues bone to surgeons in the UK in two formats, fresh-frozen unprocessed bone from living donors and processed bone from deceased donors. Processed bone may be frozen or freeze dried and all processed bone is currently subjected to a washing protocol to remove blood and bone marrow. In this study we have improved the current bone washing protocol for cancellous bone and assessed the success of the protocol by measuring the removal of the bone marrow components: soluble protein, DNA and haemoglobin at each step in the process, and residual components in the bone at the end of the process. The bone washing protocol is a combination of sonication, warm water washes, centrifugation and chemical (ethanol and hydrogen peroxide) treatments. We report that the bone washing protocol is capable of removing up to 99.85 % soluble protein, 99.95 % DNA and 100 % of haemoglobin from bone. The new bone washing protocol does not render any bone cytotoxic as shown by contact cytotoxicity assays. No microbiological cell growth was detected in any of the wash steps. This process is now in use for processed cancellous bone issued by NHSBT.
Development of a protocol for the ecological assessment of a special species
David Burton
2004-01-01
Developing consistent inventory and assessment protocols is important to people working on aspen issues in California and Nevada. Efforts have focused on identifying key indicators of ecological condition within aspen stands. The protocols have incorporated a range of factors that create or affect those indicators. Resulting ecological assessments conducted through the...
Recommended protocols for sampling macrofungi
Gregory M. Mueller; John Paul Schmit; Sabine M. Hubndorf Leif Ryvarden; Thomas E. O' Dell; D. Jean Lodge; Patrick R. Leacock; Milagro Mata; Loengrin Umania; Qiuxin (Florence) Wu; Daniel L. Czederpiltz
2004-01-01
This chapter discusses several issues regarding reommended protocols for sampling macrofungi: Opportunistic sampling of macrofungi, sampling conspicuous macrofungi using fixed-size, sampling small Ascomycetes using microplots, and sampling a fixed number of downed logs.
Campbell, Stephen M; Kontopantelis, Evangelos; Hannon, Kerin; Burke, Martyn; Barber, Annette; Lester, Helen E
2011-08-10
Quality measures should be subjected to a testing protocol before being used in practice using key attributes such as acceptability, feasibility and reliability, as well as identifying issues derived from actual implementation and unintended consequences. We describe the methodologies and results of an indicator testing protocol (ITP) using data from proposed quality indicators for the United Kingdom Quality and Outcomes Framework (QOF). The indicator testing protocol involved a multi-step and methodological process: 1) The RAND/UCLA Appropriateness Method, to test clarity and necessity, 2) data extraction from patients' medical records, to test technical feasibility and reliability, 3) diaries, to test workload, 4) cost-effectiveness modelling, and 5) semi-structured interviews, to test acceptability, implementation issues and unintended consequences. Testing was conducted in a sample of representative family practices in England. These methods were combined into an overall recommendation for each tested indicator. Using an indicator testing protocol as part of piloting was seen as a valuable way of testing potential indicators in 'real world' settings. Pilot 1 (October 2009-March 2010) involved thirteen indicators across six clinical domains and twelve indicators passed the indicator testing protocol. However, the indicator testing protocol identified a number of implementation issues and unintended consequences that can be rectified or removed prior to national roll out. A palliative care indicator is used as an exemplar of the value of piloting using a multiple attribute indicator testing protocol - while technically feasible and reliable, it was unacceptable to practice staff and raised concerns about potentially causing actual patient harm. This indicator testing protocol is one example of a protocol that may be useful in assessing potential quality indicators when adapted to specific country health care settings and may be of use to policy-makers and researchers worldwide to test the likely effect of implementing indicators prior to roll out. It builds on and codifies existing literature and other testing protocols to create a field testing methodology that can be used to produce country specific quality indicators for pay-for-performance or quality improvement schemes.
A protocol for validating Land Surface Temperature from Sentinel-3
NASA Astrophysics Data System (ADS)
Ghent, D.
2015-12-01
One of the main objectives of the Sentinel-3 mission is to measure sea- and land-surface temperature with high-end accuracy and reliability in support of environmental and climate monitoring in an operational context. Calibration and validation are thus key criteria for operationalization within the framework of the Sentinel-3 Mission Performance Centre (S3MPC).Land surface temperature (LST) has a long heritage of satellite observations which have facilitated our understanding of land surface and climate change processes, such as desertification, urbanization, deforestation and land/atmosphere coupling. These observations have been acquired from a variety of satellite instruments on platforms in both low-earth orbit and in geostationary orbit. Retrieval accuracy can be a challenge though; surface emissivities can be highly variable owing to the heterogeneity of the land, and atmospheric effects caused by the presence of aerosols and by water vapour absorption can give a bias to the underlying LST. As such, a rigorous validation is critical in order to assess the quality of the data and the associated uncertainties. The Sentinel-3 Cal-Val Plan for evaluating the level-2 SL_2_LST product builds on an established validation protocol for satellite-based LST. This set of guidelines provides a standardized framework for structuring LST validation activities, and is rapidly gaining international recognition. The protocol introduces a four-pronged approach which can be summarised thus: i) in situ validation where ground-based observations are available; ii) radiance-based validation over sites that are homogeneous in emissivity; iii) intercomparison with retrievals from other satellite sensors; iv) time-series analysis to identify artefacts on an interannual time-scale. This multi-dimensional approach is a necessary requirement for assessing the performance of the LST algorithm for SLSTR which is designed around biome-based coefficients, thus emphasizing the importance of non-traditional forms of validation such as radiance-based techniques. Here we present examples of the application of the protocol to data produced within the ESA DUE GlobTemperature Project. The lessons learnt here are helping to fine-tune the methodology in preparation for Sentinel-3 commissioning.
Issues in designing transport layer multicast facilities
NASA Technical Reports Server (NTRS)
Dempsey, Bert J.; Weaver, Alfred C.
1990-01-01
Multicasting denotes a facility in a communications system for providing efficient delivery from a message's source to some well-defined set of locations using a single logical address. While modem network hardware supports multidestination delivery, first generation Transport Layer protocols (e.g., the DoD Transmission Control Protocol (TCP) (15) and ISO TP-4 (41)) did not anticipate the changes over the past decade in underlying network hardware, transmission speeds, and communication patterns that have enabled and driven the interest in reliable multicast. Much recent research has focused on integrating the underlying hardware multicast capability with the reliable services of Transport Layer protocols. Here, we explore the communication issues surrounding the design of such a reliable multicast mechanism. Approaches and solutions from the literature are discussed, and four experimental Transport Layer protocols that incorporate reliable multicast are examined.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ekechukwu, A
Method validation is the process of evaluating whether an analytical method is acceptable for its intended purpose. For pharmaceutical methods, guidelines from the United States Pharmacopeia (USP), International Conference on Harmonisation (ICH), and the United States Food and Drug Administration (USFDA) provide a framework for performing such valications. In general, methods for regulatory compliance must include studies on specificity, linearity, accuracy, precision, range, detection limit, quantitation limit, and robustness. Elements of these guidelines are readily adapted to the issue of validation for beryllium sampling and analysis. This document provides a listing of available sources which can be used to validatemore » analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers and books reviewed is given in the Appendix. Available validation documents and guides are listed therein; each has a brief description of application and use. In the referenced sources, there are varying approches to validation and varying descriptions of the valication process at different stages in method development. This discussion focuses on valication and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all referenced documents were published in English.« less
The Wiley Protocol: an analysis of ethical issues.
Rosenthal, M Sara
2008-01-01
: This review explores the ethical issues surrounding an unregulated protocol that is advertised to women through consumer books, the popular press, and the Internet, known as the Wiley Protocol. : A content analysis of relevant documents was conducted, followed by telephone interviews with investigators and former participants to verify facts. : The Wiley Protocol is an example of unregulated research involving potentially unsafe doses of bioidentical hormones applied to an unselected population of women. This protocol fails to use research ethics guidelines such as informed consent, investigator expertise, sound methodology, standardized data collection, and data safety monitoring. : Clinical ethics breaches include lack of full disclosure of risks, coercive influences, as well as misinformation about the study goals and safety. Breaches of professional ethics include conflicts of interest with respect to financial incentives, patient accrual, and inadequate standards of awareness and proficiency among participating investigators. It appears evident that the failure to regulate nutriceuticals and products of compounding pharmacy has provided the opportunity for these ethical violations.
Glaister, Mark; Stone, Michael H; Stewart, Andrew M; Hughes, Michael; Moir, Gavin L
2004-08-01
The purpose of the present study was to assess the reliability and validity of fatigue measures, as derived from 4 separate formulae, during tests of repeat sprint ability. On separate days over a 3-week period, 2 groups of 7 recreationally active men completed 6 trials of 1 of 2 maximal (20 x 5 seconds) intermittent cycling tests with contrasting recovery periods (10 or 30 seconds). All trials were conducted on a friction-braked cycle ergometer, and fatigue scores were derived from measures of mean power output for each sprint. Apart from formula 1, which calculated fatigue from the percentage difference in mean power output between the first and last sprint, all remaining formulae produced fatigue scores that showed a reasonably good level of test-retest reliability in both intermittent test protocols (intraclass correlation range: 0.78-0.86; 95% likely range of true values: 0.54-0.97). Although between-protocol differences in the magnitude of the fatigue scores suggested good construct validity, within-protocol differences highlighted limitations with each formula. Overall, the results support the use of the percentage decrement score as the most valid and reliable measure of fatigue during brief maximal intermittent work.
2013-01-01
Background The volume of influenza pandemic modelling studies has increased dramatically in the last decade. Many models incorporate now sophisticated parameterization and validation techniques, economic analyses and the behaviour of individuals. Methods We reviewed trends in these aspects in models for influenza pandemic preparedness that aimed to generate policy insights for epidemic management and were published from 2000 to September 2011, i.e. before and after the 2009 pandemic. Results We find that many influenza pandemics models rely on parameters from previous modelling studies, models are rarely validated using observed data and are seldom applied to low-income countries. Mechanisms for international data sharing would be necessary to facilitate a wider adoption of model validation. The variety of modelling decisions makes it difficult to compare and evaluate models systematically. Conclusions We propose a model Characteristics, Construction, Parameterization and Validation aspects protocol (CCPV protocol) to contribute to the systematisation of the reporting of models with an emphasis on the incorporation of economic aspects and host behaviour. Model reporting, as already exists in many other fields of modelling, would increase confidence in model results, and transparency in their assessment and comparison. PMID:23651557
Erdem, Emre; Aydogdu, Türkan; Akpolat, Tekin
2011-02-01
Standard validation protocols are objective guides for healthcare providers, physicians, and patients. The purpose of this study was to test validation of the Medisana MTP Plus upper arm blood pressure (BP) measuring monitor for self-measurement according to the European Society of Hypertension International Protocol (ESH-IP2) in adults. The Medisana MTP Plus monitor is an automated and oscillometric upper arm device for home BP monitoring. Nine consecutive measurements were made according to the ESH-IP2. Overseen by an independent supervisor, measurements were recorded by two observers blinded from both each other's readings and from the device readings. The Medisana MTP Plus device fulfills the validation criteria of the ESH-IP2 for the general population. The mean (standard deviation) of the difference between the observers and the device measurements was 0.6 mmHg (5.1 mmHg) for systolic and 2.7 mmHg (3.4 mmHg) for diastolic pressures, respectively. As the Medisana MTP Plus device has achieved the required standards, it is recommended for home BP monitoring in an adult population.
Ethical Issues Surrounding the Use of Modern Human Remains for Research in South Africa.
Briers, N; Dempers, J J
2017-02-01
Chapter 8 of the South African National Health Act 61 of 2003 (NHA) that deals with the donation of human tissue was promulgated in 2012. The new Act is perceived to impose restrictions on low-risk research involving human remains. This study aimed to identify the issues raised by a research ethics committee (REC) when reviewing protocols where human remains are used as data source. REC minutes from 2009 to 2014 were reviewed, and issues raised by the committee were categorized. In total, 127 protocols submitted to the committee over 6 years involved human remains. Queries relating to science (22.2%) and administration (18.9%) were the most common, whereas queries relating to legal issues constituted only 10.2%. Ethical issues centered on informed consent regarding sensitive topics such as HIV, DNA, and deceased children. The change in legislation did not change the number or type of legal issues identified by the REC.
A framework for the design and development of physical employment tests and standards.
Payne, W; Harvey, J
2010-07-01
Because operational tasks in the uniformed services (military, police, fire and emergency services) are physically demanding and incur the risk of injury, employment policy in these services is usually competency based and predicated on objective physical employment standards (PESs) based on physical employment tests (PETs). In this paper, a comprehensive framework for the design of PETs and PESs is presented. Three broad approaches to physical employment testing are described and compared: generic predictive testing; task-related predictive testing; task simulation testing. Techniques for the selection of a set of tests with good coverage of job requirements, including job task analysis, physical demands analysis and correlation analysis, are discussed. Regarding individual PETs, theoretical considerations including measurability, discriminating power, reliability and validity, and practical considerations, including development of protocols, resource requirements, administrative issues and safety, are considered. With regard to the setting of PESs, criterion referencing and norm referencing are discussed. STATEMENT OF RELEVANCE: This paper presents an integrated and coherent framework for the development of PESs and hence provides a much needed theoretically based but practically oriented guide for organisations seeking to establish valid and defensible PESs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spirito, Flavia; Capt, Annabelle; Rio, Marcela Del
2006-01-20
Gene transfer represents the unique therapeutic issue for a number of inherited skin disorders including junctional epidermolysis bullosa (JEB), an untreatable genodermatose caused by mutations in the adhesion ligand laminin 5 ({alpha}3{beta}3{gamma}2) that is secreted in the extracellular matrix by the epidermal basal keratinocytes. Because gene therapy protocols require validation in animal models, we have phenotypically reverted by oncoretroviral transfer of the curative gene the keratinocytes isolated from dogs with a spontaneous form of JEB associated with a genetic mutation in the {alpha}3 chain of laminin 5. We show that the transduced dog JEB keratinocytes: (1) display a sustained secretionmore » of laminin 5 in the extracellular matrix; (2) recover the adhesion, proliferation, and clonogenic capacity of wild-type keratinocytes; (3) generate fully differentiated stratified epithelia that after grafting on immunocompromised mice produce phenotypically normal skin and sustain permanent expression of the transgene. We validate an animal model that appears particularly suitable to demonstrate feasibility, efficacy, and safety of genetic therapeutic strategies for cutaneous disorders before undertaking human clinical trials.« less
Blanchard, Philippe; Regnault, Julie; Schurr, Frank; Dubois, Eric; Ribière, Magali
2012-03-01
Chronic bee paralysis virus (CBPV) is responsible for chronic bee paralysis, an infectious and contagious disease in adult honey bees (Apis mellifera L.). A real-time RT-PCR assay to quantitate the CBPV load is now available. To propose this assay as a reference method, it was characterised further in an intra-laboratory study during which the reliability and the repeatability of results and the performance of the assay were confirmed. The qPCR assay alone and the whole quantitation method (from sample RNA extraction to analysis) were both assessed following the ISO/IEC 17025 standard and the recent XP U47-600 standard issued by the French Standards Institute. The performance of the qPCR assay and of the overall CBPV quantitation method were validated over a 6 log range from 10(2) to 10(8) with a detection limit of 50 and 100 CBPV RNA copies, respectively, and the protocol of the real-time RT-qPCR assay for CBPV quantitation was approved by the French Accreditation Committee. Copyright © 2011 Elsevier B.V. All rights reserved.
Development of an Expressed Sequence Tag (EST) Resource for Wheat (Triticum aestivum L.)
Lazo, G. R.; Chao, S.; Hummel, D. D.; Edwards, H.; Crossman, C. C.; Lui, N.; Matthews, D. E.; Carollo, V. L.; Hane, D. L.; You, F. M.; Butler, G. E.; Miller, R. E.; Close, T. J.; Peng, J. H.; Lapitan, N. L. V.; Gustafson, J. P.; Qi, L. L.; Echalier, B.; Gill, B. S.; Dilbirligi, M.; Randhawa, H. S.; Gill, K. S.; Greene, R. A.; Sorrells, M. E.; Akhunov, E. D.; Dvořák, J.; Linkiewicz, A. M.; Dubcovsky, J.; Hossain, K. G.; Kalavacharla, V.; Kianian, S. F.; Mahmoud, A. A.; Miftahudin; Ma, X.-F.; Conley, E. J.; Anderson, J. A.; Pathan, M. S.; Nguyen, H. T.; McGuire, P. E.; Qualset, C. O.; Anderson, O. D.
2004-01-01
This report describes the rationale, approaches, organization, and resource development leading to a large-scale deletion bin map of the hexaploid (2n = 6x = 42) wheat genome (Triticum aestivum L.). Accompanying reports in this issue detail results from chromosome bin-mapping of expressed sequence tags (ESTs) representing genes onto the seven homoeologous chromosome groups and a global analysis of the entire mapped wheat EST data set. Among the resources developed were the first extensive public wheat EST collection (113,220 ESTs). Described are protocols for sequencing, sequence processing, EST nomenclature, and the assembly of ESTs into contigs. These contigs plus singletons (unassembled ESTs) were used for selection of distinct sequence motif unigenes. Selected ESTs were rearrayed, validated by 5′ and 3′ sequencing, and amplified for probing a series of wheat aneuploid and deletion stocks. Images and data for all Southern hybridizations were deposited in databases and were used by the coordinators for each of the seven homoeologous chromosome groups to validate the mapping results. Results from this project have established the foundation for future developments in wheat genomics. PMID:15514037
Fania, Claudio; Albertini, Federica; Palatini, Paolo
2017-10-01
The aim of this study was to define the accuracy of UM-211, an automated oscillometric device for office use coupled to several cuffs for different arm sizes, according to the International Protocol of the European Society of Hypertension. The validation was performed in 33 individuals. Their mean age was 59.6±12.9 years, systolic blood pressure (BP) was 144.3±21.5 mmHg (range: 96-184 mmHg), diastolic BP was 86.8±18.5 mmHg (range: 48-124 mmHg), and arm circumference was 30.2±4.3 cm (range: 23-39 cm). Four sequential readings were taken by observers 1 and 2 using a double-headed stethoscope and a mercury sphygmomanometer, whereas three BP readings were taken by the supervisor using the test instrument. The differences between the readings provided by the device and the mean observer measurements were calculated. Therefore, each device measurement was compared with the previous and the next mean observer measurement. The validation results fulfilled all the 2010 European Society of Hypertension revision Protocol criteria for the general population and passed all validation grades. On average, the device overestimated systolic BP by 1.7±2.4 mmHg and diastolic BP by 1.7±2.5 mmHg. These data show that the UM-211 device coupled to several cuffs for different ranges of arm circumference met the requirements for validation according to the International Protocol and can be recommended for clinical use in the adult population. However, these results mainly apply to the use of the 22-32 and the 31-45 cm cuffs.
Kaluzhny, Yulia; Kandárová, Helena; Handa, Yuki; DeLuca, Jane; Truong, Thoa; Hunter, Amy; Kearney, Paul; d'Argembeau-Thornton, Laurence; Klausner, Mitchell
2015-05-01
The 7th Amendment to the EU Cosmetics Directive and the EU REACH Regulation have reinforced the need for in vitro ocular test methods. Validated in vitro ocular toxicity tests that can predict the human response to chemicals, cosmetics and other consumer products are required for the safety assessment of materials that intentionally, or inadvertently, come into contact with the eye. The EpiOcular Eye Irritation Test (EIT), which uses the normal human cell-based EpiOcular™ tissue model, was developed to address this need. The EpiOcular-EIT is able to discriminate, with high sensitivity and accuracy, between ocular irritant/corrosive materials and those that require no labelling. Although the original EpiOcular-EIT protocol was successfully pre-validated in an international, multicentre study sponsored by COLIPA (the predecessor to Cosmetics Europe), data from two larger studies (the EURL ECVAM-COLIPA validation study and an independent in-house validation at BASF SE) resulted in a sensitivity for the protocol for solids that was below the acceptance criteria set by the Validation Management Group (VMG) for eye irritation, and indicated the need for improvement of the assay's sensitivity for solids. By increasing the exposure time for solid materials from 90 minutes to 6 hours, the optimised EpiOcular-EIT protocol achieved 100% sensitivity, 68.4% specificity and 84.6% accuracy, thereby meeting all the acceptance criteria set by the VMG. In addition, to satisfy the needs of Japan and the Pacific region, the EpiOcular-EIT method was evaluated for its performance after extended shipment and storage of the tissues (4-5 days), and it was confirmed that the assay performs with similar levels of sensitivity, specificity and reproducibility in these circumstances. 2015 FRAME.
Ethical and legal issues in non-heart-beating organ donation.
Bos, Michael A
2005-05-15
Procurement of kidneys and livers from non-heart-beating donors (NHBD) raises ethical and legal issues that need to be considered before wider use of these donors is undertaken. Although NHBDs were used in kidney transplantation as early as the 1960s, retrieval of these organs is not universally accepted today. From a medical point of view, these organs were considered "marginal" because the majority showed delayed or impaired function early after implantation. Legal problems relate to determination of death on cardiopulmonary criteria, the issue of valid consent, and the use of preservation measures. Among ethical issues involved are observance of the dead-donor rule, decisions with respect to resuscitation and withdrawal of life-sustaining treatment, respect for the dying patient and the dead body, and proper guidance of the family. In The Netherlands NHB donation was pioneered by the Maastricht Centre as early as 1981. Today, all seven transplant centers procure and transplant these organs, and NHBDs have become an important source of transplantable kidneys and livers. Recent legislation in The Netherlands also supports NHB donation by allowing the use of organ-preserving measures, even in the absence of family consent. As a result, one of every three kidneys transplanted in The Netherlands in 2004 derives from a NHBD. This article explores Dutch NHBD practice, protocols, and results and compares these data internationally.
NASA Astrophysics Data System (ADS)
Parilla, Philip A.; Gross, Karl; Hurst, Katherine; Gennett, Thomas
2016-03-01
The ultimate goal of the hydrogen economy is the development of hydrogen storage systems that meet or exceed the US DOE's goals for onboard storage in hydrogen-powered vehicles. In order to develop new materials to meet these goals, it is extremely critical to accurately, uniformly and precisely measure materials' properties relevant to the specific goals. Without this assurance, such measurements are not reliable and, therefore, do not provide a benefit toward the work at hand. In particular, capacity measurements for hydrogen storage materials must be based on valid and accurate results to ensure proper identification of promising materials for further development. Volumetric capacity determinations are becoming increasingly important for identifying promising materials, yet there exists controversy on how such determinations are made and whether such determinations are valid due to differing methodologies to count the hydrogen content. These issues are discussed herein, and we show mathematically that capacity determinations can be made rigorously and unambiguously if the constituent volumes are well defined and measurable in practice. It is widely accepted that this occurs for excess capacity determinations and we show here that this can happen for the total capacity determination. Because the adsorption volume is undefined, the absolute capacity determination remains imprecise. Furthermore, we show that there is a direct relationship between determining the respective capacities and the calibration constants used for the manometric and gravimetric techniques. Several suggested volumetric capacity figure-of-merits are defined, discussed and reporting requirements recommended. Finally, an example is provided to illustrate these protocols and concepts.
Magasi, Susan; Harniss, Mark; Heinemann, Allen W
2018-01-01
Principles of fairness in testing require that all test takers, including people with disabilities, have an equal opportunity to demonstrate their capacity on the construct being measured. Measurement design features and assessment protocols can pose barriers for people with disabilities. Fairness in testing is a fundamental validity issue at all phases in the design, administration, and interpretation of measurement instruments in clinical practice and research. There is limited guidance for instrument developers on how to develop and evaluate the accessibility and usability of measurement instruments. This article describes a 6-stage iterative process for developing accessible computer-administered measurement instruments grounded in the procedures implemented across several major measurement initiatives. A key component of this process is interdisciplinary teams of accessibility experts, content and measurement experts, information technology experts, and people with disabilities working together to ensure that measurement instruments are accessible and usable by a wide range of users. The development of accessible measurement instruments is not only an ethical requirement, it also ensures better science by minimizing measurement bias, missing data, and attrition due to mismatches between the target population and test administration platform and protocols. Copyright © 2017 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
A compressive sensing based secure watermark detection and privacy preserving storage framework.
Qia Wang; Wenjun Zeng; Jun Tian
2014-03-01
Privacy is a critical issue when the data owners outsource data storage or processing to a third party computing service, such as the cloud. In this paper, we identify a cloud computing application scenario that requires simultaneously performing secure watermark detection and privacy preserving multimedia data storage. We then propose a compressive sensing (CS)-based framework using secure multiparty computation (MPC) protocols to address such a requirement. In our framework, the multimedia data and secret watermark pattern are presented to the cloud for secure watermark detection in a CS domain to protect the privacy. During CS transformation, the privacy of the CS matrix and the watermark pattern is protected by the MPC protocols under the semi-honest security model. We derive the expected watermark detection performance in the CS domain, given the target image, watermark pattern, and the size of the CS matrix (but without the CS matrix itself). The correctness of the derived performance has been validated by our experiments. Our theoretical analysis and experimental results show that secure watermark detection in the CS domain is feasible. Our framework can also be extended to other collaborative secure signal processing and data-mining applications in the cloud.
Topouchian, Jirar A; El Assaad, Mohamed A; Orobinskaia, Ludmila V; El Feghali, Ramzi N; Asmar, Roland G
2006-06-01
Two electronic devices for self-measurement of blood pressure - a brachial monitor, the Omron M6, and a wrist monitor, the Omron R7 - were evaluated in two separate studies according to the International Protocol of the European Society of Hypertension. The International Validation Protocol is divided into two phases: the first phase is performed on 15 selected participants (45 pairs of blood pressure measurements); if the device passes this phase, 18 supplementary participants are included (54 pairs of blood pressure measurements) making a total number of 33 participants (99 pairs of blood pressure measurements) on whom the final validation is performed. The same methodology recommended by the European Society of Hypertension protocol was applied for both studies. In each study and for each participant, four blood pressure measurements were taken simultaneously by two trained observers using mercury sphygmomanometers alternately with three measurements taken by the tested device. The difference between the blood pressure value given by the device and that obtained by the two observers (mean of the two observers) was calculated for each measure. The 99 pairs of blood pressure differences were classified into three categories (
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-01
... Effectiveness of a Proposed Rule Change Relating to FINRA Trade Reporting Notice on Price Validation and Price... (``Notice'') that explains the price validation protocol of the FINRA trade reporting facilities and sets... trades by comparing the submitted price against price validation parameters established by FINRA...
Abraha, Iosief; Giovannini, Gianni; Serraino, Diego; Fusco, Mario; Montedori, Alessandro
2016-03-18
Breast, lung and colorectal cancers constitute the most common cancers worldwide and their epidemiology, related health outcomes and quality indicators can be studied using administrative healthcare databases. To constitute a reliable source for research, administrative healthcare databases need to be validated. The aim of this protocol is to perform the first systematic review of studies reporting the validation of International Classification of Diseases 9th and 10th revision codes to identify breast, lung and colorectal cancer diagnoses in administrative healthcare databases. This review protocol has been developed according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses Protocol (PRISMA-P) 2015 statement. We will search the following databases: MEDLINE, EMBASE, Web of Science and the Cochrane Library, using appropriate search strategies. We will include validation studies that used administrative data to identify breast, lung and colorectal cancer diagnoses or studies that evaluated the validity of breast, lung and colorectal cancer codes in administrative data. The following inclusion criteria will be used: (1) the presence of a reference standard case definition for the disease of interest; (2) the presence of at least one test measure (eg, sensitivity, positive predictive values, etc) and (3) the use of data source from an administrative database. Pairs of reviewers will independently abstract data using standardised forms and will assess quality using a checklist based on the Standards for Reporting of Diagnostic accuracy (STARD) criteria. Ethics approval is not required. We will submit results of this study to a peer-reviewed journal for publication. The results will serve as a guide to identify appropriate case definitions and algorithms of breast, lung and colorectal cancers for researchers involved in validating administrative healthcare databases as well as for outcome research on these conditions that used administrative healthcare databases. CRD42015026881. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Initial outcomes of a harmonized approach to collect welfare data in sport and leisure horses.
Dalla Costa, E; Dai, F; Lebelt, D; Scholz, P; Barbieri, S; Canali, E; Minero, M
2017-02-01
A truthful snapshot of horse welfare conditions is a prerequisite for predicting the impact of any actions intended to improve the quality of life of horses. This can be achieved when welfare information, gathered by different assessors in diverse geographical areas, is valid, comparable and collected in a harmonized way. This paper aims to present the first outcomes of the Animal Welfare Indicators (AWIN) approach: the results of on-farm assessment and a reliable and harmonized data collection system. A total of 355 sport and leisure horses, stabled in 40 facilities in Italy and in Germany, were evaluated by three trained assessors using the AWIN welfare assessment protocol for horses. The AWINHorse app was used to collect, store and send data to a common server. Identified welfare issues were obesity, unsatisfactory box dimensions, long periods of confinement and lack of social interaction. The digitalized data collection was feasible in an on-farm environment, and our results suggest that this approach could prove useful in identifying the most relevant welfare issues of horses in Europe or worldwide.
Law, Gloria C; Apfelbacher, Christian; Posadzki, Pawel P; Kemp, Sandra; Tudor Car, Lorainne
2018-05-17
There will be a lack of 18 million healthcare workers by 2030. Multiplying the number of well-trained healthcare workers through innovative ways such as eLearning is highly recommended in solving this shortage. However, high heterogeneity of learning outcomes in eLearning systematic reviews reveals a lack of consistency and agreement on core learning outcomes in eLearning for medical education. In addition, there seems to be a lack of validity evidence for measurement instruments used in these trials. This undermines the credibility of these outcome measures and affects the ability to draw accurate and meaningful conclusions. The aim of this research is to address this issue by determining the choice of outcomes, measurement instruments and the prevalence of measurement instruments with validity evidence in randomised trials on eLearning for pre-registration medical education. We will conduct a systematic mapping and review to identify the types of outcomes, the kinds of measurement instruments and the prevalence of validity evidence among measurement instruments in eLearning randomised controlled trials (RCTs) in pre-registration medical education. The search period will be from January 1990 until August 2017. We will consider studies on eLearning for health professionals' education. Two reviewers will extract and manage data independently from the included studies. Data will be analysed and synthesised according to the aim of the review. Appropriate choice of outcomes and measurement tools is essential for ensuring high-quality research in the field of eLearning and eHealth. The results of this study could have positive implications for other eHealth interventions, including (1) improving quality and credibility of eLearning research, (2) enhancing the quality of digital medical education and (3) informing researchers, academics and curriculum developers about the types of outcomes and validity evidence for measurement instruments used in eLearning studies. The protocol aspires to assist in the advancement of the eLearning research field as well as in the development of high-quality healthcare professionals' digital education. PROSPERO CRD42017068427.
El Feghali, Ramzi N; Topouchian, Jirar A; Pannier, Bruno M; El Assaad, Hiba A; Asmar, Roland G
2007-06-01
A high percentage of hypertensive patients present an arm circumference of over 32 cm; the use of a large cuff is therefore recommended. Validation studies are usually performed in the general population using a standard-size cuff. The aim of this study was to assess the accuracy of the Omron M7 device in a population with an arm circumference ranging from 32 to 42 cm. A validation study was performed according to the International Protocol of the European Society of Hypertension. This protocol is divided into two phases: the first phase is performed on 15 selected participants (45 pairs of blood-pressure measurements); if the device passes this phase, 18 supplementary participants are included (54 pairs of blood-pressure measurements), making a total number of 33 participants (99 pairs of blood-pressure measurements), on whom the analysis is performed. For each participant, four blood-pressure measurements were performed simultaneously by two trained observers, using mercury sphygmomanometers fitted with a Y tube; the measurements alternated with three by the test device. The difference between the blood-pressure value given by the device and that obtained by the two observers (mean of the two observations) was calculated for each measure. The 99 pairs of blood-pressure differences were classified into three categories (
Savage, Trevor Nicholas; McIntosh, Andrew Stuart
2017-03-01
It is important to understand factors contributing to and directly causing sports injuries to improve the effectiveness and safety of sports skills. The characteristics of injury events must be evaluated and described meaningfully and reliably. However, many complex skills cannot be effectively investigated quantitatively because of ethical, technological and validity considerations. Increasingly, qualitative methods are being used to investigate human movement for research purposes, but there are concerns about reliability and measurement bias of such methods. Using the tackle in Rugby union as an example, we outline a systematic approach for developing a skill analysis protocol with a focus on improving objectivity, validity and reliability. Characteristics for analysis were selected using qualitative analysis and biomechanical theoretical models and epidemiological and coaching literature. An expert panel comprising subject matter experts provided feedback and the inter-rater reliability of the protocol was assessed using ten trained raters. The inter-rater reliability results were reviewed by the expert panel and the protocol was revised and assessed in a second inter-rater reliability study. Mean agreement in the second study improved and was comparable (52-90% agreement and ICC between 0.6 and 0.9) with other studies that have reported inter-rater reliability of qualitative analysis of human movement.
Issues in developing valid assessments of speech pathology students' performance in the workplace.
McAllister, Sue; Lincoln, Michelle; Ferguson, Alison; McAllister, Lindy
2010-01-01
Workplace-based learning is a critical component of professional preparation in speech pathology. A validated assessment of this learning is seen to be 'the gold standard', but it is difficult to develop because of design and validation issues. These issues include the role and nature of judgement in assessment, challenges in measuring quality, and the relationship between assessment and learning. Valid assessment of workplace-based performance needs to capture the development of competence over time and account for both occupation specific and generic competencies. This paper reviews important conceptual issues in the design of valid and reliable workplace-based assessments of competence including assessment content, process, impact on learning, measurement issues, and validation strategies. It then goes on to share what has been learned about quality assessment and validation of a workplace-based performance assessment using competency-based ratings. The outcomes of a four-year national development and validation of an assessment tool are described. A literature review of issues in conceptualizing, designing, and validating workplace-based assessments was conducted. Key factors to consider in the design of a new tool were identified and built into the cycle of design, trialling, and data analysis in the validation stages of the development process. This paper provides an accessible overview of factors to consider in the design and validation of workplace-based assessment tools. It presents strategies used in the development and national validation of a tool COMPASS, used in an every speech pathology programme in Australia, New Zealand, and Singapore. The paper also describes Rasch analysis, a model-based statistical approach which is useful for establishing validity and reliability of assessment tools. Through careful attention to conceptual and design issues in the development and trialling of workplace-based assessments, it has been possible to develop the world's first valid and reliable national assessment tool for the assessment of performance in speech pathology.
Lewandowska, Dagmara W; Zagordi, Osvaldo; Geissberger, Fabienne-Desirée; Kufner, Verena; Schmutz, Stefan; Böni, Jürg; Metzner, Karin J; Trkola, Alexandra; Huber, Michael
2017-08-08
Sequence-specific PCR is the most common approach for virus identification in diagnostic laboratories. However, as specific PCR only detects pre-defined targets, novel virus strains or viruses not included in routine test panels will be missed. Recently, advances in high-throughput sequencing allow for virus-sequence-independent identification of entire virus populations in clinical samples, yet standardized protocols are needed to allow broad application in clinical diagnostics. Here, we describe a comprehensive sample preparation protocol for high-throughput metagenomic virus sequencing using random amplification of total nucleic acids from clinical samples. In order to optimize metagenomic sequencing for application in virus diagnostics, we tested different enrichment and amplification procedures on plasma samples spiked with RNA and DNA viruses. A protocol including filtration, nuclease digestion, and random amplification of RNA and DNA in separate reactions provided the best results, allowing reliable recovery of viral genomes and a good correlation of the relative number of sequencing reads with the virus input. We further validated our method by sequencing a multiplexed viral pathogen reagent containing a range of human viruses from different virus families. Our method proved successful in detecting the majority of the included viruses with high read numbers and compared well to other protocols in the field validated against the same reference reagent. Our sequencing protocol does work not only with plasma but also with other clinical samples such as urine and throat swabs. The workflow for virus metagenomic sequencing that we established proved successful in detecting a variety of viruses in different clinical samples. Our protocol supplements existing virus-specific detection strategies providing opportunities to identify atypical and novel viruses commonly not accounted for in routine diagnostic panels.
Multiprofissional electronic protocol in ophtalmology with enfasis in strabismus.
Ribeiro, Christie Graf; Moreira, Ana Tereza Ramos; Pinto, José Simão DE Paula; Malafaia, Osvaldo
2016-01-01
to create and validate an electronic database in ophthalmology focused on strabismus, to computerize this database in the form of a systematic data collection software named Electronic Protocol, and to incorporate this protocol into the Integrated System of Electronic Protocols (SINPE(c)). this is a descriptive study, with the methodology divided into three phases: (1) development of a theoretical ophthalmologic database with emphasis on strabismus; (2) computerization of this theoretical ophthalmologic database using SINPE(c) and (3) interpretation of the information with demonstration of results to validate the protocol. We inputed data from the charts of fifty patients with known strabismus through the Electronic Protocol for testing and validation. the new electronic protocol was able to store information regarding patient history, physical examination, laboratory exams, imaging results, diagnosis and treatment of patients with ophthalmologic diseases, with emphasis on strabismus. We included 2,141 items in this master protocol and created 20 new specific electronic protocols for strabismus, each with its own specifics. Validation was achieved through correlation and corroboration of the symptoms and confirmed diagnoses of the fifty included patients with the diagnostic criteria for the twenty new strabismus protocols. a new, validated electronic database focusing on ophthalmology, with emphasis on strabismus, was successfully created through the standardized collection of information, and computerization of the database using proprietary software. This protocol is ready for deployment to facilitate data collection, sorting and application for practitioners and researchers in numerous specialties. criar uma base eletrônica de dados em oftalmologia com ênfase em estrabismo através da coleta padronizada de informações. Informatizar esta base sob a forma de software para a coleta sistemática de dados chamado "Protocolo Eletrônico" e incorporar este "Protocolo Eletrônico" da Oftalmologia ao Sistema Integrado de Protocolos Eletrônicos (SINPE(c)). este é um estudo descritivo e a metodologia aplicada em seu desenvolvimento está didaticamente dividida em três fases: 1) criação da base teórica de dados clínicos de oftalmologia com ênfase em estrabismo; 2) informatização da base teórica dos dados utilizando o SINPE(c); e 3) interpretação das informações com demonstração dos resultados. A informatização da base de dados foi realizada pela utilização da concessão de uso do SINPE(c). Foram incluídos neste protocolo 50 pacientes com estrabismo para validação do protocolo. o protocolo eletrônico desenvolvido permitiu armazenar informações relacionadas à anamnese, exame físico, exames complementares, diagnóstico e tratamento de pacientes com doenças oftalmológicas, com ênfase em estrabismo. Foram incluídos neste trabalho 2141 itens no protocolo mestre e foram criados 20 protocolos específicos de estrabismo, cada um com suas particularidades. Os 50 pacientes que foram incluídos nos protocolos específicos demonstraram a eficácia do método empregado. foi criada uma base eletrônica de dados em oftalmologia com ênfase em estrabismo através da coleta padronizada de informações. Esta base de dados foi informatizada sob a forma de software onde os futuros usuários poderão utilizar o protocolo eletrônico multiprofissional de doenças oftalmológicas com ênfase em estrabismo para a coleta de seus dados.
NASA Technical Reports Server (NTRS)
Lewis, Pattie
2007-01-01
Headquarters National Aeronautics and Space Administration (NASA) chartered the NASA Acquisition Pollution Prevention (AP2) Office to coordinate agency activities affecting pollution prevention issues identified during system and component acquisition and sustainment processes. The primary objectives of the AP2 Office are to: (1) Reduce or eliminate the use of hazardous materials or hazardous processes at manufacturing, remanufacturing, and sustainment locations. (2) Avoid duplication of effort in actions required to reduce or eliminate hazardous materials through joint center cooperation and technology sharing. The objective of this project was to qualify candidate alternative Low-Emission Surface Preparation/Depainting Technologies for Structural Steel applications at NASA facilities. This project compares the surface preparation/depainting performance of the proposed alternatives to existing surface preparation/depainting systems or standards. This Joint Test Report (JTR) contains the results of testing as per the outlines of the Joint Test Protocol (JTP), Joint Test Protocol for Validation of Alternative Low-Emission Surface Preparation/Depainting Technologies for Structural Steel, and the Field Test Plan (FTP), Field Evaluations Test Plan for Validation of Alternative Low-Emission Surface Preparation/Depainting Technologies for Structural Steel, for critical requirements and tests necessary to qualify alternatives for coating removal systems. These tests were derived from engineering, performance, and operational impact (supportability) requirements defined by a consensus of government and industry participants. This JTR documents the results of the testing as well as any test modifications made during the execution of the project. This JTR is made available as a reference for future pollution prevention endeavors by other NASA Centers, the Department of Defense and commercial users to minimize duplication of effort. The current coating removal processes identified herein are for polyurethane, epoxy and other paint systems applied by conventional wet-spray processes. A table summarizes the target hazardous materials, processes and materials, applications, affected programs, and candidate substrates.
Artifact-based reflective interviews for identifying pragmatic epistemological resources
NASA Astrophysics Data System (ADS)
Shubert, Christopher Walden
Physics Education Research studies the science of teaching and learning physics. The process of student learning is complex, and the factors that affect it are numerous. Describing students' understanding of physics knowledge and reasoning is the basis for much productive research; however, such research fails to account for certain types of student learning difficulties. In this dissertation, I explore one source of student difficulty: personal epistemology, students' ideas about knowledge and knowing. Epistemology traditionally answers three questions: What is knowledge? How is knowledge created? And, how do we know what we know? An individual's responses to these questions can affect learning in terms of how they approach tasks involving the construction and application of knowledge. The key issue addressed in this dissertation is the effect of methodological choices on the validity and reliability of claims concerning personal epistemology. My central concern is contextual validity, how what is said about one's epistemology is not identical to how one behaves epistemologically. In response to these issues, I present here a new methodology for research on student epistemology: video artifact-based reflective interview protocols. These protocols begin with video taping students in their natural classroom activities, and then asking the participants epistemological questions immediately after watching selected scenes from their activity, contextually anchoring them in their actual learning experience. The data from these interviews is viewed in the framework of Epistemological Resource Theory, a framework of small bits of knowledge whose coordination in a given context is used to describe personal epistemology. I claim that the privileged data from these interviews allows detailed epistemological resources to be identified, and that these resources can provide greater insight into how student epistemologies are applied in learning activities. This research, situated within an algebra-based physics for life scientists course reform project, focuses on student work in Modeling Informed Instruction (MII) laboratory activities, which are an adaptation of Modeling Instruction. The development of these activities is based on the epistemological foundations of Modeling Instruction, and these foundations are used to describe a potential assessment for the epistemological effectiveness of a curriculum.
Standardized protocols for quality control of MRM-based plasma proteomic workflows.
Percy, Andrew J; Chambers, Andrew G; Smith, Derek S; Borchers, Christoph H
2013-01-04
Mass spectrometry (MS)-based proteomics is rapidly emerging as a viable technology for the identification and quantitation of biological samples, such as human plasma--the most complex yet commonly employed biofluid in clinical analyses. The transition from a qualitative to quantitative science is required if proteomics is going to successfully make the transition to a clinically useful technique. MS, however, has been criticized for a lack of reproducibility and interlaboratory transferability. Currently, the MS and plasma proteomics communities lack standardized protocols and reagents to ensure that high-quality quantitative data can be accurately and precisely reproduced by laboratories across the world using different MS technologies. Toward addressing this issue, we have developed standard protocols for multiple reaction monitoring (MRM)-based assays with customized isotopically labeled internal standards for quality control of the sample preparation workflow and the MS platform in quantitative plasma proteomic analyses. The development of reference standards and their application to a single MS platform is discussed herein, along with the results from intralaboratory tests. The tests highlighted the importance of the reference standards in assessing the efficiency and reproducibility of the entire bottom-up proteomic workflow and revealed errors related to the sample preparation and performance quality and deficits of the MS and LC systems. Such evaluations are necessary if MRM-based quantitative plasma proteomics is to be used in verifying and validating putative disease biomarkers across different research laboratories and eventually in clinical laboratories.
Quantification of trace elements and speciation of iron in atmospheric particulate matter
NASA Astrophysics Data System (ADS)
Upadhyay, Nabin
Trace metal species play important roles in atmospheric redox processes and in the generation of oxidants in cloud systems. The chemical impact of these elements on atmospheric and cloud chemistry is dependent on their occurrence, solubility and speciation. First, analytical protocols have been developed to determine trace elements in particulate matter samples collected for carbonaceous analysis. The validated novel protocols were applied to the determination of trace elements in particulate samples collected in the remote marine atmosphere and urban areas in Arizona to study air pollution issues. The second part of this work investigates on solubility and speciation in environmental samples. A detailed study on the impact of the nature and strength of buffer solutions on solubility and speciation of iron lead to a robust protocol, allowing for comparative measurements in matrices representative of cloud water conditions. Application of this protocol to samples from different environments showed low iron solubility (less than 1%) in dust-impacted events and higher solubility (5%) in anthropogenically impacted urban samples. In most cases, Fe(II) was the dominant oxidation state in the soluble fraction of iron. The analytical protocol was then applied to investigate iron processing by fogs. Field observations showed that only a small fraction (1%) of iron was scavenged by fog droplets for which each of the soluble and insoluble fraction were similar. A coarse time resolution limited detailed insights into redox cycling within fog system. Overall results suggested that the major iron species in the droplets was Fe(1I) (80% of soluble iron). Finally, the occurrence and sources of emerging organic pollutants in the urban atmosphere were investigated. Synthetic musk species are ubiquitous in the urban environment (less than 5 ng m-3) and investigations at wastewater treatment plants showed that wastewater aeration basins emit a substantial amount of these species to the atmosphere.
Development of UV Testing Protocol and Recommendations
The goal of this effort is to develop and present new protocols for UV validation testing and analysis that leverage advances and may help to improve implementation and operation at PWSs. This document also provides for updated clarifications to the UVDGM based on evolving practi...
Wittink, Harriet; Verschuren, Olaf; Terwee, Caroline; de Groot, Janke; Kwakkel, Gert; van de Port, Ingrid
2017-11-21
To systematically review and critically appraise the literature on measurement properties of cardiopulmonary exercise test protocols for measuring aerobic capacity, VO2max, in persons after stroke. PubMed, Embase and Cinahl were searched from inception up to 15 June 2016. A total of 9 studies were identified reporting on 9 different cardiopulmonary exercise test protocols. VO2max measured with cardiopulmonary exercise test and open spirometry was the construct of interest. The target population was adult persons after stroke. We included all studies that evaluated reliability, measurement error, criterion validity, content validity, hypothesis testing and/or responsiveness of cardiopulmonary exercise test protocols. Two researchers independently screened the literature, assessed methodological quality using the COnsensus-based Standards for the selection of health Measurement INstruments checklist and extracted data on measurement properties of cardiopulmonary exercise test protocols. Most studies reported on only one measurement property. Best-evidence synthesis was derived taking into account the methodological quality of the studies, the results and the consistency of the results. No judgement could be made on which protocol is "best" for measuring VO2max in persons after stroke due to lack of high-quality studies on the measurement properties of the cardiopulmonary exercise test.
Yamanaka, Ashley; Fialkowski, Marie Kainoa; Wilkens, Lynne; Li, Fenfang; Ettienne, Reynolette; Fleming, Travis; Power, Julianne; Deenik, Jonathan; Coleman, Patricia; Leon Guerrero, Rachael; Novotny, Rachel
2016-09-02
Quality assurance plays an important role in research by assuring data integrity, and thus, valid study results. We aim to describe and share the results of the quality assurance process used to guide the data collection process in a multi-site childhood obesity prevalence study and intervention trial across the US Affiliated Pacific Region. Quality assurance assessments following a standardized protocol were conducted by one assessor in every participating site. Results were summarized to examine and align the implementation of protocol procedures across diverse settings. Data collection protocols focused on food and physical activity were adhered to closely; however, protocols for handling completed forms and ensuring data security showed more variability. Quality assurance protocols are common in the clinical literature but are limited in multi-site community-based studies, especially in underserved populations. The reduction in the number of QA problems found in the second as compared to the first data collection periods for the intervention study attest to the value of this assessment. This paper can serve as a reference for similar studies wishing to implement quality assurance protocols of the data collection process to preserve data integrity and enhance the validity of study findings. NIH clinical trial #NCT01881373.
Rosenkrantz, Andrew B; Johnson, Evan; Sanger, Joseph J
2015-10-01
This article presents our local experience in the implementation of a real-time web-based system for reporting and tracking quality issues relating to abdominal imaging examinations. This system allows radiologists to electronically submit examination quality issues during clinical readouts. The submitted information is e-mailed to a designate for the given modality for further follow-up; the designate may subsequently enter text describing their response or action taken, which is e-mailed back to the radiologist. Review of 558 entries over a 6-year period demonstrated documentation of a broad range of examination quality issues, including specific issues relating to protocol deviation, post-processing errors, positioning errors, artifacts, and IT concerns. The most common issues varied among US, CT, MRI, radiography, and fluoroscopy. In addition, the most common issues resulting in a patient recall for repeat imaging (generally related to protocol deviation in MRI and US) were identified. In addition to submitting quality problems, radiologists also commonly used the tool to provide recognition of a well-performed examination. An electronic log of actions taken in response to radiologists' submissions indicated that both positive and negative feedback were commonly communicated to the performing technologist. Information generated using the tool can be used to guide subsequent quality improvement initiatives within a practice, including continued protocol standardization as well as education of technologists in the optimization of abdominal imaging examinations.
Direct-to-consumer genetic testing in Slovenia: availability, ethical dilemmas and legislation.
Vrecar, Irena; Peterlin, Borut; Teran, Natasa; Lovrecic, Luca
2015-01-01
Over the last few years, many private companies are advertising direct-to-consumer genetic testing (DTC GT), mostly with no or only minor clinical utility and validity of tests and without genetic counselling. International professional community does not approve provision of DTC GT and situation in some EU countries has been analysed already. The aim of our study was to analyse current situation in the field of DTC GT in Slovenia and related legal and ethical issues. Information was retrieved through internet search, performed independently by two authors, structured according to individual private company and the types of offered genetic testing. Five private companies and three Health Insurance Companies offer DTC GT and it is provided without genetic counselling. Available tests include testing for breast cancer, tests with other health-related information (complex diseases, drug responses) and other tests (nutrigenetic, ancestry, paternity). National legislation is currently being developed and Council of Experts in Medical Genetics has issued an opinion about Genetic Testing and Commercialization of Genetic Tests in Slovenia. Despite the fact that Slovenia has signed the Additional protocol to the convention on human rights and biomedicine, concerning genetic testing for health purposes, DTC GT in Slovenia is present and against all international recommendations. There is lack of or no medical supervision, clinical validity and utility of tests and inappropriate genetic testing of minors is available. There is urgent need for regulation of ethical, legal, and social aspects. National legislation on DTC GT is being prepared.
A semi-automatic method for left ventricle volume estimate: an in vivo validation study
NASA Technical Reports Server (NTRS)
Corsi, C.; Lamberti, C.; Sarti, A.; Saracino, G.; Shiota, T.; Thomas, J. D.
2001-01-01
This study aims to the validation of the left ventricular (LV) volume estimates obtained by processing volumetric data utilizing a segmentation model based on level set technique. The validation has been performed by comparing real-time volumetric echo data (RT3DE) and magnetic resonance (MRI) data. A validation protocol has been defined. The validation protocol was applied to twenty-four estimates (range 61-467 ml) obtained from normal and pathologic subjects, which underwent both RT3DE and MRI. A statistical analysis was performed on each estimate and on clinical parameters as stroke volume (SV) and ejection fraction (EF). Assuming MRI estimates (x) as a reference, an excellent correlation was found with volume measured by utilizing the segmentation procedure (y) (y=0.89x + 13.78, r=0.98). The mean error on SV was 8 ml and the mean error on EF was 2%. This study demonstrated that the segmentation technique is reliably applicable on human hearts in clinical practice.
USING CFD TO ANALYZE NUCLEAR SYSTEMS BEHAVIOR: DEFINING THE VALIDATION REQUIREMENTS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Richard Schultz
2012-09-01
A recommended protocol to formulate numeric tool specifications and validation needs in concert with practices accepted by regulatory agencies for advanced reactors is described. The protocol is based on the plant type and perceived transient and accident envelopes that translates to boundary conditions for a process that gives the: (a) key phenomena and figures-of-merit which must be analyzed to ensure that the advanced plant can be licensed, (b) specification of the numeric tool capabilities necessary to perform the required analyses—including bounding calculational uncertainties, and (c) specification of the validation matrices and experiments--including the desired validation data. The result of applyingmore » the process enables a complete program to be defined, including costs, for creating and benchmarking transient and accident analysis methods for advanced reactors. By following a process that is in concert with regulatory agency licensing requirements from the start to finish, based on historical acceptance of past licensing submittals, the methods derived and validated have a high probability of regulatory agency acceptance.« less
Mixing TCP and Satellites: A View from Above
NASA Technical Reports Server (NTRS)
Travis, Eric
1998-01-01
Various issues associated with "Mixing TCP and Satellites: A View from Above" are presented in viewgraph form. Specific topics include: 1) Why are open protocol standards important?; and 2) Protocols are like galoshes: One size does not fit all.
Beime, Beate; Deutsch, Cornelia; Krüger, Ralf; Wolf, Andreas; Müller, Peter; Hammel, Gertrud; Bramlage, Peter
2017-05-01
The purpose of the study was to validate the ambulatory blood pressure monitoring (ABPM) device custo screen pediatric in children aged 3 to 12 years according to the International Protocol of the European Society of Hypertension (ESH-IP revision 2010). Thirty-three children were included and systolic and diastolic blood pressure measurements were performed according to the ESH-IP. The protocol was modified for children considering data from the German Health Interview and Examination Survey for Children and Adolescents (KIGGS). The custo screen pediatric met all the requirements of the ESH-IP. The mean difference between the test device and the reference was -1.4 ± 3.0 mmHg for systolic blood pressure (SBP) and -0.7 ± 3.2 mmHg for diastolic blood pressure (DBP). For SBP and DBP, all 99 measurements were within the absolute difference of 10 mmHg between the test device and the reference. As to part 2 of the protocol, for DBP in all subjects, two out of three measurements were within 5 mmHg between the device and the standard, whereas for SBP in 32 of 33 subjects, two out of three measurements were within this range. The custo screen pediatric met all criteria of the ESH-IP review 2010, modified for children from 3 to about 12 years, and can be recommended for ABPM in children. What is Known: • Validation of blood pressure measuring devices is essential to provide patients with an accurate blood pressure measuring device. • The majority of devices has not been validated in children. What is New: • Prior to the present validation, study protocol adjustments of ESH-IP review 2010 for children were defined according to German Health Interview and Examination Survey for Children and Adolescents 2013 (KIGGS). • The custo screen pediatric test device met all criteria of ESH-IP revision 2010, modified for children, and can be recommended for ABPM in children aged 3 to about 12 years.
Mena, Marisa; Lloveras, Belen; Tous, Sara; Bogers, Johannes; Maffini, Fausto; Gangane, Nitin; Kumar, Rekha Vijay; Somanathan, Thara; Lucas, Eric; Anantharaman, Devasena; Gheit, Tarik; Castellsagué, Xavier; Pawlita, Michael; de Sanjosé, Silvia; Alemany, Laia; Tommasino, Massimo
2017-01-01
Worldwide use of formalin-fixed paraffin-embedded blocks (FFPE) is extensive in diagnosis and research. Yet, there is a lack of optimized/standardized protocols to process the blocks and verify the quality and presence of the targeted tissue. In the context of an international study on head and neck cancer (HNC)-HPV-AHEAD, a standardized protocol for optimizing the use of FFPEs in molecular epidemiology was developed and validated. First, a protocol for sectioning the FFPE was developed to prevent cross-contamination and distributed between participating centers. Before processing blocks, all sectioning centers underwent a quality control to guarantee a satisfactory training process. The first and last sections of the FFPEs were used for histopathological assessment. A consensus histopathology evaluation form was developed by an international panel of pathologists and evaluated for four indicators in a pilot analysis in order to validate it: 1) presence/type of tumor tissue, 2) identification of other tissue components that could affect the molecular diagnosis and 3) quality of the tissue. No HPV DNA was found in sections from empty FFPE generated in any histology laboratories of HPV-AHEAD consortium and all centers passed quality assurance for processing after quality control. The pilot analysis to validate the histopathology form included 355 HNC cases. The form was filled by six pathologists and each case was randomly assigned to two of them. Most samples (86%) were considered satisfactory. Presence of >50% of invasive carcinoma was observed in all sections of 66% of cases. Substantial necrosis (>50%) was present in <2% of samples. The concordance for the indicators targeted to validate the histopathology form was very high (kappa > 0.85) between first and last sections and fair to high between pathologists (kappa/pabak 0.21-0.72). The protocol allowed to correctly process without signs of contamination all FFPE of the study. The histopathology evaluation of the cases assured the presence of the targeted tissue, identified the presence of other tissues that could disturb the molecular diagnosis and allowed the assessment of tissue quality.
Validating visual disturbance types and classes used for forest soil monitoring protocols
D. S. Page-Dumroese; A. M. Abbott; M. P. Curran; M. F. Jurgensen
2012-01-01
We describe several methods for validating visual soil disturbance classes used during forest soil monitoring after specific management operations. Site-specific vegetative, soil, and hydrologic responses to soil disturbance are needed to identify sensitive and resilient soil properties and processes; therefore, validation of ecosystem responses can provide information...
NASA Technical Reports Server (NTRS)
Hooker, Stanford B.; McClain, Charles R.; Mannino, Antonio
2007-01-01
The primary objective of this planning document is to establish a long-term capability and validating oceanic biogeochemical satellite data. It is a pragmatic solution to a practical problem based primarily o the lessons learned from prior satellite missions. All of the plan's elements are seen to be interdependent, so a horizontal organizational scheme is anticipated wherein the overall leadership comes from the NASA Ocean Biology and Biogeochemistry (OBB) Program Manager and the entire enterprise is split into two components of equal sature: calibration and validation plus satellite data processing. The detailed elements of the activity are based on the basic tasks of the two main components plus the current objectives of the Carbon Cycle and Ecosystems Roadmap. The former is distinguished by an internal core set of responsibilities and the latter is facilitated through an external connecting-core ring of competed or contracted activities. The core elements for the calibration and validation component include a) publish protocols and performance metrics; b) verify uncertainty budgets; c) manage the development and evaluation of instrumentation; and d) coordinate international partnerships. The core elements for the satellite data processing component are e) process and reprocess multisensor data; f) acquire, distribute, and archive data products; and g) implement new data products. Both components have shared responsibilities for initializing and temporally monitoring satellite calibration. Connecting-core elements include (but are not restricted to) atmospheric correction and characterization, standards and traceability, instrument and analysis round robins, field campaigns and vicarious calibration sites, in situ database, bio-optical algorithm (and product) validation, satellite characterization and vicarious calibration, and image processing software. The plan also includes an accountability process, creating a Calibration and Validation Team (to help manage the activity), and a discussion of issues associated with the plan's scientific focus.
Current federal regulations required monitoring for fecal coliforms or Salmonella in biosolids destined for land application. Methods used for analysis of fecal coliforms and Salmonella were reviewed and a standard protocol was developed. The protocols were then...
Current federal regulations require monitoring for fecal coliforms or Salmonella in biosolids destined for land application. Methods used for analysis of fecal coliforms and Salmonella were reviewed and a standard protocol was developed. The protocols were then evaluated by testi...
General A Scheme to Share Information via Employing Discrete Algorithm to Quantum States
NASA Astrophysics Data System (ADS)
Kang, Guo-Dong; Fang, Mao-Fa
2011-02-01
We propose a protocol for information sharing between two legitimate parties (Bob and Alice) via public-key cryptography. In particular, we specialize the protocol by employing discrete algorithm under mod that maps integers to quantum states via photon rotations. Based on this algorithm, we find that the protocol is secure under various classes of attacks. Specially, owe to the algorithm, the security of the classical privacy contained in the quantum public-key and the corresponding ciphertext is guaranteed. And the protocol is robust against the impersonation attack and the active wiretapping attack by designing particular checking processing, thus the protocol is valid.
Boubouchairopoulou, N; Kollias, A; Chiu, B; Chen, B; Lagou, S; Anestis, P; Stergiou, G S
2017-07-01
A pocket-size cuffless electronic device for self-measurement of blood pressure (BP) has been developed (Freescan, Maisense Inc., Zhubei, Taiwan). The device estimates BP within 10 s using three embedded electrodes and one force sensor that is applied over the radial pulse to evaluate the pulse wave. Before use, basic anthropometric characteristics are recorded on the device, and individualized initial calibration is required based on a standard BP measurement performed using an upper-arm BP monitor. The device performance in providing valid BP readings was evaluated in 313 normotensive and hypertensive adults in three study phases during which the device sensor was upgraded. A formal validation study of a prototype device against mercury sphygmomanometer was performed according to the American National Standards Institute/Association for the Advancement of Medical Instrumentation/International Organization for Standardization (ANSI/AAMI/ISO) 2013 protocol. The test device succeeded in obtaining a valid BP measurement (three successful readings within up to five attempts) in 55-72% of the participants, which reached 87% with device sensor upgrade. For the validation study, 125 adults were recruited and 85 met the protocol requirements for inclusion. The mean device-observers BP difference was 3.2±6.7 (s.d.) mm Hg for systolic and 2.6±4.6 mm Hg for diastolic BP (criterion 1). The estimated s.d. (inter-subject variability) were 5.83 and 4.17 mm Hg respectively (criterion 2). These data suggest that this prototype cuffless BP monitor provides valid self-measurements in the vast majority of adults, and satisfies the BP measurement accuracy criteria of the ANSI/AAMI/ISO 2013 validation protocol.
O'Neil, Margaret E; Fragala-Pinkham, Maria; Lennon, Nancy; George, Ameeka; Forman, Jeffrey; Trost, Stewart G
2016-01-01
Physical therapy for youth with cerebral palsy (CP) who are ambulatory includes interventions to increase functional mobility and participation in physical activity (PA). Thus, reliable and valid measures are needed to document PA in youth with CP. The purpose of this study was to evaluate the inter-instrument reliability and concurrent validity of 3 accelerometer-based motion sensors with indirect calorimetry as the criterion for measuring PA intensity in youth with CP. Fifty-seven youth with CP (mean age=12.5 years, SD=3.3; 51% female; 49.1% with spastic hemiplegia) participated. Inclusion criteria were: aged 6 to 20 years, ambulatory, Gross Motor Function Classification System (GMFCS) levels I through III, able to follow directions, and able to complete the full PA protocol. Protocol activities included standardized activity trials with increasing PA intensity (resting, writing, household chores, active video games, and walking at 3 self-selected speeds), as measured by weight-relative oxygen uptake (in mL/kg/min). During each trial, participants wore bilateral accelerometers on the upper arms, waist/hip, and ankle and a portable indirect calorimeter. Intraclass coefficient correlations (ICCs) were calculated to evaluate inter-instrument reliability (left-to-right accelerometer placement). Spearman correlations were used to examine concurrent validity between accelerometer output (activity and step counts) and indirect calorimetry. Friedman analyses of variance with post hoc pair-wise analyses were conducted to examine the validity of accelerometers to discriminate PA intensity across activity trials. All accelerometers exhibited excellent inter-instrument reliability (ICC=.94-.99) and good concurrent validity (rho=.70-.85). All accelerometers discriminated PA intensity across most activity trials. This PA protocol consisted of controlled activity trials. Accelerometers provide valid and reliable measures of PA intensity among youth with CP. © 2016 American Physical Therapy Association.
Genheimer, Hannah; Andreatta, Marta; Asan, Esther; Pauli, Paul
2017-12-20
Since exposure therapy for anxiety disorders incorporates extinction of contextual anxiety, relapses may be due to reinstatement processes. Animal research demonstrated more stable extinction memory and less anxiety relapse due to vagus nerve stimulation (VNS). We report a valid human three-day context conditioning, extinction and return of anxiety protocol, which we used to examine effects of transcutaneous VNS (tVNS). Seventy-five healthy participants received electric stimuli (unconditioned stimuli, US) during acquisition (Day1) when guided through one virtual office (anxiety context, CTX+) but never in another (safety context, CTX-). During extinction (Day2), participants received tVNS, sham, or no stimulation and revisited both contexts without US delivery. On Day3, participants received three USs for reinstatement followed by a test phase. Successful acquisition, i.e. startle potentiation, lower valence, higher arousal, anxiety and contingency ratings in CTX+ versus CTX-, the disappearance of these effects during extinction, and successful reinstatement indicate validity of this paradigm. Interestingly, we found generalized reinstatement in startle responses and differential reinstatement in valence ratings. Altogether, our protocol serves as valid conditioning paradigm. Reinstatement effects indicate different anxiety networks underlying physiological versus verbal responses. However, tVNS did neither affect extinction nor reinstatement, which asks for validation and improvement of the stimulation protocol.
Konstan, Joseph; Iantaffi, Alex; Wilkerson, J. Michael; Galos, Dylan; Simon Rosser, B. R.
2017-01-01
Researchers use protocols to screen for suspicious survey submissions in online studies. We evaluated how well a de-duplication and cross-validation process detected invalid entries. Data were from the Sexually Explicit Media Study, an Internet-based HIV prevention survey of men who have sex with men. Using our protocol, 146 (11.6 %) of 1254 entries were identified as invalid. Most indicated changes to the screening questionnaire to gain entry (n = 109, 74.7 %), matched other submissions’ payment profiles (n = 56, 41.8 %), or featured an IP address that was recorded previously (n = 43, 29.5 %). We found few demographic or behavioral differences between valid and invalid samples, however. Invalid submissions had lower odds of reporting HIV testing in the past year (OR 0.63), and higher odds of requesting no payment compared to check payments (OR 2.75). Thus, rates of HIV testing would have been underestimated if invalid submissions had not been removed, and payment may not be the only incentive for invalid participation. PMID:25805443
Benetti, Elisabetta; Fania, Claudio; Palatini, Paolo
2014-02-01
The objective of this study was to determine the accuracy of the A&D BP UA-651 device for home blood pressure (BP) measurement according to the International Protocol of the European Society of Hypertension. Device evaluation was carried out in 33 patients. The mean age of the patients was 48.3±15.5 years, the mean systolic BP was 138.3±24.9 mmHg (range 90-180), the mean diastolic BP was 88.3±13.8 mmHg (range 60-108), and the mean arm circumference was 28.6±3.4 cm (range 23-36). The protocol requirements were followed precisely. The device passed all requirements, fulfilling the standards of the protocol. On average, the device underestimated the systolic BP by 0.4±4.4 mmHg and diastolic BP by 1.3±3.5 mmHg. The device-observer discrepancies were unrelated to patients' clinical characteristics. These data show that the A&D BP UA-651 device fulfilled the requirements for validation by the International Protocol and can be recommended for clinical use in the adult population.
Test Protocols for Advanced Inverter Interoperability Functions – Main Document
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Jay Dean; Gonzalez, Sigifredo; Ralph, Mark E.
2013-11-01
Distributed energy resources (DER) such as photovoltaic (PV) systems, when deployed in a large scale, are capable of influencing significantly the operation of power systems. Looking to the future, stakeholders are working on standards to make it possible to manage the potentially complex interactions between DER and the power system. In 2009, the Electric Power Research Institute (EPRI), Sandia National Laboratories (SNL) with the U.S. Department of Energy (DOE), and the Solar Electric Power Association (SEPA) initiated a large industry collaborative to identify and standardize definitions for a set of DER grid support functions. While the initial effort concentrated onmore » grid-tied PV inverters and energy storage systems, the concepts have applicability to all DER. A partial product of this on-going effort is a reference definitions document (IEC TR 61850-90-7, Object models for power converters in distributed energy resources (DER) systems) that has become a basis for expansion of related International Electrotechnical Commission (IEC) standards, and is supported by US National Institute of Standards and Technology (NIST) Smart Grid Interoperability Panel (SGIP). Some industry-led organizations advancing communications protocols have also embraced this work. As standards continue to evolve, it is necessary to develop test protocols to independently verify that the inverters are properly executing the advanced functions. Interoperability is assured by establishing common definitions for the functions and a method to test compliance with operational requirements. This document describes test protocols developed by SNL to evaluate the electrical performance and operational capabilities of PV inverters and energy storage, as described in IEC TR 61850-90-7. While many of these functions are not currently required by existing grid codes or may not be widely available commercially, the industry is rapidly moving in that direction. Interoperability issues are already apparent as some of these inverter capabilities are being incorporated in large demonstration and commercial projects. The test protocols are intended to be used to verify acceptable performance of inverters within the standard framework described in IEC TR 61850-90-7. These test protocols, as they are refined and validated over time, can become precursors for future certification test procedures for DER advanced grid support functions.« less
Test Protocols for Advanced Inverter Interoperability Functions - Appendices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Jay Dean; Gonzalez, Sigifredo; Ralph, Mark E.
2013-11-01
Distributed energy resources (DER) such as photovoltaic (PV) systems, when deployed in a large scale, are capable of influencing significantly the operation of power systems. Looking to the future, stakeholders are working on standards to make it possible to manage the potentially complex interactions between DER and the power system. In 2009, the Electric Power Research Institute (EPRI), Sandia National Laboratories (SNL) with the U.S. Department of Energy (DOE), and the Solar Electric Power Association (SEPA) initiated a large industry collaborative to identify and standardize definitions for a set of DER grid support functions. While the initial effort concentrated onmore » grid-tied PV inverters and energy storage systems, the concepts have applicability to all DER. A partial product of this on-going effort is a reference definitions document (IEC TR 61850-90-7, Object models for power converters in distributed energy resources (DER) systems) that has become a basis for expansion of related International Electrotechnical Commission (IEC) standards, and is supported by US National Institute of Standards and Technology (NIST) Smart Grid Interoperability Panel (SGIP). Some industry-led organizations advancing communications protocols have also embraced this work. As standards continue to evolve, it is necessary to develop test protocols to independently verify that the inverters are properly executing the advanced functions. Interoperability is assured by establishing common definitions for the functions and a method to test compliance with operational requirements. This document describes test protocols developed by SNL to evaluate the electrical performance and operational capabilities of PV inverters and energy storage, as described in IEC TR 61850-90-7. While many of these functions are not now required by existing grid codes or may not be widely available commercially, the industry is rapidly moving in that direction. Interoperability issues are already apparent as some of these inverter capabilities are being incorporated in large demonstration and commercial projects. The test protocols are intended to be used to verify acceptable performance of inverters within the standard framework described in IEC TR 61850-90-7. These test protocols, as they are refined and validated over time, can become precursors for future certification test procedures for DER advanced grid support functions.« less
Refolo, P; Sacchini, D; Minacori, R; Daloiso, V; Spagnolo, A G
2015-01-01
Patient recruitment is a critical point of today's clinical research. Several proposals have been made for improving it, but the effectiveness of these measures is actually uncertain. The use of Internet (e-recruitment) could represent a great chance to improve patient enrolment, even though the effectiveness of this implementation is not so evident. E-recruitment could bring some advantages, such as better interaction between clinical research demand and clinical research supply, time and resources optimization, and reduction of data entry errors. It raises some issues too, such as sampling errors, validity of informed consent, and protection of privacy. Research Ethics Committees/Institutional Review Boards should consider these critical points. The paper deals with Internet recruitment for clinical research. It also attempts to provide Research Ethics Committees/Institutional Review Boards with notes for assessing e-recruitment based clinical protocols.
DiFranza, J.; Savageau, J.; Bouchard, J.
2001-01-01
OBJECTIVE—To determine if the standard compliance check protocol is a valid measure of the experience of underage smokers when purchasing tobacco in unfamiliar communities. SETTING—160 tobacco outlets in eight Massachusetts communities where underage tobacco sales laws are vigorously enforced. PROCEDURE—Completed purchase rates were compared between underage smokers who behaved normally and inexperienced non-smoking youths who were not allowed to lie or present proof of age (ID). RESULTS—The "smoker protocol" increased the likelihood of a sale nearly sixfold over that for the non-smokers (odds ratio (OR) 5.7, 95% confidence interval (CI) 1.5 to 22). When the youths presented an ID with an underage birth date, the odds of a completed sale increased dramatically (OR 27, 95% CI 3.4 to 212). Clerks judged to be under 21 years of age were seven times more likely to make an illegal sale (OR 7.6, 95% CI 2.4 to 24.0). CONCLUSIONS—Commonly used compliance check protocols are too artificial to reflect accurately the experience of underage smokers. The validity of compliance checks might be improved by having youths present ID, and by employing either tobacco users, or non-tobacco users who are sufficiently experienced to mimic the self confidence exhibited by tobacco users in this situation. Consideration should be given to prohibiting the sale of tobacco by individuals under 21 years of age. Keywords: compliance check protocol; underage smokers PMID:11544386
Arnholdt-Schmitt, Birgit
2017-01-01
Respiration traits allow calculating temperature-dependent carbon use efficiency and prediction of growth rates. This protocol aims (1) to enable validation of respiration traits as non-DNA biomarkers for breeding on robust plants in support of sustainable and healthy plant production; (2) to provide an efficient, novel way to identify and predict functionality of DNA-based markers (genes, polymorphisms, edited genes, transgenes, genomes, and hologenomes), and (3) to directly help farmers select robust material appropriate for a specified region. The protocol is based on applying isothermal calorespirometry and consists of four steps: plant tissue preparation, calorespirometry measurements, data processing, and final validation through massive field-based data.The methodology can serve selection and improvement for a wide range of crops. Several of them are currently being tested in the author's lab. Among them are important cereals, such as wheat, barley, and rye, and diverse vegetables. However, it is critical that the protocol for measuring respiration traits be well adjusted to the plant species by considering deep knowledge on the specific physiology and functional cell biology behind the final target trait for production. Here, Daucus carota L. is chosen as an advanced example to demonstrate critical species-specific steps for protocol development. Carrot is an important global vegetable that is grown worldwide and in all climate regions (moderate, subtropical, and tropical). Recently, this species is also used in my lab as a model for studies on alternative oxidase (AOX) gene diversity and evolutionary dynamics in interaction with endophytes.
Evaluation of Hydroxyatrazine in the Endocrine Disruptor Screening and Testing Program’s Male and Female Pubertal Protocols. ABSTRACT Two critical components of the validation of any in vivo screening assay are to demonstrate sensitivity (ability to detect weak endocrine ...
The goal of this chapter is to provide an overview of MST marker characteristics, to describe performance criteria of detection protocols used and to offer guidelines for the effective interpretation of the results. Since the trend in the research community has shifted towards (q...
Connecting Levels of Representation: Emergent versus Submergent Perspective
ERIC Educational Resources Information Center
Rappoport, Lana T.; Ashkenazi, Guy
2008-01-01
Chemical phenomena can be described using three representation modes: macro, submicro, and symbolic. The way students use and connect these modes when solving conceptual problems was studied, using a think-aloud interview protocol. The protocol was validated through interviews with six faculty members, and then applied to four graduate and six…
31 CFR 352.3 - Registration and issue.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false Registration and issue. 352.3 Section....3 Registration and issue. (a) Registration. Series HH bonds may be registered as set forth in.... 3-80. (b) Validity of issue. A bond is validly issued when it is registered as provided 31 CFR part...
Handwriting assessment of Franco-Quebec primary school-age students
Couture, Mélanie; Morin, Marie-France; Coallier, Mélissa; Lavigne, Audrey; Archambault, Patricia; Bolduc, Émilie; Chartier, Émilie; Liard, Karolane; Jasmin, Emmanuelle
2016-12-01
Reasons for referring school-age children to occupational therapy mainly relate to handwriting problems. However, there are no validated tools or reference values for assessing handwriting in francophone children in Canada. This study aimed to adapt and validate the writing tasks described in an English Canadian handwriting assessment protocol and to develop reference values for handwriting speed for francophone children. Three writing tasks from the Handwriting Assessment Protocol-2nd Edition (near-point and far-point copying and dictation) were adapted for Québec French children and administered to 141 Grade 1 ( n = 73) and Grade 2 ( n = 68) students. Reference values for handwriting speed were obtained for near point and far point copying tasks. This adapted protocol and these reference values for speed will improve occupational therapy handwriting assessments for the target population.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Nan; Carmona, Ruben; Sirak, Igor
Purpose: To demonstrate an efficient method for training and validation of a knowledge-based planning (KBP) system as a radiation therapy clinical trial plan quality-control system. Methods and Materials: We analyzed 86 patients with stage IB through IVA cervical cancer treated with intensity modulated radiation therapy at 2 institutions according to the standards of the INTERTECC (International Evaluation of Radiotherapy Technology Effectiveness in Cervical Cancer, National Clinical Trials Network identifier: 01554397) protocol. The protocol used a planning target volume and 2 primary organs at risk: pelvic bone marrow (PBM) and bowel. Secondary organs at risk were rectum and bladder. Initial unfiltered dose-volumemore » histogram (DVH) estimation models were trained using all 86 plans. Refined training sets were created by removing sub-optimal plans from the unfiltered sample, and DVH estimation models… and DVH estimation models were constructed by identifying 30 of 86 plans emphasizing PBM sparing (comparing protocol-specified dosimetric cutpoints V{sub 10} (percentage volume of PBM receiving at least 10 Gy dose) and V{sub 20} (percentage volume of PBM receiving at least 20 Gy dose) with unfiltered predictions) and another 30 of 86 plans emphasizing bowel sparing (comparing V{sub 40} (absolute volume of bowel receiving at least 40 Gy dose) and V{sub 45} (absolute volume of bowel receiving at least 45 Gy dose), 9 in common with the PBM set). To obtain deliverable KBP plans, refined models must inform patient-specific optimization objectives and/or priorities (an auto-planning “routine”). Four candidate routines emphasizing different tradeoffs were composed, and a script was developed to automatically re-plan multiple patients with each routine. After selection of the routine that best met protocol objectives in the 51-patient training sample (KBP{sub FINAL}), protocol-specific DVH metrics and normal tissue complication probability were compared for original versus KBP{sub FINAL} plans across the 35-patient validation set. Paired t tests were used to test differences between planning sets. Results: KBP{sub FINAL} plans outperformed manual planning across the validation set in all protocol-specific DVH cutpoints. The mean normal tissue complication probability for gastrointestinal toxicity was lower for KBP{sub FINAL} versus validation-set plans (48.7% vs 53.8%, P<.001). Similarly, the estimated mean white blood cell count nadir was higher (2.77 vs 2.49 k/mL, P<.001) with KBP{sub FINAL} plans, indicating lowered probability of hematologic toxicity. Conclusions: This work demonstrates that a KBP system can be efficiently trained and refined for use in radiation therapy clinical trials with minimal effort. This patient-specific plan quality control resulted in improvements on protocol-specific dosimetric endpoints.« less
Zhao, Hairong; Qiao, Weichang; Zhang, Rui; Cui, Peng; Hou, Fanglin; Zhang, Wenli
2018-02-01
The aim of this study was to validate the PG-800A36 automatic wrist blood pressure monitor according to the European Society of Hypertension International Protocol (ESH-IP) revision 2010 and the British Hypertension Society (BHS) protocols. A total of 33 participants were initially included on the basis of the ESH-IP, followed by examination of 85 participants according to the BHS protocol. The procedures and analysis methods of the protocols were followed precisely with left arm/wrist sequential measurements by two trained observers using a mercury sphygmomanometer and one supervisor using the device. The device passed the ESH-IP with an average difference of 1.45±6.46 mmHg for systolic blood pressure and 1.25±5.10 mmHg for diastolic blood pressure. Furthermore, the A/A grade of the BHS protocol was achieved with an average difference of 1.84±6.94 mmHg for systolic blood pressure and 1.15±6.49 mmHg for diastolic blood pressure, and thus, the device also fulfilled the requirements of the Association for the Advancement of Medical Instrumentation. The Pangao PG-800A36 passed the requirements of the ESH-IP revision 2010 and achieved the A/A grade of the BHS protocol, which can be recommended for self-measurement in the general population.
Ng, S S W; Lak, D C C; Lee, S C K; Ng, P P K
2015-03-01
Occupational therapists play a major role in the assessment and referral of clients with severe mental illness for supported employment. Nonetheless, there is scarce literature about the content and predictive validity of the process. In addition, the criteria of successful job matching have not been analysed and job supervisors have relied on experience rather than objective standards in recruitment. This study aimed to explore the profile of successful clients working in 'shop sales' in a supportive environment using a neurocognitive assessment protocol, and to validate the protocol against 'internal standards' of the job supervisors. This was a concurrent validation study of criterion-related scales for a single job type. The subjective ratings from the supervisors were concurrently validated against the results of neurocognitive assessment of intellectual function and work-related cognitive behaviour. A regression model was established for clients who succeeded and failed in employment using supervisor's ratings and a cutoff value of 10.5 for the Performance Fitness Rating Scale (R(2) = 0.918, F[41] = 3.794, p = 0.003). Classification And Regression Tree was also plotted to identify the profile of cases, with an overall accuracy of 0.861 (relative error, 0.26). Use of both inference statistics and data mining techniques enables the decision tree of neurocognitive assessments to be more readily applied by therapists in vocational rehabilitation, and thus directly improve the efficiency and efficacy of the process.
Ovarian Damage in Young Premenopausal Women Undergoing Chemotherapy for Cancer
2012-03-16
Leukemia; Long-term Effects Secondary to Cancer Therapy in Adults; Long-term Effects Secondary to Cancer Therapy in Children; Lymphoma; Sexual Dysfunction and Infertility; Sexuality and Reproductive Issues; Unspecified Adult Solid Tumor, Protocol Specific; Unspecified Childhood Solid Tumor, Protocol Specific
The validity of the ActiPed for physical activity monitoring.
Brown, D K; Grimwade, D; Martinez-Bussion, D; Taylor, M J D; Gladwell, V F
2013-05-01
The ActiPed (FitLinxx) is a uniaxial accelerometer, which objectively measures physical activity, uploads the data wirelessly to a website, allowing participants and researchers to view activity levels remotely. The aim was to validate ActiPed's step count, distance travelled and activity time against direct observation. Further, to compare against pedometer (YAMAX), accelerometer (ActiGraph) and manufacturer's guidelines. 22 participants, aged 28±7 years, undertook 4 protocols, including walking on different surfaces and incremental running protocol (from 2 mph to 8 mph). Bland-Altman plots allowed comparison of direct observation against ActiPed estimates. For step count, the ActiPed showed a low % bias in all protocols: walking on a treadmill (-1.30%), incremental treadmill protocol (-1.98%), walking over grass (-1.67%), and walking over concrete (-0.93%). When differentiating between walking and running step count the ActiPed showed a % bias of 4.10% and -6.30%, respectively. The ActiPed showed >95% accuracy for distance and duration estimations overall, although underestimated distance (p<0.01) for walking over grass and concrete. Overall, the ActiPed showed acceptable levels of accuracy comparable to previous validated pedometers and accelerometers. The accuracy combined with the simple and informative remote gathering of data, suggests that the ActiPed could be a useful tool in objective physical activity monitoring. © Georg Thieme Verlag KG Stuttgart · New York.
Engkvist, I L; Hagberg, M; Wigaeus-Hjelm, E; Menckel, E; Ekenvall, L
1995-06-01
No documented strategy, including preventive strategies, for systematic investigation of overexertion back accidents among nursing personnel has yet been published. One aim of the present study was to develop standardized instruments for the systematic investigation of back accidents among nursing personnel in order to develop preventive strategies. Another aim was to produce a screening tool that could easily be used for identifying potential overexertion back accident hazards. Two structured interview protocols were developed, one for the injured person and one for the supervisor. An ergonomics checklist was designed for the most important spaces according to accident statistics: patient's room, corridor, toilet, and also one for 'other space', eg X-ray and treatment rooms. The instruments were developed by frequent discussions and adjustments in a task force of researchers and occupational health personnel. The protocols were tested in two steps before a final version was established. The construct validity and interobserver reliability of the checklist were tested by ten ergonomists, who checked a patient's room, a toilet and a corridor with some known hazards. The constructed validity agreement was 90% in 19 of 26 items in the checklist. The interobserver reliability had the same figures as the validity for all items in the checklist. The interview protocols and checklist appear to be suitable for systematic investigation of overexertion back accidents.
Guo, Wan-Gang; Li, Bing-Ling; He, Yong; Xue, Yu-Sheng; Wang, Hai-Yan; Zheng, Qiang-Sun; Xiang, Ding-Cheng
2014-08-01
To validate the Andon KD-5917 automatic upper arm blood pressure monitor according to the European Society of Hypertension International Protocol revision 2010. Sequential same-left-arm measurements of systolic blood pressure (SBP) and diastolic blood pressure (DBP) were obtained in 33 participants using the mercury sphygmomanometer and the test device. According to the validation protocol, 99 pairs of test device and reference blood pressure measurements (three pairs for each of the 33 participants) were obtained in the study. The device produced 73, 98, and 99 measurements within 5, 10, and 15 mmHg for SBP and 86, 98, and 99 for DBP, respectively. The mean ± SD device-observer difference was 3.07 ± 3.68 mmHg for SBP and -0.89 ± 3.72 mmHg for DBP. The number of patients with two or three of the device-observer difference within 5 mmHg was 26 for SBP and 29 for DBP, and no patient had a device-observer difference within 5 mmHg. The Andon KD-5917 automatic upper arm blood pressure monitor can be recommended for clinical use and self-measurement in an adult population on the basis of the European Society of Hypertension International Protocol revision 2010.
Niknafs, Pedram; Norouzi, Elahe; Bahman Bijari, Bahareh; Baneshi, Mohammad Reza
2015-01-01
Neonates with respiratory distress syndrome (RDS), who are treated according to INSURE protocol; require arterial blood gas (ABG) analysis to decide on appropriate management. We conducted this study to investigate the validity of pulse oximetry instead of frequent ABG analysis in the evaluation of these patients. From a total of 193 blood samples obtained from 30 neonates <1500 grams with RDS, 7.2% were found to have one or more of the followings: acidosis, hypercapnia, or hypoxemia. We found that pulse oximetry in the detection of hyperoxemia had a good validity to appropriately manage patients without blood gas analysis. However, the validity of pulse oximetry was not good enough to detect acidosis, hypercapnia, and hypoxemia. PMID:25999627
Latency correction of event-related potentials between different experimental protocols
NASA Astrophysics Data System (ADS)
Iturrate, I.; Chavarriaga, R.; Montesano, L.; Minguez, J.; Millán, JdR
2014-06-01
Objective. A fundamental issue in EEG event-related potentials (ERPs) studies is the amount of data required to have an accurate ERP model. This also impacts the time required to train a classifier for a brain-computer interface (BCI). This issue is mainly due to the poor signal-to-noise ratio and the large fluctuations of the EEG caused by several sources of variability. One of these sources is directly related to the experimental protocol or application designed, and may affect the amplitude or latency of ERPs. This usually prevents BCI classifiers from generalizing among different experimental protocols. In this paper, we analyze the effect of the amplitude and the latency variations among different experimental protocols based on the same type of ERP. Approach. We present a method to analyze and compensate for the latency variations in BCI applications. The algorithm has been tested on two widely used ERPs (P300 and observation error potentials), in three experimental protocols in each case. We report the ERP analysis and single-trial classification. Main results. The results obtained show that the designed experimental protocols significantly affect the latency of the recorded potentials but not the amplitudes. Significance. These results show how the use of latency-corrected data can be used to generalize the BCIs, reducing the calibration time when facing a new experimental protocol.
Distance-Based and Low Energy Adaptive Clustering Protocol for Wireless Sensor Networks
Gani, Abdullah; Anisi, Mohammad Hossein; Ab Hamid, Siti Hafizah; Akhunzada, Adnan; Khan, Muhammad Khurram
2016-01-01
A wireless sensor network (WSN) comprises small sensor nodes with limited energy capabilities. The power constraints of WSNs necessitate efficient energy utilization to extend the overall network lifetime of these networks. We propose a distance-based and low-energy adaptive clustering (DISCPLN) protocol to streamline the green issue of efficient energy utilization in WSNs. We also enhance our proposed protocol into the multi-hop-DISCPLN protocol to increase the lifetime of the network in terms of high throughput with minimum delay time and packet loss. We also propose the mobile-DISCPLN protocol to maintain the stability of the network. The modelling and comparison of these protocols with their corresponding benchmarks exhibit promising results. PMID:27658194
STANDARDIZATION AND VALIDATION OF MICROBIOLOGICAL METHODS FOR EXAMINATION OF BIOSOLIDS
The objective of this presentation is to discuss pathogens of concern in biosolids, the analytical techniques used to evaluate microorganisms in biosolids, and to discuss standardization and validation of analytical protocols for microbes within a complex matrix. Implications of ...
MICROORGANISMS IN BIOSOLIDS: ANALYTICAL METHODS DEVELOPMENT, STANDARDIZATION, AND VALIDATION
The objective of this presentation is to discuss pathogens of concern in biosolids, the analytical techniques used to evaluate microorganisms in biosolids, and to discuss standardization and validation of analytical protocols for microbes within such a complex matrix. Implicatio...
Validity Issues in Clinical Assessment.
ERIC Educational Resources Information Center
Foster, Sharon L.; Cone, John D.
1995-01-01
Validation issues that arise with measures of constructs and behavior are addressed with reference to general reasons for using assessment procedures in clinical psychology. A distinction is made between the representational phase of validity assessment and the elaborative validity phase in which the meaning and utility of scores are examined.…
Harrison, C H; Laussen, P C
2008-05-01
Donation after cardiac death (DCD) remains controversial in some pediatric institutions. An evidence-based, consensus-building approach to setting institutional policy about DCD can address the controversy openly and identify common ground. To resolve an extended internal debate regarding DCD policy at Children's Hospital Boston, a multidisciplinary task force was commissioned to engage in fact finding and deliberations about clinical and ethical issues in pediatric DCD, and attempt to reach consensus regarding the development of a protocol for pediatric DCD. Issues examined included values and attitudes of staff, families, and the public; number of possible candidates for DCD at the hospital; risks and benefits for child donors and their families; and research needs. Consensus was reached on a set of foundational ethical principles for pediatric DCD. With assistance from the local organ procurement organization (OPO), the task force developed a protocol for pediatric kidney DCD which most members believed could meet all the requirements of the foundational ethical principles. Complete consensus on the use of the protocol was not reached; however, almost all members supported initiation of kidney DCD for older pediatric patients who had wished to be organ donors. The hospital has implemented the protocol on this limited basis and established a process for considering proposals to expand the eligible donor population and include other organs.
Beuscart, Jean-Baptiste; Dalleur, Olivia; Boland, Benoit; Thevelin, Stefanie; Knol, Wilma; Cullinan, Shane; Schneider, Claudio; O'Mahony, Denis; Rodondi, Nicolas; Spinewine, Anne
2017-01-01
Medication review has been advocated to address the challenge of polypharmacy in older patients, yet there is no consensus on how best to evaluate its efficacy. Heterogeneity of outcomes reported in clinical trials can hinder the comparison of clinical trial findings in systematic reviews. Moreover, the outcomes that matter most to older patients might be under-reported or disregarded altogether. A core outcome set can address this issue as it defines a minimum set of outcomes that should be reported in all clinical trials in any particular field of research. As part of the European Commission-funded project, called OPtimising thERapy to prevent Avoidable hospital admissions in the Multimorbid elderly, this paper describes the methods used to develop a core outcome set for clinical trials of medication review in older patients with multimorbidity. The study was designed in several steps. First, a systematic review established which outcomes were measured in published and ongoing clinical trials of medication review in older patients. Second, we undertook semistructured interviews with older patients and carers aimed at identifying additional relevant outcomes. Then, a multilanguage European Delphi survey adapted to older patients was designed. The international Delphi survey was conducted with older patients, health care professionals, researchers, and clinical experts in geriatric pharmacotherapy to validate outcomes to be included in the core outcome set. Consensus meetings were conducted to validate the results. We present the method for developing a core outcome set for medication review in older patients with multimorbidity. This study protocol could be used as a basis to develop core outcome sets in other fields of geriatric research.
The basophil activation test in the diagnosis of allergy: technical issues and critical factors.
Sturm, G J; Kranzelbinder, B; Sturm, E M; Heinemann, A; Groselj-Strele, A; Aberer, W
2009-09-01
The basophil activation test (BAT) is a widely validated and reliable tool especially for the diagnosis of hymenoptera venom allergy. Nevertheless, several pitfalls have to be considered and outcomes may differ because of diverse in-house protocols and commercially available kits. We aimed to identify the factors that may influence results of the CD63-based BAT. Basophil responses to monoclonal anti-IgE (clone E124.2.8) and bee and wasp venom were determined by BAT based on CD63. The effect of stimulating factors such as, IL-3, cytochalasin B and prewarming of the samples was investigated. Furthermore, we compared two different flow cytometer systems and evaluated the influence of storage time, different staining protocols and anti-allergic drugs on the test results. Interleukin-3 enhanced the reactivity of basophils at 300 pM, but not at 75 and 150 pM. Prewarming of samples and reagents did not affect basophil reactivity. CD63 expression assayed after storage time of up to 48 h showed that basophil reactivity already started to decline after 4 h. Basophils stained with HLA-DR-PC5 and CD123-PE antibodies gated as HLA-DR(neg)/CD123(pos) cells showed the highest reactivity. No effect on test outcomes was observed at therapeutic doses of dimetindene and desloratadine. Finally, slight differences in the percentage of activated basophils, depending on the cytometer system used, were found. Basophil activation test should be performed as early as possible after taking the blood sample, preferably within 4 h. In contrast to the skin test, BAT can be performed in patients undergoing treatment with antihistamines. For reasons of multiple influencing factors, BAT should be performed only at validated laboratories.
2014-01-01
29 Draft Joint Test Protocol – Validation of Pretreatments for Steel Armor 68 4.4.7 Rising Step Load (Stress Corrosion Cracking) 4.4.7.1...Cost Assessment 27 9. Schedule of Activities 29 10. Management and Staffing 29 11. References 30 Appendix A. Joint Test Protocol 33 Appendix B...in accordance with the tests delineated in the joint test protocol (JTP) provided in appendix A. The functional performance objectives are
ERIC Educational Resources Information Center
Laben, Joyce
2012-01-01
With the implementation of RTI, educators are attempting to find models that are the best fit for their schools. The problem solving and standard protocol models are the two most common. This study of 65 students examines a new model, the dynamic skills protocol implemented in an elementary school starting in their fourth quarter of kindergarten…
Rimland, Joseph M; Abraha, Iosief; Luchetta, Maria Laura; Cozzolino, Francesco; Orso, Massimiliano; Cherubini, Antonio; Dell'Aquila, Giuseppina; Chiatti, Carlos; Ambrosio, Giuseppe; Montedori, Alessandro
2016-06-01
Healthcare databases are useful sources to investigate the epidemiology of chronic obstructive pulmonary disease (COPD), to assess longitudinal outcomes in patients with COPD, and to develop disease management strategies. However, in order to constitute a reliable source for research, healthcare databases need to be validated. The aim of this protocol is to perform the first systematic review of studies reporting the validation of codes related to COPD diagnoses in healthcare databases. MEDLINE, EMBASE, Web of Science and the Cochrane Library databases will be searched using appropriate search strategies. Studies that evaluated the validity of COPD codes (such as the International Classification of Diseases 9th Revision and 10th Revision system; the Real codes system or the International Classification of Primary Care) in healthcare databases will be included. Inclusion criteria will be: (1) the presence of a reference standard case definition for COPD; (2) the presence of at least one test measure (eg, sensitivity, positive predictive values, etc); and (3) the use of a healthcare database (including administrative claims databases, electronic healthcare databases or COPD registries) as a data source. Pairs of reviewers will independently abstract data using standardised forms and will assess quality using a checklist based on the Standards for Reporting of Diagnostic accuracy (STARD) criteria. This systematic review protocol has been produced in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses Protocol (PRISMA-P) 2015 statement. Ethics approval is not required. Results of this study will be submitted to a peer-reviewed journal for publication. The results from this systematic review will be used for outcome research on COPD and will serve as a guide to identify appropriate case definitions of COPD, and reference standards, for researchers involved in validating healthcare databases. CRD42015029204. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
A Weak Value Based QKD Protocol Robust Against Detector Attacks
NASA Astrophysics Data System (ADS)
Troupe, James
2015-03-01
We propose a variation of the BB84 quantum key distribution protocol that utilizes the properties of weak values to insure the validity of the quantum bit error rate estimates used to detect an eavesdropper. The protocol is shown theoretically to be secure against recently demonstrated attacks utilizing detector blinding and control and should also be robust against all detector based hacking. Importantly, the new protocol promises to achieve this additional security without negatively impacting the secure key generation rate as compared to that originally promised by the standard BB84 scheme. Implementation of the weak measurements needed by the protocol should be very feasible using standard quantum optical techniques.
A secure medical data exchange protocol based on cloud environment.
Chen, Chin-Ling; Yang, Tsai-Tung; Shih, Tzay-Farn
2014-09-01
In recent years, health care technologies already became matured such as electronic medical records that can be easily stored. However, how to get medical resources more convenient is currently concern issue. In spite of many literatures discussed about medical systems, but these literatures should face many security challenges. The most important issue is patients' privacy. Therefore, we propose a secure medical data exchange protocol based on cloud environment. In our scheme, we use mobile device's characteristics, allowing peoples use medical resources on the cloud environment to seek medical advice conveniently.
Strategy for Developing Expert-System-Based Internet Protocols (TCP/IP)
NASA Technical Reports Server (NTRS)
Ivancic, William D.
1997-01-01
The Satellite Networks and Architectures Branch of NASA's Lewis Research is addressing the issue of seamless interoperability of satellite networks with terrestrial networks. One of the major issues is improving reliable transmission protocols such as TCP over long latency and error-prone links. Many tuning parameters are available to enhance the performance of TCP including segment size, timers and window sizes. There are also numerous congestion avoidance algorithms such as slow start, selective retransmission and selective acknowledgment that are utilized to improve performance. This paper provides a strategy to characterize the performance of TCP relative to various parameter settings in a variety of network environments (i.e. LAN, WAN, wireless, satellite, and IP over ATM). This information can then be utilized to develop expert-system-based Internet protocols.
Ethics of Social Media Research: Common Concerns and Practical Considerations
Goniu, Natalie; Moreno, Peter S.; Diekema, Douglas
2013-01-01
Abstract Social media Websites (SMWs) are increasingly popular research tools. These sites provide new opportunities for researchers, but raise new challenges for Institutional Review Boards (IRBs) that review these research protocols. As of yet, there is little-to-no guidance regarding how an IRB should review the studies involving SMWs. The purpose of this article was to review the common risks inherent in social media research and consider how researchers can consider these risks when writing research protocols. We focused this article on three common research approaches: observational research, interactive research, and survey/interview research. Concomitant with these research approaches, we gave particular attention to the issues pertinent to SMW research, including privacy, consent, and confidentiality. After considering these challenges, we outlined key considerations for both researchers and reviewers when creating or reviewing SMW IRB protocols. Our goal in this article was to provide a detailed examination of relevant ethics and regulatory issues for both researchers and those who review their protocols. PMID:23679571
Ethics of social media research: common concerns and practical considerations.
Moreno, Megan A; Goniu, Natalie; Moreno, Peter S; Diekema, Douglas
2013-09-01
Social media Websites (SMWs) are increasingly popular research tools. These sites provide new opportunities for researchers, but raise new challenges for Institutional Review Boards (IRBs) that review these research protocols. As of yet, there is little-to-no guidance regarding how an IRB should review the studies involving SMWs. The purpose of this article was to review the common risks inherent in social media research and consider how researchers can consider these risks when writing research protocols. We focused this article on three common research approaches: observational research, interactive research, and survey/interview research. Concomitant with these research approaches, we gave particular attention to the issues pertinent to SMW research, including privacy, consent, and confidentiality. After considering these challenges, we outlined key considerations for both researchers and reviewers when creating or reviewing SMW IRB protocols. Our goal in this article was to provide a detailed examination of relevant ethics and regulatory issues for both researchers and those who review their protocols.
A report on FY06 IPv6 deployment activities and issues at Sandia National Laboratories.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tolendino, Lawrence F.; Eldridge, John M.; Hu, Tan Chang
2006-06-01
Internet Protocol version 4 (IPv4) has been a mainstay of the both the Internet and corporate networks for delivering network packets to the desired destination. However, rapid proliferation of network appliances, evolution of corporate networks, and the expanding Internet has begun to stress the limitations of the protocol. Internet Protocol version 6 (IPv6) is the replacement protocol that overcomes the constraints of IPv4. IPv6 deployment in government network backbones has been mandated to occur by 2008. This paper explores the readiness of the Sandia National Laboratories' network backbone to support IPv6, the issues that must be addressed before a deploymentmore » begins, and recommends the next steps to take to comply with government mandates. The paper describes a joint, work effort of the Sandia National Laboratories ASC WAN project team and members of the System Analysis & Trouble Resolution and Network System Design & Implementation Departments.« less
Issues Validation: A New Environmental Scanning Technique for Family Life Educators.
ERIC Educational Resources Information Center
Weigel, Randy R.; And Others
1992-01-01
Three-state study used Issues Validation, environmental scanning process for family life educators that combines literature reviews, professional and public opinion, and survey research to identify issues facing families and youth. Samples of residents, local advisory committees, and community professionals ranked 30 issues facing families and…
Carvalho, Humberto M; Gonçalves, Carlos E; Grosgeorge, Bernard; Paes, Roberto R
2017-01-01
The study examined the validity of the Line Drill test (LD) in male adolescent basketball players (10-15 years). Sensitiveness of the LD to changes in performance across a training and competition season (4 months) was also considered. Age, maturation, body size and LD were measured (n = 57). Sensitiveness of the LD was examined pre- and post-competitive season in a sub-sample (n = 44). The time at each of the four shuttle sprints of the LD (i.e. four stages) was modelled with Bayesian multilevel models. We observed very large correlation of performance at stage 4 (full LD protocol) with stage 3, but lower correlations with the early LD stages. Players' performance by somatic maturity differed substantially only when considering full LD protocol performance. Substantial improvements in all stages of the protocol were observed across the 4-month competitive season. The LD protocol should be shortened by the last full court shuttle sprint, remaining sensitive to training exposure, and independent of maturity status and body size.
Toro A, Richard; Campos, Claudia; Molina, Carolina; Morales S, Raul G E; Leiva-Guzmán, Manuel A
2015-09-01
A critical analysis of Chile's National Air Quality Information System (NAQIS) is presented, focusing on particulate matter (PM) measurement. This paper examines the complexity, availability and reliability of monitoring station information, the implementation of control systems, the quality assurance protocols of the monitoring station data and the reliability of the measurement systems in areas highly polluted by particulate matter. From information available on the NAQIS website, it is possible to confirm that the PM2.5 (PM10) data available on the site correspond to 30.8% (69.2%) of the total information available from the monitoring stations. There is a lack of information regarding the measurement systems used to quantify air pollutants, most of the available data registers contain gaps, almost all of the information is categorized as "preliminary information" and neither standard operating procedures (operational and validation) nor assurance audits or quality control of the measurements are reported. In contrast, events that cause saturation of the monitoring detectors located in northern and southern Chile have been observed using beta attenuation monitoring. In these cases, it can only be concluded that the PM content is equal to or greater than the saturation concentration registered by the monitors and that the air quality indexes obtained from these measurements are underestimated. This occurrence has been observed in 12 (20) public and private stations where PM2.5 (PM10) is measured. The shortcomings of the NAQIS data have important repercussions for the conclusions obtained from the data and for how the data are used. However, these issues represent opportunities for improving the system to widen its use, incorporate comparison protocols between equipment, install new stations and standardize the control system and quality assurance. Copyright © 2015 Elsevier Ltd. All rights reserved.
Use of Flowchart for Automation of Clinical Protocols in mHealth.
Dias, Karine Nóra; Welfer, Daniel; Cordeiro d'Ornellas, Marcos; Pereira Haygert, Carlos Jesus; Dotto, Gustavo Nogara
2017-01-01
For healthcare professionals to use mobile applications we need someone who knows software development, provide them. In healthcare institutions, health professionals use clinical protocols to govern care, and sometimes these documents are computerized through mobile applications to assist them. This work aims to present a proposal of an application of flow as a way of describing clinical protocols for automatic generation of mobile applications to assist health professionals. The purpose of this research is to enable health professionals to develop applications from the description of their own clinical protocols. As a result, we developed a web system that automates clinical protocols for an Android platform, and we validated with two clinical protocols used in a Brazilian hospital. Preliminary results of the developed architecture demonstrate the feasibility of this study.
Renaudin, Isabelle; Poliakoff, Françoise
2017-01-01
A working group established in the framework of the EUPHRESCO European collaborative project aimed to compare and validate diagnostic protocols for the detection of “Flavescence dorée” (FD) phytoplasma in grapevines. Seven molecular protocols were compared in an interlaboratory test performance study where each laboratory had to analyze the same panel of samples consisting of DNA extracts prepared by the organizing laboratory. The tested molecular methods consisted of universal and group-specific real-time and end-point nested PCR tests. Different statistical approaches were applied to this collaborative study. Firstly, there was the standard statistical approach consisting in analyzing samples which are known to be positive and samples which are known to be negative and reporting the proportion of false-positive and false-negative results to respectively calculate diagnostic specificity and sensitivity. This approach was supplemented by the calculation of repeatability and reproducibility for qualitative methods based on the notions of accordance and concordance. Other new approaches were also implemented, based, on the one hand, on the probability of detection model, and, on the other hand, on Bayes’ theorem. These various statistical approaches are complementary and give consistent results. Their combination, and in particular, the introduction of new statistical approaches give overall information on the performance and limitations of the different methods, and are particularly useful for selecting the most appropriate detection scheme with regards to the prevalence of the pathogen. Three real-time PCR protocols (methods M4, M5 and M6 respectively developed by Hren (2007), Pelletier (2009) and under patent oligonucleotides) achieved the highest levels of performance for FD phytoplasma detection. This paper also addresses the issue of indeterminate results and the identification of outlier results. The statistical tools presented in this paper and their combination can be applied to many other studies concerning plant pathogens and other disciplines that use qualitative detection methods. PMID:28384335
Chabirand, Aude; Loiseau, Marianne; Renaudin, Isabelle; Poliakoff, Françoise
2017-01-01
A working group established in the framework of the EUPHRESCO European collaborative project aimed to compare and validate diagnostic protocols for the detection of "Flavescence dorée" (FD) phytoplasma in grapevines. Seven molecular protocols were compared in an interlaboratory test performance study where each laboratory had to analyze the same panel of samples consisting of DNA extracts prepared by the organizing laboratory. The tested molecular methods consisted of universal and group-specific real-time and end-point nested PCR tests. Different statistical approaches were applied to this collaborative study. Firstly, there was the standard statistical approach consisting in analyzing samples which are known to be positive and samples which are known to be negative and reporting the proportion of false-positive and false-negative results to respectively calculate diagnostic specificity and sensitivity. This approach was supplemented by the calculation of repeatability and reproducibility for qualitative methods based on the notions of accordance and concordance. Other new approaches were also implemented, based, on the one hand, on the probability of detection model, and, on the other hand, on Bayes' theorem. These various statistical approaches are complementary and give consistent results. Their combination, and in particular, the introduction of new statistical approaches give overall information on the performance and limitations of the different methods, and are particularly useful for selecting the most appropriate detection scheme with regards to the prevalence of the pathogen. Three real-time PCR protocols (methods M4, M5 and M6 respectively developed by Hren (2007), Pelletier (2009) and under patent oligonucleotides) achieved the highest levels of performance for FD phytoplasma detection. This paper also addresses the issue of indeterminate results and the identification of outlier results. The statistical tools presented in this paper and their combination can be applied to many other studies concerning plant pathogens and other disciplines that use qualitative detection methods.
Predicting implementation from organizational readiness for change: a study protocol
2011-01-01
Background There is widespread interest in measuring organizational readiness to implement evidence-based practices in clinical care. However, there are a number of challenges to validating organizational measures, including inferential bias arising from the halo effect and method bias - two threats to validity that, while well-documented by organizational scholars, are often ignored in health services research. We describe a protocol to comprehensively assess the psychometric properties of a previously developed survey, the Organizational Readiness to Change Assessment. Objectives Our objective is to conduct a comprehensive assessment of the psychometric properties of the Organizational Readiness to Change Assessment incorporating methods specifically to address threats from halo effect and method bias. Methods and Design We will conduct three sets of analyses using longitudinal, secondary data from four partner projects, each testing interventions to improve the implementation of an evidence-based clinical practice. Partner projects field the Organizational Readiness to Change Assessment at baseline (n = 208 respondents; 53 facilities), and prospectively assesses the degree to which the evidence-based practice is implemented. We will conduct predictive and concurrent validities using hierarchical linear modeling and multivariate regression, respectively. For predictive validity, the outcome is the change from baseline to follow-up in the use of the evidence-based practice. We will use intra-class correlations derived from hierarchical linear models to assess inter-rater reliability. Two partner projects will also field measures of job satisfaction for convergent and discriminant validity analyses, and will field Organizational Readiness to Change Assessment measures at follow-up for concurrent validity (n = 158 respondents; 33 facilities). Convergent and discriminant validities will test associations between organizational readiness and different aspects of job satisfaction: satisfaction with leadership, which should be highly correlated with readiness, versus satisfaction with salary, which should be less correlated with readiness. Content validity will be assessed using an expert panel and modified Delphi technique. Discussion We propose a comprehensive protocol for validating a survey instrument for assessing organizational readiness to change that specifically addresses key threats of bias related to halo effect, method bias and questions of construct validity that often go unexplored in research using measures of organizational constructs. PMID:21777479
8 CFR 212.1 - Documentary requirements for nonimmigrants.
Code of Federal Regulations, 2014 CFR
2014-01-01
... Cards and a valid Taiwan passport with a valid re-entry permit issued by the Taiwan Ministry of Foreign... Taiwan National Identity Card and a valid Taiwan passport with a valid re-entry permit issued by the... the United States from contiguous territory or adjacent islands at a land or sea port-of-entry. A...
8 CFR 212.1 - Documentary requirements for nonimmigrants.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Cards and a valid Taiwan passport with a valid re-entry permit issued by the Taiwan Ministry of Foreign... Taiwan National Identity Card and a valid Taiwan passport with a valid re-entry permit issued by the... the United States from contiguous territory or adjacent islands at a land or sea port-of-entry. A...
8 CFR 212.1 - Documentary requirements for nonimmigrants.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Cards and a valid Taiwan passport with a valid re-entry permit issued by the Taiwan Ministry of Foreign... Taiwan National Identity Card and a valid Taiwan passport with a valid re-entry permit issued by the... the United States from contiguous territory or adjacent islands at a land or sea port-of-entry. A...
33 CFR 173.35 - Coast Guard validation sticker.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 33 Navigation and Navigable Waters 2 2011-07-01 2011-07-01 false Coast Guard validation sticker... Guard validation sticker. No person may use a vessel except a vessel exempted in § 173.13 that has a number issued by the Coast Guard unless it has the validation sticker issued with the certificate of...
33 CFR 173.35 - Coast Guard validation sticker.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 33 Navigation and Navigable Waters 2 2014-07-01 2014-07-01 false Coast Guard validation sticker... Guard validation sticker. No person may operate a vessel except a vessel exempted in § 173.13 that has a number issued by the Coast Guard unless it has the validation sticker issued with the certificate of...
33 CFR 173.35 - Coast Guard validation sticker.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false Coast Guard validation sticker... Guard validation sticker. No person may use a vessel except a vessel exempted in § 173.13 that has a number issued by the Coast Guard unless it has the validation sticker issued with the certificate of...
33 CFR 173.35 - Coast Guard validation sticker.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 33 Navigation and Navigable Waters 2 2013-07-01 2013-07-01 false Coast Guard validation sticker... Guard validation sticker. No person may operate a vessel except a vessel exempted in § 173.13 that has a number issued by the Coast Guard unless it has the validation sticker issued with the certificate of...
33 CFR 173.35 - Coast Guard validation sticker.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 33 Navigation and Navigable Waters 2 2012-07-01 2012-07-01 false Coast Guard validation sticker... Guard validation sticker. No person may operate a vessel except a vessel exempted in § 173.13 that has a number issued by the Coast Guard unless it has the validation sticker issued with the certificate of...
DESCQA: Synthetic Sky Catalog Validation Framework
NASA Astrophysics Data System (ADS)
Mao, Yao-Yuan; Uram, Thomas D.; Zhou, Rongpu; Kovacs, Eve; Ricker, Paul M.; Kalmbach, J. Bryce; Padilla, Nelson; Lanusse, François; Zu, Ying; Tenneti, Ananth; Vikraman, Vinu; DeRose, Joseph
2018-04-01
The DESCQA framework provides rigorous validation protocols for assessing the quality of high-quality simulated sky catalogs in a straightforward and comprehensive way. DESCQA enables the inspection, validation, and comparison of an inhomogeneous set of synthetic catalogs via the provision of a common interface within an automated framework. An interactive web interface is also available at portal.nersc.gov/project/lsst/descqa.
31 CFR 351.69 - When is a book-entry Series EE savings bond validly issued?
Code of Federal Regulations, 2011 CFR
2011-07-01
... 31 Money and Finance:Treasury 2 2011-07-01 2011-07-01 false When is a book-entry Series EE savings... OF UNITED STATES SAVINGS BONDS, SERIES EE Book-Entry Series EE Savings Bonds § 351.69 When is a book-entry Series EE savings bond validly issued? A book-entry bond is validly issued when it is posted to...
31 CFR 359.54 - When is a book-entry Series I savings bonds validly issued?
Code of Federal Regulations, 2011 CFR
2011-07-01
... 31 Money and Finance:Treasury 2 2011-07-01 2011-07-01 false When is a book-entry Series I savings... OF UNITED STATES SAVINGS BONDS, SERIES I Book-Entry Series I Savings Bonds § 359.54 When is a book-entry Series I savings bonds validly issued? A book-entry bond is validly issued when it is posted to...
31 CFR 351.69 - When is a book-entry Series EE savings bond validly issued?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false When is a book-entry Series EE... OFFERING OF UNITED STATES SAVINGS BONDS, SERIES EE Book-Entry Series EE Savings Bonds § 351.69 When is a book-entry Series EE savings bond validly issued? A book-entry bond is validly issued when it is posted...
Mass-casualty triage: time for an evidence-based approach.
Jenkins, Jennifer Lee; McCarthy, Melissa L; Sauer, Lauren M; Green, Gary B; Stuart, Stephanie; Thomas, Tamara L; Hsu, Edbert B
2008-01-01
Mass-casualty triage has developed from a wartime necessity to a civilian tool to ensure that constrained medical resources are directed at achieving the greatest good for the most number of people. Several primary and secondary triage tools have been developed, including Simple Treatment and Rapid Transport (START), JumpSTART, Care Flight Triage, Triage Sieve, Sacco Triage Method, Secondary Assessment of Victim Endpoint (SAVE), and Pediatric Triage Tape. Evidence to support the use of one triage algorithm over another is limited, and the development of effective triage protocols is an important research priority. The most widely recognized mass-casualty triage algorithms in use today are not evidence-based, and no studies directly address these issues in the mass-casualty setting. Furthermore, no studies have evaluated existing mass-casualty triage algorithms regarding ease of use, reliability, and validity when biological, chemical, or radiological agents are introduced. Currently, the lack of a standardized mass-casualty triage system that is well validated, reliable, and uniformly accepted, remains an important gap. Future research directed at triage is recognized as a necessity, and the development of a practical, universal, triage algorithm that incorporates requirements for decontamination or special precautions for infectious agents would facilitate a more organized mass-casualty medical response.
NASA Technical Reports Server (NTRS)
Imhoff, Marc L.; Rosenquist, A.; Milne, A. K.; Dobson, M. C.; Qi, J.
2000-01-01
An International workshop was held to address how remote sensing technology could be used to support the environmental monitoring requirements of the Kyoto Protocol. An overview of the issues addressed and the findings of the workshop are discussed.
Comparison of two instruments for assessing risk of postoperative nausea and vomiting.
Kapoor, Rachna; Hola, Eric T; Adamson, Robert T; Mathis, A Scott
2008-03-01
Two instruments for assessing patients' risk of postoperative nausea and vomiting (PONV) were compared. The existing protocol (protocol 1) assessed PONV risk using 16 weighted risk factors and was used for both adults and pediatric patients. The new protocol (protocol 2) included a form for adults and a pediatric-specific form. The form for adults utilized the simplified risk score, calculated using a validated, nonweighted, 4-point scale, and categorized patients' risk of PONV as low, moderate, or high. The form for pediatric patients used a 7-point, non-weighted scale and categorized patients' risk of PONV as moderate or high. A list was generated of all patients who had surgery during August 2005, for whom protocol 1 was used, and during April 2006, for whom protocol 2 was used. Fifty patients from each time period were randomly selected for data analysis. Data collected included the percentage of the form completed, the development of PONV, the number of PONV risk factors, patient demographics, and the appropriateness of prophylaxis. The mean +/- S.D. number of PONV risk factors was significantly lower in the group treated according to protocol 2 ( p = 0.001), but fewer patients in this group were categorized as low or moderate risk and more patients were identified as high risk (p < 0.001). More patients assessed by protocol 2 received fewer interventions than recommended (p < 0.001); however, the frequency of PONV did not significantly differ between groups. Implementation of a validated and simplified PONV risk-assessment tool appeared to improve form completion rates and appropriate risk assessment; however, the rates of PONV remained similar and fewer patients received appropriate prophylaxis compared with patients assessed by the existing risk-assessment tool.
Steigler, A; Mameghan, H; Lamb, D; Joseph, D; Matthews, J; Franklin, I; Turner, S; Spry, N; Poulsen, M; North, J; Kovacev, O; Denham, J
2000-02-01
In 1997 the Trans-Tasman Radiation Oncology Group (TROG) performed a quality assurance (QA) audit of its phase III randomized clinical trial investigating the effectiveness of different durations of maximal androgen deprivation prior to and during definitive radiation therapy for locally advanced carcinoma of the prostate (TROG 96.01). The audit reviewed a total of 60 cases from 15 centres across Australia and New Zealand. In addition to verification of technical adherence to the protocol, the audit also incorporated a survey of centre planning techniques and a QA time/cost analysis. The present report builds on TROG's first technical audit conducted in 1996 for the phase III accelerated head and neck trial (TROG 91.01) and highlights the significant progress TROG has made in the interim period. The audit provides a strong validation of the results of the 96.01 trial, as well as valuable budgeting and treatment planning information for future trials. Overall improvements were detected in data quality and quantity, and in protocol compliance, with a reduction in the rate of unacceptable protocol violations from 10 to 4%. Audit design, staff education and increased data management resources were identified as the main contributing factors to these improvements. In addition, a budget estimate of $100 per patient has been proposed for conducting similar technical audits. The next major QA project to be undertaken by TROG during the period 1998-1999 is an intercentre dosimetry study. Trial funding and staff education have been targeted as the key major issues essential to the continued success and expansion of TROG's QA programme.
Pallett, Edward J; Rentowl, Patricia; Johnson, Mark I; Watson, Paul J
2014-03-01
The efficacy of transcutaneous electrical nerve stimulation (TENS) for pain relief has not been reliably established. Inconclusive findings could be due to inadequate TENS delivery and inappropriate outcome assessment. Electronic monitoring devices were used to determine patient compliance with a TENS intervention and outcome assessment protocol, to record pain scores before, during, and after TENS, and measure electrical output settings. Patients with chronic back pain consented to use TENS daily for 2 weeks and to report pain scores before, during, and after 1-hour treatments. A ≥ 30% reduction in pain scores was used to classify participants as TENS responders. Electronic monitoring devices "TLOG" and "TSCORE" recorded time and duration of TENS use, electrical settings, and pain scores. Forty-two patients consented to participate. One of 35 (3%) patients adhered completely to the TENS use and pain score reporting protocol. Fourteen of 33 (42%) were TENS responders according to electronic pain score data. Analgesia onset occurred within 30 to 60 minutes for 13/14 (93%) responders. It was not possible to correlate TENS amplitude, frequency, or pulse width measurements with therapeutic response. Findings from TENS research studies depend on the timing of outcome assessment; pain should be recorded during stimulation. TENS device sophistication might be an issue and parameter restriction should be considered. Careful protocol design is required to improve adherence and monitoring is necessary to evaluate the validity of findings. This observational study provides objective evidence to support concerns about poor implementation fidelity in TENS research.
Alemi, Farrokh; Haack, Mary R; Nemes, Susanna; Aughburns, Renita; Sinkule, Jennifer; Neuhauser, Duncan
2007-01-01
Background In this paper, we show how counselors and psychologists can use emails for online management of substance abusers, including the anatomy and content of emails that clinicians should send substance abusers. Some investigators have attempted to determine if providing mental health services online is an efficacious delivery of treatment. The question of efficacy is an empirical issue that cannot be settled unless we are explicitly clear about the content and nature of online treatment. We believe that it is not the communications via internet that matters, but the content of these communications. The purpose of this paper is to provide the content of our online counseling services so others can duplicate the work and investigate its efficacy. Results We have managed nearly 300 clients online for recovery from substance abuse. Treatment included individual counseling (motivational interviewing, cognitive-behavior therapy, relapse prevention assignments), participation in an electronic support group and the development of a recovery team. Our findings of success with these interventions are reported elsewhere. Our experience has led to development of a protocol of care that is described more fully in this paper. This protocol is based on stages of change and relapse prevention theories and follows a Motivational Interviewing method of counseling. Conclusion The use of electronic media in providing mental health treatment remains controversial due to concerns about confidentiality, security and legal considerations. More research is needed to validate and generalize the use of online treatment for mental health problems. If researchers have to build on each others work, it is paramount that we share our protocols of care, as we have done in this paper. PMID:17302991
Research in DRM architecture based on watermarking and PKI
NASA Astrophysics Data System (ADS)
Liu, Ligang; Chen, Xiaosu; Xiao, Dao-ju; Yi, Miao
2005-02-01
Analyze the virtue and disadvantage of the present digital copyright protecting system, design a kind of security protocol model of digital copyright protection, which equilibrium consider the digital media"s use validity, integrality, security of transmission, and trade equity, make a detailed formalize description to the protocol model, analyze the relationship of the entities involved in the digital work copyright protection. The analysis of the security and capability of the protocol model shows that the model is good at security and practicability.
Secure and Fair Cluster Head Selection Protocol for Enhancing Security in Mobile Ad Hoc Networks
Paramasivan, B.; Kaliappan, M.
2014-01-01
Mobile ad hoc networks (MANETs) are wireless networks consisting of number of autonomous mobile devices temporarily interconnected into a network by wireless media. MANETs become one of the most prevalent areas of research in the recent years. Resource limitations, energy efficiency, scalability, and security are the great challenging issues in MANETs. Due to its deployment nature, MANETs are more vulnerable to malicious attack. The secure routing protocols perform very basic security related functions which are not sufficient to protect the network. In this paper, a secure and fair cluster head selection protocol (SFCP) is proposed which integrates security factors into the clustering approach for achieving attacker identification and classification. Byzantine agreement based cooperative technique is used for attacker identification and classification to make the network more attack resistant. SFCP used to solve this issue by making the nodes that are totally surrounded by malicious neighbors adjust dynamically their belief and disbelief thresholds. The proposed protocol selects the secure and energy efficient cluster head which acts as a local detector without imposing overhead to the clustering performance. SFCP is simulated in network simulator 2 and compared with two protocols including AODV and CBRP. PMID:25143986
Secure and fair cluster head selection protocol for enhancing security in mobile ad hoc networks.
Paramasivan, B; Kaliappan, M
2014-01-01
Mobile ad hoc networks (MANETs) are wireless networks consisting of number of autonomous mobile devices temporarily interconnected into a network by wireless media. MANETs become one of the most prevalent areas of research in the recent years. Resource limitations, energy efficiency, scalability, and security are the great challenging issues in MANETs. Due to its deployment nature, MANETs are more vulnerable to malicious attack. The secure routing protocols perform very basic security related functions which are not sufficient to protect the network. In this paper, a secure and fair cluster head selection protocol (SFCP) is proposed which integrates security factors into the clustering approach for achieving attacker identification and classification. Byzantine agreement based cooperative technique is used for attacker identification and classification to make the network more attack resistant. SFCP used to solve this issue by making the nodes that are totally surrounded by malicious neighbors adjust dynamically their belief and disbelief thresholds. The proposed protocol selects the secure and energy efficient cluster head which acts as a local detector without imposing overhead to the clustering performance. SFCP is simulated in network simulator 2 and compared with two protocols including AODV and CBRP.
A report on IPv6 deployment activities and issues at Sandia National Laboratories:FY2007.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tolendino, Lawrence F.; Eldridge, John M.; Hu, Tan Chang
2007-06-01
Internet Protocol version 4 (IPv4) has been a mainstay of the both the Internet and corporate networks for delivering network packets to the desired destination. However, rapid proliferation of network appliances, evolution of corporate networks, and the expanding Internet has begun to stress the limitations of the protocol. Internet Protocol version 6 (IPv6) is the replacement protocol that overcomes the constraints of IPv4. As the emerging Internet network protocol, SNL needs to prepare for its eventual deployment in international, national, customer, and local networks. Additionally, the United States Office of Management and Budget has mandated that IPv6 deployment in governmentmore » network backbones occurs by 2008. This paper explores the readiness of the Sandia National Laboratories network backbone to support IPv6, the issues that must be addressed before a deployment begins, and recommends the next steps to take to comply with government mandates. The paper describes a joint work effort of the Sandia National Laboratories ASC WAN project team and members of the System Analysis & Trouble Resolution, the Communication & Network Systems, and Network System Design & Implementation Departments.« less
Schwertner, Debora Soccal; Oliveira, Raul; Mazo, Giovana Zarpellon; Gioda, Fabiane Rosa; Kelber, Christian Roberto; Swarowsky, Alessandra
2016-05-04
Several posture evaluation devices have been used to detect deviations of the vertebral column. However it has been observed that the instruments present measurement errors related to the equipment, environment or measurement protocol. This study aimed to build, validate, analyze the reliability and describe a measurement protocol for the use of the Posture Evaluation Rotating Platform System (SPGAP, Brazilian abbreviation). The posture evaluation system comprises a Posture Evaluation Rotating Platform, video camera, calibration support and measurement software. Two pilot studies were carried out with 102 elderly individuals (average age 69 years old, SD = ±7.3) to establish a protocol for SPGAP, controlling the measurement errors related to the environment, equipment and the person under evaluation. Content validation was completed with input from judges with expertise in posture measurement. The variation coefficient method was used to validate the measurement by the instrument of an object with known dimensions. Finally, reliability was established using repeated measurements of the known object. Expert content judges gave the system excellent ratings for content validity (mean 9.4 out of 10; SD 1.13). The measurement of an object with known dimensions indicated excellent validity (all measurement errors <1 %) and test-retest reliability. A total of 26 images were needed to stabilize the system. Participants in the pilot studies indicated that they felt comfortable throughout the assessment. The use of only one image can offer measurements that underestimate or overestimate the reality. To verify the images of objects with known dimensions the values for the width and height were, respectively, CV 0.88 (width) and 2.33 (height), SD 0.22 (width) and 0.35 (height), minimum and maximum values 24.83-25.2 (width) and 14.56 - 15.75 (height). In the analysis of different images (similar) of an individual, greater discrepancies were observed in the values found. The cervical index, for example, presented minimum and maximum values of 15.38 and 37.5, a coefficient of variation of 0.29 and a standard deviation of 6.78. The SPGAP was shown to be a valid and reliable instrument for the quantitative analysis of body posture with applicability and clinical use, since it managed to reduce several measurement errors, amongst which parallax distortion.
Fania, Claudio; Albertini, Federica; Palatini, Paolo
2017-08-01
The aim of this study was to determine the accuracy of the A&D UM-201 device coupled to several cuffs for different arm sizes for office blood pressure (BP) measurement according to the International Protocol of the European Society of Hypertension. Evaluation was carried out in 33 individuals. The mean age of the individuals was 59.3±13.2 years, systolic BP was 145.4±20.6 mmHg (range: 109-186 mmHg), diastolic BP was 87.3±18.0 mmHg (range: 50-124 mmHg), and arm circumference was 30.4±4.2 cm (range: 23-39 cm). The protocol requirements were followed precisely. The UM-201 monitor passed all requirements, fulfilling the standards of the protocol. On average, the device overestimated systolic BP by 3.0±2.1 mmHg and diastolic BP by 2.6±2.0 mmHg. These data show that the A&D UM-201 device coupled to several cuffs for different ranges of arm circumference fulfilled the requirements for validation by the International Protocol and can be recommended for clinical use in the adult population.
Saladini, Francesca; Benetti, Elisabetta; Fania, Claudio; Palatini, Paolo
2013-08-01
The objective of this study was to determine the accuracy of the A&D BP UB-542 wrist device for home blood pressure (BP) measurement according to the International Protocol of the European Society of Hypertension (ESH). Device evaluation was carried out in 33 patients. The mean age was 50.9±10.1 years, the mean systolic BP was 141.6±22.8 mmHg (range 92 : 189), the mean diastolic BP was 89.2±11.4 mmHg (range 62 : 120), the mean arm circumference was 28.8±3.2 cm (range 23-35), and the mean wrist circumference was 17.1±1.4 cm (range 14-19.5). The protocol requirements were followed precisely. The device passed all requirements, fulfilling the standards of the protocol. On average, the device overestimated the systolic BP by 1.8±7.2 mmHg and diastolic BP by 1.6±5.7 mmHg. These data show that the A&D BP UB-542 wrist device met the requirements for validation by the International Protocol and can be recommended for clinical use in the adult population.
[Can the degree of renal artery stenosis be automatically quantified?].
Cherrak, I; Jaulent, M C; Azizi, M; Plouin, P F; Degoulet, P; Chatellier, G
2000-08-01
The objective of the reported study is to validate a computer system, QUASAR, dedicated to the quantification of renal artery stenoses. This system estimates automatically the reference diameter and calculates the minimum diameter to compute a degree of stenosis. A hundred and eighty images of atheromatous stenoses between 10% and 80% were collected from two French independent protocols. For the 49 images of the EMMA protocol, the results from QUASAR were compared with the visual estimation of an initial investigator and with the results from a reference method based on a panel of fixe experienced experts. For the 131 images of the ASTARTE protocol, the results from QUASAR were compared with those from a semi-automatic quantification system and with those from a system based on densitometric analysis. The present work validates QUASAR in a population of narrow atheromatous stenoses (> 50%). In the context of the EMMA protocol, QUASAR is not significantly different from the mean of the fixe experts. It is unbiased and more precise than the estimation of a single investigator. In the context of the ASTARTE protocol, there is no significant difference between the three methods for the stenoses higher than 50%, however, globally, QUASAR surestimates significantly (up to 10%) the degree of stenosis.
Advances Made in the Next Generation of Satellite Networks
NASA Technical Reports Server (NTRS)
Bhasin, Kul B.
1999-01-01
Because of the unique networking characteristics of communications satellites, global satellite networks are moving to the forefront in enhancing national and global information infrastructures. Simultaneously, broadband data services, which are emerging as the major market driver for future satellite and terrestrial networks, are being widely acknowledged as the foundation for an efficient global information infrastructure. In the past 2 years, various task forces and working groups around the globe have identified pivotal topics and key issues to address if we are to realize such networks in a timely fashion. In response, industry, government, and academia undertook efforts to address these topics and issues. A workshop was organized to provide a forum to assess the current state-of-the-art, identify key issues, and highlight the emerging trends in the next-generation architectures, data protocol development, communication interoperability, and applications. The Satellite Networks: Architectures, Applications, and Technologies Workshop was hosted by the Space Communication Program at the NASA Lewis Research Center in Cleveland, Ohio. Nearly 300 executives and technical experts from academia, industry, and government, representing the United States and eight other countries, attended the event (June 2 to 4, 1998). The program included seven panels and invited sessions and nine breakout sessions in which 42 speakers presented on technical topics. The proceedings covers a wide range of topics: access technology and protocols, architectures and network simulations, asynchronous transfer mode (ATM) over satellite networks, Internet over satellite networks, interoperability experiments and applications, multicasting, NASA interoperability experiment programs, NASA mission applications, and Transmission Control Protocol/Internet Protocol (TCP/IP) over satellite: issues, relevance, and experience.
Broadening and Simplifying the First SETI Protocol
NASA Astrophysics Data System (ADS)
Michaud, M. A. G.
The Declaration of Principles Concerning Activities Following the Detection of Extraterrestrial Intelligence, known informally as the First SETI Protocol, is the primary existing international guidance on this subject. During the fifteen years since the document was issued, several people have suggested revisions or additional protocols. This article proposes a broadened and simplified text that would apply to the detection of alien technology in our solar system as well as to electromagnetic signals from more remote sources.
A Secured Authentication Protocol for SIP Using Elliptic Curves Cryptography
NASA Astrophysics Data System (ADS)
Chen, Tien-Ho; Yeh, Hsiu-Lien; Liu, Pin-Chuan; Hsiang, Han-Chen; Shih, Wei-Kuan
Session initiation protocol (SIP) is a technology regularly performed in Internet Telephony, and Hyper Text Transport Protocol (HTTP) as digest authentication is one of the major methods for SIP authentication mechanism. In 2005, Yang et al. pointed out that HTTP could not resist server spoofing attack and off-line guessing attack and proposed a secret authentication with Diffie-Hellman concept. In 2009, Tsai proposed a nonce based authentication protocol for SIP. In this paper, we demonstrate that their protocol could not resist the password guessing attack and insider attack. Furthermore, we propose an ECC-based authentication mechanism to solve their issues and present security analysis of our protocol to show that ours is suitable for applications with higher security requirement.
Validation of an Instrument and Testing Protocol for Measuring the Combinatorial Analysis Schema.
ERIC Educational Resources Information Center
Staver, John R.; Harty, Harold
1979-01-01
Designs a testing situation to examine the presence of combinatorial analysis, to establish construct validity in the use of an instrument, Combinatorial Analysis Behavior Observation Scheme (CABOS), and to investigate the presence of the schema in young adolescents. (Author/GA)
Joint Test Protocol for Validation of Alternatives to Aliphatic Isocyanate Polyurethanes
NASA Technical Reports Server (NTRS)
Lewis, Pattie
2005-01-01
The primary objective of this effort is to demonstrate and validate alternatives to aliphatic isocyanate polyurethanes. Successful completion of this project will result in one or more isocyanate-free coatings qualified for use at AFSPC and NASA installations participating in this project.
Suggestions for Rethinking Validation
ERIC Educational Resources Information Center
Fisher, William P., Jr.
2017-01-01
In this commentary on "Rethinking Traditional Methods of Survey Validation," found in this issue of "Measurement: Interdisciplinary Research and Perspectives," William Fisher writes that Maul's paper raises issues of validity in survey-based measurement that deserve far wider consideration and scrutiny than they typically…
Petersen, James C.; Justus, B.G.; Dodd, H.R.; Bowles, D.E.; Morrison, L.W.; Williams, M.H.; Rowell, G.A.
2008-01-01
Buffalo National River located in north-central Arkansas, and Ozark National Scenic Riverways, located in southeastern Missouri, are the two largest units of the National Park Service in the Ozark Plateaus physiographic province. The purpose of this report is to provide a protocol that will be used by the National Park Service to sample fish communities and collect related water-quality, habitat, and stream discharge data of Buffalo National River and Ozark National Scenic Riverways to meet inventory and long-term monitoring objectives. The protocol includes (1) a protocol narrative, (2) several standard operating procedures, and (3) supplemental information helpful for implementation of the protocol. The protocol narrative provides background information about the protocol such as the rationale of why a particular resource or resource issue was selected for monitoring, information concerning the resource or resource issue of interest, a description of how monitoring results will inform management decisions, and a discussion of the linkages between this and other monitoring projects. The standard operating procedures cover preparation, training, reach selection, water-quality sampling, fish community sampling, physical habitat collection, measuring stream discharge, equipment maintenance and storage, data management and analysis, reporting, and protocol revision procedures. Much of the information in the standard operating procedures was gathered from existing protocols of the U.S. Geological Survey National Water Quality Assessment program or other sources. Supplemental information that would be helpful for implementing the protocol is included. This information includes information on fish species known or suspected to occur in the parks, sample sites, sample design, fish species traits, index of biotic integrity metrics, sampling equipment, and field forms.
Benson, Sarah J; Lennard, Christopher J; Hill, David M; Maynard, Philip; Roux, Claude
2010-01-01
A significant amount of research has been conducted into the use of stable isotopes to assist in determining the origin of various materials. The research conducted in the forensic field shows the potential of isotope ratio mass spectrometry (IRMS) to provide a level of discrimination not achievable utilizing traditional forensic techniques. Despite the research there have been few, if any, publications addressing the validation and measurement uncertainty of the technique for forensic applications. This study, the first in a planned series, presents validation data for the measurement of bulk nitrogen isotope ratios in ammonium nitrate (AN) using the DELTA(plus)XP (Thermo Finnigan) IRMS instrument equipped with a ConFlo III interface and FlashEA 1112 elemental analyzer (EA). Appropriate laboratory standards, analytical methods and correction calculations were developed and evaluated. A validation protocol was developed in line with the guidelines provided by the National Association of Testing Authorities, Australia (NATA). Performance characteristics including: accuracy, precision/repeatability, reproducibility/ruggedness, robustness, linear range, and measurement uncertainty were evaluated for the measurement of nitrogen isotope ratios in AN. AN (99.5%) and ammonium thiocyanate (99.99+%) were determined to be the most suitable laboratory standards and were calibrated against international standards (certified reference materials). All performance characteristics were within an acceptable range when potential uncertainties, including the manufacturer's uncertainty of the technique and standards, were taken into account. The experiments described in this article could be used as a model for validation of other instruments for similar purposes. Later studies in this series will address the more general issue of demonstrating that the IRMS technique is scientifically sound and fit-for-purpose in the forensic explosives analysis field.
Reddy, Alavalapati Goutham; Das, Ashok Kumar; Odelu, Vanga; Yoo, Kee-Young
2016-01-01
Biometric based authentication protocols for multi-server architectures have gained momentum in recent times due to advancements in wireless technologies and associated constraints. Lu et al. recently proposed a robust biometric based authentication with key agreement protocol for a multi-server environment using smart cards. They claimed that their protocol is efficient and resistant to prominent security attacks. The careful investigation of this paper proves that Lu et al.'s protocol does not provide user anonymity, perfect forward secrecy and is susceptible to server and user impersonation attacks, man-in-middle attacks and clock synchronization problems. In addition, this paper proposes an enhanced biometric based authentication with key-agreement protocol for multi-server architecture based on elliptic curve cryptography using smartcards. We proved that the proposed protocol achieves mutual authentication using Burrows-Abadi-Needham (BAN) logic. The formal security of the proposed protocol is verified using the AVISPA (Automated Validation of Internet Security Protocols and Applications) tool to show that our protocol can withstand active and passive attacks. The formal and informal security analyses and performance analysis demonstrates that the proposed protocol is robust and efficient compared to Lu et al.'s protocol and existing similar protocols.
Laurin, Nancy; Frégeau, Chantal
2012-01-01
The goal of this work was to optimize and validate a fast amplification protocol for the multiplex amplification of the STR loci included in AmpFlSTR(®) Profiler Plus(®) to expedite human DNA identification. By modifying the cycling conditions and by combining the use of a DNA polymerase optimized for high speed PCR (SpeedSTAR™ HS) and a more efficient thermal cycler instrument (Bio-RAD C1000™), we were able to reduce the amplification process from 4h to 26 min. No modification to the commercial AmpFlSTR(®) Profiler Plus(®) primer mix was required. When compared to the current Royal Canadian Mounted Police (RCMP) amplification protocol, no differences with regards to specificity, sensitivity, heterozygote peak height ratios and overall profile balance were noted. Moreover, complete concordance was obtained with profiles previously generated with the standard amplification protocol and minor alleles in mixture samples were reliably typed. An increase in n-4 stutter ratios (2.2% on average for all loci) was observed for profiles amplified with the fast protocol compared to the current procedure. Our results document the robustness of this rapid amplification protocol for STR profiling using the AmpFlSTR(®) Profiler Plus(®) primer set and demonstrate that comparable data can be obtained in substantially less time. This new approach could provide an alternative option to current multiplex STR typing amplification protocols in order to increase throughput or expedite time-sensitive cases. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
The DaNa2.0 Knowledge Base Nanomaterials—An Important Measure Accompanying Nanomaterials Development
Bohmer, Nils; Marquardt, Clarissa; Nau, Katja; Steinbach, Christoph
2018-01-01
Nanotechnology is closely related to the tailored manufacturing of nanomaterials for a huge variety of applications. However, such applications with newly developed materials are also a reason for concern. The DaNa2.0 project provides information and support for these issues on the web in condensed and easy-to-understand wording. Thus, a key challenge in the field of advanced materials safety research is access to correct and reliable studies and validated results. For nanomaterials, there is currently a continuously increasing amount of publications on toxicological issues, but criteria to evaluate the quality of these studies are necessary to use them e.g., for regulatory purposes. DaNa2.0 discusses scientific results regarding 26 nanomaterials based on actual literature that has been selected after careful evaluation following a literature criteria checklist. This checklist is publicly available, along with a selection of standardized operating protocols (SOPs) established by different projects. The spectrum of information is rounded off by further articles concerning basics or crosscutting topics in nanosafety research. This article is intended to give an overview on DaNa2.0 activities to support reliable toxicity testing and science communication alike. PMID:29596351
Difficulties in fumonisin determination: the issue of hidden fumonisins.
Dall'Asta, Chiara; Mangia, Mattia; Berthiller, Franz; Molinelli, Alexandra; Sulyok, Michael; Schuhmacher, Rainer; Krska, Rudolf; Galaverna, Gianni; Dossena, Arnaldo; Marchelli, Rosangela
2009-11-01
In this paper, the results obtained by five independent methods for the quantification of fumonisins B(1), B(2), and B(3) in raw maize are reported. Five naturally contaminated maize samples and a reference material were analyzed in three different laboratories. Although each method was validated and common calibrants were used, a poor agreement about fumonisin contamination levels was obtained. In order to investigate the interactions among analyte and matrix leading to this lack of consistency, the occurrence of fumonisin derivatives was checked. Significant amounts of hidden fumonisins were detected for all the considered samples. Furthermore, the application of an in vitro digestion protocol to raw maize allowed for a higher recovery of native fumonisins, suggesting that the interaction occurring among analytes and matrix macromolecules is associative rather than covalent. Depending on the analytical method as well as the maize sample, only 37-68% of the total fumonisin concentrations were found to be extractable from the samples. These results are particularly impressive and significant in the case of the certified reference material, underlying the actual difficulties in ascertaining the trueness of a method for fumonisin determination, opening thus an important issue for risk assessment.
Code of Federal Regulations, 2014 CFR
2014-10-01
... issuing an access authorization to mining claims or other valid occupancies wholly surrounded by... permit the reasonable use of the non-Federal land, valid mining claim, or other valid occupancy; and (3) The location, construction, maintenance, and use of the access route that BLM approves will be as...
Code of Federal Regulations, 2011 CFR
2011-10-01
... issuing an access authorization to mining claims or other valid occupancies wholly surrounded by... permit the reasonable use of the non-Federal land, valid mining claim, or other valid occupancy; and (3) The location, construction, maintenance, and use of the access route that BLM approves will be as...
Code of Federal Regulations, 2013 CFR
2013-10-01
... issuing an access authorization to mining claims or other valid occupancies wholly surrounded by... permit the reasonable use of the non-Federal land, valid mining claim, or other valid occupancy; and (3) The location, construction, maintenance, and use of the access route that BLM approves will be as...
Code of Federal Regulations, 2012 CFR
2012-10-01
... issuing an access authorization to mining claims or other valid occupancies wholly surrounded by... permit the reasonable use of the non-Federal land, valid mining claim, or other valid occupancy; and (3) The location, construction, maintenance, and use of the access route that BLM approves will be as...
Thananchai, Thiwaphorn; Junkuy, Anongphan; Kittirattanapaiboon, Phunnapa; Sribanditmongkol, Pongruk
2016-06-01
Hair analysis for chronic excessive alcohol (ethanol) use has focused on ethyl glucuronide (EtG), a minor metabolite of ethanol. Preferred methods have involved high-performance liquid chromatography (HPLC) combined with tandem mass spectrometry (MS/MS) in line with an electrospray ionization (ESI) source. EtG analysis in hair has not yet been introduced to Thailand To validate an in-house HPLC-ESI-MS/MS hair analysis protocol for EtG and to apply it to a field sample of alcohol drinkers to assess different risk levels of alcohol consumption as measured by the Alcohol Use Disorders Identification Test (AUDIT). Validation procedures followed guidelines of the US Food and Drug Administration, the European Medicines Agency, and the Scientific Working Group for Forensic Toxicology. One hundred twenty subjects reported consuming alcohol during a 3-month period prior to enrollment. After taking the Thai-language version of AUDIT, subjects were divided on the basis of test scores into low, medium, and high-risk groups for chronic excessive alcohol use. The protocol satisfied the international standards for selectivity, specificity, accuracy, precision, and calibration curve. There was no significant matrix effect. Limits of detection and quantification (LOD/LOQ) were set at 15 pg of EtG per mg of hair. The protocol was not able to detect EtG in low-risk subjects (n = 38). Detection rates for medium-risk (n = 42) and high-risk subjects (n = 40) were 14.3% and 85%, respectively. The median of EtG concentration between these two groups were significantly different. Sensitivity and specificity were both more than 90% when EtG concentrations of high-risk subjects were compared with the 30 pg/mg cutoff recommended by the Society of Hair Testing (SoHT) for diagnosing chronic excessive alcohol consumption, based on an average ethanol daily intake greater than 60 g. The in-house protocol for EtG analysis in hair was validated according to international standards. The protocol is a useful tool for evaluating risk for chronic excessive drinking as defined by AUDIT scores. It strongly predicted the highest level of risk, although it was inadequate for assessing lower levels of risk.
Vahabzadeh-Hagh, Andrew M.; Muller, Paul A.; Gersner, Roman; Zangen, Abraham; Rotenberg, Alexander
2015-01-01
Objective Transcranial magnetic stimulation (TMS) is a well-established clinical protocol with numerous potential therapeutic and diagnostic applications. Yet, much work remains in the elucidation of TMS mechanisms, optimization of protocols, and in development of novel therapeutic applications. As with many technologies, the key to these issues lies in the proper experimentation and translation of TMS methods to animal models, among which rat models have proven popular. A significant increase in the number of rat TMS publications has necessitated analysis of their relevance to human work. We therefore review the essential principles necessary for the approximation of human TMS protocols in rats as well as specific methods that addressed these issues in published studies. Materials and Methods We performed an English language literature search combined with our own experience and data. We address issues that we see as important in the translation of human TMS methods to rat models and provide a summary of key accomplishments in these areas. Results An extensive literature review illustrated the growth of rodent TMS studies in recent years. Current advances in the translation of single, paired-pulse, and repetitive stimulation paradigms to rodent models are presented. The importance of TMS in the generation of data for preclinical trials is also highlighted. Conclusions Rat TMS has several limitations when considering parallels between animal and human stimulation. However, it has proven to be a useful tool in the field of translational brain stimulation and will likely continue to aid in the design and implementation of stimulation protocols for therapeutic and diagnostic applications. PMID:22780329
Performance Analysis of the HTTP Protocol on Geostationary Satellite Links
NASA Technical Reports Server (NTRS)
Krus, Hans; Allman, Mark; Griner, Jim; Tran, Diepchi
1998-01-01
Various issues associated with HTTP protocol on geostationary satellite links are presented in viewgraph form. Specific topics include: 1) Network reference points; 2) The HTTP 1.0 and 1.1 mechanisms; 3) Experimental setup; 4) TCP and HTTP configuration; 5) Modelling slow start and 6) Results and future work.
Development and Use of an Eating Disorder Assessment and Treatment Protocol
ERIC Educational Resources Information Center
Huebner, Lois A.; Weitzman, Lauren M.; Mountain, Lisa M.; Nelson, Kris L.; Oakley, Danielle R.; Smith, Michael L.
2006-01-01
Counseling centers have been challenged to effectively treat the growing number of college students who struggle with disordered eating. In response to this critical issue, an Eating Disorder Assessment and Treatment Protocol (EDATP) was developed to assist clinical disposition in the counseling center setting and identify treatment guidelines…
Reliability and validity of the Safe Routes to school parent and student surveys
2011-01-01
Background The purpose of this study is to assess the reliability and validity of the U.S. National Center for Safe Routes to School's in-class student travel tallies and written parent surveys. Over 65,000 tallies and 374,000 parent surveys have been completed, but no published studies have examined their measurement properties. Methods Students and parents from two Charlotte, NC (USA) elementary schools participated. Tallies were conducted on two consecutive days using a hand-raising protocol; on day two students were also asked to recall the previous days' travel. The recall from day two was compared with day one to assess 24-hour test-retest reliability. Convergent validity was assessed by comparing parent-reports of students' travel mode with student-reports of travel mode. Two-week test-retest reliability of the parent survey was assessed by comparing within-parent responses. Reliability and validity were assessed using kappa statistics. Results A total of 542 students participated in the in-class student travel tally reliability assessment and 262 parent-student dyads participated in the validity assessment. Reliability was high for travel to and from school (kappa > 0.8); convergent validity was lower but still high (kappa > 0.75). There were no differences by student grade level. Two-week test-retest reliability of the parent survey (n = 112) ranged from moderate to very high for objective questions on travel mode and travel times (kappa range: 0.62 - 0.97) but was substantially lower for subjective assessments of barriers to walking to school (kappa range: 0.31 - 0.76). Conclusions The student in-class student travel tally exhibited high reliability and validity at all elementary grades. The parent survey had high reliability on questions related to student travel mode, but lower reliability for attitudinal questions identifying barriers to walking to school. Parent survey design should be improved so that responses clearly indicate issues that influence parental decision making in regards to their children's mode of travel to school. PMID:21651794
Protocol for Uniformly Measuring and Expressing the Performance of Energy Storage Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Conover, David R.; Crawford, Aladsair J.; Viswanathan, Vilayanur V.
2014-06-01
The Protocol for Uniformly Measuring and Expressing the Performance of Energy Storage Systems (PNNL-22010) was first issued in November 2012 as a first step toward providing a foundational basis for developing an initial standard for the uniform measurement and expression of energy storage system (ESS) performance. Its subsequent use in the field and review by the protocol working group and most importantly the users’ subgroup and the thermal subgroup has led to the fundamental modifications reflected in this update of the 2012 Protocol. As an update of the 2012 Protocol, this document (the June 2014 Protocol) is intended to supersedemore » its predecessor and be used as the basis for measuring and expressing ESS performance. The foreword provides general and specific details about what additions, revisions, and enhancements have been made to the 2012 Protocol and the rationale for them in arriving at the June 2014 Protocol.« less
Technical Analysis of SSP-21 Protocol
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bromberger, S.
As part of the California Energy Systems for the Twenty-First Century (CES-21) program, in December 2016 San Diego Gas and Electric (SDG&E) contracted with Lawrence Livermore National Laboratory (LLNL) to perform an independent verification and validation (IV&V) of a white paper describing their Secure SCADA Protocol for the Twenty-First Century (SSP-21) in order to analyze the effectiveness and propriety of cryptographic protocol use within the SSP-21 specification. SSP-21 is designed to use cryptographic protocols to provide (optional) encryption, authentication, and nonrepudiation, among other capabilities. The cryptographic protocols to be used reflect current industry standards; future versions of SSP-21 will usemore » other advanced technologies to provide a subset of security services.« less
Routing architecture and security for airborne networks
NASA Astrophysics Data System (ADS)
Deng, Hongmei; Xie, Peng; Li, Jason; Xu, Roger; Levy, Renato
2009-05-01
Airborne networks are envisioned to provide interconnectivity for terrestial and space networks by interconnecting highly mobile airborne platforms. A number of military applications are expected to be used by the operator, and all these applications require proper routing security support to establish correct route between communicating platforms in a timely manner. As airborne networks somewhat different from traditional wired and wireless networks (e.g., Internet, LAN, WLAN, MANET, etc), security aspects valid in these networks are not fully applicable to airborne networks. Designing an efficient security scheme to protect airborne networks is confronted with new requirements. In this paper, we first identify a candidate routing architecture, which works as an underlying structure for our proposed security scheme. And then we investigate the vulnerabilities and attack models against routing protocols in airborne networks. Based on these studies, we propose an integrated security solution to address routing security issues in airborne networks.
Thermodynamic description of non-Markovian information flux of nonequilibrium open quantum systems
NASA Astrophysics Data System (ADS)
Chen, Hong-Bin; Chen, Guang-Yin; Chen, Yueh-Nan
2017-12-01
One of the fundamental issues in the field of open quantum systems is the classification and quantification of non-Markovianity. In the contest of quantity-based measures of non-Markovianity, the intuition of non-Markovianity in terms of information backflow is widely discussed. However, it is not easy to characterize the information flux for a given system state and show its connection to non-Markovianity. Here, by using the concepts from thermodynamics and information theory, we discuss a potential definition of information flux of an open quantum system, valid for static environments. We present a simple protocol to show how a system attempts to share information with its environment and how it builds up system-environment correlations. We also show that the information returned from the correlations characterizes the non-Markovianity and a hierarchy of indivisibility of the system dynamics.
Pereira, Samantha Storer Pesani; Oliveira, Hadelândia Milon de; Turrini, Ruth Natalia Teresa; Lacerda, Rúbia Aparecida
2015-08-01
To search for evidence of the efficiency of sodium hypochlorite on environmental surfaces in reducing contamination and prevention of healthcare-associated infection HAIs. Systematic review in accordance with the Cochrane Collaboration. We analyzed 14 studies, all controlled trials, published between 1989-2013. Most studies resulted in inhibition of microorganism growth. Some decreased infection, microorganism resistance and colonization, loss of efficiency in the presence of dirty and surface-dried viruses. The hypochlorite is an effective disinfectant, however, the issue of the direct relation with the reduction of HAIs remains. The absence of control for confounding variables in the analyzed studies made the meta-analysis performance inadequate. The evaluation of internal validity using CONSORT and TREND was not possible because its contents were not appropriate to laboratory and microbiological studies. As a result, there is an urgent need for developing specific protocol for evaluating such studies.
Method for a dummy CD mirror server based on NAS
NASA Astrophysics Data System (ADS)
Tang, Muna; Pei, Jing
2002-09-01
With the development of computer network, information sharing is becoming the necessity in human life. The rapid development of CD-ROM and CD-ROM driver techniques makes it possible to issue large database online. After comparing many designs of dummy CD mirror database, which are the embodiment of a main product in CD-ROM database now and in near future, we proposed and realized a new PC based scheme. Our system has the following merits, such as, supporting all kinds of CD format; supporting many network protocol; the independence of mirror network server and the main server; low price, super large capacity, without the need of any special hardware. Preliminarily experiments have verified the validity of the proposed scheme. Encouraged by the promising application future, we are now preparing to put it into market. This paper discusses the design and implement of the CD-ROM server detailedly.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mein, S; Rankine, L; Department of Radiation Oncology, Washington University School of Medicine
Purpose: To develop, evaluate and apply a novel high-resolution 3D remote dosimetry protocol for validation of MRI guided radiation therapy treatments (MRIdian by ViewRay™). We demonstrate the first application of the protocol (including two small but required new correction terms) utilizing radiochromic 3D plastic PRESAGE™ with optical-CT readout. Methods: A detailed study of PRESAGE™ dosimeters (2kg) was conducted to investigate the temporal and spatial stability of radiation induced optical density change (ΔOD) over 8 days. Temporal stability was investigated on 3 dosimeters irradiated with four equally-spaced square 6MV fields delivering doses between 10cGy and 300cGy. Doses were imaged (read-out) bymore » optical-CT at multiple intervals. Spatial stability of ΔOD response was investigated on 3 other dosimeters irradiated uniformly with 15MV extended-SSD fields with doses of 15cGy, 30cGy and 60cGy. Temporal and spatial (radial) changes were investigated using CERR and MATLAB’s Curve Fitting Tool-box. A protocol was developed to extrapolate measured ΔOD readings at t=48hr (the typical shipment time in remote dosimetry) to time t=1hr. Results: All dosimeters were observed to gradually darken with time (<5% per day). Consistent intra-batch sensitivity (0.0930±0.002 ΔOD/cm/Gy) and linearity (R2=0.9996) was observed at t=1hr. A small radial effect (<3%) was observed, attributed to curing thermodynamics during manufacture. The refined remote dosimetry protocol (including polynomial correction terms for temporal and spatial effects, CT and CR) was then applied to independent dosimeters irradiated with MR-IGRT treatments. Excellent line profile agreement and 3D-gamma results for 3%/3mm, 10% threshold were observed, with an average passing rate 96.5%± 3.43%. Conclusion: A novel 3D remote dosimetry protocol is presented capable of validation of advanced radiation treatments (including MR-IGRT). The protocol uses 2kg radiochromic plastic dosimeters read-out by optical-CT within a week of treatment. The protocol requires small corrections for temporal and spatially-dependent behaviors observed between irradiation and readout.« less
ERIC Educational Resources Information Center
Walpole, Sharon; McKenna, Michael C.; Uribe-Zarain, Ximena; Lamitina, David
2010-01-01
In this study of 116 high-poverty schools, we explored teaching and coaching in grades K-3. We developed and validated observation protocols for both coaching and teaching. Exploratory and confirmatory factor analyses were computed to identify and confirm factors that explained the protocol data. Three coaching factors were identified in both…
ERIC Educational Resources Information Center
Pinsoneault, Terry B.
2007-01-01
The ability of the Minnesota Multiphasic Personality Inventory-2 (MMPI-2; J. N. Butcher et al., 2001) validity scales to detect random, partially random, and nonrandom MMPI-2 protocols was investigated. Investigations included the Variable Response Inconsistency scale (VRIN), F, several potentially useful new F and VRIN subscales, and F-sub(b) - F…
This test/QA plan for evaluation the generic test protocol for high speed wind tunnel, representing aerial application, pesticide spray drift reduction technologies (DRT) for row and field crops is in conformance with EPA Requirements for Quality Assurance Project Plans (EPA QA/R...
This test/QA plan for evaluation the generic test protocol for high speed wind tunnel, representing aerial application, pesticide spray drift reduction technologies (DRT) for row and field crops is in conformance with EPA Requirements for Quality Assurance Project Plans (EPA QA/R...
Validity of a Protocol for Adult Self-Report of Dyslexia and Related Difficulties
ERIC Educational Resources Information Center
Snowling, Margaret; Dawes, Piers; Nash, Hannah; Hulme, Charles
2012-01-01
Background: There is an increased prevalence of reading and related difficulties in children of dyslexic parents. In order to understand the causes of these difficulties, it is important to quantify the risk factors passed from parents to their offspring. Method: 417 adults completed a protocol comprising a 15-item questionnaire rating reading and…
Reliability and Validity of the Standing Heel-Rise Test
ERIC Educational Resources Information Center
Yocum, Allison; McCoy, Sarah Westcott; Bjornson, Kristie F.; Mullens, Pamela; Burton, Gay Naganuma
2010-01-01
A standardized protocol for a pediatric heel-rise test was developed and reliability and validity are reported. Fifty-seven children developing typically (CDT) and 34 children with plantar flexion weakness performed three tests: unilateral heel rise, vertical jump, and force measurement using handheld dynamometry. Intraclass correlation…
Validation of Land Surface Temperature from Sentinel-3
NASA Astrophysics Data System (ADS)
Ghent, D.
2017-12-01
One of the main objectives of the Sentinel-3 mission is to measure sea- and land-surface temperature with high-end accuracy and reliability in support of environmental and climate monitoring in an operational context. Calibration and validation are thus key criteria for operationalization within the framework of the Sentinel-3 Mission Performance Centre (S3MPC). Land surface temperature (LST) has a long heritage of satellite observations which have facilitated our understanding of land surface and climate change processes, such as desertification, urbanization, deforestation and land/atmosphere coupling. These observations have been acquired from a variety of satellite instruments on platforms in both low-earth orbit and in geostationary orbit. Retrieval accuracy can be a challenge though; surface emissivities can be highly variable owing to the heterogeneity of the land, and atmospheric effects caused by the presence of aerosols and by water vapour absorption can give a bias to the underlying LST. As such, a rigorous validation is critical in order to assess the quality of the data and the associated uncertainties. Validation of the level-2 SL_2_LST product, which became freely available on an operational basis from 5th July 2017 builds on an established validation protocol for satellite-based LST. This set of guidelines provides a standardized framework for structuring LST validation activities. The protocol introduces a four-pronged approach which can be summarised thus: i) in situ validation where ground-based observations are available; ii) radiance-based validation over sites that are homogeneous in emissivity; iii) intercomparison with retrievals from other satellite sensors; iv) time-series analysis to identify artefacts on an interannual time-scale. This multi-dimensional approach is a necessary requirement for assessing the performance of the LST algorithm for the Sea and Land Surface Temperature Radiometer (SLSTR) which is designed around biome-based coefficients, thus emphasizing the importance of non-traditional forms of validation such as radiance-based techniques. Here we present examples of the ongoing routine application of the protocol to operational Sentinel-3 LST data.
Fernandez-Calle, Pilar; Pelaz, Sandra; Oliver, Paloma; Alcaide, Maria Jose; Gomez-Rioja, Ruben; Buno, Antonio; Iturzaeta, Jose Manuel
2013-01-01
Technological innovation requires the laboratories to ensure that modifications or incorporations of new techniques do not alter the quality of their results. In an ISO 15189 accredited laboratory, flexible scope accreditation facilitates the inclusion of these changes prior to accreditation body evaluation. A strategy to perform the validation of a biochemistry analyzer in an accredited laboratory having a flexible scope is shown. A validation procedure including the evaluation of imprecision and bias of two Dimension Vista analysers 1500 was conducted. Comparability of patient results between one of them and the lately replaced Dimension RxL Max was evaluated. All studies followed the respective Clinical and Laboratory Standards Institute (CLSI) protocols. 30 chemistry assays were studied. Coefficients of variation, percent bias and total error were calculated for all tests and biological variation was considered as acceptance criteria. Quality control material and patient samples were used as test materials. Interchangeability of the results was established by processing forty patients' samples in both devices. 27 of the 30 studied parameters met allowable performance criteria. Sodium, chloride and magnesium did not fulfil acceptance criteria. Evidence of interchangeability of patient results was obtained for all parameters except magnesium, NT-proBNP, cTroponin I and C-reactive protein. A laboratory having a well structured and documented validation procedure can opt to get a flexible scope of accreditation. In addition, performing these activities prior to use on patient samples may evidence technical issues which must be corrected to minimize their impact on patient results.
Probst, Alexander; Facius, Rainer; Wirth, Reinhard; Moissl-Eichinger, Christine
2010-01-01
In order to meet planetary-protection requirements, culturable bacterial spore loads are measured representatively for the total microbial contamination of spacecraft. However, the National Aeronautics and Space Administration's (NASA's) cotton swab protocols for spore load determination have not changed for decades. To determine whether a more efficient alternative was available, a novel swab was evaluated for recovery of different Bacillus atrophaeus spore concentrations on stainless steel and other surfaces. Two protocols for the nylon-flocked swab (NFS) were validated and compared to the present NASA standard protocol. The results indicate that the novel swab protocols recover 3- to 4-fold more (45.4% and 49.0% recovery efficiency) B. atrophaeus spores than the NASA standard method (13.2%). Moreover, the nylon-flocked-swab protocols were superior in recovery efficiency for spores of seven different Bacillus species, including Bacillus anthracis Sterne (recovery efficiency, 20%). The recovery efficiencies for B. atrophaeus spores from different surfaces showed a variation from 5.9 to 62.0%, depending on the roughness of the surface analyzed. Direct inoculation of the swab resulted in a recovery rate of about 80%, consistent with the results of scanning electron micrographs that allowed detailed comparisons of the two swab types. The results of this investigation will significantly contribute to the cleanliness control of future life detection missions and will provide significant improvement in detection of B. anthracis contamination for law enforcement and security efforts. PMID:20543054
Ogurtsova, Katherine; Heise, Thomas L; Linnenkamp, Ute; Dintsios, Charalabos-Markos; Lhachimi, Stefan K; Icks, Andrea
2017-12-29
Type 2 diabetes mellitus (T2DM), a highly prevalent chronic disease, puts a large burden on individual health and health care systems. Computer simulation models, used to evaluate the clinical and economic effectiveness of various interventions to handle T2DM, have become a well-established tool in diabetes research. Despite the broad consensus about the general importance of validation, especially external validation, as a crucial instrument of assessing and controlling for the quality of these models, there are no systematic reviews comparing such validation of diabetes models. As a result, the main objectives of this systematic review are to identify and appraise the different approaches used for the external validation of existing models covering the development and progression of T2DM. We will perform adapted searches by applying respective search strategies to identify suitable studies from 14 electronic databases. Retrieved study records will be included or excluded based on predefined eligibility criteria as defined in this protocol. Among others, a publication filter will exclude studies published before 1995. We will run abstract and full text screenings and then extract data from all selected studies by filling in a predefined data extraction spreadsheet. We will undertake a descriptive, narrative synthesis of findings to address the study objectives. We will pay special attention to aspects of quality of these models in regard to the external validation based upon ISPOR and ADA recommendations as well as Mount Hood Challenge reports. All critical stages within the screening, data extraction and synthesis processes will be conducted by at least two authors. This protocol adheres to PRISMA and PRISMA-P standards. The proposed systematic review will provide a broad overview of the current practice in the external validation of models with respect to T2DM incidence and progression in humans built on simulation techniques. PROSPERO CRD42017069983 .
Störmer, M; Radojska, S; Hos, N J; Gathof, B S
2015-04-01
In order to generate standardized conditions for the microbiological control of HPCs, the PEI recommended defined steps for validation that will lead to extensive validation as shown in this study, where a possible validation principle for the microbiological control of allogeneic SCPs is presented. Although it could be demonstrated that automated culture improves microbial safety of cellular products, the requirement for extensive validation studies needs to be considered. © 2014 International Society of Blood Transfusion.
ERIC Educational Resources Information Center
Gervais, Matthew M.
2017-01-01
Experimental economic games reveal significant population variation in human social behavior. However, most protocols involve anonymous recipients, limiting their validity to fleeting interactions. Understanding human relationship dynamics will require methods with the virtues of economic games that also tap recipient identity-conditioned…
Many PCR-based methods for microbial source tracking (MST) have been developed and validated within individual research laboratories. Inter-laboratory validation of these methods, however, has been minimal, and the effects of protocol standardization regimes have not been thor...
Validating an Observation Protocol to Measure Special Education Teacher Effectiveness
ERIC Educational Resources Information Center
Johnson, Evelyn S.; Semmelroth, Carrie L.
2015-01-01
This study used Kane's (2013) Interpretation/Use Argument (IUA) to measure validity on the Recognizing Effective Special Education Teachers (RESET) observation tool. The RESET observation tool is designed to evaluate special education teacher effectiveness using evidence-based instructional practices as the basis for evaluation. In alignment with…
O'Brien, M J; Takahashi, M; Brugal, G; Christen, H; Gahm, T; Goodell, R M; Karakitsos, P; Knesel, E A; Kobler, T; Kyrkou, K A; Labbe, S; Long, E L; Mango, L J; McGoogan, E; Oberholzer, M; Reith, A; Winkler, C
1998-01-01
Optical digital imaging and its related technologies have applications in cytopathology that encompass training and education, image analysis, diagnosis, report documentation and archiving, and telecommunications. Telecytology involves the use of telecommunications to transmit cytology images for the purposes of diagnosis, consultation or education. This working paper provides a mainly informational overview of optical digital imaging and summarizes current technologic resources and applications and some of the ethical and legal implications of the use of these new technologies in cytopathology. Computer hardware standards for optical digital imagery will continue to be driven mainly by commercial interests and nonmedical imperatives, but professional organizations can play a valuable role in developing recommendations or standards for digital image sampling, documentation, archiving, authenticity safeguards and teleconsultation protocols; in addressing patient confidentiality and ethical, legal and informed consent issues; and in providing support for quality assurance and standardization of digital image-based testing. There is some evidence that high levels of accuracy for telepathology diagnosis can be achieved using existing dynamic systems, which may also be applicable to telecytology consultation. Static systems for both telepathology and telecytology, which have the advantage of considerably lower cost, appear to have lower levels of accuracy. Laboratories that maintain digital image databases should adopt practices and protocols that ensure patient confidentiality. Individuals participating in telecommunication of digital images for diagnosis should be properly qualified, meet licensing requirements and use procedures that protect patient confidentiality. Such individuals should be cognizant of the limitations of the technology and employ quality assurance practices that ensure the validity and accuracy of each consultation. Even in an informal teleconsultation setting one should define the extent of participation and be mindful of potential malpractice liability. Digital imagery applications will continue to present new opportunities and challenges. Position papers such as this are directed toward assisting the profession to stay informed and in control of these applications in the laboratory. Telecytology is an area in particular need of studies of good quality to provide data on factors affecting accuracy. New technologic approaches to addressing the issue of selective sampling in static image consultation are needed. The use of artificial intelligence software as an adjunct to enhance the accuracy and reproducibility of cytologic diagnosis of digital images in routine and consultation settings deserves to be pursued. Other telecytology-related issues that require clarification and the adoption of workable guidelines include interstate licensure and protocols to define malpractice liability.
Ball, Samuel A.; Nich, Charla; Rounsaville, Bruce J.; Eagan, Dorothy; Carroll, Kathleen M.
2013-01-01
The concurrent and predictive validity of 2 different methods of Millon Clinical Multiaxial Inventory–III subtyping (protocol sorting, cluster analysis) was evaluated in 125 recently detoxified opioid-dependent outpatients in a 12-week randomized clinical trial. Participants received naltrexone and relapse prevention group counseling and were assigned to 1 of 3 intervention conditions: (a) no-incentive vouchers, (b) incentive vouchers alone, or (c) incentive vouchers plus relationship counseling. Affective disturbance was the most common Axis I protocol-sorted subtype (66%), antisocial–narcissistic was the most common Axis II subtype (46%), and cluster analysis suggested that a 2-cluster solution (high vs. low psychiatric severity) was optimal. Predictive validity analyses indicated less symptom improvement for the higher problem subtypes, and patient treatment matching analyses indicated that some subtypes had better outcomes in the no-incentive voucher conditions. PMID:15301655
On the Analysis of Two-Person Problem Solving Protocols.
ERIC Educational Resources Information Center
Schoenfeld, Alan H.
Methodological issues in the use of protocol analysis for research into human problem solving processes are examined through a case study in which two students were videotaped as they worked together to solve mathematical problems "out loud." The students' chosen strategic or executive behavior in examining and solving a problem was…
Corporations' Resistance to Innovation: The Adoption of the Internet Protocol Version 6
ERIC Educational Resources Information Center
Pazdrowski, Tomasz
2013-01-01
Computer networks that brought unprecedented growth in global communication have been using Internet Protocol version 4 (IPv4) as a standard for routing. The exponential increase in the use of the networks caused an acute shortage of available identification numbers (IP addresses). The shortage and other network communication issues are…
ERIC Educational Resources Information Center
Polat, Nihat; Cepik, Saban
2016-01-01
To narrow the achievement gap between English language learners (ELLs) and their native-speaking peers in K-12 settings in the United States, effective instructional models must be identified. However, identifying valid observation protocols that can measure the effectiveness of specially designed instructional practices is not an easy task. This…
The Aging of lignin rich papers upon exposure to light : its quantification and prediction
James S. Bond; Rajai H. Atalla; Agarwal Umesh P.; Chris G. Hunt
1999-01-01
A program was undertaken at the Forest Products Laboratory in conjunction with the American Society for Testing and Materials (ASTM) to develop guidelines for a credible accelerated photoaging protocol for printing and writing papers. In support of this, indepth studies of photodegredation were undertaken in sufficient detail to establish the validity of the protocol....
Lund, Kirrin E; Maloney, Shane K; Milton, John T B; Blache, Dominique
2012-01-01
Confinement in metabolism pens may provoke a stress response in alpacas that will reduce the welfare of the animal and jeopardize the validity of scientific results obtained in such pens. In this study, we tested a protocol designed to successfully train alpacas to be held in a specially designed metabolism pen so that the animals' confinement would not jeopardize their welfare. We hypothesized that the alpacas would show fewer behaviors associated with a response to stress as training gradually progressed, and that they would adapt to being in the confinement of the metabolism pen. The training protocol was successful at introducing alpacas to the metabolism pens, and it did reduce the incidence of behavioral responses to stress as the training progressed. The success of the training protocol may be attributed to the progressive nature of the training, the tailoring of the protocol to suit alpacas, and the use of positive reinforcement. This study demonstrated that both animal welfare and the validity of the scientific outcomes could be maximized by the gradual training of experimental animals, thereby minimizing the stress imposed on the animals during experimental procedures.
Grover-Páez, Fernando; Cardona-Muñoz, Ernesto G; Cardona-Müller, David; Guzmán-Saldívar, Víctor H; Rodríguez-De la Cerda, Mariana; Jiménez-Cázarez, Mayra B; Totsuka-Sutto, Sylvia E; Alanis-Sánchez, Guillermo A; Ramos-Becerra, Carlos G
2017-12-01
The aim of this study was to determine the accuracy of the Omron HEM-7320-LA with Intelli Wrap technology cuff HEM-FL1 for self-measurement and clinic blood pressure (BP) measurement according to the European Society of Hypertension International Protocol revision 2010. The evaluation was performed in 39 individuals. The mean age of the participants was 47.9±14 years; systolic BP was 145.2±24.3 mmHg (range: 97-190), diastolic BP was 90.9±12.9 mmHg (range: 68-120), and arm circumference was 30.8±4 cm (range: 25-38.5). The device successfully fulfilled the established criteria of the validation protocol. The device overestimated systolic BP by 0.6±5.7 mmHg and diastolic BP by 2.2±5.1 mmHg. The specially designed cuff HEM-FL1 to cover a broad range of arm circumferences and self-placement fulfilled the requirements of the International Protocol.
Distributed synchronization of networked drive-response systems: A nonlinear fixed-time protocol.
Zhao, Wen; Liu, Gang; Ma, Xi; He, Bing; Dong, Yunfeng
2017-11-01
The distributed synchronization of networked drive-response systems is investigated in this paper. A novel nonlinear protocol is proposed to ensure that the tracking errors converge to zeros in a fixed-time. By comparison with previous synchronization methods, the present method considers more practical conditions and the synchronization time is not dependent of arbitrary initial conditions but can be offline pre-assign according to the task assignment. Finally, the feasibility and validity of the presented protocol have been illustrated by a numerical simulation. Copyright © 2017. Published by Elsevier Ltd.
Reddy, Alavalapati Goutham; Das, Ashok Kumar; Odelu, Vanga; Yoo, Kee-Young
2016-01-01
Biometric based authentication protocols for multi-server architectures have gained momentum in recent times due to advancements in wireless technologies and associated constraints. Lu et al. recently proposed a robust biometric based authentication with key agreement protocol for a multi-server environment using smart cards. They claimed that their protocol is efficient and resistant to prominent security attacks. The careful investigation of this paper proves that Lu et al.’s protocol does not provide user anonymity, perfect forward secrecy and is susceptible to server and user impersonation attacks, man-in-middle attacks and clock synchronization problems. In addition, this paper proposes an enhanced biometric based authentication with key-agreement protocol for multi-server architecture based on elliptic curve cryptography using smartcards. We proved that the proposed protocol achieves mutual authentication using Burrows-Abadi-Needham (BAN) logic. The formal security of the proposed protocol is verified using the AVISPA (Automated Validation of Internet Security Protocols and Applications) tool to show that our protocol can withstand active and passive attacks. The formal and informal security analyses and performance analysis demonstrates that the proposed protocol is robust and efficient compared to Lu et al.’s protocol and existing similar protocols. PMID:27163786
IPV6 Mobile Network Protocol Weaknesses and a Cryptosystem Approach
NASA Astrophysics Data System (ADS)
Balitanas, Maricel; Kim, Tai-Hoon
This paper reviews some of the improvements associated with the new Internet protocol version 6, an emphasis on its security-related functionality particularly in its authentication and concludes with a hybrid cryptosystem for its authentication issue. Since new generation of Internet protocol is on its way to solve the growth of IP address depletion. It is in a process that may take several years to complete. Thus, as a step to effective solution and efficient implementation this review has been made.
Mobile Virtual Private Networking
NASA Astrophysics Data System (ADS)
Pulkkis, Göran; Grahn, Kaj; Mårtens, Mathias; Mattsson, Jonny
Mobile Virtual Private Networking (VPN) solutions based on the Internet Security Protocol (IPSec), Transport Layer Security/Secure Socket Layer (SSL/TLS), Secure Shell (SSH), 3G/GPRS cellular networks, Mobile IP, and the presently experimental Host Identity Protocol (HIP) are described, compared and evaluated. Mobile VPN solutions based on HIP are recommended for future networking because of superior processing efficiency and network capacity demand features. Mobile VPN implementation issues associated with the IP protocol versions IPv4 and IPv6 are also evaluated. Mobile VPN implementation experiences are presented and discussed.
Design and Evaluation of Complex Moving HIFU Treatment Protocols
NASA Astrophysics Data System (ADS)
Kargl, Steven G.; Andrew, Marilee A.; Kaczkowski, Peter J.; Brayman, Andrew A.; Crum, Lawrence A.
2005-03-01
The use of moving high-intensity focused ultrasound (HIFU) treatment protocols is of interest in achieving efficient formation of large-volume thermal lesions in tissue. Judicious protocol design is critical in order to avoid collateral damage to healthy tissues outside the treatment zone. A KZK-BHTE model, extended to simulate multiple, moving scans in tissue, is used to investigate protocol design considerations. Prediction and experimental observations are presented which 1) validate the model, 2) illustrate how to assess the effects of acoustic nonlinearity, and 3) demonstrate how to assess and control collateral damage such as prefocal lesion formation and lesion formation resulting from thermal conduction without direct HIFU exposure. Experimental data consist of linear and circular scan protocols delivered over a range of exposure regimes in ex vivo bovine liver.
Martinez-Solorio, Dionicio; Melillo, Bruno; Sanchez, Luis; Liang, Yong; Lam, Erwin; Houk, K. N.; Smith, Amos B.
2016-01-01
A reusable silicon-based transfer agent (1) has been designed, synthesized, and validated for effective room-temperature palladium-catalyzed cross-coupling reactions (CCRs) of aryl and heteroaryl chlorides with readily accessible aryl lithium reagents. The crystalline, bench-stable siloxane transfer agent (1) is easily prepared via a one-step protocol. Importantly, this “green” CCR protocol circumvents prefunctionalization, isolation of organometallic cross-coupling partners, and/or stoichiometric waste aside from LiCl. DFT calculations support a σ-bond metathesis mechanism during transmetalation and lead to insights on the importance of the CF3 groups. PMID:26835838
Furrer, F; Franz, T; Berta, M; Leverrier, A; Scholz, V B; Tomamichel, M; Werner, R F
2012-09-07
We provide a security analysis for continuous variable quantum key distribution protocols based on the transmission of two-mode squeezed vacuum states measured via homodyne detection. We employ a version of the entropic uncertainty relation for smooth entropies to give a lower bound on the number of secret bits which can be extracted from a finite number of runs of the protocol. This bound is valid under general coherent attacks, and gives rise to keys which are composably secure. For comparison, we also give a lower bound valid under the assumption of collective attacks. For both scenarios, we find positive key rates using experimental parameters reachable today.
Lüders, Stephan; Krüger, Ralf; Zemmrich, Claudia; Forstner, Klaus; Sturm, Claus-Dieter; Bramlage, Peter
2012-12-01
The present study aimed to validate the automated upper arm blood pressure (BP) measuring device BM 44 for home BP monitoring according to the 2002 Protocol of the European Society of Hypertension. The most important new feature of the new device was an integrated 'WHO indicator', which categorizes the patient's individual result within the WHO recommendations for target BP by a coloured scale. Systolic and diastolic BPs were measured sequentially in 35 adult participants (16 men, 19 women) using a standard mercury y-tubed reference sphygmomanometer (two observers) and the BM 44 device (one supervisor). Ninety-nine pairs of comparisons were obtained from 15 participants in phase 1 and a further 18 participants in phase 2 of the validation study. The BM 44 device passed phase 1 of the validation study successfully with a number of absolute differences between device and observers of 5, 10 and 15 mmHg for at least 28 out of 25, 35 out of 35 and 40 out of 40 measurements, respectively. The device also achieved the targets for phases 2.1 and 2.2, with 23 and 26 participants having had at least two of three device-observers differences within 5 mmHg for systolic and diastolic BP, respectively. The Beurer BM 44 upper arm BP monitor has passed the International Protocol requirements, and hence can be recommended for home use in adults. © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins.
Enhanced Security and Pairing-free Handover Authentication Scheme for Mobile Wireless Networks
NASA Astrophysics Data System (ADS)
Chen, Rui; Shu, Guangqiang; Chen, Peng; Zhang, Lijun
2017-10-01
With the widely deployment of mobile wireless networks, we aim to propose a secure and seamless handover authentication scheme that allows users to roam freely in wireless networks without worrying about security and privacy issues. Given the open characteristic of wireless networks, safety and efficiency should be considered seriously. Several previous protocols are designed based on a bilinear pairing mapping, which is time-consuming and inefficient work, as well as unsuitable for practical situations. To address these issues, we designed a new pairing-free handover authentication scheme for mobile wireless networks. This scheme is an effective improvement of the protocol by Xu et al., which is suffer from the mobile node impersonation attack. Security analysis and simulation experiment indicate that the proposed protocol has many excellent security properties when compared with other recent similar handover schemes, such as mutual authentication and resistance to known network threats, as well as requiring lower computation and communication cost.
The use of in-situ simulation to improve safety in the plastic surgery office: a feasibility study.
Shapiro, Fred E; Pawlowski, John B; Rosenberg, Noah M; Liu, Xiaoxia; Feinstein, David M; Urman, Richard D
2014-01-01
Simulation-based interventions and education can potentially contribute to safer and more effective systems of care. We utilized in-situ simulation to highlight safety issues, regulatory requirements, and assess perceptions of safety processes by the plastic surgery office staff. A high-fidelity human patient simulator was brought to an office-based plastic surgery setting to enact a half-day full-scale, multidisciplinary medical emergency. Facilitated group debriefings were conducted after each scenario with special consideration of the principles of team training, communication, crisis management, and adherence to evidence-based protocols and regulatory standards. Abbreviated AHRQ Medical Office Safety Culture Survey was completed by the participants before and after the session. The in-situ simulations had a high degree of acceptance and face validity according to the participants. Areas highlighted by the simulation sessions included rapid communication, delegation of tasks, location of emergency materials, scope of practice, and logistics of transport. The participant survey indicated greater awareness of patient safety issues following participation in simulation and debriefing exercises in 3 areas (P < 0.05): the need to change processes if there is a recognized patient safety issue (100% vs 75%), openness to ideas about improving office processes (100% vs 88%), and the need to discuss ways to prevent errors from recurring (88% vs 62%). Issues of safety and regulatory compliance can be assessed in an office-based setting through the short-term (half-day) use of in-situ simulation with facilitated debriefing and the review of audiovisual recordings by trained facilities inspectors.
The Use of In-Situ Simulation to Improve Safety in the Plastic Surgery Office: A Feasibility Study
Shapiro, Fred E.; Pawlowski, John B.; Rosenberg, Noah M.; Liu, Xiaoxia; Feinstein, David M.; Urman, Richard D.
2014-01-01
Objective: Simulation-based interventions and education can potentially contribute to safer and more effective systems of care. We utilized in-situ simulation to highlight safety issues, regulatory requirements, and assess perceptions of safety processes by the plastic surgery office staff. Methods: A high-fidelity human patient simulator was brought to an office-based plastic surgery setting to enact a half-day full-scale, multidisciplinary medical emergency. Facilitated group debriefings were conducted after each scenario with special consideration of the principles of team training, communication, crisis management, and adherence to evidence-based protocols and regulatory standards. Abbreviated AHRQ Medical Office Safety Culture Survey was completed by the participants before and after the session. Results: The in-situ simulations had a high degree of acceptance and face validity according to the participants. Areas highlighted by the simulation sessions included rapid communication, delegation of tasks, location of emergency materials, scope of practice, and logistics of transport. The participant survey indicated greater awareness of patient safety issues following participation in simulation and debriefing exercises in 3 areas (P < 0.05): the need to change processes if there is a recognized patient safety issue (100% vs 75%), openness to ideas about improving office processes (100% vs 88%), and the need to discuss ways to prevent errors from recurring (88% vs 62%). Conclusions: Issues of safety and regulatory compliance can be assessed in an office-based setting through the short-term (half-day) use of in-situ simulation with facilitated debriefing and the review of audiovisual recordings by trained facilities inspectors. PMID:24501616
NASA Astrophysics Data System (ADS)
Huang, C. Y.; Wu, C. H.
2016-06-01
The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring and physical mashup applications can be constructed to improve people's daily life. However, IoT devices created by different manufacturers follow different proprietary protocols and cannot communicate with each other. This heterogeneity issue causes different products to be locked in multiple closed ecosystems that we call IoT silos. In order to address this issue, a common industrial solution is the hub approach, which implements connectors to communicate with IoT devices following different protocols. However, with the growing number of proprietary protocols proposed by device manufacturers, IoT hubs need to support and maintain a lot of customized connectors. Hence, we believe the ultimate solution to address the heterogeneity issue is to follow open and interoperable standard. Among the existing IoT standards, the Open Geospatial Consortium (OGC) SensorThings API standard supports comprehensive conceptual model and query functionalities. The first version of SensorThings API mainly focuses on connecting to IoT devices and sharing sensor observations online, which is the sensing capability. Besides the sensing capability, IoT devices could also be controlled via the Internet, which is the tasking capability. While the tasking capability was not included in the first version of the SensorThings API standard, this research aims on defining the tasking capability profile and integrates with the SensorThings API standard, which we call the extended-SensorThings API in this paper. In general, this research proposes a lightweight JSON-based web service description, the "Tasking Capability Description", allowing device owners and manufacturers to describe different IoT device protocols. Through the extended- SensorThings API, users and applications can follow a coherent protocol to control IoT devices that use different communication protocols, which could consequently achieve the interoperable Internet of Things infrastructure.
Lund, Travis J.; Pilarz, Matthew; Velasco, Jonathan B.; Chakraverty, Devasmita; Rosploch, Kaitlyn; Undersander, Molly; Stains, Marilyne
2015-01-01
Researchers, university administrators, and faculty members are increasingly interested in measuring and describing instructional practices provided in science, technology, engineering, and mathematics (STEM) courses at the college level. Specifically, there is keen interest in comparing instructional practices between courses, monitoring changes over time, and mapping observed practices to research-based teaching. While increasingly common observation protocols (Reformed Teaching Observation Protocol [RTOP] and Classroom Observation Protocol in Undergraduate STEM [COPUS]) at the postsecondary level help achieve some of these goals, they also suffer from weaknesses that limit their applicability. In this study, we leverage the strengths of these protocols to provide an easy method that enables the reliable and valid characterization of instructional practices. This method was developed empirically via a cluster analysis using observations of 269 individual class periods, corresponding to 73 different faculty members, 28 different research-intensive institutions, and various STEM disciplines. Ten clusters, called COPUS profiles, emerged from this analysis; they represent the most common types of instructional practices enacted in the classrooms observed for this study. RTOP scores were used to validate the alignment of the 10 COPUS profiles with reformed teaching. Herein, we present a detailed description of the cluster analysis method, the COPUS profiles, and the distribution of the COPUS profiles across various STEM courses at research-intensive universities. PMID:25976654
NASA Technical Reports Server (NTRS)
James, Jeffrey M.; Sanderson, Penelope M.; Seidler, Karen S.
1990-01-01
As modern transport environments become increasingly complex, issues such as crew communication, interaction with automation, and workload management have become crucial. Much research is being focused on holistic aspects of social and cognitive behavior, such as the strategies used to handle workload, the flow of information, the scheduling of tasks, the verbal and non-verbal interactions between crew members. Traditional laboratory performance measures no longer sufficiently meet the needs of researchers addressing these issues. However observational techniques are better equipped to capture the type of data needed and to build models of the requisite level of sophistication. Presented here is SHAPA, an interactive software tool for performing both verbal and non-verbal protocol analysis. It has been developed with the idea of affording the researchers the closest possible degree of engagement with protocol data. The researcher can configure SHAPA to encode protocols using any theoretical framework or encoding vocabulary that is desired. SHAPA allows protocol analysis to be performed at any level of analysis, and it supplies a wide variety of tools for data aggregation, manipulation. The output generated by SHAPA can be used alone or in combination with other performance variables to get a rich picture of the influences on sequences of verbal or nonverbal behavior.
A review of polymer electrolyte membrane fuel cell durability test protocols
NASA Astrophysics Data System (ADS)
Yuan, Xiao-Zi; Li, Hui; Zhang, Shengsheng; Martin, Jonathan; Wang, Haijiang
Durability is one of the major barriers to polymer electrolyte membrane fuel cells (PEMFCs) being accepted as a commercially viable product. It is therefore important to understand their degradation phenomena and analyze degradation mechanisms from the component level to the cell and stack level so that novel component materials can be developed and novel designs for cells/stacks can be achieved to mitigate insufficient fuel cell durability. It is generally impractical and costly to operate a fuel cell under its normal conditions for several thousand hours, so accelerated test methods are preferred to facilitate rapid learning about key durability issues. Based on the US Department of Energy (DOE) and US Fuel Cell Council (USFCC) accelerated test protocols, as well as degradation tests performed by researchers and published in the literature, we review degradation test protocols at both component and cell/stack levels (driving cycles), aiming to gather the available information on accelerated test methods and degradation test protocols for PEMFCs, and thereby provide practitioners with a useful toolbox to study durability issues. These protocols help prevent the prolonged test periods and high costs associated with real lifetime tests, assess the performance and durability of PEMFC components, and ensure that the generated data can be compared.
The PRECIS-2 tool has good interrater reliability and modest discriminant validity.
Loudon, Kirsty; Zwarenstein, Merrick; Sullivan, Frank M; Donnan, Peter T; Gágyor, Ildikó; Hobbelen, Hans J S M; Althabe, Fernando; Krishnan, Jerry A; Treweek, Shaun
2017-08-01
PRagmatic Explanatory Continuum Indicator Summary (PRECIS)-2 is a tool that could improve design insight for trialists. Our aim was to validate the PRECIS-2 tool, unlike its predecessor, testing the discriminant validity and interrater reliability. Over 80 international trialists, methodologists, clinicians, and policymakers created PRECIS-2 helping to ensure face validity and content validity. The interrater reliability of PRECIS-2 was measured using 19 experienced trialists who used PRECIS-2 to score a diverse sample of 15 randomized controlled trial protocols. Discriminant validity was tested with two raters to independently determine if the trial protocols were more pragmatic or more explanatory, with scores from the 19 raters for the 15 trials as predictors of pragmatism. Interrater reliability was generally good, with seven of nine domains having an intraclass correlation coefficient over 0.65. Flexibility (adherence) and recruitment had wide confidence intervals, but raters found these difficult to rate and wanted more information. Each of the nine PRECIS-2 domains could be used to differentiate between trials taking more pragmatic or more explanatory approaches with better than chance discrimination for all domains. We have assessed the validity and reliability of PRECIS-2. An elaboration study and web site provide guidance to help future users of the tool which is continuing to be tested by trial teams, systematic reviewers, and funders. Copyright © 2017 Elsevier Inc. All rights reserved.
Chaimani, Anna; Caldwell, Deborah M; Li, Tianjing; Higgins, Julian P T; Salanti, Georgia
2017-03-01
The number of systematic reviews that aim to compare multiple interventions using network meta-analysis is increasing. In this study, we highlight aspects of a standard systematic review protocol that may need modification when multiple interventions are to be compared. We take the protocol format suggested by Cochrane for a standard systematic review as our reference and compare the considerations for a pairwise review with those required for a valid comparison of multiple interventions. We suggest new sections for protocols of systematic reviews including network meta-analyses with a focus on how to evaluate their assumptions. We provide example text from published protocols to exemplify the considerations. Standard systematic review protocols for pairwise meta-analyses need extensions to accommodate the increased complexity of network meta-analysis. Our suggested modifications are widely applicable to both Cochrane and non-Cochrane systematic reviews involving network meta-analyses. Copyright © 2017 Elsevier Inc. All rights reserved.
Smith, Michelle K; Jones, Francis H M; Gilbert, Sarah L; Wieman, Carl E
2013-01-01
Instructors and the teaching practices they employ play a critical role in improving student learning in college science, technology, engineering, and mathematics (STEM) courses. Consequently, there is increasing interest in collecting information on the range and frequency of teaching practices at department-wide and institution-wide scales. To help facilitate this process, we present a new classroom observation protocol known as the Classroom Observation Protocol for Undergraduate STEM or COPUS. This protocol allows STEM faculty, after a short 1.5-hour training period, to reliably characterize how faculty and students are spending their time in the classroom. We present the protocol, discuss how it differs from existing classroom observation protocols, and describe the process by which it was developed and validated. We also discuss how the observation data can be used to guide individual and institutional change.
Smith, Michelle K.; Jones, Francis H. M.; Gilbert, Sarah L.; Wieman, Carl E.
2013-01-01
Instructors and the teaching practices they employ play a critical role in improving student learning in college science, technology, engineering, and mathematics (STEM) courses. Consequently, there is increasing interest in collecting information on the range and frequency of teaching practices at department-wide and institution-wide scales. To help facilitate this process, we present a new classroom observation protocol known as the Classroom Observation Protocol for Undergraduate STEM or COPUS. This protocol allows STEM faculty, after a short 1.5-hour training period, to reliably characterize how faculty and students are spending their time in the classroom. We present the protocol, discuss how it differs from existing classroom observation protocols, and describe the process by which it was developed and validated. We also discuss how the observation data can be used to guide individual and institutional change. PMID:24297289
Maturo, Donna; Powell, Alexis; Major-Wilson, Hannah; Sanchez, Kenia; De Santis, Joseph P; Friedman, Lawrence B
2015-01-01
Advances in care and treatment of adolescents/young adults with HIV infection have made survival into adulthood possible, requiring transition to adult care. Researchers have documented that the transition process is challenging for adolescents/young adults. To ensure successful transition, a formal transition protocol is needed. Despite existing research, little quantitative evaluation of the transition process has been conducted. The purpose of the study was to pilot test the "Movin' Out" Transitioning Protocol, a formalized protocol developed to assist transition to adult care. A retrospective medical/nursing record review was conducted with 38 clients enrolled in the "Movin' Out" Transitioning Protocol at a university-based adolescent medicine clinic providing care to adolescents/young adults with HIV infection. Almost half of the participants were able to successfully transition to adult care. Reasons for failure to transition included relocation, attrition, lost to follow-up, and transfer to another adult service. Failure to transition to adult care was not related to adherence issues, X(2) (1, N=38)=2.49, p=.288; substance use, X(2) (1, N=38)=1.71, p=.474; mental health issues, X(2) (1, N=38)=2.23, p=.322; or pregnancy/childrearing, X(2) (1, N=38)=0.00, p=.627). Despite the small sample size, the "Movin' Out" Transitioning Protocol appears to be useful in guiding the transition process of adolescents/young adults with HIV infection to adult care. More research is needed with a larger sample to fully evaluate the "Movin' Out" Transitioning Protocol. Copyright © 2015 Elsevier Inc. All rights reserved.
Pereira, Telmo; Maldonado, João
2005-11-01
To evaluate the performance of the Colson MAM BP 3AA1-2 oscillometric automatic blood pressure monitor according to the validation protocol of the European Society of Hypertension, testing its suitability for self-measurement of blood pressure. The performance of the device was assessed in relation to various clinical variables, including age, gender, body mass index, arm circumference and arterial stiffness. 33 subjects (15 men and 18 women), with a mean age of 47 +/- 10 years, were studied according to the procedures laid down in the European Society of Hypertension validation protocol. Sequential same-arm blood pressure measurements were made, alternating between a mercury standard and the automatic device. The differences among the test-control measurements were assessed and divided into categorization zones of 5, 10 and 15 mmHg discrepancy. Aortic pulse wave velocity was assessed in all subjects with a Complior device (Colson, Paris). The Colson MAM BP 3AA1-2 passed all three phases of the protocol for both systolic and diastolic blood pressure. The mean differences between the test and control measurements were -1.0 +/- 5.0 mmHg for systolic blood pressure and -1.1 +/- 4.1 mmHg for diastolic blood pressure. Both standard deviations are well below the 8 mmHg limit proposed by the Association for the Advancement of Medical Instrumentation. The predictive value of various clinical variables for the discrepancies was assessed by a regression model analysis, with no variable being found that independently undermined the performance of the monitor. In another regression analysis, we found a similar relation between test and control blood pressures and aortic pulse wave velocity, a widely recognized and validated index of target organ damage. These data show that the Colson MAM BP 3AA1-2 satisfies the quality requirements proposed by the European Society of Hypertension, demonstrating its suitability for inclusion in integrated programs of clinical surveillance based on self-measurement of blood pressure. The uniformity of its performance over a wide spectrum of clinical characteristics and the relation found with pulse wave velocity further reinforce its clinical validity.
Editing wild points in isolation - Fast agreement for reliable systems (Preliminary version)
NASA Technical Reports Server (NTRS)
Kearns, Phil; Evans, Carol
1989-01-01
Consideration is given to the intuitively appealing notion of discarding sensor values which are strongly suspected of being erroneous in a modified approximate agreement protocol. Approximate agreement with editing imposes a time bound upon the convergence of the protocol - no such bound was possible for the original approximate agreement protocol. This new approach is potentially useful in the construction of asynchronous fault tolerant systems. The main result is that a wild-point replacement technique called t-worst editing can be shown to guarantee convergence of the approximate agreement protocol to a valid agreement value. Results are presented for a four-processor synchronous system in which a single processor may be faulty.
Leonardi, Matilde; Chatterji, Somnath; Koskinen, Seppo; Ayuso-Mateos, Jose Luis; Haro, Josep Maria; Frisoni, Giovanni; Frattura, Lucilla; Martinuzzi, Andrea; Tobiasz-Adamczyk, Beata; Gmurek, Michal; Serrano, Ramon; Finocchiaro, Carla
2014-01-01
COURAGE in Europe was a 3-year project involving 12 partners from four European countries and the World Health Organization. It was inspired by the pressing need to integrate international studies on disability and ageing in light of an innovative perspective based on a validated data-collection protocol. COURAGE in Europe Project collected data on the determinants of health and disability in an ageing population, with specific tools for the evaluation of the role of the built environment and social networks on health, disability, quality of life and well-being. The main survey was conducted by partners in Finland, Poland and Spain where the survey has been administered to a sample of 10,800 persons, which was completed in March 2012. The newly developed and validated COURAGE Protocol for Ageing Studies has proven to be a valid tool for collecting comparable data in ageing population, and the COURAGE in Europe Project has created valid and reliable scientific evidence, demonstrating cross-country comparability, for disability and ageing research and policy development. It is therefore recommended that future studies exploring determinants of health and disability in ageing use the COURAGE-derived methodology. COURAGE in Europe Project collected data on the determinants of health and disability in an ageing population, with specific tools for the evaluation of the role of built environment and social networks on health, disability quality of life and well-being. The COURAGE Protocol for Ageing Studies has proven to be a valid tool for collecting comparable data in the ageing population. The COURAGE in Europe Consortium recommends that future studies exploring determinants of health and disability in ageing use COURAGE-derived methodology. Copyright © 2013 John Wiley & Sons, Ltd.
Raichle, Christina J; Eckstein, Jens; Lapaire, Olav; Leonardi, Licia; Brasier, Noé; Vischer, Annina S; Burkard, Thilo
2018-06-01
Hypertensive disorders are one of the leading causes of maternal death worldwide. Several smartphone apps claim to measure blood pressure (BP) using photoplethysmographic signals recorded by smartphone cameras. However, no single app has been validated for this use to date. We aimed to validate a new, promising smartphone algorithm. In this subgroup analysis of the iPARR trial (iPhone App Compared With Standard RR Measurement), we tested the Preventicus BP smartphone algorithm on 32 pregnant women. The trial was conducted based on the European Society of Hypertension International Protocol revision 2010 for validation of BP measuring devices in adults. Each individual received 7 sequential BP measurements starting with the reference device (Omron-HBP-1300) and followed by the smartphone measurement, resulting in 96 BP comparisons. Validation requirements of the European Society of Hypertension International Protocol revision 2010 were not fulfilled. Mean (±SD) systolic BP disagreement between the test and reference devices was 5.0 (±14.5) mm Hg. The number of absolute differences between test and reference device within 5, 10, and 15 mm Hg was 31, 53, and 64 of 96, respectively. A Bland-Altman plot showed an overestimation of smartphone-determined systolic BP in comparison with reference systolic BP in low range but an underestimation in medium-range BP. The Preventicus BP smartphone algorithm failed the accuracy criteria for estimating BP in pregnant women and was thus not commercialized. Pregnant women should be discouraged from using BP smartphone apps, unless there are algorithms specifically validated according to common protocols. URL: https://www.clinicaltrials.gov. Unique identifier: NCT02552030. © 2018 American Heart Association, Inc.
A proportional integral estimator-based clock synchronization protocol for wireless sensor networks.
Yang, Wenlun; Fu, Minyue
2017-11-01
Clock synchronization is an issue of vital importance in applications of WSNs. This paper proposes a proportional integral estimator-based protocol (EBP) to achieve clock synchronization for wireless sensor networks. As each local clock skew gradually drifts, synchronization accuracy will decline over time. Compared with existing consensus-based approaches, the proposed synchronization protocol improves synchronization accuracy under time-varying clock skews. Moreover, by restricting synchronization error of clock skew into a relative small quantity, it could reduce periodic re-synchronization frequencies. At last, a pseudo-synchronous implementation for skew compensation is introduced as synchronous protocol is unrealistic in practice. Numerical simulations are shown to illustrate the performance of the proposed protocol. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
A covert authentication and security solution for GMOs.
Mueller, Siguna; Jafari, Farhad; Roth, Don
2016-09-21
Proliferation and expansion of security risks necessitates new measures to ensure authenticity and validation of GMOs. Watermarking and other cryptographic methods are available which conceal and recover the original signature, but in the process reveal the authentication information. In many scenarios watermarking and standard cryptographic methods are necessary but not sufficient and new, more advanced, cryptographic protocols are necessary. Herein, we present a new crypto protocol, that is applicable in broader settings, and embeds the authentication string indistinguishably from a random element in the signature space and the string is verified or denied without disclosing the actual signature. Results show that in a nucleotide string of 1000, the algorithm gives a correlation of 0.98 or higher between the distribution of the codon and that of E. coli, making the signature virtually invisible. This algorithm may be used to securely authenticate and validate GMOs without disclosing the actual signature. While this protocol uses watermarking, its novelty is in use of more complex cryptographic techniques based on zero knowledge proofs to encode information.
Issues of diagnostic review in brain tumor studies: from the Brain Tumor Epidemiology Consortium.
Davis, Faith G; Malmer, Beatrice S; Aldape, Ken; Barnholtz-Sloan, Jill S; Bondy, Melissa L; Brännström, Thomas; Bruner, Janet M; Burger, Peter C; Collins, V Peter; Inskip, Peter D; Kruchko, Carol; McCarthy, Bridget J; McLendon, Roger E; Sadetzki, Siegal; Tihan, Tarik; Wrensch, Margaret R; Buffler, Patricia A
2008-03-01
Epidemiologists routinely conduct centralized single pathology reviews to minimize interobserver diagnostic variability, but this practice does not facilitate the combination of studies across geographic regions and institutions where diagnostic practices differ. A meeting of neuropathologists and epidemiologists focused on brain tumor classification issues in the context of protocol needs for consortial studies (http://epi.grants.cancer.gov/btec/). It resulted in recommendations relevant to brain tumors and possibly other rare disease studies. Two categories of brain tumors have enough general agreement over time, across regions, and between individual pathologists that one can consider using existing diagnostic data without further review: glioblastomas and meningiomas (as long as uniform guidelines such as those provided by the WHO are used). Prospective studies of these tumors benefit from collection of pathology reports, at a minimum recording the pathology department and classification system used in the diagnosis. Other brain tumors, such as oligodendroglioma, are less distinct and require careful histopathologic review for consistent classification across study centers. Epidemiologic study protocols must consider the study specific aims, diagnostic changes that have taken place over time, and other issues unique to the type(s) of tumor being studied. As diagnostic changes are being made rapidly, there are no readily available answers on disease classification issues. It is essential that epidemiologists and neuropathologists collaborate to develop appropriate study designs and protocols for specific hypothesis and populations.
Mars Sample Handling Protocol Workshop Series: Workshop 4
NASA Technical Reports Server (NTRS)
Race Margaret S. (Editor); DeVincenzi, Donald L. (Editor); Rummel, John D. (Editor); Acevedo, Sara E. (Editor)
2001-01-01
In preparation for missions to Mars that will involve the return of samples to Earth, it will be necessary to prepare for the receiving, handling, testing, distributing, and archiving of martian materials here on Earth. Previous groups and committees have studied selected aspects of sample return activities, but specific detailed protocols for the handling and testing of returned samples must still be developed. To further refine the requirements for sample hazard testing and to develop the criteria for subsequent release of sample materials from quarantine, the NASA Planetary Protection Officer convened a series of workshops in 2000-2001. The overall objective of the Workshop Series was to produce a Draft Protocol by which returned martian sample materials can be assessed for biological hazards and examined for evidence of life (extant or extinct) while safeguarding the purity of the samples from possible terrestrial contamination. This report also provides a record of the proceedings of Workshop 4, the final Workshop of the Series, which was held in Arlington, Virginia, June 5-7, 2001. During Workshop 4, the sub-groups were provided with a draft of the protocol compiled in May 2001 from the work done at prior Workshops in the Series. Then eight sub-groups were formed to discuss the following assigned topics: Review and Assess the Draft Protocol for Physical/Chemical Testing Review and Assess the Draft Protocol for Life Detection Testing Review and Assess the Draft Protocol for Biohazard Testing Environmental and Health/Monitoring and Safety Issues Requirements of the Draft Protocol for Facilities and Equipment Contingency Planning for Different Outcomes of the Draft Protocol Personnel Management Considerations in Implementation of the Draft Protocol Draft Protocol Implementation Process and Update Concepts This report provides the first complete presentation of the Draft Protocol for Mars Sample Handling to meet planetary protection needs. This Draft Protocol, which was compiled from deliberations and recommendations from earlier Workshops in the Series, represents a consensus that emerged from the discussions of all the sub-groups assembled over the course of the five Workshops of the Series. These discussions converged on a conceptual approach to sample handling, as well as on specific analytical requirements. Discussions also identified important issues requiring attention, as well as research and development needed for protocol implementation.
Kanno, J; Onyon, L; Haseman, J; Fenner-Crisp, P; Ashby, J; Owens, W
2001-01-01
The Organisation for Economic Co-operation and Development has completed the first phase of an international validation program for the rodent uterotrophic bioassay. This uterotrophic bioassay is intended to identify the in vivo activity of compounds that are suspected agonists or antagonists of estrogen. This information could, for example, be used to help prioritize positive compounds for further testing. Using draft protocols, we tested and compared two model systems, the immature female rat and the adult ovariectomized rat. Data from 19 participating laboratories using a high-potency reference agonist, ethinyl estradiol (EE), and an antagonist, ZM 189,154, indicate no substantive performance differences between models. All laboratories and all protocols successfully detected increases in uterine weights using EE in phase 1. These significant uterine weight increases were achieved under a variety of experimental conditions (e.g., strain, diet, housing protocol, bedding, vehicle). For each protocol, there was generally good agreement among laboratories with regard to the actual EE doses both in producing the first significant increase in uterine weights and achieving the maximum uterine response. Furthermore, the Hill equation appears to model the dose response satisfactorily and indicates general agreement based on calculated effective dose (ED)(10) and ED(50) within and among laboratories. The feasibility of an antagonist assay was also successfully demonstrated. Therefore, both models appear robust, reproducible, and transferable across laboratories for high-potency estrogen agonists such as EE. For the next phase of the OECD validation program, both models will be tested against a battery of weak, partial estrogen agonists. PMID:11564613
PA.NET International Quality Certification Protocol for blood pressure monitors.
Omboni, Stefano; Costantini, Carlo; Pini, Claudio; Bulegato, Roberto; Manfellotto, Dario; Rizzoni, Damiano; Palatini, Paolo; O'brien, Eoin; Parati, Gianfranco
2008-10-01
Although standard validation protocols provide assurance of the accuracy of blood pressure monitors (BPMs), there is no guidance for the consumer as to the overall quality of a device. The PA.NET International Quality Certification Protocol, developed by the Association for Research and Development of Biomedical Technologies and for Continuing Medical Education (ARSMED), a nonprofit organization, with the support of the Italian Society of Hypertension-Italian Hypertension League, and the dabl Educational Trust denotes additional criteria of quality for BPMs that fulfilled basic validation criteria, published in full in peer-reviewed medical journals. The certification is characterized by three phases: (i) to determine that the device fulfilled standard validation criteria; (ii) to determine the technical and functional characteristics of the device (e.g. operativity, display dimension, accessory functions, memory availability, etc.) and (iii) to determine the commercial characteristics (e.g. price-quality ratio, after-sale service, guarantee, etc.). At the end of the certification process, ARSMED attributes a quality index to the device, based on a scale ranging from 1 to 100, and a quality seal with four different grades (bronze, silver, gold and diamond) according to the achieved score. The seal is identified by a unique alphanumeric code. The quality seal may be used on the packaging of the appliance or in advertising. A quality certification is released to the manufacturer and published on www.pressionearteriosa.net and www.dableducational.org. The PA.NET International Quality Certification Protocol represents the first attempt to provide health care personnel and consumers with an independent and objective assessment of BPMs based on their quality.
8 CFR 212.1 - Documentary requirements for nonimmigrants.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Cards and a valid Taiwan passport with a valid re-entry permit issued by the Taiwan Ministry of Foreign... valid re-entry permit issued by the Taiwan Ministry of Foreign Affairs. (2) Program Countries and... the United States from contiguous territory or adjacent islands at a land or sea port-of-entry. A...
Code of Federal Regulations, 2010 CFR
2010-10-01
... laboratories or laboratories requesting or issued a certificate of accreditation. (a) Validation inspection. CMS or a CMS agent may conduct a validation inspection of any accredited or CLIA-exempt laboratory at... requirements of this part. (c) Noncompliance determination. If a validation or complaint inspection results in...
Code of Federal Regulations, 2012 CFR
2012-10-01
... laboratories or laboratories requesting or issued a certificate of accreditation. (a) Validation inspection. CMS or a CMS agent may conduct a validation inspection of any accredited or CLIA-exempt laboratory at... requirements of this part. (c) Noncompliance determination. If a validation or complaint inspection results in...
Code of Federal Regulations, 2011 CFR
2011-10-01
... laboratories or laboratories requesting or issued a certificate of accreditation. (a) Validation inspection. CMS or a CMS agent may conduct a validation inspection of any accredited or CLIA-exempt laboratory at... requirements of this part. (c) Noncompliance determination. If a validation or complaint inspection results in...
Code of Federal Regulations, 2014 CFR
2014-10-01
... laboratories or laboratories requesting or issued a certificate of accreditation. (a) Validation inspection. CMS or a CMS agent may conduct a validation inspection of any accredited or CLIA-exempt laboratory at... requirements of this part. (c) Noncompliance determination. If a validation or complaint inspection results in...
Code of Federal Regulations, 2013 CFR
2013-10-01
... laboratories or laboratories requesting or issued a certificate of accreditation. (a) Validation inspection. CMS or a CMS agent may conduct a validation inspection of any accredited or CLIA-exempt laboratory at... requirements of this part. (c) Noncompliance determination. If a validation or complaint inspection results in...
The Nuclear Energy Knowledge and Validation Center Summary of Activities Conducted in FY16
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gougar, Hans David
The Nuclear Energy Knowledge and Validation Center (NEKVaC) is a new initiative by the Department of Energy (DOE) and Idaho National Laboratory (INL) to coordinate and focus the resources and expertise that exist with the DOE toward solving issues in modern nuclear code validation and knowledge management. In time, code owners, users, and developers will view the NEKVaC as a partner and essential resource for acquiring the best practices and latest techniques for validating codes, providing guidance in planning and executing experiments, facilitating access to and maximizing the usefulness of existing data, and preserving knowledge for continual use by nuclearmore » professionals and organizations for their own validation needs. The scope of the NEKVaC covers many interrelated activities that will need to be cultivated carefully in the near term and managed properly once the NEKVaC is fully functional. Three areas comprise the principal mission: (1) identify and prioritize projects that extend the field of validation science and its application to modern codes, (2) develop and disseminate best practices and guidelines for high-fidelity multiphysics/multiscale analysis code development and associated experiment design, and (3) define protocols for data acquisition and knowledge preservation and provide a portal for access to databases currently scattered among numerous organizations. These mission areas, while each having a unique focus, are interdependent and complementary. Likewise, all activities supported by the NEKVaC, both near term and long term, must possess elements supporting all three areas. This cross cutting nature is essential to ensuring that activities and supporting personnel do not become “stove piped” (i.e., focused a specific function that the activity itself becomes the objective rather than achieving the larger vision). This report begins with a description of the mission areas; specifically, the role played by each major committee and the types of activities for which they are responsible. It then lists and describes the proposed near term tasks upon which future efforts can build.« less
ERIC Educational Resources Information Center
Ball, Samuel A.; Nich, Charla; Rounsaville, Bruce J.; Eagan, Dorothy; Carroll, Kathleen M.
2004-01-01
The concurrent and predictive validity of 2 different methods of Millon Clinical Multiaxial Inventory-III subtyping (protocol sorting, cluster analysis) was evaluated in 125 recently detoxified opioid-dependent outpatients in a 12-week randomized clinical trial. Participants received naltrexone and relapse prevention group counseling and were…
Establishing Validity of the Consensus Auditory-Perceptual Evaluation of Voice (CAPE-V)
ERIC Educational Resources Information Center
Zraick, Richard I.; Kempster, Gail B.; Connor, Nadine P.; Thibeault, Susan; Klaben, Bernice K.; Bursac, Zoran; Thrush, Carol R.; Glaze, Leslie E.
2011-01-01
Purpose: The Consensus Auditory-Perceptual Evaluation of Voice (CAPE-V) was developed to provide a protocol and form for clinicians to use when assessing the voice quality of adults with voice disorders (Kempster, Gerratt, Verdolini Abbott, Barkmeier-Kramer, & Hillman, 2009). This study examined the reliability and the empirical validity of the…
ERIC Educational Resources Information Center
Pepper, David; Hodgen, Jeremy; Lamesoo, Katri; Kõiv, Pille; Tolboom, Jos
2018-01-01
Cognitive interviewing (CI) provides a method of systematically collecting validity evidence of response processes for questionnaire items. CI involves a range of techniques for prompting individuals to verbalise their responses to items. One such technique is concurrent verbalisation, as developed in Think Aloud Protocol (TAP). This article…
Validating Cognitive Models of Task Performance in Algebra on the SAT®. Research Report No. 2009-3
ERIC Educational Resources Information Center
Gierl, Mark J.; Leighton, Jacqueline P.; Wang, Changjiang; Zhou, Jiawen; Gokiert, Rebecca; Tan, Adele
2009-01-01
The purpose of the study is to present research focused on validating the four algebra cognitive models in Gierl, Wang, et al., using student response data collected with protocol analysis methods to evaluate the knowledge structures and processing skills used by a sample of SAT test takers.
Opportunistic Mobility Support for Resource Constrained Sensor Devices in Smart Cities
Granlund, Daniel; Holmlund, Patrik; Åhlund, Christer
2015-01-01
A multitude of wireless sensor devices and technologies are being developed and deployed in cities all over the world. Sensor applications in city environments may include highly mobile installations that span large areas which necessitates sensor mobility support. This paper presents and validates two mechanisms for supporting sensor mobility between different administrative domains. Firstly, EAP-Swift, an Extensible Authentication Protocol (EAP)-based sensor authentication protocol is proposed that enables light-weight sensor authentication and key generation. Secondly, a mechanism for handoffs between wireless sensor gateways is proposed. We validate both mechanisms in a real-life study that was conducted in a smart city environment with several fixed sensors and moving gateways. We conduct similar experiments in an industry-based anechoic Long Term Evolution (LTE) chamber with an ideal radio environment. Further, we validate our results collected from the smart city environment against the results produced under ideal conditions to establish best and real-life case scenarios. Our results clearly validate that our proposed mechanisms can facilitate efficient sensor authentication and handoffs while sensors are roaming in a smart city environment. PMID:25738767
Opportunistic mobility support for resource constrained sensor devices in smart cities.
Granlund, Daniel; Holmlund, Patrik; Åhlund, Christer
2015-03-02
A multitude of wireless sensor devices and technologies are being developed and deployed in cities all over the world. Sensor applications in city environments may include highly mobile installations that span large areas which necessitates sensor mobility support. This paper presents and validates two mechanisms for supporting sensor mobility between different administrative domains. Firstly, EAP-Swift, an Extensible Authentication Protocol (EAP)-based sensor authentication protocol is proposed that enables light-weight sensor authentication and key generation. Secondly, a mechanism for handoffs between wireless sensor gateways is proposed. We validate both mechanisms in a real-life study that was conducted in a smart city environment with several fixed sensors and moving gateways. We conduct similar experiments in an industry-based anechoic Long Term Evolution (LTE) chamber with an ideal radio environment. Further, we validate our results collected from the smart city environment against the results produced under ideal conditions to establish best and real-life case scenarios. Our results clearly validate that our proposed mechanisms can facilitate efficient sensor authentication and handoffs while sensors are roaming in a smart city environment.
Guidelines for randomized clinical trial protocol content: a systematic review
2012-01-01
Background All randomized clinical trials (RCTs) require a protocol; however, numerous studies have highlighted protocol deficiencies. Reporting guidelines may improve the content of research reports and, if developed using robust methods, may increase the utility of reports to stakeholders. The objective of this study was to systematically identify and review RCT protocol guidelines, to assess their characteristics and methods of development, and to compare recommendations. Methods We conducted a systematic review of indexed literature (MEDLINE, EMBASE and the Cochrane Methodology Register from inception to September 2010; reference lists; related article features; forward citation searching) and a targeted search of supplementary sources, including a survey of major trial funding agencies in six countries. Records were eligible if they described a content guideline in English or French relevant to RCT protocols. Guidelines were excluded if they specified content for protocols for trials of specific procedures or conditions or were intended to assess trial quality. We extracted guideline characteristics and methods. Content was mapped for a subset of guidelines that described development methods or had institutional endorsement. Results Forty guidelines published in journals, books and institutional reports were included in the review; seven were specific to RCT protocols. Only eight (20%) described development methods which included informal consensus methods, pilot testing and formal validation; no guideline described all of these methods. No guideline described formal consensus methods or a systematic retrieval of empirical evidence to inform its development. The guidelines included a median of 23 concepts per guideline (interquartile range (IQR) = 14 to 34; range = 7 to 109). Among the subset of guidelines (n = 23) for which content was mapped, approximately 380 concepts were explicitly addressed (median concepts per guideline IQR = 31 (24,80); range = 16 to 150); most concepts were addressed in a minority of guidelines. Conclusions Existing guidelines for RCT protocol content varied substantially in their recommendations. Few reports described the methods of guideline development, limiting comparisons of guideline validity. Given the importance of protocols to diverse stakeholders, we believe a systematically developed, evidence-informed guideline for clinical trial protocols is needed. PMID:23006870
Guidelines for randomized clinical trial protocol content: a systematic review.
Tetzlaff, Jennifer M; Chan, An-Wen; Kitchen, Jessica; Sampson, Margaret; Tricco, Andrea C; Moher, David
2012-09-24
All randomized clinical trials (RCTs) require a protocol; however, numerous studies have highlighted protocol deficiencies. Reporting guidelines may improve the content of research reports and, if developed using robust methods, may increase the utility of reports to stakeholders. The objective of this study was to systematically identify and review RCT protocol guidelines, to assess their characteristics and methods of development, and to compare recommendations. We conducted a systematic review of indexed literature (MEDLINE, EMBASE and the Cochrane Methodology Register from inception to September 2010; reference lists; related article features; forward citation searching) and a targeted search of supplementary sources, including a survey of major trial funding agencies in six countries. Records were eligible if they described a content guideline in English or French relevant to RCT protocols. Guidelines were excluded if they specified content for protocols for trials of specific procedures or conditions or were intended to assess trial quality. We extracted guideline characteristics and methods. Content was mapped for a subset of guidelines that described development methods or had institutional endorsement. Forty guidelines published in journals, books and institutional reports were included in the review; seven were specific to RCT protocols. Only eight (20%) described development methods which included informal consensus methods, pilot testing and formal validation; no guideline described all of these methods. No guideline described formal consensus methods or a systematic retrieval of empirical evidence to inform its development. The guidelines included a median of 23 concepts per guideline (interquartile range (IQR) = 14 to 34; range = 7 to 109). Among the subset of guidelines (n = 23) for which content was mapped, approximately 380 concepts were explicitly addressed (median concepts per guideline IQR = 31 (24,80); range = 16 to 150); most concepts were addressed in a minority of guidelines. Existing guidelines for RCT protocol content varied substantially in their recommendations. Few reports described the methods of guideline development, limiting comparisons of guideline validity. Given the importance of protocols to diverse stakeholders, we believe a systematically developed, evidence-informed guideline for clinical trial protocols is needed.
Issues in providing a reliable multicast facility
NASA Technical Reports Server (NTRS)
Dempsey, Bert J.; Strayer, W. Timothy; Weaver, Alfred C.
1990-01-01
Issues involved in point-to-multipoint communication are presented and the literature for proposed solutions and approaches surveyed. Particular attention is focused on the ideas and implementations that align with the requirements of the environment of interest. The attributes of multicast receiver groups that might lead to useful classifications, what the functionality of a management scheme should be, and how the group management module can be implemented are examined. The services that multicasting facilities can offer are presented, followed by mechanisms within the communications protocol that implements these services. The metrics of interest when evaluating a reliable multicast facility are identified and applied to four transport layer protocols that incorporate reliable multicast.
NASA/SPAN and DOE/ESnet-DECnet transition strategy for DECnet OSI/phase 5
NASA Technical Reports Server (NTRS)
Porter, Linda; Demar, Phil
1991-01-01
The technical issues are examined involved with the transition of very large DECnet networks from DECnet phase IV protocols to DECnet OSI/Phase V protocols. The networks involved are the NASA's Science Internet (NSI-DECnet) and the DOE's Energy Science network (ESnet-DECnet). These networks, along with the many universities and research institutions connected to them, combine to form a single DECnet network containing more than 20,000 transitions and crossing numerous organizational boundaries. Discussion of transition planning, including decisions about Phase V naming, addressing, and routing are presented. Also discussed are transition issues related to the use of non-DEC routers in the network.
Optical protocols for terabit networks
NASA Technical Reports Server (NTRS)
Chua, P. L.; Lambert, J. L.; Morookian, J. M.; Bergman, L. A.
1991-01-01
This paper describes a new fiber-optic local area network technology providing 100X improvement over current technology, has full crossbar funtionality, and inherent data security. Based on optical code-division multiple access (CDMA), using spectral phase encoding/decoding of optical pulses, networking protocols are implemented entirely in the optical domain and thus conventional networking bottlenecks are avoided. Component and system issues for a proof-of-concept demonstration are discussed, as well as issues for a more practical and commercially exploitable system. Possible terrestrial and aerospace applications of this technology, and its impact on other technologies are explored. Some initial results toward realization of this concept are also included.
A Novel Addressing Scheme for PMIPv6 Based Global IP-WSNs
Islam, Md. Motaharul; Huh, Eui-Nam
2011-01-01
IP based Wireless Sensor Networks (IP-WSNs) are being used in healthcare, home automation, industrial control and agricultural monitoring. In most of these applications global addressing of individual IP-WSN nodes and layer-three routing for mobility enabled IP-WSN with special attention to reliability, energy efficiency and end to end delay minimization are a few of the major issues to be addressed. Most of the routing protocols in WSN are based on layer-two approaches. For reliability and end to end communication enhancement the necessity of layer-three routing for IP-WSNs is generating significant attention among the research community, but due to the hurdle of maintaining routing state and other communication overhead, it was not possible to introduce a layer-three routing protocol for IP-WSNs. To address this issue we propose in this paper a global addressing scheme and layer-three based hierarchical routing protocol. The proposed addressing and routing approach focuses on all the above mentioned issues. Simulation results show that the proposed addressing and routing approach significantly enhances the reliability, energy efficiency and end to end delay minimization. We also present architecture, message formats and different routing scenarios in this paper. PMID:22164084
A novel addressing scheme for PMIPv6 based global IP-WSNs.
Islam, Md Motaharul; Huh, Eui-Nam
2011-01-01
IP based Wireless Sensor Networks (IP-WSNs) are being used in healthcare, home automation, industrial control and agricultural monitoring. In most of these applications global addressing of individual IP-WSN nodes and layer-three routing for mobility enabled IP-WSN with special attention to reliability, energy efficiency and end to end delay minimization are a few of the major issues to be addressed. Most of the routing protocols in WSN are based on layer-two approaches. For reliability and end to end communication enhancement the necessity of layer-three routing for IP-WSNs is generating significant attention among the research community, but due to the hurdle of maintaining routing state and other communication overhead, it was not possible to introduce a layer-three routing protocol for IP-WSNs. To address this issue we propose in this paper a global addressing scheme and layer-three based hierarchical routing protocol. The proposed addressing and routing approach focuses on all the above mentioned issues. Simulation results show that the proposed addressing and routing approach significantly enhances the reliability, energy efficiency and end to end delay minimization. We also present architecture, message formats and different routing scenarios in this paper.
Reliable Freestanding Position-Based Routing in Highway Scenarios
Galaviz-Mosqueda, Gabriel A.; Aquino-Santos, Raúl; Villarreal-Reyes, Salvador; Rivera-Rodríguez, Raúl; Villaseñor-González, Luis; Edwards, Arthur
2012-01-01
Vehicular Ad Hoc Networks (VANETs) are considered by car manufacturers and the research community as the enabling technology to radically improve the safety, efficiency and comfort of everyday driving. However, before VANET technology can fulfill all its expected potential, several difficulties must be addressed. One key issue arising when working with VANETs is the complexity of the networking protocols compared to those used by traditional infrastructure networks. Therefore, proper design of the routing strategy becomes a main issue for the effective deployment of VANETs. In this paper, a reliable freestanding position-based routing algorithm (FPBR) for highway scenarios is proposed. For this scenario, several important issues such as the high mobility of vehicles and the propagation conditions may affect the performance of the routing strategy. These constraints have only been partially addressed in previous proposals. In contrast, the design approach used for developing FPBR considered the constraints imposed by a highway scenario and implements mechanisms to overcome them. FPBR performance is compared to one of the leading protocols for highway scenarios. Performance metrics show that FPBR yields similar results when considering freespace propagation conditions, and outperforms the leading protocol when considering a realistic highway path loss model. PMID:23202159
Reliable freestanding position-based routing in highway scenarios.
Galaviz-Mosqueda, Gabriel A; Aquino-Santos, Raúl; Villarreal-Reyes, Salvador; Rivera-Rodríguez, Raúl; Villaseñor-González, Luis; Edwards, Arthur
2012-10-24
Vehicular Ad Hoc Networks (VANETs) are considered by car manufacturers and the research community as the enabling technology to radically improve the safety, efficiency and comfort of everyday driving. However, before VANET technology can fulfill all its expected potential, several difficulties must be addressed. One key issue arising when working with VANETs is the complexity of the networking protocols compared to those used by traditional infrastructure networks. Therefore, proper design of the routing strategy becomes a main issue for the effective deployment of VANETs. In this paper, a reliable freestanding position-based routing algorithm (FPBR) for highway scenarios is proposed. For this scenario, several important issues such as the high mobility of vehicles and the propagation conditions may affect the performance of the routing strategy. These constraints have only been partially addressed in previous proposals. In contrast, the design approach used for developing FPBR considered the constraints imposed by a highway scenario and implements mechanisms to overcome them. FPBR performance is compared to one of the leading protocols for highway scenarios. Performance metrics show that FPBR yields similar results when considering freespace propagation conditions, and outperforms the leading protocol when considering a realistic highway path loss model.
Must School Districts Provide Test Protocols to Parents?
ERIC Educational Resources Information Center
Rosenfeld, S. James
2010-01-01
Despite being well-settled as a matter of law, the issue of whether test protocols must be disclosed to parents continues to be a source of dispute between schools, school psychologists, and parents. To be sure, one of the reasons for this vampire-like existence is the imprecision of the questioners and questions. Moreover, professional guidance…
ERIC Educational Resources Information Center
Wahid, Wazira Ali Abdul; Ahmed, Eqbal Sulaiman; Wahid, Muntaha Ali Abdul
2015-01-01
This issue expresses a research study based on the online interactions of English teaching specially conversation through utilizing VOIP (Voice over Internet Protocol) and cosmopolitan online theme. Data has been achieved by interviews. Simplifiers indicate how oral tasks require to be planned upon to facilitate engagement models propitious to…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-13
... ENVIRONMENTAL PROTECTION AGENCY 40 CFR Part 75 [EPA-HQ-OAR-2009-0837; FRL-9280-9] RIN 2060-AQ06 Protocol Gas Verification Program and Minimum Competency Requirements for Air Emission Testing Correction In rule document 2011-6216 appearing on pages 17288-17325 in the issue of Monday, March 28, 2011...
School-Based Methylphenidate Placebo Protocols: Methodological and Practical Issues.
ERIC Educational Resources Information Center
Hyman, Irwin A.; Wojtowicz, Alexandra; Lee, Kee Duk; Haffner, Mary Elizabeth; Fiorello, Catherine A.; And Others
1998-01-01
Focuses on methodological issues involved in choosing instruments to monitor behavior, once a comprehensive evaluation has suggested trials on Ritalin. Case examples illustrate problems of teacher compliance in filling out measures, supplying adequate placebos, and obtaining physical cooperation. Emerging school-based methodologies are discussed…
McNamee, J P; Bellier, P V
2015-07-01
As part of the Japanese Center for the Validation of Alternative Methods (JaCVAM)-initiative international validation study of the in vivo rat alkaline comet assay (comet assay), our laboratory examined ampicillin trihydrate (AMP), 1,2-dimethylhydrazine dihydrochloride (DMH), and N-nitrosodimethylamine (NDA) using a standard comet assay validation protocol (v14.2) developed by the JaCVAM validation management team (VMT). Coded samples were received by our laboratory along with basic MSDS information. Solubility analysis and range-finding experiments of the coded test compounds were conducted for dose selection. Animal dosing schedules, the comet assay processing and analysis, and statistical analysis were conducted in accordance with the standard protocol. Based upon our blinded evaluation, AMP was not found to exhibit evidence of genotoxicity in either the rat liver or stomach. However, both NDA and DMH were observed to cause a significant increase in % tail DNA in the rat liver at all dose levels tested. While acute hepatoxicity was observed for these compounds in the high dose group, in the investigators opinion there were a sufficient number of consistently damaged/measurable cells at the medium and low dose groups to judge these compounds as genotoxic. There was no evidence of genotoxicity from either NDA or DMH in the rat stomach. In conclusion, our laboratory observed increased DNA damage from two blinded test compounds in rat liver (later identified as genotoxic carcinogens), while no evidence of genotoxicity was observed for the third blinded test compound (later identified as a non-genotoxic, non-carcinogen). This data supports the use of a standardized protocol of the in vivo comet assay as a cost-effective alternative genotoxicity assay for regulatory testing purposes. Crown Copyright © 2015. Published by Elsevier B.V. All rights reserved.
Applications of the pipeline environment for visual informatics and genomics computations
2011-01-01
Background Contemporary informatics and genomics research require efficient, flexible and robust management of large heterogeneous data, advanced computational tools, powerful visualization, reliable hardware infrastructure, interoperability of computational resources, and detailed data and analysis-protocol provenance. The Pipeline is a client-server distributed computational environment that facilitates the visual graphical construction, execution, monitoring, validation and dissemination of advanced data analysis protocols. Results This paper reports on the applications of the LONI Pipeline environment to address two informatics challenges - graphical management of diverse genomics tools, and the interoperability of informatics software. Specifically, this manuscript presents the concrete details of deploying general informatics suites and individual software tools to new hardware infrastructures, the design, validation and execution of new visual analysis protocols via the Pipeline graphical interface, and integration of diverse informatics tools via the Pipeline eXtensible Markup Language syntax. We demonstrate each of these processes using several established informatics packages (e.g., miBLAST, EMBOSS, mrFAST, GWASS, MAQ, SAMtools, Bowtie) for basic local sequence alignment and search, molecular biology data analysis, and genome-wide association studies. These examples demonstrate the power of the Pipeline graphical workflow environment to enable integration of bioinformatics resources which provide a well-defined syntax for dynamic specification of the input/output parameters and the run-time execution controls. Conclusions The LONI Pipeline environment http://pipeline.loni.ucla.edu provides a flexible graphical infrastructure for efficient biomedical computing and distributed informatics research. The interactive Pipeline resource manager enables the utilization and interoperability of diverse types of informatics resources. The Pipeline client-server model provides computational power to a broad spectrum of informatics investigators - experienced developers and novice users, user with or without access to advanced computational-resources (e.g., Grid, data), as well as basic and translational scientists. The open development, validation and dissemination of computational networks (pipeline workflows) facilitates the sharing of knowledge, tools, protocols and best practices, and enables the unbiased validation and replication of scientific findings by the entire community. PMID:21791102
Smith, Zachary L; Dua, Arshish; Saeian, Kia; Ledeboer, Nathan A; Graham, Mary Beth; Aburajab, Murad; Ballard, Darren D; Khan, Abdul H; Dua, Kulwinder S
2017-11-01
Numerous published outbreaks, including one from our institution, have described endoscope-associated transmission of multidrug-resistant organisms (MDROs). Individual centers have adopted their own protocols to address this issue, including endoscope culture and sequestration. Endoscope culturing has drawbacks and may allow residual bacteria, including MDROs, to go undetected after high-level disinfection. To report the outcome of our novel protocol, which does not utilize endoscope culturing, to address our outbreak. All patients undergoing procedures with elevator-containing endoscopes were asked to permit performance of a rectal swab. All endoscopes underwent high-level disinfection according to updated manufacturer's guidance. Additionally, ethylene oxide (EtO) sterilization was done in the high-risk settings of (1) positive response to a pre-procedure risk stratification questionnaire, (2) positive or indeterminate CRE polymerase chain reaction (PCR) from rectal swab, (3) refusal to consent for PCR or questionnaire, (4) purulent cholangitis or infected pancreatic fluid collections. Two endoscopes per weekend were sterilized on a rotational basis. From September 1, 2015 to April 30, 2016, 556 endoscopy sessions were performed using elevator-containing endoscopes. Prompted EtO sterilization was done on 46 (8.3%) instances, 3 from positive/indeterminate PCR tests out of 530 samples (0.6%). No CRE transmission was observed during the study period. Damage or altered performance of endoscopes related to EtO was not observed. In this pilot study, prompted EtO sterilization in high-risk patients has thus far eliminated endoscope-associated MDRO transmission, although no CRE infections were noted throughout the institution during the study period. Further studies and a larger patient sample will be required to validate these findings.
NASA Astrophysics Data System (ADS)
Watford, M.; DeCusatis, C.
2005-09-01
With the advent of new regulations governing the protection and recovery of sensitive business data, including the Sarbanes-Oxley Act, there has been a renewed interest in business continuity and disaster recovery applications for metropolitan area networks. Specifically, there has been a need for more efficient bandwidth utilization and lower cost per channel to facilitate mirroring of multi-terabit data bases. These applications have further blurred the boundary between metropolitan and wide area networks, with synchronous disaster recovery applications running up to 100 km and asynchronous solutions extending to 300 km or more. In this paper, we discuss recent enhancements in the Nortel Optical Metro 5200 Dense Wavelength Division Multiplexing (DWDM) platform, including features recently qualified for data communication applications such as Metro Mirror, Global Mirror, and Geographically Distributed Parallel Sysplex (GDPS). Using a 10 Gigabit/second (Gbit/s) backbone, this solution transports significantly more Fibre Channel protocol traffic with up to five times greater hardware density in the same physical package. This is also among the first platforms to utilize forward error correction (FEC) on the aggregate signals to improve bit error rate (BER) performance beyond industry standards. When combined with encapsulation into wide area network protocols, the use of FEC can compensate for impairments in BER across a service provider infrastructure without impacting application level performance. Design and implementation of these features will be discussed, including results from experimental test beds which validate these solutions for a number of applications. Future extensions of this environment will also be considered, including ways to provide configurable bandwidth on demand, mitigate Fibre Channel buffer credit management issues, and support for other GDPS protocols.
Development and Validation of a Porcine (Sus scrofa) Sepsis Model
2018-03-01
last IACUC approval, have any methods been identified to reduce the number of live animals used in this protocol? None 10. PUBLICATIONS...SUMMARY: (Please provide, in "ABSTRACT" format, a summary of the protocol objectives, materials and methods , results - include tables/figures, and...Materials and methods : Animals were anesthetized and instrumented for cardiovascular monitoring. Lipopolysaccharide (LPS, a large molecule present on the
2011-11-01
assessment to quality of localization/characterization estimates. This protocol includes four critical components: (1) a procedure to identify the...critical factors impacting SHM system performance; (2) a multistage or hierarchical approach to SHM system validation; (3) a model -assisted evaluation...Lindgren, E. A ., Buynak, C. F., Steffes, G., Derriso, M., “ Model -assisted Probabilistic Reliability Assessment for Structural Health Monitoring
Seaworthy Quantum Key Distribution Design and Validation (SEAKEY)
2014-10-30
to single photon detection, at comparable detection efficiencies. On the other hand, error-correction codes are better developed for small-alphabet...protocol is several orders of magnitude better than the Shapiro protocol, which needs entangled states. The bits/mode performance achieved by our...putting together a software tool implemented in MATLAB , which talks to the MODTRAN database via an intermediate numerical dump of transmission data
Cohen, Emmanuel; Bernard, Jonathan Y.; Ponty, Amandine; Ndao, Amadou; Amougou, Norbert; Saïd-Mohamed, Rihlat; Pasquet, Patrick
2015-01-01
Background The social valorisation of overweight in African populations could promote high-risk eating behaviours and therefore become a risk factor of obesity. However, existing scales to assess body image are usually not accurate enough to allow comparative studies of body weight perception in different African populations. This study aimed to develop and validate the Body Size Scale (BSS) to estimate African body weight perception. Methods Anthropometric measures of 80 Cameroonians and 81 Senegalese were used to evaluate three criteria of adiposity: body mass index (BMI), overall percentage of fat, and endomorphy (fat component of the somatotype). To develop the BSS, the participants were photographed in full face and profile positions. Models were selected for their representativeness of the wide variability in adiposity with a progressive increase along the scale. Then, for the validation protocol, participants self-administered the BSS to assess self-perceived current body size (CBS), desired body size (DBS) and provide a “body self-satisfaction index.” This protocol included construct validity, test-retest reliability and convergent validity and was carried out with three independent samples of respectively 201, 103 and 1115 Cameroonians. Results The BSS comprises two sex-specific scales of photos of 9 models each, and ordered by increasing adiposity. Most participants were able to correctly order the BSS by increasing adiposity, using three different words to define body size. Test-retest reliability was consistent in estimating CBS, DBS and the “body self-satisfaction index.” The CBS was highly correlated to the objective BMI, and two different indexes assessed with the BSS were consistent with declarations obtained in interviews. Conclusion The BSS is the first scale with photos of real African models taken in both full face and profile and representing a wide and representative variability in adiposity. The validation protocol proved its reliability for estimating body weight perception in Africans. PMID:26536030
Jeddi, Fakhri; Yapo-Kouadio, Gisèle Cha; Normand, Anne-Cécile; Cassagne, Carole; Marty, Pierre; Piarroux, Renaud
2017-02-01
In cases of fungal infection of the bloodstream, rapid species identification is crucial to provide adapted therapy and thereby ameliorate patient outcome. Currently, the commercial Sepsityper kit and the sodium-dodecyl sulfate (SDS) method coupled with MALDI-TOF mass spectrometry are the most commonly reported lysis protocols for direct identification of fungi from positive blood culture vials. However, the performance of these two protocols has never been compared on clinical samples. Accordingly, we performed a two-step survey on two distinct panels of clinical positive blood culture vials to identify the most efficient protocol, establish an appropriate log score (LS) cut-off, and validate the best method. We first compared the performance of the Sepsityper and the SDS protocols on 71 clinical samples. For 69 monomicrobial samples, mass spectrometry LS values were significantly higher with the SDS protocol than with the Sepsityper method (P < .0001), especially when the best score of four deposited spots was considered. Next, we established the LS cut-off for accurate identification at 1.7, based on specimen DNA sequence data. Using this LS cut-off, 66 (95.6%) and 46 (66.6%) isolates were correctly identified at the species level with the SDS and the Sepsityper protocols, respectively. In the second arm of the survey, we validated the SDS protocol on an additional panel of 94 clinical samples. Ninety-two (98.9%) of 93 monomicrobial samples were correctly identified at the species level (median LS = 2.061). Overall, our data suggest that the SDS method yields more accurate species identification of yeasts, than the Sepsityper protocol. © The Author 2016. Published by Oxford University Press on behalf of The International Society for Human and Animal Mycology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
An Event Driven Hybrid Identity Management Approach to Privacy Enhanced e-Health
Sánchez-Guerrero, Rosa; Almenárez, Florina; Díaz-Sánchez, Daniel; Marín, Andrés; Arias, Patricia; Sanvido, Fabio
2012-01-01
Credential-based authorization offers interesting advantages for ubiquitous scenarios involving limited devices such as sensors and personal mobile equipment: the verification can be done locally; it offers a more reduced computational cost than its competitors for issuing, storing, and verification; and it naturally supports rights delegation. The main drawback is the revocation of rights. Revocation requires handling potentially large revocation lists, or using protocols to check the revocation status, bringing extra communication costs not acceptable for sensors and other limited devices. Moreover, the effective revocation consent—considered as a privacy rule in sensitive scenarios—has not been fully addressed. This paper proposes an event-based mechanism empowering a new concept, the sleepyhead credentials, which allows to substitute time constraints and explicit revocation by activating and deactivating authorization rights according to events. Our approach is to integrate this concept in IdM systems in a hybrid model supporting delegation, which can be an interesting alternative for scenarios where revocation of consent and user privacy are critical. The delegation includes a SAML compliant protocol, which we have validated through a proof-of-concept implementation. This article also explains the mathematical model describing the event-based model and offers estimations of the overhead introduced by the system. The paper focus on health care scenarios, where we show the flexibility of the proposed event-based user consent revocation mechanism. PMID:22778634
An event driven hybrid identity management approach to privacy enhanced e-health.
Sánchez-Guerrero, Rosa; Almenárez, Florina; Díaz-Sánchez, Daniel; Marín, Andrés; Arias, Patricia; Sanvido, Fabio
2012-01-01
Credential-based authorization offers interesting advantages for ubiquitous scenarios involving limited devices such as sensors and personal mobile equipment: the verification can be done locally; it offers a more reduced computational cost than its competitors for issuing, storing, and verification; and it naturally supports rights delegation. The main drawback is the revocation of rights. Revocation requires handling potentially large revocation lists, or using protocols to check the revocation status, bringing extra communication costs not acceptable for sensors and other limited devices. Moreover, the effective revocation consent--considered as a privacy rule in sensitive scenarios--has not been fully addressed. This paper proposes an event-based mechanism empowering a new concept, the sleepyhead credentials, which allows to substitute time constraints and explicit revocation by activating and deactivating authorization rights according to events. Our approach is to integrate this concept in IdM systems in a hybrid model supporting delegation, which can be an interesting alternative for scenarios where revocation of consent and user privacy are critical. The delegation includes a SAML compliant protocol, which we have validated through a proof-of-concept implementation. This article also explains the mathematical model describing the event-based model and offers estimations of the overhead introduced by the system. The paper focus on health care scenarios, where we show the flexibility of the proposed event-based user consent revocation mechanism.
Probiotics for fibromyalgia: study design for a pilot double-blind, randomized controlled trial.
Roman, Pablo; Estévez, Ángeles F; Sánchez-Labraca, Nuria; Cañadas, Fernando; Miras, Alonso; Cardona, Diana
2017-10-24
Fibromyalgia syndrome (FMS) is a chronic, generalized and diffuse pain disorder accompanied by other symptoms such as emotional and cognitive deficits. The FMS patients show a high prevalence of gastrointestinal symptoms. Recently it has been found that microbes in the gut may regulate brain processes through the gut-microbiota-brain axis, modulating thus affection, motivation and higher cognitive functions. Therefore, the use of probiotics might be a new treatment that could improve the physical, psychological and cognitive state in FMS; however, no evidence about this issue is available. This paper describes the design and protocol of a double-blind, placebo-controlled and randomized pilot study. We use validated questionnaires, cognitive task through E-Prime and biological measures like urine cortisol and stool fecal samples. The trial aim is to explore the effects of eight weeks of probiotics therapy in physical (pain, impact of the FMS and quality of life), emotional (depression, and anxiety) and cognitive symptoms (attention, memory, and impulsivity) in FMS patients as compared to placebo. This pilot study is the first, to our knowledge, to evaluate the effects of probiotics in FMS. The primary hypothesis was that FMS patients will show a better performance on cognitive tasks, and an improvement in emotional and physical symptoms. These results will contribute to a better understanding in the gut-brain axis. Here we present the design and protocol of the study.
NASA Technical Reports Server (NTRS)
Beaton, Kara H.; Chappell, Steven P.; Bekdash, Omar S.; Gernhardt, Michael L.
2018-01-01
The NASA Next Space Technologies for Exploration Partnerships (NextSTEP) program is a public-private partnership model that seeks commercial development of deep space exploration capabilities to support extensive human spaceflight missions around and beyond cislunar space. NASA first issued the Phase 1 NextSTEP Broad Agency Announcement to U.S. industries in 2014, which called for innovative cislunar habitation concepts that leveraged commercialization plans for low Earth orbit. These habitats will be part of the Deep Space Gateway (DSG), the cislunar space station planned by NASA for construction in the 2020s. In 2016, Phase 2 of the NextSTEP program selected five commercial partners to develop ground prototypes. A team of NASA research engineers and subject matter experts have been tasked with developing the ground test protocol that will serve as the primary means by which these Phase 2 prototype habitats will be evaluated. Since 2008, this core test team has successfully conducted multiple spaceflight analog mission evaluations utilizing a consistent set of operational products, tools, methods, and metrics to enable the iterative development, testing, analysis, and validation of evolving exploration architectures, operations concepts, and vehicle designs. The purpose of implementing a similar evaluation process for the NextSTEP Phase 2 Habitation Concepts is to consistently evaluate the different commercial partner ground prototypes to provide data-driven, actionable recommendations for Phase 3.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Caine, Hannah; Whalley, Deborah; Kneebone, Andrew
If a prostate intensity-modulated radiation therapy (IMRT) or volumetric-modulated arc therapy (VMAT) plan has protocol violations, it is often a challenge knowing whether this is due to unfavorable anatomy or suboptimal planning. This study aimed to create a model to predict protocol violations based on patient anatomical variables and their potential relationship to target and organ at risk (OAR) end points in the setting of definitive, dose-escalated IMRT/VMAT prostate planning. Radiotherapy plans from 200 consecutive patients treated with definitive radiation for prostate cancer using IMRT or VMAT were analyzed. The first 100 patient plans (hypothesis-generating cohort) were examined to identifymore » anatomical variables that predict for dosimetric outcome, in particular OAR end points. Variables that scored significance were further assessed for their ability to predict protocol violations using a Classification and Regression Tree (CART) analysis. These results were then validated in a second group of 100 patients (validation cohort). In the initial analysis of the hypothesis-generating cohort, percentage of rectum overlap in the planning target volume (PTV) (%OR) and percentage of bladder overlap in the PTV (%OB) were highlighted as significant predictors of rectal and bladder dosimetry. Lymph node treatment was also significant for bladder outcomes. For the validation cohort, CART analysis showed that %OR of < 6%, 6% to 9% and > 9% predicted a 13%, 63%, and 100% rate of rectal protocol violations respectively. For the bladder, %OB of < 9% vs > 9% is associated with 13% vs 88% rate of bladder constraint violations when lymph nodes were not treated. If nodal irradiation was delivered, plans with a %OB of < 9% had a 59% risk of violations. Percentage of rectum and bladder within the PTV can be used to identify individual plan potential to achieve dose-volume histogram (DVH) constraints. A model based on these factors could be used to reduce planning time, improve work flow, and strengthen plan quality and consistency.« less
Zerbini, Francesca; Zanella, Ilaria; Fraccascia, Davide; König, Enrico; Irene, Carmela; Frattini, Luca F; Tomasi, Michele; Fantappiè, Laura; Ganfini, Luisa; Caproni, Elena; Parri, Matteo; Grandi, Alberto; Grandi, Guido
2017-04-24
The exploitation of the CRISPR/Cas9 machinery coupled to lambda (λ) recombinase-mediated homologous recombination (recombineering) is becoming the method of choice for genome editing in E. coli. First proposed by Jiang and co-workers, the strategy has been subsequently fine-tuned by several authors who demonstrated, by using few selected loci, that the efficiency of mutagenesis (number of mutant colonies over total number of colonies analyzed) can be extremely high (up to 100%). However, from published data it is difficult to appreciate the robustness of the technology, defined as the number of successfully mutated loci over the total number of targeted loci. This information is particularly relevant in high-throughput genome editing, where repetition of experiments to rescue missing mutants would be impractical. This work describes a "brute force" validation activity, which culminated in the definition of a robust, simple and rapid protocol for single or multiple gene deletions. We first set up our own version of the CRISPR/Cas9 protocol and then we evaluated the mutagenesis efficiency by changing different parameters including sequence of guide RNAs, length and concentration of donor DNAs, and use of single stranded and double stranded donor DNAs. We then validated the optimized conditions targeting 78 "dispensable" genes. This work led to the definition of a protocol, featuring the use of double stranded synthetic donor DNAs, which guarantees mutagenesis efficiencies consistently higher than 10% and a robustness of 100%. The procedure can be applied also for simultaneous gene deletions. This work defines for the first time the robustness of a CRISPR/Cas9-based protocol based on a large sample size. Since the technical solutions here proposed can be applied to other similar procedures, the data could be of general interest for the scientific community working on bacterial genome editing and, in particular, for those involved in synthetic biology projects requiring high throughput procedures.
Veneziano, Domenico; Ahmed, Kamran; Van Cleynenbreugel, Ben S E P; Gözen, Ali Serdar; Palou, Joan; Sarica, Kemal; Liatsikos, Evangelos N; Sanguedolce, Francesco; Honeck, Patrick; Alvarez-Maestro, Mario; Papatsoris, Athanasios; Kallidonis, Panagiotis; Greco, Francesco; Breda, Alberto; Somani, Bhaskar
2017-07-10
Background Simulation based technical-skill assessment is a core topic of debate, especially in high-risk environments. After the introduction of the E-BLUS exam for basic laparoscopy, no more technical training/assessment urological protocols have been developed in Europe. Objective We describe the methodology used in the development of the novel Endoscopic Stone Treatment step 1 (EST s1) assessment curriculum. Materials and Methods The "full life cycle curriculum development" template was followed for curriculum development. A CTA was run to define the most important steps and details of RIRS, in accordance with EAU Urolithiasis guidelines. Training tasks were created between April 2015 and September 2015. Tasks and metrics were further analyzed by a consensus meeting with the EULIS board in February 2016. A review, aimed to study available simulators and their accordance with task requirements, was subsequently run in London on March 2016. After initial feedback and further tests, content validity of this protocol was achieved during EUREP 2016. Results The EST s1 curriculum development, took 23 months. 72 participants tested the 5 preliminary tasks during EUREP 2015, with sessions of 45 minutes each. Likert-scale questionnaires were filled-out to score the quality of training. The protocol was modified accordingly and 25 participants tested the 4 tasks during the hands-on training sessions of the ESUT 2016 congress. 134 participants finally participated in the validation study in EUREP 2016. During the same event 10 experts confirmed content validity by filling-out a Likert-scale questionnaire. Conclusion We described a reliable and replicable methodology that can be followed to develop training/assessment protocols for surgical procedures. The expert consensus meetings, strict adherence to guidelines and updated literature search towards an Endourology curriculum allowed correct training and assessment protocol development. It is the first step towards standardized simulation training in Endourology with a potential for worldwide adoption.
Changes and Issues in the Validation of Experience
ERIC Educational Resources Information Center
Triby, Emmanuel
2005-01-01
This article analyses the main changes in the rules for validating experience in France and of what they mean for society. It goes on to consider university validation practices. The way in which this system is evolving offers a chance to identify the issues involved for the economy and for society, with particular attention to the expected…
Schiffman, Eric; Ohrbach, Richard; Truelove, Edmond; Look, John; Anderson, Gary; Goulet, Jean-Paul; List, Thomas; Svensson, Peter; Gonzalez, Yoly; Lobbezoo, Frank; Michelotti, Ambra; Brooks, Sharon L.; Ceusters, Werner; Drangsholt, Mark; Ettlin, Dominik; Gaul, Charly; Goldberg, Louis J.; Haythornthwaite, Jennifer A.; Hollender, Lars; Jensen, Rigmor; John, Mike T.; De Laat, Antoon; de Leeuw, Reny; Maixner, William; van der Meulen, Marylee; Murray, Greg M.; Nixdorf, Donald R.; Palla, Sandro; Petersson, Arne; Pionchon, Paul; Smith, Barry; Visscher, Corine M.; Zakrzewska, Joanna; Dworkin, Samuel F.
2015-01-01
Aims The original Research Diagnostic Criteria for Temporomandibular Disorders (RDC/TMD) Axis I diagnostic algorithms have been demonstrated to be reliable. However, the Validation Project determined that the RDC/TMD Axis I validity was below the target sensitivity of ≥ 0.70 and specificity of ≥ 0.95. Consequently, these empirical results supported the development of revised RDC/TMD Axis I diagnostic algorithms that were subsequently demonstrated to be valid for the most common pain-related TMD and for one temporomandibular joint (TMJ) intra-articular disorder. The original RDC/TMD Axis II instruments were shown to be both reliable and valid. Working from these findings and revisions, two international consensus workshops were convened, from which recommendations were obtained for the finalization of new Axis I diagnostic algorithms and new Axis II instruments. Methods Through a series of workshops and symposia, a panel of clinical and basic science pain experts modified the revised RDC/TMD Axis I algorithms by using comprehensive searches of published TMD diagnostic literature followed by review and consensus via a formal structured process. The panel's recommendations for further revision of the Axis I diagnostic algorithms were assessed for validity by using the Validation Project's data set, and for reliability by using newly collected data from the ongoing TMJ Impact Project—the follow-up study to the Validation Project. New Axis II instruments were identified through a comprehensive search of the literature providing valid instruments that, relative to the RDC/TMD, are shorter in length, are available in the public domain, and currently are being used in medical settings. Results The newly recommended Diagnostic Criteria for TMD (DC/TMD) Axis I protocol includes both a valid screener for detecting any pain-related TMD as well as valid diagnostic criteria for differentiating the most common pain-related TMD (sensitivity ≥ 0.86, specificity ≥ 0.98) and for one intra-articular disorder (sensitivity of 0.80 and specificity of 0.97). Diagnostic criteria for other common intra-articular disorders lack adequate validity for clinical diagnoses but can be used for screening purposes. Inter-examiner reliability for the clinical assessment associated with the validated DC/TMD criteria for pain-related TMD is excellent (kappa ≥ 0.85). Finally, a comprehensive classification system that includes both the common and less common TMD is also presented. The Axis II protocol retains selected original RDC/TMD screening instruments augmented with new instruments to assess jaw function as well as behavioral and additional psychosocial factors. The Axis II protocol is divided into screening and comprehensive self-report instrument sets. The screening instruments’ 41 questions assess pain intensity, pain-related disability, psychological distress, jaw functional limitations, and parafunctional behaviors, and a pain drawing is used to assess locations of pain. The comprehensive instruments, composed of 81 questions, assess in further detail jaw functional limitations and psychological distress as well as additional constructs of anxiety and presence of comorbid pain conditions. Conclusion The recommended evidence-based new DC/TMD protocol is appropriate for use in both clinical and research settings. More comprehensive instruments augment short and simple screening instruments for Axis I and Axis II. These validated instruments allow for identification of patients with a range of simple to complex TMD presentations. PMID:24482784
Lardy-Fontan, Sophie; Le Diouron, Véronique; Drouin, Catherine; Lalere, Béatrice; Vaslin-Reimann, Sophie; Dauchy, Xavier; Rosin, Christophe
2017-06-01
Research on emerging substances in drinking water presents major interest and the possibility of trace contamination has seen increasing concern from the scientific community and the public authorities. More particularly, residues of pharmaceuticals and personal care products (PPCPs) in bottled water are a very important issue due to societal concerns and potential media impact. In this context, it has become necessary to carry out reliable monitoring. This requires measurements of high quality with demonstration of accuracy and well-defined uncertainty. In this study, 20 pharmaceutical compounds were targeted for the first time in 167 bottled waters from France and other European countries. An isotope dilution-solid phase extraction-liquid chromatography mass spectrometry method, together with stringent quality control and quality assurance protocols, was developed and validated according to French mandatory standards. Recoveries between 87% and 112% were obtained with coefficient of variation below 20%. Operational limits of quantification (LOQ) were comprised between 5 and 30ngL -1 . Expanded uncertainties (k=2) ranged between 16% and 43% and were below 35% for half of the compounds. The survey showed only four positive quantifications, thereby highlighting the rarity of contamination. Copyright © 2017 Elsevier B.V. All rights reserved.
Routing in Mobile Wireless Sensor Networks: A Leader-Based Approach.
Burgos, Unai; Amozarrain, Ugaitz; Gómez-Calzado, Carlos; Lafuente, Alberto
2017-07-07
This paper presents a leader-based approach to routing in Mobile Wireless Sensor Networks (MWSN). Using local information from neighbour nodes, a leader election mechanism maintains a spanning tree in order to provide the necessary adaptations for efficient routing upon the connectivity changes resulting from the mobility of sensors or sink nodes. We present two protocols following the leader election approach, which have been implemented using Castalia and OMNeT++. The protocols have been evaluated, besides other reference MWSN routing protocols, to analyse the impact of network size and node velocity on performance, which has demonstrated the validity of our approach.
Challenges in verification and validation of autonomous systems for space exploration
NASA Technical Reports Server (NTRS)
Brat, Guillaume; Jonsson, Ari
2005-01-01
Space exploration applications offer a unique opportunity for the development and deployment of autonomous systems, due to limited communications, large distances, and great expense of direct operation. At the same time, the risk and cost of space missions leads to reluctance to taking on new, complex and difficult-to-understand technology. A key issue in addressing these concerns is the validation of autonomous systems. In recent years, higher-level autonomous systems have been applied in space applications. In this presentation, we will highlight those autonomous systems, and discuss issues in validating these systems. We will then look to future demands on validating autonomous systems for space, identify promising technologies and open issues.
Biomedical surveillance: rights conflict with rights.
Atherley, G; Johnston, N; Tennassee, M
1986-10-01
Medical screening and biomedical monitoring violate individual rights. Such conflicts of right with right are acted upon synergistically by uncertainty which, in some important respects, increases rather than decreases as a result of research. Issues of rightness and wrongness, ethical issues, arise because the human beings who are subjects of medical screening and biological monitoring often have little or no option whether to be subjected to them. We identify issues of rightness and wrongness of biomedical surveillance for various purposes of occupational health and safety. We distinguish between social validity and scientific validity. We observe that principles are well established for scientific validity, but not for social validity. We support guidelines as a way forward.
Vehicle Density Based Forwarding Protocol for Safety Message Broadcast in VANET
Huang, Jiawei; Wang, Jianxin
2014-01-01
In vehicular ad hoc networks (VANETs), the medium access control (MAC) protocol is of great importance to provide time-critical safety applications. Contemporary multihop broadcast protocols in VANETs usually choose the farthest node in broadcast range as the forwarder to reduce the number of forwarding hops. However, in this paper, we demonstrate that the farthest forwarder may experience large contention delay in case of high vehicle density. We propose an IEEE 802.11-based multihop broadcast protocol VDF to address the issue of emergency message dissemination. To achieve the tradeoff between contention delay and forwarding hops, VDF adaptably chooses the forwarder according to the vehicle density. Simulation results show that, due to its ability to decrease the transmission collisions, the proposed protocol can provide significantly lower broadcast delay. PMID:25121125
Fernandez-Calle, Pilar; Pelaz, Sandra; Oliver, Paloma; Alcaide, Maria Jose; Gomez-Rioja, Ruben; Buno, Antonio; Iturzaeta, Jose Manuel
2013-01-01
Introduction Technological innovation requires the laboratories to ensure that modifications or incorporations of new techniques do not alter the quality of their results. In an ISO 15189 accredited laboratory, flexible scope accreditation facilitates the inclusion of these changes prior to accreditation body evaluation. A strategy to perform the validation of a biochemistry analyzer in an accredited laboratory having a flexible scope is shown. Materials and methods: A validation procedure including the evaluation of imprecision and bias of two Dimension Vista analysers 1500 was conducted. Comparability of patient results between one of them and the lately replaced Dimension RxL Max was evaluated. All studies followed the respective Clinical and Laboratory Standards Institute (CLSI) protocols. 30 chemistry assays were studied. Coefficients of variation, percent bias and total error were calculated for all tests and biological variation was considered as acceptance criteria. Quality control material and patient samples were used as test materials. Interchangeability of the results was established by processing forty patients’ samples in both devices. Results: 27 of the 30 studied parameters met allowable performance criteria. Sodium, chloride and magnesium did not fulfil acceptance criteria. Evidence of interchangeability of patient results was obtained for all parameters except magnesium, NT-proBNP, cTroponin I and C-reactive protein. Conclusions: A laboratory having a well structured and documented validation procedure can opt to get a flexible scope of accreditation. In addition, performing these activities prior to use on patient samples may evidence technical issues which must be corrected to minimize their impact on patient results. PMID:23457769
Actions of the fall prevention protocol: mapping with the classification of nursing interventions.
Alves, Vanessa Cristina; Freitas, Weslen Carlos Junior de; Ramos, Jeferson Silva; Chagas, Samantha Rodrigues Garbis; Azevedo, Cissa; Mata, Luciana Regina Ferreira da
2017-12-21
to analyze the correspondence between the actions contained in the fall prevention protocol of the Ministry of Health and the Nursing Interventions Classification (NIC) by a cross-mapping. this is a descriptive study carried out in four stages: protocol survey, identification of NIC interventions related to nursing diagnosis, the risk of falls, cross-mapping, and validation of the mapping from the Delphi technique. there were 51 actions identified in the protocol and 42 interventions in the NIC. Two rounds of mapping evaluation were carried out by the experts. There were 47 protocol actions corresponding to 25 NIC interventions. The NIC interventions that presented the highest correspondence with protocol actions were: fall prevention, environmental-safety control, and risk identification. Regarding the classification of similarity and comprehensiveness of the 47 actions of the protocol mapped, 44.7% were considered more detailed and specific than the NIC, 29.8% less specific than the NIC and 25.5% were classified as similar in significance to the NIC. most of the actions contained in the protocol are more specific and detailed, however, the NIC contemplates a greater diversity of interventions and may base a review of the protocol to increase actions related to falls prevention..
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurnik, Charles W.; Romberger, Jeff
The HVAC Controls Evaluation Protocol is designed to address evaluation issues for direct digital controls/energy management systems/building automation systems (DDC/EMS/BAS) that are installed to control heating, ventilation, and air-conditioning (HVAC) equipment in commercial and institutional buildings. (This chapter refers to the DDC/EMS/BAS measure as HVAC controls.) This protocol may also be applicable to industrial facilities such as clean rooms and labs, which have either significant HVAC equipment or spaces requiring special environmental conditions.
Privacy Preserving Quantum Anonymous Transmission via Entanglement Relay
NASA Astrophysics Data System (ADS)
Yang, Wei; Huang, Liusheng; Song, Fang
2016-06-01
Anonymous transmission is an interesting and crucial issue in computer communication area, which plays a supplementary role to data privacy. In this paper, we put forward a privacy preserving quantum anonymous transmission protocol based on entanglement relay, which constructs anonymous entanglement from EPR pairs instead of multi-particle entangled state, e.g. GHZ state. Our protocol achieves both sender anonymity and receiver anonymity against an active adversary and tolerates any number of corrupt participants. Meanwhile, our protocol obtains an improvement in efficiency compared to quantum schemes in previous literature.
Privacy Preserving Quantum Anonymous Transmission via Entanglement Relay.
Yang, Wei; Huang, Liusheng; Song, Fang
2016-06-01
Anonymous transmission is an interesting and crucial issue in computer communication area, which plays a supplementary role to data privacy. In this paper, we put forward a privacy preserving quantum anonymous transmission protocol based on entanglement relay, which constructs anonymous entanglement from EPR pairs instead of multi-particle entangled state, e.g. GHZ state. Our protocol achieves both sender anonymity and receiver anonymity against an active adversary and tolerates any number of corrupt participants. Meanwhile, our protocol obtains an improvement in efficiency compared to quantum schemes in previous literature.
Privacy Preserving Quantum Anonymous Transmission via Entanglement Relay
Yang, Wei; Huang, Liusheng; Song, Fang
2016-01-01
Anonymous transmission is an interesting and crucial issue in computer communication area, which plays a supplementary role to data privacy. In this paper, we put forward a privacy preserving quantum anonymous transmission protocol based on entanglement relay, which constructs anonymous entanglement from EPR pairs instead of multi-particle entangled state, e.g. GHZ state. Our protocol achieves both sender anonymity and receiver anonymity against an active adversary and tolerates any number of corrupt participants. Meanwhile, our protocol obtains an improvement in efficiency compared to quantum schemes in previous literature. PMID:27247078
A two-hop based adaptive routing protocol for real-time wireless sensor networks.
Rachamalla, Sandhya; Kancherla, Anitha Sheela
2016-01-01
One of the most important and challenging issues in wireless sensor networks (WSNs) is to optimally manage the limited energy of nodes without degrading the routing efficiency. In this paper, we propose an energy-efficient adaptive routing mechanism for WSNs, which saves energy of nodes by removing the much delayed packets without degrading the real-time performance of the used routing protocol. It uses the adaptive transmission power algorithm which is based on the attenuation of the wireless link to improve the energy efficiency. The proposed routing mechanism can be associated with any geographic routing protocol and its performance is evaluated by integrating with the well known two-hop based real-time routing protocol, PATH and the resulting protocol is energy-efficient adaptive routing protocol (EE-ARP). The EE-ARP performs well in terms of energy consumption, deadline miss ratio, packet drop and end-to-end delay.
Reliability and Validity of SERVQUAL Scores Used To Evaluate Perceptions of Library Service Quality.
ERIC Educational Resources Information Center
Thompson, Bruce; Cook, Colleen
Research libraries are increasingly supplementing collection counts with perceptions of service quality as indices of status and productivity. The present study was undertaken to explore the reliability and validity of scores from the SERVQUAL measurement protocol (A. Parasuraman and others, 1991), which has previously been used in this type of…
Measuring Students' Physical Activity Levels: Validating SOFIT for Use with High-School Students
ERIC Educational Resources Information Center
van der Mars, Hans; Rowe, Paul J.; Schuldheisz, Joel M.; Fox, Susan
2004-01-01
This study was conducted to validate the System for Observing Fitness Instruction Time (SOFIT) for measuring physical activity levels of high-school students. Thirty-five students (21 girls and 14 boys from grades 9-12) completed a standardized protocol including lying, sitting, standing, walking, running, curl-ups, and push-ups. Heart rates and…
ERIC Educational Resources Information Center
Gardner, Deborah L.; Huber, Charles H.; Steiner, Robert; Vazquez, Luis A.; Savage, Todd A.
2008-01-01
This article describes the three-stage protocol employed in development and validation of the Inventory of Family Protective Factors (IFPF), a brief-form formal instrument intended to assess the primary protective factors that contribute to family resilience. Following construction of the instrument, data collections and analyses of a total sample…
Cross-Validation of a PACER Prediction Equation for Assessing Aerobic Capacity in Hungarian Youth
ERIC Educational Resources Information Center
Saint-Maurice, Pedro F.; Welk, Gregory J.; Finn, Kevin J.; Kaj, Mónika
2015-01-01
Purpose: The purpose of this article was to evaluate the validity of the Progressive Aerobic Cardiovascular and Endurance Run (PACER) test in a sample of Hungarian youth. Method: Approximately 500 participants (aged 10-18 years old) were randomly selected across Hungary to complete both laboratory (maximal treadmill protocol) and field assessments…
This product is an LC/MS/MS single laboratory validated method for the determination of cylindrospermopsin and anatoxin-a in ambient waters. The product contains step-by-step instructions for sample preparation, analyses, preservation, sample holding time and QC protocols to ensu...
Computational Enzyme Design: Advances, hurdles and possible ways forward
Linder, Mats
2012-01-01
This mini review addresses recent developments in computational enzyme design. Successful protocols as well as known issues and limitations are discussed from an energetic perspective. It will be argued that improved results can be obtained by including a dynamic treatment in the design protocol. Finally, a molecular dynamics-based approach for evaluating and refining computational designs is presented. PMID:24688650
ERIC Educational Resources Information Center
Chen, Charlie C.; Vannoy, Sandra
2013-01-01
Voice over Internet Protocol- (VoIP) enabled online learning service providers struggling with high attrition rates and low customer loyalty issues despite VoIP's high degree of system fit for online global learning applications. Effective solutions to this prevalent problem rely on the understanding of system quality, information quality, and…
Of taps and toilets: quasi-experimental protocol for evaluating community-demand-driven projects.
Pattanayak, Subhrendu K; Poulos, Christine; Yang, Jui-Chen; Patil, Sumeet R; Wendland, Kelly J
2009-09-01
Sustainable and equitable access to safe water and adequate sanitation are widely acknowledged as vital, yet neglected, development goals. Water supply and sanitation (WSS) policies are justified because of the usual efficiency criteria, but also major equity concerns. Yet, to date there are few scientific impact evaluations showing that WSS policies are effective in delivering social welfare outcomes. This lack of an evaluation culture is partly because WSS policies are characterized by diverse mechanisms, broad goals and the increasing importance of decentralized delivery, and partly because programme administrators are unaware of appropriate methods. We describe a protocol for a quasi-experimental evaluation of a community-demand-driven programme for water and sanitation in rural India, which addresses several evaluation challenges. After briefly reviewing policy and implementation issues in the sector, we describe key features of our protocol, including control group identification, pre-post measurement, programme theory, sample sufficiency and robust indicators. At its core, our protocol proposes to combine propensity score matching and difference-in-difference estimation. We conclude by briefly summarizing how quasi-experimental impact evaluations can address key issues in WSS policy design and when such evaluations are needed.
Galaviz-Mosqueda, Alejandro; Villarreal-Reyes, Salvador; Galeana-Zapién, Hiram; Rubio-Loyola, Javier; Covarrubias-Rosales, David H.
2014-01-01
Vehicular ad hoc networks (VANETs) have been identified as a key technology to enable intelligent transport systems (ITS), which are aimed to radically improve the safety, comfort, and greenness of the vehicles in the road. However, in order to fully exploit VANETs potential, several issues must be addressed. Because of the high dynamic of VANETs and the impairments in the wireless channel, one key issue arising when working with VANETs is the multihop dissemination of broadcast packets for safety and infotainment applications. In this paper a reliable low-overhead multihop broadcast (RLMB) protocol is proposed to address the well-known broadcast storm problem. The proposed RLMB takes advantage of the hello messages exchanged between the vehicles and it processes such information to intelligently select a relay set and reduce the redundant broadcast. Additionally, to reduce the hello messages rate dependency, RLMB uses a point-to-zone link evaluation approach. RLMB performance is compared with one of the leading multihop broadcast protocols existing to date. Performance metrics show that our RLMB solution outperforms the leading protocol in terms of important metrics such as packet dissemination ratio, overhead, and delay. PMID:25133224
Lightweight and scalable secure communication in VANET
NASA Astrophysics Data System (ADS)
Zhu, Xiaoling; Lu, Yang; Zhu, Xiaojuan; Qiu, Shuwei
2015-05-01
To avoid a message to be tempered and forged in vehicular ad hoc network (VANET), the digital signature method is adopted by IEEE1609.2. However, the costs of the method are excessively high for large-scale networks. The paper efficiently copes with the issue with a secure communication framework by introducing some lightweight cryptography primitives. In our framework, point-to-point and broadcast communications for vehicle-to-infrastructure (V2I) and vehicle-to-vehicle (V2V) are studied, mainly based on symmetric cryptography. A new issue incurred is symmetric key management. Thus, we develop key distribution and agreement protocols for two-party key and group key under different environments, whether a road side unit (RSU) is deployed or not. The analysis shows that our protocols provide confidentiality, authentication, perfect forward secrecy, forward secrecy and backward secrecy. The proposed group key agreement protocol especially solves the key leak problem caused by members joining or leaving in existing key agreement protocols. Due to aggregated signature and substitution of XOR for point addition, the average computation and communication costs do not significantly increase with the increase in the number of vehicles; hence, our framework provides good scalability.
Discriminant Validity Assessment: Use of Fornell & Larcker criterion versus HTMT Criterion
NASA Astrophysics Data System (ADS)
Hamid, M. R. Ab; Sami, W.; Mohmad Sidek, M. H.
2017-09-01
Assessment of discriminant validity is a must in any research that involves latent variables for the prevention of multicollinearity issues. Fornell and Larcker criterion is the most widely used method for this purpose. However, a new method has emerged for establishing the discriminant validity assessment through heterotrait-monotrait (HTMT) ratio of correlations method. Therefore, this article presents the results of discriminant validity assessment using these methods. Data from previous study was used that involved 429 respondents for empirical validation of value-based excellence model in higher education institutions (HEI) in Malaysia. From the analysis, the convergent, divergent and discriminant validity were established and admissible using Fornell and Larcker criterion. However, the discriminant validity is an issue when employing the HTMT criterion. This shows that the latent variables under study faced the issue of multicollinearity and should be looked into for further details. This also implied that the HTMT criterion is a stringent measure that could detect the possible indiscriminant among the latent variables. In conclusion, the instrument which consisted of six latent variables was still lacking in terms of discriminant validity and should be explored further.
Huang, Jinhua; Li, Zhijie; Li, Guimei; Liu, Zhaoying
2015-10-01
This study aimed to evaluate the accuracy of the Andon KD-5965 upper-arm blood pressure monitor according to the European Society of Hypertension International Protocol revision 2010. Systolic and diastolic blood pressures were sequentially measured in 33 adults, with 20 women using a mercury sphygmomanometer (two observers) and the Andon KD-5965 device (one supervisor). A total of 99 pairs of comparisons were obtained from 33 participants for judgments in two parts with three grading phases. The device achieved the targets in part 1 of the validation study. The number of absolute differences between the device and observers within 5, 10, and 15 mmHg was 70/99, 91/99, and 98/99, respectively, for systolic blood pressure and 81/99, 99/99, and 99/99, respectively, for diastolic blood pressure. The device also fulfilled the criteria in part 2 of the validation study. Twenty-five and 29 participants, for systolic and diastolic blood pressure, respectively, had at least two of the three device-observers differences within 5 mmHg (required≥24). Two and one participants for systolic and diastolic blood pressure, respectively, had all three device-observers comparisons greater than 5 mmHg. According to the validation results, with better performance for diastolic blood pressure than that for systolic blood pressure, the Andon automated oscillometric upper-arm blood pressure monitor KD-5965 fulfilled the requirements of the European Society of Hypertension International Protocol revision 2010, and hence can be recommended for blood pressure measurement in adults.
Bilo, Grzegorz; Zorzi, Cristina; Ochoa Munera, Juan E; Torlasco, Camilla; Giuli, Valentina; Parati, Gianfranco
2015-10-01
The present study aimed to evaluate the accuracy of the Somnotouch-NIBP noninvasive continuous blood pressure monitor according to the European Society of Hypertension International Protocol revision 2010. Systolic and diastolic blood pressures were sequentially measured in 33 adults (11 women, mean age 63.5±11.9 years) using a mercury sphygmomanometer (two observers) and the Somnotouch-NIBP device (one supervisor). A total of 99 pairs of comparisons were obtained from 33 participants for judgments in two parts with three grading phases. All the validation requirements were fulfilled. The Somnotouch-NIBP device fulfilled the requirements of the part 1 of the validation study. The number of absolute differences between device and observers within 5, 10, and 15 mmHg was 75/99, 90/99, and 96/99, respectively, for systolic blood pressure and 90/99, 99/99, and 99/99, respectively, for diastolic blood pressure. The device also fulfilled the criteria in part 2 of the validation study. Twenty-seven and 31 participants had at least two of the three device-observers differences less than or equal to 5 mmHg for systolic and diastolic blood pressure, respectively. All three device-observer differences were greater than 5 mmHg in two participants for systolic and in one participant for diastolic blood pressure. The Somnotouch-NIBP noninvasive continuous blood pressure monitor has passed the requirements of the International Protocol revision 2010, and hence can be recommended for blood pressure monitoring in adults, at least under conditions corresponding to those investigated in our study.
Simulating Global Climate Summits
ERIC Educational Resources Information Center
Vesperman, Dean P.; Haste, Turtle; Alrivy, Stéphane
2014-01-01
One of the most persistent and controversial issues facing the global community is climate change. With the creation of the UN Framework Convention on Climate Change (UNFCCC) in 1992 and the Kyoto Protocol (1997), the global community established some common ground on how to address this issue. However, the last several climate summits have failed…
E-novo: an automated workflow for efficient structure-based lead optimization.
Pearce, Bradley C; Langley, David R; Kang, Jia; Huang, Hongwei; Kulkarni, Amit
2009-07-01
An automated E-Novo protocol designed as a structure-based lead optimization tool was prepared through Pipeline Pilot with existing CHARMm components in Discovery Studio. A scaffold core having 3D binding coordinates of interest is generated from a ligand-bound protein structural model. Ligands of interest are generated from the scaffold using an R-group fragmentation/enumeration tool within E-Novo, with their cores aligned. The ligand side chains are conformationally sampled and are subjected to core-constrained protein docking, using a modified CHARMm-based CDOCKER method to generate top poses along with CDOCKER energies. In the final stage of E-Novo, a physics-based binding energy scoring function ranks the top ligand CDOCKER poses using a more accurate Molecular Mechanics-Generalized Born with Surface Area method. Correlation of the calculated ligand binding energies with experimental binding affinities were used to validate protocol performance. Inhibitors of Src tyrosine kinase, CDK2 kinase, beta-secretase, factor Xa, HIV protease, and thrombin were used to test the protocol using published ligand crystal structure data within reasonably defined binding sites. In-house Respiratory Syncytial Virus inhibitor data were used as a more challenging test set using a hand-built binding model. Least squares fits for all data sets suggested reasonable validation of the protocol within the context of observed ligand binding poses. The E-Novo protocol provides a convenient all-in-one structure-based design process for rapid assessment and scoring of lead optimization libraries.
Tian, Huiyong; Zeng, Sijian; Zhong, Xiaoyan; Gong, Wei; Liu, Wenjun
2015-10-01
Transtek blood pressure monitor TMB-1491 is an automatic upper arm device designed for self/home measurement in adult populations. This study aimed to evaluate its accuracy according to the European Society of Hypertension International Protocol revision 2010. The protocol requirements were followed precisely with the recruitment of 33 adult individuals on whom same-left-arm sequential systolic blood pressure (SBP) and diastolic blood pressure (DBP) were measured. According to the validation protocol, 99 pairs of test device and reference blood pressure measurements were obtained in this study (three pairs for each of the 33 participants). The device produced 74, 95 and 99 measurements within 5, 10, and 15 mmHg for SBP and 85, 97, and 99 for DBP, respectively. The mean±SD device-observer difference was -0.6±4.4 mmHg for SBP and -0.6±3.4 mmHg for DBP. The number of participants with two or three device-observer difference within 5 mmHg was 24 for SBP and 29 for DBP. In addition, none of the participants had a device-observer difference within 5 mmHg for SBP, and three of the participants had the same for DBP. Transtek TMB-1491 has passed all phases of European Society of Hypertension International Protocol revision 2010 and can be recommended for self/home measurement in adult populations.
Verifiable Secret Redistribution
2001-10-01
but they are not trusted with secret. Thus, we require a protocol for redistribution without reconstruction of the secret . We also require...verification that the new shareholders have valid shares (ones that can be used to reconstruct the secret ). We present a new protocol to perform non...secret to shareholders in Shamir’s (m,n) threshold scheme (one in which we require m of n shares to reconstruct the secret ), and wish to redistribute the
Functional and nonfunctional testing of ATM networks
NASA Astrophysics Data System (ADS)
Ricardo, Manuel; Ferreira, M. E. P.; Guimaraes, Francisco E.; Mamede, J.; Henriques, M.; da Silva, Jorge A.; Carrapatoso, E.
1995-02-01
ATM network will support new multimedia services that will require new protocols, those services and protocols will need different test strategies and tools. In this paper, the concepts of functional and non-functional testers of ATM networks are discussed, a multimedia service and its requirements are presented and finally, a summary description of an ATM network and of the test tool that will be used to validate it are presented.
In silico simulations of experimental protocols for cardiac modeling.
Carro, Jesus; Rodriguez, Jose Felix; Pueyo, Esther
2014-01-01
A mathematical model of the AP involves the sum of different transmembrane ionic currents and the balance of intracellular ionic concentrations. To each ionic current corresponds an equation involving several effects. There are a number of model parameters that must be identified using specific experimental protocols in which the effects are considered as independent. However, when the model complexity grows, the interaction between effects becomes increasingly important. Therefore, model parameters identified considering the different effects as independent might be misleading. In this work, a novel methodology consisting in performing in silico simulations of the experimental protocol and then comparing experimental and simulated outcomes is proposed for parameter model identification and validation. The potential of the methodology is demonstrated by validating voltage-dependent L-type calcium current (ICaL) inactivation in recently proposed human ventricular AP models with different formulations. Our results show large differences between ICaL inactivation as calculated from the model equation and ICaL inactivation from the in silico simulations due to the interaction between effects and/or to the experimental protocol. Our results suggest that, when proposing any new model formulation, consistency between such formulation and the corresponding experimental data that is aimed at being reproduced needs to be first verified considering all involved factors.
Chen, Wan; Zeng, Zhaolin; Li, Lizhi; Wan, Xiaofen; Wan, Yi
2014-10-01
This study aimed to validate the Pangao PG-800B5 upper arm blood pressure monitor according to the European Society of Hypertension International Protocol revision 2010. A total of 33 participants, 16 men and 17 women, were included in the device evaluation. The protocol requirements were followed precisely. The mean age of the participants was 56.4±21.0 years (range 22-84 years). The mean systolic blood pressure was 143.6±25.5 mmHg (range 98-188 mmHg), the mean diastolic blood pressure was 85.7±17.2 mmHg (range 49-125 mmHg), and the mean arm circumference was 26.1±2.2 cm (range 23-32 cm). On average, the device overestimated the systolic blood pressure by 0.9±4.2 mmHg and diastolic blood pressure by 0.7±4.5 mmHg. The device passed all requirements, fulfilling the standards of the protocol. Therefore, the Pangao PG-800B5 upper arm blood pressure monitor can be recommended for clinical use and self-measurement in an adult population.
Yang, Qi; Franco, Christopher M M; Sorokin, Shirley J; Zhang, Wei
2017-02-02
For sponges (phylum Porifera), there is no reliable molecular protocol available for species identification. To address this gap, we developed a multilocus-based Sponge Identification Protocol (SIP) validated by a sample of 37 sponge species belonging to 10 orders from South Australia. The universal barcode COI mtDNA, 28S rRNA gene (D3-D5), and the nuclear ITS1-5.8S-ITS2 region were evaluated for their suitability and capacity for sponge identification. The highest Bit Score was applied to infer the identity. The reliability of SIP was validated by phylogenetic analysis. The 28S rRNA gene and COI mtDNA performed better than the ITS region in classifying sponges at various taxonomic levels. A major limitation is that the databases are not well populated and possess low diversity, making it difficult to conduct the molecular identification protocol. The identification is also impacted by the accuracy of the morphological classification of the sponges whose sequences have been submitted to the database. Re-examination of the morphological identification further demonstrated and improved the reliability of sponge identification by SIP. Integrated with morphological identification, the multilocus-based SIP offers an improved protocol for more reliable and effective sponge identification, by coupling the accuracy of different DNA markers.
Yang, Qi; Franco, Christopher M. M.; Sorokin, Shirley J.; Zhang, Wei
2017-01-01
For sponges (phylum Porifera), there is no reliable molecular protocol available for species identification. To address this gap, we developed a multilocus-based Sponge Identification Protocol (SIP) validated by a sample of 37 sponge species belonging to 10 orders from South Australia. The universal barcode COI mtDNA, 28S rRNA gene (D3–D5), and the nuclear ITS1-5.8S-ITS2 region were evaluated for their suitability and capacity for sponge identification. The highest Bit Score was applied to infer the identity. The reliability of SIP was validated by phylogenetic analysis. The 28S rRNA gene and COI mtDNA performed better than the ITS region in classifying sponges at various taxonomic levels. A major limitation is that the databases are not well populated and possess low diversity, making it difficult to conduct the molecular identification protocol. The identification is also impacted by the accuracy of the morphological classification of the sponges whose sequences have been submitted to the database. Re-examination of the morphological identification further demonstrated and improved the reliability of sponge identification by SIP. Integrated with morphological identification, the multilocus-based SIP offers an improved protocol for more reliable and effective sponge identification, by coupling the accuracy of different DNA markers. PMID:28150727
Pedata, Paola; Corvino, Anna Rita; Napolitano, Raffaele Carmine; Garzillo, Elpidio Maria; Furfaro, Ciro; Lamberti, Monica
2016-01-20
From many years now, thanks to the development of modern diving techniques, there has been a rapid spread of diving activities everywhere. In fact, divers are ever more numerous both among the Armed Forces and civilians who dive for work, like fishing, biological research and archeology. The aim of the study was to propose a health protocol for work fitness of professional divers keeping in mind the peculiar work activity, existing Italian legislation that is almost out of date and the technical and scientific evolution in this occupational field. We performed an analysis of the most frequently occurring diseases among professional divers and of the clinical investigation and imaging techniques used for work fitness assessment of professional divers. From analysis of the health protocol recommended by D.M. 13 January 1979 (Ministerial Decree), that is most used by occupational health physician, several critical issues emerged. Very often the clinical investigation and imaging techniques still used are almost obsolete, ignoring the execution of simple and inexpensive investigations that are more useful for work fitness assessment. Considering the out-dated legislation concerning diving disciplines, it is necessary to draw up a common health protocol that takes into account clinical and scientific knowledge and skills acquired in this area. This protocol's aim is to propose a useful tool for occupational health physicians who work in this sector.
Berman, D Wayne
2011-08-01
Given that new protocols for assessing asbestos-related cancer risk have recently been published, questions arise concerning how they compare to the "IRIS" protocol currently used by regulators. The newest protocols incorporate findings from 20 additional years of literature. Thus, differences between the IRIS and newer Berman and Crump protocols are examined to evaluate whether these protocols can be reconciled. Risks estimated by applying these protocols to real exposure data from both laboratory and field studies are also compared to assess the relative health protectiveness of each protocol. The reliability of risks estimated using the two protocols are compared by evaluating the degree with which each potentially reproduces the known epidemiology study risks. Results indicate that the IRIS and Berman and Crump protocols can be reconciled; while environment-specific variation within fiber type is apparently due primarily to size effects (not addressed by IRIS), the 10-fold (average) difference between amphibole asbestos risks estimated using each protocol is attributable to an arbitrary selection of the lowest of available mesothelioma potency factors in the IRIS protocol. Thus, the IRIS protocol may substantially underestimate risk when exposure is primarily to amphibole asbestos. Moreover, while the Berman and Crump protocol is more reliable than the IRIS protocol overall (especially for predicting amphibole risk), evidence is presented suggesting a new fiber-size-related adjustment to the Berman and Crump protocol may ultimately succeed in reconciling the entire epidemiology database. However, additional data need to be developed before the performance of the adjusted protocol can be fully validated. © 2011 Society for Risk Analysis.
Advertisement-Based Energy Efficient Medium Access Protocols for Wireless Sensor Networks
NASA Astrophysics Data System (ADS)
Ray, Surjya Sarathi
One of the main challenges that prevents the large-scale deployment of Wireless Sensor Networks (WSNs) is providing the applications with the required quality of service (QoS) given the sensor nodes' limited energy supplies. WSNs are an important tool in supporting applications ranging from environmental and industrial monitoring, to battlefield surveillance and traffic control, among others. Most of these applications require sensors to function for long periods of time without human intervention and without battery replacement. Therefore, energy conservation is one of the main goals for protocols for WSNs. Energy conservation can be performed in different layers of the protocol stack. In particular, as the medium access control (MAC) layer can access and control the radio directly, large energy savings is possible through intelligent MAC protocol design. To maximize the network lifetime, MAC protocols for WSNs aim to minimize idle listening of the sensor nodes, packet collisions, and overhearing. Several approaches such as duty cycling and low power listening have been proposed at the MAC layer to achieve energy efficiency. In this thesis, I explore the possibility of further energy savings through the advertisement of data packets in the MAC layer. In the first part of my research, I propose Advertisement-MAC or ADV-MAC, a new MAC protocol for WSNs that utilizes the concept of advertising for data contention. This technique lets nodes listen dynamically to any desired transmission and sleep during transmissions not of interest. This minimizes the energy lost in idle listening and overhearing while maintaining an adaptive duty cycle to handle variable loads. Additionally, ADV-MAC enables energy efficient MAC-level multicasting. An analytical model for the packet delivery ratio and the energy consumption of the protocol is also proposed. The analytical model is verified with simulations and is used to choose an optimal value of the advertisement period. Simulations show that the optimized ADV-MAC provides substantial energy gains (50% to 70% less than other MAC protocols for WSNs such as T-MAC and S-MAC for the scenarios investigated) while faring as well as T-MAC in terms of packet delivery ratio and latency. Although ADV-MAC provides substantial energy gains over S-MAC and T-MAC, it is not optimal in terms of energy savings because contention is done twice -- once in the Advertisement Period and once in the Data Period. In the next part of my research, the second contention in the Data Period is eliminated and the advantages of contention-based and TDMA-based protocols are combined to form Advertisement based Time-division Multiple Access (ATMA), a distributed TDMA-based MAC protocol for WSNs. ATMA utilizes the bursty nature of the traffic to prevent energy waste through advertisements and reservations for data slots. Extensive simulations and qualitative analysis show that with bursty traffic, ATMA outperforms contention-based protocols (S-MAC, T-MAC and ADV-MAC), a TDMA based protocol (TRAMA) and hybrid protocols (Z-MAC and IEEE 802.15.4). ATMA provides energy reductions of up to 80%, while providing the best packet delivery ratio (close to 100%) and latency among all the investigated protocols. Simulations alone cannot reflect many of the challenges faced by real implementations of MAC protocols, such as clock-drift, synchronization, imperfect physical layers, and irregular interference from other transmissions. Such issues may cripple a protocol that otherwise performs very well in software simulations. Hence, to validate my research, I conclude with a hardware implementation of the ATMA protocol on SORA (Software Radio), developed by Microsoft Research Asia. SORA is a reprogrammable Software Defined Radio (SDR) platform that satisfies the throughput and timing requirements of modern wireless protocols while utilizing the rich general purpose PC development environment. Experimental results obtained from the hardware implementation of ATMA closely mirror the simulation results obtained for a single hop network with 4 nodes.
Novel Multi-Party Quantum Key Agreement Protocol with G-Like States and Bell States
NASA Astrophysics Data System (ADS)
Min, Shi-Qi; Chen, Hua-Ying; Gong, Li-Hua
2018-03-01
A significant aspect of quantum cryptography is quantum key agreement (QKA), which ensures the security of key agreement protocols by quantum information theory. The fairness of an absolute security multi-party quantum key agreement (MQKA) protocol demands that all participants can affect the protocol result equally so as to establish a shared key and that nobody can determine the shared key by himself/herself. We found that it is difficult for the existing multi-party quantum key agreement protocol to withstand the collusion attacks. Put differently, it is possible for several cooperated and untruthful participants to determine the final key without being detected. To address this issue, based on the entanglement swapping between G-like state and Bell states, a new multi-party quantum key agreement protocol is put forward. The proposed protocol makes full use of EPR pairs as quantum resources, and adopts Bell measurement and unitary operation to share a secret key. Besides, the proposed protocol is fair, secure and efficient without involving a third party quantum center. It demonstrates that the protocol is capable of protecting users' privacy and meeting the requirement of fairness. Moreover, it is feasible to carry out the protocol with existing technologies.
Novel Multi-Party Quantum Key Agreement Protocol with G-Like States and Bell States
NASA Astrophysics Data System (ADS)
Min, Shi-Qi; Chen, Hua-Ying; Gong, Li-Hua
2018-06-01
A significant aspect of quantum cryptography is quantum key agreement (QKA), which ensures the security of key agreement protocols by quantum information theory. The fairness of an absolute security multi-party quantum key agreement (MQKA) protocol demands that all participants can affect the protocol result equally so as to establish a shared key and that nobody can determine the shared key by himself/herself. We found that it is difficult for the existing multi-party quantum key agreement protocol to withstand the collusion attacks. Put differently, it is possible for several cooperated and untruthful participants to determine the final key without being detected. To address this issue, based on the entanglement swapping between G-like state and Bell states, a new multi-party quantum key agreement protocol is put forward. The proposed protocol makes full use of EPR pairs as quantum resources, and adopts Bell measurement and unitary operation to share a secret key. Besides, the proposed protocol is fair, secure and efficient without involving a third party quantum center. It demonstrates that the protocol is capable of protecting users' privacy and meeting the requirement of fairness. Moreover, it is feasible to carry out the protocol with existing technologies.
Trivedi, Hari; Mesterhazy, Joseph; Laguna, Benjamin; Vu, Thienkhai; Sohn, Jae Ho
2018-04-01
Magnetic resonance imaging (MRI) protocoling can be time- and resource-intensive, and protocols can often be suboptimal dependent upon the expertise or preferences of the protocoling radiologist. Providing a best-practice recommendation for an MRI protocol has the potential to improve efficiency and decrease the likelihood of a suboptimal or erroneous study. The goal of this study was to develop and validate a machine learning-based natural language classifier that can automatically assign the use of intravenous contrast for musculoskeletal MRI protocols based upon the free-text clinical indication of the study, thereby improving efficiency of the protocoling radiologist and potentially decreasing errors. We utilized a deep learning-based natural language classification system from IBM Watson, a question-answering supercomputer that gained fame after challenging the best human players on Jeopardy! in 2011. We compared this solution to a series of traditional machine learning-based natural language processing techniques that utilize a term-document frequency matrix. Each classifier was trained with 1240 MRI protocols plus their respective clinical indications and validated with a test set of 280. Ground truth of contrast assignment was obtained from the clinical record. For evaluation of inter-reader agreement, a blinded second reader radiologist analyzed all cases and determined contrast assignment based on only the free-text clinical indication. In the test set, Watson demonstrated overall accuracy of 83.2% when compared to the original protocol. This was similar to the overall accuracy of 80.2% achieved by an ensemble of eight traditional machine learning algorithms based on a term-document matrix. When compared to the second reader's contrast assignment, Watson achieved 88.6% agreement. When evaluating only the subset of cases where the original protocol and second reader were concordant (n = 251), agreement climbed further to 90.0%. The classifier was relatively robust to spelling and grammatical errors, which were frequent. Implementation of this automated MR contrast determination system as a clinical decision support tool may save considerable time and effort of the radiologist while potentially decreasing error rates, and require no change in order entry or workflow.
Uranium mining industry views on ICRP statement on radon.
Takala, J
2012-01-01
In 2009, the International Commission on Radiological Protection issued a statement on radon which stated that the dose conversion factor for radon progeny would likely double, and the calculation of risk from radon should move to a dosimetric approach, rather than the longstanding epidemiological approach. Through the World Nuclear Association, whose members represent over 90% of the world's uranium production, industry has been examining this issue with a goal of offering expertise and knowledge to assist with the practical implementation of these evolutionary changes to evaluating the risk from radon progeny. Industry supports the continuing use of the most current epidemiological data as a basis for risk calculation, but believes that further examination of these results is needed to better understand the level of conservatism in the potential epidemiological-based risk models. With regard to adoption of the dosimetric approach, industry believes that further work is needed before this is a practical option. In particular, this work should include a clear demonstration of the validation of the dosimetric model which includes how smoking is handled, the establishment of a practical measurement protocol, and the collection of relevant data for modern workplaces. Industry is actively working to address the latter two items. Copyright © 2012. Published by Elsevier Ltd.
YARNsim: Simulating Hadoop YARN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Ning; Yang, Xi; Sun, Xian-He
Despite the popularity of the Apache Hadoop system, its success has been limited by issues such as single points of failure, centralized job/task management, and lack of support for programming models other than MapReduce. The next generation of Hadoop, Apache Hadoop YARN, is designed to address these issues. In this paper, we propose YARNsim, a simulation system for Hadoop YARN. YARNsim is based on parallel discrete event simulation and provides protocol-level accuracy in simulating key components of YARN. YARNsim provides a virtual platform on which system architects can evaluate the design and implementation of Hadoop YARN systems. Also, application developersmore » can tune job performance and understand the tradeoffs between different configurations, and Hadoop YARN system vendors can evaluate system efficiency under limited budgets. To demonstrate the validity of YARNsim, we use it to model two real systems and compare the experimental results from YARNsim and the real systems. The experiments include standard Hadoop benchmarks, synthetic workloads, and a bioinformatics application. The results show that the error rate is within 10% for the majority of test cases. The experiments prove that YARNsim can provide what-if analysis for system designers in a timely manner and at minimal cost compared with testing and evaluating on a real system.« less
Automated monitoring of medical protocols: a secure and distributed architecture.
Alsinet, T; Ansótegui, C; Béjar, R; Fernández, C; Manyà, F
2003-03-01
The control of the right application of medical protocols is a key issue in hospital environments. For the automated monitoring of medical protocols, we need a domain-independent language for their representation and a fully, or semi, autonomous system that understands the protocols and supervises their application. In this paper we describe a specification language and a multi-agent system architecture for monitoring medical protocols. We model medical services in hospital environments as specialized domain agents and interpret a medical protocol as a negotiation process between agents. A medical service can be involved in multiple medical protocols, and so specialized domain agents are independent of negotiation processes and autonomous system agents perform monitoring tasks. We present the detailed architecture of the system agents and of an important domain agent, the database broker agent, that is responsible of obtaining relevant information about the clinical history of patients. We also describe how we tackle the problems of privacy, integrity and authentication during the process of exchanging information between agents.
Owens, J William; Ashby, John
2002-01-01
A current issue for regulatory agencies is endocrine-related modes of action such as those mediated by the estrogen, androgen, and thyroid nuclear receptors. At the national and international levels, the consensus recommendation for the assessment of such modes of action is a tiered series of in vitro and in vivo protocols. The tiered framework begins with screens for structural alerts and then moves to rapid, mechanistic in vitro screening assays, and then to in vivo screening bioassays. The objective of these screens is to identify substances that may warrant testing for endocrine-mediated adverse effects. The final framework tier as needed is to test these substances in long-term bioassays for adverse endocrine-mediated reproductive and/or developmental effects. The subject of this review, the rodent uterotrophic bioassay, is intended to be a rapid in vivo screening bioassay for possible estrogen agonists and based on the response of the estrogen-sensitive uterus. The central metric of bioassay is a statistically significant increase in the weight of the uterus after 3 consecutive days of test substance administration. The extensive background literature is summarized in this review on the mode of action underlying the bioassay and the uterine response to estrogens. The review includes the bioassay's history of development and how its employment has changed and evolved over time. The review describes two major uterotrophic bioassay versions, the intact, immature female and the mature, ovariectomized female, and the protocol factors likely to influence relevance, reproducibility, and reliability of bioassay. The emphasis of the review is the ability of the uterotrophic bioassay to identify the substances of current interest: weak estrogen agonists with binding affinities relative to the natural 17beta-estradiol in the log 0 to log -3 range. Using selected model substances having RBAs in this target range, the bioassay's performance in a hierarchical, tiered approach is evaluated, including the predictive capability of the uterotrophic bioassay based on available reproductive and developmental testing data. The review concludes that the uterotrophic bioassay is reliable and can identify substances that may act via an estrogen-mode of action, supporting the validity of the uterotrophic bioassay and its regulatory use as an in vivo mechanistic screening bioassay for estrogen agonists and antagonists.
A quantitative telomeric chromatin isolation protocol identifies different telomeric states
NASA Astrophysics Data System (ADS)
Grolimund, Larissa; Aeby, Eric; Hamelin, Romain; Armand, Florence; Chiappe, Diego; Moniatte, Marc; Lingner, Joachim
2013-11-01
Telomere composition changes during tumourigenesis, aging and in telomere syndromes in a poorly defined manner. Here we develop a quantitative telomeric chromatin isolation protocol (QTIP) for human cells, in which chromatin is cross-linked, immunopurified and analysed by mass spectrometry. QTIP involves stable isotope labelling by amino acids in cell culture (SILAC) to compare and identify quantitative differences in telomere protein composition of cells from various states. With QTIP, we specifically enrich telomeric DNA and all shelterin components. We validate the method characterizing changes at dysfunctional telomeres, and identify and validate known, as well as novel telomere-associated polypeptides including all THO subunits, SMCHD1 and LRIF1. We apply QTIP to long and short telomeres and detect increased density of SMCHD1 and LRIF1 and increased association of the shelterins TRF1, TIN2, TPP1 and POT1 with long telomeres. Our results validate QTIP to study telomeric states during normal development and in disease.
Herson, M R; Hamilton, K; White, J; Alexander, D; Poniatowski, S; O'Connor, A J; Werkmeister, J A
2018-04-25
Current regulatory requirements demand an in-depth understanding and validation of protocols used in tissue banking. The aim of this work was to characterize the quality of split thickness skin allografts cryopreserved or manufactured using highly concentrated solutions of glycerol (50, 85 or 98%), where tissue water activity (a w ), histology and birefringence changes were chosen as parameters. Consistent a w outcomes validated the proposed processing protocols. While no significant changes in tissue quality were observed under bright-field microscopy or in collagen birefringence, in-process findings can be harnessed to fine-tune and optimize manufacturing outcomes in particular when further radiation sterilization is considered. Furthermore, exposing the tissues to 85% glycerol seems to derive the most efficient outcomes as far as a w and control of microbiological growth.