Science.gov

Sample records for standardized assessment protocol

  1. Assessing impacts of roads: application of a standard assessment protocol

    USGS Publications Warehouse

    Duniway, Michael C.; Herrick, Jeffrey E.

    2013-01-01

    Adaptive management of road networks depends on timely data that accurately reflect the impacts those systems are having on ecosystem processes and associated services. In the absence of reliable data, land managers are left with little more than observations and perceptions to support management decisions of road-associated disturbances. Roads can negatively impact the soil, hydrologic, plant, and animal processes on which virtually all ecosystem services depend. The Interpreting Indicators of Rangeland Health (IIRH) protocol is a qualitative method that has been demonstrated to be effective in characterizing impacts of roads. The goal of this study were to develop, describe, and test an approach for using IIRH to systematically evaluate road impacts across large, diverse arid and semiarid landscapes. We developed a stratified random sampling approach to plot selection based on ecological potential, road inventory data, and image interpretation of road impacts. The test application on a semiarid landscape in southern New Mexico, United States, demonstrates that the approach developed is sensitive to road impacts across a broad range of ecological sites but that not all the types of stratification were useful. Ecological site and road inventory strata accounted for significant variability in the functioning of ecological processes but stratification based on apparent impact did not. Analysis of the repeatability of IIRH applied to road plots indicates that the method is repeatable but consensus evaluations based on multiple observers should be used to minimize risk of bias. Landscape-scale analysis of impacts by roads of contrasting designs (maintained dirt or gravel roads vs. non- or infrequently maintained roads) suggests that future travel management plans for the study area should consider concentrating traffic on fewer roads that are well designed and maintained. Application of the approach by land managers will likely provide important insights into

  2. Toxicity testing of dispersed oil requires adherence to standardized protocols to assess potential real world effects.

    PubMed

    Coelho, Gina; Clark, James; Aurand, Don

    2013-06-01

    Recently, several researchers have attempted to address Deepwater Horizon incident environmental fate and effects issues using laboratory testing and extrapolation procedures that are not fully reliable measures for environmental assessments. The 2013 Rico-Martínez et al. publication utilized laboratory testing approaches that severely limit our ability to reliably extrapolate such results to meaningful real-world assessments. The authors did not adopt key methodological elements of oil and dispersed oil toxicity standards. Further, they drew real-world conclusions from static exposure tests without reporting actual exposure concentrations. Without this information, it is not possible to compare their results to other research or real spill events that measured and reported exposure concentrations. The 1990s' Chemical Response to Oil Spills: Ecological Effects Research Forum program was established to standardize and conduct exposure characterization in oil and dispersed oil aquatic toxicity testing (Aurand and Coelho, 2005). This commentary raises awareness regarding the necessity of standardized test protocols.

  3. A model standardized risk assessment protocol for use with hazardous waste sites.

    PubMed Central

    Marsh, G M; Day, R

    1991-01-01

    This paper presents a model standardized risk assessment protocol (SRAP) for use with hazardous waste sites. The proposed SRAP focuses on the degree and patterns of evidence that exist for a significant risk to human populations from exposure to a hazardous waste site. The SRAP was designed with at least four specific goals in mind: to organize the available scientific data on a specific site and to highlight important gaps in this knowledge; to facilitate rational, cost-effective decision making about the best distribution of available manpower and resources; to systematically classify sites roughly according to the level of risk they pose to surrounding human populations; and to promote an improved level of communication among professionals working in the area of waste site management and between decision makers and the local population. PMID:2050062

  4. Assessing transportation infrastructure impacts on rangelands: test of a standard rangeland assessment protocol

    USGS Publications Warehouse

    Duniway, Michael C.; Herrick, Jeffrey E.; Pyke, David A.; Toledo, David

    2010-01-01

    Linear disturbances associated with on- and off-road vehicle use on rangelands has increased dramatically throughout the world in recent decades. This increase is due to a variety of factors including increased availability of all-terrain vehicles, infrastructure development (oil, gas, renewable energy, and ex-urban), and recreational activities. In addition to the direct impacts of road development, the presence and use of roads may alter resilience of adjoining areas through indirect effects such as altered site hydrologic and eolian processes, invasive seed dispersal, and sediment transport. There are few standardized methods for assessing impacts of transportation-related land-use activities on soils and vegetation in arid and semi-arid rangelands. Interpreting Indicators of Rangeland Health (IIRH) is an internationally accepted qualitative assessment that is applied widely to rangelands. We tested the sensitivity of IIRH to impacts of roads, trails, and pipelines on adjacent lands by surveying plots at three distances from these linear disturbances. We performed tests at 16 randomly selected sites in each of three ecosystems (Northern High Plains, Colorado Plateau, and Chihuahuan Desert) for a total of 208 evaluation plots. We also evaluated the repeatability of IIRH when applied to road-related disturbance gradients. Finally, we tested extent of correlations between IIRH plot attribute departure classes and trends in a suite of quantitative indicators. Results indicated that the IIRH technique is sensitive to direct and indirect impacts of transportation activities with greater departure from reference condition near disturbances than far from disturbances. Trends in degradation of ecological processes detected with qualitative assessments were highly correlated with quantitative data. Qualitative and quantitative assessments employed in this study can be used to assess impacts of transportation features at the plot scale. Through integration with remote

  5. The Impact of National Standards Assessment in New Zealand, and National Testing Protocols in Norway on Indigenous Schooling

    ERIC Educational Resources Information Center

    Özerk, Kamil; Whitehead, David

    2012-01-01

    This paper first provides a critic of the implementation of compulsory national assessment protocols internationally, and then nationally through a review of the implementation process used for the introduction of National Standards in New Zealand, and National Testing in Norwegian mainstream schools. It then reviews the impact of these two…

  6. National Sample Assessment Protocols

    ERIC Educational Resources Information Center

    Ministerial Council on Education, Employment, Training and Youth Affairs (NJ1), 2012

    2012-01-01

    These protocols represent a working guide for planning and implementing national sample assessments in connection with the national Key Performance Measures (KPMs). The protocols are intended for agencies involved in planning or conducting national sample assessments and personnel responsible for administering associated tenders or contracts,…

  7. Plasma and urinary GH following a standardized exercise protocol to assess GH production in short children.

    PubMed

    Sartorio, A; Palmieri, E; Vangeli, V; Conte, G; Narici, M; Faglia, G

    2001-01-01

    Plasma and urinary GH responses following acute physical exercise were evaluated in 19 short-statured children (12 males, 7 females, median age: 11.4 yr, age range: 6.1-14.5 yr, Tanner stage I-III, height < or = 3rd centile for age; 7 with familial short stature, FSS; 8 with constitutional growth delay, CGD; 4 with GH deficiency, GHD) and 7 normally growing, age- and sex-matched control children (4 males, 3 females, median age 11.0 yr, range: 7.2-13.1 yr, Tanner stage I-III). All patients and controls underwent a standardized exercise protocol (consisting of jogging up and down a corridor for 15 min, strongly encouraged to produce the maximum possible effort, corresponding to 70-80% of the maximal heart rate) after an overnight fasting. Samples for plasma GH determinations were drawn at 0 time (baseline), at 20 min (5 min after the end of exercise) and at 35 min (after 20 min of rest); urine samples were collected before (0 time) and at 40, 80 and 120 min after exercise. The distance covered by children with GHD during the test was significantly lower (p<0.05) than in the other groups of patients and controls. No differences in the pattern of plasma GH responses after physical exercise were found between children with FSS, CGD and healthy controls, the maximum percent increase (vs baseline) being evident at 20 min (median, FSS: +1125%; CGD: +1271%; controls: +571%). Children with GHD showed a smaller percent increase (+94%) of plasma GH, significantly lower (p<0.01) than those recorded in the other groups. A significant percent increase (p<0.01) of baseline urinary GH following exercise was found in children with FSS (median: +34%), CGD (+18%) and controls (+44%). Children with FSS and CGD showed a gradual increase of urinary GH, reaching the maximum at 80 min, while healthy controls had a more evident and precocious increase (maximum at 40 min). Urinary median GH levels did not change following physical exercise in children with GHD (-5%, not significant). A

  8. Standardizing the Protocol for Hemispherical Photographs: Accuracy Assessment of Binarization Algorithms

    PubMed Central

    Glatthorn, Jonas; Beckschäfer, Philip

    2014-01-01

    Hemispherical photography is a well-established method to optically assess ecological parameters related to plant canopies; e.g. ground-level light regimes and the distribution of foliage within the crown space. Interpreting hemispherical photographs involves classifying pixels as either sky or vegetation. A wide range of automatic thresholding or binarization algorithms exists to classify the photographs. The variety in methodology hampers ability to compare results across studies. To identify an optimal threshold selection method, this study assessed the accuracy of seven binarization methods implemented in software currently available for the processing of hemispherical photographs. Therefore, binarizations obtained by the algorithms were compared to reference data generated through a manual binarization of a stratified random selection of pixels. This approach was adopted from the accuracy assessment of map classifications known from remote sensing studies. Percentage correct () and kappa-statistics () were calculated. The accuracy of the algorithms was assessed for photographs taken with automatic exposure settings (auto-exposure) and photographs taken with settings which avoid overexposure (histogram-exposure). In addition, gap fraction values derived from hemispherical photographs were compared with estimates derived from the manually classified reference pixels. All tested algorithms were shown to be sensitive to overexposure. Three of the algorithms showed an accuracy which was high enough to be recommended for the processing of histogram-exposed hemispherical photographs: “Minimum” ( 98.8%; 0.952), “Edge Detection” ( 98.1%; 0.950), and “Minimum Histogram” ( 98.1%; 0.947). The Minimum algorithm overestimated gap fraction least of all (11%). The overestimation by the algorithms Edge Detection (63%) and Minimum Histogram (67%) were considerably larger. For the remaining four evaluated algorithms (IsoData, Maximum Entropy, MinError, and Otsu) an

  9. Standardizing the protocol for hemispherical photographs: accuracy assessment of binarization algorithms.

    PubMed

    Glatthorn, Jonas; Beckschäfer, Philip

    2014-01-01

    Hemispherical photography is a well-established method to optically assess ecological parameters related to plant canopies; e.g. ground-level light regimes and the distribution of foliage within the crown space. Interpreting hemispherical photographs involves classifying pixels as either sky or vegetation. A wide range of automatic thresholding or binarization algorithms exists to classify the photographs. The variety in methodology hampers ability to compare results across studies. To identify an optimal threshold selection method, this study assessed the accuracy of seven binarization methods implemented in software currently available for the processing of hemispherical photographs. Therefore, binarizations obtained by the algorithms were compared to reference data generated through a manual binarization of a stratified random selection of pixels. This approach was adopted from the accuracy assessment of map classifications known from remote sensing studies. Percentage correct (Pc) and kappa-statistics (K) were calculated. The accuracy of the algorithms was assessed for photographs taken with automatic exposure settings (auto-exposure) and photographs taken with settings which avoid overexposure (histogram-exposure). In addition, gap fraction values derived from hemispherical photographs were compared with estimates derived from the manually classified reference pixels. All tested algorithms were shown to be sensitive to overexposure. Three of the algorithms showed an accuracy which was high enough to be recommended for the processing of histogram-exposed hemispherical photographs: "Minimum" (Pc 98.8%; K 0.952), "Edge Detection" (Pc 98.1%; K 0.950), and "Minimum Histogram" (Pc 98.1%; K 0.947). The Minimum algorithm overestimated gap fraction least of all (11%). The overestimation by the algorithms Edge Detection (63%) and Minimum Histogram (67%) were considerably larger. For the remaining four evaluated algorithms (IsoData, Maximum Entropy, MinError, and Otsu

  10. Performance Assessment of Water Insight SpectroPhotometer with Three Channels (WISP-3) Against the Standard of Ocean Optics Protocols

    NASA Astrophysics Data System (ADS)

    Ghezehegn, Semhar; Ansko, Ilmar; Kuusk, Joel; Hommersom, Annelies; Laanen, Marnix

    2015-12-01

    In the FP7 project GLaSS, seven groups from different European countries co-operate on the preparation of the uptake of Sentinel data, including use cases to demonstrate the applicability of this new high-resolution data on lakes with a large range of optical properties. Within GLaSS there are work packages on validation, algorithm comparisons and atmospheric correction that require comparable and high quality in situ measurements of the lakes. Unfortunately, the type of radiometric instruments and lab techniques used by the partners are different with regard to specification, performance and sensitivity. Hence, it is very important to use standard protocols to make sure the results are comparable and requirements are fulfilled before validating results. The Ocean Optics Protocols for SeaWiFS Validation, later upgraded to “Ocean Optics Protocols For Satellite Ocean Colour Sensor Validation” [3] are set up to allow such harmonization. GLaSS made a start with the development of dedicated protocols for optical measurements and satellite validation for inland waters of different types. Because several GLaSS partners use the WISP-3 radiometer for reflectance measurements, extra attention is put to check the performance of this instrument with regard to the protocols.

  11. STANDARD OPERATING PROTOCOLS FOR DECOMMISSIONING

    SciTech Connect

    Foss, D. L.; Stevens, J. L.; Gerdeman, F. W.

    2002-02-25

    Decommissioning projects at Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) sites are conducted under project-specific decision documents, which involve extensive preparation time, public comment periods, and regulatory approvals. Often, the decision documents must be initiated at least one year before commencing the decommissioning project, and they are expensive and time consuming to prepare. The Rocky Flats Environmental Technology Site (RFETS) is a former nuclear weapons production plant at which hazardous substances and wastes were released or disposed during operations. As a result of the releases, RFETS was placed on the National Priorities List in 1989, and is conducting cleanup activities under a federal facilities compliance agreement. Working closely with interested stakeholders and state and federal regulatory agencies, RFETS has developed and implemented an improved process for obtaining the approvals. The key to streamlining the approval process has been the development of sitewide decision documents called Rocky Flats Cleanup Agreement Standard Operating Protocols or ''RSOPs.'' RSOPs have broad applicability, and could be used instead of project-specific documents. Although no two decommissioning projects are exactly the same and they may vary widely in contamination and other hazards, the basic steps taken for cleanup are usually similar. Because of this, using RSOPs is more efficient than preparing a separate project-specific decision documents for each cleanup action. Over the Rocky Flats cleanup life cycle, using RSOPs has the potential to: (1) Save over 5 million dollars and 6 months on the site closure schedule; (2) Eliminate preparing one hundred and twenty project-specific decision documents; and (3) Eliminate writing seventy-five closure description documents for hazardous waste unit closure and corrective actions.

  12. Standardizing data exchange for clinical research protocols and case report forms: An assessment of the suitability of the Clinical Data Interchange Standards Consortium (CDISC) Operational Data Model (ODM)

    PubMed Central

    Huser, Vojtech; Sastry, Chandan; Breymaier, Matthew; Idriss, Asma; Cimino, James J.

    2015-01-01

    Efficient communication of a clinical study protocol and case report forms during all stages of a human clinical study is important for many stakeholders. An electronic and structured study representation format that can be used throughout the whole study life-span can improve such communication and potentially lower total study costs. The most relevant standard for representing clinical study data, applicable to unregulated as well as regulated studies, is the Operational Data Model (ODM) in development since 1999 by the Clinical Data Interchange Standards Consortium (CDISC). ODM's initial objective was exchange of case report forms data but it is increasingly utilized in other contexts. An ODM extension called Study Design Model, introduced in 2011, provides additional protocol representation elements. Using a case study approach, we evaluated ODM's ability to capture all necessary protocol elements during a complete clinical study lifecycle in the Intramural Research Program of the National Institutes of Health. ODM offers the advantage of a single format for institutions that deal with hundreds or thousands of concurrent clinical studies and maintain a data warehouse for these studies. For each study stage, we present a list of gaps in the ODM standard and identify necessary vendor or institutional extensions that can compensate for such gaps. The current version of ODM (1.3.2) has only partial support for study protocol and study registration data mainly because it is outside the original development goal. ODM provides comprehensive support for representation of case report forms (in both the design stage and with patient level data). Inclusion of requirements of observational, non-regulated or investigator-initiated studies (outside Food and Drug Administration (FDA) regulation) can further improve future revisions of the standard. PMID:26188274

  13. Standardizing data exchange for clinical research protocols and case report forms: An assessment of the suitability of the Clinical Data Interchange Standards Consortium (CDISC) Operational Data Model (ODM).

    PubMed

    Huser, Vojtech; Sastry, Chandan; Breymaier, Matthew; Idriss, Asma; Cimino, James J

    2015-10-01

    Efficient communication of a clinical study protocol and case report forms during all stages of a human clinical study is important for many stakeholders. An electronic and structured study representation format that can be used throughout the whole study life-span can improve such communication and potentially lower total study costs. The most relevant standard for representing clinical study data, applicable to unregulated as well as regulated studies, is the Operational Data Model (ODM) in development since 1999 by the Clinical Data Interchange Standards Consortium (CDISC). ODM's initial objective was exchange of case report forms data but it is increasingly utilized in other contexts. An ODM extension called Study Design Model, introduced in 2011, provides additional protocol representation elements. Using a case study approach, we evaluated ODM's ability to capture all necessary protocol elements during a complete clinical study lifecycle in the Intramural Research Program of the National Institutes of Health. ODM offers the advantage of a single format for institutions that deal with hundreds or thousands of concurrent clinical studies and maintain a data warehouse for these studies. For each study stage, we present a list of gaps in the ODM standard and identify necessary vendor or institutional extensions that can compensate for such gaps. The current version of ODM (1.3.2) has only partial support for study protocol and study registration data mainly because it is outside the original development goal. ODM provides comprehensive support for representation of case report forms (in both the design stage and with patient level data). Inclusion of requirements of observational, non-regulated or investigator-initiated studies (outside Food and Drug Administration (FDA) regulation) can further improve future revisions of the standard. PMID:26188274

  14. Standardizing data exchange for clinical research protocols and case report forms: An assessment of the suitability of the Clinical Data Interchange Standards Consortium (CDISC) Operational Data Model (ODM).

    PubMed

    Huser, Vojtech; Sastry, Chandan; Breymaier, Matthew; Idriss, Asma; Cimino, James J

    2015-10-01

    Efficient communication of a clinical study protocol and case report forms during all stages of a human clinical study is important for many stakeholders. An electronic and structured study representation format that can be used throughout the whole study life-span can improve such communication and potentially lower total study costs. The most relevant standard for representing clinical study data, applicable to unregulated as well as regulated studies, is the Operational Data Model (ODM) in development since 1999 by the Clinical Data Interchange Standards Consortium (CDISC). ODM's initial objective was exchange of case report forms data but it is increasingly utilized in other contexts. An ODM extension called Study Design Model, introduced in 2011, provides additional protocol representation elements. Using a case study approach, we evaluated ODM's ability to capture all necessary protocol elements during a complete clinical study lifecycle in the Intramural Research Program of the National Institutes of Health. ODM offers the advantage of a single format for institutions that deal with hundreds or thousands of concurrent clinical studies and maintain a data warehouse for these studies. For each study stage, we present a list of gaps in the ODM standard and identify necessary vendor or institutional extensions that can compensate for such gaps. The current version of ODM (1.3.2) has only partial support for study protocol and study registration data mainly because it is outside the original development goal. ODM provides comprehensive support for representation of case report forms (in both the design stage and with patient level data). Inclusion of requirements of observational, non-regulated or investigator-initiated studies (outside Food and Drug Administration (FDA) regulation) can further improve future revisions of the standard.

  15. STANDARD MEASUREMENT PROTOCOLS - FLORIDA RADON RESEARCH PROGRAM

    EPA Science Inventory

    The manual, in support of the Florida Radon Research Program, contains standard protocols for key measurements where data quality is vital to the program. t contains two sections. he first section, soil measurements, contains field sampling protocols for soil gas permeability and...

  16. Communication protocol standards for space data systems

    NASA Technical Reports Server (NTRS)

    Hooke, Adrian J.; Desjardins, Richard

    1990-01-01

    The main elements and requirements of advanced space data networks are identified. The communication protocol standards for use on space missions during the coming decades are described. In particular, the blending of high-performance space-unique data transmission techniques with off-the-shelf open systems interconnection (OSI) protocols is described.

  17. Standardized North American marsh bird monitoring protocol

    USGS Publications Warehouse

    Conway, Courtney J.

    2011-01-01

    Little is known about the population status of many marsh-dependent birds in North America but recent efforts have focused on collecting more reliable information and estimates of population trends. As part of that effort, a standardized survey protocol was developed in 1999 that provided guidance for conducting marsh bird surveys throughout North America such that data would be consistent among locations. The original survey protocol has been revised to provide greater clarification on many issues as the number of individuals using the protocol has grown. The Standardized North American Marsh Bird Monitoring Protocol instructs surveyors to conduct an initial 5-minute passive point-count survey followed by a series of 1-minute segments during which marsh bird calls are broadcast into the marsh following a standardized approach. Surveyors are instructed to record each individual bird from the suite of 26 focal species that are present in their local area on separate lines of a datasheet and estimate the distance to each bird. Also, surveyors are required to record whether each individual bird was detected within each 1-minute subsegment of the survey. These data allow analysts to use several different approaches for estimating detection probability. The Standardized North American Marsh Bird Monitoring Protocol provides detailed instructions that explain the field methods used to monitor marsh birds in North America.

  18. An international computer protocol standard is essential

    SciTech Connect

    Marks, J.

    1994-02-01

    This article examines the need for the development of an international communication protocol to avoid building or buying customized interfaces or gateways in order to connect two separate vendor's devices to the same computer. The article discuss the need for standards and details one electric cooperative's experience in converting their automated mapping and facilities management system to EPRI sponsored Utility Communications Architecture.

  19. The Space Communications Protocol Standards Program

    NASA Astrophysics Data System (ADS)

    Jeffries, Alan; Hooke, Adrian J.

    1994-11-01

    In the fall of 1992 NASA and the Department of Defense chartered a technical team to explore the possibility of developing a common set of space data communications standards for potential dual-use across the U.S. national space mission support infrastructure. The team focused on the data communications needs of those activities associated with on-lined control of civil and military aircraft. A two-pronged approach was adopted: a top-down survey of representative civil and military space data communications requirements was conducted; and a bottom-up analysis of available standard data communications protocols was performed. A striking intersection of civil and military space mission requirements emerged, and an equally striking consensus on the approach towards joint civil and military space protocol development was reached. The team concluded that wide segments of the U.S. civil and military space communities have common needs for: (1) an efficient file transfer protocol; (2) various flavors of underlying data transport service; (3) an optional data protection mechanism to assure end-to-end security of message exchange; and (4) an efficient internetworking protocol. These recommendations led to initiating a program to develop a suite of protocols based on these findings. This paper describes the current status of this program.

  20. Satellite-Friendly Protocols and Standards

    NASA Astrophysics Data System (ADS)

    Koudelka, O.; Schmidt, M.; Ebert, J.; Schlemmer, H.; Kastner, S.; Riedler, W.

    2002-01-01

    We are currently observing a development unprecedented with other services, the enormous growth of the Internet. Video, voice and data applications can be supported via this network in high quality. Multi-media applications require high bandwidth which may not be available in many areas. When making proper use of the broadcast feature of a communications satellite, the performance of the satellite-based system can compare favourably to terrestrial solutions. Internet applications are in many cases highly asymmetric, making them very well suited to applications using small and inexpensive terminals. Data from one source may be used simultaneously by a large number of users. The Internet protocol suite has become the de-facto standard. But this protocol family in its original form has not been designed to support guaranteed quality of service, a prerequisite for real-time, high quality traffic. The Internet Protocol has to be adapted for the satellite environment, because long roundtrip delays and the error behaviour of the channel could make it inefficient over a GEO satellite. Another requirement is to utilise the satellite bandwidth as efficiently as possible. This can be achieved by adapting the access system to the nature of IP frames, which are variable in length. In the framework of ESA's ARTES project a novel satellite multimedia system was developed which utilises Multi-Frequency TDMA in a meshed network topology. The system supports Quality of Service (QoS) by reserving capacity with different QoS requirements. The system is centrally controlled by a master station with the implementation of a demand assignment (DAMA) system. A lean internal signalling system has been adopted. Network management is based on the SNMP protocol and industry-standard network management platforms, making interfaces to standard accounting and billing systems easy. Modern communication systems will have to be compliant to different standards in a very flexible manner. The

  1. Melanins and melanogenesis: methods, standards, protocols.

    PubMed

    d'Ischia, Marco; Wakamatsu, Kazumasa; Napolitano, Alessandra; Briganti, Stefania; Garcia-Borron, José-Carlos; Kovacs, Daniela; Meredith, Paul; Pezzella, Alessandro; Picardo, Mauro; Sarna, Tadeusz; Simon, John D; Ito, Shosuke

    2013-09-01

    Despite considerable advances in the past decade, melanin research still suffers from the lack of universally accepted and shared nomenclature, methodologies, and structural models. This paper stems from the joint efforts of chemists, biochemists, physicists, biologists, and physicians with recognized and consolidated expertise in the field of melanins and melanogenesis, who critically reviewed and experimentally revisited methods, standards, and protocols to provide for the first time a consensus set of recommended procedures to be adopted and shared by researchers involved in pigment cell research. The aim of the paper was to define an unprecedented frame of reference built on cutting-edge knowledge and state-of-the-art methodology, to enable reliable comparison of results among laboratories and new progress in the field based on standardized methods and shared information.

  2. Toward a Standard Protocol for Micelle Simulation.

    PubMed

    Johnston, Michael A; Swope, William C; Jordan, Kirk E; Warren, Patrick B; Noro, Massimo G; Bray, David J; Anderson, Richard L

    2016-07-01

    In this paper, we present protocols for simulating micelles using dissipative particle dynamics (and in principle molecular dynamics) that we expect to be appropriate for computing micelle properties for a wide range of surfactant molecules. The protocols address challenges in equilibrating and sampling, specifically when kinetics can be very different with changes in surfactant concentration, and with minor changes in molecular size and structure, even using the same force field parameters. We demonstrate that detection of equilibrium can be automated and is robust, for the molecules in this study and others we have considered. In order to quantify the degree of sampling obtained during simulations, metrics to assess the degree of molecular exchange among micellar material are presented, and the use of correlation times are prescribed to assess sampling and for statistical uncertainty estimates on the relevant simulation observables. We show that the computational challenges facing the measurement of the critical micelle concentration (CMC) are somewhat different for high and low CMC materials. While a specific choice is not recommended here, we demonstrate that various methods give values that are consistent in terms of trends, even if not numerically equivalent. PMID:27096611

  3. Toward a Standard Protocol for Micelle Simulation.

    PubMed

    Johnston, Michael A; Swope, William C; Jordan, Kirk E; Warren, Patrick B; Noro, Massimo G; Bray, David J; Anderson, Richard L

    2016-07-01

    In this paper, we present protocols for simulating micelles using dissipative particle dynamics (and in principle molecular dynamics) that we expect to be appropriate for computing micelle properties for a wide range of surfactant molecules. The protocols address challenges in equilibrating and sampling, specifically when kinetics can be very different with changes in surfactant concentration, and with minor changes in molecular size and structure, even using the same force field parameters. We demonstrate that detection of equilibrium can be automated and is robust, for the molecules in this study and others we have considered. In order to quantify the degree of sampling obtained during simulations, metrics to assess the degree of molecular exchange among micellar material are presented, and the use of correlation times are prescribed to assess sampling and for statistical uncertainty estimates on the relevant simulation observables. We show that the computational challenges facing the measurement of the critical micelle concentration (CMC) are somewhat different for high and low CMC materials. While a specific choice is not recommended here, we demonstrate that various methods give values that are consistent in terms of trends, even if not numerically equivalent.

  4. Protocol Standards for Reporting Video Data in Academic Journals.

    PubMed

    Rowland, Pamela A; Ignacio, Romeo C; de Moya, Marc A

    2016-04-01

    Editors of biomedical journals have estimated that a majority (40%-90%) of studies published in scientific journals cannot be replicated, even though an inherent principle of publication is that others should be able to replicate and build on published claims. Each journal sets its own protocols for establishing "quality" in articles, yet over the past 50 years, few journals in any field--especially medical education--have specified protocols for reporting the use of video data in research. The authors found that technical and industry-driven aspects of video recording, as well as a lack of standardization and reporting requirements by research journals, have led to major limitations in the ability to assess or reproduce video data used in research. Specific variables in the videotaping process (e.g., camera angle), which can be changed or be modified, affect the quality of recorded data, leading to major reporting errors and, in turn, unreliable conclusions. As more data are now in the form of digital videos, the historical lack of reporting standards makes it increasingly difficult to accurately replicate medical educational studies. Reproducibility is especially important as the medical education community considers setting national high-stakes standards in medicine and surgery based on video data. The authors of this Perspective provide basic protocol standards for investigators and journals using video data in research publications so as to allow for reproducibility.

  5. Comparison of the Reach Scale Habitat Characteristics of Historic and Current Ozark Hellbender (Cryptobranchus alleganiensis bishopi) Localities Using Standardized Assessment Protocols

    NASA Astrophysics Data System (ADS)

    Wheeler, B. A.; Hiler, W. R.; Trauth, S. E.; Christian, A. D.

    2005-05-01

    Habitat degradation is typically cited as a reason for declines in Ozark hellbender populations. While habitat degradation is evident, many sites appear to contain suitable microhabitat, but do not support hellbender populations. We used three standardized protocols, US EPA Rapid Bioassessment Protocol (RBP), Ohio Qualitative Habitat Evaluation Index (QHEI), and Basin Area Stream Survey (BASS), to compare reach-scale habitat at nine locations within each of the Eleven Point (EP) and Spring (SR) rivers in the eastern Ozark Mountains. Sites were divided into Historically Present (HP), Currently Present (CP), and Reference Reaches (RR). Although EP sites scored consistently higher than SR sites for the RBP and QHEI, all sites scored close to the optimal levels. The BASS data were analyzed using PCA, and three resulting axes explained 52.8% of the variation. ANOVA of the PCA loading scores indicated significant differences between the rivers and between SRCP sites and both EPCP and EPHP sites. Parameters most associated with SR sites were rooted vegetation and embeddedness, whereas woody debris and bank cover were associated with EP stations. Our results suggest the Spring River is suffering from loss of riparian zones, thus, resulting in the degradation of in-stream habitat.

  6. SPIRIT 2013 Statement: defining standard protocol items for clinical trials.

    PubMed

    Chan, An-Wen; Tetzlaff, Jennifer M; Altman, Douglas G; Laupacis, Andreas; Gøtzsche, Peter C; Krle A-Jerić, Karmela; Hrobjartsson, Asbjørn; Mann, Howard; Dickersin, Kay; Berlin, Jesse A; Dore, Caroline J; Parulekar, Wendy R; Summerskill, William S M; Groves, Trish; Schulz, Kenneth F; Sox, Harold C; Rockhold, Frank W; Rennie, Drummond; Moher, David

    2015-12-01

    The protocol of a clinical trial serves as the foundation for study planning, conduct, reporting, and appraisal. However, trial protocols and existing protocol guidelines vary greatly in content and quality. This article describes the systematic development and scope of SPIRIT (Standard Protocol Items: Recommendations for Interventional Trials) 2013, a guideline for the minimum content of a clinical trial protocol. The 33-item SPIRIT checklist applies to protocols for all clinical trials and focuses on content rather than format. The checklist recommends a full description of what is planned; it does not prescribe how to design or conduct a trial. By providing guidance for key content, the SPIRIT recommendations aim to facilitate the drafting of high-quality protocols. Adherence to SPIRIT would also enhance the transparency and completeness of trial protocols for the benefit of investigators, trial participants, patients, sponsors, funders, research ethics committees or institutional review boards, peer reviewers, journals, trial registries, policymakers, regulators, and other key stakeholders. PMID:27440100

  7. Pelvic Muscle Rehabilitation: A Standardized Protocol for Pelvic Floor Dysfunction

    PubMed Central

    Pedraza, Rodrigo; Nieto, Javier; Ibarra, Sergio; Haas, Eric M.

    2014-01-01

    Introduction. Pelvic floor dysfunction syndromes present with voiding, sexual, and anorectal disturbances, which may be associated with one another, resulting in complex presentation. Thus, an integrated diagnosis and management approach may be required. Pelvic muscle rehabilitation (PMR) is a noninvasive modality involving cognitive reeducation, modification, and retraining of the pelvic floor and associated musculature. We describe our standardized PMR protocol for the management of pelvic floor dysfunction syndromes. Pelvic Muscle Rehabilitation Program. The diagnostic assessment includes electromyography and manometry analyzed in 4 phases: (1) initial baseline phase; (2) rapid contraction phase; (3) tonic contraction and endurance phase; and (4) late baseline phase. This evaluation is performed at the onset of every session. PMR management consists of 6 possible therapeutic modalities, employed depending on the diagnostic evaluation: (1) down-training; (2) accessory muscle isolation; (3) discrimination training; (4) muscle strengthening; (5) endurance training; and (6) electrical stimulation. Eight to ten sessions are performed at one-week intervals with integration of home exercises and lifestyle modifications. Conclusions. The PMR protocol offers a standardized approach to diagnose and manage pelvic floor dysfunction syndromes with potential advantages over traditional biofeedback, involving additional interventions and a continuous pelvic floor assessment with management modifications over the clinical course. PMID:25006337

  8. Modifications to the Standard Sit-and-Reach Flexibility Protocol

    PubMed Central

    Holt, Laurence E.; Pelham, Thomas W.; Burke, Darren G.

    1999-01-01

    Objective: To present several modifications of the standard sit-and-reach protocol. Background: Many exercises designed to increase strength and aerobic capacity tend to decrease the flexibility of the erector spinae and hamstrings musculature. Less-than-ideal flexibility in these soft tissues may increase the risk of injury during training, competition, or activities of daily living. The most widely used measures of flexibility have been either the stand-and-reach or the sit-and-reach, but both are limited to a single measure. Description: Using the new multitest flexometer, we were able to take 6 flexibility measures beyond the stand-and-reach test: standard active sit-and-reach, standard passive sit-and-reach, modified active sit-and-reach with external rotators slackened, modified passive sit-and-reach with external rotators slackened, modified active sit-and-reach with the hamstrings, gastrocnemii, and external rotators slackened, and modified passive sit-and-reach with the hamstrings, gastrocnemii, and external rotators slackened. Clinical Advantages: This modified sit-and-reach protocol allows the indirect assessment of the influence of the 4 major muscle groups that affect sit-and-reach scores: erector spinae, hip rotators, hamstrings, and gastrocnemii. ImagesFigure 1.Figure 2.Figure 3. PMID:16558547

  9. Standard protocol stack for mission control

    NASA Technical Reports Server (NTRS)

    Hooke, Adrian J.

    1994-01-01

    It is proposed to create a fully 'open' architectural specification for standardized space mission command and control. By being open, i.e., independent for any particular implementation, diversity and competition will be encouraged among future commercial suppliers of space equipment and systems. Customers of the new standard capability are expected to include: (1) the civil space community (e.g., NASA, NOAA, international Agencies); (2) the military space community (e.g., Air Force, Navy, intelligence); and (3) the emerging commercial space community (e.g., mobile satellite service providers).

  10. The Vocational Assessment Protocol: Development and Validation.

    ERIC Educational Resources Information Center

    Thomas, Dale F.; Menz, Fredrick E.

    This report describes a 48-month project which developed, field tested, and evaluated the utility of the Vocational Assessment Protocol (VAP) for use with persons with traumatic brain injury resulting in a severe and persistent disability. The VAP is intended to assist in the community-based vocational rehabilitation of these individuals. The VAP…

  11. Wildlife road traffic accidents: a standardized protocol for counting flattened fauna

    PubMed Central

    Collinson, Wendy J; Parker, Daniel M; Bernard, Ric T F; Reilly, Brian K; Davies-Mostert, Harriet T

    2014-01-01

    Previous assessments of wildlife road mortality have not used directly comparable methods and, at present, there is no standardized protocol for the collection of such data. Consequently, there are no internationally comparative statistics documenting roadkill rates. In this study, we used a combination of experimental trials and road transects to design a standardized protocol to assess roadkill rates on both paved and unpaved roads. Simulated roadkill were positioned over a 1 km distance, and trials were conducted at eight different speeds (20–100 km·h−1). The recommended protocol was then tested on a 100-km transect, driven daily over a 40-day period. This recorded 413 vertebrate roadkill, comprising 106 species. We recommend the protocol be adopted for future road ecology studies to enable robust statistical comparisons between studies. PMID:25247063

  12. Joint Architecture Standard (JAS) Reliable Data Delivery Protocol (RDDP) specification.

    SciTech Connect

    Enderle, Justin Wayne; Daniels, James W.; Gardner, Michael T.; Eldridge, John M.; Hunt, Richard D.; Gallegos, Daniel E.

    2011-05-01

    The Joint Architecture Standard (JAS) program at Sandia National Laboratories requires the use of a reliable data delivery protocol over SpaceWire. The National Aeronautics and Space Administration at the Goddard Spaceflight Center in Greenbelt, Maryland, developed and specified a reliable protocol for its Geostationary Operational Environment Satellite known as GOES-R Reliable Data Delivery Protocol (GRDDP). The JAS program implemented and tested GRDDP and then suggested a number of modifications to the original specification to meet its program specific requirements. This document details the full RDDP specification as modified for JAS. The JAS Reliable Data Delivery Protocol uses the lower-level SpaceWire data link layer to provide reliable packet delivery services to one or more higher-level host application processes. This document specifies the functional requirements for JRDDP but does not specify the interfaces to the lower- or higher-level processes, which may be implementation-dependent.

  13. Standardized CT protocols and nomenclature: better, but not yet there.

    PubMed

    Singh, Sarabjeet; Kalra, Mannudeep K

    2014-10-01

    Radiation dose associated with CT is an important safety concern in patient care, especially in children. Technical advancements in multidetector-row CT scanner technology offer several advantages for clinical applications; these advancements have considerably increased CT utilization and enhanced the complexity of CT scanning protocols. Furthermore there are several scan manufacturers spearheading these technical advancements, leading to different commercial names causing confusion among the users, especially at imaging sites with scanners from different vendors. Several scientific studies and the National Council on Radiation Protection and Measurements (NCRP) have shown variation in CT radiation doses for same body region and similar scanning protocols. Therefore there is a need for standardization of scanning protocols and nomenclature of scan parameters. The following material reviews the status and challenges in standardization of CT scanning and nomenclature. PMID:25304702

  14. SUPPLEMENT TO: STANDARD MEASUREMENT PROTOCOLS - FLORIDA RADON RESEARCH PROGRAM

    EPA Science Inventory

    The report supplements earlier published standard protocols for key measurements where data quality is vital to the Florida Radon Research Program. The report adds measurements of small canister radon flux and soil water potential to the section on soil measurements. It adds indo...

  15. Biocoder: A programming language for standardizing and automating biology protocols

    PubMed Central

    2010-01-01

    Background Published descriptions of biology protocols are often ambiguous and incomplete, making them difficult to replicate in other laboratories. However, there is increasing benefit to formalizing the descriptions of protocols, as laboratory automation systems (such as microfluidic chips) are becoming increasingly capable of executing them. Our goal in this paper is to improve both the reproducibility and automation of biology experiments by using a programming language to express the precise series of steps taken. Results We have developed BioCoder, a C++ library that enables biologists to express the exact steps needed to execute a protocol. In addition to being suitable for automation, BioCoder converts the code into a readable, English-language description for use by biologists. We have implemented over 65 protocols in BioCoder; the most complex of these was successfully executed by a biologist in the laboratory using BioCoder as the only reference. We argue that BioCoder exposes and resolves ambiguities in existing protocols, and could provide the software foundations for future automation platforms. BioCoder is freely available for download at http://research.microsoft.com/en-us/um/india/projects/biocoder/. Conclusions BioCoder represents the first practical programming system for standardizing and automating biology protocols. Our vision is to change the way that experimental methods are communicated: rather than publishing a written account of the protocols used, researchers will simply publish the code. Our experience suggests that this practice is tractable and offers many benefits. We invite other researchers to leverage BioCoder to improve the precision and completeness of their protocols, and also to adapt and extend BioCoder to new domains. PMID:21059251

  16. Standardization of clinical protocols in oral malodor research.

    PubMed

    Yaegaki, Ken; Brunette, Donald M; Tangerman, Albert; Choe, Yong-Sahm; Winkel, Edwin G; Ito, Sayaka; Kitano, Tomohiro; Ii, Hisataka; Calenic, Bogdan; Ishkitiev, Nikolay; Imai, Toshio

    2012-03-01

    The objective of this study is to standardize protocols for clinical research into oral malodor caused by volatile sulfur compounds (VSCs). To detect VSCs, a gas chromatograph (GC) using a flame photometric detector equipped with a bandpass filter (at 393 nm) is the gold standard (sensitivity: 5 × 10(-11) gS s(-1)). The baselines of VSC concentrations in mouth air varied considerably over a week. When the subjects refrained from eating, drinking and oral hygiene including mouth rinsing, the VSC concentrations remained constant until eating. Over a 6 h period after a meal, VSC concentrations decreased dramatically (p < 0.01). These results point to optimal times and conditions for sampling subjects. Several portable devices were compared with the measurements by the GCs. Portable GCs demonstrated capabilities similar to those of the GCs. We also applied the recommended protocols described below to clinical research testing the efficacy of ZnCl(2) products, and confirmed that using the recommended protocols in a randomized crossover design would provide very clear results. Proposed protocols include: (a) a short-term study rather than a long-term study is strongly recommended, since the VSC concentrations are constant in the short term; (b) a crossover study would be the best design to avoid the effects of individual specificities on each clinical intervention; (c) measurements of VSCs should preferably be carried out using either a GC or portable GCs. PMID:22368249

  17. Automatic quality assessment protocol for MRI equipment.

    PubMed

    Bourel, P; Gibon, D; Coste, E; Daanen, V; Rousseau, J

    1999-12-01

    The authors have developed a protocol and software for the quality assessment of MRI equipment with a commercial test object. Automatic image analysis consists of detecting surfaces and objects, defining regions of interest, acquiring reference point coordinates and establishing gray level profiles. Signal-to-noise ratio, image uniformity, geometrical distortion, slice thickness, slice profile, and spatial resolution are checked. The results are periodically analyzed to evaluate possible drifts with time. The measurements are performed weekly on three MRI scanners made by the Siemens Company (VISION 1.5T, EXPERT 1.0T, and OPEN 0.2T). The results obtained for the three scanners over approximately 3.5 years are presented, analyzed, and compared.

  18. Multilevel Assessments of Science Standards

    ERIC Educational Resources Information Center

    Quellmalz, Edys S.; Timms, Michael J.; Silberglitt, Matt D.

    2011-01-01

    The Multilevel Assessment of Science Standards (MASS) project is creating a new generation of technology-enhanced formative assessments that bring the best formative assessment practices into classrooms to transform what, how, when, and where science learning is assessed. The project is investigating the feasibility, utility, technical quality,…

  19. Standardized food images: A photographing protocol and image database.

    PubMed

    Charbonnier, Lisette; van Meer, Floor; van der Laan, Laura N; Viergever, Max A; Smeets, Paul A M

    2016-01-01

    The regulation of food intake has gained much research interest because of the current obesity epidemic. For research purposes, food images are a good and convenient alternative for real food because many dietary decisions are made based on the sight of foods. Food pictures are assumed to elicit anticipatory responses similar to real foods because of learned associations between visual food characteristics and post-ingestive consequences. In contemporary food science, a wide variety of images are used which introduces between-study variability and hampers comparison and meta-analysis of results. Therefore, we created an easy-to-use photographing protocol which enables researchers to generate high resolution food images appropriate for their study objective and population. In addition, we provide a high quality standardized picture set which was characterized in seven European countries. With the use of this photographing protocol a large number of food images were created. Of these images, 80 were selected based on their recognizability in Scotland, Greece and The Netherlands. We collected image characteristics such as liking, perceived calories and/or perceived healthiness ratings from 449 adults and 191 children. The majority of the foods were recognized and liked at all sites. The differences in liking ratings, perceived calories and perceived healthiness between sites were minimal. Furthermore, perceived caloric content and healthiness ratings correlated strongly (r ≥ 0.8) with actual caloric content in both adults and children. The photographing protocol as well as the images and the data are freely available for research use on http://nutritionalneuroscience.eu/. By providing the research community with standardized images and the tools to create their own, comparability between studies will be improved and a head-start is made for a world-wide standardized food image database.

  20. Japanese Virtual Observatory (JVO): implementation of VO standard protocols

    NASA Astrophysics Data System (ADS)

    Shirasaki, Y.; Tanaka, M.; Honda, S.; Kawanomoto, S.; Yasuda, N.; Masunaga, Y.; Ishihara, Y.; Tsutsumi, J.; Nakamoto, H.; Kobayashi, Y.

    2006-07-01

    We developed the third prototype towards a Japanese Virtual Observatory (JVO). IVOA standards, such as Simple Image Access and ADQL, were adapted to the system for the first time. We also constructed an OAI-PMH publishing registry, a web service based searchable registry, and VO data services based on SIA and SkyNode protocols. Most of the components were built on open software, except for an XML database used for searchable registry. This paper describes the JVO proto 3 system and results of a performance measurement.

  1. Standards-Based Wireless Sensor Networking Protocols for Spaceflight Applications

    NASA Technical Reports Server (NTRS)

    Wagner, Raymond S.

    2010-01-01

    Wireless sensor networks (WSNs) have the capacity to revolutionize data gathering in both spaceflight and terrestrial applications. WSNs provide a huge advantage over traditional, wired instrumentation since they do not require wiring trunks to connect sensors to a central hub. This allows for easy sensor installation in hard to reach locations, easy expansion of the number of sensors or sensing modalities, and reduction in both system cost and weight. While this technology offers unprecedented flexibility and adaptability, implementing it in practice is not without its difficulties. Recent advances in standards-based WSN protocols for industrial control applications have come a long way to solving many of the challenges facing practical WSN deployments. In this paper, we will overview two of the more promising candidates - WirelessHART from the HART Communication Foundation and ISA100.11a from the International Society of Automation - and present the architecture for a new standards-based sensor node for networking and applications research.

  2. Establishing a standardized therapeutic testing protocol for spinal muscular atrophy.

    PubMed

    Tsai, Li-Kai; Tsai, Ming-Shung; Lin, Tzer-Bin; Hwu, Wuh-Liang; Li, Hung

    2006-11-01

    Several mice models have been created for spinal muscular atrophy (SMA); however, there is still no standard preclinical testing system for the disease. We previously generated type III-specific SMA model mice, which might be suitable for use as a preclinical therapeutic testing system for SMA. To establish such a system and test its applicability, we first created a testing protocol and then applied it as a means to investigate the use of valproic acid (VPA) as a possible treatment for SMA. These SMA mice revealed tail/ear/foot deformity, muscle atrophy, poorer motor performances, smaller compound muscle action potential and lower spinal motoneuron density at the age of 9 to 12 months in comparison with age-matched wild-type littermate mice. In addition, VPA attenuates motoneuron death, increases spinal SMN protein level and partially normalizes motor function in SMA mice. These results suggest that the testing protocol developed here is well suited for use as a standardized preclinical therapeutic testing system for SMA.

  3. Hydrogen Hazards Assessment Protocol (HHAP): Approach and Methodology

    NASA Technical Reports Server (NTRS)

    Woods, Stephen

    2009-01-01

    This viewgraph presentation reviews the approach and methodology to develop a assessment protocol for hydrogen hazards. Included in the presentation are the reasons to perform hazards assessment, the types of hazard assessments that exist, an analysis of hydrogen hazards, specific information about the Hydrogen Hazards Assessment Protocol (HHAP). The assessment is specifically tailored for hydrogen behavior. The end product of the assesment is a compilation of hazard, mitigations and associated factors to facilitate decision making and achieve the best practice.

  4. Towards Automatic Diabetes Case Detection and ABCS Protocol Compliance Assessment

    PubMed Central

    Mishra, Ninad K.; Son, Roderick Y.; Arnzen, James J.

    2012-01-01

    Objective According to the American Diabetes Association, the implementation of the standards of care for diabetes has been suboptimal in most clinical settings. Diabetes is a disease that had a total estimated cost of $174 billion in 2007 for an estimated diabetes-affected population of 17.5 million in the United States. With the advent of electronic medical records (EMR), tools to analyze data residing in the EMR for healthcare surveillance can help reduce the burdens experienced today. This study was primarily designed to evaluate the efficacy of employing clinical natural language processing to analyze discharge summaries for evidence indicating a presence of diabetes, as well as to assess diabetes protocol compliance and high risk factors. Methods Three sets of algorithms were developed to analyze discharge summaries for: (1) identification of diabetes, (2) protocol compliance, and (3) identification of high risk factors. The algorithms utilize a common natural language processing framework that extracts relevant discourse evidence from the medical text. Evidence utilized in one or more of the algorithms include assertion of the disease and associated findings in medical text, as well as numerical clinical measurements and prescribed medications. Results The diabetes classifier was successful at classifying reports for the presence and absence of diabetes. Evaluated against 444 discharge summaries, the classifier’s performance included macro and micro F-scores of 0.9698 and 0.9865, respectively. Furthermore, the protocol compliance and high risk factor classifiers showed promising results, with most F-measures exceeding 0.9. Conclusions The presented approach accurately identified diabetes in medical discharge summaries and showed promise with regards to assessment of protocol compliance and high risk factors. Utilizing free-text analytic techniques on medical text can complement clinical-public health decision support by identifying cases and high risk

  5. DOE limited standard: Operations assessments

    SciTech Connect

    1996-05-01

    Purpose of this standard is to provide DOE Field Element assessors with a guide for conducting operations assessments, and provide DOE Field Element managers with the criteria of the EM Operations Assessment Program. Sections 6.1 to 6.21 provide examples of how to assess specific areas; the general techniques of operations assessments (Section 5) may be applied to other areas of health and safety (e.g. fire protection, criticality safety, quality assurance, occupational safety, etc.).

  6. Workshop on laboratory protocol standards for the Molecular Methods Database.

    PubMed

    Klingström, Tomas; Soldatova, Larissa; Stevens, Robert; Roos, T Erik; Swertz, Morris A; Müller, Kristian M; Kalaš, Matúš; Lambrix, Patrick; Taussig, Michael J; Litton, Jan-Eric; Landegren, Ulf; Bongcam-Rudloff, Erik

    2013-01-25

    Management of data to produce scientific knowledge is a key challenge for biological research in the 21st century. Emerging high-throughput technologies allow life science researchers to produce big data at speeds and in amounts that were unthinkable just a few years ago. This places high demands on all aspects of the workflow: from data capture (including the experimental constraints of the experiment), analysis and preservation, to peer-reviewed publication of results. Failure to recognise the issues at each level can lead to serious conflicts and mistakes; research may then be compromised as a result of the publication of non-coherent protocols, or the misinterpretation of published data. In this report, we present the results from a workshop that was organised to create an ontological data-modelling framework for Laboratory Protocol Standards for the Molecular Methods Database (MolMeth). The workshop provided a set of short- and long-term goals for the MolMeth database, the most important being the decision to use the established EXACT description of biomedical ontologies as a starting point. PMID:22687389

  7. Two RFID standard-based security protocols for healthcare environments.

    PubMed

    Picazo-Sanchez, Pablo; Bagheri, Nasour; Peris-Lopez, Pedro; Tapiador, Juan E

    2013-10-01

    Radio Frequency Identification (RFID) systems are widely used in access control, transportation, real-time inventory and asset management, automated payment systems, etc. Nevertheless, the use of this technology is almost unexplored in healthcare environments, where potential applications include patient monitoring, asset traceability and drug administration systems, to mention just a few. RFID technology can offer more intelligent systems and applications, but privacy and security issues have to be addressed before its adoption. This is even more dramatical in healthcare applications where very sensitive information is at stake and patient safety is paramount. In Wu et al. (J. Med. Syst. 37:19, 43) recently proposed a new RFID authentication protocol for healthcare environments. In this paper we show that this protocol puts location privacy of tag holders at risk, which is a matter of gravest concern and ruins the security of this proposal. To facilitate the implementation of secure RFID-based solutions in the medical sector, we suggest two new applications (authentication and secure messaging) and propose solutions that, in contrast to previous proposals in this field, are fully based on ISO Standards and NIST Security Recommendations.

  8. Effects of tailored neck-shoulder pain treatment based on a decision model guided by clinical assessments and standardized functional tests. A study protocol of a randomized controlled trial

    PubMed Central

    2012-01-01

    Background A major problem with rehabilitation interventions for neck pain is that the condition may have multiple causes, thus a single treatment approach is seldom efficient. The present study protocol outlines a single blinded randomised controlled trial evaluating the effect of tailored treatment for neck-shoulder pain. The treatment is based on a decision model guided by standardized clinical assessment and functional tests with cut-off values. Our main hypothesis is that the tailored treatment has better short, intermediate and long-term effects than either non-tailored treatment or treatment-as-usual (TAU) on pain and function. We sub-sequentially hypothesize that tailored and non-tailored treatment both have better effect than TAU. Methods/Design 120 working women with minimum six weeks of nonspecific neck-shoulder pain aged 20–65, are allocated by minimisation with the factors age, duration of pain, pain intensity and disability in to the groups tailored treatment (T), non-tailored treatment (NT) or treatment-as-usual (TAU). Treatment is given to the groups T and NT for 11 weeks (27 sessions evenly distributed). An extensive presentation of the tests and treatment decision model is provided. The main treatment components are manual therapy, cranio-cervical flexion exercise and strength training, EMG-biofeedback training, treatment for cervicogenic headache, neck motor control training. A decision algorithm based on the baseline assessment determines the treatment components given to each participant of T- and NT-groups. Primary outcome measures are physical functioning (Neck Disability Index) and average pain intensity last week (Numeric Rating Scale). Secondary outcomes are general improvement (Patient Global Impression of Change scale), symptoms (Profile Fitness Mapping neck questionnaire), capacity to work in the last 6 weeks (quality and quantity) and pressure pain threshold of m. trapezius. Primary and secondary outcomes will be reported for

  9. Use of a standardized JaCVAM in vivo rat comet assay protocol to assess the genotoxicity of three coded test compounds; ampicillin trihydrate, 1,2-dimethylhydrazine dihydrochloride, and N-nitrosodimethylamine.

    PubMed

    McNamee, J P; Bellier, P V

    2015-07-01

    As part of the Japanese Center for the Validation of Alternative Methods (JaCVAM)-initiative international validation study of the in vivo rat alkaline comet assay (comet assay), our laboratory examined ampicillin trihydrate (AMP), 1,2-dimethylhydrazine dihydrochloride (DMH), and N-nitrosodimethylamine (NDA) using a standard comet assay validation protocol (v14.2) developed by the JaCVAM validation management team (VMT). Coded samples were received by our laboratory along with basic MSDS information. Solubility analysis and range-finding experiments of the coded test compounds were conducted for dose selection. Animal dosing schedules, the comet assay processing and analysis, and statistical analysis were conducted in accordance with the standard protocol. Based upon our blinded evaluation, AMP was not found to exhibit evidence of genotoxicity in either the rat liver or stomach. However, both NDA and DMH were observed to cause a significant increase in % tail DNA in the rat liver at all dose levels tested. While acute hepatoxicity was observed for these compounds in the high dose group, in the investigators opinion there were a sufficient number of consistently damaged/measurable cells at the medium and low dose groups to judge these compounds as genotoxic. There was no evidence of genotoxicity from either NDA or DMH in the rat stomach. In conclusion, our laboratory observed increased DNA damage from two blinded test compounds in rat liver (later identified as genotoxic carcinogens), while no evidence of genotoxicity was observed for the third blinded test compound (later identified as a non-genotoxic, non-carcinogen). This data supports the use of a standardized protocol of the in vivo comet assay as a cost-effective alternative genotoxicity assay for regulatory testing purposes. PMID:26212307

  10. Use of a standardized JaCVAM in vivo rat comet assay protocol to assess the genotoxicity of three coded test compounds; ampicillin trihydrate, 1,2-dimethylhydrazine dihydrochloride, and N-nitrosodimethylamine.

    PubMed

    McNamee, J P; Bellier, P V

    2015-07-01

    As part of the Japanese Center for the Validation of Alternative Methods (JaCVAM)-initiative international validation study of the in vivo rat alkaline comet assay (comet assay), our laboratory examined ampicillin trihydrate (AMP), 1,2-dimethylhydrazine dihydrochloride (DMH), and N-nitrosodimethylamine (NDA) using a standard comet assay validation protocol (v14.2) developed by the JaCVAM validation management team (VMT). Coded samples were received by our laboratory along with basic MSDS information. Solubility analysis and range-finding experiments of the coded test compounds were conducted for dose selection. Animal dosing schedules, the comet assay processing and analysis, and statistical analysis were conducted in accordance with the standard protocol. Based upon our blinded evaluation, AMP was not found to exhibit evidence of genotoxicity in either the rat liver or stomach. However, both NDA and DMH were observed to cause a significant increase in % tail DNA in the rat liver at all dose levels tested. While acute hepatoxicity was observed for these compounds in the high dose group, in the investigators opinion there were a sufficient number of consistently damaged/measurable cells at the medium and low dose groups to judge these compounds as genotoxic. There was no evidence of genotoxicity from either NDA or DMH in the rat stomach. In conclusion, our laboratory observed increased DNA damage from two blinded test compounds in rat liver (later identified as genotoxic carcinogens), while no evidence of genotoxicity was observed for the third blinded test compound (later identified as a non-genotoxic, non-carcinogen). This data supports the use of a standardized protocol of the in vivo comet assay as a cost-effective alternative genotoxicity assay for regulatory testing purposes.

  11. Methods to Standardize a Multicenter Acupuncture Trial Protocol to Reduce Aromatase Inhibitor-related Joint Symptoms in Breast Cancer Patients

    PubMed Central

    Greenlee, Heather; Crew, Katherine D.; Capodice, Jillian; Awad, Danielle; Jeffres, Anne; Unger, Joseph M.; Lew, Danika L.; Hansen, Lisa K.; Meyskens, Frank L.; Wade, James L.; Hershman, Dawn L.

    2015-01-01

    Robust methods are needed to efficiently conduct large, multi-site, randomized controlled clinical trials of acupuncture protocols. SWOG S1200 is a randomized, controlled sham- and waitlist-controlled trial of a standardized acupuncture protocol treating aromatase inhibitor (AI)-associated arthralgias in early stage breast cancer patients (n=228). The primary objective is to determine whether true acupuncture administered twice weekly for 6 weeks compared to sham acupuncture or a waitlist control causes a reduction in AI-associated joint pain at 6 weeks as assessed by patient report. The study is conducted at 11 institutions across the US. The true acupuncture protocol was developed using a consensus-based process. Both the true acupuncture and sham acupuncture protocols consist of 12 sessions administered over 6 weeks, followed by 1 session per week for the remaining 6 weeks. The true acupuncture protocol uses standardized protocol points in addition to standardized acupoints tailored to a patient’s joint symptoms. The similarly standardized sham acupuncture protocol utilizes superficial needling of non-acupoints. Standardized methods were developed to train and monitor acupuncturists, including online and in-person training, study manuals, monthly phone calls, and remote quality assurance monitoring throughout the study period. Research staff was similarly trained using online and in-person training, and monthly phone calls. PMID:26100070

  12. Whole Body Vibration Exercise Protocol versus a Standard Exercise Protocol after ACL Reconstruction: A Clinical Randomized Controlled Trial with Short Term Follow-Up

    PubMed Central

    Berschin, Gereon; Sommer, Björn; Behrens, Antje; Sommer, Hans-Martin

    2014-01-01

    The suitability and effectiveness of whole body vibration (WBV) exercise in rehabilitation after injury of the anterior cruciate ligament (ACL) was studied using a specially designed WBV protocol. We wanted to test the hypothesis if WBV leads to superior short term results regarding neuromuscular performance (strength and coordination) and would be less time consuming than a current standard muscle strengthening protocol. In this prospective randomized controlled clinical trial, forty patients who tore their ACL and underwent subsequent ligament reconstruction were enrolled. Patients were randomized to the whole body vibration (n=20) or standard rehabilitation exercise protocol (n=20). Both protocols started in the 2nd week after surgery. Isometric and isokinetic strength measurements, clinical assessment, Lysholm score, neuromuscular performance were conducted weeks 2, 5, 8 and 11 after surgery. Time spent for rehabilitation exercise was reduced to less than a half in the WBV group. There were no statistically significant differences in terms of clinical assessment, Lysholm score, isokinetic and isometric strength. The WBV group displayed significant better results in the stability test. In conclusion, preliminary data indicate that our whole body vibration muscle exercise protocol seems to be a good alternative to a standard exercise program in ACL-rehabilitation. Despite of its significant reduced time requirement it is at least equally effective compared to a standard rehabilitation protocol. Key points In this prospective randomized controlled clinical trial, we tested the hypothesis if WBV leads to superior short term results regarding neuromuscular performance (strength and coordination) and would be less time consuming than a current standard muscle strengthening protocol in forty patients who underwent ACL reconstruction. Time spent for rehabilitation exercise was reduced to less than a half in the WBV group as compared to the standard exercise group. Both

  13. Development of a standard documentation protocol for communicating exposure models.

    PubMed

    Ciffroy, P; Altenpohl, A; Fait, G; Fransman, W; Paini, A; Radovnikovic, A; Simon-Cornu, M; Suciu, N; Verdonck, F

    2016-10-15

    An important step in building a computational model is its documentation; a comprehensive and structured documentation can improve the model applicability and transparency in science/research and for regulatory purposes. This is particularly crucial and challenging for environmental and/or human exposure models that aim to establish quantitative relationships between personal exposure levels and their determinants. Exposure models simulate the transport and fate of a contaminant from the source to the receptor and may involve a large set of entities (e.g. all the media the contaminants may pass though). Such complex models are difficult to be described in a comprehensive, unambiguous and accessible way. Bad communication of assumptions, theory, structure and/or parameterization can lead to lack of confidence by the user and it may be source of errors. The goal of this paper is to propose a standard documentation protocol (SDP) for exposure models, i.e. a generic format and a standard structure by which all exposure models could be documented. For this purpose, a CEN (European Committee for Standardisation) workshop was set up with objective to agree on minimum requirements for the amount and type of information to be provided on exposure models documentation along with guidelines for the structure and presentation of the information. The resulting CEN workshop agreement (CWA) was expected to facilitate a more rigorous formulation of exposure models description and the understanding by users. This paper intends to describe the process followed for defining the SDP, the standardisation approach, as well as the main components of the SDP resulting from a wide consultation of interested stakeholders. The main outcome is a CEN CWA which establishes terms and definitions for exposure models and their elements, specifies minimum requirements for the amount and type of information to be documented, and proposes a structure for communicating the documentation to different

  14. FALLS-protocol: lung ultrasound in hemodynamic assessment of shock.

    PubMed

    Lichtenstein, D

    2013-01-01

    The assessment of acute circulatory failure is a challenge in absence of solid gold standard. It is suggested that artifacts generated by lung ultrasound can be of help. The FALLS-protocol (Fluid Administration Limited by Lung Sonography) follows Weil's classification of shocks. Firstly, it searches for pericardial fluid, then right heart enlargment, lastly abolished lung sliding. In this setting, the diagnoses of pericardial tamponade, pulmonary embolism and tension pneumothorax, i.e. obstructive shock, can be schematically ruled out. Moreover, the search of diffuse lung rockets (i.e. multiple B-lines, a comet-tail artifact) is performed. Its absence excludes pulmonary edema, that in clinical practice is left cardiogenic shock (most cases). At this step, the patient (defined FALLS-responder) receives fluid therapy. He/she has usually a normal sonographic lung surface, an A-profile. Any clinical improvement suggests hypovolemic shock. The absence of improvement generates continuation of fluid therapy, eventually yielding fluid overload. This condition results in the change from A-profile to B-profile. Lung ultrasound has the advantage to demonstrate this interstitial syndrome at an early and infraclinical stage (FALLS-endpoint). The change from horizontal A-lines to vertical B-lines can be considered as a direct marker of volemia in this use. By elimination, this change indicates schematically distributive shock, while in current practice septic shock. The major limitation is the B-profile on admission generated by an initial lung disorder. FALLS-protocol, which can be associated with no drawback with traditional hemodynamic tools, uses a simple machine (without Doppler) and a suitable microconvex probe allowing for heart, lung and vein assessment. PMID:24364005

  15. Assessment and classification of protocol deviations

    PubMed Central

    Ghooi, Ravindra Bhaskar; Bhosale, Neelambari; Wadhwani, Reena; Divate, Pathik; Divate, Uma

    2016-01-01

    Introduction: Deviations from the approved trial protocol are common during clinical trials. They have been conventionally classified as deviations or violations, depending on their impact on the trial. Methods: A new method has been proposed by which deviations are classified in five grades from 1 to 5. A deviation of Grade 1 has no impact on the subjects’ well-being or on the quality of data. At the maximum, a deviation Grade 5 leads to the death of the subject. This method of classification was applied to deviations noted in the center over the last 3 years. Results: It was observed that most deviations were of Grades 1 and 2, with fewer falling in Grades 3 and 4. There were no deviations that led to the death of the subject (Grade 5). Discussion: This method of classification would help trial managers decide on the action to be taken on the occurrence of deviations, which would be based on their impact. PMID:27453830

  16. MASSIVE TRANSFUSION PROTOCOL: STANDARDIZING CARE TO IMPROVE PATIENT OUTCOMES.

    PubMed

    Porteous, Joan

    2015-06-01

    Providing rapid response is a primary goal when caring for surgical patients with injuries involving massive blood loss. Massive transfusion protocols have been developed in some tertiary care health care facilities to ensure a rapid and efficient response in the provision of care to patients with a massive and uncontrolled hemorrhage. The purpose of this article is to discuss a massive transfusion protocol and to describe the process used to implement a massive transfusion protocol at Winnipeg's Health Sciences Centre (the site) as well as to describe its impact in the operating room department. PMID:26310036

  17. MASSIVE TRANSFUSION PROTOCOL: STANDARDIZING CARE TO IMPROVE PATIENT OUTCOMES.

    PubMed

    Porteous, Joan

    2015-06-01

    Providing rapid response is a primary goal when caring for surgical patients with injuries involving massive blood loss. Massive transfusion protocols have been developed in some tertiary care health care facilities to ensure a rapid and efficient response in the provision of care to patients with a massive and uncontrolled hemorrhage. The purpose of this article is to discuss a massive transfusion protocol and to describe the process used to implement a massive transfusion protocol at Winnipeg's Health Sciences Centre (the site) as well as to describe its impact in the operating room department.

  18. Developing and implementing computerized protocols for standardization of clinical decisions.

    PubMed

    Morris, A H

    2000-03-01

    Humans have only a limited ability to incorporate information in decision making. In certain situations, the mismatch between this limitation and the availability of extensive information contributes to the varying performance and high error rate of clinical decision makers. Variation in clinical practice is due in part to clinicians' poor compliance with guidelines and recommended therapies. The use of decision-support tools is a response to both the information revolution and poor compliance. Computerized protocols used to deliver decision support can be configured to contain much more detail than textual guidelines or paper-based flow diagrams. Such protocols can generate patient-specific instructions for therapy that can be carried out with little interclinician variability; however, clinicians must be willing to modify personal styles of clinical management. Protocols need not be perfect. Several defensible and reasonable approaches are available for clinical problems. However, one of these reasonable approaches must be chosen and incorporated into the protocol to promote consistent clinical decisions. This reasoning is the basis of an explicit method of decision support that allows the rigorous evaluation of interventions, including use of the protocols themselves. Computerized protocols for mechanical ventilation and management of intravenous fluid and hemodynamic factors in patients with the acute respiratory distress syndrome provide case studies for this discussion. PMID:10691588

  19. Developing and implementing computerized protocols for standardization of clinical decisions.

    PubMed

    Morris, A H

    2000-03-01

    Humans have only a limited ability to incorporate information in decision making. In certain situations, the mismatch between this limitation and the availability of extensive information contributes to the varying performance and high error rate of clinical decision makers. Variation in clinical practice is due in part to clinicians' poor compliance with guidelines and recommended therapies. The use of decision-support tools is a response to both the information revolution and poor compliance. Computerized protocols used to deliver decision support can be configured to contain much more detail than textual guidelines or paper-based flow diagrams. Such protocols can generate patient-specific instructions for therapy that can be carried out with little interclinician variability; however, clinicians must be willing to modify personal styles of clinical management. Protocols need not be perfect. Several defensible and reasonable approaches are available for clinical problems. However, one of these reasonable approaches must be chosen and incorporated into the protocol to promote consistent clinical decisions. This reasoning is the basis of an explicit method of decision support that allows the rigorous evaluation of interventions, including use of the protocols themselves. Computerized protocols for mechanical ventilation and management of intravenous fluid and hemodynamic factors in patients with the acute respiratory distress syndrome provide case studies for this discussion.

  20. Why standard brain-computer interface (BCI) training protocols should be changed: an experimental study

    NASA Astrophysics Data System (ADS)

    Jeunet, Camille; Jahanpour, Emilie; Lotte, Fabien

    2016-06-01

    Objective. While promising, electroencephaloraphy based brain–computer interfaces (BCIs) are barely used due to their lack of reliability: 15% to 30% of users are unable to control a BCI. Standard training protocols may be partly responsible as they do not satisfy recommendations from psychology. Our main objective was to determine in practice to what extent standard training protocols impact users’ motor imagery based BCI (MI-BCI) control performance. Approach. We performed two experiments. The first consisted in evaluating the efficiency of a standard BCI training protocol for the acquisition of non-BCI related skills in a BCI-free context, which enabled us to rule out the possible impact of BCIs on the training outcome. Thus, participants (N = 54) were asked to perform simple motor tasks. The second experiment was aimed at measuring the correlations between motor tasks and MI-BCI performance. The ten best and ten worst performers of the first study were recruited for an MI-BCI experiment during which they had to learn to perform two MI tasks. We also assessed users’ spatial ability and pre-training μ rhythm amplitude, as both have been related to MI-BCI performance in the literature. Main results. Around 17% of the participants were unable to learn to perform the motor tasks, which is close to the BCI illiteracy rate. This suggests that standard training protocols are suboptimal for skill teaching. No correlation was found between motor tasks and MI-BCI performance. However, spatial ability played an important role in MI-BCI performance. In addition, once the spatial ability covariable had been controlled for, using an ANCOVA, it appeared that participants who faced difficulty during the first experiment improved during the second while the others did not. Significance. These studies suggest that (1) standard MI-BCI training protocols are suboptimal for skill teaching, (2) spatial ability is confirmed as impacting on MI-BCI performance, and (3) when faced

  1. Why standard brain-computer interface (BCI) training protocols should be changed: an experimental study

    NASA Astrophysics Data System (ADS)

    Jeunet, Camille; Jahanpour, Emilie; Lotte, Fabien

    2016-06-01

    Objective. While promising, electroencephaloraphy based brain-computer interfaces (BCIs) are barely used due to their lack of reliability: 15% to 30% of users are unable to control a BCI. Standard training protocols may be partly responsible as they do not satisfy recommendations from psychology. Our main objective was to determine in practice to what extent standard training protocols impact users’ motor imagery based BCI (MI-BCI) control performance. Approach. We performed two experiments. The first consisted in evaluating the efficiency of a standard BCI training protocol for the acquisition of non-BCI related skills in a BCI-free context, which enabled us to rule out the possible impact of BCIs on the training outcome. Thus, participants (N = 54) were asked to perform simple motor tasks. The second experiment was aimed at measuring the correlations between motor tasks and MI-BCI performance. The ten best and ten worst performers of the first study were recruited for an MI-BCI experiment during which they had to learn to perform two MI tasks. We also assessed users’ spatial ability and pre-training μ rhythm amplitude, as both have been related to MI-BCI performance in the literature. Main results. Around 17% of the participants were unable to learn to perform the motor tasks, which is close to the BCI illiteracy rate. This suggests that standard training protocols are suboptimal for skill teaching. No correlation was found between motor tasks and MI-BCI performance. However, spatial ability played an important role in MI-BCI performance. In addition, once the spatial ability covariable had been controlled for, using an ANCOVA, it appeared that participants who faced difficulty during the first experiment improved during the second while the others did not. Significance. These studies suggest that (1) standard MI-BCI training protocols are suboptimal for skill teaching, (2) spatial ability is confirmed as impacting on MI-BCI performance, and (3) when faced

  2. Standard Hydrogen Test Protocols for the NREL Sensor Testing Laboratory (Brochure)

    SciTech Connect

    Not Available

    2011-12-01

    This brochure summarizes the test protocols used in the NREL Hydrogen Sensor Test Laboratory for the quantitative assessment of critical analytical performance specifications for hydrogen sensors. Researchers at the NREL Hydrogen Safety Sensor Test Laboratory developed a variety of test protocols to quantitatively assess critical analytical performance specifications for hydrogen sensors. Many are similar to, but typically more rigorous than, the test procedures mandated by ISO Standard 26142 (Hydrogen Detector for Stationary Applications). Specific protocols were developed for linear range, short-term stability, and the impact of fluctuations in temperature (T), pressure (P), relative humidity (RH), and chemical environment. Specialized tests (e.g., oxygen requirement) may also be performed. Hydrogen safety sensors selected for evaluation are subjected to a thorough regimen of test protocols, as described. Sensor testing is performed at NREL on custom-built sensor test fixtures. Environmental parameters such as T, P, RH, and gas composition are rigorously controlled and monitored. The NREL evaluations are performed on commercial hydrogen detectors, on emerging sensing technologies, and for end users to validate sensor performance for specific application needs. Test results and data are shared with the manufacturer or client via summary reports, teleconference phone calls, and, when appropriate, site visits to manufacturer facilities. Client representatives may also monitor NREL's operation while their technologies are being tested. Manufacturers may use test data to illustrate the analytical capability of their technologies and, more importantly, to guide future developments. NREL uses the data to assess technology gaps and deployment considerations. Per NREL Sensor Testing Laboratory policy, test results are treated as proprietary and are not shared with other manufacturers or other entities without permission. The data may be used by NREL in open publications

  3. Using generalizability theory to develop clinical assessment protocols.

    PubMed

    Preuss, Richard A

    2013-04-01

    Clinical assessment protocols must produce data that are reliable, with a clinically attainable minimal detectable change (MDC). In a reliability study, generalizability theory has 2 advantages over classical test theory. These advantages provide information that allows assessment protocols to be adjusted to match individual patient profiles. First, generalizability theory allows the user to simultaneously consider multiple sources of measurement error variance (facets). Second, it allows the user to generalize the findings of the main study across the different study facets and to recalculate the reliability and MDC based on different combinations of facet conditions. In doing so, clinical assessment protocols can be chosen based on minimizing the number of measures that must be taken to achieve a realistic MDC, using repeated measures to minimize the MDC, or simply based on the combination that best allows the clinician to monitor an individual patient's progress over a specified period of time. PMID:23258312

  4. A monitoring protocol to assess tidal restoration of salt marshes on local and regional scales

    USGS Publications Warehouse

    Neckles, H.A.; Dionne, M.D.; Burdick, D.M.; Roman, C.T.; Buchsbaum, R.; Hutchins, E.

    2002-01-01

    Assessing the response of salt marshes to tidal restoration relies on comparisons of ecosystem attributes between restored and reference marshes. Although this approach provides an objective basis for judging project success, inferences can be constrained if the high variability of natural marshes masks differences in sampled attributes between restored and reference sites. Furthermore, such assessments are usually focused on a small number of restoration projects in a local area, limiting the ability to address questions regarding the effectiveness of restoration within a broad region. We developed a hierarchical approach to evaluate the performance of tidal restorations at local and regional scales throughout the Gulf of Maine. The cornerstone of the approach is a standard protocol for monitoring restored and reference salt marshes throughout the region. The monitoring protocol was developed by consensus among nearly 50 restoration scientists and practitioners. The protocol is based on a suite of core structural measures that can be applied to any tidal restoration project. The protocol also includes additional functional measures for application to specific projects. Consistent use of the standard protocol to monitor local projects will enable pooling information for regional assessments. Ultimately, it will be possible to establish a range of reference conditions characterizing natural tidal wetlands in the region and to compare performance curves between populations of restored and reference marshes for assessing regional restoration effectiveness.

  5. Thermal cycling for restorative materials: does a standardized protocol exist in laboratory testing? A literature review.

    PubMed

    Morresi, Anna Lucia; D'Amario, Maurizio; Capogreco, Mario; Gatto, Roberto; Marzo, Giuseppe; D'Arcangelo, Camillo; Monaco, Annalisa

    2014-01-01

    In vitro tests continue to be an indispensable method for the initial screening of dental materials. Thermal cycling is one of the most widely used procedures to simulate the physiological aging experienced by biomaterials in clinical practice. Consequently it is routinely employed in experimental studies to evaluate materials' performance. A literature review aimed to elucidate test parameters for in vitro aging of adhesive restorations was performed. This study aims to assess whether or not a standardized protocol of thermal cycling has been acknowledged from a review of the literature. An exhaustive literature search, examining the effect of thermal cycling on restorative dental materials, was performed with electronic database and by hand. The search was restricted to studies published from 1998 to August 2013. No language restrictions were applied. The search identified 193 relevant experimental studies. Only twenty-three studies had faithfully applied ISO standard. The majority of studies used their own procedures, showing only a certain consistency within the temperature parameter (5-55°C) and a great variability in the number of cycles and dwell time chosen. A wide variation in thermal cycling parameters applied in experimental studies has been identified. The parameters selected amongst these studies seem to be done on the basis of convenience for the authors in most cases. A comparison of results between studies would appear to be impossible. The available data suggest that further investigations will be required to ultimately develop a standardized thermal cycling protocol.

  6. Testing warning messages on smokers’ cigarette packages: A standardized protocol

    PubMed Central

    Brewer, Noel T.; Hall, Marissa G.; Lee, Joseph G. L.; Peebles, Kathryn; Noar, Seth M.; Ribisl, Kurt M.

    2015-01-01

    Purpose Lab experiments on cigarette warnings typically use a brief one-time exposure that is not paired with the cigarette packs smokers use every day, leaving open the question of how repeated warning exposure over several weeks may affect smokers. This proof of principle study sought to develop a new protocol for testing cigarette warnings that better reflects real-world exposure by presenting them on cigarette smokers’ own packs. Methods We tested a cigarette pack labeling protocol with 76 US smokers ages 18 and older. We applied graphic warnings to the front and back of smokers’ cigarette packs. Results Most smokers reported that at least 75% of the packs of cigarettes they smoked during the study had our warnings. Nearly all said they would participate in the study again. Using cigarette packs with the study warnings increased quit intentions (p<.05). Conclusion Our findings suggest a feasible pack labeling protocol with six steps: (1) schedule appointments at brief intervals; (2) determine typical cigarette consumption; (3) ask smokers to bring a supply of cigarette packs to study appointments; (4) apply labels to smokers’ cigarette packs; (5) provide participation incentives at the end of appointments; and (6) refer smokers to cessation services at end of the study. When used in randomized controlled trials in settings with real-world message exposure over time, this protocol may help identify the true impact of warnings and thus better inform tobacco product labeling policy. PMID:25564282

  7. Comparison of a standard and a detailed postmortem protocol for detecting Mycobacterium bovis in badgers.

    PubMed

    Crawshaw, T R; Griffiths, I B; Clifton-Hadley, R S

    2008-10-18

    A standard postmortem protocol, consisting of gross pathology, culture for mycobacteria and limited selective histopathology, was used in the randomised badger culling trial in Great Britain to detect Mycobacterium bovis infection. This standard protocol was compared with a more detailed protocol in which more tissues were examined grossly, more tissues were cultured, more culture slopes were seeded, the culture period was extended and tissues were examined routinely by histopathology. The standard protocol was more sensitive in badgers with gross visible lesions than in badgers with no gross visible lesions. When applied to the study population of badgers, the overall sensitivity of the standard protocol relative to the more detailed protocol was estimated to be 54.6 per cent (95 per cent confidence interval 44.9 to 69.8 per cent). Badgers with tuberculosis (tb) detected by the standard protocol had a mean of 7.6 tissues with microscopic lesions suspicious of tb. The additional badgers detected by the detailed protocol had a mean of 4.4 tissues with microscopic lesions suspicious of tb.

  8. A standardized protocol for repeated social defeat stress in mice

    PubMed Central

    Golden, Sam A; Covington, Herbert E; Berton, Olivier; Russo, Scott J

    2011-01-01

    A major impediment to novel drug development has been the paucity of animal models that accurately reflect symptoms of affective disorders. In animal models, prolonged social stress has proven to be useful in understanding the molecular mechanisms underlying affective-like disorders. When considering experimental approaches for studying depression, social defeat stress, in particular, has been shown to have excellent etiological, predictive, discriminative and face validity. Described here is a protocol whereby C57BL/6J mice that are repeatedly subjected to bouts of social defeat by a larger and aggressive CD-1 mouse results in the development of a clear depressive-like syndrome, characterized by enduring deficits in social interactions. Specifically, the protocol consists of three important stages, beginning with the selection of aggressive CD-1 mice, followed by agonistic social confrontations between the CD-1 and C57BL/6J mice, and concluding with the confirmation of social avoidance in subordinate C57BL/6J mice. The automated detection of social avoidance allows a marked increase in throughput, reproducibility and quantitative analysis. This protocol is highly adaptable, but in its most common form it requires 3–4 weeks for completion. PMID:21799487

  9. The Vocational Assessment Protocol. User's Manual [and] Profiles. Revised.

    ERIC Educational Resources Information Center

    Thomas, Dale F.

    This packet contains the user's manual and profile sheets for the Vocational Assessment Protocol (VAP), a functional skills profile of vocational-related factors intended for use in the vocational rehabilitation of persons who have acquired a traumatic brain injury. The VAP consists of nine structured rating instruments and a structural summary…

  10. EPA Traceability Protocol for Assay and Certification of Gaseous Calibration Standards (2012 Revision)

    EPA Science Inventory

    In 1997, the U.S. Environmental Protection Agency (EPA) in Research Triangle Park, North Carolina, revised its 1993 version of its traceability protocol for the assay and certification of compressed gas and permeation-device calibration standards. The protocol allows producers o...

  11. Assessing impacts of roads: Application of a standard assessment protocol

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Adaptive management of road networks depends on timely data that accurately reflect the impacts of network impacts on ecosystem processes and associated services. In the absence of reliable data, land managers are left with little more than observations and perceptions to support adaptive management...

  12. Standardized Patient Walkthroughs in the National Drug Abuse Treatment Clinical Trials Network: Common Challenges to Protocol Implementation

    PubMed Central

    Fussell, Holly; Kunkel, Lynn E.; McCarty, Dennis; Lewy, Colleen S.

    2012-01-01

    Background Training research staff to implement clinical trials occurring in community-based addiction treatment programs presents unique challenges. Standardized patient walkthroughs of study procedures may enhance training and protocol implementation. Objectives Examine and discuss cross-site and cross-study challenges of participant screening and data collection procedures identified during standardized patient walkthroughs of multi-site clinical trials. Method Actors portrayed clients and “walked through” study procedures with protocol research staff. The study completed 57 walkthroughs during implementation of 4 clinical trials. Results Observers and walkthrough participants identified three areas of concern (consent procedures, screening and assessment processes, and protocol implementation) and made suggestions for resolving the concerns. Conclusions and Scientific Significance Standardized patient walkthroughs capture issues with study procedures previously unidentified with didactic training or unscripted rehearsals. Clinical trials within the National Drug Abuse Treatment Clinical Trials Network are conducted in addiction treatment centers that vary on multiple dimensions. Based on walkthrough observations, the national protocol team and local site leadership modify standardized operating procedures and resolve cross-site problems prior to recruiting study participants. The standardized patient walkthrough improves consistency across study sites and reduces potential site variation in study outcomes. PMID:21854287

  13. 45 CFR 155.270 - Use of standards and protocols for electronic transactions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... RELATING TO HEALTH CARE ACCESS EXCHANGE ESTABLISHMENT STANDARDS AND OTHER RELATED STANDARDS UNDER THE AFFORDABLE CARE ACT General Functions of an Exchange § 155.270 Use of standards and protocols for electronic... rules, and code sets adopted by the Secretary in 45 CFR parts 160 and 162. (b) HIT enrollment...

  14. 45 CFR 155.270 - Use of standards and protocols for electronic transactions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... RELATING TO HEALTH CARE ACCESS EXCHANGE ESTABLISHMENT STANDARDS AND OTHER RELATED STANDARDS UNDER THE AFFORDABLE CARE ACT General Functions of an Exchange § 155.270 Use of standards and protocols for electronic... rules, and code sets that are adopted by the Secretary in 45 CFR parts 160 and 162 or that are...

  15. Serial link protocol design: a critique of the X. 25 standard, Level 2

    SciTech Connect

    Fletcher, J.G.

    1983-10-25

    There are certain technical design principles for link communication protocols which, if followed, result in a protocol that is less complex in both concept and implementation, but at the same time provides better service, than if the principles are not followed. These principles include modularization into subprotocols, symmetry between the nodes on the link, and use of the state-exchange model of a conversation rather than the command-response model. The principles are described, the extent to which they are followed by the standard protocol X.25, level 2, is examined, and a protocol adhering to them is presented.

  16. Standards and Assessment. IDRA Focus.

    ERIC Educational Resources Information Center

    IDRA Newsletter, 1997

    1997-01-01

    This newsletter includes three articles, two of which focus on standards for student evaluation and for admission to higher education. "A Measuring Stick for Standards and TEKS: Meeting the Needs of Second Language Learners" (Laura Chris Green, Adela Solis) examines beliefs embodied in the notion of standards; defines content, performance, and…

  17. A protocol for lifetime energy and environmental impact assessment of building insulation materials

    SciTech Connect

    Shrestha, Som S. Biswas, Kaushik; Desjarlais, Andre O.

    2014-04-01

    This article describes a proposed protocol that is intended to provide a comprehensive list of factors to be considered in evaluating the direct and indirect environmental impacts of building insulation materials, as well as detailed descriptions of standardized calculation methodologies to determine those impacts. The energy and environmental impacts of insulation materials can generally be divided into two categories: (1) direct impact due to the embodied energy of the insulation materials and other factors and (2) indirect or environmental impacts avoided as a result of reduced building energy use due to addition of insulation. Standards and product category rules exist, which provide guidelines about the life cycle assessment (LCA) of materials, including building insulation products. However, critical reviews have suggested that these standards fail to provide complete guidance to LCA studies and suffer from ambiguities regarding the determination of the environmental impacts of building insulation and other products. The focus of the assessment protocol described here is to identify all factors that contribute to the total energy and environmental impacts of different building insulation products and, more importantly, provide standardized determination methods that will allow comparison of different insulation material types. Further, the intent is not to replace current LCA standards but to provide a well-defined, easy-to-use comparison method for insulation materials using existing LCA guidelines. - Highlights: • We proposed a protocol to evaluate the environmental impacts of insulation materials. • The protocol considers all life cycle stages of an insulation material. • Both the direct environmental impacts and the indirect impacts are defined. • Standardized calculation methods for the ‘avoided operational energy’ is defined. • Standardized calculation methods for the ‘avoided environmental impact’ is defined.

  18. Standard Care versus Protocol Based Therapy for New Onset Pseudomonas aeruginosa in Cystic Fibrosis

    PubMed Central

    Mayer-Hamblett, Nicole; Rosenfeld, Margaret; Treggiari, Miriam M.; Konstan, Michael W.; Retsch-Bogart, George; Morgan, Wayne; Wagener, Jeff; Gibson, Ronald L.; Khan, Umer; Emerson, Julia; Thompson, Valeria; Elkin, Eric P.; Ramsey, Bonnie W.

    2014-01-01

    Rationale The Early Pseudomonal Infection Control (EPIC) randomized trial rigorously evaluated the efficacy of different antibiotic regimens for eradication of newly identified Pseudomonas (Pa) in children with cystic fibrosis (CF). Protocol based therapy in the trial was provided based on culture positivity independent of symptoms. It is unclear whether outcomes observed in the clinical trial were different than those that would have been observed with historical standard of care driven more heavily by respiratory symptoms than culture positivity alone. We hypothesized that the incidence of Pa recurrence and hospitalizations would be significantly reduced among trial participants as compared to historical controls whose standard of care preceded the widespread adoption of tobramycin inhalation solution (TIS) as initial eradication therapy at the time of new isolation of Pa. Methods Eligibility criteria from the trial were used to derive historical controls from the Epidemiologic Study of CF (ESCF) who received standard of care treatment from 1995 to 1998, before widespread availability of TIS. Pa recurrence and hospitalization outcomes were assessed over a 15-month time period. Results As compared to 100% of the 304 trial participants, only 296/608 (49%) historical controls received antibiotics within an average of 20 weeks after new onset Pa. Pa recurrence occurred among 104/298 (35%) of the trial participants as compared to 295/549 (54%) of historical controls (19% difference, 95% CI: 12%, 26%, p<0.001). No significant differences in the incidence of hospitalization were observed between cohorts. Conclusions Protocol-based antimicrobial therapy for newly acquired Pa resulted in a lower rate of Pa recurrence but comparable hospitalization rates as compared to a historical control cohort less aggressively treated with antibiotics for new onset Pa. PMID:23818295

  19. A Chlorhexidine- Agar Plate Culture Medium Protocol to Complement Standard Broth Culture of Mycobacterium tuberculosis

    PubMed Central

    Asmar, Shady; Chatellier, Sonia; Mirande, Caroline; van Belkum, Alex; Canard, Isabelle; Raoult, Didier; Drancourt, Michel

    2016-01-01

    The culture of Mycobacterium tuberculosis using parallel inoculation of a solid culture medium and a liquid broth provides the gold standard for the diagnosis of tuberculosis. Here, we evaluated a chlorhexidine decontamination-MOD9 solid medium protocol versus the standard NALC-NaOH-Bactec 960 MGIT protocol for the diagnosis of pulmonary tuberculosis by culture. Three-hundred clinical specimens comprising 193 sputa, 30 bronchial aspirates, 10 broncho-alveolar lavages, 47 stools, and 20 urines were prospectively submitted for the routine diagnosis of tuberculosis. The contamination rates were 5/300 (1.7%) using the MOD9 protocol and 17/300 (5.7%) with the Bactec protocol, respectively (P < 0.05, Fisher exact test). Of a total of 50 Mycobacterium isolates (48 M. tuberculosis and two Mycobacterium abscessus) were cultured. Out of these 50, 48 (96%) isolates were found using the MOD9 protocol versus 35 (70%) when using the Bactec protocol (P < 0.05, Fisher exact test). The time to positivity was 10.1 ± 3.9 days versus 14.7 ± 7.3 days, respectively, (P < 0.05, Student’s t-test). These data confirmed the usefulness of parallel inoculation of a solid culture medium with broth for the recovery of M. tuberculosis in agreement with current recommendations. More specifically, chlorhexidine decontamination and inoculation of the MOD9 solid medium could be proposed to complement the standard Bactec 960 MGIT broth protocol. PMID:26834733

  20. A Chlorhexidine- Agar Plate Culture Medium Protocol to Complement Standard Broth Culture of Mycobacterium tuberculosis.

    PubMed

    Asmar, Shady; Chatellier, Sonia; Mirande, Caroline; van Belkum, Alex; Canard, Isabelle; Raoult, Didier; Drancourt, Michel

    2016-01-01

    The culture of Mycobacterium tuberculosis using parallel inoculation of a solid culture medium and a liquid broth provides the gold standard for the diagnosis of tuberculosis. Here, we evaluated a chlorhexidine decontamination-MOD9 solid medium protocol versus the standard NALC-NaOH-Bactec 960 MGIT protocol for the diagnosis of pulmonary tuberculosis by culture. Three-hundred clinical specimens comprising 193 sputa, 30 bronchial aspirates, 10 broncho-alveolar lavages, 47 stools, and 20 urines were prospectively submitted for the routine diagnosis of tuberculosis. The contamination rates were 5/300 (1.7%) using the MOD9 protocol and 17/300 (5.7%) with the Bactec protocol, respectively (P < 0.05, Fisher exact test). Of a total of 50 Mycobacterium isolates (48 M. tuberculosis and two Mycobacterium abscessus) were cultured. Out of these 50, 48 (96%) isolates were found using the MOD9 protocol versus 35 (70%) when using the Bactec protocol (P < 0.05, Fisher exact test). The time to positivity was 10.1 ± 3.9 days versus 14.7 ± 7.3 days, respectively, (P < 0.05, Student's t-test). These data confirmed the usefulness of parallel inoculation of a solid culture medium with broth for the recovery of M. tuberculosis in agreement with current recommendations. More specifically, chlorhexidine decontamination and inoculation of the MOD9 solid medium could be proposed to complement the standard Bactec 960 MGIT broth protocol. PMID:26834733

  1. Surgeons' Evaluation of Colorectal Cancer Resections Against Standard HPE Protocol-Auditing the Surgeons.

    PubMed

    Sagap, Ismail; Elnaim, Abdel Latif K; Hamid, Imtiaz; Rose, Isa M

    2011-06-01

    The survival of Colorectal Cancer patients is very much dependent on complete tumor resection and multimodality adjuvant treatment. However, the main determinants for management plan of these patients rely heavily on accurate staging through histopathological examination (HPE). A reliable standard HPE protocol will be a significant impact in determining best surgical outcome. We evaluate surgeons' intra-operative judgment and the quality of resected specimens in the treatment of colorectal cancers. To quantify the quality of surgery by applying standard HPE protocol in colorectal cancer specimens and to assess the use of new format for pathological reporting in Colorectal Cancer using a formulated standard proforma. We perform a prospective observation of all colorectal cancer patients who underwent surgical resection over 8 month duration. Surgeons are required to make self-assessment about completion of tumor excision and possible lymph nodes or adjacent organ involvement while all pathologists followed standard reporting protocol for examination of the specimens. We evaluate the accuracy of surgeons judgment against HPE. The study involved 44 colorectal cancers comprising of 23 male and 21 female patients. The majority of these patients were Malay (50%) followed by Chinese (43%) and Indian (7%). The main presenting symptoms were bleeding (32%), intestinal obstruction (29%) and perforation (7%). Sixteen (36%) patients underwent emergency surgery.Rectal tumor was the commonest (53%) followed by sigmoid colon (22.7%). Neoadjuvant Chemoradiation were given to 8 patients and complete pathological response was observed in 1 (12.5%) of these. The final TNM classification for staging were; stage I (22.7%), stage IIa (18.2%), stage IIb (11.4%), stage IIIa (2.3%), stage IIIb (25%), stage IIIc (13.6%) and stage IV (6.8%).The commonest surgery performed was anterior resection with mesorectal excision (43.2%). Ten patients (22.7%) had laparoscopic surgery with 3 (30

  2. Inter-laboratory variation in DNA damage using a standard comet assay protocol.

    PubMed

    Forchhammer, Lykke; Ersson, Clara; Loft, Steffen; Möller, Lennart; Godschalk, Roger W L; van Schooten, Frederik J; Jones, George D D; Higgins, Jennifer A; Cooke, Marcus; Mistry, Vilas; Karbaschi, Mahsa; Collins, Andrew R; Azqueta, Amaya; Phillips, David H; Sozeri, Osman; Routledge, Michael N; Nelson-Smith, Kirsty; Riso, Patrizia; Porrini, Marisa; Matullo, Giuseppe; Allione, Alessandra; Stępnik, Maciej; Steepnik, Maciej; Komorowska, Magdalena; Teixeira, João Paulo; Costa, Solange; Corcuera, Laura-Ana; López de Cerain, Adela; Laffon, Blanca; Valdiglesias, Vanessa; Møller, Peter

    2012-11-01

    There are substantial inter-laboratory variations in the levels of DNA damage measured by the comet assay. The aim of this study was to investigate whether adherence to a standard comet assay protocol would reduce inter-laboratory variation in reported values of DNA damage. Fourteen laboratories determined the baseline level of DNA strand breaks (SBs)/alkaline labile sites and formamidopyrimidine DNA glycosylase (FPG)-sensitive sites in coded samples of mononuclear blood cells (MNBCs) from healthy volunteers. There were technical problems in seven laboratories in adopting the standard protocol, which were not related to the level of experience. Therefore, the inter-laboratory variation in DNA damage was only analysed using the results from laboratories that had obtained complete data with the standard comet assay protocol. This analysis showed that the differences between reported levels of DNA SBs/alkaline labile sites in MNBCs were not reduced by applying the standard assay protocol as compared with the laboratory's own protocol. There was large inter-laboratory variation in FPG-sensitive sites by the laboratory-specific protocol and the variation was reduced when the samples were analysed by the standard protocol. The SBs and FPG-sensitive sites were measured in the same experiment, indicating that the large spread in the latter lesions was the main reason for the reduced inter-laboratory variation. However, it remains worrying that half of the participating laboratories obtained poor results using the standard procedure. This study indicates that future comet assay validation trials should take steps to evaluate the implementation of standard procedures in participating laboratories.

  3. Setting standards: Risk assessment issues

    SciTech Connect

    Pontius, F.W.

    1995-07-01

    How drinking water standards are set and which contaminants should be regulated are central issues in reauthorization of the Safe Drinking Water Act (SDWA). Suggested amendments to the standard-setting provisions of the SDWA cover a broad spectrum. In general, environmental groups argue that standards are not strict enough and that greater consideration should be given to sensitive subpopulations. Other note that the high cost associated with meeting increasingly strict standards is unjustified in light of the uncertain and sometimes nonexistent incremental benefits. This article takes a look at the issues involved in developing a rational approach for establishing drinking water standards. It reviews the current approach used by the US Environmental Protection Agency (USEPA) to set standards.

  4. Constantly evolving safety assessment protocols for GM foods.

    PubMed

    Sesikeran, B; Vasanthi, Siruguri

    2008-01-01

    he introduction of GM foods has led to the evolution of a food safety assessment paradigm that establishes safety of the GM food relative to its conventional counterpart. The GM foods currently approved and marketed in several countries have undergone extensive safety testing under a structured safety assessment framework evolved by international organizations like FAO, WHO, Codex and OECD. The major elements of safety assessment include molecular characterization of inserted genes and stability of the trait, toxicity and allergenicity potential of the expressed substances, compositional analysis, potential for gene transfer to gut microflora and unintentional effects of the genetic modification. As more number and type of food crops are being brought under the genetic modification regime, the adequacy of existing safety assessment protocols for establishing safety of these foods has been questioned. Such crops comprise GM crops with higher agronomic vigour, nutritional or health benefit/ by modification of plant metabolic pathways and those expressing bioactive substances and pharmaceuticals. The safety assessment challenges of these foods are the potential of the methods to detect unintentional effects with higher sensitivity and rigor. Development of databases on food compositions, toxicants and allergens is currently seen as an important aid to development of safety protocols. With the changing global trends in genetic modification technology future challenge would be to develop GM crops with minimum amount of inserted foreign DNA so as to reduce the burden of complex safety assessments while ensuring safety and utility of the technology.

  5. A standardized protocol to reduce cerebrospinal fluid shunt infection: The Hydrocephalus Clinical Research Network Quality Improvement Initiative

    PubMed Central

    Kestle, John R. W.; Riva-Cambrin, Jay; Wellons, John C.; Kulkarni, Abhaya V.; Whitehead, William E.; Walker, Marion L.; Oakes, W. Jerry; Drake, James M.; Luerssen, Thomas G.; Simon, Tamara D.; Holubkov, Richard

    2011-01-01

    Object Quality improvement techniques are being implemented in many areas of medicine. In an effort to reduce the ventriculoperitoneal shunt infection rate, a standardized protocol was developed and implemented at 4 centers of the Hydrocephalus Clinical Research Network (HCRN). Methods The protocol was developed sequentially by HCRN members using the current literature and prior institutional experience until consensus was obtained. The protocol was prospectively applied at each HCRN center to all children undergoing a shunt insertion or revision procedure. Infections were defined on the basis of CSF, wound, or pseudocyst cultures; wound breakdown; abdominal pseudocyst; or positive blood cultures in the presence of a ventriculoatrial shunt. Procedures and infections were measured before and after protocol implementation. Results Twenty-one surgeons at 4 centers performed 1571 procedures between June 1, 2007, and February 28, 2009. The minimum follow-up was 6 months. The Network infection rate decreased from 8.8% prior to the protocol to 5.7% while using the protocol (p = 0.0028, absolute risk reduction 3.15%, relative risk reduction 36%). Three of 4 centers lowered their infection rate. Shunt surgery after external ventricular drainage (with or without prior infection) had the highest infection rate. Overall protocol compliance was 74.5% and improved over the course of the observation period. Based on logistic regression analysis, the use of BioGlide catheters (odds ratio [OR] 1.91, 95% CI 1.19–3.05; p = 0.007) and the use of antiseptic cream by any members of the surgical team (instead of a formal surgical scrub by all members of the surgical team; OR 4.53, 95% CI 1.43–14.41; p = 0.01) were associated with an increased risk of infection. Conclusions The standardized protocol for shunt surgery significantly reduced shunt infection across the HCRN. Overall protocol compliance was good. The protocol has established a common baseline within the Network, which will

  6. Development of a new welfare assessment protocol for practical application in long-term dog shelters.

    PubMed

    Barnard, S; Pedernera, C; Candeloro, L; Ferri, N; Velarde, A; Dalla Villa, P

    2016-01-01

    In many European shelters, dogs may spend many years confined. A poor environment and inappropriate management may lead to a low quality of life. The absence of harmonised European regulatory frameworks defining the minimum requirements for shelter facilities makes the definition of welfare standards for kennelled dogs challenging. Here, a new protocol was developed and tested to help identify the main welfare issues for shelter dogs. Twenty-six indicators were identified including management, resource and animal based measures. Accuracy and interobserver reliability were checked between four assessors. The protocol was applied in 29 shelters (n=1308 dogs) in six European countries. Overall prevalence of poor health conditions was below 10%. Test-retest reliability and validity of the protocol were investigated with encouraging results. A logistic regression was carried out to assess the potential of the protocol as a tool to identify welfare hazards in shelter environments. Inappropriate space allowance, for example, was found to be a risk factor potentially affecting the animal's cleanliness, skin condition and body condition. The protocol was designed to be concise and easy to implement. Systematic data collection could help identify welfare problems that are likely to arise in certain shelter designs and thus determine improvement in animal care standards. PMID:26612859

  7. 78 FR 45096 - Standards for Business Practices and Communication Protocols for Public Utilities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-26

    ... No. 676-G, 78 FR 14654 (Mar. 7, 2013), FERC Stats. & Regs. ] 31,343 (Feb. 21, 2013). In this rule... protocols and related standards designed to improve the efficiency of communication within each industry... designed to ensure that all industry members can have input into the development of a standard, whether...

  8. A standardized sampling protocol for channel catfish in prairie streams

    USGS Publications Warehouse

    Vokoun, Jason C.; Rabeni, Charles F.

    2001-01-01

    Three alternative gears—an AC electrofishing raft, bankpoles, and a 15-hoop-net set—were used in a standardized manner to sample channel catfish Ictalurus punctatus in three prairie streams of varying size in three seasons. We compared these gears as to time required per sample, size selectivity, mean catch per unit effort (CPUE) among months, mean CPUE within months, effect of fluctuating stream stage, and sensitivity to population size. According to these comparisons, the 15-hoop-net set used during stable water levels in October had the most desirable characteristics. Using our catch data, we estimated the precision of CPUE and size structure by varying sample sizes for the 15-hoop-net set. We recommend that 11–15 repetitions of the 15-hoop-net set be used for most management activities. This standardized basic unit of effort will increase the precision of estimates and allow better comparisons among samples as well as increased confidence in management decisions.

  9. Using the CAS Standards in Assessment Projects

    ERIC Educational Resources Information Center

    Dean, Laura A.

    2013-01-01

    This chapter provides an overview of the use of professional standards of practice in assessment and of the Council for the Advancement of Standards in Higher Education (CAS). It outlines a model for conducting program self-studies and discusses the importance of implementing change based on assessment results.

  10. Performance assessment of time-domain optical brain imagers, part 2: nEUROPt protocol.

    PubMed

    Wabnitz, Heidrun; Jelzow, Alexander; Mazurenka, Mikhail; Steinkellner, Oliver; Macdonald, Rainer; Milej, Daniel; Żołek, Norbert; Kacprzak, Michal; Sawosz, Piotr; Maniewski, Roman; Liebert, Adam; Magazov, Salavat; Hebden, Jeremy; Martelli, Fabrizio; Di Ninni, Paola; Zaccanti, Giovanni; Torricelli, Alessandro; Contini, Davide; Re, Rebecca; Zucchelli, Lucia; Spinelli, Lorenzo; Cubeddu, Rinaldo; Pifferi, Antonio

    2014-08-01

    The nEUROPt protocol is one of two new protocols developed within the European project nEUROPt to characterize the performances of time-domain systems for optical imaging of the brain. It was applied in joint measurement campaigns to compare the various instruments and to assess the impact of technical improvements. This protocol addresses the characteristic of optical brain imaging to detect, localize, and quantify absorption changes in the brain. It was implemented with two types of inhomogeneous liquid phantoms based on Intralipid and India ink with well-defined optical properties. First, small black inclusions were used to mimic localized changes of the absorption coefficient. The position of the inclusions was varied in depth and lateral direction to investigate contrast and spatial resolution. Second, two-layered liquid phantoms with variable absorption coefficients were employed to study the quantification of layer-wide changes and, in particular, to determine depth selectivity, i.e., the ratio of sensitivities for deep and superficial absorption changes. We introduce the tests of the nEUROPt protocol and present examples of results obtained with different instruments and methods of data analysis. This protocol could be a useful step toward performance tests for future standards in diffuse optical imaging.

  11. Performance assessment of time-domain optical brain imagers, part 2: nEUROPt protocol.

    PubMed

    Wabnitz, Heidrun; Jelzow, Alexander; Mazurenka, Mikhail; Steinkellner, Oliver; Macdonald, Rainer; Milej, Daniel; Żołek, Norbert; Kacprzak, Michal; Sawosz, Piotr; Maniewski, Roman; Liebert, Adam; Magazov, Salavat; Hebden, Jeremy; Martelli, Fabrizio; Di Ninni, Paola; Zaccanti, Giovanni; Torricelli, Alessandro; Contini, Davide; Re, Rebecca; Zucchelli, Lucia; Spinelli, Lorenzo; Cubeddu, Rinaldo; Pifferi, Antonio

    2014-08-01

    The nEUROPt protocol is one of two new protocols developed within the European project nEUROPt to characterize the performances of time-domain systems for optical imaging of the brain. It was applied in joint measurement campaigns to compare the various instruments and to assess the impact of technical improvements. This protocol addresses the characteristic of optical brain imaging to detect, localize, and quantify absorption changes in the brain. It was implemented with two types of inhomogeneous liquid phantoms based on Intralipid and India ink with well-defined optical properties. First, small black inclusions were used to mimic localized changes of the absorption coefficient. The position of the inclusions was varied in depth and lateral direction to investigate contrast and spatial resolution. Second, two-layered liquid phantoms with variable absorption coefficients were employed to study the quantification of layer-wide changes and, in particular, to determine depth selectivity, i.e., the ratio of sensitivities for deep and superficial absorption changes. We introduce the tests of the nEUROPt protocol and present examples of results obtained with different instruments and methods of data analysis. This protocol could be a useful step toward performance tests for future standards in diffuse optical imaging. PMID:25121480

  12. A Protocol for Lifetime Energy and Environmental Impact Assessment of Building Insulation Materials

    SciTech Connect

    Shrestha, Som S; Biswas, Kaushik; Desjarlais, Andre Omer

    2014-01-01

    This article describes a proposed protocol that is intended to provide a comprehensive list of factors to be considered in evaluating the direct and indirect environmental impacts of building insulation materials, as well as detailed descriptions of standardized calculation methodologies to determine those impacts. The energy and environmental impacts of insulation materials can generally be divided into two categories: (1) direct impact due to the embodied energy of the insulation materials and other factors, and (2) indirect or environmental impacts avoided as a result of reduced building energy use due to addition of insulation. Standards and product category rules exist that provide guidelines about the life cycle assessment (LCA) of materials, including building insulation products. However, critical reviews have suggested that these standards fail to provide complete guidance to LCA studies and suffer from ambiguities regarding the determination of the environmental impacts of building insulation and other products. The focus of the assessment protocol described here is to identify all factors that contribute to the total energy and environmental impacts of different insulation products and, more importantly, provide standardized determination methods that will allow comparison of different insulation material types. Further, the intent is not to replace current LCA standards but to provide a well-defined, easy-to-use comparison method for insulation materials using existing LCA guidelines.

  13. ACR/NEMA Digital Image Interface Standard (An Illustrated Protocol Overview)

    NASA Astrophysics Data System (ADS)

    Lawrence, G. Robert

    1985-09-01

    The American College of Radiologists (ACR) and the National Electrical Manufacturers Association (NEMA) have sponsored a joint standards committee mandated to develop a universal interface standard for the transfer of radiology images among a variety of PACS imaging devicesl. The resulting standard interface conforms to the ISO/OSI standard reference model for network protocol layering. The standard interface specifies the lower layers of the reference model (Physical, Data Link, Transport and Session) and implies a requirement of the Network Layer should a requirement for a network exist. The message content has been considered and a flexible message and image format specified. The following Imaging Equipment modalities are supported by the standard interface... CT Computed Tomograpy DS Digital Subtraction NM Nuclear Medicine US Ultrasound MR Magnetic Resonance DR Digital Radiology The following data types are standardized over the transmission interface media.... IMAGE DATA DIGITIZED VOICE HEADER DATA RAW DATA TEXT REPORTS GRAPHICS OTHERS This paper consists of text supporting the illustrated protocol data flow. Each layer will be individually treated. Particular emphasis will be given to the Data Link layer (Frames) and the Transport layer (Packets). The discussion utilizes a finite state sequential machine model for the protocol layers.

  14. [Oral rehydration in acute gastroenteritis in infants and children--advantages of a standardized protocol].

    PubMed

    Weizman, Z; Weizman, A; Alsheikh, A; Herzog, L; Tal, A; Gorodischer, R

    2000-11-01

    Oral rehydration (OR) for acute gastroenteritis in infants and children has been shown to be as effective as IV therapy, with less discomfort and lower costs. In this retrospective study we compared 2 pediatric wards, in 1 of which only a standardized, simplified, bedside protocol, based on American Academy of Pediatrics guidelines, was used. There were no significant clinical characteristics in the 208 patients. In the ward which used the above protocol, OR utilization was significantly more frequent than in the other ward (48% versus 15%), thus saving equipment costs of nearly $1,000/3 months. There were no significant differences in outcome between the wards. We conclude that introducing a standardized management protocol may increase OR utilization in hospitalized children with acute diarrhea. PMID:11341212

  15. Nutrition assessment outcomes: a protocol for Native American hospitals.

    PubMed

    Bickford, G R; Brugler, L J; Gannon, C

    2000-12-01

    The incorporation of visceral protein testing into nutrition assessment protocols can help all hospitals rapidly and accurately identify patients in need of restorative nutrition therapy. Reilly showed that a 3- to 5-day delay in identifying malnutrition has a direct variable cost of $1,500 per case. Studies by Brugler, Mears, and Reilly have demonstrated longer lengths of stay and increased care costs because of nosocomial complications (i.e., infections, pressure ulcers, wound dehiscence, dyspnea, system failures) related to malnutrition. Brugler showed that functionality--a measure of a patient's independence and ability to perform daily activities--both at admission and discharge, the number of care interventions, the occurrence of complications, the level of nutrition treatment needed, and the patient's discharge disposition were strongly associated with their admission albumin value. Conversely, nutrition restoration leads to improved patient outcomes, reduced costs, maximization of care reimbursement, and fulfillment of regulatory requirements. Adoption of this protocol by other hospitals should allow them to demonstrate comparable results, thereby justifying the incorporation of visceral protein testing into their nutrition assessment methods.

  16. 75 FR 4323 - Additional Quantitative Fit-testing Protocols for the Respiratory Protection Standard

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-27

    ... performed particle counts on samples collected during the Study. Table 1 provides the exercise and sampling... revised PortaCount quantitative fit-testing protocols are not sufficiently accurate or reliable to include...) to Appendix A of ] its Respiratory Protection Standard (see 69 FR 46986). OSHA also published...

  17. Behavior Intervention for Students with Externalizing Behavior Problems: Primary-Level Standard Protocol

    ERIC Educational Resources Information Center

    Benner, Gregory J.; Nelson, J. Ron; Sanders, Elizabeth A.; Ralston, Nicole C.

    2012-01-01

    This article examined the efficacy of a primary-level, standard-protocol behavior intervention for students with externalizing behavioral disorders. Elementary schools were randomly assigned to treatment (behavior intervention) or control (business as usual) conditions, and K-3 students were screened for externalizing behavior risk status. The…

  18. Dosimetric and image quality assessment of different acquisition protocols of a novel 64-slice CT scanner

    NASA Astrophysics Data System (ADS)

    Vite, Cristina; Mangini, Monica; Strocchi, Sabina; Novario, Raffaele; Tanzi, Fabio; Carrafiello, Gianpaolo; Conte, Leopoldo; Fugazzola, Carlo

    2006-03-01

    Dose and image quality assessment in computed tomography (CT) are almost affected by the vast variety of CT scanners (axial CT, spiral CT, low-multislice CT (2-16), high-multislice CT (32-64)) and imaging protocols in use. Very poor information is at the moment available on 64 slices CT scanners. Aim of this work is to assess image quality related to patient dose indexes and to investigate the achievable dose reduction for a commercially available 64 slices CT scanner. CT dose indexes (weighted computed tomography dose index, CTDI w and Dose Length Product, DLP) were measured with a standard CT phantom for the main protocols in use (head, chest, abdomen and pelvis) and compared with the values displayed by the scanner itself. The differences were always below 7%. All the indexes were below the Diagnostic Reference Levels defined by the European Council Directive 97/42. Effective doses were measured for each protocol with thermoluminescent dosimeters inserted in an anthropomorphic Alderson Rando phantom and compared with the same values computed by the ImPACT CT Patient Dosimetry Calculator software code and corrected by a factor taking in account the number of slices (from 16 to 64). The differences were always below 25%. The effective doses range from 1.5 mSv (head) to 21.8 mSv (abdomen). The dose reduction system of the scanner was assessed comparing the effective dose measured for a standard phantom-man (a cylinder phantom, 32 cm in diameter) to the mean dose evaluated on 46 patients. The standard phantom was considered as no dose reduction reference. The dose reduction factor range from 16% to 78% (mean of 46%) for all protocols, from 29% to 78% (mean of 55%) for chest protocol, from 16% to 76% (mean of 42%) for abdomen protocol. The possibility of a further dose reduction was investigated measuring image quality (spatial resolution, contrast and noise) as a function of CTDI w. This curve shows a quite flat trend decreasing the dose approximately to 90% and a

  19. A standard protocol for describing individual-based and agent-based models

    USGS Publications Warehouse

    Grimm, Volker; Berger, Uta; Bastiansen, Finn; Eliassen, Sigrunn; Ginot, Vincent; Giske, Jarl; Goss-Custard, John; Grand, Tamara; Heinz, Simone K.; Huse, Geir; Huth, Andreas; Jepsen, Jane U.; Jorgensen, Christian; Mooij, Wolf M.; Muller, Birgit; Pe'er, Guy; Piou, Cyril; Railsback, Steven F.; Robbins, Andrew M.; Robbins, Martha M.; Rossmanith, Eva; Ruger, Nadja; Strand, Espen; Souissi, Sami; Stillman, Richard A.; Vabo, Rune; Visser, Ute; DeAngelis, Donald L.

    2006-01-01

    Simulation models that describe autonomous individual organisms (individual based models, IBM) or agents (agent-based models, ABM) have become a widely used tool, not only in ecology, but also in many other disciplines dealing with complex systems made up of autonomous entities. However, there is no standard protocol for describing such simulation models, which can make them difficult to understand and to duplicate. This paper presents a proposed standard protocol, ODD, for describing IBMs and ABMs, developed and tested by 28 modellers who cover a wide range of fields within ecology. This protocol consists of three blocks (Overview, Design concepts, and Details), which are subdivided into seven elements: Purpose, State variables and scales, Process overview and scheduling, Design concepts, Initialization, Input, and Submodels. We explain which aspects of a model should be described in each element, and we present an example to illustrate the protocol in use. In addition, 19 examples are available in an Online Appendix. We consider ODD as a first step for establishing a more detailed common format of the description of IBMs and ABMs. Once initiated, the protocol will hopefully evolve as it becomes used by a sufficiently large proportion of modellers.

  20. Delivered dose estimate to standardize airway hyperresponsiveness assessment in mice.

    PubMed

    Robichaud, Annette; Fereydoonzad, Liah; Schuessler, Thomas F

    2015-04-15

    Airway hyperresponsiveness often constitutes a primary outcome in respiratory studies in mice. The procedure commonly employs aerosolized challenges, and results are typically reported in terms of bronchoconstrictor concentrations loaded into the nebulizer. Yet, because protocols frequently differ across studies, especially in terms of aerosol generation and delivery, direct study comparisons are difficult. We hypothesized that protocol variations could lead to differences in aerosol delivery efficiency and, consequently, in the dose delivered to the subject, as well as in the response. Thirteen nebulization patterns containing common protocol variations (nebulization time, duty cycle, particle size spectrum, air humidity, and/or ventilation profile) and using increasing concentrations of methacholine and broadband forced oscillations (flexiVent, SCIREQ, Montreal, Qc, Canada) were created, characterized, and studied in anesthetized naïve A/J mice. A delivered dose estimate calculated from nebulizer-, ventilator-, and subject-specific characteristics was introduced and used to account for protocol variations. Results showed that nebulization protocol variations significantly affected the fraction of aerosol reaching the subject site and the delivered dose, as well as methacholine reactivity and sensitivity in mice. From the protocol variants studied, addition of a slow deep ventilation profile during nebulization was identified as a key factor for optimization of the technique. The study also highlighted sensitivity differences within the lung, as well as the possibility that airway responses could be selectively enhanced by adequate control of nebulizer and ventilator settings. Reporting results in terms of delivered doses represents an important standardizing element for assessment of airway hyperresponsiveness in mice. PMID:25637610

  1. Improving post-stroke dysphagia outcomes through a standardized and multidisciplinary protocol: an exploratory cohort study.

    PubMed

    Gandolfi, Marialuisa; Smania, Nicola; Bisoffi, Giulia; Squaquara, Teresa; Zuccher, Paola; Mazzucco, Sara

    2014-12-01

    Stroke is a major cause of dysphagia. Few studies to date have reported on standardized multidisciplinary protocolized approaches to the management of post-stroke dysphagia. The aim of this retrospective cohort study was to evaluate the impact of a standardized multidisciplinary protocol on clinical outcomes in patients with post-stroke dysphagia. We performed retrospective chart reviews of patients with post-stroke dysphagia admitted to the neurological ward of Verona University Hospital from 2004 to 2008. Outcomes after usual treatment for dysphagia (T- group) were compared versus outcomes after treatment under a standardized diagnostic and rehabilitative multidisciplinary protocol (T+ group). Outcome measures were death, pneumonia on X-ray, need for respiratory support, and proportion of patients on tube feeding at discharge. Of the 378 patients admitted with stroke, 84 had dysphagia and were enrolled in the study. A significantly lower risk of in-hospital death (odds ratio [OR] 0.20 [0.53-0.78]), pneumonia (OR 0.33 [0.10-1.03]), need for respiratory support (OR 0.48 [0.14-1.66]), and tube feeding at discharge (OR 0.30 [0.09-0.91]) was recorded for the T+ group (N = 39) as compared to the T- group (N = 45). The adjusted OR showed no difference between the two groups for in-hospital death and tube feeding at discharge. Use of a standardized multidisciplinary protocolized approach to the management of post-stroke dysphagia may significantly reduce rates of aspiration pneumonia, in-hospital mortality, and tube feeding in dysphagic stroke survivors. Consistent with the study's exploratory purposes, our findings suggest that the multidisciplinary protocol applied in this study offers an effective model of management of post-stroke dysphagia.

  2. State Standards and State Assessment Systems: A Guide to Alignment. Series on Standards and Assessments.

    ERIC Educational Resources Information Center

    La Marca, Paul M.; Redfield, Doris; Winter, Phoebe C.

    Alignment of content standards, performance standards, and assessments is crucial. This guide contains information to assist states and districts in aligning their assessment systems to their content and performance standards. It includes a review of current literature, both published and fugitive. The research is woven together with a few basic…

  3. Protocol for exercise hemodynamic assessment: performing an invasive cardiopulmonary exercise test in clinical practice

    PubMed Central

    Berry, Natalia C.; Manyoo, Agarwal; Oldham, William M.; Stephens, Thomas E.; Goldstein, Ronald H.; Waxman, Aaron B.; Tracy, Julie A.; Leary, Peter J.; Leopold, Jane A.; Kinlay, Scott; Opotowsky, Alexander R.; Systrom, David M.

    2015-01-01

    Abstract Invasive cardiopulmonary exercise testing (iCPET) combines full central hemodynamic assessment with continuous measurements of pulmonary gas exchange and ventilation to help in understanding the pathophysiology underpinning unexplained exertional intolerance. There is increasing evidence to support the use of iCPET as a key methodology for diagnosing heart failure with preserved ejection fraction and exercise-induced pulmonary hypertension as occult causes of exercise limitation, but there is little information available outlining the methodology to use this diagnostic test in clinical practice. To bridge this knowledge gap, the operational protocol for iCPET at our institution is discussed in detail. In turn, a standardized iCPET protocol may provide a common framework to describe the evolving understanding of mechanism(s) that limit exercise capacity and to facilitate research efforts to define novel treatments in these patients. PMID:26697168

  4. The Missing Link: Standards, Assessment, "and" Instruction

    ERIC Educational Resources Information Center

    Fisher, Douglas

    2005-01-01

    Over a two-year period, the teachers at John Adams Middle School wrote and administered eight common assessments across content areas and met to discuss the results of each. By linking standards, assessments, and instruction, teachers were able to identify areas of need for specific students and address those needs. Educators across the country…

  5. Individual versus Standardized Running Protocols in the Determination of VO2max.

    PubMed

    Sperlich, Paula F; Holmberg, Hans-Christer; Reed, Jennifer L; Zinner, Christoph; Mester, Joachim; Sperlich, Billy

    2015-06-01

    The purpose of this study was to determine whether an individually designed incremental exercise protocol results in greater rates of oxygen uptake (VO2max) than standardized testing. Fourteen well-trained, male runners performed five incremental protocols in randomized order to measure their VO2max: i) an incremental test (INCS+I) with pre-defined increases in speed (2 min at 8.64 km·h(-1), then a rise of 1.44 km·h(-1) every 30 s up to 14.4 km·h(-1)) and thereafter inclination (0.5° every 30 s); ii) an incremental test (INCI) at constant speed (14.4 km·h(-1)) and increasing inclination (2° every 2 min from the initial 0°); iii) an incremental test (INCS) at constant inclination (0°) and increasing speed (0.5 km·h(-1) every 30 s from the initial 12.0 km·h(-1)); iv) a graded exercise protocol (GXP) at a 1° incline with increasing speed (initially 8.64 km·h(-1) + 1.44 km·h(-1) every 5 min); v) an individual exercise protocol (INDXP) in which the runner chose the inclination and speed. VO2max was lowest (-4.2%) during the GXP (p = 0.01; d = 0.06-0.61) compared to all other tests. The highest rating of perceived exertion, heart rate, ventilation and end-exercise blood lactate concentration were similar between the different protocols (p < 0.05). The time to exhaustion ranged from 7 min 18 sec (INCS) to 25 min 30 sec (GXP) (p = 0.01).The VO2max attained by employing an individual treadmill protocol does not differ from the values derived from various standardized incremental protocols. Key pointsThe mean maximum oxygen uptake during the GXP was lower than for all other tests.Differences in the maximum rate of oxygen uptake between the various protocols exhibited considerable inter-individual variation.From the current findings, it can be concluded that well trained athletes are able to perform an individually designed treadmill running protocol.

  6. Individual versus Standardized Running Protocols in the Determination of VO2max

    PubMed Central

    Sperlich, Paula F.; Holmberg, Hans-Christer; Reed, Jennifer L.; Zinner, Christoph; Mester, Joachim; Sperlich, Billy

    2015-01-01

    The purpose of this study was to determine whether an individually designed incremental exercise protocol results in greater rates of oxygen uptake (VO2max) than standardized testing. Fourteen well-trained, male runners performed five incremental protocols in randomized order to measure their VO2max: i) an incremental test (INCS+I) with pre-defined increases in speed (2 min at 8.64 km·h−1, then a rise of 1.44 km·h−1 every 30 s up to 14.4 km·h−1) and thereafter inclination (0.5° every 30 s); ii) an incremental test (INCI) at constant speed (14.4 km·h−1) and increasing inclination (2° every 2 min from the initial 0°); iii) an incremental test (INCS) at constant inclination (0°) and increasing speed (0.5 km·h−1 every 30 s from the initial 12.0 km·h−1); iv) a graded exercise protocol (GXP) at a 1° incline with increasing speed (initially 8.64 km·h−1 + 1.44 km·h−1 every 5 min); v) an individual exercise protocol (INDXP) in which the runner chose the inclination and speed. VO2max was lowest (-4.2%) during the GXP (p = 0.01; d = 0.06-0.61) compared to all other tests. The highest rating of perceived exertion, heart rate, ventilation and end-exercise blood lactate concentration were similar between the different protocols (p < 0.05). The time to exhaustion ranged from 7 min 18 sec (INCS) to 25 min 30 sec (GXP) (p = 0.01).The VO2max attained by employing an individual treadmill protocol does not differ from the values derived from various standardized incremental protocols. Key points The mean maximum oxygen uptake during the GXP was lower than for all other tests. Differences in the maximum rate of oxygen uptake between the various protocols exhibited considerable inter-individual variation. From the current findings, it can be concluded that well trained athletes are able to perform an individually designed treadmill running protocol. PMID:25983589

  7. Report from the NOAA workshops to standardize protocols for monitoring toxic Pfiesteria species and associated environmental conditions.

    PubMed

    Luttenberg, D; Turgeon, D; Higgins, J

    2001-10-01

    Long-term monitoring of water quality, fish health, and plankton communities in susceptible bodies of water is crucial to identify the environmental factors that contribute to outbreaks of toxic Pfiesteria complex (TPC) species. In the aftermath of the 1997 toxic Pfiesteria outbreaks in North Carolina and Maryland, federal and several state agencies agreed that there was a need to standardize monitoring protocols. The National Oceanic & Atmospheric Administration convened two workshops that brought together state, federal, and academic resource managers and scientific experts to a) seek consensus on responding to and monitoring potential toxic Pfiesteria outbreaks; b) recommend standard parameters and protocols to characterize water quality, fish health, and plankton at historical event sites and potentially susceptible sites; and c) discuss options for integrating monitoring data sets from different states into regional and national assessments. Workshop recommendations included the development of a three-tiered TPC monitoring strategy: Tier 1, rapid event response; Tier 2, comprehensive assessment; and Tier 3, routine monitoring. These tiers correspond to varying levels of water quality, fish health, and plankton monitoring frequency and intensity. Under the strategy, sites are prioritized, depending upon their history and susceptibility to TPC events, and assigned an appropriate level of monitoring activity. Participants also agreed upon a suite of water quality parameters that should be monitored. These recommendations provide guidance to state and federal agencies conducting rapid-response and assessment activities at sites of suspected toxic Pfiesteria outbreaks, as well as to states that are developing such monitoring programs for the first time.

  8. Reef Fish Survey Techniques: Assessing the Potential for Standardizing Methodologies.

    PubMed

    Caldwell, Zachary R; Zgliczynski, Brian J; Williams, Gareth J; Sandin, Stuart A

    2016-01-01

    Dramatic changes in populations of fishes living on coral reefs have been documented globally and, in response, the research community has initiated efforts to assess and monitor reef fish assemblages. A variety of visual census techniques are employed, however results are often incomparable due to differential methodological performance. Although comparability of data may promote improved assessment of fish populations, and thus management of often critically important nearshore fisheries, to date no standardized and agreed-upon survey method has emerged. This study describes the use of methods across the research community and identifies potential drivers of method selection. An online survey was distributed to researchers from academic, governmental, and non-governmental organizations internationally. Although many methods were identified, 89% of survey-based projects employed one of three methods-belt transect, stationary point count, and some variation of the timed swim method. The selection of survey method was independent of the research design (i.e., assessment goal) and region of study, but was related to the researcher's home institution. While some researchers expressed willingness to modify their current survey protocols to more standardized protocols (76%), their willingness decreased when methodologies were tied to long-term datasets spanning five or more years. Willingness to modify current methodologies was also less common among academic researchers than resource managers. By understanding both the current application of methods and the reported motivations for method selection, we hope to focus discussions towards increasing the comparability of quantitative reef fish survey data. PMID:27111085

  9. Reef Fish Survey Techniques: Assessing the Potential for Standardizing Methodologies

    PubMed Central

    Caldwell, Zachary R.; Zgliczynski, Brian J.; Williams, Gareth J.; Sandin, Stuart A.

    2016-01-01

    Dramatic changes in populations of fishes living on coral reefs have been documented globally and, in response, the research community has initiated efforts to assess and monitor reef fish assemblages. A variety of visual census techniques are employed, however results are often incomparable due to differential methodological performance. Although comparability of data may promote improved assessment of fish populations, and thus management of often critically important nearshore fisheries, to date no standardized and agreed-upon survey method has emerged. This study describes the use of methods across the research community and identifies potential drivers of method selection. An online survey was distributed to researchers from academic, governmental, and non-governmental organizations internationally. Although many methods were identified, 89% of survey-based projects employed one of three methods–belt transect, stationary point count, and some variation of the timed swim method. The selection of survey method was independent of the research design (i.e., assessment goal) and region of study, but was related to the researcher’s home institution. While some researchers expressed willingness to modify their current survey protocols to more standardized protocols (76%), their willingness decreased when methodologies were tied to long-term datasets spanning five or more years. Willingness to modify current methodologies was also less common among academic researchers than resource managers. By understanding both the current application of methods and the reported motivations for method selection, we hope to focus discussions towards increasing the comparability of quantitative reef fish survey data. PMID:27111085

  10. Evaluation of Dogs with Border Collie Collapse, Including Response to Two Standardized Strenuous Exercise Protocols.

    PubMed

    Taylor, Susan; Shmon, Cindy; Su, Lillian; Epp, Tasha; Minor, Katie; Mickelson, James; Patterson, Edward; Shelton, G Diane

    2016-01-01

    Clinical and metabolic variables were evaluated in 13 dogs with border collie collapse (BCC) before, during, and following completion of standardized strenuous exercise protocols. Six dogs participated in a ball-retrieving protocol, and seven dogs participated in a sheep-herding protocol. Findings were compared with 16 normal border collies participating in the same exercise protocols (11 retrieving, five herding). Twelve dogs with BCC developed abnormal mentation and/or an abnormal gait during evaluation. All dogs had post-exercise elevations in rectal temperature, pulse rate, arterial blood pH, PaO2, and lactate, and decreased PaCO2 and bicarbonate, as expected with strenuous exercise, but there were no significant differences between BCC dogs and normal dogs. Electrocardiography demonstrated sinus tachycardia in all dogs following exercise. Needle electromyography was normal, and evaluation of muscle biopsy cryosections using a standard panel of histochemical stains and reactions did not reveal a reason for collapse in 10 dogs with BCC in which these tests were performed. Genetic testing excluded the dynamin-1 related exercise-induced collapse mutation and the V547A malignant hyperthermia mutation as the cause of BCC. Common reasons for exercise intolerance were eliminated. Although a genetic basis is suspected, the cause of collapse in BCC was not determined. PMID:27487345

  11. Standardization of infrared breast thermogram acquisition protocols and abnormality analysis of breast thermograms

    NASA Astrophysics Data System (ADS)

    Bhowmik, Mrinal Kanti; Gogoi, Usha Rani; Das, Kakali; Ghosh, Anjan Kumar; Bhattacharjee, Debotosh; Majumdar, Gautam

    2016-05-01

    The non-invasive, painless, radiation-free and cost-effective infrared breast thermography (IBT) makes a significant contribution to improving the survival rate of breast cancer patients by early detecting the disease. This paper presents a set of standard breast thermogram acquisition protocols to improve the potentiality and accuracy of infrared breast thermograms in early breast cancer detection. By maintaining all these protocols, an infrared breast thermogram acquisition setup has been established at the Regional Cancer Centre (RCC) of Government Medical College (AGMC), Tripura, India. The acquisition of breast thermogram is followed by the breast thermogram interpretation, for identifying the presence of any abnormality. However, due to the presence of complex vascular patterns, accurate interpretation of breast thermogram is a very challenging task. The bilateral symmetry of the thermal patterns in each breast thermogram is quantitatively computed by statistical feature analysis. A series of statistical features are extracted from a set of 20 thermograms of both healthy and unhealthy subjects. Finally, the extracted features are analyzed for breast abnormality detection. The key contributions made by this paper can be highlighted as -- a) the designing of a standard protocol suite for accurate acquisition of breast thermograms, b) creation of a new breast thermogram dataset by maintaining the protocol suite, and c) statistical analysis of the thermograms for abnormality detection. By doing so, this proposed work can minimize the rate of false findings in breast thermograms and thus, it will increase the utilization potentiality of breast thermograms in early breast cancer detection.

  12. Evaluation of Dogs with Border Collie Collapse, Including Response to Two Standardized Strenuous Exercise Protocols.

    PubMed

    Taylor, Susan; Shmon, Cindy; Su, Lillian; Epp, Tasha; Minor, Katie; Mickelson, James; Patterson, Edward; Shelton, G Diane

    2016-01-01

    Clinical and metabolic variables were evaluated in 13 dogs with border collie collapse (BCC) before, during, and following completion of standardized strenuous exercise protocols. Six dogs participated in a ball-retrieving protocol, and seven dogs participated in a sheep-herding protocol. Findings were compared with 16 normal border collies participating in the same exercise protocols (11 retrieving, five herding). Twelve dogs with BCC developed abnormal mentation and/or an abnormal gait during evaluation. All dogs had post-exercise elevations in rectal temperature, pulse rate, arterial blood pH, PaO2, and lactate, and decreased PaCO2 and bicarbonate, as expected with strenuous exercise, but there were no significant differences between BCC dogs and normal dogs. Electrocardiography demonstrated sinus tachycardia in all dogs following exercise. Needle electromyography was normal, and evaluation of muscle biopsy cryosections using a standard panel of histochemical stains and reactions did not reveal a reason for collapse in 10 dogs with BCC in which these tests were performed. Genetic testing excluded the dynamin-1 related exercise-induced collapse mutation and the V547A malignant hyperthermia mutation as the cause of BCC. Common reasons for exercise intolerance were eliminated. Although a genetic basis is suspected, the cause of collapse in BCC was not determined.

  13. Definition of standardized nutritional assessment and interventional pathways in oncology.

    PubMed

    Ottery, F D

    1996-01-01

    Weight loss and nutritional deterioration are associated with adverse outcomes in terms of cancer prognosis (response rate and survival) as well as increased complications, prolonged hospitalizations, increased risk of unplanned hospitalization, increased disability, and increased overall cost of care. The nutritional oncology service at Fox Chase Cancer Center defined a proactive, standardized assessment and interventional approach from 1987-1994. In 186 consecutive patients referred to the nutrition clinic and managed solely by oral intervention and aggressive symptom management, the team demonstrated a 50%-80% success rate in getting patients to maintain or gain weight during therapy, with a similar success in maintaining or improving visceral protein status as determined by serum transferrin and/or albumin. Evaluation of the home parenteral nutrition program (n = 65, from 1987-1993) demonstrated similar success when appropriate triaging was carried out, with 58% of patients able to be tapered off parenteral nutrition (PN) entirely or with transition to enteral tube feeding. The assessment of success for a nutritional intervention (e.g., a disease-specific nutritional supplement) requires the standardization of definitions, assessment tools, criteria for nutritional intervention, and appropriate end points for the assessment of outcomes. The Patient-Generated Subjective Global Assessment of nutritional status is used in conjunction with the nutritional risk of planned cancer therapy to define a standardized interventional approach in oncology patients, which can be used in clinical practice, cooperative oncology group protocols, and clinical trials of nutritional intervention regimens. PMID:8850213

  14. Characterizing College Science Assessments: The Three-Dimensional Learning Assessment Protocol

    PubMed Central

    Underwood, Sonia M.; Matz, Rebecca L.; Posey, Lynmarie A.; Carmel, Justin H.; Caballero, Marcos D.; Fata-Hartley, Cori L.; Ebert-May, Diane; Jardeleza, Sarah E.; Cooper, Melanie M.

    2016-01-01

    Many calls to improve science education in college and university settings have focused on improving instructor pedagogy. Meanwhile, science education at the K-12 level is undergoing significant changes as a result of the emphasis on scientific and engineering practices, crosscutting concepts, and disciplinary core ideas. This framework of “three-dimensional learning” is based on the literature about how people learn science and how we can help students put their knowledge to use. Recently, similar changes are underway in higher education by incorporating three-dimensional learning into college science courses. As these transformations move forward, it will become important to assess three-dimensional learning both to align assessments with the learning environment, and to assess the extent of the transformations. In this paper we introduce the Three-Dimensional Learning Assessment Protocol (3D-LAP), which is designed to characterize and support the development of assessment tasks in biology, chemistry, and physics that align with transformation efforts. We describe the development process used by our interdisciplinary team, discuss the validity and reliability of the protocol, and provide evidence that the protocol can distinguish between assessments that have the potential to elicit evidence of three-dimensional learning and those that do not. PMID:27606671

  15. Characterizing College Science Assessments: The Three-Dimensional Learning Assessment Protocol.

    PubMed

    Laverty, James T; Underwood, Sonia M; Matz, Rebecca L; Posey, Lynmarie A; Carmel, Justin H; Caballero, Marcos D; Fata-Hartley, Cori L; Ebert-May, Diane; Jardeleza, Sarah E; Cooper, Melanie M

    2016-01-01

    Many calls to improve science education in college and university settings have focused on improving instructor pedagogy. Meanwhile, science education at the K-12 level is undergoing significant changes as a result of the emphasis on scientific and engineering practices, crosscutting concepts, and disciplinary core ideas. This framework of "three-dimensional learning" is based on the literature about how people learn science and how we can help students put their knowledge to use. Recently, similar changes are underway in higher education by incorporating three-dimensional learning into college science courses. As these transformations move forward, it will become important to assess three-dimensional learning both to align assessments with the learning environment, and to assess the extent of the transformations. In this paper we introduce the Three-Dimensional Learning Assessment Protocol (3D-LAP), which is designed to characterize and support the development of assessment tasks in biology, chemistry, and physics that align with transformation efforts. We describe the development process used by our interdisciplinary team, discuss the validity and reliability of the protocol, and provide evidence that the protocol can distinguish between assessments that have the potential to elicit evidence of three-dimensional learning and those that do not. PMID:27606671

  16. Characterizing College Science Assessments: The Three-Dimensional Learning Assessment Protocol.

    PubMed

    Laverty, James T; Underwood, Sonia M; Matz, Rebecca L; Posey, Lynmarie A; Carmel, Justin H; Caballero, Marcos D; Fata-Hartley, Cori L; Ebert-May, Diane; Jardeleza, Sarah E; Cooper, Melanie M

    2016-01-01

    Many calls to improve science education in college and university settings have focused on improving instructor pedagogy. Meanwhile, science education at the K-12 level is undergoing significant changes as a result of the emphasis on scientific and engineering practices, crosscutting concepts, and disciplinary core ideas. This framework of "three-dimensional learning" is based on the literature about how people learn science and how we can help students put their knowledge to use. Recently, similar changes are underway in higher education by incorporating three-dimensional learning into college science courses. As these transformations move forward, it will become important to assess three-dimensional learning both to align assessments with the learning environment, and to assess the extent of the transformations. In this paper we introduce the Three-Dimensional Learning Assessment Protocol (3D-LAP), which is designed to characterize and support the development of assessment tasks in biology, chemistry, and physics that align with transformation efforts. We describe the development process used by our interdisciplinary team, discuss the validity and reliability of the protocol, and provide evidence that the protocol can distinguish between assessments that have the potential to elicit evidence of three-dimensional learning and those that do not.

  17. Assessment of Service Protocols Adaptability Using a Novel Path Computation Technique

    NASA Astrophysics Data System (ADS)

    Zhou, Zhangbing; Bhiri, Sami; Haller, Armin; Zhuge, Hai; Hauswirth, Manfred

    In this paper we propose a new kind of adaptability assessment that determines whether service protocols of a requestor and a provider are adaptable, computes their adaptation degree, and identifies conditions that determine when they can be adapted. We also propose a technique that implements this adaptability assessment: (1) we construct a complete adaptation graph that captures all service interactions adaptable between these two service protocols. The emptiness or non-emptiness of this graph corresponds to the fact that whether or not they are adaptable; (2) we propose a novel path computation technique to generate all instance sub-protocols which reflect valid executions of a particular service protocol, and to derive all instance sub-protocol pairs captured by the complete adaptation graph. An adaptation degree is computed as a ratio between the number of instance sub-protocols captured by these instance sub-protocol pairs with respect to a service protocol and that of this service protocol; (3) and finally we identify a set of conditions based on these instance sub-protocol pairs. A condition is the conjunction of all conditions specified on the transitions of a given pair of instance sub-protocols. This assessment is a comprehensive means of selecting the suitable service protocol among functionally-equivalent candidates according to the requestor's business requirements.

  18. The California Multimedia Risk Assessment Protocol for Alternative Fuels

    NASA Astrophysics Data System (ADS)

    Hatch, T.; Ginn, T. R.; McKone, T. E.; Rice, D. W.

    2013-12-01

    Any new fuel in California requires approval by the state agencies overseeing human and environmental health. In order to provide a systematic evaluation of new fuel impacts, California now requires a multimedia risk assessment (MMRA) for fuel approval. The fuel MMRA involves all relevant state agencies including: the California Air Resources Board (CARB), the State Water Resources Control Board (SWRCB), the Office of Environmental Health Hazards Assessment (OEHHA), and the Department of Toxic Substances Control (DTSC) overseen by the California Environmental Protection Agency (CalEPA). The lead agency for MMRAs is the CARB. The original law requiring a multimedia assessment is California Health and Safety Code 43830.8. In addition, the low carbon fuel standard (LCFS), the Global Warming Solutions Act (AB32), and the Verified Diesel Emission Control Strategy (VDECS) have provisions that can require a multimedia assessment. In this presentation, I give an overview of the California multimedia risk assessment (MMRA) for new fuels that has been recently developed and applied to several alternative fuels. The objective of the California MMRA is to assess risk of potential impacts of new fuels to multiple environmental media including: air, water, and soil. Attainment of this objective involves many challenges, including varying levels of uncertainty, relative comparison of incommensurate risk factors, and differing levels of priority assigned to risk factors. The MMRA is based on a strategy of relative risk assessment and flexible accommodation of distinct and diverse fuel formulations. The approach is tiered by design, in order to allow for sequentially more sophisticated investigations as knowledge gaps are identified and re-prioritized by the ongoing research. The assessment also involves peer review in order to provide coupling between risk assessment and stakeholder investment, as well as constructive or confrontational feedback. The multimedia assessment

  19. Color gamut assessment standard: construction, characterization and interlaboratory measurement comparison

    NASA Astrophysics Data System (ADS)

    Libert, John M.; Kelley, Edward F.; Boynton, Paul A.; Brown, Steven W.; Wall, Christine F.; Campbell, Colin

    2003-07-01

    In earlier papers, NIST proposed a standard illumination source and optical filter targets with which to assess the state-of-the-art of display measurement. The Display Measurement Assessment Transfer Standard (DMATS) was designed to present the display metrologist with a rectangular array of targets such as color filters, polarizers, and grilles, back-lighted by uniform illumination, to be measured using methods and instruments typically used in display performance measurement. A "round robin" interlaboratory measurement exercise using the "standard" artifact suite would enable a first order assessment of display measurement reproducibility, i.e., measurement variability within the electronic display community. The rectangular array design of the DMATS was anticipated to present stray light and color contamination challenges to facilitate identification of error sources deriving from measurement protocols, laboratory environment, and equipment. However, complications in dealing with heating problems threatened to delay the planned laboratory intercomparison. The Gamut Assessment Standard (GAS) was thus designed as an interim solution to enable the NIST scientists and participating measurement laboratories to begin collecting data. The GAS consists of a 150 mm diameter integrating sphere standard illumination source with a stray light elimination tube (SLET) mounted at the exit port. A dual six-position filter wheel is mounted at the SLET exit port. One wheel holds a series of neutral density filters and a second interchangeable wheel holds various color filters. This paper describes the design and construction of the GAS, its initial performance characterization by NIST, and comparison measurements made at NPL. Possible design changes suggested by the results of the preliminary intercomparison are discussed, as are plans for future interlaboratory comparisons and potential use of the GAS as a transfer standard for laboratory self-certification.

  20. Assessing Juvenile Salmonid Passage Through Culverts: Field Research in Support of Protocol Development

    SciTech Connect

    Williams, Greg D.; Evans, Nathan R.; Pearson, Walter H.; Southard, John A.

    2001-10-30

    The primary goal of our research this spring/ summer was to refine techniques and examine scenarios under which a standardized protocol could be applied to assess juvenile coho salmon (O. kisutch) passage through road culverts. Field evaluations focused on capture-mark- recapture methods that allowed analysis of fish movement patterns, estimates of culvert passability, and potential identification of cues inducing these movements. At this stage, 0+ age coho salmon fry 30 mm to 65 mm long (fork length) were the species and age class of interest. Ultimately, the protocol will provide rapid, statistically rigorous methods for trained personnel to perform standardized biological assessments of culvert passability to a number of juvenile salmon species. Questions to be addressed by the research include the following: ? Do hydraulic structures such as culverts restrict habitat for juvenile salmonids? ? How do existing culverts and retrofits perform relative to juvenile salmonid passage? ? Do some culvert characteristics and hydraulic conditions provide better passage than others? ? Does the culvert represent a barrier to certain size classes of fish? Recommendations addressed issues of study site selection, initial capture, marking, recapture/observations, and estimating movement.

  1. A proposed protocol for the standardized preparation of PRF membranes for clinical use.

    PubMed

    Kobayashi, Mito; Kawase, Tomoyuki; Horimizu, Makoto; Okuda, Kazuhiro; Wolff, Larry F; Yoshie, Hiromasa

    2012-09-01

    Upon clinical application, thick platelet-rich fibrin (PRF) is usually compressed to fit the implantation site. However, it is speculated that the preservation of platelets and plasma content depends on the compression methods used. To accurately evaluate the clinical outcome of PRF, the preparation protocol should be standardized. Freshly prepared PRF clots were compressed into a thin membrane by our novel PRF compression device. The localization of platelets was examined by SEM and immunostaining. Growth factor levels were evaluated by bioassays and cytokine-antibody array techniques. The angiogenic activity was examined by the chick chorioallantoic membrane assay and the scratch assay using HUVEC cultures. Platelets were concentrated on the surface of the region adjacent to the red thrombus and this region was subjected to the experiments. Compared to the PRF membrane compressed by dry gauze (G-PRF), the preservation of the plasma content, 3D-fibrin meshwork, and platelets was more intact in the compressor-prepared PRF membrane (C-PRF). Among the growth factors tested, C-PRF contained PDGF isoforms at higher levels, and significantly stimulated cell proliferation and neovascularization. C-PRF may be useful for grafting while minimizing the loss of bioactive factors. This C-PRF preparation protocol is proposed as a standardized protocol for PRF membrane preparation.

  2. Addressing Standards and Assessments on the IEP.

    ERIC Educational Resources Information Center

    Thompson, Sandra J.; Thurlow, Martha L.; Esler, Amy; Whetstone, Patti J.

    2001-01-01

    A study that examined state Individualized Education Program (IEP) forms found that out of the 41 with IEP forms, only 5 specifically addressed educational standards on their forms. Thirty-one states addressed the general curriculum on their IEP forms and 30 states listed three or more options for assessment participation. (Contains nine…

  3. Assessing Clinical Judgment Using Standardized Oral Examinations.

    ERIC Educational Resources Information Center

    Bashook, Philip

    This paper describes the use of oral examinations to assess the clinical judgment of aspiring physicians. Oral examinations have been used in U.S. medicine since 1917. Currently, 15 member boards of the American Board of Medical Specialties administer 17 different standardized oral examinations to approximately 10,000 physician candidates…

  4. System Assessment Standards: Defining the Market for Industrial Energy Assessments

    SciTech Connect

    Sheaffer, Paul; McKane, Aimee; Tutterow, Vestal; Crane, Ryan

    2009-08-01

    Improved efficiency of industrial systems (e.g., compressed air or steam) contributes to a manufacturing facility?s bottom line, improves reliability, and better utilizes assets. Despite these advantages, many industrial facilities continue to have unrealized system optimization potential. A barrier to realizing this potential is the lack of market definition for system energy efficiency assessment services, creating problems for both service providers in establishing market value for their services and for consumers in determining the relative quality of these system assessment services. On August 19, 2008, the American Society of Mechanical Engineers (ASME) issued four new draft Standards for trial use that are designed to raise the bar and define the market for these services. These draft Standards set the requirements for conducting an energy assessment at an industrial facility for four different system types: compressed air, process heating, pumping, and steam. The Standards address topics such as organizing and conducting assessments; analyzing the data collected; and reporting and documentation. This paper addresses both the issues and challenges in developing the Standards and the accompanying Guidance Documents, as well as the result of field testing by industrial facilities, consultants, and utilities during the trial use period that ended in January, 2009. These Standards will be revised and released by ASME for public review, and subsequently submitted for approval as American National Standards for publication in late 2009. Plans for a related activity to establish a professional-level program to certify practitioners in the area of system assessments, opportunities to integrate the ASME Standards with related work on industrial energy efficiency, as well as plans to expand the system assessment Standard portfolio are also discussed.

  5. Systematic Review Checklist: A Standardized Technique for Assessing and Reporting Reviews of Life Cycle Assessment Data

    PubMed Central

    Zumsteg, Jennifer M.; Cooper, Joyce S.; Noon, Michael S.

    2015-01-01

    Summary Systematic review, including meta-analysis, is increasingly utilized in life cycle assessment (LCA). There are currently no widely recognized guidelines for designing, conducting, or reporting systematic reviews in LCA. Other disciplines such as medicine, ecology, and software engineering have both recognized the utility of systematic reviews and created standardized protocols for conducting and reporting systematic reviews. Based largely on the 2009 Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) statement, which updated the preferred format for reporting of such reviews in biomedical research, we provide an introduction to the topic and a checklist to guide the reporting of future LCA reviews in a standardized format. The standardized technique for assessing and reporting reviews of LCA (STARR-LCA) checklist is a starting point for improving the utility of systematic reviews in LCA. PMID:26069437

  6. EPA Traceability Protocol for Assay and Certification of Gaseous Calibration Standards (EPA/600/R-12/531, May 2012)

    EPA Science Inventory

    In 1997, the U.S. Environmental Protection Agency (EPA) in Research Triangle Park, North Carolina, revised its 1993 version of its traceability protocol for the assay and certification of compressed gas and permeation-device calibration standards. The protocol allows producers of...

  7. Developing Korean Standard for Nanomaterial Exposure Assessment

    PubMed Central

    Lee, Ji Hyun; Lee, Jun Yeob

    2011-01-01

    Nanotechnology is now applied to many industries, resulting in wide range of nanomaterial-containing products, such as electronic components, cosmetic, medicines, vehicles, and home appliances. Nanoparticles can be released throughout the life cycle of nanoproducts, including the manufacture, consumer use, and disposal, thereby involving workers, consumers, and the environment in potential exposure. However, there is no current consensus on the best sampling method for characterizing manufactured-nanoparticle exposure. Therefore, this report aims to provide a standard method for assessing nanoparticle exposure, including the identification of nanoparticle emission, the assessment of worker exposure, and the evaluation of exposure mitigation actions in nanomaterial-handling workplaces or research institutes. PMID:24278552

  8. Standardized mental status assessment of sports concussion.

    PubMed

    McCrea, M

    2001-07-01

    Neurocognitive status is often considered the domain of neurologic functioning most sensitive to change following concussion, but the effects are often subtle and difficult to detect on routine clinical examination. Recent efforts have focused on the development of brief, standardized methods of mental status assessment for use by sports medicine clinicians to quantify the acute neurocognitive effects of concussion and objectively track postinjury recovery. Research has demonstrated the reliability, validity, and sensitivity of these measures in detecting concussion in athletes and providing empirical data for consideration in the context of other examination findings, neuropsychologic test data, and neuroimaging results. Standardized measures of mental status and other postconcussive symptoms are valuable tools to assist clinicians in the assessment and management of concussion, but should not be used as a replacement for medical evaluation or viewed as a stand-alone means for determining readiness to return to competition after injury.

  9. Histopathological alterations of the heart in fish: proposal for a standardized assessment.

    PubMed

    Steinbach, Christoph; Kroupová, Hana Kocour; Wahli, Thomas; Klicnarová, Jana; Schmidt-Posthaus, Heike

    2016-03-30

    Histopathological alterations in the heart are often reported in fish as a result of exposure to a variety of chemical compounds. However, researchers presently lack a standardized method for the evaluation of histopathological alterations in the cardiovascular system of fish and the calculation of an 'organ index'. Therefore, we designed a method for a standardized assessment and evaluation of histopathological alterations in the heart of fish. As a model species, we used rainbow trout Oncorhynchus mykiss, but the protocol was also successfully applied to other fish species belonging to different taxonomic orders. To test the protocol, we re-evaluated sections of atenolol-exposed and unexposed rainbow trout obtained in a previous study. The results were in accordance with those previously published, demonstrating the applicability of the protocol. The protocol provides a universal method for the comparative evaluation of histopathological changes in the heart of fish. PMID:27025306

  10. Development of a Canadian Standardized Protocol for Subtyping Methicillin-Resistant Staphylococcus aureus Using Pulsed-Field Gel Electrophoresis

    PubMed Central

    Mulvey, M. R.; Chui, L.; Ismail, J.; Louie, L.; Murphy, C.; Chang, N.; Alfa, M.

    2001-01-01

    A panel of 24 methicillin-resistant Staphylococcus aureus strains was distributed to 15 laboratories in Canada to evaluate their in-house pulsed-field gel electrophoresis (PFGE) protocols and interpretation criteria. Attempts to compare fingerprint images using computer-aided analysis were not successful due to variability in individual laboratory PFGE protocols. In addition, individual site interpretation of the fingerprint patterns was inadequate, as 7 of 13 sites (54%) made at least one error in interpreting the fingerprints from the panel. A 2-day standardized PFGE protocol (culture to gel image) was developed and distributed to all of the sites. Each site was requested to use the standardized protocol on five strains from the original panel. Thirteen sites submitted gel images for comparisons. The protocol demonstrated excellent reproducibility and allowed interlaboratory comparisons with Molecular Analyst DST software (Bio-Rad) and 1.5% band tolerance. PMID:11574559

  11. 76 FR 16250 - Planning Resource Adequacy Assessment Reliability Standard

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-23

    ... Energy Regulatory Commission 18 CFR Part 40 Planning Resource Adequacy Assessment Reliability Standard... Standard, BAL-502-RFC-02 (Planning Resource Adequacy Analysis, Assessment and ] Documentation), developed... Reliability Corporation. The approved regional Reliability Standard requires planning coordinators within...

  12. Consensus recommendations for a standardized Brain Tumor Imaging Protocol in clinical trials

    PubMed Central

    Ellingson, Benjamin M.; Bendszus, Martin; Boxerman, Jerrold; Barboriak, Daniel; Erickson, Bradley J.; Smits, Marion; Nelson, Sarah J.; Gerstner, Elizabeth; Alexander, Brian; Goldmacher, Gregory; Wick, Wolfgang; Vogelbaum, Michael; Weller, Michael; Galanis, Evanthia; Kalpathy-Cramer, Jayashree; Shankar, Lalitha; Jacobs, Paula; Pope, Whitney B.; Yang, Dewen; Chung, Caroline; Knopp, Michael V.; Cha, Soonme; van den Bent, Martin J.; Chang, Susan; Al Yung, W.K.; Cloughesy, Timothy F.; Wen, Patrick Y.; Gilbert, Mark R.

    2015-01-01

    A recent joint meeting was held on January 30, 2014, with the US Food and Drug Administration (FDA), National Cancer Institute (NCI), clinical scientists, imaging experts, pharmaceutical and biotech companies, clinical trials cooperative groups, and patient advocate groups to discuss imaging endpoints for clinical trials in glioblastoma. This workshop developed a set of priorities and action items including the creation of a standardized MRI protocol for multicenter studies. The current document outlines consensus recommendations for a standardized Brain Tumor Imaging Protocol (BTIP), along with the scientific and practical justifications for these recommendations, resulting from a series of discussions between various experts involved in aspects of neuro-oncology neuroimaging for clinical trials. The minimum recommended sequences include: (i) parameter-matched precontrast and postcontrast inversion recovery-prepared, isotropic 3D T1-weighted gradient-recalled echo; (ii) axial 2D T2-weighted turbo spin-echo acquired after contrast injection and before postcontrast 3D T1-weighted images to control timing of images after contrast administration; (iii) precontrast, axial 2D T2-weighted fluid-attenuated inversion recovery; and (iv) precontrast, axial 2D, 3-directional diffusion-weighted images. Recommended ranges of sequence parameters are provided for both 1.5 T and 3 T MR systems. PMID:26250565

  13. An assessment of cleaning regimes and standards in butchers' shops.

    PubMed

    Worsfold, D; Griffith, C J

    2001-09-01

    Cleaning regimes and standards in retail butchers taking part in the Accelerated HACCP project initiative, were assessed by means of visual inspection, examination of cleaning schedules and ATP bioluminescence assays of selected food and hand contact sites. There was a wide variation in surface ATP results, both within and between butchers' shops, but overall they indicated that food and hand contact surfaces were heavily soiled during food production and service. Although separate preparation equipment/utensils were provided, staff undertook raw and cooked product handling throughout the day, with the concomitant danger of contaminating hand and food contact surfaces. The extent of soiling was generally underestimated when assessed visually, the technique used most commonly by the food retail trade and inspection authorities. Periodic or interim cleaning practices produced a significant improvement in cleanliness assessed visually and with ATP assay; however, these results were generally less satisfactory than those obtained by the use of best practice protocols. A lack of written cleaning schedules and records, training in the correct use of cleaning products and awareness of the importance of cleaning hand contact sites were identified as common defects. The results are discussed in relation to the establishment of an effective HACCP system and recommendations for improving cleaning standards are given.

  14. PROFILE: Environmental Impact Assessment Under the National Environmental Policy Act and the Protocol on Environmental Protection to the Antarctic Treaty.

    PubMed

    Ensminger; McCold; Webb

    1999-07-01

    for activities undertaken by all Parties in Antarctica. The Protocol gives clear and strong guidance for protection of specific, valued antarctic environmental resources including intrinsic wilderness and aesthetic values, and the value of Antarctica as an area for scientific research. That guidance requires a higher standard of environmental protection for Antarctica than is required in other parts of the world. This paper shows that taken together NEPA and the Protocol call for closer examination of proposed actions and a more rigorous consideration of environmental impacts than either would alone. Three areas are identified where the EIA provisions of the Protocol could be strengthened to improve its effectiveness. First, the thresholds defined by the Protocol need to be clarified. Specifically, the meanings of the terms "minor" and "transitory" are not clear in the context of the Protocol. The use of "or" in the phrase "minor or transitory" further confuses the meaning. Second, cumulative impact assessment is called for by the Protocol but is not defined. A clear definition could reduce the chance that cumulative impacts would be given inadequate consideration. Finally, the public has limited opportunities to comment on or influence the preparation of initial or comprehensive environmental evaluations. Experience has shown that public input to environmental documents has a considerable influence on agency decision making and the quality of EIA that agencies perform.KEY WORDS: Environment; Impact assessment; Antarctica; NEPA; Protocol; Antarctic Treatyhttp://link.springer-ny.com/link/service/journals/00267/bibs/24n1p13.html

  15. A comparison of single and multiple stressor protocols to assess acute stress in a coastal shark species, Rhizoprionodon terraenovae.

    PubMed

    Hoffmayer, Eric R; Hendon, Jill M; Parsons, Glenn R; Driggers, William B; Campbell, Matthew D

    2015-10-01

    Elasmobranch stress responses are traditionally measured in the field by either singly or serially sampling an animal after a physiologically stressful event. Although capture and handling techniques are effective at inducing a stress response, differences in protocols could affect the degree of stress experienced by an individual, making meaningful comparisons between the protocols difficult, if not impossible. This study acutely stressed Atlantic sharpnose sharks, Rhizoprionodon terraenovae, by standardized capture (rod and reel) and handling methods and implemented either a single or serial blood sampling protocol to monitor four indicators of the secondary stress response. Single-sampled sharks were hooked and allowed to swim around the boat until retrieved for a blood sample at either 0, 15, 30, 45, or 60 min post-hooking. Serially sampled sharks were retrieved, phlebotomized, released while still hooked, and subsequently resampled at 15, 30, 45, and 60 min intervals post-hooking. Blood was analyzed for hematocrit, and plasma glucose, lactate, and osmolality levels. Although both single and serial sampling protocols resulted in an increase in glucose, no significant difference in glucose level was found between protocols. Serially sampled sharks exhibited cumulatively heightened levels for lactate and osmolality at all time intervals when compared to single-sampled animals at the same time. Maximal concentration differences of 217.5, 9.8, and 41.6 % were reported for lactate, osmolality, and glucose levels, respectively. Hematocrit increased significantly over time for the single sampling protocol but did not change significantly during the serial sampling protocol. The differences in resultant blood chemistry levels between implemented stress protocols and durations are significant and need to be considered when assessing stress in elasmobranchs.

  16. Enrichment-induced exercise to quantify the effect of different housing conditions: a tool to standardize enriched environment protocols.

    PubMed

    Xie, Hongyu; Wu, Yi; Jia, Jie; Liu, Gang; Zhang, Qi; Yu, Kewei; Guo, Zhenzhen; Shen, Li; Hu, Ruiping

    2013-07-15

    Enriched environments (EE) have been used for a long time to promote recovery in many neurological disorders, however, a growing body of inconsistent results strongly calls for a rigorous standardization of experimental EE paradigms. Although some core principles are well accepted as standards, a method to quantitatively assess the complexity of EE in various experimental designs is still lacking. In this study, we tracked and recorded the physical exercise of rats in four housing conditions, namely isolated condition, social condition, novel condition and EE. Then, we analyzed whether and to what extent, enrichment-induced exercise reflected the degree of enrichment. We next examined rat exercise in a conventional environment condition and under different light intensities, to explore whether environment-related exercise could be considered a parameter to quantify the degree of enrichment. The results obtained showed that (1) both inanimate and social stimulations enhanced the exercise level and (2) EE combined the effects of the two stimulations. Furthermore, exercise durability which correlated positively with degree of enrichment, was an objective measure of different housing conditions. Exercise-related parameters also sensitively reflected the impacts of light intensity even in the same enrichment arrangements. Our results indicate that there is a direct and measurable correlation between degree of environmental enrichment and enrichment-induced exercise, and therefore enrichment-induced exercise could be used as a helpful tool to evaluate the degree of housing conditions and to standardize the EE protocols.

  17. Improved (and Singular) Disinfectant Protocol for Indirectly Assessing Organic Precursor Concentrations of Trihalomethanes and Dihaloacetonitriles.

    PubMed

    Do, Thien D; Chimka, Justin R; Fairey, Julian L

    2015-08-18

    Measurements of disinfection byproduct (DBP) organic precursor concentrations (OPCs) are crucial to assess and improve DBP control processes. Typically, formation potential tests - specified in Standard Methods (SM) 5710-B/D - are used to measure OPCs. Here, we highlight several limitations of this protocol for dihaloacetonitriles and trihalomethanes and validate a novel Alternative Method (AM). The effects of pH, disinfectant type (free chlorine and monochloramine), and chlor(am)ine residual (CR) were examined on DBP formation in a suite of waters. Using the SM, DHAN decreased 43-47% as the CR increased from 3 to 5 mg L(-1) as Cl2, compromising OPC assessments. In contrast, a high monochloramine dose (250 mg L(-1) as Cl2) at pH 7.0 (the AM) accurately reflected OPCs. The two methods were compared for assessing DBP precursor removal through three granular activated carbon (GAC) columns in series. Breakthrough profiles assessed using the AM only showed DBP precursor sorption occurred in each column that decreased over time (p = 0.0001). Similarly, the AM facilitated ranking of three types of GAC compared in parallel columns, whereas the SM produced ambiguous results. Fluorescence intensity of a humic-like fluorophore (i.e., I345/425) correlated strongly to precursor removal in the GAC columns. The practical implications of the results are discussed.

  18. Improved (and Singular) Disinfectant Protocol for Indirectly Assessing Organic Precursor Concentrations of Trihalomethanes and Dihaloacetonitriles.

    PubMed

    Do, Thien D; Chimka, Justin R; Fairey, Julian L

    2015-08-18

    Measurements of disinfection byproduct (DBP) organic precursor concentrations (OPCs) are crucial to assess and improve DBP control processes. Typically, formation potential tests - specified in Standard Methods (SM) 5710-B/D - are used to measure OPCs. Here, we highlight several limitations of this protocol for dihaloacetonitriles and trihalomethanes and validate a novel Alternative Method (AM). The effects of pH, disinfectant type (free chlorine and monochloramine), and chlor(am)ine residual (CR) were examined on DBP formation in a suite of waters. Using the SM, DHAN decreased 43-47% as the CR increased from 3 to 5 mg L(-1) as Cl2, compromising OPC assessments. In contrast, a high monochloramine dose (250 mg L(-1) as Cl2) at pH 7.0 (the AM) accurately reflected OPCs. The two methods were compared for assessing DBP precursor removal through three granular activated carbon (GAC) columns in series. Breakthrough profiles assessed using the AM only showed DBP precursor sorption occurred in each column that decreased over time (p = 0.0001). Similarly, the AM facilitated ranking of three types of GAC compared in parallel columns, whereas the SM produced ambiguous results. Fluorescence intensity of a humic-like fluorophore (i.e., I345/425) correlated strongly to precursor removal in the GAC columns. The practical implications of the results are discussed. PMID:26167626

  19. Clinical skills assessment with standardized patients.

    PubMed

    Gómez, J M; Prieto, L; Pujol, R; Arbizu, T; Vilar, L; Pi, F; Borrell, F; Roma, J; Martínez-Carretero, J M

    1997-03-01

    Previous projects (Combell I & II) to assess clinical skills were conducted in medical schools in Catalonia, in order to introduce a model of such an assessment using standardized patients (SP). The aim of this study (Combell III) was to measure selected characteristics of our model. Seventy-three medical students in the final year at the Bellvitge teaching unit of the University of Barcelona participated in a clinical skills assessment (CSA) project that used 10 SP cases. The mean group scores for the four components of clinical skills for each day of testing were studied, and ratings for each student in the 10 sequential encounters were checked. The study also compared the clinical skills scores with their academic grades. The total case mean score (mean score of history-taking, physical examination and patient notes scores) was 51.9%, and the mean score for communication skills was 63.6%. The clinical skills scores over the 8 testing days showed no day-to-day differences. The study did not find differences among the sequential encounters for each student (training effect). There was a lack of correlation between clinical skills scores and academic grades. The project demonstrated the feasibility of the method for assessing clinical skills, confirmed its reliability, and showed that there is no correlation between scores with this method and academic examinations that mainly reflect knowledge.

  20. Standardized accuracy assessment of the calypso wireless transponder tracking system.

    PubMed

    Franz, A M; Schmitt, D; Seitel, A; Chatrasingh, M; Echner, G; Oelfke, U; Nill, S; Birkfellner, W; Maier-Hein, L

    2014-11-21

    Electromagnetic (EM) tracking allows localization of small EM sensors in a magnetic field of known geometry without line-of-sight. However, this technique requires a cable connection to the tracked object. A wireless alternative based on magnetic fields, referred to as transponder tracking, has been proposed by several authors. Although most of the transponder tracking systems are still in an early stage of development and not ready for clinical use yet, Varian Medical Systems Inc. (Palo Alto, California, USA) presented the Calypso system for tumor tracking in radiation therapy which includes transponder technology. But it has not been used for computer-assisted interventions (CAI) in general or been assessed for accuracy in a standardized manner, so far. In this study, we apply a standardized assessment protocol presented by Hummel et al (2005 Med. Phys. 32 2371-9) to the Calypso system for the first time. The results show that transponder tracking with the Calypso system provides a precision and accuracy below 1 mm in ideal clinical environments, which is comparable with other EM tracking systems. Similar to other systems the tracking accuracy was affected by metallic distortion, which led to errors of up to 3.2 mm. The potential of the wireless transponder tracking technology for use in many future CAI applications can be regarded as extremely high.

  1. Standardized accuracy assessment of the calypso wireless transponder tracking system

    NASA Astrophysics Data System (ADS)

    Franz, A. M.; Schmitt, D.; Seitel, A.; Chatrasingh, M.; Echner, G.; Oelfke, U.; Nill, S.; Birkfellner, W.; Maier-Hein, L.

    2014-11-01

    Electromagnetic (EM) tracking allows localization of small EM sensors in a magnetic field of known geometry without line-of-sight. However, this technique requires a cable connection to the tracked object. A wireless alternative based on magnetic fields, referred to as transponder tracking, has been proposed by several authors. Although most of the transponder tracking systems are still in an early stage of development and not ready for clinical use yet, Varian Medical Systems Inc. (Palo Alto, California, USA) presented the Calypso system for tumor tracking in radiation therapy which includes transponder technology. But it has not been used for computer-assisted interventions (CAI) in general or been assessed for accuracy in a standardized manner, so far. In this study, we apply a standardized assessment protocol presented by Hummel et al (2005 Med. Phys. 32 2371-9) to the Calypso system for the first time. The results show that transponder tracking with the Calypso system provides a precision and accuracy below 1 mm in ideal clinical environments, which is comparable with other EM tracking systems. Similar to other systems the tracking accuracy was affected by metallic distortion, which led to errors of up to 3.2 mm. The potential of the wireless transponder tracking technology for use in many future CAI applications can be regarded as extremely high.

  2. Standardized accuracy assessment of the calypso wireless transponder tracking system.

    PubMed

    Franz, A M; Schmitt, D; Seitel, A; Chatrasingh, M; Echner, G; Oelfke, U; Nill, S; Birkfellner, W; Maier-Hein, L

    2014-11-21

    Electromagnetic (EM) tracking allows localization of small EM sensors in a magnetic field of known geometry without line-of-sight. However, this technique requires a cable connection to the tracked object. A wireless alternative based on magnetic fields, referred to as transponder tracking, has been proposed by several authors. Although most of the transponder tracking systems are still in an early stage of development and not ready for clinical use yet, Varian Medical Systems Inc. (Palo Alto, California, USA) presented the Calypso system for tumor tracking in radiation therapy which includes transponder technology. But it has not been used for computer-assisted interventions (CAI) in general or been assessed for accuracy in a standardized manner, so far. In this study, we apply a standardized assessment protocol presented by Hummel et al (2005 Med. Phys. 32 2371-9) to the Calypso system for the first time. The results show that transponder tracking with the Calypso system provides a precision and accuracy below 1 mm in ideal clinical environments, which is comparable with other EM tracking systems. Similar to other systems the tracking accuracy was affected by metallic distortion, which led to errors of up to 3.2 mm. The potential of the wireless transponder tracking technology for use in many future CAI applications can be regarded as extremely high. PMID:25332308

  3. Audit of a 5-year radiographic protocol for assessment of mandibular third molars before surgical intervention

    PubMed Central

    Schou, S; Christensen, J; Hintze, H; Wenzel, A

    2014-01-01

    Objectives: To perform an audit of a three-step protocol for radiographic examination of mandibular third molars before surgery. Methods: 1769 teeth underwent surgery. A standardized three-step radiographic protocol was followed: (1) panoramic imaging (PAN), (2) stereoscanography (SCAN) and (3) CBCT. If there was overprojection between the tooth and the canal in PAN, SCAN was performed. If the tooth was determined to be in close contact with the canal in SCAN, CBCT was performed. Close contact between the tooth and the canal was assessed in all images, and patient-reported sensory disturbances from the alveolar inferior nerve were recorded after surgery. The relation between the final radiographic examination and sensory disturbances was determined. Logistic regression analysis tested whether signs for a close contact in PAN/SCAN could predict no bony separation between the tooth and canal in CBCT. Results: 46% of teeth underwent PAN, 31% underwent SCAN and 23% underwent CBCT as the final examination. 21% underwent all three radiographic examinations. 53/76% of teeth with close relation to the canal in PAN/SCAN showed no bony separation in CBCT; if there was close relation in PAN/SCAN, there was 1.6/4.3 times higher probability that no bony separation existed in CBCT. 16 cases of sensory disturbances were recorded: 4 operations were based on PAN, 8 on SCAN and 4 on CBCT. Conclusions: The radiographic protocol was in general followed. SCAN was superior to PAN in predicting no bony separation between the tooth and the canal in CBCT, and there was no relation between sensory disturbances and radiographic method. PMID:25216077

  4. Competitive PCR-ELISA protocols for the quantitative and the standardized detection of viral genomes.

    PubMed

    Musiani, Monica; Gallinella, Giorgio; Venturoli, Simona; Zerbini, Marialuisa

    2007-01-01

    Competitive PCR-ELISA combines competitive PCR with an ELISA to allow quantitative detection of PCR products. It is based on the inclusion of an internal standard competitor molecule that is designed to differ from the target by a short sequence of nucleotides. Once such a competitor molecule has been designed and constructed, target and competitor sequences are concurrently PCR-amplified, before hybridization to two different specific probes and determination of their respective OD values by ELISA. The target can be quantified in relation to a titration curve of different dilutions of the competitor. The competitor can alternatively be used at a unique optimal concentration to allow for standardized detection of the target sequence. PCR-ELISA can be performed in 1 d in laboratories without access to a real-time PCR thermocycler. This technique is applied in diagnostics to monitor the course of infections and drug efficacy. Competitive PCR-ELISA protocols for the quantitative and for the standardized detection of parvovirus B19 are detailed here as an example of the technique.

  5. A Systematic Review of Protocols for the Three-Dimensional Morphologic Assessment of Abdominal Aortic Aneurysms Using Computed Tomographic Angiography

    SciTech Connect

    Ghatwary, Tamer M. H.; Patterson, Benjamin O.; Karthikesalingam, Alan; Hinchliffe, Robert J.; Loftus, Ian M.; Morgan, Robert; Thompson, Matt M.; Holt, Peter J. E.

    2013-02-15

    The morphology of infrarenal abdominal aortic aneurysms (AAAs) directly influences the perioperative outcome and long-term durability of endovascular aneurysm repair. A variety of methods have been proposed for the characterization of AAA morphology using reconstructed three-dimensional (3D) computed tomography (CT) images. At present, there is lack of consensus as to which of these methods is most applicable to clinical practice or research. The purpose of this review was to evaluate existing protocols that used 3D CT images in the assessment of various aspects of AAA morphology. An electronic search was performed, from January 1996 to the end of October 2010, using the Embase and Medline databases. The literature review conformed to PRISMA statement standards. The literature search identified 604 articles, of which 31 studies met inclusion criteria. Only 15 of 31 studies objectively assessed reproducibility. Existing published protocols were insufficient to define a single evidence-based methodology for preoperative assessment of AAA morphology. Further development and expert consensus are required to establish a standardized and validated protocol to determine precisely how morphology relates to outcomes after endovascular aneurysm repair.

  6. Quantitative sensory testing in the German Research Network on Neuropathic Pain (DFNS): standardized protocol and reference values.

    PubMed

    Rolke, R; Baron, R; Maier, C; Tölle, T R; Treede, R-D; Beyer, A; Binder, A; Birbaumer, N; Birklein, F; Bötefür, I C; Braune, S; Flor, H; Huge, V; Klug, R; Landwehrmeyer, G B; Magerl, W; Maihöfner, C; Rolko, C; Schaub, C; Scherens, A; Sprenger, T; Valet, M; Wasserka, B

    2006-08-01

    The nationwide multicenter trials of the German Research Network on Neuropathic Pain (DFNS) aim to characterize the somatosensory phenotype of patients with neuropathic pain. For this purpose, we have implemented a standardized quantitative sensory testing (QST) protocol giving a complete profile for one region within 30 min. To judge plus or minus signs in patients we have now established age- and gender-matched absolute and relative QST reference values from 180 healthy subjects, assessed bilaterally over face, hand and foot. We determined thermal detection and pain thresholds including a test for paradoxical heat sensations, mechanical detection thresholds to von Frey filaments and a 64 Hz tuning fork, mechanical pain thresholds to pinprick stimuli and blunt pressure, stimulus/response-functions for pinprick and dynamic mechanical allodynia, and pain summation (wind-up ratio). QST parameters were region specific and age dependent. Pain thresholds were significantly lower in women than men. Detection thresholds were generally independent of gender. Reference data were normalized to the specific group means and variances (region, age, gender) by calculating z-scores. Due to confidence limits close to the respective limits of the possible data range, heat hypoalgesia, cold hypoalgesia, and mechanical hyperesthesia can hardly be diagnosed. Nevertheless, these parameters can be used for group comparisons. Sensitivity is enhanced by side-to-side comparisons by a factor ranging from 1.1 to 2.5. Relative comparisons across body regions do not offer advantages over absolute reference values. Application of this standardized QST protocol in patients and human surrogate models will allow to infer underlying mechanisms from somatosensory phenotypes.

  7. Non-Intrusive Load Monitoring Assessment: Literature Review and Laboratory Protocol

    SciTech Connect

    Butner, R. Scott; Reid, Douglas J.; Hoffman, Michael G.; Sullivan, Greg; Blanchard, Jeremy

    2013-07-01

    To evaluate the accuracy of NILM technologies, a literature review was conducted to identify any test protocols or standardized testing approaches currently in use. The literature review indicated that no consistent conventions were currently in place for measuring the accuracy of these technologies. Consequently, PNNL developed a testing protocol and metrics to provide the basis for quantifying and analyzing the accuracy of commercially available NILM technologies. This report discusses the results of the literature review and the proposed test protocol and metrics in more detail.

  8. Standardization and Optimization of Computed Tomography Protocols to Achieve Low-Dose

    PubMed Central

    Chin, Cynthia; Cody, Dianna D.; Gupta, Rajiv; Hess, Christopher P.; Kalra, Mannudeep K.; Kofler, James M.; Krishnam, Mayil S.; Einstein, Andrew J.

    2014-01-01

    The increase in radiation exposure due to CT scans has been of growing concern in recent years. CT scanners differ in their capabilities and various indications require unique protocols, but there remains room for standardization and optimization. In this paper we summarize approaches to reduce dose, as discussed in lectures comprising the first session of the 2013 UCSF Virtual Symposium on Radiation Safety in Computed Tomography. The experience of scanning at low dose in different body regions, for both diagnostic and interventional CT procedures, is addressed. An essential primary step is justifying the medical need for each scan. General guiding principles for reducing dose include tailoring a scan to a patient, minimizing scan length, use of tube current modulation and minimizing tube current, minimizing-tube potential, iterative reconstruction, and periodic review of CT studies. Organized efforts for standardization have been spearheaded by professional societies such as the American Association of Physicists in Medicine. Finally, all team members should demonstrate an awareness of the importance of minimizing dose. PMID:24589403

  9. Using standard treatment protocols to manage costs and quality of hospital services.

    PubMed

    Meyer, J W; Feingold, M G

    1993-06-01

    The current health care environment has made it critically important that hospital costs and quality be managed in an integrated fashion. Promised health care reforms are expected to make cost reduction and quality enhancement only more important. Traditional methods of hospital cost and quality control have largely been replaced by such approaches as practice parameters, outcomes measurement, clinical indicators, clinical paths, benchmarking, patient-centered care, and a focus on patient selection criteria. This Special Report describes an integrated process for strategically managing costs and quality simultaneously, incorporating key elements of many important new quality and cost control tools. By using a multidisciplinary group process to develop standard treatment protocols, hospitals and their medical staffs address the most important services provided within major product lines. Using both clinical and financial data, groups of physicians, nurses, department managers, financial analysts, and administrators redesign key patterns of care within their hospital, incorporating the best practices of their own and other institutions. The outcome of this process is a new, standardized set of clinical guidelines that reduce unnecessary variation in care, eliminate redundant interventions, establish clear lines of communication for all caregivers, and reduce the cost of each stay. The hospital, medical staff, and patients benefit from the improved opportunities for managed care contracting, more efficient hospital systems, consensus-based quality measures, and reductions in the cost of care. STPs offer a workable and worthwhile approach to positioning the hospital of the 1990s for operational efficiency and cost and quality competitiveness.

  10. Using GLOBE Plant Phenology Protocols To Meet the "National Science Education Standards."

    ERIC Educational Resources Information Center

    Bombaugh, Ruth; Sparrow, Elena; Mal, Tarun

    2003-01-01

    Describes how high school biology teachers can use the Global Learning and Observations to Benefit the Environment (GLOBE) program protocols and data in their classrooms. Includes background information on plant phenology, an overview of GLOBE phenology protocols and materials, and implications for protocols with both deciduous trees and grasses…

  11. A Protocol for the Global Sensitivity Analysis of Impact Assessment Models in Life Cycle Assessment.

    PubMed

    Cucurachi, S; Borgonovo, E; Heijungs, R

    2016-02-01

    The life cycle assessment (LCA) framework has established itself as the leading tool for the assessment of the environmental impact of products. Several works have established the need of integrating the LCA and risk analysis methodologies, due to the several common aspects. One of the ways to reach such integration is through guaranteeing that uncertainties in LCA modeling are carefully treated. It has been claimed that more attention should be paid to quantifying the uncertainties present in the various phases of LCA. Though the topic has been attracting increasing attention of practitioners and experts in LCA, there is still a lack of understanding and a limited use of the available statistical tools. In this work, we introduce a protocol to conduct global sensitivity analysis in LCA. The article focuses on the life cycle impact assessment (LCIA), and particularly on the relevance of global techniques for the development of trustable impact assessment models. We use a novel characterization model developed for the quantification of the impacts of noise on humans as a test case. We show that global SA is fundamental to guarantee that the modeler has a complete understanding of: (i) the structure of the model and (ii) the importance of uncertain model inputs and the interaction among them.

  12. Age and gender leucocytes variances and references values generated using the standardized ONE-Study protocol.

    PubMed

    Kverneland, Anders H; Streitz, Mathias; Geissler, Edward; Hutchinson, James; Vogt, Katrin; Boës, David; Niemann, Nadja; Pedersen, Anders Elm; Schlickeiser, Stephan; Sawitzki, Birgit

    2016-06-01

    Flow cytometry is now accepted as an ideal technology to reveal changes in immune cell composition and function. However, it is also an error-prone and variable technology, which makes it difficult to reproduce findings across laboratories. We have recently developed a strategy to standardize whole blood flow cytometry. The performance of our protocols was challenged here by profiling samples from healthy volunteers to reveal age- and gender-dependent differences and to establish a standardized reference cohort for use in clinical trials. Whole blood samples from two different cohorts were analyzed (first cohort: n = 52, second cohort: n = 46, both 20-84 years with equal gender distribution). The second cohort was run as a validation cohort by a different operator. The "ONE Study" panels were applied to analyze expression of >30 different surface markers to enumerate proportional and absolute numbers of >50 leucocyte subsets. Indeed, analysis of the first cohort revealed significant age-dependent changes in subsets e.g. increased activated and differentiated CD4(+) and CD8(+) T cell subsets, acquisition of a memory phenotype for Tregs as well as decreased MDC2 and Marginal Zone B cells. Males and females showed different dynamics in age-dependent T cell activation and differentiation, indicating faster immunosenescence in males. Importantly, although both cohorts consisted of a small sample size, our standardized approach enabled validation of age-dependent changes with the second cohort. Thus, we have proven the utility of our strategy and generated reproducible reference ranges accounting for age- and gender-dependent differences, which are crucial for a better patient monitoring and individualized therapy. © 2016 International Society for Advancement of Cytometry. PMID:27144459

  13. Assessing the Effect of a Contouring Protocol on Postprostatectomy Radiotherapy Clinical Target Volumes and Interphysician Variation

    SciTech Connect

    Mitchell, Darren M.; Perry, Lesley; Smith, Steve; Elliott, Tony; Wylie, James P.; Cowan, Richard A.; Livsey, Jacqueline E.; Logue, John P.

    2009-11-15

    Purpose: To compare postprostatectomy clinical target volume (CTV) delineation before and after the introduction of a contouring protocol and to investigate its effect on interphysician variability Methods and Materials: Six site-specialized radiation oncologists independently delineated a CTV on the computed tomography (CT) scans of 3 patients who had received postprostatectomy radiotherapy. At least 3 weeks later this was repeated, but with the physicians adhering to the contouring protocol from the Medical Research Council's Radiotherapy and Androgen Deprivation In Combination After Local Surgery (RADICALS) trial. The volumes obtained before and after the protocol were compared and the effect of the protocol on interphysician variability assessed. Results: An increase in mean CTV for all patients of 40.7 to 53.9cm{sup 3} was noted as a result of observing the protocol, with individual increases in the mean CTV of 65%, 15%, and 24% for Patients 1, 2, and 3 respectively. A reduction in interphysician variability was noted when the protocol was used. Conclusions: Substantial interphysician variation in target volume delineation for postprostatectomy radiotherapy exists, which can be reduced by the use of a contouring protocol. The RADICALS contouring protocol increases the target volumes when compared with those volumes typically applied at our center. The effect of treating larger volumes on the therapeutic ratio and resultant toxicity should be carefully monitored, particularly if the same dose-response as documented in radical prostate radiotherapy applies to the adjuvant and salvage setting. Prostate cancer, Postprostatectomy, Radiotherapy, Target volume.

  14. USE OF BROMOERGOCRYPTINE IN THE VALIDATION OF PROTOCOLS FOR THE ASSESSMENT OF MECHANISMS OF EARLY PREGNANCY LOSS IN THE RAT

    EPA Science Inventory

    Validated protocols for evaluating maternally mediated mechanisms of early pregnancy failure in rodents are needed for use in the risk assessment process. To supplement previous efforts in the validation of a panel of protocols assembled for this purpose, bromoergocryptine (BEC) ...

  15. Defining standardized protocols for determining the efficacy of a postmilking teat disinfectant following experimental exposure of teats to mastitis pathogens.

    PubMed

    Schukken, Y H; Rauch, B J; Morelli, J

    2013-04-01

    The objective of this paper was to define standardized protocols for determining the efficacy of a postmilking teat disinfectant following experimental exposure of teats to both Staphylococcus aureus and Streptococcus agalactiae. The standardized protocols describe the selection of cows and herds and define the critical points in performing experimental exposure, performing bacterial culture, evaluating the culture results, and finally performing statistical analyses and reporting of the results. The protocols define both negative control and positive control trials. For negative control trials, the protocol states that an efficacy of reducing new intramammary infections (IMI) of at least 40% is required for a teat disinfectant to be considered effective. For positive control trials, noninferiority to a control disinfectant with a published efficacy of reducing new IMI of at least 70% is required. Sample sizes for both negative and positive control trials are calculated. Positive control trials are expected to require a large trial size. Statistical analysis methods are defined and, in the proposed methods, the rate of IMI may be analyzed using generalized linear mixed models. The efficacy of the test product can be evaluated while controlling for important covariates and confounders in the trial. Finally, standards for reporting are defined and reporting considerations are discussed. The use of the defined protocol is shown through presentation of the results of a recent trial of a test product against a negative control. PMID:23415529

  16. Students' Attitudes toward a Group Coursework Protocol and Peer Assessment System

    ERIC Educational Resources Information Center

    Moraes, Caroline; Michaelidou, Nina; Canning, Louise

    2016-01-01

    This paper addresses a knowledge gap by presenting an empirical investigation of a group coursework protocol and peer assessment system (GCP&PAS) used in a UK university to support postgraduate marketing students in their assessed group activities. The aim of the research was to examine students' understanding of the GCP&PAS and their…

  17. On-Farm Welfare Assessment Protocol for Adult Dairy Goats in Intensive Production Systems

    PubMed Central

    Battini, Monica; Stilwell, George; Vieira, Ana; Barbieri, Sara; Canali, Elisabetta; Mattiello, Silvana

    2015-01-01

    Simple Summary The Animal Welfare Indicators (AWIN) project developed a practical welfare assessment protocol for lactating dairy goats in intensive husbandry systems, using animal-based indicators that cover the whole multidimensional concept of animal welfare. The strict collaboration between scientists and stakeholders resulted in an easy-to-use protocol that provides farmers or veterinarians with comprehensive but clear feedback on the welfare status of the herd in less than three hours. The protocol, which highlights key points and motivates farmers to achieve improvements, has received much attention from interested parties. Abstract Within the European AWIN project, a protocol for assessing dairy goats’ welfare on the farm was developed. Starting from a literature review, a prototype including animal-based indicators covering four welfare principles and 12 welfare criteria was set up. The prototype was tested in 60 farms for validity, reliability, and feasibility. After testing the prototype, a two-level assessment protocol was proposed in order to increase acceptability among stakeholders. The first level offers a more general overview of the welfare status, based on group assessment of a few indicators (e.g., hair coat condition, latency to the first contact test, severe lameness, Qualitative Behavior Assessment), with no or minimal handling of goats and short assessment time required. The second level starts if welfare problems are encountered in the first level and adds a comprehensive and detailed individual evaluation (e.g., Body Condition Score, udder asymmetry, overgrown claws), supported by an effective sampling strategy. The assessment can be carried out using the AWIN Goat app. The app results in a clear visual output, which provides positive feedback on welfare conditions in comparison with a benchmark of a reference population. The protocol may be a valuable tool for both veterinarians and technicians and a self-assessment instrument for

  18. Comparison of ventilation threshold and heart rate deflection point in fast and standard treadmill test protocols.

    PubMed

    Vucetić, Vlatko; Sentija, Davor; Sporis, Goran; Trajković, Nebojsa; Milanović, Zoran

    2014-06-01

    The purpose of this study was to compare two methods for determination of anaerobic threshold from two different treadmill protocols. Forty-eight Croatian runners of national rank (ten sprinters, fifteen 400-m runners, ten middle distance runners and thirteen long distance runners), mean age 21.7 +/- 5.1 years, participated in the study. They performed two graded maximal exercise tests on a treadmill, a standard ramp treadmill test (T(SR), speed increments of 1 km x h(-1) every 60 seconds) and a fast ramp treadmill test (T(FR), speed increments of 1 km x h(-1) every 30 seconds) to determine and compare the parameters at peak values and at heart rate at the deflection point (HR(DP)) and ventilation threshold (VT). There were no significant differences between protocols (p > 0.05) for peak values of oxygen uptake (VO(2max), 4.48 +/- 0.43 and 4.44 +/- 0.45 L x min(-1)), weight related VO(2max) (62.5 +/- 6.2 and 62.0 +/- 6.0 mL x kg(-1) x min(-1)), pulmonary ventilation (VE(max), 163.1 +/- 18.7 and 161.3 +/- 19.9 L x min(-1)) and heart rate (HR(max), 192.3 +/- 8.5 and 194.4 +/- 8.7 bpm) (T(FR) and T(SR), respectively). Moreover, no significant differences between T(FR) and T(SR) where found for VT and HR(DP) when expressed as VO2 and HR. However, there was a significant effect of ramp slope on running speed at VO(2max) and at the anaerobic threshold (AnT), independent of the method used (VT: 16.0 +/- 2.2 vs 14.9 +/- 2.2 km x h(-1);HR(DP): 16.5 +/- 1.9 vs 14.9 +/- 2.0 km x h(-1) for T(FR) and T(SR) respectively). Linear regression analysis revealed high between-test and between-method correlations for VO2, HR and running speed parameters (r = 0.78-0.89, p < 0.01). The present study has indicated that the VT and HR(DP) for running (VO2, ventilation, and heart rate at VT/HR(DP)) are independent of test protocol, while there is a significant effect of ramp slope on VT and HR(DP) when expressed as running speed. Moreover, this study demonstrates that the point of deflection

  19. Development of a Protocol for Simultaneous Assessment of Cognitive Functioning Under Psychosocial Stress.

    PubMed

    Marko, Martin

    2016-10-01

    This study presents a protocol for induction of moderate psychosocial stress and investigates its impact on psychological and physiological responses. The proposed procedure was designed to enable researchers to assess cognitive performance under effect of various classes of stressors. The protocol's structure contains three main periods: baseline, assessment, and recovery. The assessment stage starts with task anticipation, during which audience (three-member commission) is introduced and apparatus (cameras, microphones, lights, and physiological measuring devices) stationed. Subsequently, cognitive performance was tested. The protocol was evaluated on 56 university students that were randomly assigned to control or stress (protocol) treatment and administered three cognitive tests (working memory operation span, remote associates test, and semantic fluency). Compared to control sessions, protocol induced state anxiety, interfering worry thoughts, and disturbance during recovery period. In addition, the stress group also showed elevated levels of skin conductance, higher average heart rates, and larger drops in peripheral temperature. Even though more research is needed, these results suggest that the protocol effectively induces both psychological and physiological stress responses and therefore encourages utilization in cognitive-affective and cognitive-biological fields of research. PMID:27467447

  20. Standardized protocols for characterizing women's fertility: A data-driven approach.

    PubMed

    Blake, Khandis R; Dixson, Barnaby J W; O'Dean, Siobhan M; Denson, Thomas F

    2016-05-01

    Experts are divided on whether women's cognition and behavior differs between fertile and non-fertile phases of the menstrual cycle. One of the biggest criticisms of this literature concerns the use of indirect, imprecise, and flexible methodologies between studies to characterize women's fertility. To resolve this problem, we provide a data-driven method of best practices for characterizing women's fertile phase. We compared the accuracy of self-reported methods and counting procedures (i.e., the forward- and backward-counting methods) in estimating ovulation using data from 140 women whose fertility was verified with luteinizing hormone tests. Results revealed that no counting method was associated with ovulation with >30% accuracy. A minimum of 39.5% of the days in the six-day fertile window predicted by the counting methods were non-fertile, and correlations between counting method conception probabilities and actual conception probability were weak to moderate, rs=0.11-0.30. Poor results persisted when using a lenient window for predicting ovulation, across alternative estimators of the onset of the next cycle, and when removing outliers to increase the homogeneity of the sample. By contrast, combining counting methods with a relatively inexpensive test of luteinizing hormone predicted fertility with accuracy >95%, but only when specific guidelines were followed. To this end, herein we provide a cost-effective, pragmatic, and standardized protocol that will allow researchers to test whether fertility effects exist or not. PMID:27072982

  1. Implementing Istanbul Protocol standards for forensic evidence of torture in Kyrgyzstan.

    PubMed

    Moreno, Alejandro; Crosby, Sondra; Xenakis, Stephen; Iacopino, Vincent

    2015-02-01

    The Kyrgyz government declared a policy of "zero tolerance" for torture and began reforms to stop such practice, a regular occurrence in the country's daily life. This study presents the results of 10 forensic evaluations of individuals alleging torture; they represent 35% of all criminal investigations into torture for the January 2011-July 2012 period. All individuals evaluated were male with an average age of 34 years. Police officers were implicated as perpetrators in all cases. All individuals reported being subjected to threats and blunt force trauma from punches, kicks, and blows with objects such as police batons. The most common conditions documented during the evaluations were traumatic brain injury and chronic seizures. Psychological sequelae included post-traumatic stress disorder and major depressive disorder, which was diagnosed in seven individuals. In all cases, the physical and psychological evidence was highly consistent with individual allegations of abuse. These forensic evaluations, which represent the first ever to be conducted in Kyrgyzstan in accordance with Istanbul Protocol standards, provide critical insight into torture practices in the country. The evaluations indicate a pattern of brutal torture practices and inadequate governmental and nongovernmental forensic evaluations.

  2. Validity and responsiveness of the Clubfoot Assessment Protocol (CAP). A methodological study

    PubMed Central

    Andriesse, Hanneke; Roos, Ewa M; Hägglund, Gunnar; Jarnlo, Gun-Britt

    2006-01-01

    Background The Clubfoot Assessment Protocol (CAP) is a multi dimensional instrument designed for longitudinal follow up of the clubfoot deformity during growth. Item reliability has shown to be sufficient. In this article the CAP's validity and responsiveness is studied using the Dimeglio classification scoring as a gold standard. Methods Thirty-two children with 45 congenital clubfeet were assessed prospectively and consecutively at ages of new-born, one, two, four months and two years of age. For convergent/divergent construct validity the Spearman's correlation coefficients were calculated. Discriminate validity was evaluated by studying the scores in bilateral clubfeet. The floor-ceiling effects at baseline (untreated clubfeet) and at two years of age (treated clubfeet) were evaluated. Responsiveness was evaluated by using effect sizes (ES) and by calculating if significant changes (Wilcoxons signed test) had occurred between the different measurement occasions. Results High to moderate significant correlation were found between CAP mobility I and morphology and the Dimeglio scores (rs = 0.77 and 0.44 respectively). Low correlation was found between CAP muscle function, mobility II and motion quality and the Dimeglio scoring system (rs = 0.20, 0.09 and 0.06 respectively). Of 13 children with bilateral clubfeet, 11 showed different CAP mobility I scores between right and left foot at baseline (untreated) compared with 5 with the Dimeglio score. At the other assessment occasions the CAP mobility I continued to show higher discrimination ability than the Dimeglio. No floor effects and low ceiling effects were found in the untreated clubfeet for both instruments. High ceiling effects were found in the CAP for the treated children and low for the Dimeglio. Responsiveness was good. ES from untreated to treated ranged from 0.80 to 4.35 for the CAP subgroups and was 4.68 for the Dimeglio. The first four treatment months, the CAP mobility I had generally higher ES

  3. A Linguistic Feature Analysis of Verbal Protocols Associated with Pupil Responses to Standardized Measures of Reading Comprehension.

    ERIC Educational Resources Information Center

    Jacobson, M. Victoria

    This study was designed to collect and analyze the verbal protocols of students involved in introspection as they responded to standardized measures of reading comprehension, for the purpose of learning more about the reading process. Eleven seventh grade students from an urban public school were randomly chosen from 61 subjects who met the…

  4. A Linguistic Feature Analysis of Verbal Protocols Associated with Pupil Responses to Standardized Measures of Reading Comprehension.

    ERIC Educational Resources Information Center

    Jacobson, M. Victoria

    The major purpose of this study was to provide insights into some of the reasoning strategies that may be used by students in obtaining meaning from the printed page. The study was designed to collect and analyze the verbal protocols of 11 seventh grade students involved in introspection as they responded to standardized measures of reading…

  5. Divided Timed and Continuous Timed Assessment Protocols and Academic Performance

    ERIC Educational Resources Information Center

    Perucca, David.

    2013-01-01

    Children from a low socioeconomic status (SES) are exposed to numerous stress factors that are negatively associated with sustained attention and academic performance. This association suggests that the timed component of lengthy assessments may be unfair for students from such backgrounds, as they may have an inability to sustain attention during…

  6. Toward an HRD Auditing Protocol: Assessing HRD Risk Management Practices

    ERIC Educational Resources Information Center

    Clardy, Alan

    2004-01-01

    Even though HRD-related programs and activities carry risks that should be monitored and assessed, there is little literature on how auditing applies to the HRD function; the existing literature on the topic defines HRD auditing in widely different ways. The nature of risk for organizational process is discussed, followed by a review of the…

  7. Satellite Communications Using Commercial Protocols

    NASA Technical Reports Server (NTRS)

    Ivancic, William D.; Griner, James H.; Dimond, Robert; Frantz, Brian D.; Kachmar, Brian; Shell, Dan

    2000-01-01

    NASA Glenn Research Center has been working with industry, academia, and other government agencies in assessing commercial communications protocols for satellite and space-based applications. In addition, NASA Glenn has been developing and advocating new satellite-friendly modifications to existing communications protocol standards. This paper summarizes recent research into the applicability of various commercial standard protocols for use over satellite and space- based communications networks as well as expectations for future protocol development. It serves as a reference point from which the detailed work can be readily accessed. Areas that will be addressed include asynchronous-transfer-mode quality of service; completed and ongoing work of the Internet Engineering Task Force; data-link-layer protocol development for unidirectional link routing; and protocols for aeronautical applications, including mobile Internet protocol routing for wireless/mobile hosts and the aeronautical telecommunications network protocol.

  8. Common Core State Standards Assessments: Challenges and Opportunities

    ERIC Educational Resources Information Center

    Polikoff, Morgan S.

    2014-01-01

    The Common Core State Standards (CCSS) were created in response to the shortcomings of No Child Left Behind era standards and assessments. Among those failings were the poor quality of content standards and assessments and the variability in content expectations and proficiency targets across states, as well as concerns related to the economic…

  9. 42 CFR 493.1299 - Standard: Postanalytic systems quality assessment.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard: Postanalytic systems quality assessment... AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY REQUIREMENTS Quality System for Nonwaived Testing Postanalytic Systems § 493.1299 Standard: Postanalytic systems quality assessment. (a)...

  10. 42 CFR 493.1249 - Standard: Preanalytic systems quality assessment.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard: Preanalytic systems quality assessment... AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY REQUIREMENTS Quality System for Nonwaived Testing Preanalytic Systems § 493.1249 Standard: Preanalytic systems quality assessment. (a)...

  11. Utility of a Standardized Protocol for Submitting Clinically Suspected Endometrial Polyps to the Pathology Laboratory

    PubMed Central

    Safdar, Nida S.; Giannico, Giovanna; Desouki, Mohamed Mokhtar

    2016-01-01

    The purpose of the study is to assess whether a protocol for submitting clinically suspected endometrial polyps will improve the detection rate of polyps and evaluation of the background endometrium. A retrospective review from 1999– 2015 was performed. Cases were divided into: 1) polyps and curetting placed in 2 containers (separate, n=61) and 2) polyps and curettings placed in one container (combined, n=80). Polyps were identified in 100% of cases in the separate compared to 95% in the combined group (p=0.62). The background endometrium was evaluable in 79% in the combined compared to 90% in the separate group (p=0.07). The frequency of hyperplasia without atypia, atypical hyperplasia and carcinoma was 4.4%, 3.6% and 1.5%, respectively. In conclusion, the enhanced rate of polyp detection and evaluation of the background endometrium in the separate group is minimal. This supports the recommendation of submitting endometrial polyps and curetting combined in one container. PMID:27402220

  12. A Field-Based Testing Protocol for Assessing Gross Motor Skills in Preschool Children: The CHAMPS Motor Skills Protocol (CMSP)

    PubMed Central

    Williams, Harriet G.; Pfeiffer, Karin A.; Dowda, Marsha; Jeter, Chevy; Jones, Shaverra; Pate, Russell R.

    2010-01-01

    The purpose of the study was to develop a valid and reliable tool for use in assessing motor skills in preschool children in field based settings. The development of the CHAMPS (Children’s Activity and Movement in Preschool Study) Motor Skills Protocol (CMSP) included evidence of its reliability and validity for use in field-based environments as part of large epidemiological studies. Following pilot work, 297 children (3-5 years old) from 22 preschools were tested using the final version of the CMSP and the TGMD-2. Reliability of the CMSP and interobserver reliability were determined using intraclass correlation procedures (ICC; ANOVA). Concurrent validity was assessed using Pearson correlation coefficients to compare the CMSP to the original Test of Gross Motor Development (2nd Edition) (TGMD-2). Results indicated that test reliability, interobserver reliability and validity coefficients were all high, generally above R/r = 0.90. Significant age differences were found. Outcomes indicate that the CMSP is an appropriate tool for assessing motor development of 3-, 4-, and 5-year-old children in field-based settings that are consistent with large-scale trials. PMID:21532999

  13. Water-quality sampling by the U.S. Geological Survey-Standard protocols and procedures

    USGS Publications Warehouse

    Wilde, Franceska D.

    2010-01-01

    Thumbnail of and link to report PDF (1.0 MB) The U.S. Geological Survey (USGS) develops the sampling procedures and collects the data necessary for the accurate assessment and wise management of our Nation's surface-water and groundwater resources. Federal and State agencies, water-resource regulators and managers, and many organizations and interested parties in the public and private sectors depend on the reliability, timeliness, and integrity of the data we collect and the scientific soundness and impartiality of our data assessments and analysis. The standard data-collection methods uniformly used by USGS water-quality personnel are peer reviewed, kept up-to-date, and published in the National Field Manual for the Collection of Water-Quality Data (http://pubs.water.usgs.gov/twri9A/).

  14. Hyperthermic tissue sealing devices: a proposed histopathologic protocol for standardizing the evaluation of thermally sealed vessels

    NASA Astrophysics Data System (ADS)

    Livengood, Ryan H.; Vos, Jeffrey A.; Coad, James E.

    2011-03-01

    Hyperthermic tissue sealing devices are advancing modern laparoscopy and other minimally invasive surgical approaches. Histopathologic evaluation of thermally sealed vessels can provide important information on their associated tissue effects and reactions. However, a standardized systematic approach has not been historically used in the literature. This paper proposes a histologic approach for the analysis of thermally sealed vessels and their basis of hemostasis, including thermal tissue changes, healing, and thrombosis. Histologic evaluation during the first week (Days 3-7) can assess the seal's primary tissue properties. These parameters include the thermal seal's length, architecture, tissue layers involved, adventitial collagen denaturation length, entrapped vapor or blood pockets, tissue homogenization and thermal tissue injury zones. While the architectural features can be assessed in Day 0-3 specimens, the latter thermal injury zones are essentially not assessable in Day 0-3 specimens. Day 14 specimens can provide information on the early healing response to the sealed vessel. Day 30 and longer specimens can be used to evaluate the seal's healing reactions. Assessment of the healing response should include seal site inflammation, granulation tissue, necrosis resorption, fibroproliferative scar healing, and thrombus organization. In order to accurately evaluate these parameters, careful specimen orientation, embedding and multiple histologic sections across the entire seal width are required. When appropriate in vivo post-treatment times are used, thermal vessel seals can be evaluated with routine light microscopy and common histologic staining methods.

  15. Symposium: Language Assessment in Standards-Based Education Reform

    ERIC Educational Resources Information Center

    Menken, Kate; Hudson, Thom; Leung, Constant

    2014-01-01

    This symposium article, to which three authors contribute distinct parts, presents the rationale for standards-based language assessment and examines both the uses and misuses of language assessments in English-speaking countries that are engaged in standards-based education reform. Specifically, they focus on the assessment of emergent bilinguals…

  16. Answers to Essential Questions about Standards, Assessments, Grading, & Reporting

    ERIC Educational Resources Information Center

    Guskey, Thomas R.; Jung, Lee Ann

    2013-01-01

    How do assessments for learning differ from assessments of learning? What is the purpose of grading? After nearly two decades of immersion in standards-based curriculua and instruction, our nation's educators are often still confounded by the (admittedly complex) landscape of standards, assessment, and reporting. In "Answers to Essential…

  17. 42 CFR 493.1235 - Standard: Personnel competency assessment policies.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard: Personnel competency assessment policies... Nonwaived Testing General Laboratory Systems § 493.1235 Standard: Personnel competency assessment policies... written policies and procedures to assess employee and, if applicable, consultant competency....

  18. 42 CFR 493.1289 - Standard: Analytic systems quality assessment.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard: Analytic systems quality assessment. 493... Nonwaived Testing Analytic Systems § 493.1289 Standard: Analytic systems quality assessment. (a) The..., assess, and when indicated, correct problems identified in the analytic systems specified in §§...

  19. False-positive diatom test: a real challenge? A post-mortem study using standardized protocols.

    PubMed

    Lunetta, Philippe; Miettinen, Arto; Spilling, Kristian; Sajantila, Antti

    2013-09-01

    The main criticism of the validity of the diatom test for the diagnosis of drowning is based on the potential ante- and post-mortem penetration of diatoms and the finding of diatoms in bodies of non-drowned human beings. However, qualitative and quantitative studies on diatoms in organs of the non-drowned have yielded both conflicting and contradictory results. In the present study, we have analysed under standardised methods the diatom content in several organs of 14 non-drowned human bodies. Overall, only 9 diatoms (6 entire, 3 fragmented) were disclosed in 6 of the 14 non-drowned bodies. Each of these 6 cadavers had only a single "positive" organ. Six diatoms were found in the bone marrow, 2 in the lung, and one in the pleural liquid. No diatoms were recovered from the brain, liver, kidney, or blood samples of any of these 14 bodies. Moreover, in five additional cadavers, whose lungs were injected, prior autopsy, with a 3.5L solution containing a bi-cellulate diatom culture (Thalassiosira baltica, Thalassiosira levanderi) via tracheostomy, a few diatoms appeared in the pleural cavity and in the blood from the left heart chamber, but none in any other internal organs investigated. The results of the presented study demonstrate that the issue of the false-positive diatom test should not be a logical impediment to the performance of the diatom method. However, strict and standardized protocols aimed at avoiding contamination during sample preparation must be used, appropriate separation values set and taxonomic analysis of all diatoms performed. PMID:23701706

  20. The edaphic quantitative protargol stain: a sampling protocol for assessing soil ciliate abundance and diversity.

    PubMed

    Acosta-Mercado, Dimaris; Lynn, Denis H

    2003-06-01

    It has been suggested that species loss from microbial groups low in diversity that occupy trophic positions close to the base of the detrital food web could be critical for terrestrial ecosystem functioning. Among the protozoans within the soil microbial loop, ciliates are presumably the least abundant and of low diversity. However, the lack of a standardized method to quantitatively enumerate and identify them has hampered our knowledge about the magnitude of their active and potential diversity, and about the interactions in which they are involved. Thus, the Edaphic Quantitative Protargol Staining (EQPS) method is provided to simultaneously account for ciliate species richness and abundance in a quantitative and qualitative way. This direct method allows this rapid and simultaneous assessment by merging the Non-flooded Petri Dish (NFPD) method [Prog. Protistol. 2 (1987) 69] and the Quantitative Protargol Stain (QPS) method [Montagnes, D.J.S., Lynn, D.H., 1993. A quantitative protargol stain (QPS) for ciliates and other protists. In: Kemp, P.F., Sherr, B.F., Sherr, E.B., Cole, J.J. (Eds.), Handbook of Methods in Aquatic Microbial Ecology. Lewis Publishers, Boca Raton, FL, pp. 229-240]. The abovementioned protocols were refined by experiments examining the spatial distribution of ciliates under natural field conditions, sampling intensity, the effect of storage, and the use of cytological preparations versus live observations. The EQPS could be useful in ecological studies since it provides both a "snapshot" of the active and effective diversity and a robust estimate of the potential diversity.

  1. Whose Standards? (B) Reaching the Assessment Puzzle

    ERIC Educational Resources Information Center

    Polimeni, John M.; Iorgulescu, Raluca I.

    2009-01-01

    Love it or hate it, assessment has become the new reality on college and university campuses. Although measuring student achievement of course outcomes is not an easy task, assessment does not need to be a complex or painful experience. This paper describes the methods used to assess student achievement of the stated course outcomes in…

  2. Laparoscopic colorectal surgery in learning curve: Role of implementation of a standardized technique and recovery protocol. A cohort study

    PubMed Central

    Luglio, Gaetano; De Palma, Giovanni Domenico; Tarquini, Rachele; Giglio, Mariano Cesare; Sollazzo, Viviana; Esposito, Emanuela; Spadarella, Emanuela; Peltrini, Roberto; Liccardo, Filomena; Bucci, Luigi

    2015-01-01

    Background Despite the proven benefits, laparoscopic colorectal surgery is still under utilized among surgeons. A steep learning is one of the causes of its limited adoption. Aim of the study is to determine the feasibility and morbidity rate after laparoscopic colorectal surgery in a single institution, “learning curve” experience, implementing a well standardized operative technique and recovery protocol. Methods The first 50 patients treated laparoscopically were included. All the procedures were performed by a trainee surgeon, supervised by a consultant surgeon, according to the principle of complete mesocolic excision with central vascular ligation or TME. Patients underwent a fast track recovery programme. Recovery parameters, short-term outcomes, morbidity and mortality have been assessed. Results Type of resections: 20 left side resections, 8 right side resections, 14 low anterior resection/TME, 5 total colectomy and IRA, 3 total panproctocolectomy and pouch. Mean operative time: 227 min; mean number of lymph-nodes: 18.7. Conversion rate: 8%. Mean time to flatus: 1.3 days; Mean time to solid stool: 2.3 days. Mean length of hospital stay: 7.2 days. Overall morbidity: 24%; major morbidity (Dindo–Clavien III): 4%. No anastomotic leak, no mortality, no 30-days readmission. Conclusion Proper laparoscopic colorectal surgery is safe and leads to excellent results in terms of recovery and short term outcomes, even in a learning curve setting. Key factors for better outcomes and shortening the learning curve seem to be the adoption of a standardized technique and training model along with the strict supervision of an expert colorectal surgeon. PMID:25859386

  3. An outcomes evaluation of an emergency department early pregnancy assessment service and early pregnancy assessment protocol

    PubMed Central

    Wendt, Kim; Crilly, Julia; May, Chris; Bates, Kym; Saxena, Rakhee

    2014-01-01

    Background Complications in early pregnancy, such as threatened or actual miscarriage is a common occurrence resulting in many women presenting to the emergency department (ED). Early pregnancy service delivery models described in the literature vary in terms of approach, setting and outcomes. Our objective was to determine outcomes of women who presented to an Australian regional ED with diagnoses consistent with early pregnancy complications following the implementation of an early pregnancy assessment service (EPAS) and early pregnancy assessment protocol (EPAP) in July 2011. Methods A descriptive, comparative (6 months before and after) study was undertaken. Data were extracted from the hospital ED information system and medical healthcare records. Outcome measures included: time to see a clinician, ED length of stay, admission rate, re-presentation rate, hospital admission and types of pathology tests ordered. Results Over the 12 -month period, 584 ED presentations were made to the ED with complications of early pregnancy (268 PRE and 316 POST EPAS–EPAP). Outcomes that improved statistically and clinically following implementation included: time to see a clinician (decreased by 6 min from 35 to 29 min), admission rate (decreased 6% from 14.5% to 8.5%), increase in β-human chorionic gonadotrophin ordering by 10% (up to 80% POST), increase in ultrasound (USS) performed by 10% (up to 73% POST) and increase in pain score documentation by 23% (up to 36% POST). Conclusions The results indicate that patient and service delivery improvements can be achieved following the implementation of targeted service delivery models such as EPAS and EPAP in the ED. PMID:24136123

  4. ASSESSMENT OF DE-71, A COMMERCIAL POLYBROMINATED DIPHENYLETHER (PBDE) MIXTURE, IN THE EDSP MALE PUBERTAL PROTOCOL

    EPA Science Inventory

    ASSESSMENT OF DE-71, A COMMERCIAL POLYBROMINATED DIPHENYL ETHER (PBDE) MIXTURE, IN THE EDSP MALE PUBERTAL PROTOCOL. T.E. Stoker1, J. Ferrell1, J.M. Hedge2, K. M. Crofton2, R.L. Cooper1 and S.C. Laws1. 1 Reprod. Tox. Div., 2 Neurotox. Div., NHEERL, ORD, USEPA, RTP, NC.

    P...

  5. Development and Use of an Eating Disorder Assessment and Treatment Protocol

    ERIC Educational Resources Information Center

    Huebner, Lois A.; Weitzman, Lauren M.; Mountain, Lisa M.; Nelson, Kris L.; Oakley, Danielle R.; Smith, Michael L.

    2006-01-01

    Counseling centers have been challenged to effectively treat the growing number of college students who struggle with disordered eating. In response to this critical issue, an Eating Disorder Assessment and Treatment Protocol (EDATP) was developed to assist clinical disposition in the counseling center setting and identify treatment guidelines…

  6. Training Organizations in Use of a Modified Stream Visual Assessment Protocol

    ERIC Educational Resources Information Center

    Obropta, Christopher C.; Yergeau, Steven E.

    2011-01-01

    The Stream Visual Assessment Protocol (SVAP) was evaluated as a means to increase watershed surveys in New Jersey. Groups were trained in an SVAP modified for New Jersey streams. Participants in three training workshops were surveyed to determine the usefulness of SVAP as a cost-effective method to evaluate watershed health. Many respondents found…

  7. Using Simple Linear Regression to Assess the Success of the Montreal Protocol in Reducing Atmospheric Chlorofluorocarbons

    ERIC Educational Resources Information Center

    Nelson, Dean

    2009-01-01

    Following the Guidelines for Assessment and Instruction in Statistics Education (GAISE) recommendation to use real data, an example is presented in which simple linear regression is used to evaluate the effect of the Montreal Protocol on atmospheric concentration of chlorofluorocarbons. This simple set of data, obtained from a public archive, can…

  8. Treatment Integrity Assessment in the Schools: An Evaluation of the Treatment Integrity Planning Protocol

    ERIC Educational Resources Information Center

    Sanetti, Lisa M. Hagermoser; Kratochwill, Thomas R.

    2009-01-01

    The Treatment Integrity Planning Protocol (TIPP) provides a structured process for collaboratively creating a treatment integrity assessment within a consultation framework. The authors evaluated the effect of the TIPP on the implementation of an intervention designed to improve the consistency of students' mathematics performance. Treatment…

  9. A protocol for the health and fitness assessment of NBA players.

    PubMed

    Scheller, A; Rask, B

    1993-04-01

    The assessment of the health and fitness of elite basketball players should be a multidisciplinary process. We have described an organized, efficient, and comprehensive protocol for preseason physical evaluations that could be used at the university as well as professional level.

  10. Teacher Assessment Literacy: A Review of International Standards and Measures

    ERIC Educational Resources Information Center

    DeLuca, Christopher; LaPointe-McEwan, Danielle; Luhanga, Ulemu

    2016-01-01

    Assessment literacy is a core professional requirement across educational systems. Hence, measuring and supporting teachers' assessment literacy have been a primary focus over the past two decades. At present, there are a multitude of assessment standards across the world and numerous assessment literacy measures that represent different…

  11. Biorepository standards and protocols for collecting, processing, and storing human tissues.

    PubMed

    Troyer, Dean

    2008-01-01

    Recent advances in high-throughput assays for gene expression (genomics), proteins (proteomics), and metabolites (metabolomics) have engendered a parallel need for well-annotated human biological samples. Samples from both diseased and unaffected normal tissues are often required. Biorepositories consist of a specimen bank linked to a database of information. Assuring chain of custody and annotation of samples with relevant clinical information is required. The value of samples to end users is generally commensurate with the quality and extent of relevant clinical data included with the samples. Procurement of tissues is often done with parallel pre- and/or post-treatment venipuncture to obtain blood and tissue samples from the same subject. Biorepositories must also process, preserve, and distribute samples to end users. Like traditional libraries, biorepositories are meant to be used, and they are most useful when the needs of end users (researchers) are considered in the planning and development process. Ethics review and an awareness of regulatory requirements for storage, transport, and distribution are required. In the USA, Institutional Review Boards are the local regulatory entities that review protocols for banking of human biological tissues. Governmental and professional agencies and organizations provide some guidelines for standard operating procedures. The Food and Drug Administration (FDA), the Centers For Disease Control (CDC), and professional organizations such as the American Association of Tissue Banks (AATB), the American Association of Blood Banks, The International Red Cross, International Society for Biological Repositories (ISBER) and other organizations provide guidelines for biorepositories and banking of human tissues (see Table 1). To date, these guidelines are directed largely toward procurement, banking, and distribution of human tissues for therapeutic uses. In the international setting, the World Health Organization provides ethical

  12. B-Mode Sonographic Assessment of the Posterior Circumflex Humeral Artery: The SPI-US Protocol-A Technical Procedure in 4 Steps.

    PubMed

    van de Pol, Daan; Maas, Mario; Terpstra, Aart; Pannekoek-Hekman, Marja; Kuijer, P Paul F M; Planken, R Nils

    2016-05-01

    Elite overhead athletes are at risk of vascular injury due to repetitive abduction and external rotation of the dominant arm. The posterior circumflex humeral artery (PCHA) is prone to degeneration, aneurysm formation, and thrombosis in elite volleyball players and baseball pitchers. The prevalence of PCHA-related thromboembolic complications is unknown in this population. However, the prevalence of symptoms associated with digital ischemia is 31% in elite volleyball players. A standardized noninvasive imaging tool will aid in early detection of PCHA injury, prevention of thromboembolic complications, and measurement reproducibility. A standardized vascular sonographic protocol for assessment of the proximal PCHA (SPI-US protocol [Shoulder PCHA Pathology and Digital Ischemia-Ultrasound protocol]) is presented.

  13. B-Mode Sonographic Assessment of the Posterior Circumflex Humeral Artery: The SPI-US Protocol-A Technical Procedure in 4 Steps.

    PubMed

    van de Pol, Daan; Maas, Mario; Terpstra, Aart; Pannekoek-Hekman, Marja; Kuijer, P Paul F M; Planken, R Nils

    2016-05-01

    Elite overhead athletes are at risk of vascular injury due to repetitive abduction and external rotation of the dominant arm. The posterior circumflex humeral artery (PCHA) is prone to degeneration, aneurysm formation, and thrombosis in elite volleyball players and baseball pitchers. The prevalence of PCHA-related thromboembolic complications is unknown in this population. However, the prevalence of symptoms associated with digital ischemia is 31% in elite volleyball players. A standardized noninvasive imaging tool will aid in early detection of PCHA injury, prevention of thromboembolic complications, and measurement reproducibility. A standardized vascular sonographic protocol for assessment of the proximal PCHA (SPI-US protocol [Shoulder PCHA Pathology and Digital Ischemia-Ultrasound protocol]) is presented. PMID:27072158

  14. Assessing Medication Adherence as a Standard of Care in Pediatric Oncology.

    PubMed

    Pai, Ahna L H; McGrady, Meghan E

    2015-12-01

    Poor adherence to pediatric cancer treatment protocols may prevent children and adolescents from realizing the potential benefits of therapy. This paper presents the evidence for a standard of care for supporting medication adherence. Databases were reviewed for articles examining adherence and including children and/or adolescents with cancer. Fourteen articles (i.e., qualitative, quantitative, review, and randomized clinical trials) were evaluated for rigor. There is moderate-quality evidence to support a strong recommendation for adherence to be assessed routinely and monitored throughout the treatment. Integrating the proposed clinical procedures into standard clinical care may improve outcomes for children and adolescents with cancer.

  15. Development and application of cognitive-pragmatic language ability assessment protocol for traumatic brain injury.

    PubMed

    Lee, Mi Sook; Kim, HyangHee

    2016-01-01

    The study aim was to introduce a newly-developed multifaceted cognitive-pragmatic language assessment protocol. This study was also designed to assess the reliability and validity of the assessment protocol in the discrimination between mild traumatic brain injury (mTBI) and normal control. Individuals in this study were 25 to 64 years old. Ten mTBIs and twenty-two control group were recruited for the preliminary study. Their mean ages were 45.20 and 41.23, respectively. For the main study, we recruited 39 mTBIs and 100 healthy individuals whose mean ages were 44.67 and 40.84, respectively. The newly-developed protocol was completed through a systematic review based on an item analysis. We administered the CAPTBI based on nine domains, 22 subcategories, and 57 items. All nine domains of the CAPTBI were found to be significant variables by which mTBI individuals can be distinguished from normal individuals (p < .001). We also presented the cut-off points by education level to maximize the validity of differentiating the two groups. This study is the first attempt to evaluate mTBI by means of the cognitive-linguistic protocol with multiple domains. The CAPTBI is an appropriate tool for differentiating the cognitive-pragmatic language abilities between mTBI and control group.

  16. Aligning Assessments with State Curriculum Standards and Teaching Strategies

    ERIC Educational Resources Information Center

    Pemberton, Jane B.; Rademacher, Joyce A.; Tyler-Wood, Tandra; Perez Cereijo, Maria Victoria

    2006-01-01

    This article describes the steps of moving from state curriculum standards for writing to selecting and teaching a writing strategy to designing curriculum-based assessments in writing. The relationship between assessment and instruction is strengthened as educators monitor student progress in the state curriculum standards, make sound…

  17. Biosafety assessment protocols for new organisms in New Zealand: Can they apply internationally to emerging technologies?

    SciTech Connect

    Barratt, B.I.P. . E-mail: barbara.barratt@agresearch.co.nz; Moeed, A.; Malone, L.A.

    2006-05-15

    An analysis of established biosafety protocols for release into the environment of exotic plants and biological control agents for weeds and arthropod pests has been carried out to determine whether such protocols can be applied to relatively new and emerging technologies intended for the primary production industries, such as transgenic plants. Example case studies are described to indicate the scope of issues considered by regulators who make decisions on new organism releases. No transgenic plants have been released to date in New Zealand, but two field test approvals are described as examples. An analysis of the biosafety protocols has shown that, while many of the risk criteria considered for decision-making by regulators are similar for all new organisms, a case-by-case examination of risks and potential impacts is required in order to fully assess risk. The value of post-release monitoring and validation of decisions made by regulators is emphasised.

  18. Assessment and risk classification protocol for patients in emergency units1

    PubMed Central

    Silva, Michele de Freitas Neves; Oliveira, Gabriela Novelli; Pergola-Marconato, Aline Maino; Marconato, Rafael Silva; Bargas, Eliete Boaventura; Araujo, Izilda Esmenia Muglia

    2014-01-01

    Objective to develop, validate the contents and verify the reliability of a risk classification protocol for an Emergency Unit. Method the content validation was developed in a University Hospital in a country town located in the state of Sao Paulo and was carried out in two stages: the first with the individual assessment of specialists and the second with the meeting between the researchers and the specialists. The use of the protocol followed a specific guide. Concerning reliability, the concordance or equivalent method among observers was used. Results the protocol developed showed to have content validity and, after the suggested changes were made, there were excellent results concerning reliability. Conclusion the assistance flow chart was shown to be easy to use, and facilitate the search for the complaint in each assistance priority. PMID:26107828

  19. Chesapeake Bay regions of concern: Geographical targeting protocol for remediation, reduction, prevention and assessment actions

    SciTech Connect

    Batiuk, R.A.

    1994-12-31

    As a result of a two year reevaluation of a Basinwide Toxics Reduction Strategy, the Chesapeake Bay Program identified the need to more effectively direct reduction and prevention actions toward regional areas with known toxic problems as well as areas where significant potential exists for toxic impacts on living resources and habitats. Building upon the geographical targeting efforts in the Great Lakes and Puget Sound, a protocol was established for identifying and categorizing areas ranging from known toxic problems to areas with low probability for adverse effects to insufficient data. The identification protocol is based on a series of criteria which include evaluation of sediment contamination concentrations and ambient sediment toxicity. The process for development and application of the Regions of Concern protocol along with a focus on the sediment assessment criteria and how they influenced the over-all categorization of regions will be presented.

  20. Physical Activity Stories: Assessing the "Meaning Standard" in Physical Education

    ERIC Educational Resources Information Center

    Johnson, Tyler G.

    2016-01-01

    The presence of the "meaning standard" in both national and state content standards suggests that professionals consider it an important outcome of a quality physical education program. However, only 10 percent of states require an assessment to examine whether students achieve this standard. The purpose of this article is to introduce…

  1. 75 FR 66038 - Planning Resource Adequacy Assessment Reliability Standard

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-27

    ... Standard that applies to planning coordinators. E. Provision of Data 26. Proposed Reliability Standard BAL... Federal Energy Regulatory Commission 18 CFR Part 40 Planning Resource Adequacy Assessment Reliability... Regulatory Commission proposes to approve a regional Reliability Standard, BAL-502-RFC-02, Planning...

  2. Under the Cartagena Protocol on Biosafety - Where is the Roadmap for Risk Assessment Taking Us?

    PubMed

    Gaugitsch, Helmut

    2015-01-01

    The paper summarizes the history of the development of the guidance on risk assessment, including the roadmap under the Cartagena Protocol on Biosafety since 2008 until now. The aim and the contents of the roadmap for risk assessment of living modified organisms (LMOs) are described, in particular the five steps in the risk assessment process. After several rounds of discussions at the expert and political level, the guidance including the roadmap is currently revised taking into account the results of an in-depth practical testing process by the Parties, Non-Parties, and relevant organizations. The aim is to provide an improved version of the guidance for endorsement and broad support by the next meeting of the Parties to the Cartagena Protocol in December 2016. PMID:26835448

  3. Under the Cartagena Protocol on Biosafety – Where is the Roadmap for Risk Assessment Taking Us?

    PubMed Central

    Gaugitsch, Helmut

    2016-01-01

    The paper summarizes the history of the development of the guidance on risk assessment, including the roadmap under the Cartagena Protocol on Biosafety since 2008 until now. The aim and the contents of the roadmap for risk assessment of living modified organisms (LMOs) are described, in particular the five steps in the risk assessment process. After several rounds of discussions at the expert and political level, the guidance including the roadmap is currently revised taking into account the results of an in-depth practical testing process by the Parties, Non-Parties, and relevant organizations. The aim is to provide an improved version of the guidance for endorsement and broad support by the next meeting of the Parties to the Cartagena Protocol in December 2016. PMID:26835448

  4. Under the Cartagena Protocol on Biosafety - Where is the Roadmap for Risk Assessment Taking Us?

    PubMed

    Gaugitsch, Helmut

    2015-01-01

    The paper summarizes the history of the development of the guidance on risk assessment, including the roadmap under the Cartagena Protocol on Biosafety since 2008 until now. The aim and the contents of the roadmap for risk assessment of living modified organisms (LMOs) are described, in particular the five steps in the risk assessment process. After several rounds of discussions at the expert and political level, the guidance including the roadmap is currently revised taking into account the results of an in-depth practical testing process by the Parties, Non-Parties, and relevant organizations. The aim is to provide an improved version of the guidance for endorsement and broad support by the next meeting of the Parties to the Cartagena Protocol in December 2016.

  5. The standardization of nonsterile compounding: a study in quality control and assessment for hormone compounding.

    PubMed

    Wiley, T S; Odegard, R D; Raden, J; Haraldsen, J T

    2014-01-01

    Sterile and nonsterile compounding of medication has attracted much attention over the last few years due to the onset of various infections and negative compounding practices. This paper reports on the standardization of compounded hormones utilizing the Wiley Protocol, which provides nonsynthetic bioidentical estradiol, progesterone, dehydroepiandrosterone, and testosterone in a transdermal topical cream base for women and men in a standardized dosing regimen. Here, we present data from 2008 through 2012, which details the process of standardization and quality testing of the hormones through submission of random compounded samples for quality control and assessment. Pharmacies delivering the Wiley Protocol were required to follow the same compounding formulation, as well as submit random samples for quarterly testing. Sample concentrations were tested using high-performance liquid chromatography. We found that pharmacies that submitted samples had a 91% passing rating with a percent of target of 98.6% +/- 8.4%. It was also determined that pharmacies that prepared more compounded cream had a higher passing rating than those that prepared limited quantities. We found that standardization across multiple pharmacies could be achieved through quarterly testing of submitted samples by a third-party laboratory when following necessary procedures as defined by the Wiley Protocol. It was also determined that experience and training were a critical factor in the mixing of compounded prescriptions, with high consistency and accuracy providing patient safety. PMID:24881121

  6. A Web Resource for Standardized Benchmark Datasets, Metrics, and Rosetta Protocols for Macromolecular Modeling and Design

    PubMed Central

    Pache, Roland A.; Ollikainen, Noah; Kundert, Kale; O'Meara, Matthew J.; Smith, Colin A.; Kortemme, Tanja

    2015-01-01

    The development and validation of computational macromolecular modeling and design methods depend on suitable benchmark datasets and informative metrics for comparing protocols. In addition, if a method is intended to be adopted broadly in diverse biological applications, there needs to be information on appropriate parameters for each protocol, as well as metrics describing the expected accuracy compared to experimental data. In certain disciplines, there exist established benchmarks and public resources where experts in a particular methodology are encouraged to supply their most efficient implementation of each particular benchmark. We aim to provide such a resource for protocols in macromolecular modeling and design. We present a freely accessible web resource (https://kortemmelab.ucsf.edu/benchmarks) to guide the development of protocols for protein modeling and design. The site provides benchmark datasets and metrics to compare the performance of a variety of modeling protocols using different computational sampling methods and energy functions, providing a “best practice” set of parameters for each method. Each benchmark has an associated downloadable benchmark capture archive containing the input files, analysis scripts, and tutorials for running the benchmark. The captures may be run with any suitable modeling method; we supply command lines for running the benchmarks using the Rosetta software suite. We have compiled initial benchmarks for the resource spanning three key areas: prediction of energetic effects of mutations, protein design, and protein structure prediction, each with associated state-of-the-art modeling protocols. With the help of the wider macromolecular modeling community, we hope to expand the variety of benchmarks included on the website and continue to evaluate new iterations of current methods as they become available. PMID:26335248

  7. A Web Resource for Standardized Benchmark Datasets, Metrics, and Rosetta Protocols for Macromolecular Modeling and Design.

    PubMed

    Ó Conchúir, Shane; Barlow, Kyle A; Pache, Roland A; Ollikainen, Noah; Kundert, Kale; O'Meara, Matthew J; Smith, Colin A; Kortemme, Tanja

    2015-01-01

    The development and validation of computational macromolecular modeling and design methods depend on suitable benchmark datasets and informative metrics for comparing protocols. In addition, if a method is intended to be adopted broadly in diverse biological applications, there needs to be information on appropriate parameters for each protocol, as well as metrics describing the expected accuracy compared to experimental data. In certain disciplines, there exist established benchmarks and public resources where experts in a particular methodology are encouraged to supply their most efficient implementation of each particular benchmark. We aim to provide such a resource for protocols in macromolecular modeling and design. We present a freely accessible web resource (https://kortemmelab.ucsf.edu/benchmarks) to guide the development of protocols for protein modeling and design. The site provides benchmark datasets and metrics to compare the performance of a variety of modeling protocols using different computational sampling methods and energy functions, providing a "best practice" set of parameters for each method. Each benchmark has an associated downloadable benchmark capture archive containing the input files, analysis scripts, and tutorials for running the benchmark. The captures may be run with any suitable modeling method; we supply command lines for running the benchmarks using the Rosetta software suite. We have compiled initial benchmarks for the resource spanning three key areas: prediction of energetic effects of mutations, protein design, and protein structure prediction, each with associated state-of-the-art modeling protocols. With the help of the wider macromolecular modeling community, we hope to expand the variety of benchmarks included on the website and continue to evaluate new iterations of current methods as they become available.

  8. Development and application of an assessment protocol for monitoring watery quality using benthic macroinvertebrate communities

    SciTech Connect

    Schwartz, J.H.; Dickson, K.L.; Kennedy, J.H.; Waller, W.T.

    1995-12-31

    The inability to accurately assess water quality using benthic macroinvertebrate communities, due to invalid sampling regimes and tenuous assessment endpoints, has led to confusion among the scientific community and the public as to the condition of the nation`s surface waters. Identifying a suite of reliable indicators (metrics) and a statistically valid sampling strategy should be a priority. In 1990, the Environmental Protection Agency (EPA) established the Rapid Bioassessment Protocol (RBP) to provide guidance in this area. However, several of these metrics have come under scrutiny of late. Excessive variability and redundancy of information have been the major criticisms. This study statistically evaluates the RBP metrics for their overall usefulness as indicators of water quality, using previously compiled data from a reference stream in the ecoregion in which the study site is contained. Endpoints with a high degree of variability and/or an inability to generate unique and pertinent information were not included in the assessment protocol. In addition, power analysis was conducted on these metrics to determine the number of samples necessary to detect differences at ecologically relevant values. The metrics which met the criteria of low variability and the ability to provide unique and pertinent information were then applied to three small urban streams to assess the condition of these systems. It is the contention that only when a proven assessment protocol is employed, like the one presented here, can benthic macroinvertebrates reliably be used to evaluate water quality.

  9. Summary Report Panel 1: The Need for Protocols and Standards in Research on Underwater Noise Impacts on Marine Life.

    PubMed

    Erbe, Christine; Ainslie, Michael A; de Jong, Christ A F; Racca, Roberto; Stocker, Michael

    2016-01-01

    As concern about anthropogenic noise and its impacts on marine fauna is increasing around the globe, data are being compared across populations, species, noise sources, geographic regions, and time. However, much of the raw and processed data are not comparable due to differences in measurement methodology, analysis and reporting, and a lack of metadata. Common protocols and more formal, international standards are needed to ensure the effectiveness of research, conservation, regulation and practice, and unambiguous communication of information and ideas. Developing standards takes time and effort, is largely driven by a few expert volunteers, and would benefit from stakeholders' contribution and support. PMID:26611096

  10. Summary Report Panel 1: The Need for Protocols and Standards in Research on Underwater Noise Impacts on Marine Life.

    PubMed

    Erbe, Christine; Ainslie, Michael A; de Jong, Christ A F; Racca, Roberto; Stocker, Michael

    2016-01-01

    As concern about anthropogenic noise and its impacts on marine fauna is increasing around the globe, data are being compared across populations, species, noise sources, geographic regions, and time. However, much of the raw and processed data are not comparable due to differences in measurement methodology, analysis and reporting, and a lack of metadata. Common protocols and more formal, international standards are needed to ensure the effectiveness of research, conservation, regulation and practice, and unambiguous communication of information and ideas. Developing standards takes time and effort, is largely driven by a few expert volunteers, and would benefit from stakeholders' contribution and support.

  11. 78 FR 14654 - Standards for Business Practices and Communication Protocols for Public Utilities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-07

    .... B. NAESB Phase II Demand Response M&V 14. Standards. 1. Comments 15. a. Adoption of Phase II 15. Demand Response M&V Standards. b. Level of Detail, 19. Standardization, and Best Practices. c. Other Matters 26. 2. Commission Determination...... 33. C. NAESB Energy Efficiency M&V 40. Standards....

  12. Optimization and standardization of an enzyme-linked immunosorbent assay protocol for serodiagnosis of Actinobacillus pleuropneumoniae serotype 5.

    PubMed Central

    Trottier, Y L; Wright, P F; Larivière, S

    1992-01-01

    An indirect enzyme-linked immunosorbent assay protocol has been optimized with special emphasis given to assay standardization and quality control. Technical aspects such as choice of a microplate, antigen immobilization, buffer composition, optimal screening dilution of sera, and kinetics of the enzymatic reaction were studied and evaluated in order to design a standard protocol offering maximal analytical sensitivity and specificity, as well as to obtain minimal within- and between-plate variability. Among the 27 plates tested, the Nunc 475-094 and 269-620 immunoplates were found to be the best in terms of high positive-to-negative ratio and low variability. No significant differences in antigen immobilization were found by using buffers of various compositions or pHs; however, the presence of magnesium ions (Mg2+; 0.02 M) resulted in a twofold increase in nonspecific background. An optimal screening dilution of sera was established at 1:200. A 1-h incubation period for test serum was found to be optimal. Maximum enzymatic activity for peroxidase was obtained by adjusting both substrate (H2O2) and hydrogen donor [2,2' -azinobis(3-ethylbenz-thiazoline sulfonic acid)] concentrations to 4 and 1 mM, respectively. To control between-plate variability, a timing protocol was adopted. Within-plate variability was also controlled by using a sample placement configuration pattern. Sliding scales were determined by repeated testing of a cross section of samples to set acceptance limits for both within- and between-plate variability. These limits were used in a quality control program to monitor assay performance. The results obtained suggest that this standardized protocol might be useful in the serodiagnosis of Actinobacillus pleuropneumoniae serotype 5. PMID:1734068

  13. Assessing most practical and effective protocols to sanitize hands of poultry catching crew members.

    PubMed

    Racicot, M; Kocher, A; Beauchamp, G; Letellier, A; Vaillancourt, J-P

    2013-08-01

    Catching crew members can heavily contaminate their hands with organic material. They can act as mechanical vector and spread diseases between farms. Hand hygiene is an important issue for the industry as a whole and for human health by reducing contamination risks. Many studies, in human medicine, tend to make hand rub a standard for hand hygiene. However, few studies have tested the effectiveness of hand hygiene products on visibly contaminated hands. The objective of this study was to evaluate the effectiveness of practical hand sanitization protocols: water and soap, degreasing cream and hand wipes, all combined with alcohol-based hand gel. The use of alcohol-based gel alone was also evaluated. For the reduction of coliforms after washing, there was no statistically significant difference between protocols when the initial level of bacterial contamination was low to moderate. When hands were highly contaminated, the alcohol-based gel alone was less effective than the degreasing cream combined with the alcohol-based gel (p=0.002). As for the reduction in total aerobic bacteria counts, there was no difference between protocols when the initial level of bacterial contamination was low. The water, soap and alcohol-based gel protocol was more effective than the scrubbing wipes and alcohol-based gel protocol when hands were moderately (p=0.002) and highly contaminated (p=0.001). All protocols were effective in neutralizing Salmonella on hands. Reducing the level of bacterial contamination on hands before using an alcohol-based gel seems important to ensure effective hand sanitation for highly and moderately contaminated hands. This can be done by using a degreasing cream or water and soap. Based on the survey, catching crew members preferred using warm water and soap compared to a degreasing cream.

  14. Assessing most practical and effective protocols to sanitize hands of poultry catching crew members.

    PubMed

    Racicot, M; Kocher, A; Beauchamp, G; Letellier, A; Vaillancourt, J-P

    2013-08-01

    Catching crew members can heavily contaminate their hands with organic material. They can act as mechanical vector and spread diseases between farms. Hand hygiene is an important issue for the industry as a whole and for human health by reducing contamination risks. Many studies, in human medicine, tend to make hand rub a standard for hand hygiene. However, few studies have tested the effectiveness of hand hygiene products on visibly contaminated hands. The objective of this study was to evaluate the effectiveness of practical hand sanitization protocols: water and soap, degreasing cream and hand wipes, all combined with alcohol-based hand gel. The use of alcohol-based gel alone was also evaluated. For the reduction of coliforms after washing, there was no statistically significant difference between protocols when the initial level of bacterial contamination was low to moderate. When hands were highly contaminated, the alcohol-based gel alone was less effective than the degreasing cream combined with the alcohol-based gel (p=0.002). As for the reduction in total aerobic bacteria counts, there was no difference between protocols when the initial level of bacterial contamination was low. The water, soap and alcohol-based gel protocol was more effective than the scrubbing wipes and alcohol-based gel protocol when hands were moderately (p=0.002) and highly contaminated (p=0.001). All protocols were effective in neutralizing Salmonella on hands. Reducing the level of bacterial contamination on hands before using an alcohol-based gel seems important to ensure effective hand sanitation for highly and moderately contaminated hands. This can be done by using a degreasing cream or water and soap. Based on the survey, catching crew members preferred using warm water and soap compared to a degreasing cream. PMID:23618466

  15. Developing Assessments for the Next Generation Science Standards

    ERIC Educational Resources Information Center

    Pellegrino, James W., Ed.; Wilson, Mark R., Ed.; Koenig, Judith A., Ed.; Beatty, Alexandra S., Ed.

    2014-01-01

    Assessments, understood as tools for tracking what and how well students have learned, play a critical role in the classroom. "Developing Assessments for the Next Generation Science Standards" develops an approach to science assessment to meet the vision of science education for the future as it has been elaborated in "A Framework…

  16. Web-Based Assessment of Physical Education Standards

    ERIC Educational Resources Information Center

    Avery, Marybell

    2012-01-01

    Why would a school district consider implementing a district-wide, web-based assessment of student achievement of physical education standards? Why should any school or school district assume the expense, both in terms of time and money, of adopting an online assessment tool for physical education to assess students' cognitive and motor skills?…

  17. Evaluation, Assessment and Hardware Prototyping of the SpaceWire-D Protocol

    NASA Astrophysics Data System (ADS)

    Tavoularis, Antonis; Pogkas, Nikos; Kollias, Vangelis; Marinis, Kostas

    2013-08-01

    The SpaceWire-D (SpW-D) protocol has been proposed as a method of sending information over a SpaceWire network in a deterministic manner. The need to reduce spacecraft cabling, and subsequently development, test and verification costs by integrating control and data delivery in a single network, and the requirement for deterministic delivery imposed by real time data and control traffic, resulted in the proposal for the SpW-D protocol, which is based on scheduling bounded latency RMAP transactions in time-slots. The main objective of this study, which was carried out in the frame of ESTEC Contract no. 22256/09 (titled “Protocol Validation System (PVS) activity”) was to assess and analyse the SpaceWire-D protocol specification, identify possible problem areas, provide comments and support for its improvement, develop a prototype hardware implementation of it, and perform extensive testing and evaluation in order to identify problems that were not apparent during the theoretical analysis. This paper presents the simulation activities that were performed towards the evaluation of the performance of the SpW-D protocol using a simple network topology as well as a realistic network topology, based on information provided by Thales Alenia Space. It also presents the architecture of the prototype SpW-D IP Core, as well as feedback on the SpW-D draft specification derived from tests performed in a prototype SpW-D implementation on an FPGA platform. The document concludes with proposals for further standardisation of the SpW-D protocol.

  18. Assessing the Genetics Content in the Next Generation Science Standards.

    PubMed

    Lontok, Katherine S; Zhang, Hubert; Dougherty, Michael J

    2015-01-01

    Science standards have a long history in the United States and currently form the backbone of efforts to improve primary and secondary education in science, technology, engineering, and math (STEM). Although there has been much political controversy over the influence of standards on teacher autonomy and student performance, little light has been shed on how well standards cover science content. We assessed the coverage of genetics content in the Next Generation Science Standards (NGSS) using a consensus list of American Society of Human Genetics (ASHG) core concepts. We also compared the NGSS against state science standards. Our goals were to assess the potential of the new standards to support genetic literacy and to determine if they improve the coverage of genetics concepts relative to state standards. We found that expert reviewers cannot identify ASHG core concepts within the new standards with high reliability, suggesting that the scope of content addressed by the standards may be inconsistently interpreted. Given results that indicate that the disciplinary core ideas (DCIs) included in the NGSS documents produced by Achieve, Inc. clarify the content covered by the standards statements themselves, we recommend that the NGSS standards statements always be viewed alongside their supporting disciplinary core ideas. In addition, gaps exist in the coverage of essential genetics concepts, most worryingly concepts dealing with patterns of inheritance, both Mendelian and complex. Finally, state standards vary widely in their coverage of genetics concepts when compared with the NGSS. On average, however, the NGSS support genetic literacy better than extant state standards.

  19. Assessing the Genetics Content in the Next Generation Science Standards

    PubMed Central

    Lontok, Katherine S.; Zhang, Hubert; Dougherty, Michael J.

    2015-01-01

    Science standards have a long history in the United States and currently form the backbone of efforts to improve primary and secondary education in science, technology, engineering, and math (STEM). Although there has been much political controversy over the influence of standards on teacher autonomy and student performance, little light has been shed on how well standards cover science content. We assessed the coverage of genetics content in the Next Generation Science Standards (NGSS) using a consensus list of American Society of Human Genetics (ASHG) core concepts. We also compared the NGSS against state science standards. Our goals were to assess the potential of the new standards to support genetic literacy and to determine if they improve the coverage of genetics concepts relative to state standards. We found that expert reviewers cannot identify ASHG core concepts within the new standards with high reliability, suggesting that the scope of content addressed by the standards may be inconsistently interpreted. Given results that indicate that the disciplinary core ideas (DCIs) included in the NGSS documents produced by Achieve, Inc. clarify the content covered by the standards statements themselves, we recommend that the NGSS standards statements always be viewed alongside their supporting disciplinary core ideas. In addition, gaps exist in the coverage of essential genetics concepts, most worryingly concepts dealing with patterns of inheritance, both Mendelian and complex. Finally, state standards vary widely in their coverage of genetics concepts when compared with the NGSS. On average, however, the NGSS support genetic literacy better than extant state standards. PMID:26222583

  20. Field Test Protocol. Standard Internal Load Generation for Unoccupied Test Homes

    SciTech Connect

    Fang, X.; Christensen, D.; Barker, G.; Hancock, E.

    2011-06-01

    This document describes a simple and general way to generate House Simulation Protocol (HSP)-consistent internal sensible and latent loads in unoccupied homes. It is newly updated based on recent experience, and provides instructions on how to calculate and set up the operational profiles in unoccupied homes. The document is split into two sections: how to calculate the internal load magnitude and schedule, and then what tools and methods should be used to generate those internal loads to achieve research goals.

  1. Field Test Protocol: Standard Internal Load Generation in Unoccupied Test Homes

    SciTech Connect

    Fang, X.; Christensen, D.; Barker, G.; Hancock, E.

    2011-06-01

    This document describes a simple and general way to generate House Simulation Protocol (HSP)-consistent internal sensible and latent loads in unoccupied homes. It is newly updated based on recent experience, and provides instructions on how to calculate and set up the operational profiles in unoccupied homes. The document is split into two sections: how to calculate the internal load magnitude and schedule, and then what tools and methods should be used to generate those internal loads to achieve research goals.

  2. A Field-Based Testing Protocol for Assessing Gross Motor Skills in Preschool Children: The Children's Activity and Movement in Preschool Study Motor Skills Protocol

    ERIC Educational Resources Information Center

    Williams, Harriet G.; Pfeiffer, Karin A.; Dowda, Marsha; Jeter, Chevy; Jones, Shaverra; Pate, Russell R.

    2009-01-01

    The purpose of this study was to develop a valid and reliable tool for use in assessing motor skills in preschool children in field-based settings. The development of the Children's Activity and Movement in Preschool Study Motor Skills Protocol included evidence of its reliability and validity for use in field-based environments as part of large…

  3. Establishing eutrophication assessment standards for four lake regions, China.

    PubMed

    Huo, Shouliang; Ma, Chunzi; Xi, Beidou; Su, Jing; Zan, Fengyu; Ji, Danfeng; He, Zhuoshi

    2013-10-01

    The trophic status assessment of lakes in different lake regions may provide important and fundamental information for lake trophic state classification and eutrophication control. In this study, a region-specific lake eutrophication assessment standard was established through a frequency distribution method based on chlorophyll-a concentration. The assessment standards under the oligotrophic state for lakes in the Eastern plain, Yungui Plateau, Northeast Plain and Mountain Mongolia-Xinjiang regions are total phosphorus of 0.068, 0.005, 0.011, 0.005 mg/L; total nitrogen of 1.00, 0.16, 0.37, 0.60 mg/L; Secchi depth of 0.60, 8.00, 1.55, 3.00 m; and COD(Mn) of 2.24, 1.00, 5.11, 4.00 mg/L, respectively. Moreover, a region-specific comprehensive trophic level index was developed to provide an understandable assessment method for the public. The results indicated that the frequency distribution analysis based on chlorophyll-a combined with trophic level index provided a useful metric for the assessment of the lake trophic status. In addition, the difference of eutrophication assessment standards in different lake regions was analyzed, which suggested that the sensitivities of algae to nutrients and the assessment standard of trophic status possessed significant regional differences for the four lake ecoregions. Lake eutrophication assessment standards would contribute to maximizing the effectiveness of future management strategies, to control and minimize lake eutrophication problems.

  4. Assessment of welfare of Brazilian and Belgian broiler flocks using the Welfare Quality protocol.

    PubMed

    Tuyttens, F A M; Federici, J F; Vanderhasselt, R F; Goethals, K; Duchateau, L; Sans, E C O; Molento, C F M

    2015-08-01

    The Welfare Quality consortium has proposed a science-based protocol for assessing broiler chicken welfare on farms. Innovative features make the protocols particularly suited for comparative studies, such as the focus on animal-based welfare measures and an integration procedure for calculating an overall welfare status. These protocols reflect the scientific status up to 2009 but are meant to be updated on the basis of inter alia implementation studies. Because only few such studies have been done, we applied the Welfare Quality protocol to compare the welfare of broiler flocks in Belgium (representing a typical European Union (EU) country which implies stringent animal welfare legislation) versus Brazil (the major broiler meat exporter to the EU and with minimal animal welfare legislation). Two trained observers performed broiler Welfare Quality assessments on a total of 22 farms in Belgium and south Brazil. All of the farms produced for the EU market. Although the overall welfare was categorized as 'acceptable' on all farms, many country differences were observed at the level of the welfare principles, criteria, and measures. Brazilian farms obtained higher scores for 3 of the 4 welfare principles: 'good feeding' (P = 0.007), 'good housing' (P < 0.001), and 'good health' (P = 0.005). Four of the 10 welfare criteria scores were, or tended to be, higher on Brazilian than Belgian farms: 'absence of prolonged thirst' (P < 0.001), 'ease of movement' (P < 0.001), 'absence of injuries' (P = 0.002), and 'positive emotional state' (P = 0.055). The only criteria with a higher score for the Belgian farms than their Brazilian counterparts were 'absence of prolonged hunger' (P = 0.048) and 'good human-animal relationship' (P = 0.002). Application of the Welfare Quality protocol has raised several concerns about the validity, reliability, and discriminatory potential of the protocol. The results also call for more research into the effect of animal welfare

  5. Assessment of welfare of Brazilian and Belgian broiler flocks using the Welfare Quality protocol.

    PubMed

    Tuyttens, F A M; Federici, J F; Vanderhasselt, R F; Goethals, K; Duchateau, L; Sans, E C O; Molento, C F M

    2015-08-01

    The Welfare Quality consortium has proposed a science-based protocol for assessing broiler chicken welfare on farms. Innovative features make the protocols particularly suited for comparative studies, such as the focus on animal-based welfare measures and an integration procedure for calculating an overall welfare status. These protocols reflect the scientific status up to 2009 but are meant to be updated on the basis of inter alia implementation studies. Because only few such studies have been done, we applied the Welfare Quality protocol to compare the welfare of broiler flocks in Belgium (representing a typical European Union (EU) country which implies stringent animal welfare legislation) versus Brazil (the major broiler meat exporter to the EU and with minimal animal welfare legislation). Two trained observers performed broiler Welfare Quality assessments on a total of 22 farms in Belgium and south Brazil. All of the farms produced for the EU market. Although the overall welfare was categorized as 'acceptable' on all farms, many country differences were observed at the level of the welfare principles, criteria, and measures. Brazilian farms obtained higher scores for 3 of the 4 welfare principles: 'good feeding' (P = 0.007), 'good housing' (P < 0.001), and 'good health' (P = 0.005). Four of the 10 welfare criteria scores were, or tended to be, higher on Brazilian than Belgian farms: 'absence of prolonged thirst' (P < 0.001), 'ease of movement' (P < 0.001), 'absence of injuries' (P = 0.002), and 'positive emotional state' (P = 0.055). The only criteria with a higher score for the Belgian farms than their Brazilian counterparts were 'absence of prolonged hunger' (P = 0.048) and 'good human-animal relationship' (P = 0.002). Application of the Welfare Quality protocol has raised several concerns about the validity, reliability, and discriminatory potential of the protocol. The results also call for more research into the effect of animal welfare

  6. TDA Assessment of Recommendations for Space Data System Standards

    NASA Technical Reports Server (NTRS)

    Posner, E. C.; Stevens, R.

    1984-01-01

    NASA is participating in the development of international standards for space data systems. Recommendations for standards thus far developed are assessed. The proposed standards for telemetry coding and packet telemetry provide worthwhile benefit to the DSN; their cost impact to the DSN should be small. Because of their advantage to the NASA space exploration program, their adoption should be supported by TDA, JPL, and OSTDS.

  7. A standardized staining protocol for L1CAM on formalin-fixed, paraffin-embedded tissues using automated platforms.

    PubMed

    Fogel, Mina; Harari, Ayelet; Müller-Holzner, Elisabeth; Zeimet, Alain G; Moldenhauer, Gerhard; Altevogt, Peter

    2014-01-01

    The L1 cell adhesion molecule (L1CAM) is overexpressed in many human cancers and can serve as a biomarker for prognosis in most of these cancers (including type I endometrial carcinomas). Here we provide an optimized immunohistochemical staining procedure for a widely used automated platform (VENTANA™), which has recourse to commercially available primary antibody and detection reagents. In parallel, we optimized the staining on a semi-automated BioGenix (i6000) 
immunostainer. These protocols yield good stainings and should represent the basis for a reliable and standardized immunohistochemical detection of L1CAM in a variety of malignancies in different laboratories. PMID:24242293

  8. Measuring Up: A Standards and Assessment Benchmarking Report for Massachusetts.

    ERIC Educational Resources Information Center

    Achieve, Inc., Washington, DC.

    At the request of the Massachusetts Department of Education, Achieve, Inc., conducted an evaluation of the state's K-12 mathematics standards and grade 10 Massachusetts Comprehensive Assessment System (MCAS) tests in English language arts and mathematics during the spring and summer of 2001. The state's English Language Arts standards were not…

  9. Assessing the Assessors: JMC Administrators Critique the Nine ACEJMC Standards

    ERIC Educational Resources Information Center

    Reinardy, Scott; Crawford, Jerry, II.

    2013-01-01

    For nearly ninety years, journalism professionals and academics have attempted to develop standards by which to prepare college students for the media industry. For nearly 70 years, the Accrediting Council on Education in Journalism and Mass Communications (ACEJMC) has assessed programs based on its standards. This study surveyed administers of…

  10. Lexical Analysis of Words on Commonly Used Standardized Spelling Assessments

    ERIC Educational Resources Information Center

    Calhoon, Mary Beth; Masterson, Julie J.

    2011-01-01

    The purpose of this study was to examine the morphological characteristics (i.e., number of morphemes in each word, degree of transparency between a derived morpheme and its root word) and frequency data (i.e., the standard frequency index; SFI) of six commonly used standardized spelling assessments and their alternate forms (when available).…

  11. Peer Review of Assessment Network: Supporting Comparability of Standards

    ERIC Educational Resources Information Center

    Booth, Sara; Beckett, Jeff; Saunders, Cassandra

    2016-01-01

    Purpose: This paper aims to test the need in the Australian higher education (HE) sector for a national network for the peer review of assessment in response to the proposed HE standards framework and propose a sector-wide framework for calibrating and assuring achievement standards, both within and across disciplines, through the establishment of…

  12. A cross-platform survey of CT image quality and dose from routine abdomen protocols and a method to systematically standardize image quality

    NASA Astrophysics Data System (ADS)

    Favazza, Christopher P.; Duan, Xinhui; Zhang, Yi; Yu, Lifeng; Leng, Shuai; Kofler, James M.; Bruesewitz, Michael R.; McCollough, Cynthia H.

    2015-11-01

    Through this investigation we developed a methodology to evaluate and standardize CT image quality from routine abdomen protocols across different manufacturers and models. The influence of manufacturer-specific automated exposure control systems on image quality was directly assessed to standardize performance across a range of patient sizes. We evaluated 16 CT scanners across our health system, including Siemens, GE, and Toshiba models. Using each practice’s routine abdomen protocol, we measured spatial resolution, image noise, and scanner radiation output (CTDIvol). Axial and in-plane spatial resolutions were assessed through slice sensitivity profile (SSP) and modulation transfer function (MTF) measurements, respectively. Image noise and CTDIvol values were obtained for three different phantom sizes. SSP measurements demonstrated a bimodal distribution in slice widths: an average of 6.2  ±  0.2 mm using GE’s ‘Plus’ mode reconstruction setting and 5.0  ±  0.1 mm for all other scanners. MTF curves were similar for all scanners. Average spatial frequencies at 50%, 10%, and 2% MTF values were 3.24  ±  0.37, 6.20  ±  0.34, and 7.84  ±  0.70 lp cm-1, respectively. For all phantom sizes, image noise and CTDIvol varied considerably: 6.5-13.3 HU (noise) and 4.8-13.3 mGy (CTDIvol) for the smallest phantom; 9.1-18.4 HU and 9.3-28.8 mGy for the medium phantom; and 7.8-23.4 HU and 16.0-48.1 mGy for the largest phantom. Using these measurements and benchmark SSP, MTF, and image noise targets, CT image quality can be standardized across a range of patient sizes.

  13. 77 FR 24427 - Standards for Business Practices and Communication Protocols for Public Utilities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-24

    ... Paragraph Nos. I. Background 2 II. Discussion 10 ] A. NAESB Phase II Demand Response M&V Standards......... 11 1. Description 12 2. Discussion 15 B. NAESB Wholesale Energy Efficiency M&V Standards...... 20 1.... Information Collection Statement 26 V. Environmental Analysis 32 VI. Regulatory Flexibility Act...

  14. Design and Evaluation of a Protocol to Assess Electronic Travel Aids for Persons Who Are Visually Impaired

    ERIC Educational Resources Information Center

    Havik, Else M.; Steyvers, Frank J. J. M.; van der Velde, Hanneke; Pinkster, J. Christiaan; Kooijman, Aart C.

    2010-01-01

    This study evaluated a protocol that was developed to assess how beneficial electronic travel aids are for persons who are visually impaired. Twenty persons with visual impairments used an electronic travel device (Trekker) for six weeks to conform to the protocol, which proved useful in identifying successful users of the device. (Contains 2…

  15. Leading the Transition from the Alternate Assessment Based on Modified Achievement Standards to the General Assessment

    ERIC Educational Resources Information Center

    Lazarus, Sheryl S.; Rieke, Rebekah

    2013-01-01

    Schools are facing many changes in the ways that teaching, learning, and assessment take place. Most states are moving from individual state standards to the new Common Core State Standards, which will be fewer, higher, and more rigorous than most current state standards. As the next generation of assessments used for accountability are rolled…

  16. [Food Security in Europe: comparison between the "Hygiene Package" and the British Retail Consortium (BRC) & International Food Standard (IFS) protocols].

    PubMed

    Stilo, A; Parisi, S; Delia, S; Anastasi, F; Bruno, G; Laganà, P

    2009-01-01

    The birth of Hygiene Package and of the Reg. CE no 2073/2005 in the food production field signalled a change in Italy. This process started in Italy in 1997 with the legislative decree no 155 on Self-control but in reality, it was implemented in the UK in 1990 with the promulgation of the Food Safety Act. This legal act was influenced by some basic rules corresponding to the application of HACCP standards. Since 1990 the British chains of distribution (Retailers) have involved all aspects of the food line in this type of responsibility. Due to this growing awareness for a need for greater regulation, a protocol, edited by British Retail Consortium was created in 1998. This protocol acted as a "stamp" of approval for food products and it is now known as the BRC Global Food Standard. In July 2008, this protocol became effective in its fifth version. After the birth of BRC, also French and German Retailers have established a standard practically equivalent and perhaps more pertinent to safety food, that is International Food Standard (IFS). The new approach is specific to the food field and strictly applies criteria which will ensure "safety, quality and legality" of food products, similarly to ISO 22000:2005 (mainly based on BRC & IFS past experiences). New standards aim to create a sort of green list with fully "proper and fit" Suppliers only, because of comprehensible exigencies of Retailers. It is expected, as we have shown, that Auditor authorities who are responsible for ensuring that inspections are now carried out like the Hygiene Package, will find these new standards useful. The advantages of streamlining this system is that it will allow enterprises to diligently enforce food safety practices without fear of upset or legal consequence, to improve the quality (HACCP) of management & traceability system; to restrict wastes, reprocessing and withdrawal of products. However some discordances about the interpretation of certain sub-field norms (e.g., water

  17. The spinning task: a new protocol to easily assess motor coordination and resistance in zebrafish.

    PubMed

    Blazina, Ana R; Vianna, Mônica R; Lara, Diogo R

    2013-12-01

    The increasing use of adult zebrafish in behavioral studies has created the need for new and improved protocols. Our investigation sought to evaluate the swimming behavior of zebrafish against a water current using the newly developed Spinning Task. Zebrafish were individually placed in a beaker containing a spinning magnetic stirrer and their latency to be swept into the whirlpool was recorded. We characterized that larger fish (>4 cm) and lower rpm decreased the swimming time in the Spinning Task. There was also a dose-related reduction in swimming after acute treatment with haloperidol, valproic acid, clonazepam, and ethanol, which alter coordination. Importantly, at doses that reduced swimming time in the Spinning Task, these drugs influenced absolute turn angle (ethanol increased and the other drugs decreased), but had no effect of distance travelled in a regular water tank. These results suggest that the Spinning Task is a useful protocol to add information to the assessment of zebrafish motor behavior. PMID:24044654

  18. Enhancing the interpretation of stated choice analysis through the application of a verbal protocol assessment

    USGS Publications Warehouse

    Cahill, K.L.; Marion, J.L.; Lawson, S.R.

    2007-01-01

    A stated choice survey was employed to evaluate the relative importance of resource, social, and management attributes by asking visitors to select preferred configurations of these attributes. A verbal protocol assessment was added to consider how respondents interpret and respond to stated choice questions applied to hikers of a popular trail at Acadia National Park. Results suggest that visitors are sensitive to changes in public access to the trail and its ecological conditions, with level of encounters least important. Verbal protocol results identified considnations made by respondents that provide insight to their evaluations of alternative recreation setting configurations. These insights help clarify issues important to visitors that stated choice results on their own do not provide.

  19. Spanish Translations of a Standard Assessment Battery for Marital Distress.

    ERIC Educational Resources Information Center

    Mead, D. Eugene; Thurber, Shawn L.; Crane, Brent E.

    2003-01-01

    To better serve the growing number of Spanish-speaking couples and families in the U.S., it is useful to have a battery of instruments to assess the nature of their marital distress. This article presents the standard assessment battery that Brigham Young University uses to evaluate marital distress. (Contains 11 references and 1 table.) (GCP)

  20. Standardizing serum 25-hydroxyvitamin D data from four Nordic population samples using the Vitamin D Standardization Program protocols: Shedding new light on vitamin D status in Nordic individuals.

    PubMed

    Cashman, Kevin D; Dowling, Kirsten G; Škrabáková, Zuzana; Kiely, Mairead; Lamberg-Allardt, Christel; Durazo-Arvizu, Ramon A; Sempos, Christopher T; Koskinen, Seppo; Lundqvist, Annamari; Sundvall, Jouko; Linneberg, Allan; Thuesen, Betina; Husemoen, Lise Lotte N; Meyer, Haakon E; Holvik, Kristin; Grønborg, Ida M; Tetens, Inge; Andersen, Rikke

    2015-11-01

    Knowledge about the distributions of serum 25-hydroxyvitamin D (25(OH)D) concentrations in representative population samples is critical for the quantification of vitamin D deficiency as well as for setting dietary reference values and food-based strategies for its prevention. Such data for the European Union are of variable quality making it difficult to estimate the prevalence of vitamin D deficiency across member states. As a consequence of the widespread, method-related differences in measurements of serum 25(OH)D concentrations, the Vitamin D Standardization Program (VDSP) developed protocols for standardizing existing serum 25(OH)D data from national surveys around the world. The objective of the present work was to apply the VDSP protocols to existing serum 25(OH)D data from a Danish, a Norwegian, and a Finnish population-based health survey and from a Danish randomized controlled trial. A specifically-selected subset (n 100-150) of bio-banked serum samples from each of the studies were reanalyzed for 25(OH)D by LC-MS/MS and a calibration equation developed between old and new 25(OH)D data, and this equation was applied to the entire data-sets from each study. Compared to estimates based on the original serum 25(OH)D data, the percentage vitamin D deficiency (< 30 nmol/L) decreased by 21.5% in the Danish health survey but by only 1.4% in the Norwegian health survey; but was relatively unchanged (0% and 0.2%) in the Finish survey or Danish RCT, respectively, following VDSP standardization. In conclusion, standardization of serum 25(OH)D concentrations is absolutely necessary in order to compare serum 25(OH)D concentrations across different study populations, which is needed to quantify and prevent vitamin D deficiency.

  1. A minimum dataset for a standard adult transthoracic echocardiogram: a guideline protocol from the British Society of Echocardiography

    PubMed Central

    Wharton, Gill; Steeds, Richard; Allen, Jane; Phillips, Hollie; Jones, Richard; Kanagala, Prathap; Lloyd, Guy; Masani, Navroz; Mathew, Thomas; Oxborough, David; Rana, Bushra; Sandoval, Julie; Wheeler, Richard; O'Gallagher, Kevin

    2015-01-01

    There have been significant advances in the field of echocardiography with the introduction of a number of new techniques into standard clinical practice. Consequently, a ‘standard’ echocardiographic examination has evolved to become a more detailed and time-consuming examination that requires a high level of expertise. This Guideline produced by the British Society of Echocardiography (BSE) Education Committee aims to provide a minimum dataset that should be obtained in a comprehensive standard echocardiogram. In addition, the layout proposes a recommended sequence in which to acquire the images. If abnormal pathology is detected, additional views and measurements should be obtained with reference to other BSE protocols when appropriate. Adherence to these recommendations will promote an increased quality of echocardiography and facilitate accurate comparison of studies performed either by different operators or at different departments. PMID:26693316

  2. ASK Standards: Assessment, Skills, and Knowledge Content Standards for Student Affairs Practitioners and Scholars

    ERIC Educational Resources Information Center

    ACPA College Student Educators International, 2011

    2011-01-01

    The Assessment Skills and Knowledge (ASK) standards seek to articulate the areas of content knowledge, skill and dispositions that student affairs professionals need in order to perform as practitioner-scholars to assess the degree to which students are mastering the learning and development outcomes the professionals intend. Consistent with…

  3. Metabolomics Workbench: An international repository for metabolomics data and metadata, metabolite standards, protocols, tutorials and training, and analysis tools.

    PubMed

    Sud, Manish; Fahy, Eoin; Cotter, Dawn; Azam, Kenan; Vadivelu, Ilango; Burant, Charles; Edison, Arthur; Fiehn, Oliver; Higashi, Richard; Nair, K Sreekumaran; Sumner, Susan; Subramaniam, Shankar

    2016-01-01

    The Metabolomics Workbench, available at www.metabolomicsworkbench.org, is a public repository for metabolomics metadata and experimental data spanning various species and experimental platforms, metabolite standards, metabolite structures, protocols, tutorials, and training material and other educational resources. It provides a computational platform to integrate, analyze, track, deposit and disseminate large volumes of heterogeneous data from a wide variety of metabolomics studies including mass spectrometry (MS) and nuclear magnetic resonance spectrometry (NMR) data spanning over 20 different species covering all the major taxonomic categories including humans and other mammals, plants, insects, invertebrates and microorganisms. Additionally, a number of protocols are provided for a range of metabolite classes, sample types, and both MS and NMR-based studies, along with a metabolite structure database. The metabolites characterized in the studies available on the Metabolomics Workbench are linked to chemical structures in the metabolite structure database to facilitate comparative analysis across studies. The Metabolomics Workbench, part of the data coordinating effort of the National Institute of Health (NIH) Common Fund's Metabolomics Program, provides data from the Common Fund's Metabolomics Resource Cores, metabolite standards, and analysis tools to the wider metabolomics community and seeks data depositions from metabolomics researchers across the world.

  4. Metabolomics Workbench: An international repository for metabolomics data and metadata, metabolite standards, protocols, tutorials and training, and analysis tools

    PubMed Central

    Sud, Manish; Fahy, Eoin; Cotter, Dawn; Azam, Kenan; Vadivelu, Ilango; Burant, Charles; Edison, Arthur; Fiehn, Oliver; Higashi, Richard; Nair, K. Sreekumaran; Sumner, Susan; Subramaniam, Shankar

    2016-01-01

    The Metabolomics Workbench, available at www.metabolomicsworkbench.org, is a public repository for metabolomics metadata and experimental data spanning various species and experimental platforms, metabolite standards, metabolite structures, protocols, tutorials, and training material and other educational resources. It provides a computational platform to integrate, analyze, track, deposit and disseminate large volumes of heterogeneous data from a wide variety of metabolomics studies including mass spectrometry (MS) and nuclear magnetic resonance spectrometry (NMR) data spanning over 20 different species covering all the major taxonomic categories including humans and other mammals, plants, insects, invertebrates and microorganisms. Additionally, a number of protocols are provided for a range of metabolite classes, sample types, and both MS and NMR-based studies, along with a metabolite structure database. The metabolites characterized in the studies available on the Metabolomics Workbench are linked to chemical structures in the metabolite structure database to facilitate comparative analysis across studies. The Metabolomics Workbench, part of the data coordinating effort of the National Institute of Health (NIH) Common Fund's Metabolomics Program, provides data from the Common Fund's Metabolomics Resource Cores, metabolite standards, and analysis tools to the wider metabolomics community and seeks data depositions from metabolomics researchers across the world. PMID:26467476

  5. Metabolomics Workbench: An international repository for metabolomics data and metadata, metabolite standards, protocols, tutorials and training, and analysis tools.

    PubMed

    Sud, Manish; Fahy, Eoin; Cotter, Dawn; Azam, Kenan; Vadivelu, Ilango; Burant, Charles; Edison, Arthur; Fiehn, Oliver; Higashi, Richard; Nair, K Sreekumaran; Sumner, Susan; Subramaniam, Shankar

    2016-01-01

    The Metabolomics Workbench, available at www.metabolomicsworkbench.org, is a public repository for metabolomics metadata and experimental data spanning various species and experimental platforms, metabolite standards, metabolite structures, protocols, tutorials, and training material and other educational resources. It provides a computational platform to integrate, analyze, track, deposit and disseminate large volumes of heterogeneous data from a wide variety of metabolomics studies including mass spectrometry (MS) and nuclear magnetic resonance spectrometry (NMR) data spanning over 20 different species covering all the major taxonomic categories including humans and other mammals, plants, insects, invertebrates and microorganisms. Additionally, a number of protocols are provided for a range of metabolite classes, sample types, and both MS and NMR-based studies, along with a metabolite structure database. The metabolites characterized in the studies available on the Metabolomics Workbench are linked to chemical structures in the metabolite structure database to facilitate comparative analysis across studies. The Metabolomics Workbench, part of the data coordinating effort of the National Institute of Health (NIH) Common Fund's Metabolomics Program, provides data from the Common Fund's Metabolomics Resource Cores, metabolite standards, and analysis tools to the wider metabolomics community and seeks data depositions from metabolomics researchers across the world. PMID:26467476

  6. Standardized mastery content assessments for predicting NCLEX-RN outcomes.

    PubMed

    Emory, Jan

    2013-01-01

    Nurse educators need predictors of failure for early intervention. This study investigated the predictability of fundamentals, mental health, and pharmacology standardized assessment scores to identify the risk of baccalaureate students' failure on the NCLEX-RN. Using logistic regression the pharmacology assessment score was predictive with 73.7% accuracy. Use of the pharmacology assessment can assist in early identification of at-risk students in efforts to better prepare for the NCLEX-RN examination.

  7. Assessment of an extraction protocol to detect the major mastitis-causing pathogens in bovine milk.

    PubMed

    Cressier, B; Bissonnette, N

    2011-05-01

    Despite all efforts to control its spread, mastitis remains the most costly disease for dairy farmers worldwide. One key component of better control of this disease is identification of the causative bacterial agent during udder infections in cows. Mastitis is complex, however, given the diversity of pathogens that must be identified. Development of a rapid and efficient bacterial species identification tool is thus necessary. This study was conducted to demonstrate the feasibility of bacterial DNA extraction for the automated molecular detection of major mastitis-causing pathogens directly in milk samples to complement traditional microbiological identification. Extraction and detection procedures were designed and optimized to achieve detection in a respectable time frame, at a reasonable cost, and with a high throughput capacity. The following species were identified: Staphylococcus aureus, Escherichia coli, Streptococcus uberis, Streptococcus agalactiae, Streptococcus dysgalactiae, and Klebsiella spp. (including Klebsiella oxytoca and Klebsiella pneumoniae). The detection procedure includes specific genomic DNA amplification by multiplex PCR for each species, separation by capillary electrophoresis, and laser-assisted automated detection. The specificity of the primers was assessed with a panel of bacteria representing mastitis-negative control species. The extraction protocol comprised multiple steps, starting with centrifugation for fat removal, followed by heating in the presence of a cation exchange resin to trap divalent ions. The analytical sensitivity was 100 cfu/mL for milk samples spiked with Staph. aureus, Strep. dysgalactiae, and E. coli, with a tendency for K. pneumoniae. The detection limit was 500 cfu/mL for Strep. uberis and Strep. agalactiae. The overall diagnostic sensitivity (95.4%) and specificity (97.3%) were determined in a double-blind randomized assay by processing 172 clinical milk samples with microbiological characterization as the

  8. Mercury Assessment and Monitoring Protocol for the Bear Creek Watershed, Colusa County, California

    USGS Publications Warehouse

    Suchanek, Thomas H.; Hothem, Roger L.; Rytuba, James J.; Yee, Julie L.

    2010-01-01

    This report summarizes the known information on the occurrence and distribution of mercury (Hg) in physical/chemical and biological matrices within the Bear Creek watershed. Based on these data, a matrix-specific monitoring protocol for the evaluation of the effectiveness of activities designed to remediate Hg contamination in the Bear Creek watershed is presented. The monitoring protocol documents procedures for collecting and processing water, sediment, and biota for estimation of total Hg (TotHg) and monomethyl mercury (MMeHg) in the Bear Creek watershed. The concurrent sampling of TotHg and MMeHg in biota as well as water and sediment from 10 monitoring sites is designed to assess the relative bioavailability of Hg released from Hg sources in the watershed and identify environments conducive to Hg methylation. These protocols are designed to assist landowners, land managers, water quality regulators, and scientists in determining whether specific restoration/mitigation actions lead to significant progress toward achieving water quality goals to reduce Hg in Bear and Sulphur Creeks.

  9. Using broadband spatially resolved NIRS to assess muscle oxygenation during altered running protocols

    NASA Astrophysics Data System (ADS)

    Koukourakis, Georg; Vafiadou, Maria; Steimers, André; Geraskin, Dmitri; Neary, Patrick; Kohl-Bareis, Matthias

    2009-07-01

    We used spatially resolved near-infrared spectroscopy (SRS-NIRS) to assess calf and thigh muscle oxygenation during running on a motor-driven treadmill. Two protocols were used: An incremental speed protocol (velocity = 6 - 12 km/h, ▵v = 2 km/h) was performed in 3 minute stages, while a pacing paradigm modulated step frequency alternatively (2.3 Hz [SLow]; 3.3 Hz [SHigh]) during a constant velocity for 2 minutes each. A SRS-NIRS broadband system (600 - 1000 nm) was used to measure total haemoglobin concentration and oxygen saturation (SO2). An accelerometer was placed on the hip joints to measure limb acceleration through the experiment. The data showed that the calf (SO2 58 to 42%) desaturated to a significantly lower level than the thigh (61 to 54%). During the pacing protocol, SO2 was significantly different between the SLow vs. SHigh trials. Additionally, physiological data as measured by spirometry were different between the SLow vs. SHigh pacing trials (VO2 (2563+/- 586 vs. 2503 +/- 605 mL/min). Significant differences in VO2 at the same workload (speed) indicate alterations in mechanical efficiency. These data suggest that SRS broadband NIRS can be used to discern small changes in muscle oxygenation, making this device useful for metabolic exercise studies in addition to spirometry and movement monitoring by accelerometers.

  10. Protocol standards and implementation within the digital engineering laboratory computer network (DELNET) using the universal network interface device (UNID). Part 2

    NASA Astrophysics Data System (ADS)

    Phister, P. W., Jr.

    1983-12-01

    Development of the Air Force Institute of Technology's Digital Engineering Laboratory Network (DELNET) was continued with the development of an initial draft of a protocol standard for all seven layers as specified by the International Standards Organization's (ISO) Reference Model for Open Systems Interconnections. This effort centered on the restructuring of the Network Layer to perform Datagram routing and to conform to the developed protocol standards and actual software module development of the upper four protocol layers residing within the DELNET Monitor (Zilog MCZ 1/25 Computer System). Within the guidelines of the ISO Reference Model the Transport Layer was developed utilizing the Internet Header Format (IHF) combined with the Transport Control Protocol (TCP) to create a 128-byte Datagram. Also a limited Application Layer was created to pass the Gettysburg Address through the DELNET. This study formulated a first draft for the DELNET Protocol Standard and designed, implemented, and tested the Network, Transport, and Application Layers to conform to these protocol standards.

  11. A clinical protocol to increase chewing and assess mastication in children with feeding disorders.

    PubMed

    Volkert, Valerie M; Peterson, Kathryn M; Zeleny, Jason R; Piazza, Cathleen C

    2014-09-01

    Children with feeding disorders often cannot or do not chew when presented with table food. Children with chewing deficits also often swallow the bite before masticating it appropriately, which we will refer to as early swallowing. In the current study, we evaluated a clinical protocol to increase chews per bite, assess mastication, and eliminate early swallowing with three children with feeding disorders. The current study adds to a small body of literature on chewing and mastication of children with feeding disorders. Suggestions for future research are also discussed. PMID:24902589

  12. 75 FR 20901 - Standards for Business Practices and Communication Protocols for Public Utilities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-22

    ... Electric Quadrant of the North American Energy Standards Board (NAESB) to categorize various demand... services in wholesale electric energy markets. This rule ensures that participants in wholesale energy... wholesale electric energy markets. We also take this opportunity to update 18 CFR 38.2(b) to reflect...

  13. Recommended volumetric capacity definitions and protocols for accurate, standardized and unambiguous metrics for hydrogen storage materials

    NASA Astrophysics Data System (ADS)

    Parilla, Philip A.; Gross, Karl; Hurst, Katherine; Gennett, Thomas

    2016-03-01

    The ultimate goal of the hydrogen economy is the development of hydrogen storage systems that meet or exceed the US DOE's goals for onboard storage in hydrogen-powered vehicles. In order to develop new materials to meet these goals, it is extremely critical to accurately, uniformly and precisely measure materials' properties relevant to the specific goals. Without this assurance, such measurements are not reliable and, therefore, do not provide a benefit toward the work at hand. In particular, capacity measurements for hydrogen storage materials must be based on valid and accurate results to ensure proper identification of promising materials for further development. Volumetric capacity determinations are becoming increasingly important for identifying promising materials, yet there exists controversy on how such determinations are made and whether such determinations are valid due to differing methodologies to count the hydrogen content. These issues are discussed herein, and we show mathematically that capacity determinations can be made rigorously and unambiguously if the constituent volumes are well defined and measurable in practice. It is widely accepted that this occurs for excess capacity determinations and we show here that this can happen for the total capacity determination. Because the adsorption volume is undefined, the absolute capacity determination remains imprecise. Furthermore, we show that there is a direct relationship between determining the respective capacities and the calibration constants used for the manometric and gravimetric techniques. Several suggested volumetric capacity figure-of-merits are defined, discussed and reporting requirements recommended. Finally, an example is provided to illustrate these protocols and concepts.

  14. Diagnostic accuracy of refractometer and Brix refractometer to assess failure of passive transfer in calves: protocol for a systematic review and meta-analysis.

    PubMed

    Buczinski, S; Fecteau, G; Chigerwe, M; Vandeweerd, J M

    2016-06-01

    Calves are highly dependent of colostrum (and antibody) intake because they are born agammaglobulinemic. The transfer of passive immunity in calves can be assessed directly by dosing immunoglobulin G (IgG) or by refractometry or Brix refractometry. The latter are easier to perform routinely in the field. This paper presents a protocol for a systematic review meta-analysis to assess the diagnostic accuracy of refractometry or Brix refractometry versus dosage of IgG as a reference standard test. With this review protocol we aim to be able to report refractometer and Brix refractometer accuracy in terms of sensitivity and specificity as well as to quantify the impact of any study characteristic on test accuracy. PMID:27427188

  15. Revised Recommendations of the Consortium of MS Centers Task Force for a Standardized MRI Protocol and Clinical Guidelines for the Diagnosis and Follow-Up of Multiple Sclerosis.

    PubMed

    Traboulsee, A; Simon, J H; Stone, L; Fisher, E; Jones, D E; Malhotra, A; Newsome, S D; Oh, J; Reich, D S; Richert, N; Rammohan, K; Khan, O; Radue, E-W; Ford, C; Halper, J; Li, D

    2016-03-01

    An international group of neurologists and radiologists developed revised guidelines for standardized brain and spinal cord MR imaging for the diagnosis and follow-up of MS. A brain MR imaging with gadolinium is recommended for the diagnosis of MS. A spinal cord MR imaging is recommended if the brain MR imaging is nondiagnostic or if the presenting symptoms are at the level of the spinal cord. A follow-up brain MR imaging with gadolinium is recommended to demonstrate dissemination in time and ongoing clinically silent disease activity while on treatment, to evaluate unexpected clinical worsening, to re-assess the original diagnosis, and as a new baseline before starting or modifying therapy. A routine brain MR imaging should be considered every 6 months to 2 years for all patients with relapsing MS. The brain MR imaging protocol includes 3D T1-weighted, 3D T2-FLAIR, 3D T2-weighted, post-single-dose gadolinium-enhanced T1-weighted sequences, and a DWI sequence. The progressive multifocal leukoencephalopathy surveillance protocol includes FLAIR and DWI sequences only. The spinal cord MR imaging protocol includes sagittal T1-weighted and proton attenuation, STIR or phase-sensitive inversion recovery, axial T2- or T2*-weighted imaging through suspicious lesions, and, in some cases, postcontrast gadolinium-enhanced T1-weighted imaging. The clinical question being addressed should be provided in the requisition for the MR imaging. The radiology report should be descriptive, with results referenced to previous studies. MR imaging studies should be permanently retained and available. The current revision incorporates new clinical information and imaging techniques that have become more available. PMID:26564433

  16. Revised Recommendations of the Consortium of MS Centers Task Force for a Standardized MRI Protocol and Clinical Guidelines for the Diagnosis and Follow-Up of Multiple Sclerosis.

    PubMed

    Traboulsee, A; Simon, J H; Stone, L; Fisher, E; Jones, D E; Malhotra, A; Newsome, S D; Oh, J; Reich, D S; Richert, N; Rammohan, K; Khan, O; Radue, E-W; Ford, C; Halper, J; Li, D

    2016-03-01

    An international group of neurologists and radiologists developed revised guidelines for standardized brain and spinal cord MR imaging for the diagnosis and follow-up of MS. A brain MR imaging with gadolinium is recommended for the diagnosis of MS. A spinal cord MR imaging is recommended if the brain MR imaging is nondiagnostic or if the presenting symptoms are at the level of the spinal cord. A follow-up brain MR imaging with gadolinium is recommended to demonstrate dissemination in time and ongoing clinically silent disease activity while on treatment, to evaluate unexpected clinical worsening, to re-assess the original diagnosis, and as a new baseline before starting or modifying therapy. A routine brain MR imaging should be considered every 6 months to 2 years for all patients with relapsing MS. The brain MR imaging protocol includes 3D T1-weighted, 3D T2-FLAIR, 3D T2-weighted, post-single-dose gadolinium-enhanced T1-weighted sequences, and a DWI sequence. The progressive multifocal leukoencephalopathy surveillance protocol includes FLAIR and DWI sequences only. The spinal cord MR imaging protocol includes sagittal T1-weighted and proton attenuation, STIR or phase-sensitive inversion recovery, axial T2- or T2*-weighted imaging through suspicious lesions, and, in some cases, postcontrast gadolinium-enhanced T1-weighted imaging. The clinical question being addressed should be provided in the requisition for the MR imaging. The radiology report should be descriptive, with results referenced to previous studies. MR imaging studies should be permanently retained and available. The current revision incorporates new clinical information and imaging techniques that have become more available.

  17. Revised Recommendations of the Consortium of MS Centers Task Force for a Standardized MRI Protocol and Clinical Guidelines for the Diagnosis and Follow-Up of Multiple Sclerosis

    PubMed Central

    Traboulsee, A.; Simon, J.H.; Stone, L.; Fisher, E.; Jones, D.E.; Malhotra, A.; Newsome, S.D.; Oh, J.; Reich, D.S.; Richert, N.; Rammohan, K.; Khan, O.; Radue, E.-W.; Ford, C.; Halper, J.; Li, D.

    2016-01-01

    SUMMARY An international group of neurologists and radiologists developed revised guidelines for standardized brain and spinal cord MR imaging for the diagnosis and follow-up of MS. A brain MR imaging with gadolinium is recommended for the diagnosis of MS. A spinal cord MR imaging is recommended if the brain MR imaging is nondiagnostic or if the presenting symptoms are at the level of the spinal cord. A follow-up brain MR imaging with gadolinium is recommended to demonstrate dissemination in time and ongoing clinically silent disease activity while on treatment, to evaluate unexpected clinical worsening, to re-assess the original diagnosis, and as a new baseline before starting or modifying therapy. A routine brain MR imaging should be considered every 6 months to 2 years for all patients with relapsing MS. The brain MR imaging protocol includes 3D T1-weighted, 3D T2-FLAIR, 3D T2-weighted, post-single-dose gadolinium-enhanced T1-weighted sequences, and a DWI sequence. The progressive multifocal leukoencephalopathy surveillance protocol includes FLAIR and DWI sequences only. The spinal cord MR imaging protocol includes sagittal T1-weighted and proton attenuation, STIR or phase-sensitive inversion recovery, axial T2- or T2*-weighted imaging through suspicious lesions, and, in some cases, postcontrast gadolinium-enhanced T1-weighted imaging. The clinical question being addressed should be provided in the requisition for the MR imaging. The radiology report should be descriptive, with results referenced to previous studies. MR imaging studies should be permanently retained and available. The current revision incorporates new clinical information and imaging techniques that have become more available. PMID:26564433

  18. Environmental assessment. Energy efficiency standards for consumer products

    SciTech Connect

    McSwain, Berah

    1980-06-01

    The Energy Policy and Conservation Act of 1975 requires DOE to prescribe energy efficiency standards for 13 consumer products. The Consumer Products Efficiency Standards (CPES) program covers: refrigerators and refrigerator-freezers, freezers, clothes dryers, water heaters, room air conditioners, home heating equipment, kitchen ranges and ovens, central air conditioners (cooling and heat pumps), furnaces, dishwashers, television sets, clothes washers, and humidifiers and dehumidifiers. This Environmental Assessment evaluates the potential environmental and socioeconomic impacts expected as a result of setting efficiency standards for all of the consumer products covered by the CPES program. DOE has proposed standards for eight of the products covered by the Program in a Notice of Proposed Rulemaking (NOPR). DOE expects to propose standards for home heating equipment, central air conditioners (heat pumps only), dishwashers, television sets, clothes washers, and humidifiers and dehumidifiers in 1981. No significant adverse environmental or socioeconomic impacts have been found to result from instituting the CPES.

  19. Towards a scalable, open-standards service for brokering cross-protocol data transfers across multiple sources and sinks.

    PubMed

    Meredith, David; Crouch, Stephen; Galang, Gerson; Jiang, Ming; Nguyen, Hung; Turner, Peter

    2010-09-13

    Data Transfer Service (DTS) is an open-source project that is developing a document-centric message model for describing a bulk data transfer activity, with an accompanying set of loosely coupled and platform-independent components for brokering the transfer of data between a wide range of (potentially incompatible) storage resources as scheduled, fault-tolerant batch jobs. The architecture scales from small embedded deployments on a single computer to large distributed deployments through an expandable 'worker-node pool' controlled through message-orientated middleware. Data access and transfer efficiency are maximized through the strategic placement of worker nodes at or between particular data sources/sinks. The design is inherently asynchronous, and, when third-party transfer is not available, it side-steps the bandwidth, concurrency and scalability limitations associated with buffering bytes directly through intermediary client applications. It aims to address geographical-topological deployment concerns by allowing service hosting to be either centralized (as part of a shared service) or confined to a single institution or domain. Established design patterns and open-source components are coupled with a proposal for a document-centric and open-standards-based messaging protocol. As part of the development of the message protocol, a bulk data copy activity document is proposed for the first time.

  20. Syntheses of isoxazoline-carbocyclic nucleosides and their antiviral evaluation: a standard protocol.

    PubMed

    Quadrelli, Paolo; Vazquez Martinez, Naiara; Scrocchi, Roberto; Corsaro, Antonino; Pistarà, Venerando

    2014-01-01

    The current synthesis of racemic purine and pyrimidine isoxazoline-carbocyclic nucleosides is reported, detailing the key-steps for standard and reliable preparations. Improved yields were obtained by the proper tuning of the single synthetic steps, opening the way for the preparation of a variety of novel compounds. Some of the obtained compounds were also evaluated against a wide variety of DNA and RNA viruses including HIV. No specific antiviral activity was observed in the cases at hand. Novel compounds were prepared for future biological tests.

  1. Syntheses of Isoxazoline-Carbocyclic Nucleosides and Their Antiviral Evaluation: A Standard Protocol

    PubMed Central

    Quadrelli, Paolo; Vazquez Martinez, Naiara; Scrocchi, Roberto; Corsaro, Antonino; Pistarà, Venerando

    2014-01-01

    The current synthesis of racemic purine and pyrimidine isoxazoline-carbocyclic nucleosides is reported, detailing the key-steps for standard and reliable preparations. Improved yields were obtained by the proper tuning of the single synthetic steps, opening the way for the preparation of a variety of novel compounds. Some of the obtained compounds were also evaluated against a wide variety of DNA and RNA viruses including HIV. No specific antiviral activity was observed in the cases at hand. Novel compounds were prepared for future biological tests. PMID:25544956

  2. The Acute Asthma Severity Assessment Protocol (AASAP) study: objectives and methods of a study to develop an acute asthma clinical prediction rule.

    PubMed

    Arnold, Donald H; Gebretsadik, Tebeb; Abramo, Thomas J; Sheller, James R; Resha, Donald J; Hartert, Tina V

    2012-06-01

    Acute asthma exacerbations are one of the most common reasons for paediatric emergency department visits and hospitalisations, and a relapse frequently necessitates repeat urgent care. While care plans exist, there are no acute asthma prediction rules (APRs) to assess severity and predict outcome. The primary objective of the Acute Asthma Severity Assessment Protocol study is to develop a multivariable APR for acute asthma exacerbations in paediatric patients. A prospective, convenience sample of paediatric patients aged 5-17 years with acute asthma exacerbations who present to an urban, academic, tertiary paediatric emergency department was enrolled. The study protocol and data analysis plan conform to accepted biostatistical and clinical standards for clinical prediction rule development. Modelling of the APR will be performed once the entire sample size of 1500 has accrued. It is anticipated that the APR will improve resource utilisation in the emergency department, aid in standardisation of disease assessment and allow physician and non-physician providers to participate in earlier objective decision making. The objective of this report is to describe the study objectives and detailed methodology of the Acute Asthma Severity Assessment Protocol study.

  3. NASA Standard for Models and Simulations: Credibility Assessment Scale

    NASA Technical Reports Server (NTRS)

    Babula, Maria; Bertch, William J.; Green, Lawrence L.; Hale, Joseph P.; Mosier, Gary E.; Steele, Martin J.; Woods, Jody

    2009-01-01

    As one of its many responses to the 2003 Space Shuttle Columbia accident, NASA decided to develop a formal standard for models and simulations (M&S). Work commenced in May 2005. An interim version was issued in late 2006. This interim version underwent considerable revision following an extensive Agency-wide review in 2007 along with some additional revisions as a result of the review by the NASA Engineering Management Board (EMB) in the first half of 2008. Issuance of the revised, permanent version, hereafter referred to as the M&S Standard or just the Standard, occurred in July 2008. Bertch, Zang and Steeleiv provided a summary review of the development process of this standard up through the start of the review by the EMB. A thorough recount of the entire development process, major issues, key decisions, and all review processes are available in Ref. v. This is the second of a pair of papers providing a summary of the final version of the Standard. Its focus is the Credibility Assessment Scale, a key feature of the Standard, including an example of its application to a real-world M&S problem for the James Webb Space Telescope. The companion paper summarizes the overall philosophy of the Standard and an overview of the requirements. Verbatim quotes from the Standard are integrated into the text of this paper, and are indicated by quotation marks.

  4. Addressing Standards and Assessments on State IEP Forms. Synthesis Report.

    ERIC Educational Resources Information Center

    Thompson, Sandra J.; Thurlow, Martha L.; Quenemoen, Rachel F.; Esler, Amy; Whetstone, Patti

    A study examined state Individualized Education Program (IEP) forms to determine the extent to which they include documentation of standards and assessments. All 50 states were asked to send their IEP Forms and to indicate whether they were required, recommended, or simply sample forms. Out of the 41 states with IEP forms, only 5 states…

  5. Assessment and Maintenance of Standards in Vocational Education Down Under.

    ERIC Educational Resources Information Center

    Navaratnam, K. K.

    A study examined assessment of students in the Technical and Further Education (TAFE) systems in Australia with reference to moderation processes and the maintenance of standards. Data were obtained through a multiple case-study technique, using visits to six of the eight major TAFE systems; focus group interviews were held with selected…

  6. Examination of Curricula, Teaching Practices, and Assessment through National Standards

    ERIC Educational Resources Information Center

    Chen, Weiyun

    2005-01-01

    This study examined to what degree the existing curricula, teaching practices, and assessments in 15 elementary physical education programs were aligned with the National Standards for Physical Education (NASPE, 1995) in the USA. Fifteen elementary physical education teachers voluntarily participated in this study. Data were gathered through…

  7. An agenda for assessing and improving conservation impacts of sustainability standards in tropical agriculture.

    PubMed

    Milder, Jeffrey C; Arbuthnot, Margaret; Blackman, Allen; Brooks, Sharon E; Giovannucci, Daniele; Gross, Lee; Kennedy, Elizabeth T; Komives, Kristin; Lambin, Eric F; Lee, Audrey; Meyer, Daniel; Newton, Peter; Phalan, Ben; Schroth, Götz; Semroc, Bambi; Van Rikxoort, Henk; Zrust, Michal

    2015-04-01

    Sustainability standards and certification serve to differentiate and provide market recognition to goods produced in accordance with social and environmental good practices, typically including practices to protect biodiversity. Such standards have seen rapid growth, including in tropical agricultural commodities such as cocoa, coffee, palm oil, soybeans, and tea. Given the role of sustainability standards in influencing land use in hotspots of biodiversity, deforestation, and agricultural intensification, much could be gained from efforts to evaluate and increase the conservation payoff of these schemes. To this end, we devised a systematic approach for monitoring and evaluating the conservation impacts of agricultural sustainability standards and for using the resulting evidence to improve the effectiveness of such standards over time. The approach is oriented around a set of hypotheses and corresponding research questions about how sustainability standards are predicted to deliver conservation benefits. These questions are addressed through data from multiple sources, including basic common information from certification audits; field monitoring of environmental outcomes at a sample of certified sites; and rigorous impact assessment research based on experimental or quasi-experimental methods. Integration of these sources can generate time-series data that are comparable across sites and regions and provide detailed portraits of the effects of sustainability standards. To implement this approach, we propose new collaborations between the conservation research community and the sustainability standards community to develop common indicators and monitoring protocols, foster data sharing and synthesis, and link research and practice more effectively. As the role of sustainability standards in tropical land-use governance continues to evolve, robust evidence on the factors contributing to effectiveness can help to ensure that such standards are designed and

  8. An agenda for assessing and improving conservation impacts of sustainability standards in tropical agriculture.

    PubMed

    Milder, Jeffrey C; Arbuthnot, Margaret; Blackman, Allen; Brooks, Sharon E; Giovannucci, Daniele; Gross, Lee; Kennedy, Elizabeth T; Komives, Kristin; Lambin, Eric F; Lee, Audrey; Meyer, Daniel; Newton, Peter; Phalan, Ben; Schroth, Götz; Semroc, Bambi; Van Rikxoort, Henk; Zrust, Michal

    2015-04-01

    Sustainability standards and certification serve to differentiate and provide market recognition to goods produced in accordance with social and environmental good practices, typically including practices to protect biodiversity. Such standards have seen rapid growth, including in tropical agricultural commodities such as cocoa, coffee, palm oil, soybeans, and tea. Given the role of sustainability standards in influencing land use in hotspots of biodiversity, deforestation, and agricultural intensification, much could be gained from efforts to evaluate and increase the conservation payoff of these schemes. To this end, we devised a systematic approach for monitoring and evaluating the conservation impacts of agricultural sustainability standards and for using the resulting evidence to improve the effectiveness of such standards over time. The approach is oriented around a set of hypotheses and corresponding research questions about how sustainability standards are predicted to deliver conservation benefits. These questions are addressed through data from multiple sources, including basic common information from certification audits; field monitoring of environmental outcomes at a sample of certified sites; and rigorous impact assessment research based on experimental or quasi-experimental methods. Integration of these sources can generate time-series data that are comparable across sites and regions and provide detailed portraits of the effects of sustainability standards. To implement this approach, we propose new collaborations between the conservation research community and the sustainability standards community to develop common indicators and monitoring protocols, foster data sharing and synthesis, and link research and practice more effectively. As the role of sustainability standards in tropical land-use governance continues to evolve, robust evidence on the factors contributing to effectiveness can help to ensure that such standards are designed and

  9. Using a clinical protocol for orthognathic surgery and assessing a 3-dimensional virtual approach: current therapy.

    PubMed

    Quevedo, Luis A; Ruiz, Jessica V; Quevedo, Cristobal A

    2011-03-01

    Oral and maxillofacial surgeons who perform orthognathic surgery face major changes in their practices, and these challenges will increase in the near future, because the extraordinary advances in technology applied to our profession are not only amazing but are becoming the standard of care as they promote improved outcomes for our patients. Orthognathic surgery is one of the favorite areas of practicing within the scope of practice of an oral and maxillofacial surgeon. Our own practice in orthognathic surgery has completed over 1,000 surgeries of this type. Success is directly related to the consistency and capability of the surgical-orthodontic team to achieve predictable, stable results, and our hypothesis is that a successful result is directly related to the way we take our records and perform diagnosis and treatment planning following basic general principles. Now that we have the opportunity to plan and treat 3-dimensional (3D) problems with 3D technology, we should enter into this new era with appropriate standards to ensure better results, instead of simply enjoying these new tools, which will clearly show not only us but everyone what we do when we perform orthognathic surgery. Appropriate principles need to be taken into account when implementing this new technology. In other words, new technology is welcome, but we do not have to reinvent the wheel. The purpose of this article is to review the current protocol that we use for orthognathic surgery and compare it with published protocols that incorporate new 3D and virtual technology. This report also describes our approach to this new technology.

  10. Feasibility of an intracranial EEG-fMRI protocol at 3T: risk assessment and image quality.

    PubMed

    Boucousis, Shannon M; Beers, Craig A; Cunningham, Cameron J B; Gaxiola-Valdez, Ismael; Pittman, Daniel J; Goodyear, Bradley G; Federico, Paolo

    2012-11-15

    Integrating intracranial EEG (iEEG) with functional MRI (iEEG-fMRI) may help elucidate mechanisms underlying the generation of seizures. However, the introduction of iEEG electrodes in the MR environment has inherent risk and data quality implications that require consideration prior to clinical use. Previous studies of subdural and depth electrodes have confirmed low risk under specific circumstances at 1.5T and 3T. However, no studies have assessed risk and image quality related to the feasibility of a full iEEG-fMRI protocol. To this end, commercially available platinum subdural grid/strip electrodes (4×5 grid or 1×8 strip) and 4 or 6-contact depth electrodes were secured to the surface of a custom-made phantom mimicking the conductivity of the human brain. Electrode displacement, temperature increase of electrodes and surrounding phantom material, and voltage fluctuations in electrode contacts were measured in a GE Discovery MR750 3T MR scanner during a variety of imaging sequences, typical of an iEEG-fMRI protocol. An electrode grid was also used to quantify the spatial extent of susceptibility artifact. The spatial extent of susceptibility artifact in the presence of an electrode was also assessed for typical imaging parameters that maximize BOLD sensitivity at 3T (TR=1500 ms; TE=30 ms; slice thickness=4mm; matrix=64×64; field-of-view=24 cm). Under standard conditions, all electrodes exhibited no measurable displacement and no clinically significant temperature increase (<1°C) during scans employed in a typical iEEG-fMRI experiment, including 60 min of continuous fMRI. However, high SAR sequences, such as fast spin-echo (FSE), produced significant heating in almost all scenarios (>2.0°C) that in some cases exceeded 10°C. Induced voltages in the frequency range that could elicit neuronal stimulation (<10 kHz) were well below the threshold of 100 mV. fMRI signal intensity was significantly reduced within 20mm of the electrodes for the imaging parameters

  11. Psychosocial Assessment as a Standard of Care in Pediatric Cancer.

    PubMed

    Kazak, Anne E; Abrams, Annah N; Banks, Jaime; Christofferson, Jennifer; DiDonato, Stephen; Grootenhuis, Martha A; Kabour, Marianne; Madan-Swain, Avi; Patel, Sunita K; Zadeh, Sima; Kupst, Mary Jo

    2015-12-01

    This paper presents the evidence for a standard of care for psychosocial assessment in pediatric cancer. An interdisciplinary group of investigators utilized EBSCO, PubMed, PsycINFO, Ovid, and Google Scholar search databases, focusing on five areas: youth/family psychosocial adjustment, family resources, family/social support, previous history/premorbid functioning, and family structure/function. Descriptive quantitative studies, systematic reviews, and meta-analyses (n = 149) were reviewed and evaluated using grading of recommendations, assessment development, and evaluation (GRADE) criteria. There is high quality evidence to support a strong recommendation for multifaceted, systematic assessments of psychosocial health care needs of youth with cancer and their families as a standard of care in pediatric oncology.

  12. A Video Recording and Viewing Protocol for Student Group Presentations: Assisting Self-Assessment through a Wiki Environment

    ERIC Educational Resources Information Center

    Barry, Shane

    2012-01-01

    The purpose of this research was to firstly develop a protocol for video recording student group oral presentations, for later viewing and self-assessment by student group members. Secondly, evaluations of students' experiences of this process were undertaken to determine if this self-assessment method was a positive experience for them in gaining…

  13. Assessing Overwater Structure-Related Predation on Juvenile Salmon: A Field Study and Protocol for Weighing the Evidence

    SciTech Connect

    Williams, Greg D.; Thom, Ronald M.; Southard, John A.; Sargeant, Susan L.; Shreffler, David K.; Stamey, Mark T.

    2004-02-03

    Large overwater structures have often been cited as potential migratory barriers and areas of increased predation for juvenile salmon migrating along shallow shoreline habitats, although conclusive evidence has not been demonstrated to date in situ. To help resolve this issue, Washington State Ferries (WSF) sponsored directed research to determine whether WSF terminals affect predation on juvenile salmon. We used a combination of standardized surveys, stomach content analyses, and new observational technologies to assess fish, avian, and mammal predation on salmon fry at ferry terminals and paired reference sites during periods of pre- (early April) and peak (May) outmigration. We observed no significant aggregation of potential bird or mammal predators at six ferry terminal study sites. Few potential fish predators were documented in SCUBA surveys, beach seines, or with a Dual frequency IDentification SONar (DIDSON) camera at Mukilteo, our single underwater study location. Only one instance of salmon predation by fish (staghorn sculpin ? Leptocottus armatus) was confirmed, and this was at the corresponding reference site. A tiered protocol (Minimum/ Recommended/ Preferred actions) was developed for assessing potential predation at other overwater structures. Likewise, recommendations were developed for incorporating design features into WSF terminal improvement projects that could minimize future impacts.

  14. A new protocol for ecotoxicological assessment of seawater using nauplii of Tisbe biminiensis (Copepoda:Harpacticoida).

    PubMed

    Lavorante, Beatriz R B O; Oliveira, Deloar D; Costa, Bruno V M; Souza-Santos, Lília P

    2013-09-01

    Copepods are largely used in toxicity tests. The nauplii of these organisms are more sensitive to contaminants than the adult stage. The aim of the present study was to test a protocol for the use of nauplii of the copepod Tisbe biminiensis in the ecotoxicological assessment of seawater. The sensitivity of these organisms to zinc sulphate (ZnSO4·7H2O) was also determined. The following conditions were established for the protocol based on the best development of nauplii to copepodites: 72-h duration, the microalga Chaetocerus gracilis at 2.5×10(5)cellsmL(-1) as feed and incubation temperature of 28°C. In the zinc sulphate sensitivity tests, EC50-72 h and LC50/72 h were 3.25±0.59 mg L(-1) and 3.46±0.72 mg L(-1), respectively, as estimated by the final number of copepodites and total number of live animals in relation to the mean number of inoculated nauplii. The estimated NOEC was 2.0 mg L(-1). The test developed is fast and not labour intensive. T. biminiensis nauplii exhibit sensitivity to zinc sulphate similar to that of other species of copepods employed in water toxicity tests, demonstrating the usefulness of these organisms in ecotoxicological studies involving samples of environmental seawater. PMID:23769123

  15. Performance assessment of time-domain optical brain imagers, part 1: basic instrumental performance protocol.

    PubMed

    Wabnitz, Heidrun; Taubert, Dieter Richard; Mazurenka, Mikhail; Steinkellner, Oliver; Jelzow, Alexander; Macdonald, Rainer; Milej, Daniel; Sawosz, Piotr; Kacprzak, Michał; Liebert, Adam; Cooper, Robert; Hebden, Jeremy; Pifferi, Antonio; Farina, Andrea; Bargigia, Ilaria; Contini, Davide; Caffini, Matteo; Zucchelli, Lucia; Spinelli, Lorenzo; Cubeddu, Rinaldo; Torricelli, Alessandro

    2014-08-01

    Performance assessment of instruments devised for clinical applications is of key importance for validation and quality assurance. Two new protocols were developed and applied to facilitate the design and optimization of instruments for time-domain optical brain imaging within the European project nEUROPt. Here, we present the "Basic Instrumental Performance" protocol for direct measurement of relevant characteristics. Two tests are discussed in detail. First, the responsivity of the detection system is a measure of the overall efficiency to detect light emerging from tissue. For the related test, dedicated solid slab phantoms were developed and quantitatively spectrally characterized to provide sources of known radiance with nearly Lambertian angular characteristics. The responsivity of four time-domain optical brain imagers was found to be of the order of 0.1 m² sr. The relevance of the responsivity measure is demonstrated by simulations of diffuse reflectance as a function of source-detector separation and optical properties. Second, the temporal instrument response function (IRF) is a critically important factor in determining the performance of time-domain systems. Measurements of the IRF for various instruments were combined with simulations to illustrate the impact of the width and shape of the IRF on contrast for a deep absorption change mimicking brain activation.

  16. A real-time, quantitative PCR protocol for assessing the relative parasitemia of Leucocytozoon in waterfowl

    USGS Publications Warehouse

    Smith, Matthew M.; Schmutz, Joel A.; Apelgren, Chloe; Ramey, Andy M.

    2015-01-01

    Microscopic examination of blood smears can be effective at diagnosing and quantifying hematozoa infections. However, this method requires highly trained observers, is time consuming, and may be inaccurate for detection of infections at low levels of parasitemia. To develop a molecular methodology for identifying and quantifying Leucocytozoon parasite infection in wild waterfowl (Anseriformes), we designed a real-time, quantitative PCR protocol to amplify Leucocytozoon mitochondrial DNA using TaqMan fluorogenic probes and validated our methodology using blood samples collected from waterfowl in interior Alaska during late summer and autumn (n = 105). By comparing our qPCR results to those derived from a widely used nested PCR protocol, we determined that our assay showed high levels of sensitivity (91%) and specificity (100%) in detecting Leucocytozoon DNA from host blood samples. Additionally, results of a linear regression revealed significant correlation between the raw measure of parasitemia produced by our qPCR assay (Ct values) and numbers of parasites observed on blood smears (R2 = 0.694, P = 0.003), indicating that our assay can reliably determine the relative parasitemia levels among samples. This methodology provides a powerful new tool for studies assessing effects of haemosporidian infection in wild avian species.

  17. Pantak Therapax SXT 150: performance assessment and dose determination using IAEA TRS-398 protocol.

    PubMed

    Jurado, D; Eudaldo, T; Carrasco, P; Jornet, N; Ruiz, A; Ribas, M

    2005-08-01

    The performance assessment and beam characteristics of the Therapax SXT 150 unit, which encompass both low and medium-energy beams, were evaluated. Dose determination was carried out by implementing the International Atomic Energy Agency (IAEA) TRS-398 protocol and measuring all the dosimetric parameters in order to have a solid, consistent and reliable data set for the unit. Mechanical movements, interlocks and applicator characteristics agreed with specifications. The timer exhibited good accuracy and linearity. The output was very stable, with good repeatability, long-term reproducibility and no dependence on tube head orientation. The measured dosimetric parameters included beam first and second half-value layers (HVLs), absorbed dose rate to water under reference conditions, central axis depth dose distributions, output factors and beam profiles. Measured first HVLs agreed with comparable published data, but the homogeneity coefficients were low in comparison with typical values found in the literature. The timer error was significant for all filters and should be taken into consideration for the absorbed dose rate determination under reference conditions as well as for the calculation of treatment times. Percentage depth-dose (PDD) measurements are strongly recommended for each filter-applicator combination. The output factor definition of the IAEA TRS-398 protocol for medium-energy X-ray qualities involves the use of data that is difficult to measure. Beam profiles had small penumbras and good symmetry and flatness except for the lowest energy beam, for which a heel effect was observed. PMID:16046424

  18. Performance assessment of time-domain optical brain imagers, part 1: basic instrumental performance protocol.

    PubMed

    Wabnitz, Heidrun; Taubert, Dieter Richard; Mazurenka, Mikhail; Steinkellner, Oliver; Jelzow, Alexander; Macdonald, Rainer; Milej, Daniel; Sawosz, Piotr; Kacprzak, Michał; Liebert, Adam; Cooper, Robert; Hebden, Jeremy; Pifferi, Antonio; Farina, Andrea; Bargigia, Ilaria; Contini, Davide; Caffini, Matteo; Zucchelli, Lucia; Spinelli, Lorenzo; Cubeddu, Rinaldo; Torricelli, Alessandro

    2014-08-01

    Performance assessment of instruments devised for clinical applications is of key importance for validation and quality assurance. Two new protocols were developed and applied to facilitate the design and optimization of instruments for time-domain optical brain imaging within the European project nEUROPt. Here, we present the "Basic Instrumental Performance" protocol for direct measurement of relevant characteristics. Two tests are discussed in detail. First, the responsivity of the detection system is a measure of the overall efficiency to detect light emerging from tissue. For the related test, dedicated solid slab phantoms were developed and quantitatively spectrally characterized to provide sources of known radiance with nearly Lambertian angular characteristics. The responsivity of four time-domain optical brain imagers was found to be of the order of 0.1 m² sr. The relevance of the responsivity measure is demonstrated by simulations of diffuse reflectance as a function of source-detector separation and optical properties. Second, the temporal instrument response function (IRF) is a critically important factor in determining the performance of time-domain systems. Measurements of the IRF for various instruments were combined with simulations to illustrate the impact of the width and shape of the IRF on contrast for a deep absorption change mimicking brain activation. PMID:25121479

  19. Fluid composition impacts standardized testing protocols in ultrahigh molecular weight polyethylene knee wear testing.

    PubMed

    Schwenke, T; Kaddick, C; Schneider, E; Wimmer, M A

    2005-11-01

    Wear of total knee replacements is determined gravimetrically in simulator studies. A mix of bovine serum, distilled water, and additives is intended to replicate the lubrication conditions in vivo. Weight gain due to fluid absorption during testing is corrected using a load soak station. In this study, three sets of ultrahigh molecular weight polyethylene tibial plateau were tested against highly polished titanium condyles. Test 1 was performed in two different institutions on the same simulator according to the standard ISO 14243-1, using two testing lubricants. Test 2 and test 3 repeated both previous test sections. The wear and load soak rates changed significantly with the lubricant. The wear rate decreased from 16.9 to 7.9 mg weight loss per million cycles when switching from fluid A to fluid B. The weight gain of the load soak specimen submersed in fluid A was 6.1 mg after 5 x 10(6) cycles, compared with 31.6 mg for the implant in fluid B after the same time period. Both lubricants were mixed in accordance with ISO 14243 (Implants for surgery - wear of total knee-joint prostheses), suggesting that calf serum should be diluted to 25 +/- 2 per cent with deionized water and a protein mass concentration of not less than 17 g/l. The main differences were the type and amount of additives that chemically stabilize the lubricant throughout the test. The results suggest that wear rates can only be compared if exactly the same testing conditions are applied. An agreement on detailed lubricant specifications is desirable.

  20. Low kV settings CT angiography (CTA) with low dose contrast medium volume protocol in the assessment of thoracic and abdominal aorta disease: a feasibility study

    PubMed Central

    Talei Franzesi, C; Fior, D; Bonaffini, P A; Minutolo, O; Sironi, S

    2015-01-01

    Objective: To assess the diagnostic quality of low dose (100 kV) CT angiography (CTA), by using ultra-low contrast medium volume (30 ml), for thoracic and abdominal aorta evaluation. Methods: 67 patients with thoracic or abdominal vascular disease underwent multidetector CT study using a 256 slice scanner, with low dose radiation protocol (automated tube current modulation, 100 kV) and low contrast medium volume (30 ml; 4 ml s−1). Density measurements were performed on ascending, arch, descending thoracic aorta, anonymous branch, abdominal aorta, and renal and common iliac arteries. Radiation dose exposure [dose–length product (DLP)] was calculated. A control group of 35 patients with thoracic or abdominal vascular disease were evaluated with standard CTA protocol (automated tube current modulation, 120 kV; contrast medium, 80 ml). Results: In all patients, we correctly visualized and evaluated main branches of the thoracic and abdominal aorta. No difference in density measurements was achieved between low tube voltage protocol (mean attenuation value of thoracic aorta, 304 HU; abdominal, 343 HU; renal arteries, 331 HU) and control group (mean attenuation value of thoracic aorta, 320 HU; abdominal, 339; renal arteries, 303 HU). Radiation dose exposure in low tube voltage protocol was significantly different between thoracic and abdominal low tube voltage studies (490 and 324 DLP, respectively) and the control group (thoracic DLP, 1032; abdomen, DLP 1078). Conclusion: Low-tube-voltage protocol may provide a diagnostic performance comparable with that of the standard protocol, decreasing radiation dose exposure and contrast material volume amount. Advances in knowledge: Low-tube-voltage-setting protocol combined with ultra-low contrast agent volume (30 ml), by using new multidetector-row CT scanners, represents a feasible diagnostic tool to significantly reduce the radiation dose delivered to patients and to preserve renal function

  1. Proposal for the standardization of flow cytometry protocols to detect minimal residual disease in acute lymphoblastic leukemia.

    PubMed

    Ikoma, Maura Rosane Valério; Beltrame, Miriam Perlingeiro; Ferreira, Silvia Inês Alejandra Cordoba Pires; Souto, Elizabeth Xisto; Malvezzi, Mariester; Yamamoto, Mihoko

    2015-01-01

    Minimal residual disease is the most powerful predictor of outcome in acute leukemia and is useful in therapeutic stratification for acute lymphoblastic leukemia protocols. Nowadays, the most reliable methods for studying minimal residual disease in acute lymphoblastic leukemia are multiparametric flow cytometry and polymerase chain reaction. Both provide similar results at a minimal residual disease level of 0.01% of normal cells, that is, detection of one leukemic cell in up to 10,000 normal nucleated cells. Currently, therapeutic protocols establish the minimal residual disease threshold value at the most informative time points according to the appropriate methodology employed. The expertise of the laboratory in a cancer center or a cooperative group could be the most important factor in determining which method should be used. In Brazil, multiparametric flow cytometry laboratories are available in most leukemia treatment centers, but multiparametric flow cytometry processes must be standardized for minimal residual disease investigations in order to offer reliable and reproducible results that ensure quality in the clinical application of the method. The Minimal Residual Disease Working Group of the Brazilian Society of Bone Marrow Transplantation (SBTMO) was created with that aim. This paper presents recommendations for the detection of minimal residual disease in acute lymphoblastic leukemia based on the literature and expertise of the laboratories who participated in this consensus, including pre-analytical and analytical methods. This paper also recommends that both multiparametric flow cytometry and polymerase chain reaction are complementary methods, and so more laboratories with expertise in immunoglobulin/T cell receptor (Ig/TCR) gene assays are necessary in Brazil.

  2. Perioperative Standard Oral Nutrition Supplements Versus Immunonutrition in Patients Undergoing Colorectal Resection in an Enhanced Recovery (ERAS) Protocol

    PubMed Central

    Moya, Pedro; Soriano-Irigaray, Leticia; Ramirez, Jose Manuel; Garcea, Alessandro; Blasco, Olga; Blanco, Francisco Javier; Brugiotti, Carlo; Miranda, Elena; Arroyo, Antonio

    2016-01-01

    Abstract To compare immunonutrition versus standard high calorie nutrition in patients undergoing elective colorectal resection within an Enhanced Recovery After Surgery (ERAS) program. Despite progress in recent years in the surgical management of patients with colorectal cancer (ERAS programs), postoperative complications are frequent. Nutritional supplements enriched with immunonutrients have recently been introduced into clinical practice. However, the extent to which the combination of ERAS protocols and immunonutrition benefits patients undergoing colorectal cancer surgery is unknown. The SONVI study is a prospective, multicenter, randomized trial with 2 parallel treatment groups receiving either the study product (an immune-enhancing feed) or the control supplement (a hypercaloric hypernitrogenous supplement) for 7 days before colorectal resection and 5 days postoperatively. A total of 264 patients were randomized. At baseline, both groups were comparable in regards to age, sex, surgical risk, comorbidity, and analytical and nutritional parameters. The median length of the postoperative hospital stay was 5 days with no differences between the groups. A decrease in the total number of complications was observed in the immunonutrition group compared with the control group, primarily due to a significant decrease in infectious complications (23.8% vs. 10.7%, P = 0.0007). Of the infectious complications, wound infection differed significantly between the groups (16.4% vs. 5.7%, P = 0.0008). Other infectious complications were lower in the immunonutrition group but were not statistically significantly different. The implementation of ERAS protocols including immunonutrient-enriched supplements reduces the complications of patients undergoing colorectal resection. This study is registered with ClinicalTrial.gov: NCT02393976. PMID:27227930

  3. Proposal for the standardization of flow cytometry protocols to detect minimal residual disease in acute lymphoblastic leukemia.

    PubMed

    Ikoma, Maura Rosane Valério; Beltrame, Miriam Perlingeiro; Ferreira, Silvia Inês Alejandra Cordoba Pires; Souto, Elizabeth Xisto; Malvezzi, Mariester; Yamamoto, Mihoko

    2015-01-01

    Minimal residual disease is the most powerful predictor of outcome in acute leukemia and is useful in therapeutic stratification for acute lymphoblastic leukemia protocols. Nowadays, the most reliable methods for studying minimal residual disease in acute lymphoblastic leukemia are multiparametric flow cytometry and polymerase chain reaction. Both provide similar results at a minimal residual disease level of 0.01% of normal cells, that is, detection of one leukemic cell in up to 10,000 normal nucleated cells. Currently, therapeutic protocols establish the minimal residual disease threshold value at the most informative time points according to the appropriate methodology employed. The expertise of the laboratory in a cancer center or a cooperative group could be the most important factor in determining which method should be used. In Brazil, multiparametric flow cytometry laboratories are available in most leukemia treatment centers, but multiparametric flow cytometry processes must be standardized for minimal residual disease investigations in order to offer reliable and reproducible results that ensure quality in the clinical application of the method. The Minimal Residual Disease Working Group of the Brazilian Society of Bone Marrow Transplantation (SBTMO) was created with that aim. This paper presents recommendations for the detection of minimal residual disease in acute lymphoblastic leukemia based on the literature and expertise of the laboratories who participated in this consensus, including pre-analytical and analytical methods. This paper also recommends that both multiparametric flow cytometry and polymerase chain reaction are complementary methods, and so more laboratories with expertise in immunoglobulin/T cell receptor (Ig/TCR) gene assays are necessary in Brazil. PMID:26670404

  4. UAF radiorespirometric protocol for assessing hydrocarbon mineralization potential in environmental samples.

    PubMed

    Brown, E J; Resnick, S M; Rebstock, C; Luong, H V; Lindstrom, J

    1991-01-01

    Following the EXXON Valdez oil spill, a radiorespirometric protocol was developed at the University of Alaska Fairbanks (UAF) to assess the potential for microorganisms in coastal waters and sediments to degrade hydrocarbons. The use of bioremediation to assist in oil spill cleanup operations required microbial bioassays to establish that addition of nitrogen and phosphorus would enhance biodegradation. A technique assessing 1-14C-n-hexadecane mineralization in seawater or nutrient rich sediment suspensions was used for both of these measurements. Hydrocarbon-degradation potentials were determined by measuring mineralization associated with sediment microorganisms in sediment suspended in sterilized seawater and/or marine Bushnell-Haas broth. Production of 14CO2 and CO2 was easily detectable during the first 48 hours with added hexadecane levels ranging from 10 to 500 mg/l of suspension and dependent on the biomass of hydrocarbon degraders, the hydrocarbon-oxidation potential of the biomass and nutrient availability. In addition to assessment of the hydrocarbon-degrading potential of environmental samples, the radiorespirometric procedure, and concomitant measurement of microbial biomass, has utility as an indicator of hydrocarbon contamination of soils, aqueous sediments and water, and can also be used to evaluate the effectiveness of bioremediation treatments. PMID:1368153

  5. The OECD Program to Validate the Rat Hershberger Bioassay to Screen Compounds for in Vivo Androgen and Antiandrogen Responses. Phase 1: Use of a Potent Agonist and a Potent Antagonist to Test the Standardized Protocol

    PubMed Central

    Owens, William; Zeiger, Errol; Walker, Michael; Ashby, John; Onyon, Lesley; Gray, L. Earl

    2006-01-01

    The Organisation for Economic Cooperation and Development (OECD) has completed phase 1 of the Hershberger validation intended to identify in vivo activity of suspected androgens and anti-androgens. Seventeen laboratories from 7 countries participated in phase 1, and results were collated and evaluated by the OECD with the support of an international committee of experts. Five androgen-responsive tissues (ventral prostate, paired seminal vesicles and coagulating glands, levator ani and bulbocavernosus muscles, glans penis, and paired Cowper’s or bulbourethral glands) were evaluated. The standardized protocols used selected doses of a reference androgen, testosterone propionate (TP), and an antiandrogen, flutamide (FLU). All laboratories successfully detected TP-stimulated increases in androgen-responsive tissue weight and decreases in TP-stimulated tissue weights when FLU was co-administered. The standardized protocols performed well under a variety of conditions (e.g., strain, diet, housing protocol, bedding). There was good agreement among laboratories with regard to the TP doses inducing significant increases in tissue weights and the FLU doses decreasing TP-stimulated tissue weights. Several additional procedures (e.g., weighing of the dorsolateral prostate and fixation of tissues before weighing) and serum component measurements (e.g., luteinizing hormone) were also included by some laboratories to assess their potential utility. The results indicated that the OECD Hershberger protocol was robust, reproducible, and transferable across laboratories. Based on this phase 1 validation study, the protocols have been refined, and the next phase of the OECD validation program will test the protocol with selected doses of weak androgen agonists, androgen antagonists, a 5α-reductase inhibitor, and chemicals having no androgenic activity. PMID:16882536

  6. Application of industrial hygiene techniques for work-place exposure assessment protocols related to petro-chemical exploration and production field activities

    SciTech Connect

    Koehn, J.

    1995-12-31

    Standard industrial hygiene techniques for recognition, evaluation, and control can be directly applied to development of technical protocols for workplace exposure assessment activities for a variety of field site locations. Categories of occupational hazards include chemical and physical agents. Examples of these types of hazards directly related to oil and gas exploration and production workplaces include hydrocarbons, benzene, oil mist, hydrogen sulfide, Naturally Occurring Radioactive Materials (NORM), asbestos-containing materials, and noise. Specific components of well process chemicals include potential hazardous chemical substances such as methanol, acrolein, chlorine dioxide, and hydrochloric acid. Other types of exposure hazards may result from non-routine conduct of sandblasting and painting operations.

  7. Interobserver reliability of the 'Welfare Quality(®) Animal Welfare Assessment Protocol for Growing Pigs'.

    PubMed

    Czycholl, I; Kniese, C; Büttner, K; Beilage, E Grosse; Schrader, L; Krieter, J

    2016-01-01

    The present paper focuses on evaluating the interobserver reliability of the 'Welfare Quality(®) Animal Welfare Assessment Protocol for Growing Pigs'. The protocol for growing pigs mainly consists of a Qualitative Behaviour Assessment (QBA), direct behaviour observations (BO) carried out by instantaneous scan sampling and checks for different individual parameters (IP), e.g. presence of tail biting, wounds and bursitis. Three trained observers collected the data by performing 29 combined assessments, which were done at the same time and on the same animals; but they were carried out completely independent of each other. The findings were compared by the calculation of Spearman Rank Correlation Coefficients (RS), Intraclass Correlation Coefficients (ICC), Smallest Detectable Changes (SDC) and Limits of Agreements (LoA). There was no agreement found concerning the adjectives belonging to the QBA (e.g. active: RS: 0.50, ICC: 0.30, SDC: 0.38, LoA: -0.05 to 0.45; fearful: RS: 0.06, ICC: 0.0, SDC: 0.26, LoA: -0.20 to 0.30). In contrast, the BO showed good agreement (e.g. social behaviour: RS: 0.45, ICC: 0.50, SDC: 0.09, LoA: -0.09 to 0.03 use of enrichment material: RS: 0.75, ICC: 0.68, SDC: 0.06, LoA: -0.03 to 0.03). Overall, observers agreed well in the IP, e.g. tail biting (RS: 0.52, ICC: 0.88; SDC: 0.05, LoA: -0.01 to 0.02) and wounds (RS: 0.43, ICC: 0.59, SDC: 0.10, LoA: -0.09 to 0.10). The parameter bursitis showed great differences (RS: 0.10, ICC: 0.0, SDC: 0.35, LoA: -0.37 to 0.40), which can be explained by difficulties in the assessment when the animals moved around quickly or their legs were soiled. In conclusion, the interobserver reliability was good in the BO and most IP, but not for the parameter bursitis and the QBA. PMID:27478731

  8. Interobserver reliability of the 'Welfare Quality(®) Animal Welfare Assessment Protocol for Growing Pigs'.

    PubMed

    Czycholl, I; Kniese, C; Büttner, K; Beilage, E Grosse; Schrader, L; Krieter, J

    2016-01-01

    The present paper focuses on evaluating the interobserver reliability of the 'Welfare Quality(®) Animal Welfare Assessment Protocol for Growing Pigs'. The protocol for growing pigs mainly consists of a Qualitative Behaviour Assessment (QBA), direct behaviour observations (BO) carried out by instantaneous scan sampling and checks for different individual parameters (IP), e.g. presence of tail biting, wounds and bursitis. Three trained observers collected the data by performing 29 combined assessments, which were done at the same time and on the same animals; but they were carried out completely independent of each other. The findings were compared by the calculation of Spearman Rank Correlation Coefficients (RS), Intraclass Correlation Coefficients (ICC), Smallest Detectable Changes (SDC) and Limits of Agreements (LoA). There was no agreement found concerning the adjectives belonging to the QBA (e.g. active: RS: 0.50, ICC: 0.30, SDC: 0.38, LoA: -0.05 to 0.45; fearful: RS: 0.06, ICC: 0.0, SDC: 0.26, LoA: -0.20 to 0.30). In contrast, the BO showed good agreement (e.g. social behaviour: RS: 0.45, ICC: 0.50, SDC: 0.09, LoA: -0.09 to 0.03 use of enrichment material: RS: 0.75, ICC: 0.68, SDC: 0.06, LoA: -0.03 to 0.03). Overall, observers agreed well in the IP, e.g. tail biting (RS: 0.52, ICC: 0.88; SDC: 0.05, LoA: -0.01 to 0.02) and wounds (RS: 0.43, ICC: 0.59, SDC: 0.10, LoA: -0.09 to 0.10). The parameter bursitis showed great differences (RS: 0.10, ICC: 0.0, SDC: 0.35, LoA: -0.37 to 0.40), which can be explained by difficulties in the assessment when the animals moved around quickly or their legs were soiled. In conclusion, the interobserver reliability was good in the BO and most IP, but not for the parameter bursitis and the QBA.

  9. QUANTITATIVE PLANAR AND VOLUMETRIC CARDIAC MEASUREMENTS USING 64 MDCT AND 3T MRI VS. STANDARD 2D AND M-MODE ECHOCARDIOGRAPHY: DOES ANESTHETIC PROTOCOL MATTER?

    PubMed

    Drees, Randi; Johnson, Rebecca A; Stepien, Rebecca L; Munoz Del Rio, Alejandro; Saunders, Jimmy H; François, Christopher J

    2015-01-01

    Cross-sectional imaging of the heart utilizing computed tomography and magnetic resonance imaging (MRI) has been shown to be superior for the evaluation of cardiac morphology and systolic function in humans compared to echocardiography. The purpose of this prospective study was to test the effects of two different anesthetic protocols on cardiac measurements in 10 healthy beagle dogs using 64-multidetector row computed tomographic angiography (64-MDCTA), 3T magnetic resonance (MRI) and standard awake echocardiography. Both anesthetic protocols used propofol for induction and isoflourane for anesthetic maintenance. In addition, protocol A used midazolam/fentanyl and protocol B used dexmedetomedine as premedication and constant rate infusion during the procedure. Significant elevations in systolic and mean blood pressure were present when using protocol B. There was overall good agreement between the variables of cardiac size and systolic function generated from the MDCTA and MRI exams and no significant difference was found when comparing the variables acquired using either anesthetic protocol within each modality. Systolic function variables generated using 64-MDCTA and 3T MRI were only able to predict the left ventricular end diastolic volume as measured during awake echocardiogram when using protocol B and 64-MDCTA. For all other systolic function variables, prediction of awake echocardiographic results was not possible (P = 1). Planar variables acquired using MDCTA or MRI did not allow prediction of the corresponding measurements generated using echocardiography in the awake patients (P = 1). Future studies are needed to validate this approach in a more varied population and clinically affected dogs.

  10. Savannah River Site peer evaluator standards: Operator assessment for restart

    SciTech Connect

    Not Available

    1990-06-01

    Savannah River Site has implemented a Peer Evaluator program for the assessment of certified Central Control Room Operators, Central Control Room Supervisors and Shift Technical Engineers prior to restart. This program is modeled after the nuclear Regulatory Commission's (NRC's) Examiner Standard, ES-601, for the requalification of licensed operators in the commercial utility industry. It has been tailored to reflect the unique differences between Savannah River production reactors and commercial power reactors.

  11. Savannah River Site peer evaluator standards: Operator assessment for restart

    SciTech Connect

    Not Available

    1990-06-01

    Savannah River Site has implemented a Peer Evaluator program for the assessment of certified Central Control Room Operators, Central Control Room Supervisors and Shift Technical Engineers prior to restart. This program is modeled after the nuclear Regulatory Commission`s (NRC`s) Examiner Standard, ES-601, for the requalification of licensed operators in the commercial utility industry. It has been tailored to reflect the unique differences between Savannah River production reactors and commercial power reactors.

  12. Getting Started on Assessment: Developing a Voluntary System of Assessment and Certification Based on Skill Standards.

    ERIC Educational Resources Information Center

    National Skill Standards Board (DOL/ETA), Washington, DC.

    This manual provides practical advice for voluntary partnerships that, since 1994, are part of the effort to build a voluntary national system of skill standards, assessment, and certification. Intended to be used with guidance from the National Skill Standards Board, it is designed for the voluntary partnerships that have completed the standards…

  13. Development of a first-contact protocol to guide assessment of adult patients in rehabilitation services networks

    PubMed Central

    Souza, Mariana A. P.; Ferreira, Fabiane R.; César, Cibele C.; Furtado, Sheyla R. C.; Coster, Wendy J.; Mancini, Marisa C.; Sampaio, Rosana F.

    2016-01-01

    Objective: This paper describes the development of the Protocol for Identification of Problems for Rehabilitation (PLPR), a tool to standardize collection of functional information based on the International Classification of Functioning, Disability and Health (ICF). Development of the protocol: The PLPR was developed for use during the initial contact with adult patients within a public network of rehabilitation services. Steps to develop the protocol included: survey of the ICF codes most used by clinical professionals; compilation of data from functional instruments; development and pilot testing of a preliminary version in the service settings; discussion with professionals and development of the final version. The final version includes: user identification; social and health information; brief functional description (BFD); summary of the BFD; and PLPR results. Further testing of the final version will be conducted. Conclusions: The protocol standardizes the first contact between the user and the rehabilitation service. Systematic use of the protocol could also help to create a functional database that would allow comparisons between rehabilitation services and countries over time. PMID:26786075

  14. Comparison of Standard 1.5 T vs. 3 T Optimized Protocols in Patients Treated with Glatiramer Acetate. A Serial MRI Pilot Study

    PubMed Central

    Zivadinov, Robert; Hojnacki, David; Hussein, Sara; Bergsland, Niels; Carl, Ellen; Durfee, Jacqueline; Dwyer, Michael G.; Kennedy, Cheryl; Weinstock-Guttman, Bianca

    2012-01-01

    This study explored the effect of glatiramer acetate (GA, 20 mg) on lesion activity using the 1.5 T standard MRI protocol (single dose gadolinium [Gd] and 5-min delay) or optimized 3 T protocol (triple dose of Gd, 20-min delay and application of an off-resonance saturated magnetization transfer pulse). A 15-month, phase IV, open-label, single-blinded, prospective, observational study included 12 patients with relapsing-remitting multiple sclerosis who underwent serial MRI scans (Days −45, −20, 0; the minus ign indicates the number of days before GA treatment; and on Days 30, 60, 90, 120, 150, 180, 270 and 360 during GA treatment) on 1.5 T and 3 T protocols. Cumulative number and volume of Gd enhancing (Gd-E) and T2 lesions were calculated. At Days −45 and 0, there were higher number (p < 0.01) and volume (p < 0.05) of Gd-E lesions on 3 T optimized compared to 1.5 T standard protocol. However, at 180 and 360 days of the study, no significant differences in total and cumulative number of new Gd-E and T 2 lesions were found between the two protocols. Compared to pre-treatment period, at Days 180 and 360 a significantly greater decrease in the cumulative number of Gd-E lesions (p = 0.03 and 0.021, respectively) was found using the 3 T vs. the 1.5 T protocol (p = NS for both time points). This MRI mechanistic study suggests that GA may exert a greater effect on decreasing lesion activity as measured on 3 T optimized compared to 1.5 T standard protocol. PMID:22754322

  15. Comparison of standard 1.5 T vs. 3 T optimized protocols in patients treated with glatiramer acetate. A serial MRI pilot study.

    PubMed

    Zivadinov, Robert; Hojnacki, David; Hussein, Sara; Bergsland, Niels; Carl, Ellen; Durfee, Jacqueline; Dwyer, Michael G; Kennedy, Cheryl; Weinstock-Guttman, Bianca

    2012-01-01

    This study explored the effect of glatiramer acetate (GA, 20 mg) on lesion activity using the 1.5 T standard MRI protocol (single dose gadolinium [Gd] and 5-min delay) or optimized 3 T protocol (triple dose of Gd, 20-min delay and application of an off-resonance saturated magnetization transfer pulse). A 15-month, phase IV, open-label, single-blinded, prospective, observational study included 12 patients with relapsing-remitting multiple sclerosis who underwent serial MRI scans (Days -45, -20, 0; the minus ign indicates the number of days before GA treatment; and on Days 30, 60, 90, 120, 150, 180, 270 and 360 during GA treatment) on 1.5 T and 3 T protocols. Cumulative number and volume of Gd enhancing (Gd-E) and T2 lesions were calculated. At Days -45 and 0, there were higher number (p < 0.01) and volume (p < 0.05) of Gd-E lesions on 3 T optimized compared to 1.5 T standard protocol. However, at 180 and 360 days of the study, no significant differences in total and cumulative number of new Gd-E and T 2 lesions were found between the two protocols. Compared to pre-treatment period, at Days 180 and 360 a significantly greater decrease in the cumulative number of Gd-E lesions (p = 0.03 and 0.021, respectively) was found using the 3 T vs. the 1.5 T protocol (p = NS for both time points). This MRI mechanistic study suggests that GA may exert a greater effect on decreasing lesion activity as measured on 3 T optimized compared to 1.5 T standard protocol. PMID:22754322

  16. Use of Standardized Visual Assessments of Riparian and Stream Condition to Manage Riparian Bird Habitat in Eastern Oregon

    NASA Astrophysics Data System (ADS)

    Cooke, Hilary A.; Zack, Steve

    2009-07-01

    The importance of riparian vegetation to support stream function and provide riparian bird habitat in semiarid landscapes suggests that standardized assessment tools that include vegetation criteria to evaluate stream health could also be used to assess habitat conditions for riparian-dependent birds. We first evaluated the ability of two visual assessments of woody vegetation in the riparian zone (corridor width and height) to describe variation in the obligate riparian bird ensemble along 19 streams in eastern Oregon. Overall species richness and the abundances of three species all correlated significantly with both, but width was more important than height. We then examined the utility of the riparian zone criteria in three standardized and commonly used rapid visual riparian assessment protocols—the USDI BLM Proper Functioning Condition (PFC) assessment, the USDA NRCS Stream Visual Assessment Protocol (SVAP), and the U.S. EPA Habitat Assessment Field Data Sheet (HAFDS)—to assess potential riparian bird habitat. Based on the degree of correlation of bird species richness with assessment ratings, we found that PFC does not assess obligate riparian bird habitat condition, SVAP provides a coarse estimate, and HAFDS provides the best assessment. We recommend quantitative measures of woody vegetation for all assessments and that all protocols incorporate woody vegetation height. Given that rapid assessments may be the only source of information for thousands of kilometers of streams in the western United States, incorporating simple vegetation measurements is a critical step in evaluating the status of riparian bird habitat and provides a tool for tracking changes in vegetation condition resulting from management decisions.

  17. The Promises and Challenges of Ecological Momentary Assessment in Schizophrenia: Development of an Initial Experimental Protocol

    PubMed Central

    Gaudiano, Brandon A.; Moitra, Ethan; Ellenberg, Stacy; Armey, Michael F.

    2015-01-01

    Severe mental illnesses, including schizophrenia and other psychotic-spectrum disorders, are a major cause of disability worldwide. Although efficacious pharmacological and psychosocial interventions have been developed for treating patients with schizophrenia, relapse rates are high and long-term recovery remains elusive for many individuals. Furthermore, little is still known about the underlying mechanisms of these illnesses. Thus, there is an urgent need to better understand the contextual factors that contribute to psychosis so that they can be better targeted in future interventions. Ecological Momentary Assessment (EMA) is a dynamic procedure that permits the measurement of variables in natural settings in real-time through the use of brief assessments delivered via mobile electronic devices (i.e., smartphones). One advantage of EMA is that it is less subject to retrospective memory biases and highly sensitive to fluctuating environmental factors. In the current article, we describe the research-to-date using EMA to better understand fluctuating symptoms and functioning in patients with schizophrenia and other psychotic disorders and potential applications to treatment. In addition, we describe a novel EMA protocol that we have been employing to study the outcomes of patients with schizophrenia following a hospital discharge. We also report the lessons we have learned thus far using EMA methods in this challenging clinical population. PMID:26689969

  18. The Promises and Challenges of Ecological Momentary Assessment in Schizophrenia: Development of an Initial Experimental Protocol.

    PubMed

    Gaudiano, Brandon A; Moitra, Ethan; Ellenberg, Stacy; Armey, Michael F

    2015-09-01

    Severe mental illnesses, including schizophrenia and other psychotic-spectrum disorders, are a major cause of disability worldwide. Although efficacious pharmacological and psychosocial interventions have been developed for treating patients with schizophrenia, relapse rates are high and long-term recovery remains elusive for many individuals. Furthermore, little is still known about the underlying mechanisms of these illnesses. Thus, there is an urgent need to better understand the contextual factors that contribute to psychosis so that they can be better targeted in future interventions. Ecological Momentary Assessment (EMA) is a dynamic procedure that permits the measurement of variables in natural settings in real-time through the use of brief assessments delivered via mobile electronic devices (i.e., smart phones). One advantage of EMA is that it is less subject to retrospective memory biases and highly sensitive to fluctuating environmental factors. In the current article, we describe the research-to-date using EMA to better understand fluctuating symptoms and functioning in patients with schizophrenia and other psychotic disorders and potential applications to treatment. In addition, we describe a novel EMA protocol that we have been employing to study the outcomes of patients with schizophrenia following a hospital discharge. We also report the lessons we have learned thus far using EMA methods in this challenging clinical population. PMID:26689969

  19. An Engulfment Assay: A Protocol to Assess Interactions Between CNS Phagocytes and Neurons

    PubMed Central

    Schafer, Dorothy P.; Lehrman, Emily K.; Heller, Christopher T.; Stevens, Beth

    2014-01-01

    Phagocytosis is a process in which a cell engulfs material (entire cell, parts of a cell, debris, etc.) in its surrounding extracellular environment and subsequently digests this material, commonly through lysosomal degradation. Microglia are the resident immune cells of the central nervous system (CNS) whose phagocytic function has been described in a broad range of conditions from neurodegenerative disease (e.g., beta-amyloid clearance in Alzheimer’s disease) to development of the healthy brain (e.g., synaptic pruning)1-6. The following protocol is an engulfment assay developed to visualize and quantify microglia-mediated engulfment of presynaptic inputs in the developing mouse retinogeniculate system7. While this assay was used to assess microglia function in this particular context, a similar approach may be used to assess other phagocytes throughout the brain (e.g., astrocytes) and the rest of the body (e.g., peripheral macrophages) as well as other contexts in which synaptic remodeling occurs (e.g. ,brain injury/disease). PMID:24962472

  20. What standardized tests ignore when assessing individuals with neurodevelopmental disorders

    PubMed Central

    Tenorio, Marcela; Campos, Ruth; Karmiloff-Smith, Annette

    2016-01-01

    In this article we critique the use of traditional standardized tests for the cognitive assessment of children with neurodevelopmental disorders. Limitations stem from the lack of integrating (a) results from research into the psychological functioning of these populations, and (b) the main arguments underlying models of human development. We identify four secondary issues in this discussion: (1) these instruments cannot be used with children who have particularly low cognitive functioning; (2) little or no variance in the scores obtained by individuals with neurodevelopmental disorders, because all are at floor, prevent adequate interpretations; (3) measurements do not provide information useful for the design of intervention strategies; and (4) different cognitive and/or neural processes may underlie behavioural scores ‘in the normal range’. Rethinking traditional assessment methods in favour of technologically-mediated games yields new cognitive assessment possibilities. PMID:26778874

  1. The recent findings of the "Scientific Assessment of Ozone Depletion: 2010" and the World Avoided by the Montreal Protocol

    NASA Astrophysics Data System (ADS)

    Newman, P. A.; Scientific Assessment Panel to the Montreal Protocol

    2011-12-01

    The ozone layer is the Earth's natural sunscreen, blocking harmful solar ultraviolet radiation. In 1974, Mario Molina and F. Sherwood Rowland proposed that the ozone layer could be depleted by chlorine released from human-produced chlorofluorocarbons (CFCs). Follow-up science investigations supported this hypothesis, leading to the landmark 1987 Montreal Protocol on Substances That Deplete the Ozone Layer (a protocol to the Vienna Convention for the Protection of the Ozone Layer). One of the Montreal Protocol provisions is that science assessments on ozone depletion be written and submitted to the signatory Parties every 4 years. In this talk, I will primarily focus on the science findings from the recently published "Scientific Assessment of Ozone Depletion: 2010". This assessment is written and reviewed (multiple times) by the international science community. The 2010 assessment is the latest in a long series of reports that provide the science foundation for the Montreal Protocol. This assessment demonstrates that the Montreal Protocol is working, and that there are early signs that ozone is beginning to respond to decreasing CFC levels. There are now state-of-the-art simulations that show that the ozone layer would have been largely destroyed if CFCs had not been regulated, and therefore extreme levels of UV radiation have been avoided. The 2010 assessment also spotlights new insights into the impact of ozone depletion on surface climate, and climate impacts on ozone. However, the assessment also reveals that greenhouse gases are modifying the stratosphere and that the ozone layer will evolve into a different state than its pre-industrial values - you can't go home again.

  2. Intragenomic heterogeneity in the 16S rRNA genes of Flavobacterium columnare and standard protocol for genomovar assignment.

    PubMed

    LaFrentz, B R; Waldbieser, G C; Welch, T J; Shoemaker, C A

    2014-07-01

    Genetic variability in 16S rRNA gene sequences has been demonstrated among isolates of Flavobacterium columnare, and a restriction fragment length polymorphism (RFLP) assay is available for genetic typing of this important fish pathogen. Interpretation of restriction patterns can be difficult due to the lack of a formal description of the expected number and sizes of DNA fragments generated for each of the described genomovars. In this study, partial 16S rRNA gene sequences (ca. 1250-bp fragment) from isolates representing each described genomovar and isolates generating unique restriction patterns were cloned and sequenced. The results demonstrated that some isolates contained up to three different 16S rRNA genes whose sequences generate different RFLP patterns due to intragenomic heterogeneity within HaeIII restriction sites. The occurrence of HaeIII restriction sites within the portion of the 16S rRNA gene used for typing the F. columnare isolates and intragenomic heterogeneity within these sites explained the restriction patterns observed following RFLP analyses. This research provides a standard protocol for typing isolates of F. columnare by RFLP and a formal description of the expected restriction patterns for the previously described genomovars I, II, II-B and III. Additionally, we describe a new genomovar, I/II.

  3. [Development of telepathology systems between different types of terminals based on the standard for image collaboration command protocol].

    PubMed

    Tofukuji, Ikuo; Nakagawa, Shuji; Suzuki, Akitoshi; Saito, Makoto; Hara, Shigeji; Tsuchihashi, Yasunari; Shiraishi, Taizo; Ooshiro, Mariko; Sawai, Takashi; Kaihara, Shigekoto

    2003-01-01

    In Japan telepathology systems have been developed in medical or pathological environment such as a shortage and an uneven distribution of pathologists. More than 100 telepathology terminals are working mainly for intraoperative quick diagnosis. They cannot communicate with different types each other. In March 2000 the Medical Information System Development Center(MEDIS-DC) successfully demonstrated the interconnection between different types of telepathology terminals based on the Standard for Image Collaboration Command Protocol (SICCP). Nikon, NTTdata and Olympus had joined the development. In February 2002 MEDIS-DC examined these systems for pathological consultations in the fields of Okinawa-Kyoto, Kyoto-Mie and Mie-Okinawa. These successful examinations let us know that telepathology systems need new observation methodologies for telecytology and teleconsultation in addition to the flow for intraoperative quick diagnosis, new GUI guidelines for telepathology terminal design and, education and support for users of their smooth operation. Outcomes of MEDIS-DC activities encourageed us to challenge the next generation telepathology. We found some new trends in telepathology or pathology informatics such as virtual slide technologies and the internet applications in US and Europe. In order to keep Japanese priority, MEDIS-DC telepathology comittee has started investigations to construct a strategy for development of Japanese next generation telepathology.

  4. An umbrella protocol for standardized data collection (SDC) in rectal cancer: a prospective uniform naming and procedure convention to support personalized medicine.

    PubMed

    Meldolesi, Elisa; van Soest, Johan; Dinapoli, Nicola; Dekker, Andre; Damiani, Andrea; Gambacorta, Maria Antonietta; Valentini, Vincenzo

    2014-07-01

    Predictive models allow treating physicians to deliver tailored treatment moving from prescription by consensus to prescription by numbers. The main features of an umbrella protocol for standardizing data and procedures to create a consistent dataset useful to obtain a trustful analysis for a Decision Support System for rectal cancer are reported.

  5. Assessing Outside the Bubble: Performance Assessment for Common Core State Standards

    ERIC Educational Resources Information Center

    Bishop, Jesica M.; Bristow, Lora J.; Coriell, Bryn P.; Jensen, Mark E.; Johnson, Leif E.; Luring, Sara R.; Lyons-Tinsley, Mary Ann; Mefford, Megan M.; Neu, Gwen L.; Samulski, Emerson T.; Warner, Timothy D.; White, Mathew F.

    2011-01-01

    The adoption of Common Core State Standards has increased the need for assessments capable of measuring more performance-based outcomes. This monograph brings together the current literature and resources for the development and implementation of performance assessment. The text was written as part of a project-based graduate course and has…

  6. A Self-Paced Intermittent Protocol on a Non-Motorised Treadmill: A Reliable Alternative to Assessing Team-Sport Running Performance

    PubMed Central

    Tofari, Paul J.; McLean, Blake D.; Kemp, Justin; Cormack, Stuart

    2015-01-01

    This study assessed the reliability of a ‘self-paced’ 30-min, team-sport running protocol on a Woodway Curve 3.0 non-motorised treadmill (NMT). Ten male team-sport athletes (20.3 ± 1.2 y, 74.4 ± 9.7 kg, VO2peak 57.1 ± 4.5 ml·kg-1·min-1) attended five sessions (VO2peak testing + familiarisation; four reliability trials). The 30-min protocol consisted of three identical 10-min activity blocks, with visual and audible commands directing locomotor activity; however, actual speeds were self-selected by participants. Reliability of variables was estimated using typical error ± 90% confidence limits expressed as a percentage [coefficient of variation (CV)] and intraclass correlation coefficient. The smallest worthwhile change (SWC) was calculated as 0.2 × between participant standard deviation. Peak/mean speed and distance variables assessed across the 30-min protocol exhibited a CV < 5%, and < 6% for each 10-min activity block. All power variables exhibited a CV < 7.5%, except walking (CV 8.3-10.1%). The most reliable variables were maximum and mean sprint speed (CV < 2%). All variables produced a CV% greater than the SWC. A self-paced, team-sport running protocol performed on a NMT produces reliable speed/distance and power data. Importantly, a single familiarisation session allowed for adequate test-retest reliability. The self-paced design provides an ecologically-valid alternative to externally-paced team-sport running simulations. Key points Self-paced team-sport running protocols on a curved NMT that closely match the locomotor demands of competition deliver reliable test-retest measures of speed, distance and power. Such protocols may be sensitive to changes in running profile following an intervention that may not be detectable during externally-paced protocols. One familiarisation session is adequate to ensure test-retest reliability. PMID:25729291

  7. First in vivo assessment of "Outwalk": a novel protocol for clinical gait analysis based on inertial and magnetic sensors.

    PubMed

    Ferrari, Alberto; Cutti, Andrea Giovanni; Garofalo, Pietro; Raggi, Michele; Heijboer, Monique; Cappello, Angelo; Davalli, Angelo

    2010-01-01

    A protocol named "Outwalk" was recently proposed to measure the thorax-pelvis and lower-limb kinematics during gait in free-living conditions, by means of an inertial and magnetic measurement system (IMMS). The aim of this study was to validate Outwalk on four healthy subjects when it is used in combination with a specific IMMS (Xsens Technologies, NL), against a reference protocol (CAST) and measurement system (optoelectronic system; Vicon, Oxford Metrics Group, UK). For this purpose, we developed an original approach based on three tests, which allowed to separately investigate: (1) the consequences on joint kinematics of the differences between protocols (Outwalk vs. CAST), (2) the accuracy of the hardware (Xsens vs. Vicon), and (3) the summation of protocols' differences and hardware accuracy (Outwalk + Xsens vs. CAST + Vicon). In order to assess joint-angles similarity, the coefficient of multiple correlation (CMC) was used. For test 3, the CMC showed that Outwalk + Xsens and CAST + Vicon kinematics can be interchanged, offset included, for hip, knee and ankle flexion-extension, and hip ab-adduction (CMC > 0.88). The other joint-angles can be interchanged offset excluded (CMC > 0.85). Tests 1 and 2 also showed that differences in offset between joint-angles were predominantly induced by differences in the protocols; differences in correlation by both hardware and protocols; differences in range of motion by the Xsens accuracy. Results thus support the commencement of a clinical trial of Outwalk on transtibial amputees. PMID:19911215

  8. First in vivo assessment of "Outwalk": a novel protocol for clinical gait analysis based on inertial and magnetic sensors.

    PubMed

    Ferrari, Alberto; Cutti, Andrea Giovanni; Garofalo, Pietro; Raggi, Michele; Heijboer, Monique; Cappello, Angelo; Davalli, Angelo

    2010-01-01

    A protocol named "Outwalk" was recently proposed to measure the thorax-pelvis and lower-limb kinematics during gait in free-living conditions, by means of an inertial and magnetic measurement system (IMMS). The aim of this study was to validate Outwalk on four healthy subjects when it is used in combination with a specific IMMS (Xsens Technologies, NL), against a reference protocol (CAST) and measurement system (optoelectronic system; Vicon, Oxford Metrics Group, UK). For this purpose, we developed an original approach based on three tests, which allowed to separately investigate: (1) the consequences on joint kinematics of the differences between protocols (Outwalk vs. CAST), (2) the accuracy of the hardware (Xsens vs. Vicon), and (3) the summation of protocols' differences and hardware accuracy (Outwalk + Xsens vs. CAST + Vicon). In order to assess joint-angles similarity, the coefficient of multiple correlation (CMC) was used. For test 3, the CMC showed that Outwalk + Xsens and CAST + Vicon kinematics can be interchanged, offset included, for hip, knee and ankle flexion-extension, and hip ab-adduction (CMC > 0.88). The other joint-angles can be interchanged offset excluded (CMC > 0.85). Tests 1 and 2 also showed that differences in offset between joint-angles were predominantly induced by differences in the protocols; differences in correlation by both hardware and protocols; differences in range of motion by the Xsens accuracy. Results thus support the commencement of a clinical trial of Outwalk on transtibial amputees.

  9. Clinical Applicability and Cutoff Values for an Unstructured Neuropsychological Assessment Protocol for Older Adults with Low Formal Education

    PubMed Central

    de Paula, Jonas Jardim; Bertola, Laiss; Ávila, Rafaela Teixeira; Moreira, Lafaiete; Coutinho, Gabriel; de Moraes, Edgar Nunes; Bicalho, Maria Aparecida Camargos; Nicolato, Rodrigo; Diniz, Breno Satler; Malloy-Diniz, Leandro Fernandes

    2013-01-01

    Background and Objectives The neuropsychological exam plays a central role in the assessment of elderly patients with cognitive complaints. It is particularly relevant to differentiate patients with mild dementia from those subjects with mild cognitive impairment. Formal education is a critical factor in neuropsychological performance; however, there are few studies that evaluated the psychometric properties, especially criterion related validity, neuropsychological tests for patients with low formal education. The present study aims to investigate the validity of an unstructured neuropsychological assessment protocol for this population and develop cutoff values for clinical use. Methods and Results A protocol composed by the Rey-Auditory Verbal Learning Test, Frontal Assessment Battery, Category and Letter Fluency, Stick Design Test, Clock Drawing Test, Digit Span, Token Test and TN-LIN was administered to 274 older adults (96 normal aging, 85 mild cognitive impairment and 93 mild Alzheimer`s disease) with predominantly low formal education. Factor analysis showed a four factor structure related to Executive Functions, Language/Semantic Memory, Episodic Memory and Visuospatial Abilities, accounting for 65% of explained variance. Most of the tests showed a good sensitivity and specificity to differentiate the diagnostic groups. The neuropsychological protocol showed a significant ecological validity as 3 of the cognitive factors explained 31% of the variance on Instrumental Activities of Daily Living. Conclusion The study presents evidence of the construct, criteria and ecological validity for this protocol. The neuropsychological tests and the proposed cutoff values might be used for the clinical assessment of older adults with low formal education. PMID:24066031

  10. Initial Reliability of The Standardized Orthopedic Assessment Tool (SOAT)

    PubMed Central

    Lafave, Mark R; Katz, Larry; Donnon, Tyrone; Butterwick, Dale J

    2008-01-01

    Context: Orthopaedic assessment skills are critical to the success of athletic therapists and trainers. The Standardized Orthopedic Assessment Tool (SOAT) has been content validated. Objective: To establish interrater reliability of the SOAT. Patients or Other Participants: Thirty-two college students, 10 raters, and 2 standardized patients (SPs) from Calgary, Alberta, Canada. Design: Randomized observational study. Intervention(s): Students were allowed 30 minutes to complete a mock orthopaedic assessment of an SP with an injury specific to a region of the body (shoulder, knee, or ankle). Using the region-specific SOAT, raters and SPs evaluated students' orthopaedic assessment skills. Main Outcome Measure(s): The sum totals of the SOAT for 2 raters and 1 SP were used to calculate each student's performance scores for respective scenarios. Scale reliability analysis (Cronbach α) was completed on the SOAT for each of the 3 body-region examinations. Results: The mean overall reliability of 3 SOATs (ie, ankle, knee, and shoulder) was positive: α  =  .85 with the SP scores factored into the equation and α  =  .86 without the SP scores factored into the equation. Reliability for the ankle region was highest (α  =  .91), followed by the knee (α  =  .83) and the shoulder (α  =  .82). Conclusions: The study sample size was small, but the results will enable further study with generalization to a broader audience of athletic therapists and athletic trainers. Because a baseline measure of reliability was established using a robust statistical analysis, future researchers can employ more stringent statistical analysis and focus on the effects of various pedagogical techniques to teach and learn the underlying construct of clinical competence in orthopaedic assessment. PMID:18833311

  11. Applying International Standards for Hydrokinetic Energy Resource Assessments

    NASA Astrophysics Data System (ADS)

    Haas, K. A.

    2015-12-01

    The extraction of hydrokinetic energy is the conversion of the kinetic energy of moving water into another more useful form of energy, frequently electricity. This water motion may be in the form of waves, tides, ocean currents or river flows. In addition to the development of the technology, the successful extraction of hydrokinetic energy requires a better understanding of physical, environmental and social aspects of the resource and their interactions with the technology. To assist with the development of the hydrokinetic industry as a whole, much work over the past decade has been completed developing international technical standards which can be used by the full range of stakeholders in the hydrokinetic industry. To support the design of projects for tidal energy extraction, a new International Electrotechnical Commission (IEC) Technical Specification (TS) has recently been published outlining a standardized methodology for performing resource assessments. In addition, presently work is ongoing on producing another TS for performing resource assessments on in-stream river projects. While the specific technology for extracting the energy from tidal and river flows may be similar, the methodologies for performing the respective resource assessments is quite different due to the differing nature of the physical processes involved. This presentation will discuss both the tidal and in-stream river methodologies, highlighting their respective key aspects. In addition, a case study illustrating the use of the published tidal TS will be presented.

  12. Implications of Stein's Paradox for Environmental Standard Compliance Assessment.

    PubMed

    Qian, Song S; Stow, Craig A; Cha, YoonKyung

    2015-05-19

    The implications of Stein's paradox stirred considerable debate in statistical circles when the concept was first introduced in the 1950s. The paradox arises when we are interested in estimating the means of several variables simultaneously. In this situation, the best estimator for an individual mean, the sample average, is no longer the best. Rather, a shrinkage estimator, which shrinks individual sample averages toward the overall average is shown to have improved overall accuracy. Although controversial at the time, the concept of shrinking toward overall average is now widely accepted as a good practice for improving statistical stability and reducing error, not only in simple estimation problems, but also in complicated modeling problems. However, the utility of Stein's insights are not widely recognized in the environmental management community, where mean pollutant concentrations of multiple waters are routinely estimated for management decision-making. In this essay, we introduce Stein's paradox and its modern generalization, the Bayesian hierarchical model, in the context of environmental standard compliance assessment. Using simulated data and nutrient monitoring data from wadeable streams around the Great Lakes, we show that a Bayesian hierarchical model can improve overall estimation accuracy, thereby improving our confidence in the assessment results, especially for standard compliance assessment of waters with small sample sizes.

  13. Public School Finance Assessment Project Aligned with ELCC Standards

    ERIC Educational Resources Information Center

    Risen, D. Michael

    2008-01-01

    This is a detailed description of an assessment that can be used in a graduate level of study in the area of public school finance. This has been approved by NCATE as meeting all of the stipulated ELCC standards for which it is designed (1.1, 1.2, 1.3, 1.4, 1.5, 2.1, 2.2, 2.3, 2.4, 3.1, 3.2, 3.3, 4.1, 4.2, 4.3, 5.1, 5.2, 5.3.). This course of…

  14. Modifying the Bank Erosion Hazard Index (BEHI) protocol for rapid assessment of streambank erosion in northeastern Ohio.

    PubMed

    Newton, Sara E; Drenten, Deanna M

    2015-01-01

    Understanding the source of pollution in a stream is vital to preserving, restoring, and maintaining the stream's function and habitat it provides. Sediments from highly eroding streambanks are a major source of pollution in a stream system and have the potential to jeopardize habitat, infrastructure, and stream function. Watershed management practices throughout the Cleveland Metroparks attempt to locate and inventory the source and rate the risk of potential streambank erosion to assist in formulating effect stream, riparian, and habitat management recommendations. The Bank Erosion Hazard Index (BEHI), developed by David Rosgen of Wildland Hydrology is a fluvial geomorphic assessment procedure used to evaluate the susceptibility of potential streambank erosion based on a combination of several variables that are sensitive to various processes of erosion. This protocol can be time consuming, difficult for non-professionals, and confined to specific geomorphic regions. To address these constraints and assist in maintaining consistency and reducing user bias, modifications to this protocol include a "Pre-Screening Questionnaire", elimination of the Study Bank-Height Ratio metric including the bankfull determination, and an adjusted scoring system. This modified protocol was used to assess several high priority streams within the Cleveland Metroparks. The original BEHI protocol was also used to confirm the results of the modified BEHI protocol. After using the modified assessment in the field, and comparing it to the original BEHI method, the two were found to produce comparable BEHI ratings of the streambanks, while significantly reducing the amount of time and resources needed to complete the modified protocol. PMID:25742064

  15. State trends in ecological risk assessment and standard setting

    SciTech Connect

    Siegel, M R; Fowler, K M; Bilyard, G R

    1993-02-01

    The purposes of this paper are (1) to identify key states' activities and plans related to setting cleanup standards using the ecological risk assessment process, and (2) to discuss the impacts these actions may have on the US Department of Energy's (DOE's) environmental restoration program. This report is prepared as part of a larger task, the purpose of which is to identify and assess state regulatory trends and legal developments that may impact DOE's environmental restoration program. Results of this task are intended to provide DOE with advance notice of potentially significant regulatory developments so as to enhance DOE's ability to influence these developments and to incorporate possible regulatory and policy changes into its planning process.

  16. Comparative assessment of various lipid extraction protocols and optimization of transesterification process for microalgal biodiesel production.

    PubMed

    Mandal, Shovon; Patnaik, Reeza; Singh, Amit Kumar; Mallick, Nirupama

    2013-01-01

    Biodiesel, using microalgae as feedstocks, is being explored as the most potent form of alternative diesel fuel for sustainable economic development. A comparative assessment of various protocols for microalgal lipid extraction was carried out using five green algae, six blue-green algae and two diatom species treated with different single and binary solvents both at room temperature and using a soxhlet. Lipid recovery was maximum with chloroform-methanol in the soxhlet extractor. Pretreatments ofbiomass, such as sonication, homogenization, bead-beating, lyophilization, autoclaving, microwave treatment and osmotic shock did not register any significant rise in lipid recovery. As lipid recovery using chloroform-methanol at room temperature demonstrated a marginally lower value than that obtained under the soxhlet extractor, on economical point of view, the former is recommended for microalgal total lipid extraction. Transesterification process enhances the quality of biodiesel. Experiments were designed to determine the effects of catalyst type and quantity, methanol to oil ratio, reaction temperature and time on the transesterification process using response surface methodology. Fatty acid methyl ester yield reached up to 91% with methanol:HCl:oil molar ratio of 82:4:1 at 65 degrees C for 6.4h reaction time. The biodiesel yield relative to the weight of the oil was found to be 69%.

  17. Positive animal welfare states and reference standards for welfare assessment.

    PubMed

    Mellor, D J

    2015-01-01

    Developments in affective neuroscience and behavioural science during the last 10-15 years have together made it increasingly apparent that sentient animals are potentially much more sensitive to their environmental and social circumstances than was previously thought to be the case. It therefore seems likely that both the range and magnitude of welfare trade-offs that occur when animals are managed for human purposes have been underestimated even when minimalistic but arguably well-intentioned attempts have been made to maintain high levels of welfare. In light of these neuroscience-supported behaviour-based insights, the present review considers the extent to which the use of currently available reference standards might draw attention to these previously neglected areas of concern. It is concluded that the natural living orientation cannot provide an all-embracing or definitive welfare benchmark because of its primary focus on behavioural freedom. However assessments of this type, supported by neuroscience insights into behavioural motivation, may now carry greater weight when used to identify management practices that should be avoided, discontinued or substantially modified. Using currently accepted baseline standards as welfare reference points may result in small changes being accorded greater significance than would be the case if they were compared with higher standards, and this could slow the progress towards better levels of welfare. On the other hand, using "what animals want" as a reference standard has the appeal of focusing on the specific resources or conditions the animals would choose themselves and can potentially improve their welfare more quickly than the approach of making small increments above baseline standards. It is concluded that the cautious use of these approaches in different combinations could lead to recommendations that would more effectively promote positive welfare states in hitherto neglected areas of concern.

  18. Standard versus accelerated initiation of renal replacement therapy in acute kidney injury (STARRT-AKI): study protocol for a randomized controlled trial

    PubMed Central

    2013-01-01

    Background Acute kidney injury is a common and devastating complication of critical illness, for which renal replacement therapy is frequently needed to manage severe cases. While a recent systematic review suggested that “earlier” initiation of renal replacement therapy improves survival, completed trials are limited due to small size, single-centre status, and use of variable definitions to define “early” renal replacement therapy initiation. Methods/design This is an open-label pilot randomized controlled trial. One hundred critically ill patients with severe acute kidney injury will be randomly allocated 1:1 to receive “accelerated” initiation of renal replacement therapy or “standard” initiation at 12 centers across Canada. In the accelerated arm, participants will have a venous catheter placed and renal replacement therapy will be initiated within 12 hours of fulfilling eligibility. In the standard initiation arm, participants will be monitored over 7 days to identify indications for renal replacement therapy. For participants in the standard arm with persistent acute kidney injury, defined as a serum creatinine not declining >50% from the value at the time of eligibility, the initiation of RRT will be discouraged unless one or more of the following criteria are fulfilled: serum potassium ≥6.0 mmol/L; serum bicarbonate ≤10 mmol/L; severe respiratory failure (PaO2/FiO2<200) or persisting acute kidney injury for ≥72 hours after fulfilling eligibility. The inclusion criteria are designed to identify a population of critically ill adults with severe acute kidney injury who are likely to need renal replacement therapy during their hospitalization, but not immediately. The primary outcome is protocol adherence (>90%). Secondary outcomes include measures of feasibility (proportion of eligible patients enrolled in the trial, proportion of enrolled patients followed to 90 days for assessment of vital status and the need for renal replacement

  19. Assessing the Fecal Microbiota: An Optimized Ion Torrent 16S rRNA Gene-Based Analysis Protocol

    PubMed Central

    Foroni, Elena; Duranti, Sabrina; Turroni, Francesca; Lugli, Gabriele Andrea; Sanchez, Borja; Martín, Rebeca; Gueimonde, Miguel; van Sinderen, Douwe; Margolles, Abelardo; Ventura, Marco

    2013-01-01

    Assessing the distribution of 16S rRNA gene sequences within a biological sample represents the current state-of-the-art for determination of human gut microbiota composition. Advances in dissecting the microbial biodiversity of this ecosystem have very much been dependent on the development of novel high-throughput DNA sequencing technologies, like the Ion Torrent. However, the precise representation of this bacterial community may be affected by the protocols used for DNA extraction as well as by the PCR primers employed in the amplification reaction. Here, we describe an optimized protocol for 16S rRNA gene-based profiling of the fecal microbiota. PMID:23869230

  20. Assessment of a sequential extraction protocol by examining solution chemistry and mineralogical evolution

    NASA Astrophysics Data System (ADS)

    Maubec, Nicolas; Pauwels, Hélène; Noël, Hervé; Bourrat, Xavier

    2015-04-01

    Knowledge of the behavior of heavy metals, such as copper and zinc in sediments, is a key factor to improve the management of rivers. The mobility of these metals, which may be harmful to the environment, depends directly on their concentration and speciation , which in turn depend on physico-chemical parameters such as mineralogy of the sediment fraction, pH, redox potential, salinity etc ... (Anderson et al., 2000; Sterckeman et al., 2004; Van Oort et al., 2008). Several methods based on chemical extractions are currently applied to assess the behavior of heavy metals in soils and sediments. Among them, sequential extraction procedure is widely used in soil and sediment science and provides details about the origin, biological and physicochemical availability, mobilization and transports of trace metals elements. It is based on the use of a series of extracting reagents to extract selectively heavy metals according to their association within the solid phase (Cornu and Clozel, 2000) including the following different fraction : exchangeable, bound to carbonates, associated to oxides (reducible fraction), linked to organic matter and sulfides (oxidizable fraction) as well as silicate minerals so called residual fraction (Hickey and Kittrick, 1984; Tessier et al., 1979). Consequently sequential extraction method is expected to simulate a lot of potential natural and anthropogenic modifications of environmental conditions (Arey et al., 1999; Brannon and Patrick, 1987; Hickey and Kittrick, 1984; La Force et al., 1999; Tessier et al., 1979). For three decades, a large number of protocols has been proposed, characterized by specific reagents and experimental conditions (concentrations, number of steps, extraction orders and solid/solution ratio) (Das et al., 1995; Gomez Ariza et al., 2000; Quevauviller et al., 1994; Rauret, 1998; Tack and Verloo, 1995), but it appeared that several of them suffer from a lack of selectivity of applied reagents: besides target ones, some

  1. Soil genotoxicity assessment--results of an interlaboratory study on the Vicia micronucleus assay in the context of ISO standardization.

    PubMed

    Cotelle, Sylvie; Dhyèvre, Adrien; Muller, Serge; Chenon, Pascale; Manier, Nicolas; Pandard, Pascal; Echairi, Abdelwahad; Silvestre, Jérôme; Guiresse, Maritxu; Pinelli, Eric; Giorgetti, Lucia; Barbafieri, Meri; Silva, Valéria C; Engel, Fernanda; Radetski, Claudemir M

    2015-01-01

    The Vicia micronucleus assay was standardized in an international protocol, ISO 29200, "Assessment of genotoxic effects on higher plants-Vicia faba micronucleus test," for soil or soil materials (e.g., compost, sludge, sediment, waste, and fertilizing materials). The aim of this interlaboratory study on the Vicia micronucleus assay was to investigate the robustness of this in vivo assay in terms of its applicability in different countries where each participant were asked to use their own seeds and reference soil, in agreement with the ISO 29200 standard. The ISO 29200 standard protocol was adopted for this study, and seven laboratories from three countries (France, Italy, and Brazil) participated in the study. Negative and positive controls were correctly evaluated by 100 % of the participants. In the solid-phase test, the micronucleus frequency (number of micronuclei/1,000 cells) varied from 0.0 to 1.8 for the negative control (i.e., Hoagland's solution) and from 5.8 to 85.7 for the positive control (i.e., maleic hydrazide), while these values varied from 0.0 to 1.7 for the negative control and from 14.3 to 97.7 for the positive control in the case of liquid-phase test. The variability in the data obtained does not adversely affect the robustness of the protocol assessed, on the condition that the methodology described in the standard ISO 29200 is strictly respected. Thus, the Vicia micronucleus test (ISO 29200) is appropriate for complementing prokaryotic or in vitro tests cited in legislation related to risk assessment of genotoxicity potential.

  2. Low incidence of clonality in cold water corals revealed through the novel use of standardized protocol adapted to deep sea sampling

    USGS Publications Warehouse

    Becheler, Ronan; Cassone, Anne-Laure; Noel, Philippe; Mouchel, Olivier; Morrison, Cheryl; Arnaud-Haond, Sophie

    2016-01-01

    Sampling in the deep sea is a technical challenge, which has hindered the acquisition of robust datasets that are necessary to determine the fine-grained biological patterns and processes that may shape genetic diversity. Estimates of the extent of clonality in deep-sea species, despite the importance of clonality in shaping the local dynamics and evolutionary trajectories, have been largely obscured by such limitations. Cold-water coral reefs along European margins are formed mainly by two reef-building species, Lophelia pertusa and Madrepora oculata. Here we present a fine-grained analysis of the genotypic and genetic composition of reefs occurring in the Bay of Biscay, based on an innovative deep-sea sampling protocol. This strategy was designed to be standardized, random, and allowed the georeferencing of all sampled colonies. Clonal lineages discriminated through their Multi-Locus Genotypes (MLG) at 6–7 microsatellite markers could thus be mapped to assess the level of clonality and the spatial spread of clonal lineages. High values of clonal richness were observed for both species across all sites suggesting a limited occurrence of clonality, which likely originated through fragmentation. Additionally, spatial autocorrelation analysis underlined the possible occurrence of fine-grained genetic structure in several populations of both L. pertusa and M. oculata. The two cold-water coral species examined had contrasting patterns of connectivity among canyons, with among-canyon genetic structuring detected in M. oculata, whereas L. pertusa was panmictic at the canyon scale. This study exemplifies that a standardized, random and georeferenced sampling strategy, while challenging, can be applied in the deep sea, and associated benefits outlined here include improved estimates of fine grained patterns of clonality and dispersal that are comparable across sites and among species.

  3. Cervical dystonia: effectiveness of a standardized physical therapy program; study design and protocol of a single blind randomized controlled trial

    PubMed Central

    2013-01-01

    Background Cervical dystonia is characterized by involuntary muscle contractions of the neck and abnormal head positions that affect daily life activities and social life of patients. Patients are usually treated with botulinum toxin injections into affected neck muscles to relief pain and improve control of head postures. In addition, many patients are referred for physical therapy to improve their ability to perform activities of daily living. A recent review on allied health interventions in cervical dystonia showed a lack of randomized controlled intervention studies regarding the effectiveness of physical therapy interventions. Methods/design The (cost-) effectiveness of a standardized physical therapy program compared to regular physical therapy, both as add-on treatment to botulinum toxin injections will be determined in a multi-centre, single blinded randomized controlled trial with 100 cervical dystonia patients. Primary outcomes are disability in daily functioning assessed with the disability subscale of the Toronto Western Spasmodic Torticollis Rating Scale. Secondary outcomes are pain, severity of dystonia, active range of motion of the head, quality of life, anxiety and depression. Data will be collected at baseline, after six months and one year by an independent blind assessor just prior to botulinum toxin injections. For the cost effectiveness, an additional economic evaluation will be performed with the costs per quality adjusted life-year as primary outcome parameter. Discussion Our study will provide new evidence regarding the (cost-) effectiveness of a standardized, tailored physical therapy program for patients with cervical dystonia. It is widely felt that allied health interventions, including physical therapy, may offer a valuable supplement to the current therapeutic options. A positive outcome will lead to a greater use of the standardized physical therapy program. For the Dutch situation a positive outcome implies that the standardized

  4. Assessing change in patient-reported quality of life after elective surgery: protocol for an observational comparison study

    PubMed Central

    Kronzer, Vanessa L.; Jerry, Michelle R.; Avidan, Michael S.

    2016-01-01

    Despite their widespread use, the two main methods of assessing quality of life after surgery have never been directly compared. To support patient decision-making and study design, we aim to compare these two methods. The first of these methods is to assess quality of life before surgery and again after surgery using the same validated scale. The second is simply to ask patients whether or not they think their post-operative quality of life is better, worse, or the same. Our primary objective is to assess agreement between the two measures. Secondary objectives are to calculate the minimum clinically important difference (MCID) and to describe the variation across surgical specialties. To accomplish these aims, we will administer surveys to patients undergoing elective surgery, both before surgery and again 30 days after surgery. This protocol follows detailed guidelines for observational study protocols. PMID:27635222

  5. Assessing change in patient-reported quality of life after elective surgery: protocol for an observational comparison study.

    PubMed

    Kronzer, Vanessa L; Jerry, Michelle R; Avidan, Michael S

    2016-01-01

    Despite their widespread use, the two main methods of assessing quality of life after surgery have never been directly compared. To support patient decision-making and study design, we aim to compare these two methods. The first of these methods is to assess quality of life before surgery and again after surgery using the same validated scale. The second is simply to ask patients whether or not they think their post-operative quality of life is better, worse, or the same. Our primary objective is to assess agreement between the two measures. Secondary objectives are to calculate the minimum clinically important difference (MCID) and to describe the variation across surgical specialties. To accomplish these aims, we will administer surveys to patients undergoing elective surgery, both before surgery and again 30 days after surgery. This protocol follows detailed guidelines for observational study protocols. PMID:27635222

  6. Navigating the iceberg: reducing the number of parameters within the Welfare Quality(®) assessment protocol for dairy cows.

    PubMed

    Heath, C A E; Browne, W J; Mullan, S; Main, D C J

    2014-12-01

    The Welfare Quality(®) protocols provide a multidimensional assessment of welfare, which is lengthy, and hence limited in terms of practicality. The aim of this study was to investigate potential 'iceberg indicators' which could reliably predict the overall classification as a means of reducing the length of time for an assessment and so increase the feasibility of the Welfare Quality(®) protocol as a multidimensional assessment of welfare. Full Welfare Quality(®) assessments were carried out on 92 dairy farms in England and Wales. The farms were all classified as Acceptable or Enhanced. Logistic regression models with cross validation were used to compare model fit for the overall classification on farms. 'Absence of prolonged thirst', on its own, was found to correctly classify farms 88% of the time. More generally, the inclusion of more measures in the models was not associated with greater predictive ability for the overall classification. Absence of prolonged thirst could thus, in theory, be considered to be an iceberg indicator for the Welfare Quality(®) protocol, and could reduce the length of time for a farm assessment to 15 min. Previous work has shown that the parameters within the Welfare Quality(®) protocol are important and relevant for welfare assessment. However, it is argued that the credibility of the published aggregation system is compromised by the finding that one resource measure (Absence of prolonged thirst) is a major driver for the overall classification. It is therefore suggested that the prominence of Absence of prolonged thirst in this role may be better understood as an unintended consequence of the published measure aggregation system rather than as reflecting a realistic iceberg indicator.

  7. An Instrument to Assess Beliefs about Standardized Testing: Measuring the Influence of Epistemology on the Endorsement of Standardized Testing

    ERIC Educational Resources Information Center

    Magee, Robert G.; Jones, Brett D.

    2012-01-01

    This article describes the development of an instrument to assess beliefs about standardized testing in schools, a topic of much heated debate. The Beliefs About Standardized Testing scale was developed to measure the extent to which individuals support high-stakes standardized testing. The 9-item scale comprises three subscales which measure…

  8. Development and evaluation of multispecies test protocols for assessing chemical toxicity

    SciTech Connect

    Garten, C.T. Jr.; Suter, G.W. II; Blaylock, B.G.

    1985-06-01

    Toxicity testing is a well-recognized tool to assist in evaluating the hazards of chemicals to individual biological species. Multispecies toxicity tests, however, are now well developed. Three test systems were examined: the legume-Rhizobium symbiosis for N-fixation, soil microbial populations, and algal multispecies interactions. Test protocols were to be developed and tested using several different chemicals. Test protocols for the legume-Rhizobium and soil microorganisms systems were developed and are presented. The algal multispecies system will require more research, and thus no protocol was recommended at this time. Separate abstracts were prepared for each test system. (ACR)

  9. Quantitative planar and volumetric cardiac measurements using 64 MDCT and 3T MRI versus standard 2D and M-mode echocardiography: Does anesthetic protocol matter?

    PubMed Central

    Drees, Randi; Johnson, Rebecca A; Stepien, Rebecca L; Rio, Alejandro Munoz Del; Saunders, Jimmy H; François, Christopher J

    2016-01-01

    Cross-sectional imaging of the heart utilizing computed tomography (CT) and magnetic resonance imaging (MRI) has been shown to be superior for the evaluation of cardiac morphology and systolic function in humans compared to echocardiography. The purpose of this prospective study was to test the effects of two different anesthetic protocols on cardiac measurements in 10 healthy beagle dogs using 64-multidetector row computed tomographic angiography (64-MDCTA), 3T magnetic resonance (MRI) and standard awake echocardiography. Both anesthetic protocols used propofol for induction and isoflourane for anesthetic maintenance. In addition, protocol A used midazolam/fentanyl and protocol B used dexmedetomedine as premedication and constant rate infusion during the procedure. Significant elevations in systolic and mean blood pressure were present when using protocol B. There was overall good agreement between the variables of cardiac size and systolic function generated from the MDCTA and MRI exams and no significant difference was found when comparing the variables acquired using either anesthetic protocol within each modality. Systolic function variables generated using 64-MDCTA and 3T MRI were only able to predict the left ventricular end diastolic volume as measured during awake echocardiogram when using protocol B and 64-MDCTA. For all other systolic function variables, prediction of awake echocardiographic results was not possible (P = 1). Planar variables acquired using MDCTA or MRI did not allow prediction of the corresponding measurements generated using echocardiography in the awake patients (P=1). Future studies are needed to validate this approach in a more varied population and clinically affected dogs. PMID:26082285

  10. QUANTITATIVE PLANAR AND VOLUMETRIC CARDIAC MEASUREMENTS USING 64 MDCT AND 3T MRI VS. STANDARD 2D AND M-MODE ECHOCARDIOGRAPHY: DOES ANESTHETIC PROTOCOL MATTER?

    PubMed

    Drees, Randi; Johnson, Rebecca A; Stepien, Rebecca L; Munoz Del Rio, Alejandro; Saunders, Jimmy H; François, Christopher J

    2015-01-01

    Cross-sectional imaging of the heart utilizing computed tomography and magnetic resonance imaging (MRI) has been shown to be superior for the evaluation of cardiac morphology and systolic function in humans compared to echocardiography. The purpose of this prospective study was to test the effects of two different anesthetic protocols on cardiac measurements in 10 healthy beagle dogs using 64-multidetector row computed tomographic angiography (64-MDCTA), 3T magnetic resonance (MRI) and standard awake echocardiography. Both anesthetic protocols used propofol for induction and isoflourane for anesthetic maintenance. In addition, protocol A used midazolam/fentanyl and protocol B used dexmedetomedine as premedication and constant rate infusion during the procedure. Significant elevations in systolic and mean blood pressure were present when using protocol B. There was overall good agreement between the variables of cardiac size and systolic function generated from the MDCTA and MRI exams and no significant difference was found when comparing the variables acquired using either anesthetic protocol within each modality. Systolic function variables generated using 64-MDCTA and 3T MRI were only able to predict the left ventricular end diastolic volume as measured during awake echocardiogram when using protocol B and 64-MDCTA. For all other systolic function variables, prediction of awake echocardiographic results was not possible (P = 1). Planar variables acquired using MDCTA or MRI did not allow prediction of the corresponding measurements generated using echocardiography in the awake patients (P = 1). Future studies are needed to validate this approach in a more varied population and clinically affected dogs. PMID:26082285

  11. Performance Comparison of Wireless Sensor Network Standard Protocols in an Aerospace Environment: ISA100.11a and ZigBee Pro

    NASA Technical Reports Server (NTRS)

    Wagner, Raymond S.; Barton, Richard J.

    2011-01-01

    Standards-based wireless sensor network (WSN) protocols are promising candidates for spacecraft avionics systems, offering unprecedented instrumentation flexibility and expandability. Ensuring reliable data transport is key, however, when migrating from wired to wireless data gathering systems. In this paper, we conduct a rigorous laboratory analysis of the relative performances of the ZigBee Pro and ISA100.11a protocols in a representative crewed aerospace environment. Since both operate in the 2.4 GHz radio frequency (RF) band shared by systems such as Wi-Fi, they are subject at times to potentially debilitating RF interference. We compare goodput (application-level throughput) achievable by both under varying levels of 802.11g Wi-Fi traffic. We conclude that while the simpler, more inexpensive ZigBee Pro protocol performs well under moderate levels of interference, the more complex and costly ISA100.11a protocol is needed to ensure reliable data delivery under heavier interference. This paper represents the first published, rigorous analysis of WSN protocols in an aerospace environment that we are aware of and the first published head-to-head comparison of ZigBee Pro and ISA100.11a.

  12. Assessment of and standardization for quantitative nondestructive test

    NASA Technical Reports Server (NTRS)

    Neuschaefer, R. W.; Beal, J. B.

    1972-01-01

    Present capabilities and limitations of nondestructive testing (NDT) as applied to aerospace structures during design, development, production, and operational phases are assessed. It will help determine what useful structural quantitative and qualitative data may be provided from raw materials to vehicle refurbishment. This assessment considers metal alloys systems and bonded composites presently applied in active NASA programs or strong contenders for future use. Quantitative and qualitative data has been summarized from recent literature, and in-house information, and presented along with a description of those structures or standards where the information was obtained. Examples, in tabular form, of NDT technique capabilities and limitations have been provided. NDT techniques discussed and assessed were radiography, ultrasonics, penetrants, thermal, acoustic, and electromagnetic. Quantitative data is sparse; therefore, obtaining statistically reliable flaw detection data must be strongly emphasized. The new requirements for reusable space vehicles have resulted in highly efficient design concepts operating in severe environments. This increases the need for quantitative NDT evaluation of selected structural components, the end item structure, and during refurbishment operations.

  13. Assessing Spectral Simulation Protocols for the Amide I Band of Proteins.

    PubMed

    Cunha, Ana V; Bondarenko, Anna S; Jansen, Thomas L C

    2016-08-01

    We present a benchmark study of spectral simulation protocols for the amide I band of proteins. The amide I band is widely used in infrared spectroscopy of proteins due to the large signal intensity, high sensitivity to hydrogen bonding, and secondary structural motifs. This band has, thus, proven valuable in many studies of protein structure-function relationships. We benchmark spectral simulation protocols using two common force fields in combination with several electrostatic mappings and coupling models. The results are validated against experimental linear absorption and two-dimensional infrared spectroscopy for three well-studied proteins. We find two-dimensional infrared spectroscopy to be much more sensitive to the simulation protocol than linear absorption and report on the best simulation protocols. The findings demonstrate that there is still room for ideas to improve the existing models for the amide I band of proteins. PMID:27348022

  14. Characterization of Eyjafjallajokull volcanic ash particles and a protocol for rapid risk assessment.

    PubMed

    Gislason, S R; Hassenkam, T; Nedel, S; Bovet, N; Eiriksdottir, E S; Alfredsson, H A; Hem, C P; Balogh, Z I; Dideriksen, K; Oskarsson, N; Sigfusson, B; Larsen, G; Stipp, S L S

    2011-05-01

    On April 14, 2010, when meltwaters from the Eyjafjallajökull glacier mixed with hot magma, an explosive eruption sent unusually fine-grained ash into the jet stream. It quickly dispersed over Europe. Previous airplane encounters with ash resulted in sandblasted windows and particles melted inside jet engines, causing them to fail. Therefore, air traffic was grounded for several days. Concerns also arose about health risks from fallout, because ash can transport acids as well as toxic compounds, such as fluoride, aluminum, and arsenic. Studies on ash are usually made on material collected far from the source, where it could have mixed with other atmospheric particles, or after exposure to water as rain or fog, which would alter surface composition. For this study, a unique set of dry ash samples was collected immediately after the explosive event and compared with fresh ash from a later, more typical eruption. Using nanotechniques, custom-designed for studying natural materials, we explored the physical and chemical nature of the ash to determine if fears about health and safety were justified and we developed a protocol that will serve for assessing risks during a future event. On single particles, we identified the composition of nanometer scale salt coatings and measured the mass of adsorbed salts with picogram resolution. The particles of explosive ash that reached Europe in the jet stream were especially sharp and abrasive over their entire size range, from submillimeter to tens of nanometers. Edges remained sharp even after a couple of weeks of abrasion in stirred water suspensions. PMID:21518890

  15. Characterization of Eyjafjallajokull volcanic ash particles and a protocol for rapid risk assessment.

    PubMed

    Gislason, S R; Hassenkam, T; Nedel, S; Bovet, N; Eiriksdottir, E S; Alfredsson, H A; Hem, C P; Balogh, Z I; Dideriksen, K; Oskarsson, N; Sigfusson, B; Larsen, G; Stipp, S L S

    2011-05-01

    On April 14, 2010, when meltwaters from the Eyjafjallajökull glacier mixed with hot magma, an explosive eruption sent unusually fine-grained ash into the jet stream. It quickly dispersed over Europe. Previous airplane encounters with ash resulted in sandblasted windows and particles melted inside jet engines, causing them to fail. Therefore, air traffic was grounded for several days. Concerns also arose about health risks from fallout, because ash can transport acids as well as toxic compounds, such as fluoride, aluminum, and arsenic. Studies on ash are usually made on material collected far from the source, where it could have mixed with other atmospheric particles, or after exposure to water as rain or fog, which would alter surface composition. For this study, a unique set of dry ash samples was collected immediately after the explosive event and compared with fresh ash from a later, more typical eruption. Using nanotechniques, custom-designed for studying natural materials, we explored the physical and chemical nature of the ash to determine if fears about health and safety were justified and we developed a protocol that will serve for assessing risks during a future event. On single particles, we identified the composition of nanometer scale salt coatings and measured the mass of adsorbed salts with picogram resolution. The particles of explosive ash that reached Europe in the jet stream were especially sharp and abrasive over their entire size range, from submillimeter to tens of nanometers. Edges remained sharp even after a couple of weeks of abrasion in stirred water suspensions.

  16. Characterization of Eyjafjallajökull volcanic ash particles and a protocol for rapid risk assessment

    PubMed Central

    Gislason, S. R.; Hassenkam, T.; Nedel, S.; Bovet, N.; Eiriksdottir, E. S.; Alfredsson, H. A.; Hem, C. P.; Balogh, Z. I.; Dideriksen, K.; Oskarsson, N.; Sigfusson, B.; Larsen, G.; Stipp, S. L. S.

    2011-01-01

    On April 14, 2010, when meltwaters from the Eyjafjallajökull glacier mixed with hot magma, an explosive eruption sent unusually fine-grained ash into the jet stream. It quickly dispersed over Europe. Previous airplane encounters with ash resulted in sandblasted windows and particles melted inside jet engines, causing them to fail. Therefore, air traffic was grounded for several days. Concerns also arose about health risks from fallout, because ash can transport acids as well as toxic compounds, such as fluoride, aluminum, and arsenic. Studies on ash are usually made on material collected far from the source, where it could have mixed with other atmospheric particles, or after exposure to water as rain or fog, which would alter surface composition. For this study, a unique set of dry ash samples was collected immediately after the explosive event and compared with fresh ash from a later, more typical eruption. Using nanotechniques, custom-designed for studying natural materials, we explored the physical and chemical nature of the ash to determine if fears about health and safety were justified and we developed a protocol that will serve for assessing risks during a future event. On single particles, we identified the composition of nanometer scale salt coatings and measured the mass of adsorbed salts with picogram resolution. The particles of explosive ash that reached Europe in the jet stream were especially sharp and abrasive over their entire size range, from submillimeter to tens of nanometers. Edges remained sharp even after a couple of weeks of abrasion in stirred water suspensions. PMID:21518890

  17. IVF/ICSI outcomes between cycles with luteal estradiol (E2) pre-treatment before GnRH antagonist protocol and standard long GnRH agonist protocol: a prospective and randomized study

    PubMed Central

    Huang, Guo-ning; Zeng, Ping-hong; Pei, Li

    2009-01-01

    Objective To study if luteal E2 pre-treatment before GnRH antagonist protocol improves IVF/ICSI outcomes compared with standard long GnRH agonist protocol. Design A prospective, randomized and controlled study. Setting ART center of a state public hospital Patient(s) Two hundred twenty infertile women underwent IVF/ICSI treatments. Intervention(s) Participants received oral Estradiol Valerate 4 mg/day preceding the IVF cycle from day 21 until day 2 of next cycle before GnRH antagonist protocol (E2 pre-treatment group n = 109) or received standard long GnRH agonist protocol as control group (n = 111). Main outcome measure(s) Number of oocytes collected, MII oocytes, fertilization, implantation, live birth and early pregnancy rate, and hormone profiles. Result(s) E2 pre-treatment exerted a significant suppressive effect on FSH but not LH secretion compared with basal FSH and LH levels. In E2 pre-treatment group serum LH level was significantly higher during COH and serum P was also significantly higher on the day of HCG injection compared with control group. Five patients from E2 pre-treatment group had elevated LH at all time (≥10 IU/L) and also a concomitantly high P (>1 ng/mL). Two of the five women achieved pregnancy but had early pregnancy loss. Overall, IVF/ICSI outcomes such as implantation, clinical pregnancy and live birth rates were similar between E2 pre-treatment and control groups. Conclusion(s) Luteal E2 pre-treatment before GnRH antagonist protocol significantly increases serum LH level and incidence rate of premature LH but no significant effect is observed on implantation, clinical pregnancy, live birth and early pregnancy loss rates compared with long GnRH agonist protocol. However, more studies in large numbers of cycles are needed to confirm that increased serum LH level by E2 pre-treatment during COH has no negative effect on the IVF/ICSI outcomes. PMID:19225876

  18. Using Think Aloud Protocols to Assess E-Prescribing in Community Pharmacies

    PubMed Central

    Chui, Michelle A.

    2013-01-01

    Introduction Think aloud protocol has rarely been used as a method of data collection in community pharmacies. Purpose The aim of the report is to describe how think aloud protocols were used to identify issues that arise when using e-prescribing technology in pharmacies. In this paper, we report on the benefits and challenges of using think aloud protocols in pharmacies to examine the use of e-prescribing systems. Methods Sixteen pharmacists and pharmacy technicians were recruited from seven community pharmacies in Wisconsin. Data were collected using direct observation alongside think aloud protocol. Direct observations and think aloud protocols took place between January-February, 2011. Participants were asked to verbalize their thoughts as they process electronic prescriptions. Results Participants identify weaknesses in e-prescribing that they had previously not conceived. This created heightened awareness for vigilance when processing e-prescriptions. The main challenge with using think aloud protocols were interruptions in the pharmacy. Some participants found it difficult to remember to continue verbalizing during think aloud sessions. Conclusion The use of think aloud protocols as method of data collection is a new way for understanding the issues related to technology use in community pharmacy practice. Think aloud protocol was beneficial in providing objective information on e-prescribing not based on pharmacist’s or technician’s opinion of the technology. This method provided detailed information and also a wide variety of real time challenges with e-prescribing technology in community pharmacies. Using this data collection method can help identify potential patient safety issues when using e-prescribing and suggestions for redesign. PMID:24860689

  19. Assessing the HIPAA standard in practice: PHR privacy policies.

    PubMed

    Carrión, Inmaculada; Alemán, José Luis Fernández; Toval, Ambrosio

    2011-01-01

    Health service providers are starting to become interested in providing PHRs (Personal Health Records). With PHRs, access to data is controlled by the patient, and not by the health care provider. Companies such as Google and Microsoft are establishing a leadership position in this emerging market. A number of benefits can be achieved with PHRs, but important challenges related to security and privacy must be addressed. This paper presents a review of the privacy policies of 20 free web-based PHRs. Security and privacy characteristics were extracted and assessed according to the HIPAA standard. The results show a number of important differences in the characteristics analyzed. Some improvements can be made to current PHR privacy policies to enhance the audit and management of access to users' PHRs. A questionnaire has been defined to assist PHR designers in this task.

  20. High Temperature Gas Reactors: Assessment of Applicable Codes and Standards

    SciTech Connect

    McDowell, Bruce K.; Nickolaus, James R.; Mitchell, Mark R.; Swearingen, Gary L.; Pugh, Ray

    2011-10-31

    Current interest expressed by industry in HTGR plants, particularly modular plants with power up to about 600 MW(e) per unit, has prompted NRC to task PNNL with assessing the currently available literature related to codes and standards applicable to HTGR plants, the operating history of past and present HTGR plants, and with evaluating the proposed designs of RPV and associated piping for future plants. Considering these topics in the order they are arranged in the text, first the operational histories of five shut-down and two currently operating HTGR plants are reviewed, leading the authors to conclude that while small, simple prototype HTGR plants operated reliably, some of the larger plants, particularly Fort St. Vrain, had poor availability. Safety and radiological performance of these plants has been considerably better than LWR plants. Petroleum processing plants provide some applicable experience with materials similar to those proposed for HTGR piping and vessels. At least one currently operating plant - HTR-10 - has performed and documented a leak before break analysis that appears to be applicable to proposed future US HTGR designs. Current codes and standards cover some HTGR materials, but not all materials are covered to the high temperatures envisioned for HTGR use. Codes and standards, particularly ASME Codes, are under development for proposed future US HTGR designs. A 'roadmap' document has been prepared for ASME Code development; a new subsection to section III of the ASME Code, ASME BPVC III-5, is scheduled to be published in October 2011. The question of terminology for the cross-duct structure between the RPV and power conversion vessel is discussed, considering the differences in regulatory requirements that apply depending on whether this structure is designated as a 'vessel' or as a 'pipe'. We conclude that designing this component as a 'pipe' is the more appropriate choice, but that the ASME BPVC allows the owner of the facility to select

  1. Evaluating a national assessment strategy for urinary incontinence in nursing home residents: reliability of the minimum data set and validity of the resident assessment protocol.

    PubMed

    Resnick, N M; Brandeis, G H; Baumann, M M; Morris, J N

    1996-01-01

    Evaluation of 1 million incontinent American nursing home residents is hampered by both failure to detect incontinence and logistical barriers to diagnostic testing. The nationally mandated Minimum Data Set (MDS) and Resident Assessment Protocol (RAP) were devised to address these deficiencies. Although both instruments are also used in at least 18 other countries, neither has been evaluated. Our goal was to determine the reliability of the MDS and the accuracy of the RAP in predicting the lower urinary tract cause of incontinence. We determined interrater reliability for the 13 MDS items related to urinary incontinence in 123 randomly selected residents of 13 nursing homes in 5 states; forms were completed blindly by 2 nurses from each facility who were trained for a day. The RAP was assessed in 102 representative institutionalized women by blinded evaluation of its diagnostic accuracy compared with the multichannel videourodynamic criterion standard. For the MDS, interrater reliability for incontinence of all grades was excellent (weighted kappa correlation coefficient = 0.90), although reliability was greater at the extremes of measurement than for incontinence of intermediate severity. With the exception of delirium, correlations for the 11 MDS items related to incontinence were 0.65-0.96; for 6 items, correlations were > or = 0.8. The diagnostic accuracy of the RAP, successfully administered to 80% of women, was 70%. The accuracy of the nearly identical algorithm that formed the basis for the RAP was 84%. Importantly, serious misclassifications were not observed for either the RAP or the algorithm. Although its definitions should be modified slightly, the MDS appears to be feasible and reliable when administered by trained staff. In women, the diagnostic accuracy and safety of the RAP are good-particularly when administered as instructed-but the original, sex-specific algorithm is preferable. Together, the MDS and modified RAP provide a useful, stepwise, and

  2. Assessment of a sequential extraction protocol by examining solution chemistry and mineralogical evolution

    NASA Astrophysics Data System (ADS)

    Maubec, Nicolas; Pauwels, Hélène; Noël, Hervé; Bourrat, Xavier

    2015-04-01

    Knowledge of the behavior of heavy metals, such as copper and zinc in sediments, is a key factor to improve the management of rivers. The mobility of these metals, which may be harmful to the environment, depends directly on their concentration and speciation , which in turn depend on physico-chemical parameters such as mineralogy of the sediment fraction, pH, redox potential, salinity etc ... (Anderson et al., 2000; Sterckeman et al., 2004; Van Oort et al., 2008). Several methods based on chemical extractions are currently applied to assess the behavior of heavy metals in soils and sediments. Among them, sequential extraction procedure is widely used in soil and sediment science and provides details about the origin, biological and physicochemical availability, mobilization and transports of trace metals elements. It is based on the use of a series of extracting reagents to extract selectively heavy metals according to their association within the solid phase (Cornu and Clozel, 2000) including the following different fraction : exchangeable, bound to carbonates, associated to oxides (reducible fraction), linked to organic matter and sulfides (oxidizable fraction) as well as silicate minerals so called residual fraction (Hickey and Kittrick, 1984; Tessier et al., 1979). Consequently sequential extraction method is expected to simulate a lot of potential natural and anthropogenic modifications of environmental conditions (Arey et al., 1999; Brannon and Patrick, 1987; Hickey and Kittrick, 1984; La Force et al., 1999; Tessier et al., 1979). For three decades, a large number of protocols has been proposed, characterized by specific reagents and experimental conditions (concentrations, number of steps, extraction orders and solid/solution ratio) (Das et al., 1995; Gomez Ariza et al., 2000; Quevauviller et al., 1994; Rauret, 1998; Tack and Verloo, 1995), but it appeared that several of them suffer from a lack of selectivity of applied reagents: besides target ones, some

  3. Understanding barriers to evidence-based assessment: clinician attitudes toward standardized assessment tools.

    PubMed

    Jensen-Doss, Amanda; Hawley, Kristin M

    2010-01-01

    In an era of evidence-based practice, why are clinicians not typically engaged in evidence-based assessment? To begin to understand this issue, a national multidisciplinary survey was conducted to examine clinician attitudes toward standardized assessment tools. There were 1,442 child clinicians who provided opinions about the psychometric qualities of these tools, their benefit over clinical judgment alone, and their practicality. Doctoral-level clinicians and psychologists expressed more positive ratings in all three domains than master's-level clinicians and nonpsychologists, respectively, although only the disciplinary differences remained significant when predictors were examined simultaneously. All three attitude scales were predictive of standardized assessment tool use, although practical concerns were the strongest and only independent predictor of use.

  4. Alignment of Standards and Assessment: A Theoretical and Empirical Study of Methods for Alignment

    ERIC Educational Resources Information Center

    Nasstrom, Gunilla; Henriksson, Widar

    2008-01-01

    Introduction: In a standards-based school-system alignment of policy documents with standards and assessment is important. To be able to evaluate whether schools and students have reached the standards, the assessment should focus on the standards. Different models and methods can be used for measuring alignment, i.e. the correspondence between…

  5. Standards for the assessment of salivary glands – an update

    PubMed Central

    Ochal-Choińska, Aleksandra

    2016-01-01

    The paper is an update of 2011 Standards for Ultrasound Assessment of Salivary Glands, which were developed by the Polish Ultrasound Society. We have described current ultrasound technical requirements, assessment and measurement techniques as well as guidelines for ultrasound description. We have also discussed an ultrasound image of normal salivary glands as well as the most important pathologies, such as inflammation, sialosis, collagenosis, injuries and proliferative processes, with particular emphasis on lesions indicating high risk of malignancy. In acute bacterial inflammation, the salivary glands appear as hypoechoic, enlarged or normal-sized, with increased parenchymal flow. The echogenicity is significantly increased in viral infections. Degenerative lesions may be seen in chronic inflammations. Hyperechoic deposits with acoustic shadowing can be visualized in lithiasis. Parenchymal fibrosis is a dominant feature of sialosis. Sjögren syndrome produces different pictures of salivary gland parenchymal lesions at different stages of the disease. Pleomorphic adenomas are usually hypoechoic, well-defined and polycyclic in most cases. Warthin tumor usually presents as a hypoechoic, oval-shaped lesion with anechoic cystic spaces. Malignancies are characterized by blurred outlines, irregular shape, usually heterogeneous echogenicity and pathological neovascularization. The accompanying metastatic lesions are another indicator of malignancy, however, final diagnosis should be based on biopsy findings. PMID:27446602

  6. Standards and Assessment Resource Bank, Version 2.5 [CD-ROM].

    ERIC Educational Resources Information Center

    Colorado State Dept. of Education, Denver.

    The Colorado "Standards and Assessment Resource Bank" on CD-ROM contains updated information about the Colorado Student Assessment Program, the text of the "Standards-Based Classroom Operator's Manual," and a bank of standards-based units, assessments, and staff development materials submitted by Colorado teachers and school district…

  7. Standardized Assessment of Biodiversity Trends in Tropical Forest Protected Areas: The End Is Not in Sight.

    PubMed

    Beaudrot, Lydia; Ahumada, Jorge A; O'Brien, Timothy; Alvarez-Loayza, Patricia; Boekee, Kelly; Campos-Arceiz, Ahimsa; Eichberg, David; Espinosa, Santiago; Fegraus, Eric; Fletcher, Christine; Gajapersad, Krisna; Hallam, Chris; Hurtado, Johanna; Jansen, Patrick A; Kumar, Amit; Larney, Eileen; Lima, Marcela Guimarães Moreira; Mahony, Colin; Martin, Emanuel H; McWilliam, Alex; Mugerwa, Badru; Ndoundou-Hockemba, Mireille; Razafimahaimodison, Jean Claude; Romero-Saltos, Hugo; Rovero, Francesco; Salvador, Julia; Santos, Fernanda; Sheil, Douglas; Spironello, Wilson R; Willig, Michael R; Winarni, Nurul L; Zvoleff, Alex; Andelman, Sandy J

    2016-01-01

    Extinction rates in the Anthropocene are three orders of magnitude higher than background and disproportionately occur in the tropics, home of half the world's species. Despite global efforts to combat tropical species extinctions, lack of high-quality, objective information on tropical biodiversity has hampered quantitative evaluation of conservation strategies. In particular, the scarcity of population-level monitoring in tropical forests has stymied assessment of biodiversity outcomes, such as the status and trends of animal populations in protected areas. Here, we evaluate occupancy trends for 511 populations of terrestrial mammals and birds, representing 244 species from 15 tropical forest protected areas on three continents. For the first time to our knowledge, we use annual surveys from tropical forests worldwide that employ a standardized camera trapping protocol, and we compute data analytics that correct for imperfect detection. We found that occupancy declined in 22%, increased in 17%, and exhibited no change in 22% of populations during the last 3-8 years, while 39% of populations were detected too infrequently to assess occupancy changes. Despite extensive variability in occupancy trends, these 15 tropical protected areas have not exhibited systematic declines in biodiversity (i.e., occupancy, richness, or evenness) at the community level. Our results differ from reports of widespread biodiversity declines based on aggregated secondary data and expert opinion and suggest less extreme deterioration in tropical forest protected areas. We simultaneously fill an important conservation data gap and demonstrate the value of large-scale monitoring infrastructure and powerful analytics, which can be scaled to incorporate additional sites, ecosystems, and monitoring methods. In an era of catastrophic biodiversity loss, robust indicators produced from standardized monitoring infrastructure are critical to accurately assess population outcomes and identify

  8. Standardized Assessment of Biodiversity Trends in Tropical Forest Protected Areas: The End Is Not in Sight.

    PubMed

    Beaudrot, Lydia; Ahumada, Jorge A; O'Brien, Timothy; Alvarez-Loayza, Patricia; Boekee, Kelly; Campos-Arceiz, Ahimsa; Eichberg, David; Espinosa, Santiago; Fegraus, Eric; Fletcher, Christine; Gajapersad, Krisna; Hallam, Chris; Hurtado, Johanna; Jansen, Patrick A; Kumar, Amit; Larney, Eileen; Lima, Marcela Guimarães Moreira; Mahony, Colin; Martin, Emanuel H; McWilliam, Alex; Mugerwa, Badru; Ndoundou-Hockemba, Mireille; Razafimahaimodison, Jean Claude; Romero-Saltos, Hugo; Rovero, Francesco; Salvador, Julia; Santos, Fernanda; Sheil, Douglas; Spironello, Wilson R; Willig, Michael R; Winarni, Nurul L; Zvoleff, Alex; Andelman, Sandy J

    2016-01-01

    Extinction rates in the Anthropocene are three orders of magnitude higher than background and disproportionately occur in the tropics, home of half the world's species. Despite global efforts to combat tropical species extinctions, lack of high-quality, objective information on tropical biodiversity has hampered quantitative evaluation of conservation strategies. In particular, the scarcity of population-level monitoring in tropical forests has stymied assessment of biodiversity outcomes, such as the status and trends of animal populations in protected areas. Here, we evaluate occupancy trends for 511 populations of terrestrial mammals and birds, representing 244 species from 15 tropical forest protected areas on three continents. For the first time to our knowledge, we use annual surveys from tropical forests worldwide that employ a standardized camera trapping protocol, and we compute data analytics that correct for imperfect detection. We found that occupancy declined in 22%, increased in 17%, and exhibited no change in 22% of populations during the last 3-8 years, while 39% of populations were detected too infrequently to assess occupancy changes. Despite extensive variability in occupancy trends, these 15 tropical protected areas have not exhibited systematic declines in biodiversity (i.e., occupancy, richness, or evenness) at the community level. Our results differ from reports of widespread biodiversity declines based on aggregated secondary data and expert opinion and suggest less extreme deterioration in tropical forest protected areas. We simultaneously fill an important conservation data gap and demonstrate the value of large-scale monitoring infrastructure and powerful analytics, which can be scaled to incorporate additional sites, ecosystems, and monitoring methods. In an era of catastrophic biodiversity loss, robust indicators produced from standardized monitoring infrastructure are critical to accurately assess population outcomes and identify

  9. Standardized Assessment of Biodiversity Trends in Tropical Forest Protected Areas: The End Is Not in Sight

    PubMed Central

    O'Brien, Timothy; Alvarez-Loayza, Patricia; Boekee, Kelly; Campos-Arceiz, Ahimsa; Eichberg, David; Espinosa, Santiago; Fegraus, Eric; Fletcher, Christine; Gajapersad, Krisna; Hallam, Chris; Hurtado, Johanna; Jansen, Patrick A.; Kumar, Amit; Larney, Eileen; Lima, Marcela Guimarães Moreira; Mahony, Colin; Martin, Emanuel H.; McWilliam, Alex; Mugerwa, Badru; Ndoundou-Hockemba, Mireille; Razafimahaimodison, Jean Claude; Romero-Saltos, Hugo; Rovero, Francesco; Salvador, Julia; Santos, Fernanda; Sheil, Douglas; Spironello, Wilson R.; Willig, Michael R.; Winarni, Nurul L.; Zvoleff, Alex; Andelman, Sandy J.

    2016-01-01

    Extinction rates in the Anthropocene are three orders of magnitude higher than background and disproportionately occur in the tropics, home of half the world’s species. Despite global efforts to combat tropical species extinctions, lack of high-quality, objective information on tropical biodiversity has hampered quantitative evaluation of conservation strategies. In particular, the scarcity of population-level monitoring in tropical forests has stymied assessment of biodiversity outcomes, such as the status and trends of animal populations in protected areas. Here, we evaluate occupancy trends for 511 populations of terrestrial mammals and birds, representing 244 species from 15 tropical forest protected areas on three continents. For the first time to our knowledge, we use annual surveys from tropical forests worldwide that employ a standardized camera trapping protocol, and we compute data analytics that correct for imperfect detection. We found that occupancy declined in 22%, increased in 17%, and exhibited no change in 22% of populations during the last 3–8 years, while 39% of populations were detected too infrequently to assess occupancy changes. Despite extensive variability in occupancy trends, these 15 tropical protected areas have not exhibited systematic declines in biodiversity (i.e., occupancy, richness, or evenness) at the community level. Our results differ from reports of widespread biodiversity declines based on aggregated secondary data and expert opinion and suggest less extreme deterioration in tropical forest protected areas. We simultaneously fill an important conservation data gap and demonstrate the value of large-scale monitoring infrastructure and powerful analytics, which can be scaled to incorporate additional sites, ecosystems, and monitoring methods. In an era of catastrophic biodiversity loss, robust indicators produced from standardized monitoring infrastructure are critical to accurately assess population outcomes and identify

  10. Evaluating holistic needs assessment in outpatient cancer care—a randomised controlled trial: the study protocol

    PubMed Central

    Snowden, Austyn; Young, Jenny; White, Craig; Murray, Esther; Richard, Claude; Lussier, Marie-Therese; MacArthur, Ewan; Storey, Dawn; Schipani, Stefano; Wheatley, Duncan; McMahon, Jeremy; Ross, Elaine

    2015-01-01

    Introduction People living with and beyond cancer are vulnerable to a number of physical, functional and psychological issues. Undertaking a holistic needs assessment (HNA) is one way to support a structured discussion of patients’ needs within a clinical consultation. However, there is little evidence on how HNA impacts on the dynamics of the clinical consultation. This study aims to establish (1) how HNA affects the type of conversation that goes on during a clinical consultation and (2) how these putative changes impact on shared decision-making and self-efficacy. Methods and analysis The study is hosted by 10 outpatient oncology clinics in the West of Scotland and South West England. Participants are patients with a diagnosis of head and neck, breast, urological, gynaecological and colorectal cancer who have received treatment for their cancer. Patients are randomised to an intervention or control group. The control group entails standard care—routine consultation between the patient and clinician. In the intervention group, the patient completes a holistic needs assessment prior to consultation. The completed assessment is then given to the clinician where it informs a discussion based on the patient's needs and concerns as identified by them. The primary outcome measure is patient participation, as determined by dialogue ratio (DR) and preponderance of initiative (PI) within the consultation. The secondary outcome measures are shared decision-making and self-efficacy. It is hypothesised that HNA will be associated with greater patient participation within the consultation, and that shared decision-making and feelings of self-efficacy will increase as a function of the intervention. Ethics and dissemination This study has been given a favourable opinion by the West of Scotland Research Ethics Committee and NHS Research & Development. Study findings will be disseminated through peer-reviewed publications and conference attendance. Trail registration number

  11. Development of a standard protocol for monitoring trace elements in continental waters with moss bags: inter- and intraspecific differences.

    PubMed

    Cesa, Mattia; Bertossi, Alberto; Cherubini, Giovanni; Gava, Emanuele; Mazzilis, Denis; Piccoli, Elisa; Verardo, Pierluigi; Nimis, Pier Luigi

    2015-04-01

    This paper is a contribution for validating a standard method for trace element monitoring based on transplants and analysis of aquatic bryophytes, in the framework of the EC Directive 2000/60. It presents the results of an experiment carried out to assess significant differences in the amount and variability of As, Cd, Co, Cr, Cu, Fe, Mn, Ni, Pb and Zn in three moss species (Cinclidotus aquaticus, Fontinalis antipyretica, Platyhypnidium riparioides) and two different parts of the moss (whole plant vs apical tips). Mosses were caged in bags made of a plastic net and transplanted for 2 weeks to an irrigation canal impacted by a waste water treatment plant. Trace element concentrations were measured by inductively coupled plasma optical emission spectrometry (ICP-OES) and inductively coupled plasma mass spectrometry (ICP-MS) before and after exposure to the experimental and control sites in five samples. Enrichment factors >2 were found for Cu, Ni, Mn, Pb and Zn in all moss species, lower in C. aquaticus, intermediate in F. antipyretica and higher in P. riparioides (the species we recommend to use). The analysis of apical tips after exposure instead of the whole plant led to (I) lower concentrations of As, Co, Cr, Fe and Zn in C. aquaticus (-7 to -30%) and of Fe and Pb (-13, -18%) in P. riparioides, (II) higher concentrations of Cu, Ni and Zn (+14 to +18%) in P. riparioides, while (III) no significant difference (p > 0.05) in F. antipyretica. Data variability after exposure was generally lower in apical tips, especially in C. aquaticus and in F. antipyretica, less in P. riparioides. In the aim of standardizing the moss-bag technique, the analysis of apical tips is recommended.

  12. Development of a standard protocol for monitoring trace elements in continental waters with moss bags: inter- and intraspecific differences.

    PubMed

    Cesa, Mattia; Bertossi, Alberto; Cherubini, Giovanni; Gava, Emanuele; Mazzilis, Denis; Piccoli, Elisa; Verardo, Pierluigi; Nimis, Pier Luigi

    2015-04-01

    This paper is a contribution for validating a standard method for trace element monitoring based on transplants and analysis of aquatic bryophytes, in the framework of the EC Directive 2000/60. It presents the results of an experiment carried out to assess significant differences in the amount and variability of As, Cd, Co, Cr, Cu, Fe, Mn, Ni, Pb and Zn in three moss species (Cinclidotus aquaticus, Fontinalis antipyretica, Platyhypnidium riparioides) and two different parts of the moss (whole plant vs apical tips). Mosses were caged in bags made of a plastic net and transplanted for 2 weeks to an irrigation canal impacted by a waste water treatment plant. Trace element concentrations were measured by inductively coupled plasma optical emission spectrometry (ICP-OES) and inductively coupled plasma mass spectrometry (ICP-MS) before and after exposure to the experimental and control sites in five samples. Enrichment factors >2 were found for Cu, Ni, Mn, Pb and Zn in all moss species, lower in C. aquaticus, intermediate in F. antipyretica and higher in P. riparioides (the species we recommend to use). The analysis of apical tips after exposure instead of the whole plant led to (I) lower concentrations of As, Co, Cr, Fe and Zn in C. aquaticus (-7 to -30%) and of Fe and Pb (-13, -18%) in P. riparioides, (II) higher concentrations of Cu, Ni and Zn (+14 to +18%) in P. riparioides, while (III) no significant difference (p > 0.05) in F. antipyretica. Data variability after exposure was generally lower in apical tips, especially in C. aquaticus and in F. antipyretica, less in P. riparioides. In the aim of standardizing the moss-bag technique, the analysis of apical tips is recommended. PMID:25647488

  13. Assessment of neuromuscular function after different strength training protocols using tensiomyography.

    PubMed

    de Paula Simola, Rauno Á; Harms, Nico; Raeder, Christian; Kellmann, Michael; Meyer, Tim; Pfeiffer, Mark; Ferrauti, Alexander

    2015-05-01

    The purpose of the study was to analyze tensiomyography (TMG) sensitivity to changes in muscle force and neuromuscular function of the muscle rectus femoris (RF) using TMG muscle properties after 5 different lower-limb strength training protocols (multiple sets; DS = drop sets; eccentric overload; FW = flywheel; PL = plyometrics). After baseline measurements, 14 male strength trained athletes completed 1 squat training protocol per week over a 5-week period in a randomized controlled order. Maximal voluntary isometric contraction (MVIC), TMG measurements of maximal radial displacement of the muscle belly (Dm), contraction time between 10 and 90% of Dm (Tc), and mean muscle contraction velocities from the beginning until 10% (V10) and 90% of Dm (V90) were analyzed up to 0.5 (post-train), 24 (post-24), and 48 hours (post-48) after the training interventions. Significant analysis of variance main effects for measurement points were found for all TMG contractile properties and MVIC (p < 0.01). Dm and V10 post-train values were significantly lower after protocols DS and FW compared with protocol PL (p = 0.032 and 0.012, respectively). Dm, V10, and V90 decrements correlated significantly to the decreases in MVIC (r = 0.64-0.67, p ≤ 0.05). Some TMG muscle properties are sensitive to changes in muscle force, and different lower-limb strength training protocols lead to changes in neuromuscular function of RF. In addition, those protocols involving high and eccentric load and a high total time under tension may induce higher changes in TMG muscle properties. PMID:25474337

  14. Definition of a Standard Protocol to Determine the Growth Potential of Listeria Monotgenes and Yersinia Enterocolitica in Pork Sausage Produced in Abruzzo Region, Italy

    PubMed Central

    Neri, Diana; Romantini, Romina; Santarelli, Gino Angelo; Prencipe, Vincenza

    2014-01-01

    Pork meat products consumed raw or after a short period of fermentation can be considered at risk for food safety. Sausages (fresh sausage made from pork meat) are produced in several Italian regions, with variation in ingredients. In some Italian Regions, including Abruzzo, these products are frequently consumed raw or undercooked, after a variable period of fermentation. The European Community food regulation promotes the use of challenge tests to determine safety levels. This study is aimed to ensure safety of Abruzzo’s sausages, compared with growth potential (δ) of Listeria monocytogenes and Yersinia enterocolitica, and also aims to define an experimental standard protocol document to carry out challenge tests. Guidelines classify ready-to-eat foods in categories that are able to support (δ>0.5 log10 ufc/g) and not support (δ≤0.5 log10 ufc/g) the growth of Listeria monocytogenes. The products were manufactured according to traditional recipes and were contaminated in laboratory. Results from the experiment yielded information useful to assess the ability of these products to support the growth of pathogenic microorganisms. The batches of sausages were stored at 8, 12, 18 and 20°C to get statistical evaluation. The results showed that, despite the conditioning of the storage temperature and the level of water activity, both organisms remain in the product in concentrations similar to those leading or being able to increase its charge. In particular, the period of greatest consumption of this product (7/8 days of preparation) corresponds to the period of greatest growth of pathogenic microorganisms studied, except for those stored at a temperature of 8°C, which are safer for the consumer. PMID:27800415

  15. [Standardizing a protocol of magnetic resonance imaging of temporomandibular joints. Part 2. Unification of analysis of obtained data].

    PubMed

    Bulanova, T V

    2004-01-01

    The paper presents a unified protocol for analyzing the data obtained by magnetic resonance tomography, which has been used to examine 350 patients. It characterizes the MR semiotics of different pathological conditions of articular structures, which are illustrated by MR images. An optimal terminology is proposed for the evaluation of bone and soft tissue changes.

  16. Can DNA-Based Ecosystem Assessments Quantify Species Abundance? Testing Primer Bias and Biomass--Sequence Relationships with an Innovative Metabarcoding Protocol.

    PubMed

    Elbrecht, Vasco; Leese, Florian

    2015-01-01

    Metabarcoding is an emerging genetic tool to rapidly assess biodiversity in ecosystems. It involves high-throughput sequencing of a standard gene from an environmental sample and comparison to a reference database. However, no consensus has emerged regarding laboratory pipelines to screen species diversity and infer species abundances from environmental samples. In particular, the effect of primer bias and the detection limit for specimens with a low biomass has not been systematically examined, when processing samples in bulk. We developed and tested a DNA metabarcoding protocol that utilises the standard cytochrome c oxidase subunit I (COI) barcoding fragment to detect freshwater macroinvertebrate taxa. DNA was extracted in bulk, amplified in a single PCR step, and purified, and the libraries were directly sequenced in two independent MiSeq runs (300-bp paired-end reads). Specifically, we assessed the influence of specimen biomass on sequence read abundance by sequencing 31 specimens of a stonefly species with known haplotypes spanning three orders of magnitude in biomass (experiment I). Then, we tested the recovery of 52 different freshwater invertebrate taxa of similar biomass using the same standard barcoding primers (experiment II). Each experiment was replicated ten times to maximise statistical power. The results of both experiments were consistent across replicates. We found a distinct positive correlation between species biomass and resulting numbers of MiSeq reads. Furthermore, we reliably recovered 83% of the 52 taxa used to test primer bias. However, sequence abundance varied by four orders of magnitudes between taxa despite the use of similar amounts of biomass. Our metabarcoding approach yielded reliable results for high-throughput assessments. However, the results indicated that primer efficiency is highly species-specific, which would prevent straightforward assessments of species abundance and biomass in a sample. Thus, PCR-based metabarcoding

  17. Can DNA-Based Ecosystem Assessments Quantify Species Abundance? Testing Primer Bias and Biomass--Sequence Relationships with an Innovative Metabarcoding Protocol.

    PubMed

    Elbrecht, Vasco; Leese, Florian

    2015-01-01

    Metabarcoding is an emerging genetic tool to rapidly assess biodiversity in ecosystems. It involves high-throughput sequencing of a standard gene from an environmental sample and comparison to a reference database. However, no consensus has emerged regarding laboratory pipelines to screen species diversity and infer species abundances from environmental samples. In particular, the effect of primer bias and the detection limit for specimens with a low biomass has not been systematically examined, when processing samples in bulk. We developed and tested a DNA metabarcoding protocol that utilises the standard cytochrome c oxidase subunit I (COI) barcoding fragment to detect freshwater macroinvertebrate taxa. DNA was extracted in bulk, amplified in a single PCR step, and purified, and the libraries were directly sequenced in two independent MiSeq runs (300-bp paired-end reads). Specifically, we assessed the influence of specimen biomass on sequence read abundance by sequencing 31 specimens of a stonefly species with known haplotypes spanning three orders of magnitude in biomass (experiment I). Then, we tested the recovery of 52 different freshwater invertebrate taxa of similar biomass using the same standard barcoding primers (experiment II). Each experiment was replicated ten times to maximise statistical power. The results of both experiments were consistent across replicates. We found a distinct positive correlation between species biomass and resulting numbers of MiSeq reads. Furthermore, we reliably recovered 83% of the 52 taxa used to test primer bias. However, sequence abundance varied by four orders of magnitudes between taxa despite the use of similar amounts of biomass. Our metabarcoding approach yielded reliable results for high-throughput assessments. However, the results indicated that primer efficiency is highly species-specific, which would prevent straightforward assessments of species abundance and biomass in a sample. Thus, PCR-based metabarcoding

  18. Can DNA-Based Ecosystem Assessments Quantify Species Abundance? Testing Primer Bias and Biomass—Sequence Relationships with an Innovative Metabarcoding Protocol

    PubMed Central

    Elbrecht, Vasco; Leese, Florian

    2015-01-01

    Metabarcoding is an emerging genetic tool to rapidly assess biodiversity in ecosystems. It involves high-throughput sequencing of a standard gene from an environmental sample and comparison to a reference database. However, no consensus has emerged regarding laboratory pipelines to screen species diversity and infer species abundances from environmental samples. In particular, the effect of primer bias and the detection limit for specimens with a low biomass has not been systematically examined, when processing samples in bulk. We developed and tested a DNA metabarcoding protocol that utilises the standard cytochrome c oxidase subunit I (COI) barcoding fragment to detect freshwater macroinvertebrate taxa. DNA was extracted in bulk, amplified in a single PCR step, and purified, and the libraries were directly sequenced in two independent MiSeq runs (300-bp paired-end reads). Specifically, we assessed the influence of specimen biomass on sequence read abundance by sequencing 31 specimens of a stonefly species with known haplotypes spanning three orders of magnitude in biomass (experiment I). Then, we tested the recovery of 52 different freshwater invertebrate taxa of similar biomass using the same standard barcoding primers (experiment II). Each experiment was replicated ten times to maximise statistical power. The results of both experiments were consistent across replicates. We found a distinct positive correlation between species biomass and resulting numbers of MiSeq reads. Furthermore, we reliably recovered 83% of the 52 taxa used to test primer bias. However, sequence abundance varied by four orders of magnitudes between taxa despite the use of similar amounts of biomass. Our metabarcoding approach yielded reliable results for high-throughput assessments. However, the results indicated that primer efficiency is highly species-specific, which would prevent straightforward assessments of species abundance and biomass in a sample. Thus, PCR-based metabarcoding

  19. Standardization of tumor markers - priorities identified through external quality assessment.

    PubMed

    Sturgeon, Catharine

    2016-01-01

    Tumor markers are often heterogeneous substances that may be present in elevated concentrations in the serum of cancer patients. Typically measured by immunoassay, they contribute to clinical management, particularly in screening, case-finding, prognostic assessment, and post-treatment monitoring. Data both from external quality assessment (EQA) schemes and clinical studies demonstrate significant variation in tumor marker results obtained for the same specimen using different methods. Between-method between-laboratory coefficients of variation (CV) reported by EQA schemes generally reflect the complexity of the measurand, ranging from <5% for the structurally relatively simple α-fetoprotein (AFP) to >25% for the complex mucinous cancer antigen 19-9 (CA19-9). Improving the standardization of tumor marker measurements is particularly important for three reasons. The primary use of tumor markers is in monitoring cancer patients over long periods of time. Clinical interpretation of trends may consequently be affected if results are obtained in different laboratories using different methods or if a laboratory has to change method. Differences in results may have major implications for adoption of area-wide decision cut-offs and make implementation of these difficult. Method-related differences also make it difficult to compare clinical studies. Improving comparability of tumor marker results requires broad international agreement about which molecular forms of the measurand have clinical utility, identifying and adopting pure molecular forms as calibrants, and defining antibody specificities for their optimal detection. These aims have been achieved to varying extents for the most frequently measured serum tumor markers as described in this paper. PMID:27542005

  20. Standardization of tumor markers - priorities identified through external quality assessment.

    PubMed

    Sturgeon, Catharine

    2016-01-01

    Tumor markers are often heterogeneous substances that may be present in elevated concentrations in the serum of cancer patients. Typically measured by immunoassay, they contribute to clinical management, particularly in screening, case-finding, prognostic assessment, and post-treatment monitoring. Data both from external quality assessment (EQA) schemes and clinical studies demonstrate significant variation in tumor marker results obtained for the same specimen using different methods. Between-method between-laboratory coefficients of variation (CV) reported by EQA schemes generally reflect the complexity of the measurand, ranging from <5% for the structurally relatively simple α-fetoprotein (AFP) to >25% for the complex mucinous cancer antigen 19-9 (CA19-9). Improving the standardization of tumor marker measurements is particularly important for three reasons. The primary use of tumor markers is in monitoring cancer patients over long periods of time. Clinical interpretation of trends may consequently be affected if results are obtained in different laboratories using different methods or if a laboratory has to change method. Differences in results may have major implications for adoption of area-wide decision cut-offs and make implementation of these difficult. Method-related differences also make it difficult to compare clinical studies. Improving comparability of tumor marker results requires broad international agreement about which molecular forms of the measurand have clinical utility, identifying and adopting pure molecular forms as calibrants, and defining antibody specificities for their optimal detection. These aims have been achieved to varying extents for the most frequently measured serum tumor markers as described in this paper.

  1. The Reliability and Validity of Protocols for the Assessment of Endurance Sports Performance: An Updated Review

    ERIC Educational Resources Information Center

    Stevens, Christopher John; Dascombe, Ben James

    2015-01-01

    Sports performance testing is one of the most common and important measures used in sport science. Performance testing protocols must have high reliability to ensure any changes are not due to measurement error or inter-individual differences. High validity is also important to ensure test performance reflects true performance. Time-trial…

  2. The use of a Stream Visual Assessment Protocol to determine ecosystem integrity in an urban watershed in Puerto Rico

    NASA Astrophysics Data System (ADS)

    de Jesús-Crespo, Rebeca; Ramirez, Alonso

    The growing need to protect stream ecosystems in Puerto Rico requires the development of monitoring procedures that help determine management priorities. Physical habitat assessments have been used to make quick evaluations that are cost efficient and easy conduct, yet they need to be studied further to understand their accuracy at predicting stream health. This study evaluated the efficiency of the Hawaii Stream Visual Assessment Protocol (HSVAP) at determining integrity of streams within the highly urbanized Rio Piedras watershed in Puerto Rico. To validate the protocol we compared results from HSVAP assessments conducted at 16 reaches with water quality and macroinvertebrate data collected at the same sites. Results from linear regressions between the water quality measures and HSVAP scores showed that there was no significant relationships ( R2 = 0.48; p = 0.08). This implies that the protocol is not supported by the water quality data. However, results from regressions between macroinvertebrate diversity and the number of families per site showed a significant positive relation with HSVAP scores ( R2 = 0.30; p = 0.02; R2 = 0.24; p = 0.05). In addition, a significant negative relation was observed between HSVAP scores and the Family Biotic Index (FBI) ( R2 = 0.32; p = 0.02). Comparisons between ratings obtained from the FBI and HSVAP scores suggest that the HSVAP classified sites as having higher quality than the biological metric. Based on these results, it can be concluded that the HSVAP is a good tool for a general assessment of the physical characteristics of a stream, but it needs modifications to accurately assess ecological quality of streams in Puerto Rico.

  3. Assessing the Effectiveness of the NICHD Investigative Interview Protocol when Interviewing French-Speaking Alleged Victims of Child Sexual Abuse in Quebec

    ERIC Educational Resources Information Center

    Cyr, Mireille; Lamb, Michael E.

    2009-01-01

    Objectives: The study was designed to assess the effectiveness of the flexibly structured NICHD Investigative Interview Protocol for child sexual abuse (CSA) investigative interviews by police officers and mental health workers in Quebec. The NICHD Protocol was designed to operationalize "best practice" guidelines and to help forensic interviewers…

  4. A Protocol for Functional Assessment of Whole-Protein Saturation Mutagenesis Libraries Utilizing High-Throughput Sequencing.

    PubMed

    Stiffler, Michael A; Subramanian, Subu K; Salinas, Victor H; Ranganathan, Rama

    2016-01-01

    Site-directed mutagenesis has long been used as a method to interrogate protein structure, function and evolution. Recent advances in massively-parallel sequencing technology have opened up the possibility of assessing the functional or fitness effects of large numbers of mutations simultaneously. Here, we present a protocol for experimentally determining the effects of all possible single amino acid mutations in a protein of interest utilizing high-throughput sequencing technology, using the 263 amino acid antibiotic resistance enzyme TEM-1 β-lactamase as an example. In this approach, a whole-protein saturation mutagenesis library is constructed by site-directed mutagenic PCR, randomizing each position individually to all possible amino acids. The library is then transformed into bacteria, and selected for the ability to confer resistance to β-lactam antibiotics. The fitness effect of each mutation is then determined by deep sequencing of the library before and after selection. Importantly, this protocol introduces methods which maximize sequencing read depth and permit the simultaneous selection of the entire mutation library, by mixing adjacent positions into groups of length accommodated by high-throughput sequencing read length and utilizing orthogonal primers to barcode each group. Representative results using this protocol are provided by assessing the fitness effects of all single amino acid mutations in TEM-1 at a clinically relevant dosage of ampicillin. The method should be easily extendable to other proteins for which a high-throughput selection assay is in place. PMID:27403811

  5. Large-scale hydrological simulations using the soil water assessment tool, protocol development, and application in the danube basin.

    PubMed

    Pagliero, Liliana; Bouraoui, Fayçal; Willems, Patrick; Diels, Jan

    2014-01-01

    The Water Framework Directive of the European Union requires member states to achieve good ecological status of all water bodies. A harmonized pan-European assessment of water resources availability and quality, as affected by various management options, is necessary for a successful implementation of European environmental legislation. In this context, we developed a methodology to predict surface water flow at the pan-European scale using available datasets. Among the hydrological models available, the Soil Water Assessment Tool was selected because its characteristics make it suitable for large-scale applications with limited data requirements. This paper presents the results for the Danube pilot basin. The Danube Basin is one of the largest European watersheds, covering approximately 803,000 km and portions of 14 countries. The modeling data used included land use and management information, a detailed soil parameters map, and high-resolution climate data. The Danube Basin was divided into 4663 subwatersheds of an average size of 179 km. A modeling protocol is proposed to cope with the problems of hydrological regionalization from gauged to ungauged watersheds and overparameterization and identifiability, which are usually present during calibration. The protocol involves a cluster analysis for the determination of hydrological regions and multiobjective calibration using a combination of manual and automated calibration. The proposed protocol was successfully implemented, with the modeled discharges capturing well the overall hydrological behavior of the basin.

  6. Assessing the standard Molybdenum projector augmented wave VASP potentials

    SciTech Connect

    Mattsson, Ann E.

    2014-07-01

    Density Functional Theory (DFT) based Equation of State (EOS) construction is a prominent part of Sandia’s capabilities to support engineering sciences. This capability is based on augmenting experimental data with information gained from computational investigations, especially in those parts of the phase space where experimental data is hard, dangerous, or expensive to obtain. A key part of the success of the Sandia approach is the fundamental science work supporting the computational capability. Not only does this work enhance the capability to perform highly accurate calculations but it also provides crucial insight into the limitations of the computational tools, providing high confidence in the results even where results cannot be, or have not yet been, validated by experimental data. This report concerns the key ingredient of projector augmented-wave (PAW) potentials for use in pseudo-potential computational codes. Using the tools discussed in SAND2012-7389 we assess the standard Vienna Ab-initio Simulation Package (VASP) PAWs for Molybdenum.

  7. Implementing the CAS Standards: The Implementation of the CAS Standards in Student Affairs as a Comprehensive Assessment Approach

    ERIC Educational Resources Information Center

    Dorman, Jesse A.

    2012-01-01

    The increasing use of the CAS standards as a comprehensive assessment approach in divisions of student affairs necessitates a more in-depth understanding of how the CAS standards are being implemented in these settings. In response to increasing calls for improvement, accountability and professionalism in student affairs (Bresciani, 2006; Cooper…

  8. Assessment and Next Generation Standards: An Interview with Olivia Gude

    ERIC Educational Resources Information Center

    Sweeny, Robert

    2014-01-01

    This article provides a transcript of an interview with Olivia Gude, member of the National Coalition for Core Arts Standards Writing Team. In the interview, Gude provides an overview of the process for writing the new visual arts standards.

  9. Formative Assessment for the Common Core Literacy Standards

    ERIC Educational Resources Information Center

    Calfee, Robert; Wilson, Kathleen M.; Flannery, Brian; Kapinus, Barbara A.

    2014-01-01

    Background/Context: As implementation of the Common Core Literacy Standards moves ahead, teachers, students, and schools are discovering that the standards demand a great deal of them in order to achieve the vision of college, career, and citizenship in the global-digital world outlined in the standards. To accomplish the goals and high…

  10. Standardization of Test for Assessment and Comparing of Students' Measurement

    ERIC Educational Resources Information Center

    Osadebe, Patrick U.

    2014-01-01

    The study Standardized Economics Achievement Test for senior secondary school students in Nigeria. Three research questions guided the study. The standardized test in Economics was first constructed by an expert as a valid and reliable instrument. The test was then used for standardization in this study. That is, ensuring that the Economics…

  11. A standard method for measuring benzene and formaldehyde emissions from candles in emission test chambers for human health risk assessment purposes.

    PubMed

    Petry, Thomas; Cazelle, Elodie; Lloyd, Paul; Mascarenhas, Reuben; Stijntjes, Gerard

    2013-07-01

    Burning candles release a number of volatile or semi-volatile organic compounds (VOC; SVOC) and particulate matters into indoor air. Publicly available candle emission studies vary in protocols and factors known to have a great influence on combustion processes, making it difficult to determine potential implications of candle emissions for human health. The main objective of this investigation was to establish and standardize as far as possible a candle VOC emission testing protocol in small- to mid-scale test chambers on the basis of existing standards as well as to verify its suitability for human health risk assessment purposes. Two pilot studies were conducted to define the boundaries of permissible variations in chamber parameters without significantly impacting the quality of the candle burn. A four-centre ring trial assessed the standardised protocol. The ring trial revealed that when the laboratories were able to control the chamber parameters within the defined boundaries, reproducible formaldehyde and benzene emissions, considered as VOC markers, are determined. It was therefore concluded that the protocol developed in this investigation is suitable for generating candle VOC emission data for human health risk assessment purposes. PMID:23695106

  12. Protocol for the Assessment of Common Core Teaching: The Impact of Instructional Inclusion on Students with Special Needs

    ERIC Educational Resources Information Center

    Gallagher, Kathleen L.; Odozi, Anthony

    2015-01-01

    The quality of instruction in the classroom is the most powerful leverage point for school improvement because it is the only thing over which educators have a significant degree of control. As student assessments change to reflect the higher expectations of Common Core State Standards (CCSS), it is important that the assessment and development of…

  13. In-Suit Light Exercise (ISLE) Prebreathe Protocol Peer Review Assessment. Volume 1

    NASA Technical Reports Server (NTRS)

    Brady, Timothy K.; Polk, James D.

    2011-01-01

    The performance of extravehicular activity (EVA) by National Aeronautics and Space Administration astronauts involves the risk of decompression sickness. This risk has been mitigated by the use of oxygen "prebreathe" to effectively wash out tissue nitrogen prior to each EVA. Now that the Space Shuttle Program (SSP) is being retired, high-pressure oxygen will become a limited resource. The In-Suit Light Exercise (ISLE) Prebreathe Protocol offers several potential benefits including its potential to save 6 pounds of oxygen per EVA. At the request of the NASA Engineering and Safety Center, the peer review convened on October 14, 2010. The major recommendation of the Review Committee was that the ISLE protocol was acceptable for operational use as a prebreathe option prior to EVA. The results from the peer review are contained in this document.

  14. In-Suit Light Exercise (ISLE) Prebreathe Protocol Peer Review Assessment. Part 2; Appendices

    NASA Technical Reports Server (NTRS)

    Brady, Timothy K.; Polk, James D.

    2011-01-01

    The performance of extravehicular activity (EVA) by National Aeronautics and Space Administration astronauts involves the risk of decompression sickness. This risk has been mitigated by the use of oxygen "prebreathe" to effectively wash out tissue nitrogen prior to each EVA. Now that the Space Shuttle Program (SSP) is being retired, high-pressure oxygen will become a limited resource. The In-Suit Light Exercise (ISLE) Prebreathe Protocol offers several potential benefits including its potential to save 6 pounds of oxygen per EVA. At the request of the NASA Engineering and Safety Center, the peer review convened on October 14, 2010. The major recommendation of the Review Committee was that the ISLE protocol was acceptable for operational use as a prebreathe option prior to EVA. The appendices to Volume I of the report are contained in this document.

  15. Critical assessment of OmpF channel selectivity: merging information from different experimental protocols

    NASA Astrophysics Data System (ADS)

    López, M. L.; García-Giménez, E.; Aguilella, V. M.; Alcaraz, A.

    2010-11-01

    The ion selectivity of a channel can be quantified in several ways by using different experimental protocols. A wide, mesoscopic channel, the OmpF porin of the outer membrane of E. coli, serves as a case study for comparing and analysing several measures of the channel cation-anion permeability in chlorides of alkali metals (LiCl, NaCl, KCl, CsCl). We show how different insights can be gained and integrated to rationalize the global image of channel selectivity. To this end, reversal potential, channel conductance and bi-ionic potential (two different salts with a common anion on each side of the channel but with the same concentration) experiments are discussed in light of an electrodiffusion model based on the Poisson-Nernst-Planck formalism. Measurements and calculations based on the atomic crystal structure of the channel show that each protocol displays a particular balance between the different sources of selectivity.

  16. Two Models for Evaluating Alignment of State Standards and Assessments: Competing or Complementary Perspectives?

    ERIC Educational Resources Information Center

    Newton, Jill A.; Kasten, Sarah E.

    2013-01-01

    The release of the Common Core State Standards for Mathematics and their adoption across the United States calls for careful attention to the alignment between mathematics standards and assessments. This study investigates 2 models that measure alignment between standards and assessments, the Surveys of Enacted Curriculum (SEC) and the Webb…

  17. Using Microsoft Excel to Assess Standards: A "Techtorial". Article #2 in a 6-Part Series

    ERIC Educational Resources Information Center

    Mears, Derrick

    2009-01-01

    Standards-based assessment is a term currently being used quite often in educational reform discussions. The philosophy behind this initiative is to utilize "standards" or "benchmarks" to focus instruction and assessments of student learning. The National Standards for Physical Education (NASPE, 2004) provide a framework to guide this process for…

  18. Assessing and Managing Risk with Suicidal Individuals

    ERIC Educational Resources Information Center

    Linehan, Marsh M.; Comtois, Katherine A.; Ward-Ciesielski, Erin F.

    2012-01-01

    The University of Washington Risk Assessment Protocol (UWRAP) and Risk Assessment and Management Protocol (UWRAMP) have been used in numerous clinical trials treating high-risk suicidal individuals over several years. These protocols structure assessors and treatment providers to provide a thorough suicide risk assessment, review standards of care…

  19. Bird biodiversity assessments in temperate forest: the value of point count versus acoustic monitoring protocols

    PubMed Central

    Willig, Michael R.

    2015-01-01

    Effective monitoring programs for biodiversity are needed to assess trends in biodiversity and evaluate the consequences of management. This is particularly true for birds and faunas that occupy interior forest and other areas of low human population density, as these are frequently under-sampled compared to other habitats. For birds, Autonomous Recording Units (ARUs) have been proposed as a supplement or alternative to point counts made by human observers to enhance monitoring efforts. We employed two strategies (i.e., simultaneous-collection and same-season) to compare point count and ARU methods for quantifying species richness and composition of birds in temperate interior forests. The simultaneous-collection strategy compares surveys by ARUs and point counts, with methods matched in time, location, and survey duration such that the person and machine simultaneously collect data. The same-season strategy compares surveys from ARUs and point counts conducted at the same locations throughout the breeding season, but methods differ in the number, duration, and frequency of surveys. This second strategy more closely follows the ways in which monitoring programs are likely to be implemented. Site-specific estimates of richness (but not species composition) differed between methods; however, the nature of the relationship was dependent on the assessment strategy. Estimates of richness from point counts were greater than estimates from ARUs in the simultaneous-collection strategy. Woodpeckers in particular, were less frequently identified from ARUs than point counts with this strategy. Conversely, estimates of richness were lower from point counts than ARUs in the same-season strategy. Moreover, in the same-season strategy, ARUs detected the occurrence of passerines at a higher frequency than did point counts. Differences between ARU and point count methods were only detected in site-level comparisons. Importantly, both methods provide similar estimates of species

  20. Bird biodiversity assessments in temperate forest: the value of point count versus acoustic monitoring protocols.

    PubMed

    Klingbeil, Brian T; Willig, Michael R

    2015-01-01

    Effective monitoring programs for biodiversity are needed to assess trends in biodiversity and evaluate the consequences of management. This is particularly true for birds and faunas that occupy interior forest and other areas of low human population density, as these are frequently under-sampled compared to other habitats. For birds, Autonomous Recording Units (ARUs) have been proposed as a supplement or alternative to point counts made by human observers to enhance monitoring efforts. We employed two strategies (i.e., simultaneous-collection and same-season) to compare point count and ARU methods for quantifying species richness and composition of birds in temperate interior forests. The simultaneous-collection strategy compares surveys by ARUs and point counts, with methods matched in time, location, and survey duration such that the person and machine simultaneously collect data. The same-season strategy compares surveys from ARUs and point counts conducted at the same locations throughout the breeding season, but methods differ in the number, duration, and frequency of surveys. This second strategy more closely follows the ways in which monitoring programs are likely to be implemented. Site-specific estimates of richness (but not species composition) differed between methods; however, the nature of the relationship was dependent on the assessment strategy. Estimates of richness from point counts were greater than estimates from ARUs in the simultaneous-collection strategy. Woodpeckers in particular, were less frequently identified from ARUs than point counts with this strategy. Conversely, estimates of richness were lower from point counts than ARUs in the same-season strategy. Moreover, in the same-season strategy, ARUs detected the occurrence of passerines at a higher frequency than did point counts. Differences between ARU and point count methods were only detected in site-level comparisons. Importantly, both methods provide similar estimates of species

  1. Effects of standard training in the use of closed-circuit televisions in visually impaired adults: design of a training protocol and a randomized controlled trial

    PubMed Central

    2010-01-01

    Background Reading problems are frequently reported by visually impaired persons. A closed-circuit television (CCTV) can be helpful to maintain reading ability, however, it is difficult to learn how to use this device. In the Netherlands, an evidence-based rehabilitation program in the use of CCTVs was lacking. Therefore, a standard training protocol needed to be developed and tested in a randomized controlled trial (RCT) to provide an evidence-based training program in the use of this device. Methods/Design To develop a standard training program, information was collected by studying literature, observing training in the use of CCTVs, discussing the content of the training program with professionals and organizing focus and discussion groups. The effectiveness of the program was evaluated in an RCT, to obtain an evidence-based training program. Dutch patients (n = 122) were randomized into a treatment group: normal instructions from the supplier combined with training in the use of CCTVs, or into a control group: instructions from the supplier only. The effect of the training program was evaluated in terms of: change in reading ability (reading speed and reading comprehension), patients' skills to operate the CCTV, perceived (vision-related) quality of life and tasks performed in daily living. Discussion The development of the CCTV training protocol and the design of the RCT in the present study may serve as an example to obtain an evidence-based training program. The training program was adjusted to the needs and learning abilities of individual patients, however, for scientific reasons it might have been preferable to standardize the protocol further, in order to gain more comparable results. Trial registration http://www.trialregister.nl, identifier: NTR1031 PMID:20219120

  2. Validation of the automatic image analyser to assess retinal vessel calibre (ALTAIR): a prospective study protocol

    PubMed Central

    Garcia-Ortiz, Luis; Gómez-Marcos, Manuel A; Recio-Rodríguez, Jose I; Maderuelo-Fernández, Jose A; Chamoso-Santos, Pablo; Rodríguez-González, Sara; de Paz-Santana, Juan F; Merchan-Cifuentes, Miguel A; Corchado-Rodríguez, Juan M

    2014-01-01

    Introduction The fundus examination is a non-invasive evaluation of the microcirculation of the retina. The aim of the present study is to develop and validate (reliability and validity) the ALTAIR software platform (Automatic image analyser to assess retinal vessel calibre) in order to analyse its utility in different clinical environments. Methods and analysis A cross-sectional study in the first phase and a prospective observational study in the second with 4 years of follow-up. The study will be performed in a primary care centre and will include 386 participants. The main measurements will include carotid intima-media thickness, pulse wave velocity by Sphygmocor, cardio-ankle vascular index through the VASERA VS-1500, cardiac evaluation by a digital ECG and renal injury by microalbuminuria and glomerular filtration. The retinal vascular evaluation will be performed using a TOPCON TRCNW200 non-mydriatic retinal camera to obtain digital images of the retina, and the developed software (ALTAIR) will be used to automatically calculate the calibre of the retinal vessels, the vascularised area and the branching pattern. For software validation, the intraobserver and interobserver reliability, the concurrent validity of the vascular structure and function, as well as the association between the estimated retinal parameters and the evolution or onset of new lesions in the target organs or cardiovascular diseases will be examined. Ethics and dissemination The study has been approved by the clinical research ethics committee of the healthcare area of Salamanca. All study participants will sign an informed consent to agree to participate in the study in compliance with the Declaration of Helsinki and the WHO standards for observational studies. Validation of this tool will provide greater reliability to the analysis of retinal vessels by decreasing the intervention of the observer and will result in increased validity through the use of additional information, especially

  3. Assessment of equity in healthcare financing in Fiji and Timor-Leste: a study protocol

    PubMed Central

    Asante, Augustine D; Price, Jennifer; Hayen, Andrew; Irava, Wayne; Martins, Joao; Guinness, Lorna; Ataguba, John E; Limwattananon, Supon; Mills, Anne; Jan, Stephen; Wiseman, Virginia

    2014-01-01

    Introduction Equitable health financing remains a key health policy objective worldwide. In low and middle-income countries (LMICs), there is evidence that many people are unable to access the health services they need due to financial and other barriers. There are growing calls for fairer health financing systems that will protect people from catastrophic and impoverishing health payments in times of illness. This study aims to assess equity in healthcare financing in Fiji and Timor-Leste in order to support government efforts to improve access to healthcare and move towards universal health coverage in the two countries. Methods and analysis The study employs two standard measures of equity in health financing increasingly being applied in LMICs—benefit incidence analysis (BIA) and financing incidence analysis (FIA). In Fiji, we will use a combination of secondary and primary data including a Household Income and Expenditure Survey, National Health Accounts, and data from a cross-sectional household survey on healthcare utilisation. In Timor-Leste, the World Bank recently completed a health equity and financial protection analysis that incorporates BIA and FIA, and found that the distribution of benefits from healthcare financing is pro-rich. Building on this work, we will explore the factors that influence the pro-rich distribution. Ethics and dissemination The study is approved by the Human Research Ethics Committee of University of New South Wales, Australia (Approval number: HC13269); the Fiji National Health Research Committee (Approval # 201371); and the Timor-Leste Ministry of Health (Ref MS/UNSW/VI/218). Results Study outcomes will be disseminated through stakeholder meetings, targeted multidisciplinary seminars, peer-reviewed journal publications, policy briefs and the use of other web-based technologies including social media. A user-friendly toolkit on how to analyse healthcare financing equity will be developed for use by policymakers and

  4. A Novel Staining Protocol for Multiparameter Assessment of Cell Heterogeneity in Phormidium Populations (Cyanobacteria) Employing Fluorescent Dyes

    PubMed Central

    Tashyreva, Daria; Elster, Josef; Billi, Daniela

    2013-01-01

    Bacterial populations display high heterogeneity in viability and physiological activity at the single-cell level, especially under stressful conditions. We demonstrate a novel staining protocol for multiparameter assessment of individual cells in physiologically heterogeneous populations of cyanobacteria. The protocol employs fluorescent probes, i.e., redox dye 5-cyano-2,3-ditolyl tetrazolium chloride, ‘dead cell’ nucleic acid stain SYTOX Green, and DNA-specific fluorochrome 4′,6-diamidino-2-phenylindole, combined with microscopy image analysis. Our method allows simultaneous estimates of cellular respiration activity, membrane and nucleoid integrity, and allows the detection of photosynthetic pigments fluorescence along with morphological observations. The staining protocol has been adjusted for, both, laboratory and natural populations of the genus Phormidium (Oscillatoriales), and tested on 4 field-collected samples and 12 laboratory strains of cyanobacteria. Based on the mentioned cellular functions we suggest classification of cells in cyanobacterial populations into four categories: (i) active and intact; (ii) injured but active; (iii) metabolically inactive but intact; (iv) inactive and injured, or dead. PMID:23437052

  5. A protocol to assess the enzymatic release of dissolved organic phosphorus species in waters under environmentally relevant conditions.

    PubMed

    Monbet, Phil; McKelvie, Ian D; Saefumillah, Asep; Worsfold, Paul J

    2007-11-01

    A protocol to assess the potential release of dissolved reactive phosphorus (DRP) by enzymatic hydrolysis of dissolved organic phosphorus (DOP) in waters (sediment porewater and sewage liquors in this study) under environmental conditions is presented. This protocol enables the quantification of different classes of DOP compounds using a variety of phosphatase enzymes, i.e., alkaline phosphatase, phosphodiesterase, and phytase. All experiments were carried out within the pH range of most natural waters, i.e., at neutral (pH 7) or slightly alkaline pH (pH 9). Tri-sodium citrate and sodium dodecyl sulfate (SDS) were used in the assays to prevent interferences due to adsorption processes in the presence of multivalent metallic cations and to minimize protein binding. Applying this protocol revealed that labile phosphate monoesters always represented the largest fraction of enzymatically hydrolyzed P in sewage liquors and sediment porewater. Total enzymatically hydrolyzable P (EHP) represented only 16% of the TDP in the sediment porewater but up to 43% in sewage liquors. Because most of the enzymes used in this study are likely to exist in aquatic ecosystems, the EHP fraction might represent a source of potentially bioavailable P of similar magnitude to DRP. PMID:18044529

  6. Transformative Shifts in Art History Teaching: The Impact of Standards-Based Assessment

    ERIC Educational Resources Information Center

    Ormond, Barbara

    2011-01-01

    This article examines pedagogical shifts in art history teaching that have developed as a response to the implementation of a standards-based assessment regime. The specific characteristics of art history standards-based assessment in the context of New Zealand secondary schools are explained to demonstrate how an exacting form of assessment has…

  7. Comparing Panelists' Understanding of Standard Setting across Multiple Levels of an Alternate Science Assessment

    ERIC Educational Resources Information Center

    Hansen, Mary A.; Lyon, Steven R.; Heh, Peter; Zigmond, Naomi

    2013-01-01

    Large-scale assessment programs, including alternate assessments based on alternate achievement standards (AA-AAS), must provide evidence of technical quality and validity. This study provides information about the technical quality of one AA-AAS by evaluating the standard setting for the science component. The assessment was designed to have…

  8. Assessing the Quality of the Common Core State Standards for Mathematics

    ERIC Educational Resources Information Center

    Cobb, Paul; Jackson, Kara

    2011-01-01

    The authors comment on Porter, McMaken, Hwang, and Yang's recent analysis of the Common Core State Standards for Mathematics by critiquing their measures of the focus of the standards and the absence of an assessment of coherence. The authors then consider whether the standards are an improvement over most state mathematics standards by discussing…

  9. lower limbs kinematic assessment of the effect of a gym and hydrotherapy rehabilitation protocol after knee megaprosthesis: a case report

    PubMed Central

    Lovecchio, Nicola; Sciumè, Luciana; Zago, Matteo; Panella, Lorenzo; Lopresti, Maurizio; Sforza, Chiarella

    2016-01-01

    [Purpose] To quantitatively assess the effect of a personalized rehabilitation protocol after knee megaprosthesis. [Subject and Methods] The gait patterns of a 33-year-old male patient with knee synovial sarcoma were assessed by a computerized analysis before and after 40 rehabilitation sessions. [Results] The rehabilitation protocol improved the gait pattern. After rehabilitation, hip flexion was nearly symmetric, with normalized affected limb hip flexion, and improved ankle flexion. Ankle in/eversion was asymmetric and did not improve after physiotherapy. Before physiotherapy, the hip flexion on the affected side anticipated the movement but nearly normalized in the follow-up assessment. Hip abduction range of motion increased, with wider movements and good balance. Knee range of motion nearly symmetrized, but maintained an anticipated behavior, without shock absorption at heel-strike. [Conclusion] Instrumental gait analysis allowed us to gain evidence about the training and how to expand rehabilitative interventions to improve efficacy. In particular, we recommend quadriceps and gastrocnemius eccentric contraction training (to improve the shock absorption phase, preventing early failures of the prosthesis); one-leg standing performance (to improve the support phase of the affected limb); adductor strength training (to aid in hip control during the swing phase); and peroneus strength training (to increase ankle joint stabilization). PMID:27134413

  10. lower limbs kinematic assessment of the effect of a gym and hydrotherapy rehabilitation protocol after knee megaprosthesis: a case report.

    PubMed

    Lovecchio, Nicola; Sciumè, Luciana; Zago, Matteo; Panella, Lorenzo; Lopresti, Maurizio; Sforza, Chiarella

    2016-03-01

    [Purpose] To quantitatively assess the effect of a personalized rehabilitation protocol after knee megaprosthesis. [Subject and Methods] The gait patterns of a 33-year-old male patient with knee synovial sarcoma were assessed by a computerized analysis before and after 40 rehabilitation sessions. [Results] The rehabilitation protocol improved the gait pattern. After rehabilitation, hip flexion was nearly symmetric, with normalized affected limb hip flexion, and improved ankle flexion. Ankle in/eversion was asymmetric and did not improve after physiotherapy. Before physiotherapy, the hip flexion on the affected side anticipated the movement but nearly normalized in the follow-up assessment. Hip abduction range of motion increased, with wider movements and good balance. Knee range of motion nearly symmetrized, but maintained an anticipated behavior, without shock absorption at heel-strike. [Conclusion] Instrumental gait analysis allowed us to gain evidence about the training and how to expand rehabilitative interventions to improve efficacy. In particular, we recommend quadriceps and gastrocnemius eccentric contraction training (to improve the shock absorption phase, preventing early failures of the prosthesis); one-leg standing performance (to improve the support phase of the affected limb); adductor strength training (to aid in hip control during the swing phase); and peroneus strength training (to increase ankle joint stabilization). PMID:27134413

  11. lower limbs kinematic assessment of the effect of a gym and hydrotherapy rehabilitation protocol after knee megaprosthesis: a case report.

    PubMed

    Lovecchio, Nicola; Sciumè, Luciana; Zago, Matteo; Panella, Lorenzo; Lopresti, Maurizio; Sforza, Chiarella

    2016-03-01

    [Purpose] To quantitatively assess the effect of a personalized rehabilitation protocol after knee megaprosthesis. [Subject and Methods] The gait patterns of a 33-year-old male patient with knee synovial sarcoma were assessed by a computerized analysis before and after 40 rehabilitation sessions. [Results] The rehabilitation protocol improved the gait pattern. After rehabilitation, hip flexion was nearly symmetric, with normalized affected limb hip flexion, and improved ankle flexion. Ankle in/eversion was asymmetric and did not improve after physiotherapy. Before physiotherapy, the hip flexion on the affected side anticipated the movement but nearly normalized in the follow-up assessment. Hip abduction range of motion increased, with wider movements and good balance. Knee range of motion nearly symmetrized, but maintained an anticipated behavior, without shock absorption at heel-strike. [Conclusion] Instrumental gait analysis allowed us to gain evidence about the training and how to expand rehabilitative interventions to improve efficacy. In particular, we recommend quadriceps and gastrocnemius eccentric contraction training (to improve the shock absorption phase, preventing early failures of the prosthesis); one-leg standing performance (to improve the support phase of the affected limb); adductor strength training (to aid in hip control during the swing phase); and peroneus strength training (to increase ankle joint stabilization).

  12. Population Dynamics P System (PDP) Models: A Standardized Protocol for Describing and Applying Novel Bio-Inspired Computing Tools

    PubMed Central

    Colomer, Maria Àngels; Margalida, Antoni; Pérez-Jiménez, Mario J.

    2013-01-01

    Today, the volume of data and knowledge of processes necessitates more complex models that integrate all available information. This handicap has been solved thanks to the technological advances in both software and hardware. Computational tools available today have allowed developing a new family of models, known as computational models. The description of these models is difficult as they can not be expressed analytically, and it is therefore necessary to create protocols that serve as guidelines for future users. The Population Dynamics P systems models (PDP) are a novel and effective computational tool to model complex problems, are characterized by the ability to work in parallel (simultaneously interrelating different processes), are modular and have a high computational efficiency. However, the difficulty of describing these models therefore requires a protocol to unify the presentation and the steps to follow. We use two case studies to demonstrate the use and implementation of these computational models for population dynamics and ecological process studies, discussing briefly their potential applicability to simulate complex ecosystem dynamics. PMID:23593284

  13. Population Dynamics P system (PDP) models: a standardized protocol for describing and applying novel bio-inspired computing tools.

    PubMed

    Colomer, Maria Àngels; Margalida, Antoni; Pérez-Jiménez, Mario J

    2013-01-01

    Today, the volume of data and knowledge of processes necessitates more complex models that integrate all available information. This handicap has been solved thanks to the technological advances in both software and hardware. Computational tools available today have allowed developing a new family of models, known as computational models. The description of these models is difficult as they can not be expressed analytically, and it is therefore necessary to create protocols that serve as guidelines for future users. The Population Dynamics P systems models (PDP) are a novel and effective computational tool to model complex problems, are characterized by the ability to work in parallel (simultaneously interrelating different processes), are modular and have a high computational efficiency. However, the difficulty of describing these models therefore requires a protocol to unify the presentation and the steps to follow. We use two case studies to demonstrate the use and implementation of these computational models for population dynamics and ecological process studies, discussing briefly their potential applicability to simulate complex ecosystem dynamics.

  14. Social Moderation, Assessment and Assuring Standards for Accounting Graduates

    ERIC Educational Resources Information Center

    Watty, Kim; Freeman, Mark; Howieson, Bryan; Hancock, Phil; O'Connell, Brendan; de Lange, Paul; Abraham, Anne

    2014-01-01

    Evidencing student achievement of standards is a growing imperative worldwide. Key stakeholders (including current and prospective students, government, regulators and employers) want confidence that threshold learning standards in an accounting degree have been assured. Australia's new higher education regulatory environment requires that…

  15. Tying Together the Common Core of Standards, Instruction, and Assessments

    ERIC Educational Resources Information Center

    Phillips, Vicki; Wong, Carina

    2010-01-01

    Clear, high standards will enable us to develop an education system that ensures that high school graduates are ready for college. The Bill & Melinda Gates Foundation has been working with other organizations to develop a Common Core of Standards. The partners working with the foundation are developing tools that will show teachers what is…

  16. Standards, Assessments, and Accountability. Education Policy White Paper

    ERIC Educational Resources Information Center

    Shepard, Lorrie, Ed.; Hannaway, Jane, Ed.; Baker, Eva, Ed.

    2009-01-01

    Standards-based education reform has a more than 20-year history. A standards-based vision was enacted in federal law under the Clinton administration with the 1994 reauthorization of the Elementary and Secondary Education Act (ESEA) and carried forward under the Bush administration with the No Child Left Behind Act (NCLB) of 2001. In a recent…

  17. Initial recommendations for higher-tier risk assessment protocols for bumble bees, Bombus spp. (Hymenoptera: Apidae).

    PubMed

    Cabrera, Ana R; Almanza, Maria Teresa; Cutler, G Christopher; Fischer, David L; Hinarejos, Silvia; Lewis, Gavin; Nigro, Daniel; Olmstead, Allen; Overmyer, Jay; Potter, Daniel A; Raine, Nigel E; Stanley-Stahr, Cory; Thompson, Helen; van der Steen, Jozef

    2016-04-01

    Global declines of bumble bees and other pollinator populations are of concern because of their critical role for crop production and maintenance of wild plant biodiversity. Although the consensus among scientists is that the interaction of many factors, including habitat loss, forage scarcity, diseases, parasites, and pesticides, potentially plays a role in causing these declines, pesticides have received considerable attention and scrutiny. In response, regulatory agencies have introduced more stringent pollinator testing requirements for registration and reregistration of pesticides, to ensure that the risks to pollinators are minimized. In this context, guidelines for testing bumble bees (Bombus spp.) in regulatory studies are not yet available, and a pressing need exists to develop suitable protocols for routine higher-tier studies with these non-Apis sp., social bees. To meet this need, Bayer CropScience LP, Syngenta Crop Protection LLC US, and Valent USA. Corporation organized a workshop bringing together a group of global experts on bumble bee behavior, ecology, and ecotoxicology to discuss and develop draft protocols for both semi-field (Tier II) and field (Tier III) studies. The workshop was held May 8-9, 2014, at the Bayer Bee Care Center, North Carolina, USA. The participants represented academic, consulting, and industry scientists from Europe, Canada, the United States, and Brazil. The workshop identified a clear protection goal and generated proposals for basic experimental designs, relevant measurements, and endpoints for both semifield (tunnel) and field tests. These initial recommendations are intended to form the basis of discussions to help advance the development of appropriate protocol guidelines.

  18. SMARTER Balanced Assessment Consortium Common Core State Standards Analysis: Eligible Content for the Summative Assessment. Final Report

    ERIC Educational Resources Information Center

    Sato, Edynn; Lagunoff, Rachel; Worth, Peter

    2011-01-01

    This report is a descriptive analysis of the Common Core State Standards (CCSS), intended to determine which content is eligible for the Smarter Balanced Assessment Consortium's end-of-year summative assessment for English language arts (ELA) and mathematics in grades 3-8 and high school. The high school standards analyzed were those in grades…

  19. Calibration of the Delaware Rapid Assessment Protocol to a Comprehensive Measure of Wetland Condition

    EPA Science Inventory

    The importance of monitoring and assessment to the management and protection of wetlands has been recognized, and research in recent years has made progress in the development of wetland monitoring and assessment tools. Wetland assessments are made at multiple levels of intensit...

  20. The Utility of a High-intensity Exercise Protocol to Prospectively Assess ACL Injury Risk.

    PubMed

    Bossuyt, F M; García-Pinillos, F; Raja Azidin, R M F; Vanrenterghem, J; Robinson, M A

    2016-02-01

    This study investigated the utility of a 5-min high-intensity exercise protocol (SAFT(5)) to include in prospective cohort studies investigating ACL injury risk. 15 active females were tested on 2 occasions during which their non-dominant leg was analysed before SAFT(5) (PRE), immediately after (POST0), 15 min after (POST15), and 30 min after (POST30). On the first occasion, testing included 5 maximum isokinetic contractions for eccentric and concentric hamstring and concentric quadriceps and on the second occasion, 3 trials of 2 landing tasks (i. e., single-leg hop and drop vertical jump) were conducted. Results showed a reduced eccentric hamstring peak torque at POST0, POST15 and POST30 (p<0.05) and a reduced functional HQ ratio (Hecc/Qcon) at POST15 and POST30 (p<0.05). Additionally, a more extended knee angle at POST30 (p<0.05) and increased knee internal rotation angle at POST0 and POST15 (p<0.05) were found in a single-leg hop. SAFT(5) altered landing strategies associated with increased ACL injury risk and similar to observations from match simulations. Our findings therefore support the utility of a high-intensity exercise protocol such as SAFT(5) to strengthen injury screening tests and to include in prospective cohort studies where time constraints apply. PMID:26509378

  1. [Assessment of validity of the archived standard curve in endotoxin assay, produced in other facilities].

    PubMed

    Waki, Atsuo; Mori, Tetsuya; Nishijima, Ken-ichi; Honjyo, Kazuyoshi; Kayano, Yuichiro; Yano, Ryoichi; Shiraishi, Hiromi; Takaoka, Aya; Kiyono, Yasushi; Fujibayashi, Yasuhisa

    2014-11-01

    We have reported the possibility of the use of the archived standard curve of endotoxin assay, which is prepared in the same facility from the viewpoint of the accuracy and precision. In this study, the possibility of the use of the archived standard curves prepared in the different facilities was investigated with the same data set in the previous paper. The evaluation was performed with the recovery rate of the concentrations of the standard solutions, as the same method as the previous study. The clotting times of the standard solutions were substituted into the standard curves prepared in the different facilities from those, in which standard solutions were prepared. The recovery rates were 86.1-125.0%, and the range was almost the same as that when the facility preparing standard solutions were the same as that preparing the standard curve. From this data, if the protocols of the preparation of standard solutions, such as mixing and the interval timing until set to the apparatus and so on, can be set the same between the endotoxin test and the preparation of the archived standard curves, the endotoxin concentration calculated with the archived standard curves prepared in other facilities were not varied very much, compared to the true values and the values obtained from the use of the archived standard curves prepared in the same facility.

  2. Environmental assessment for the Consumer Products Efficiency Standards program

    SciTech Connect

    Not Available

    1980-05-23

    The Energy Policy and Conservation Act of 1975 as amended by the National Energy Conservation Policy Act of 1978, requires the DOE to prescribe energy efficiency standards for thirteen consumer products. The Consumer Products Efficiency Standards (CPES) program covers the following products: refrigerators and refrigerator-freezers; freezers;clothes dryers;water heaters; room air conditioners; home heating equipment (not including furnaces); kitchen ranges and ovens; central air conditioners (cooling and heat pumps); furnaces; dishwashers; television sets; clothes washers; and humidifiers and dehumidifiers. DOE is proposing two sets of standards for all thirteen consumer products: intermediate standards to become effective in 1981 for the first nine products and in 1982 for the second four products, and final standards to become effective in 1986 and 1987, respectively. The final standards are more restrictive than the intermediate standards and will provide manufacturers with the maximum time permitted under the Act to plan and develop extensive new lines of efficient consumer products. The final standards proposed by DOE require the maximum improvements in efficiency which are technologically feasible and economically justified, as required by Section 325(c) of EPCA. The thirteen consumer products account for approximately 90% of all the energy consumed in the nation's residences, or more than 20% of the nation's energy needs. Increases in the energy efficiency of these consumer products can help to narrow the gap between the nation's increasing demand for energy and decreasing supplies of domestic oil and natural gas. Improvements in the efficiency of consumer products can thus help to solve the nation's energy crisis.

  3. Evaluation of a low-dose CT protocol with oral contrast for assessment of acute appendicitis.

    PubMed

    Platon, Alexandra; Jlassi, Helmi; Rutschmann, Olivier T; Becker, Christoph D; Verdun, Francis R; Gervaz, Pascal; Poletti, Pierre-Alexandre

    2009-02-01

    The aim of this study was to evaluate a low-dose CT with oral contrast medium (LDCT) for the diagnosis of acute appendicitis and compare its performance with standard-dose i.v. contrast-enhanced CT (standard CT) according to patients' BMIs. Eighty-six consecutive patients admitted with suspicion of acute appendicitis underwent LDCT (30 mAs), followed by standard CT (180 mAs). Both examinations were reviewed by two experienced radiologists for direct and indirect signs of appendicitis. Clinical and surgical follow-up was considered as the reference standard. Appendicitis was confirmed by surgery in 37 (43%) of the 86 patients. Twenty-nine (34%) patients eventually had an alternative discharge diagnosis to explain their abdominal pain. Clinical and biological follow-up was uneventful in 20 (23%) patients. LDCT and standard CT had the same sensitivity (100%, 33/33) and specificity (98%, 45/46) to diagnose appendicitis in patients with a body mass index (BMI) >or= 18.5. In slim patients (BMI<18.5), sensitivity to diagnose appendicitis was 50% (2/4) for LDCT and 100% (4/4) for standard CT, while specificity was identical for both techniques (67%, 2/3). LDCT may play a role in the diagnostic workup of patients with a BMI >or= 18.5.

  4. Energy efficiency and pollution prevention assessment protocol in the polymer processing industries. Final report

    SciTech Connect

    Nardone, John; Sansone, Leonard; Kenney, William; Christodoulatos, Christos; Koutsospyros, Agamemnon

    1998-03-31

    This report was developed from experiences with three New Jersey firms and is intended to be a guide for conducting analyses on resource (energy and raw materials) utilization and pollution (solid waste, air and water emissions) prevention in plastics processing plants. The protocol is written on the assumption that the analysis is to be done by an outside agency such as a consulting firm, but it also can be used for internal audits by plant teams. Key concepts in this analysis were adapted from life cycle analysis. Because of the small sample of companies studied, the results have to be considered high preliminary, but some of the conclusions will probably be confirmed by further work.

  5. Portfolios for Prior Learning Assessment: Caught between Diversity and Standardization

    ERIC Educational Resources Information Center

    Sweygers, Annelies; Soetewey, Kim; Meeus, Wil; Struyf, Elke; Pieters, Bert

    2009-01-01

    In recent years, procedures have been established in Flanders for "Prior Learning Assessment" (PLA) outside the formal learning circuit, of which the portfolio is a regular component. In order to maximize the possibilities of acknowledgement of prior learning assessment, the Flemish government is looking for a set of common criteria and principles…

  6. Location Knowledge: Assessment, Spatial Thinking, and New National Geography Standards

    ERIC Educational Resources Information Center

    Dunn, James M.

    2011-01-01

    Location knowledge is typically assessed using outline maps. A new set of questions reflect spatial thinking research and helps to assess student location knowledge. A small group (145) of first-year college students helped to refine the items. Question styles include: open-response, multiple-choice, listing, labeling, and sketching. Topics…

  7. Screening and Assessing Adolescents for Substance Use Disorders. Treatment Improvement Protocol (TIP) Series 31.

    ERIC Educational Resources Information Center

    Substance Abuse and Mental Health Services Administration (DHHS/PHS), Rockville, MD. Center for Substance Abuse Treatment.

    This TIP is designed to teach juvenile justice, health services, education, and substance abuse treatment personnel about how to identify, screen, and assess people 11-to-21 years old who may be experiencing substance-related problems. It details warning signs of substance use disorders, when to screen, when to assess, what domains besides…

  8. Assessment of technologies to meet a low carbon fuel standard.

    PubMed

    Yeh, Sonia; Lutsey, Nicholas P; Parker, Nathan C

    2009-09-15

    California's low carbon fuel standard (LCFS) was designed to incentivize a diverse array of available strategies for reducing transportation greenhouse gas (GHG) emissions. It provides strong incentives for fuels with lower GHG emissions, while explicitly requiring a 10% reduction in California's transportation fuel GHG intensity by 2020. This paper investigates the potential for cost-effective GHG reductions from electrification and expanded use of biofuels. The analysis indicates that fuel providers could meetthe standard using a portfolio approach that employs both biofuels and electricity, which would reduce the risks and uncertainties associated with the progress of cellulosic and battery technologies, feedstock prices, land availability, and the sustainability of the various compliance approaches. Our analysis is based on the details of California's development of an LCFS; however, this research approach could be generalizable to a national U.S. standard and to similar programs in Europe and Canada. PMID:19806719

  9. [Norms and standards for radiofrequency electromagnetic fields in Latin America: guidelines for exposure limits and measurement protocols].

    PubMed

    Skvarca, Jorge; Aguirre, Aníbal

    2006-01-01

    New technologies that use electromagnetic fields (EMF) have proved greatly beneficial to humankind. EMF are used in a variety of ways in the transmission of electrical energy and in telecommunications, industry, and medicine. However, some studies have shown that EMF could be detrimental to one's health, having found an association between exposure to EMF on the one hand, and the incidence of some types of cancer as well as behavioral changes on the other. Although so far there is no concrete proof that exposure to low-intensity EMF is hazardous, researchers continue to study the issue in an attempt to reach a consensus opinion and to establish safety standards. While developing and establishing such norms and standards have traditionally been the responsibility of international specialized agencies, national health authorities should take an active part in this process. Currently the Pan American Health Organization is promoting scientific research, often in the form of epidemiologic studies, in order to propose uniform norms and standards. Some Latin American countries, including Argentina, Brazil, Chile, Colombia, Costa Rica, Ecuador, Mexico, Peru, and Venezuela, have already enacted incomplete or partial legislation based on recommended international standards. This article describes the norms established in Latin America and the particular approach taken by each country.

  10. Assessment of Adverse Events in Protocols, Clinical Study Reports, and Published Papers of Trials of Orlistat: A Document Analysis

    PubMed Central

    Schroll, Jeppe Bennekou; Penninga, Elisabeth I.; Gøtzsche, Peter C.

    2016-01-01

    Background Little is known about how adverse events are summarised and reported in trials, as detailed information is usually considered confidential. We have acquired clinical study reports (CSRs) from the European Medicines Agency through the Freedom of Information Act. The CSRs describe the results of studies conducted as part of the application for marketing authorisation for the slimming pill orlistat. The purpose of this study was to study how adverse events were summarised and reported in study protocols, CSRs, and published papers of orlistat trials. Methods and Findings We received the CSRs from seven randomised placebo controlled orlistat trials (4,225 participants) submitted by Roche. The CSRs consisted of 8,716 pages and included protocols. Two researchers independently extracted data on adverse events from protocols and CSRs. Corresponding published papers were identified on PubMed and adverse event data were extracted from this source as well. All three sources were compared. Individual adverse events from one trial were summed and compared to the totals in the summary report. None of the protocols or CSRs contained instructions for investigators on how to question participants about adverse events. In CSRs, gastrointestinal adverse events were only coded if the participant reported that they were “bothersome,” a condition that was not specified in the protocol for two of the trials. Serious adverse events were assessed for relationship to the drug by the sponsor, and all adverse events were coded by the sponsor using a glossary that could be updated by the sponsor. The criteria for withdrawal due to adverse events were in one case related to efficacy (high fasting glucose led to withdrawal), which meant that one trial had more withdrawals due to adverse events in the placebo group. Finally, only between 3% and 33% of the total number of investigator-reported adverse events from the trials were reported in the publications because of post hoc

  11. Performance Comparison of Wireless Sensor Network Standard Protocols in an Aerospace Environment: ISA100.11a and ZigBee

    NASA Technical Reports Server (NTRS)

    Wagner, Raymond S.; Barton, Richard J.

    2011-01-01

    Wireless Sensor Networks (WSNs) can provide a substantial benefit in spacecraft systems, reducing launch weight and providing unprecedented flexibility by allowing instrumentation capabilities to grow and change over time. Achieving data transport reliability on par with that of wired systems, however, can prove extremely challenging in practice. Fortunately, much progress has been made in developing standard WSN radio protocols for applications from non-critical home automation to mission-critical industrial process control. The relative performances of candidate protocols must be compared in representative aerospace environments, however, to determine their suitability for spaceflight applications. In this paper, we will present the results of a rigorous laboratory analysis of the performance of two standards-based, low power, low data rate WSN protocols: ZigBee Pro and ISA100.11a. Both are based on IEEE 802.15.4 and augment that standard's specifications to build complete, multi-hop networking stacks. ZigBee Pro targets primarily the home and office automation markets, providing an ad-hoc protocol that is computationally lightweight and easy to implement in inexpensive system-on-a-chip components. As a result of this simplicity, however, ZigBee Pro can be susceptible to radio frequency (RF) interference. ISA100.11a, on the other hand, targets the industrial process control market, providing a robust, centrally-managed protocol capable of tolerating a significant amount of RF interference. To achieve these gains, a coordinated channel hopping mechanism is employed, which entails a greater computational complexity than ZigBee and requires more sophisticated and costly hardware. To guide future aerospace deployments, we must understand how well these standards relatively perform in analog environments under expected operating conditions. Specifically, we are interested in evaluating goodput -- application level throughput -- in a representative crewed environment

  12. Ground-Water Data-Collection Protocols and Procedures for the National Water-Quality Assessment Program: Selection, Installation, and Documentation of Wells, and Collection of Related Data

    USGS Publications Warehouse

    Lapham, Wayne W.; Wilde, Franceska D.; Koterba, Michael T.

    1995-01-01

    Protocols for well installation and documentation are included in a 1989 report written for the National Water-Quality Assessment (NAWQA) Pilot Program of the U.S. Geological Survey (USGS). These protocols were reviewed and revised to address the needs of the full-scale implementation of the NAWQA Program that began in 1991. This report, which is a collaborative effort between the National Water-Quality Assessment Program and the Office of Water Quality, is the result of that review and revision. This report describes protocols and recommended procedures for the collection of data from wells for the NAWQA Program. Protocols and procedures discussed are well selection, installation of monitoring wells, documentation, and the collection of water level and additional hydrogeologic and geologic data.

  13. Technical note: enumeration of mesophilic aerobes in milk: evaluation of standard official protocols and Petrifilm aerobic count plates.

    PubMed

    Freitas, R; Nero, L A; Carvalho, A F

    2009-07-01

    Enumeration of mesophilic aerobes (MA) is the main quality and hygiene parameter for raw and pasteurized milk. High levels of these microorganisms indicate poor conditions in production, storage, and processing of milk, and also the presence of pathogens. Fifteen raw and 15 pasteurized milk samples were submitted for MA enumeration by a conventional plating method (using plate count agar) and Petrifilm Aerobic Count plates (3M, St. Paul, MN), followed by incubation according to 3 official protocols: IDF/ISO (incubation at 30 degrees C for 72 h), American Public Health Association (32 degrees C for 48 h), and Brazilian Ministry of Agriculture (36 degrees C for 48 h). The results were compared by linear regression and ANOVA. Considering the results from conventional methodology, good correlation indices and absence of significant differences between mean counts were observed, independent of type of milk sample (raw or pasteurized) and incubation conditions (IDF/ISO, American Public Health Association, or Ministry of Agriculture). Considering the results from Petrifilm Aerobic Count plates, good correlation indices and absence of significant differences were only observed for raw milk samples. The microbiota of pasteurized milk interfered negatively with the performance of Petrifilm Aerobic Count plates, probably because of the presence of microorganisms that poorly reduce the dye indicator of this system.

  14. Changing the Practice of Teacher Education. Standards and Assessment as a Lever for Change.

    ERIC Educational Resources Information Center

    Diez, Mary E., Ed.

    This volume presents a collection of papers on teacher education reform, discussing the impact of standards and assessment on teacher education. Targeting policymakers, researchers, and teacher educators, the volume describes seven teacher education institutions that have used standards and assessment to guide their reform. Part 1, "The Role of…

  15. Alignment of Standards, Assessment and Instruction: Implications for English Language Learners in Ohio

    ERIC Educational Resources Information Center

    Mohamud, Abdinur; Fleck, Dan

    2010-01-01

    The purpose of this article is to describe the process and development of English Language Proficiency (ELP) standards and assessment in Ohio and to discuss issues related to alignment. The article addresses the importance of alignment among standards, instruction, and assessment, as well as the effect of alignment on students' academic…

  16. Virginia Standards of Learning Assessments. Grade 5 Released Test Items, 1998.

    ERIC Educational Resources Information Center

    Virginia State Dept.of Education, Richmond. Div. of Assessment and Reporting.

    Beginning in Spring 1998, Virginia students participated in the Standards of Learning (SOL) assessments designed to test student knowledge of the content and skills specified in the state's standards. This document contains questions that approximately 80,000 students in grade 5 were required to answer as part of the SOL assessments. These…

  17. Virginia Standards of Learning Assessments. Grade 3 Released Test Items, 1998.

    ERIC Educational Resources Information Center

    Virginia State Dept.of Education, Richmond. Div. of Assessment and Reporting.

    Beginning in Spring 1998, Virginia students participated in the Standards of Learning (SOL) Assessments designed to test student knowledge of the content and skills specified in the state's standards. This document contains questions that approximately 83,000 students in grade 3 were required to answer as part of the SOL assessments. These…

  18. Virginia Standards of Learning Assessments. Grade 8 Released Test Items, 1998.

    ERIC Educational Resources Information Center

    Virginia State Dept.of Education, Richmond. Div. of Assessment and Reporting.

    Beginning in Spring 1998, Virginia students participated in the Standards of Learning (SOL) assessments designed to test student knowledge of the content and skills specified in the state's standards. This document contains questions that approximately 79,000 students in grade 8 were required to answer as part of the SOL assessments. These…

  19. Virginia Standards of Learning Assessments. End of Course Released Test Items, 1998.

    ERIC Educational Resources Information Center

    Virginia State Dept.of Education, Richmond. Div. of Assessment and Reporting.

    Beginning in Spring 1998, Virginia students participated in the Standards of Learning (SOL) assessments designed to test student knowledge of the content and skills specified in the state's standards. This document contains questions that students were required to answer as part of the SOL End-of-Course assessments. These questions are…

  20. Assessement of Codes and Standards Applicable to a Hydrogen Production Plant Coupled to a Nuclear Reactor

    SciTech Connect

    M. J. Russell

    2006-06-01

    This is an assessment of codes and standards applicable to a hydrogen production plant to be coupled to a nuclear reactor. The result of the assessment is a list of codes and standards that are expected to be applicable to the plant during its design and construction.

  1. Alignment of World Language Standards and Assessments: A Multiple Case Study

    ERIC Educational Resources Information Center

    Kaplan, Carolyn Shemwell

    2016-01-01

    Previous research has examined world language classroom-based assessment practices as well as the impact of the Standards for Foreign Language Learning in the 21st Century (National Standards in Foreign Language Education Project, 1999) on practice. However, the extent to which K-12 teachers' assessment practices reflect national and state…

  2. Towards Improving Public Understanding of Judgement Practice in Standards-Referenced Assessment: An Australian Perspective

    ERIC Educational Resources Information Center

    Klenowski, Val

    2013-01-01

    Curriculum and standards-referenced assessment reform in accountability contexts are increasingly dominated by the use of testing, evidence, comparative analyses of achievement data and policy as numbers all of which have given rise to a set of related developments. Internationally these developments towards the use of standards for assessment and…

  3. Customized versus Standardized Exams for Learning Outcomes Assessment in an Undergraduate Business Program

    ERIC Educational Resources Information Center

    Phelps, Amy L.; Spangler, William E.

    2013-01-01

    A standardized exam for program-level assessment can take the form of 1) a customized exam developed in-house by faculty and linked explicitly to program-level learning goals; or 2) a standardized exam developed externally by assessment experts and linked to a set of somewhat broader and more generalizable learning goals. This article discusses…

  4. The Role of Assessment in Meeting the NASPE Physical Education Content Standards.

    ERIC Educational Resources Information Center

    DeJong, Glenna; Kokinakis, C. Lee; Kuntzleman, Charles

    2002-01-01

    Describes the essential link among content standards, curriculum, instruction, professional development, and assessment in elementary physical education, discussing the National Association for Sport and Physical Education standards for physical education and emphasizing the importance of assessment in monitoring student progress, improving…

  5. Comparing Yes/No Angoff and Bookmark Standard Setting Methods in the Context of English Assessment

    ERIC Educational Resources Information Center

    Hsieh, Mingchuan

    2013-01-01

    The Yes/No Angoff and Bookmark method for setting standards on educational assessment are currently two of the most popular standard-setting methods. However, there is no research into the comparability of these two methods in the context of language assessment. This study compared results from the Yes/No Angoff and Bookmark methods as applied to…

  6. Application of State Biological Assessment Protocols to the Lower Missouri River: are any Lotic Systems Non-Wadeable ?

    NASA Astrophysics Data System (ADS)

    Poulton, B. C.

    2005-05-01

    Benthic ecologists have suggested that the biological condition of large rivers could be successfully evaluated with modifications of small stream bioassessments. The lower Missouri system contains a unique fauna with rare, large-river macroinvertebrate species that are difficult to sample and more poorly known ecologically. As part of an evaluation of 27 indicator metrics and their ability to detect cumulative stressors, an 812 km section of the lower Missouri was sampled with three methods in two key habitats. Among these, rock revetments were sampled with a D-frame kick net using the state of Missouri's biological assessment protocol developed for coarse substrate in riffles of wadeable streams. A total of 62 species of macroinvertebrates were collected, as compared to 72 species from artificial substrates deployed at the same locations. Ranges in values for the four core metrics included in the Missouri protocol (total taxa richness 24-42, EPT taxa richness 9-18, Missouri Biotic Index 5.89-6.33, Shannon-Wiener Index 1.60-2.77) were more comparable to data from perennial 4th and 5th order streams than the values from artificial substrate data. However, proposed biocriteria for wadeable streams in Missouri could not identify impairment levels among sites, and no longitudinal response gradient was apparent until an additional 8 metrics were included in the site scores. Utilization of semi-quantitative kick net protocols in large rivers can yield data comparable to wadeable streams if sampling is performed under specific conditions related to hydrology and insect colonization dynamics.

  7. Liver safety assessment: required data elements and best practices for data collection and standardization in clinical trials.

    PubMed

    Avigan, Mark I; Bjornsson, Einar S; Pasanen, Markku; Cooper, Charles; Andrade, Raul J; Watkins, Paul B; Lewis, James H; Merz, Michael

    2014-11-01

    A workshop was convened to discuss best practices for the assessment of drug-induced liver injury (DILI) in clinical trials. In a breakout session, workshop attendees discussed necessary data elements and standards for the accurate measurement of DILI risk associated with new therapeutic agents in clinical trials. There was agreement that in order to achieve this goal the systematic acquisition of protocol-specified clinical measures and lab specimens from all study subjects is crucial. In addition, standard DILI terms that address the diverse clinical and pathologic signatures of DILI were considered essential. There was a strong consensus that clinical and lab analyses necessary for the evaluation of cases of acute liver injury should be consistent with the US Food and Drug Administration (FDA) guidance on pre-marketing risk assessment of DILI in clinical trials issued in 2009. A recommendation that liver injury case review and management be guided by clinicians with hepatologic expertise was made. Of note, there was agreement that emerging DILI signals should prompt the systematic collection of candidate pharmacogenomic, proteomic and/or metabonomic biomarkers from all study subjects. The use of emerging standardized clinical terminology, CRFs and graphic tools for data review to enable harmonization across clinical trials was strongly encouraged. Many of the recommendations made in the breakout session are in alignment with those made in the other parallel sessions on methodology to assess clinical liver safety data, causality assessment for suspected DILI, and liver safety assessment in special populations (hepatitis B, C, and oncology trials). Nonetheless, a few outstanding issues remain for future consideration.

  8. Liver safety assessment: required data elements and best practices for data collection and standardization in clinical trials.

    PubMed

    Avigan, Mark I; Bjornsson, Einar S; Pasanen, Markku; Cooper, Charles; Andrade, Raul J; Watkins, Paul B; Lewis, James H; Merz, Michael

    2014-11-01

    A workshop was convened to discuss best practices for the assessment of drug-induced liver injury (DILI) in clinical trials. In a breakout session, workshop attendees discussed necessary data elements and standards for the accurate measurement of DILI risk associated with new therapeutic agents in clinical trials. There was agreement that in order to achieve this goal the systematic acquisition of protocol-specified clinical measures and lab specimens from all study subjects is crucial. In addition, standard DILI terms that address the diverse clinical and pathologic signatures of DILI were considered essential. There was a strong consensus that clinical and lab analyses necessary for the evaluation of cases of acute liver injury should be consistent with the US Food and Drug Administration (FDA) guidance on pre-marketing risk assessment of DILI in clinical trials issued in 2009. A recommendation that liver injury case review and management be guided by clinicians with hepatologic expertise was made. Of note, there was agreement that emerging DILI signals should prompt the systematic collection of candidate pharmacogenomic, proteomic and/or metabonomic biomarkers from all study subjects. The use of emerging standardized clinical terminology, CRFs and graphic tools for data review to enable harmonization across clinical trials was strongly encouraged. Many of the recommendations made in the breakout session are in alignment with those made in the other parallel sessions on methodology to assess clinical liver safety data, causality assessment for suspected DILI, and liver safety assessment in special populations (hepatitis B, C, and oncology trials). Nonetheless, a few outstanding issues remain for future consideration. PMID:25352325

  9. Controversies of Standardized Assessment in School Accountability Reform: A Critical Synthesis of Multidisciplinary Research Evidence

    ERIC Educational Resources Information Center

    Wang, Lihshing; Beckett, Gulbahar H.; Brown, Lionel

    2006-01-01

    Standardized assessment in school systems has been the center of debate for decades. Although the voices of opponents of standardized tests have dominated the public forum, only a handful of scholars and practitioners have argued in defense of standardized tests. This article provides a critical synthesis of the controversial issues on…

  10. Personal Exposure Monitoring Wearing Protocol Compliance: An Initial Assessment of Quantitative Measurements

    EPA Science Inventory

    Personal exposure sampling provides the most accurate and representative assessment of exposure to a pollutant, but only if measures are implemented to minimize exposure misclassification and reduce confounders that may cause misinterpretation of the collected data. Poor complian...

  11. SU-C-17A-02: Sirius MRI Markers for Prostate Post-Implant Assessment: MR Protocol Development

    SciTech Connect

    Lim, T; Wang, J; Kudchadker, R; Stafford, R; Bathala, T; Pugh, T; Ibbott, G; Frank, S

    2014-06-15

    Purpose: Currently, CT is used to visualize prostate brachytherapy sources, at the expense of accurate structure contouring. MRI is superior to CT for anatomical delineation, but the sources appear as voids on MRI images. Previously we have developed Sirius MRI markers (C4 Imaging) to replace spacers to assist source localization on MRI images. Here we develop an MRI pulse sequence protocol that enhances the signal of these markers to enable MRI-only post-implant prostate dosimetric analysis. Methods: To simulate a clinical scenario, a CIRS multi-modality prostate phantom was implanted with 66 markers and 86 sources. The implanted phantom was imaged on both 1.5T and 3.0T GE scanners under various conditions, different pulse sequences (2D fast spin echo [FSE], 3D balanced steadystate free precession [bSSFP] and 3D fast spoiled gradient echo [FSPGR]), as well as varying amount of padding to simulate various patient sizes and associated signal fall-off from the surface coil elements. Standard FSE sequences from the current clinical protocols were also evaluated. Marker visibility, marker size, intra-marker distance, total scan time and artifacts were evaluated for various combinations of echo time, repetition time, flip angle, number of excitations, bandwidth, slice thickness and spacing, fieldof- view, frequency/phase encoding steps and frequency direction. Results: We have developed a 3D FSPGR pulse sequence that enhances marker signal and ensures the integrity of the marker shape while maintaining reasonable scan time. For patients contraindicated for 3.0T, we have also developed a similar sequence for 1.5T scanners. Signal fall-off with distance from prostate to coil can be compensated mainly by decreasing bandwidth. The markers are not visible using standard FSE sequences. FSPGR sequences are more robust for consistent marker visualization as compared to bSSFP sequences. Conclusion: The developed MRI pulse sequence protocol for Sirius MRI markers assists source

  12. Outcomes Following Low-Energy Civilian Gunshot Wound Trauma to the Lower Extremities: Results of a Standard Protocol at an Urban Trauma Center

    PubMed Central

    Abghari, Michelle; Monroy, Alexa; Schubl, Sebastian; Davidovitch, Roy; Egol, Kenneth

    2015-01-01

    Background Lower extremity injuries secondary to low-energy gunshot wounds are frequently seen in the civilian populations of urban areas. Although these wounds have fewer complications than high-energy gunshot injuries, the functional and psychological damage is still significant making appropriate timely orthopaedic treatment and follow-up imperative. Purpose The purpose of this study is to present our outcomes in the treatment of low-energy gunshot wounds in a civilian population at an urban, level one trauma center in patients treated by a standard protocol. Methods One hundred and thirty three patients who sustained 148 gunshot wound injuries were treated at our level one trauma center between January 1st, 2009 and October 1st, 2011. Following IRB approval, we extracted information from medical records regarding hospital course, length of stay and type of operative or non-operative treatment. If available, injury and post-operative radiographs were also reviewed. Patients were contacted by telephone to obtain Short Musculoskeletal Function Assessment (SMFA) surveys, pain on a scale of 0–10 and for the determination of any adverse events related to their shooting. Results There were 125 men (94.0%) and 8 women (6.0%) with an average age of 27.1 years (range 15.2–56.3). Seventy-six patients (57.1%) did not have any health insurance upon admission. The average length of stay in the hospital was 4.5 days (range 0.0–88.0). Fifty-one gun shots (34.5%) resulted in fractures of the lower extremities. Patients underwent a total of 95 lower extremity-related procedures during their hospitalization. Twenty-two patients (16.5%) experienced a complication related to their gunshot wounds. 38% of the cohort was available for long-term functional assessment At a mean 23.5 months (range 8–48) of follow up, patients reported mean Functional and Bothersome SMFA scores of 19.6 (SD 15.9) and 10.9 (SD 15.6) suggesting that these patients have poorer function scores than the

  13. A protocol for assessing the effectiveness of oil spill dispersants in stimulating the biodegradation of oil.

    PubMed

    Prince, Roger C; Butler, Josh D

    2014-01-01

    Dispersants are important tools in oil spill response. Taking advantage of the energy in even small waves, they disperse floating oil slicks into tiny droplets (<70 μm) that entrain in the water column and drift apart so that they do not re-agglomerate to re-form a floating slick. The dramatically increased surface area allows microbial access to much more of the oil, and diffusion and dilution lead to oil concentrations where natural background levels of biologically available oxygen, nitrogen, and phosphorus are sufficient for microbial growth and oil consumption. Dispersants are only used on substantial spills in relatively deep water (usually >10 m), conditions that are impossible to replicate in the laboratory. To date, laboratory experiments aimed at following the biodegradation of dispersed oil usually show only minimal stimulation of the rate of biodegradation, but principally because the oil in these experiments disperses fairly effectively without dispersant. What is needed is a test protocol that allows comparison between an untreated slick that remains on the water surface during the entire biodegradation study and dispersant-treated oil that remains in the water column as small dispersed oil droplets. We show here that when this is accomplished, the rate of biodegradation is dramatically stimulated by an effective dispersant, Corexit 9500. Further development of this approach might result in a useful tool for comparing the full benefits of different dispersants. PMID:23943003

  14. Multi-level assessment protocol (MAP) for adoption in multi-site clinical trials

    PubMed Central

    Guydish, J.; Manser, S.T.; Jessup, M.; Tajima, B.; Sears, C.; Montini, T.

    2010-01-01

    The National Institute on Drug Abuse (NIDA) Clinical Trials Network (CTN) is intended to test promising drug abuse treatment models in multi-site clinical trials, and to support adoption of new interventions into clinical practice. Using qualitative research methods we asked: How might the technology of multi-site clinical trials be modified to better support adoption of tested interventions? A total of 42 participants, representing 8 organizational levels ranging from clinic staff to clinical trial leaders, were interviewed about their role in the clinical trial, its interactions with clinics, and intervention adoption. Among eight clinics participating in the clinical trial, we found adoption of the tested intervention in one clinic only. In analysis of interview data we identified four conceptual themes which are likely to affect adoption and may be informative in future multi-site clinical trials. We offer the conclusion that planning for adoption in the early stages of protocol development will better serve the aim of integrating new interventions into practice. PMID:20890376

  15. Assessing the Dynamics of Bittorrent Swarms Topologies Using the Peer Exchange Protocol

    NASA Astrophysics Data System (ADS)

    Fauzie, Mohamad Dikshie; Thamrin, Achmad Husni; van Meter, Rodney; Murai, Jun

    Bittorrent is one of the most popular and successful applications in the current Internet. However, we still have little knowledge about the topology of real Bittorrent swarms, how dynamic the topology is, and how it affects overall behavior. This paper describes an experimental study of the overlay topologies of real-world Bittorrent networks, focusing on the activity of the nodes of its P2P topology and especially their dynamic relationships. Peer Exchange Protocol (PEX) messages are analyzed to infer topologies and their properties, capturing the variations of their behavior. Our measurements, verified using the Kolmogorov-Smirnov goodness of fit test and the likelihood ratio test and confirmed via simulation, show that a power-law with exponential cutoff is a more plausible model than a pure power-law distribution. We also found that the average clustering coefficient is very low, supporting this observation. Bittorrent swarms are far more dynamic than has been recognized previously, potentially impacting attempts to optimize the performance of the system as well as the accuracy of simulations and analyses.

  16. Detailed protocol to assess in vivo and ex vivo myeloperoxidase activity in mouse models of vascular inflammation and disease using hydroethidine.

    PubMed

    Talib, Jihan; Maghzal, Ghassan J; Cheng, David; Stocker, Roland

    2016-08-01

    Myeloperoxidase (MPO) activity contributes to arterial inflammation, vascular dysfunction and disease, including atherosclerosis. Current assessment of MPO activity in biological systems in vivo utilizes 3-chlorotyrosine (3-Cl-Tyr) as a biomarker of hypochlorous acid (HOCl) and other chlorinating species. However, 3-Cl-Tyr is formed in low yield and is subject to further metabolism. Recently, we reported a method to selectively assess MPO-activity in vivo by measuring the conversion of hydroethidine to 2-chloroethidium (2-Cl-E(+)) by liquid chromatography with tandem mass spectrometry (LC-MS/MS) (J. Biol. Chem., 289, 2014, pp. 5580-5595). The hydroethidine-based method has greater sensitivity for MPO activity than measurement of 3-Cl-Tyr. The current methods paper provides a detailed protocol to determine in vivo and ex vivo MPO activity in arteries from mouse models of vascular inflammation and disease by utilizing the conversion of hydroethidine to 2-Cl-E(+). Procedures for the synthesis of standards, preparation of tissue homogenates and the generation of 2-Cl-E(+) are also provided in detail, as are the conditions for LC-MS/MS detection of 2-Cl-E(+). PMID:27184954

  17. Assessing cultural validity in standardized tests in stem education

    NASA Astrophysics Data System (ADS)

    Gassant, Lunes

    This quantitative ex post facto study examined how race and gender, as elements of culture, influence the development of common misconceptions among STEM students. Primary data came from a standardized test: the Digital Logic Concept Inventory (DLCI) developed by Drs. Geoffrey L. Herman, Michael C. Louis, and Craig Zilles from the University of Illinois at Urbana-Champaign. The sample consisted of a cohort of 82 STEM students recruited from three universities in Northern Louisiana. Microsoft Excel and the Statistical Package for the Social Sciences (SPSS) were used for data computation. Two key concepts, several sub concepts, and 19 misconceptions were tested through 11 items in the DLCI. Statistical analyses based on both the Classical Test Theory (Spearman, 1904) and the Item Response Theory (Lord, 1952) yielded similar results: some misconceptions in the DLCI can reliably be predicted by the Race or the Gender of the test taker. The research is significant because it has shown that some misconceptions in a STEM discipline attracted students with similar ethnic backgrounds differently; thus, leading to the existence of some cultural bias in the standardized test. Therefore the study encourages further research in cultural validity in standardized tests. With culturally valid tests, it will be possible to increase the effectiveness of targeted teaching and learning strategies for STEM students from diverse ethnic backgrounds. To some extent, this dissertation has contributed to understanding, better, the gap between high enrollment rates and low graduation rates among African American students and also among other minority students in STEM disciplines.

  18. Phase III Technology for All Americans Project: Creating Assessment, Professional Development, and Program Standards for Technological Literacy.

    ERIC Educational Resources Information Center

    Dugger, William E., Jr.

    2001-01-01

    The goals of Phase III of the Technology for All Americans Project are to develop student assessment standards, professional development standards, program standards, and effective leaders. The project is based on the Standards for Technology Literacy, a NASA initiative. (JOW)

  19. Beyond Criteria and Definitions: Outcome of a Standardized Antibody-Mediated Rejection Protocol with a Diagnostic Schema Different from the Banff 2009 Criteria.

    PubMed

    Rendulic, TrisAnn; Ramon, Daniel S; Killen, Paul D; Samaniego-Picota, Milagros; Park, Jeong M

    2014-01-01

    A new clinical diagnostic schema is needed for the diagnosis of antibody-mediated rejection (AMR) in kidney transplant recipients due to the limited utility of C4d staining, lack of standardized quantitative tests for donor specific antibodies, and potential new diagnostic markers. The treatment of AMR remains controversial because previous studies included heterogeneous treatment modalities, small sample sizes, and short follow-up time. At the University of Michigan Transplant Center, 26 patients were diagnosed with AMR based on our diagnostic protocol including C4d-negative AMR in thesetting of graft dysfunction and Banff tissue injury type II (capillaritis) or type III (arteritis). After diagnosis, these patients received six sessions of plasmapheresis (PP) and IVIG (100 mg/kg after the first to fifth PP and 500 mg/kg with the last PP). Our novel finding in this analysis was the association between persistent C1q detection and graft loss. We confirmed that C4d positivity at diagnosis is associated with worse outcomes. Also, we found that response to our treatment protocol is dependent on C4d staining and Banff tissue injury type.

  20. Comparability of the expanded WMS-III standardization protocol to the published WMS-III among right and left temporal lobectomy patients.

    PubMed

    Doss, R C; Chelune, G J; Naugle, R I

    2000-11-01

    We examined whether differences between the expanded standardization protocol (SP) used to derive norms for the final published version (PB) of the Wechsler Memory Scale - Third Edition (WMS-III; Wechsler, 1997a) would result in differences on the Primary Indexes in a neurologic sample. Specifically, we examined the comparability of the performances of 63 patients with temporal lobectomy (TL) who were administered either the expanded SP protocol (n = 33: 22 left TL and 11 right TL) or the PB battery (n = 30: 11 left TL and 19 right TL). Patients who were administered the SP or PB were comparable in terms of age, sex, education, seizure duration, postsurgical seizure status, and Full Scale IQ. Postoperative intervals were significantly longer for the SP group, although correlational analyses demonstrated no significant relationship between postoperative follow-up interval and WMS-III performance. A series of t tests revealed no significant differences on any of the eight Primary Index scores between patients taking the two versions of the WMS-III for either left or right TL groups. Furthermore, repeated measures analyses of variance failed to show significant differences on modality-specific memory scores between the SP and PB for the left and right TL groups. The current study indicates that temporal lobectomy patients obtained comparable scores on the two versions of the WMS-III.

  1. Alternate Assessments Based on Alternate Achievement Standards: Principals' Perceptions

    ERIC Educational Resources Information Center

    Towles-Reeves, Elizabeth; Kleinert, Harold; Anderman, Lynley

    2008-01-01

    To improve teaching and assessment for students with the most significant cognitive disabilities and to truly ensure that state accountability systems address the performance of these students, a better understanding of the instructional effects of accountability systems as perceived by school leaders (i.e., principals) is critically important.…

  2. Alignment of Standards and Assessments as an Accountability Criterion.

    ERIC Educational Resources Information Center

    La Marca, Paul M.

    2001-01-01

    Provides an overview of the concept of alignment and the role it plays in assessment and accountability systems. Discusses some methodological issues affecting the study of alignment and explores the relationship between alignment and test score interpretation. Alignment is not only a methodological requirement but also an ethical requirement.…

  3. Assessing function in patients undergoing joint replacement: a study protocol for a cohort study

    PubMed Central

    2012-01-01

    Background Joint replacement is an effective intervention for people with advanced arthritis, although there is an important minority of patients who do not improve post-operatively. There is a need for robust evidence on outcomes after surgery, but there are a number of measures that assess function after joint replacement, many of which lack any clear theoretical basis. The World Health Organisation has introduced the International Classification of Functioning, Disability and Health (ICF), which divides function into three separate domains: Impairment, activity limitations and participation restrictions. The aim of this study is to compare the properties and responsiveness of a selection of commonly used outcome tools that assess function, examine how well they relate to the ICF concepts, and to explore the changes in the measures over time. Methods/design Two hundred and sixty three patients listed for lower limb joint replacement at an elective orthopaedic centre have been recruited into this study. Participants attend the hospital for a research appointment prior to surgery and then at 3-months and 1-year after surgery. At each assessment time, function is assessed using a range of measures. Self-report function is assessed using the WOMAC, Aberdeen Impairment, Activity Limitation and Participation Restriction Measure, SF-12 and Measure Yourself Medical Outcome Profile 2. Clinician-administered measures of function include the American Knee Society Score for knee patients and the Harris Hip Score for hip patients. Performance tests include the timed 20-metre walk, timed get up and go, sit-to-stand-to-sit, step tests and single stance balance test. During the performance tests, participants wear an inertial sensor and data from motion analysis are collected. Statistical analysis will include exploring the relationship between measures describing the same ICF concepts, assessing responsiveness, and studying changes in measures over time. Discussion There are a

  4. ASSESSMENT PROTOCOLS - DURABILITY OF PERFORMANCE OF A HOME RADON REDUCTION SYSTEM FOR SUB-SLAB DEPRESSURIZA- TION SYSTEMS

    EPA Science Inventory

    This handbook contains protocols that compare the immediate performance of subslab depressurization (SSD) mitigation system with performance months or years later. These protocols provide a methodology to test SSD radon mitigation systems in situ to determine long-term performanc...

  5. Toward a Best-Practice Protocol for Assessment of Sensory Features in ASD

    ERIC Educational Resources Information Center

    Schaaf, Roseann C.; Lane, Alison E.

    2015-01-01

    Sensory difficulties are a commonly occurring feature of autism spectrum disorders and are now included as one manifestation of the "restricted, repetitive patterns of behavior, interests, or activities" diagnostic criteria of the DSM5 necessitating guidelines for comprehensive assessment of these features. To facilitate the development…

  6. Combining Quality and Curriculum-Based Measurement: A Suggested Assessment Protocol in Writing

    ERIC Educational Resources Information Center

    Ganzeveld, Paula

    2015-01-01

    Curriculum-Based Measures in writing (CBM-W) assesses a variety of fluency-based components of writing. While support exists for the use of CBM measures in the area of writing, there is a need to conduct further validation studies to investigate the utility of these measures within elementary and secondary classrooms. Since only countable indices…

  7. ASSESSING THE USE OF A STANDARDIZED DENTAL DIAGNOSTIC TERMINOLOGY

    PubMed Central

    Tokede, Oluwabunmi; White, Joel M.; Stark, Paul C.; Vaderhobli, Ram; Walji, Muhammad F.; Ramoni, Rachel B.; Schoonheim-Klein, Meta E.; Kimmes, Nicole S.; Tavares, Anamaria

    2012-01-01

    Although standardized terminologies, such as the International Classification of Diseases (ICD), have been in use in medicine for over a century, in the dental profession, efforts to standardize dental diagnostic terms have not achieved widespread acceptance. To address this gap, a standardized dental diagnostic terminology - the ‘EZcodes’ terminology was developed in 2009. Fifteen dental practices and schools in the United States and Europe have implemented the ‘EZcodes’. In this paper we report on the utilization and valid entry of the EZcodes at three of the dental schools that have adopted this standardized dental diagnostic terminology. Electronic data on the use of procedure codes with diagnostic terms from the three schools over a one-year period between July 2010 and June 2011 were aggregated. The diagnostic term and procedure code pairs were adjudicated by three calibrated dentists. Analyses were conducted to gain insight into the utilization and valid entry of the EZcodes diagnostic terminology in the one-year period extending from 7/1/2010 through 6/30/2011. Error proportions in the entry of diagnostic term (and by diagnostic category) were also computed. Within the twelve-month period included in the analysis, a total of 29,965 diagnostic terms and 249,411 procedure codes were entered at the three institutions resulting in a utilization proportion of 12%. Caries and periodontics were the more frequently used categories. More than 1000 of the available 1321 diagnostic terms were never used at the three institutions. Overall, 60.5% of the EZcodes entries were found to be valid. In summary, our results demonstrate low utilization of EZ codes in an electronic dental record and raise the need for specific training of dental providers on the importance of using dental diagnostic terminology and specifically how to use the terms within the EHR. This will serve to increase the use/correct use of the EZcodes diagnostic terminology and ultimately create a

  8. Assessment of Offshore Wind System Design, Safety, and Operation Standards

    SciTech Connect

    Sirnivas, S.; Musial, W.; Bailey, B.; Filippelli, M.

    2014-01-01

    This report is a deliverable for a project sponsored by the U.S. Department of Energy (DOE) entitled National Offshore Wind Energy Resource and Design Data Campaign -- Analysis and Collaboration (contract number DE-EE0005372; prime contractor -- AWS Truepower). The project objective is to supplement, facilitate, and enhance ongoing multiagency efforts to develop an integrated national offshore wind energy data network. The results of this initiative are intended to 1) produce a comprehensive definition of relevant met-ocean resource assets and needs and design standards, and 2) provide a basis for recommendations for meeting offshore wind energy industry data and design certification requirements.

  9. Treatment protocol based on assessment of clot quality during endovascular thrombectomy for acute ischemic stroke using the Trevo stent retriever.

    PubMed

    Ishikawa, Kojiro; Ohshima, Tomotaka; Nishihori, Masahiro; Imai, Tasuku; Goto, Shunsaku; Yamamoto, Taiki; Nishizawa, Toshihisa; Shimato, Shinji; Kato, Kyozo

    2016-08-01

    The optional endovascular approach for acute ischemic stroke is unclear. The Trevo stent retriever can be used as first-line treatment for fast mechanical recanalization. The authors developed a treatment protocol for acute ischemic stroke based on the assessment of clot quality during clot removal with the Trevo. This prospective single-center study included all patients admitted for acute ischemic stroke between July 2014 and February 2015, who underwent emergency endovascular treatment. According to the protocol, the Trevo was used for first-line treatment. Immediately after the Trevo was deployed, the stent delivery wire was pushed to open the stent by force (ACAPT technique). Clot quality was assessed on the basis of the perfusion status after deployment of the Trevo; continued occlusion or immediate reopening either reoccluded or maintained after the stent retriever had been in place for 5 min. If there was no obvious clot removal after the first pass with the Trevo, according to the quality of the clot, either a second pass was performed or another endovascular device was selected. Twelve consecutive patients with acute major cerebral artery occlusion were analyzed. Thrombolysis in cerebral infarction score 2b and 3 was achieved in 11 patients (91.7%) and 9 (75%) had a good clinical outcome after 90 days based on a modified Rankin scale score ≤ 2. Symptomatic intracranial hemorrhage occurred in 1 patient (8.3%). The overall mortality rate was 8.3%. Endovascular thrombectomy using the Trevo stent retriever for first-line treatment is feasible and effective. PMID:27578909

  10. Treatment protocol based on assessment of clot quality during endovascular thrombectomy for acute ischemic stroke using the Trevo stent retriever

    PubMed Central

    Ishikawa, Kojiro; Ohshima, Tomotaka; Nishihori, Masahiro; Imai, Tasuku; Goto, Shunsaku; Yamamoto, Taiki; Nishizawa, Toshihisa; Shimato, Shinji; Kato, Kyozo

    2016-01-01

    ABSTRACT The optional endovascular approach for acute ischemic stroke is unclear. The Trevo stent retriever can be used as first-line treatment for fast mechanical recanalization. The authors developed a treatment protocol for acute ischemic stroke based on the assessment of clot quality during clot removal with the Trevo. This prospective single-center study included all patients admitted for acute ischemic stroke between July 2014 and February 2015, who underwent emergency endovascular treatment. According to the protocol, the Trevo was used for first-line treatment. Immediately after the Trevo was deployed, the stent delivery wire was pushed to open the stent by force (ACAPT technique). Clot quality was assessed on the basis of the perfusion status after deployment of the Trevo; continued occlusion or immediate reopening either reoccluded or maintained after the stent retriever had been in place for 5 min. If there was no obvious clot removal after the first pass with the Trevo, according to the quality of the clot, either a second pass was performed or another endovascular device was selected. Twelve consecutive patients with acute major cerebral artery occlusion were analyzed. Thrombolysis in cerebral infarction score 2b and 3 was achieved in 11 patients (91.7%) and 9 (75%) had a good clinical outcome after 90 days based on a modified Rankin scale score ≤ 2. Symptomatic intracranial hemorrhage occurred in 1 patient (8.3%). The overall mortality rate was 8.3%. Endovascular thrombectomy using the Trevo stent retriever for first-line treatment is feasible and effective. PMID:27578909

  11. Teacher Evaluation Standards in Practice: A Standards-Based Assessment Tool for Diversity-Responsive Teaching.

    ERIC Educational Resources Information Center

    Sobel, Donna M.; Taylor, Sheryl V.; Anderson, Ruth E.

    2003-01-01

    Describes joint effects by urban university faculty and a large public school district to create an observation tool for assessing and mentoring preservice and inservice teachers' abilities to meaningfully address diversity issues in their classroom, examining missions and background information on the university and school district and the…

  12. Standards for chemical quality of drinking water: a critical assessment.

    PubMed

    Zielhuis, R L

    1982-01-01

    The author critically reviews present standards for the chemical quality of drinking water, particularly the limits proposed by the Commission of the European Communities (CEC) in 1979. Particularly, the general principles of standard setting are discussed. It appears that there exists a surprisingly high similarity in drinking water limits, issued by various national and international authorities, although for other environmental compartments important discrepancies exist. Usually, drinking water limits lack adequate documentation, and appear often to be copied from other existing lists. There is an apparent lack of logical consistency in limits set for food, ambient or workroom air, and drinking water, probably due to lack of communication between health experts and decision-making authorities. Moreover, there is a lack of toxicologic studies, explicitly aimed at setting limits. Extrapolation from the acceptable daily intakes (ADI) for food or the Threshold Limit Value (TLV)-Maximum Acceptable Concentration (MAC) for workroom air could be undertaken to derive tentative drinking water limits, as long as explicitly designed studies for drinking water are not yet available. PMID:6749691

  13. CAATS--Comprehensive Assessments Aligned with Teacher Standards: A Five Step Design Model for Assessing Teachers Validly and Reliably

    ERIC Educational Resources Information Center

    Wilkerson, Judy R.; Lang, William Steve

    2005-01-01

    NCATE (2002) requires the measurement of knowledge, skills, and dispositions as part of its accreditation requirements for teacher education programs (Standard 1) and the use of unit assessment systems to aggregate and analyse data with a view toward program improvement (Standard 2). Data must indicate that candidates meet professional, state, and…

  14. A minimum dataset for a standard transoesphageal echocardiogram: a guideline protocol from the British Society of Echocardiography

    PubMed Central

    Wheeler, Richard; Steeds, Richard; Rana, Bushra; Wharton, Gill; Smith, Nicola; Allen, Jane; Chambers, John; Jones, Richard; Lloyd, Guy; O'Gallagher, Kevin

    2015-01-01

    A systematic approach to transoesophageal echocardiography (TOE) is essential to ensure that no pathology is missed during a study. In addition, a standardised approach facilitates the education and training of operators and is helpful when reviewing studies performed in other departments or by different operators. This document produced by the British Society of Echocardiography aims to provide a framework for a standard TOE study. In addition to a minimum dataset, the layout proposes a recommended sequence in which to perform a comprehensive study. It is recommended that this standardised approach is followed when performing TOE in all clinical settings, including intraoperative TOE to ensure important pathology is not missed. Consequently, this document has been prepared with the direct involvement of the Association of Cardiothoracic Anaesthetists (ACTA). PMID:26798487

  15. Comparison of size modulation and conventional standard automated perimetry with the 24-2 test protocol in glaucoma patients

    PubMed Central

    Hirasawa, Kazunori; Shoji, Nobuyuki; Kasahara, Masayuki; Matsumura, Kazuhiro; Shimizu, Kimiya

    2016-01-01

    This prospective randomized study compared test results of size modulation standard automated perimetry (SM-SAP) performed with the Octopus 600 and conventional SAP (C-SAP) performed with the Humphrey Field Analyzer (HFA) in glaucoma patients. Eighty-eight eyes of 88 glaucoma patients underwent SM-SAP and C-SAP tests with the Octopus 600 24-2 Dynamic and HFA 24-2 SITA-Standard, respectively. Fovea threshold, mean defect, and square loss variance of SM-SAP were significantly correlated with the corresponding C-SAP indices (P < 0.001). The false-positive rate was slightly lower, and false-negative rate slightly higher, with SM-SAP than C-SAP (P = 0.002). Point-wise threshold values obtained with SM-SAP were moderately to strongly correlated with those obtained with C-SAP (P < 0.001). The correlation coefficients of the central zone were significantly lower than those of the middle to peripheral zone (P = 0.031). The size and depth of the visual field (VF) defect were smaller (P = 0.039) and greater (P = 0.043), respectively, on SM-SAP than on C-SAP. Although small differences were observed in VF sensitivity in the central zone, the defect size and depth and the reliability indices between SM-SAP and C-SAP, global indices of the two testing modalities were well correlated. PMID:27149561

  16. Comparison of size modulation and conventional standard automated perimetry with the 24-2 test protocol in glaucoma patients.

    PubMed

    Hirasawa, Kazunori; Shoji, Nobuyuki; Kasahara, Masayuki; Matsumura, Kazuhiro; Shimizu, Kimiya

    2016-01-01

    This prospective randomized study compared test results of size modulation standard automated perimetry (SM-SAP) performed with the Octopus 600 and conventional SAP (C-SAP) performed with the Humphrey Field Analyzer (HFA) in glaucoma patients. Eighty-eight eyes of 88 glaucoma patients underwent SM-SAP and C-SAP tests with the Octopus 600 24-2 Dynamic and HFA 24-2 SITA-Standard, respectively. Fovea threshold, mean defect, and square loss variance of SM-SAP were significantly correlated with the corresponding C-SAP indices (P < 0.001). The false-positive rate was slightly lower, and false-negative rate slightly higher, with SM-SAP than C-SAP (P = 0.002). Point-wise threshold values obtained with SM-SAP were moderately to strongly correlated with those obtained with C-SAP (P < 0.001). The correlation coefficients of the central zone were significantly lower than those of the middle to peripheral zone (P = 0.031). The size and depth of the visual field (VF) defect were smaller (P = 0.039) and greater (P = 0.043), respectively, on SM-SAP than on C-SAP. Although small differences were observed in VF sensitivity in the central zone, the defect size and depth and the reliability indices between SM-SAP and C-SAP, global indices of the two testing modalities were well correlated. PMID:27149561

  17. Distance Education Assessment Infrastructure and Process Design Based on International Standard 23988

    ERIC Educational Resources Information Center

    Shaffer, Steven C.

    2012-01-01

    Assessment is an important part of distance education (DE). As class sizes get larger and workloads increase, the IT infrastructure and processes used for DE assessments become more of an issue. Using the BS ISO/IEC 23988:2007 Standard for the use of technology in the delivery of assessments as a guide, this paper describes a rational approach to…

  18. Characteristics of States' Alternate Assessments Based on Modified Academic Achievement Standards in 2008. Synthesis Report 72

    ERIC Educational Resources Information Center

    Albus, Deb; Lazarus, Sheryl S.; Thurlow, Martha L.; Cormier, Damien

    2009-01-01

    In April 2007, Federal No Child Left Behind regulations were finalized that provided states with additional flexibility for assessing some students with disabilities. The regulations allowed states to offer another assessment option, alternate assessments based on modified academic achievement standards (AA-MAS). States are not required to have…

  19. Setting Performance Standards for the VAL-ED: Assessment of Principal Leadership

    ERIC Educational Resources Information Center

    Porter, Andrew; Goldring, Ellen; Elliott, Stephen; Murphy, Joseph; Polikoff, Morgan; Cravens, Xiu

    2008-01-01

    The Vanderbilt Assessment of Leadership in Education is a 360 assessment of principals' learning-centered leadership behaviors. The instrument was designed to provide formative and summative assessment to principals on the leadership behaviors most important to student learning. The purpose of this report is to describe the standard-setting…

  20. Online colour training system for dental students: a comprehensive assessment of different training protocols.

    PubMed

    Liu, M; Chen, L; Liu, X; Yang, Y; Zheng, M; Tan, J

    2015-04-01

    The purpose of this study was to evaluate the training effect and to determine the optimal training protocol for a recently developed online colour training system. Seventy students participated in the evaluation. They first completed a baseline test with shade guides (SGT) and the training system (TST), and then trained with one of the three system training methods (Basic colour training for group E1, Vitapan Classical for E2, and Vitapan 3D-Master for E3) or shade guides (group C1) for 4 days. The control group (C2) received no training. The same test was performed after training and they finally completed a questionnaire. The correct matches after training increased in three experimental groups and group C1. Among experimental groups, the greatest improvement of correct matching number was achieved by group E3 (4·00 ± 1·88 in SGT, 4·29 ± 2·73 in TST), followed by E2 (2·29 ± 2·73 in SGT, 3·50 ± 3·03 in TST) and E1 (2·00 ± 2·60 in SGT, 1·93 ± 2·96 in TST). The difference between E3 and E1 was statistically significant (P = 0·036 in SGT, 0·026 in TST). The total average training time was shorter in group E2 (15·39 ± 4·22 min) and E3 (17·63 ± 5·22 min), with no significant difference between them. Subjective evaluations revealed that self-confidence in colour matching were improved greater in group C1 and E3. In conclusion, all tested sections of the system effectively improved students' colour-matching ability. Among system training methods, Vitapan 3D-Master showed the best performance; it enabled greater shade-matching improvement, it saved time and was superior in subjective evaluations.

  1. Validity of Partial Protocols to Assess the Prevalence of Periodontal Outcomes and Associated Sociodemographic and Behavior Factors in Adolescents and Young Adults

    PubMed Central

    Peres, Marco A.; Peres, Karen G.; Cascaes, Andreia M.; Correa, Marcos B.; Demarco, Flávio F.; Hallal, Pedro C.; Horta, Bernardo L.; Gigante, Denise P.; Menezes, Ana B.

    2012-01-01

    Background Most studies comparing prevalence of periodontal disease and risk factors by using partial protocols were performed in adult populations, with several studies being conducted in clinical settings. The aim of this study is to assess the accuracy of partial protocols in estimating the prevalence of periodontal outcomes in adolescents and young adults from two population-based birth cohorts from Pelotas, Brazil, and to assess differences in the estimation and strength of the effect measures when partial protocols are adopted compared to full-mouth examination. Methods Gingival bleeding at probing among adolescents (n = 339) and young adults (n = 720) and dental calculus and periodontal probing depth among young adults were assessed using full-mouth examinations and four partial protocols: Ramfjord teeth (RT), community periodontal index (CPI), and two random diagonal quadrants (1 and 3, 2 and 4). Socioeconomic, demographic, and periodontal health-related variables were also collected. Sensitivity, absolute and relative bias, and inflation factors were calculated. Prevalence ratio for each periodontal outcome for the risk factors was estimated. Results Two diagonal quadrants showed better accuracy; RT had the worst, whereas CPI presented an intermediate pattern when compared to full-mouth examination. For bleeding assessment in adolescence, RT and CPI underestimated by 18.4% and 16.2%, respectively, the true outcome prevalence, whereas among young adults, all partial protocols underestimated the prevalence. All partial protocols presented similar magnitude of association measures for all investigated periodontal potential risk factors. Conclusion Two diagonal quadrants protocol may be effective in identifying the risk factors for the most relevant periodontal outcomes in adolescence and in young adulthood. PMID:21859320

  2. Moving beyond standard procedures to assess spontaneous recognition memory.

    PubMed

    Ameen-Ali, K E; Easton, A; Eacott, M J

    2015-06-01

    This review will consider how spontaneous tasks have been applied alongside neuroscientific techniques to test complex forms of recognition memory for objects and their environmental features, e.g. the spatial location of an object or the context in which it is presented. We discuss studies that investigate the roles of the perirhinal cortex and the hippocampus in recognition memory using standard testing paradigms, and consider how these findings contribute to the ongoing debate about whether recognition memory is a single unitary process or multiple processes that can be dissociated anatomically and functionally. Due to the wide use of spontaneous tasks, the need for improved procedures that reduce animal use is acknowledged, with multiple trial paradigms discussed as a novel way of reducing variability and animal numbers in these tasks. The importance of improving translation of animal models to humans is highlighted, with emphasis on a shift away from relying on the phenomenological experience of human subjects.

  3. Ground-Water Data-Collection Protocols and Procedures for the National Water-Quality Assessment Program: Collection and Documentation of Water-Quality Samples and Related Data

    USGS Publications Warehouse

    Koterba, Michael T.; Wilde, Franceska D.; Lapham, Wayne W.

    1995-01-01

    Protocols for ground-water sampling are described in a report written in 1989 as part of the pilot program for the National Water-Quality Assessment (NAWQA) Program of the U.S. Geological Survey (USGS). These protocols have been reviewed and revised to address the needs of the full-scale implementation of the NAWQA Program that began in 1991. This report, which is a collaborative effort between the NAWQA Program and the USGS Office of Water Quality, is the result of that review and revision. This report describes protocols and recommended procedures for the collection of water-quality samples and related data from wells for the NAWQA Program. Protocols and recommended procedures discussed include (1) equipment setup and other preparations for data collection; (2) well purging and field measurements; (3) collecting and processing ground-water-quality samples; (4) equipment decontamination; (5) quality-control sampling; and (6) sample handling and shipping.

  4. Factors Associated With the Use of Standardized Power Mobility Skills Assessments Among Assistive Technology Practitioners.

    PubMed

    Jenkins, Gavin R; Vogtle, Laura K; Yuen, Hon K

    2015-01-01

    This study investigated self-reported prevalence of and factors affecting clinicians' use of standardized assessments when evaluating clients for power mobility devices (PMDs), and explored assessments clinicians typically use when carrying out PMD evaluation. An e-mail survey was sent to assistive technology professionals listed in the Rehabilitation Engineering and Assistive Technology Society of North America directory. Three hundred fifty-four respondents, qualified to conduct formal power mobility skills assessments, completed the online survey. Of those, 122 (34.5%) respondents reported that they were aware of the presence of standardized performance-based power mobility skills assessments, but only 28 (7.9%) used these assessments in their practice. Multivariate analysis revealed that the odds of the respondents who use the standardized assessments were 18 times higher for those who were aware of the presence of these assessments than those who were not (adjusted odds ratio [OR] OR = 17.85, P < 0.0001). The odds of using the standardized assessment for respondents who did not identify themselves as occupational or physical therapists were five times higher than those who were therapists (adjusted OR = 0.20, P < 0.0001). This survey revealed that the assistive technology practitioners who recommend PMDs mainly use non-standardized mobility skills assessments.

  5. Aligning Mathematics Assessment Standards: New Mexico and the 2009 National Assessment of Educational Progress (NAEP). REL Technical Brief. REL 2008-No. 011

    ERIC Educational Resources Information Center

    Shapley, Kathy L.; Brite, Jessica

    2008-01-01

    This technical brief examines the current alignment between the New Mexico Standards Based Assessment (NMSBA) standards and the 2009 National Assessment of Educational Progress (NAEP) mathematics framework. It looks at the extent to which current state assessment standards cover the content on which 2009 NAEP assessments will be based. Applying…

  6. A Standardized Tool for Assessing the Quality of Classroom-Based Shared Reading: Systematic Assessment of Book Reading (SABR)

    ERIC Educational Resources Information Center

    Pentimonti, Jill M.; Zucker, Tricia A.; Justice, Laura M.; Petscher, Yaacov; Piasta, Shayne B.; Kaderavek, Joan N.

    2012-01-01

    Participation in shared-reading experiences is associated with children's language and literacy outcomes, yet few standardized assessments of shared-reading quality exist. The purpose of this study was to describe the psychometric characteristics of the Systematic Assessment of Book Reading (SABR), an observational tool designed to characterize…

  7. Mimics: a symbolic conflict/cooperation simulation program, with embedded protocol recording and automatic psychometric assessment.

    PubMed

    Aidman, Eugene V; Shmelyov, Alexander G

    2002-02-01

    This paper describes an interactive software environment designed as a social interaction simulator with embedded comprehensive recording and flexible assessment facilities. Using schematized visual sketches similar to cross-cultural facial universals (Ekman, 1999), Mimics (Shmelyov & Aidman, 1997) employs a computer-game-like scenario that requires the subject to identify with an avatar and navigate it through a playing field inhabited by hosts who display a range of facial expressions. From these expressions (which are highly consequential), the player has to anticipate the hosts' reactions to the avatar (which may vary from friendly to obstructing or aggressive) and choose between negotiating with a host (by altering the avatar's facial expression), attacking it, or searching for an escape route. Comprehensive recording of player moves and interactions has enabled computation of several finegrained indices of interactive behavior, such as aggressive response styles, efficiency, and motivation in conflict/cooperation contexts. Initial validation data and potential applications of the method in the assessment of personality and social behavior are discussed.

  8. Rats assess costs and benefits according to an internal standard.

    PubMed

    van den Bos, Ruud; van der Harst, Johanneke; Jonkman, Sietse; Schilders, Mariska; Spruijt, Berry

    2006-08-10

    Variation in effort to obtain rewards is a fact of mammalian everyday life. In this study, we assess how rats scale variable costs and benefits. Different groups of rats were trained in a T-maze to discriminate a high (three or five sugar pellets) from a low reward (one sugar pellet) arm. Subsequently barriers were introduced at the high and low reward side such that the overall long-term pay-off of the high reward arm finally became lower than that of the low reward arm. The data show that under different regimes of costs (climbing barriers) and benefits (number of rewards) of the two arms rats appear to shift their behaviour towards the better side according to a constant relative cost-benefit ratio between the arms. Such a ratio allows them to deal with variation in the (physical appearance of) costs and benefits and choose the best long-term option. PMID:16697474

  9. Independent donor ethical assessment: aiming to standardize donor advocacy.

    PubMed

    Choudhury, Devasmita; Jotterand, Fabrice; Casenave, Gerald; Smith-Morris, Carolyn

    2014-06-01

    Living organ donation has become more common across the world. To ensure an informed consent process, given the complex issues involved with organ donation, independent donor advocacy is required. The choice of how donor advocacy is administered is left up to each transplant center. This article presents the experience and process of donor advocacy at University of Texas Southwestern Medical Center administered by a multidisciplinary team consisting of physicians, surgeons, psychologists, medical ethicists and anthropologists, lawyers, a chaplain, a living kidney donor, and a kidney transplant recipient. To ensure that advocacy remains fair and consistent for all donors being considered, the donor advocacy team at University of Texas Southwestern Medical Center developed the Independent Donor Ethical Assessment, a tool that may be useful to others in rendering donor advocacy. In addition, the tool may be modified as circumstances arise to improve donor advocacy and maintain uniformity in decision making.

  10. Independent donor ethical assessment: aiming to standardize donor advocacy.

    PubMed

    Choudhury, Devasmita; Jotterand, Fabrice; Casenave, Gerald; Smith-Morris, Carolyn

    2014-06-01

    Living organ donation has become more common across the world. To ensure an informed consent process, given the complex issues involved with organ donation, independent donor advocacy is required. The choice of how donor advocacy is administered is left up to each transplant center. This article presents the experience and process of donor advocacy at University of Texas Southwestern Medical Center administered by a multidisciplinary team consisting of physicians, surgeons, psychologists, medical ethicists and anthropologists, lawyers, a chaplain, a living kidney donor, and a kidney transplant recipient. To ensure that advocacy remains fair and consistent for all donors being considered, the donor advocacy team at University of Texas Southwestern Medical Center developed the Independent Donor Ethical Assessment, a tool that may be useful to others in rendering donor advocacy. In addition, the tool may be modified as circumstances arise to improve donor advocacy and maintain uniformity in decision making. PMID:24919733

  11. The renewables portfolio standard in Texas: An early assessment

    SciTech Connect

    Wiser, Ryan H.; Langniss, Ole

    2001-11-01

    Texas has rapidly emerged as one of the leading wind power markets in the United States. This development can be largely traced to a well-designed and carefully implemented renewables portfolio standard (RPS). The RPS is a new policy mechanism that has received increasing attention as an attractive approach to support renewable power generation. Though replacing existing renewable energy policies with an as-of-yet largely untested approach in the RPS is risky, early experience from Texas suggests that an RPS can effectively spur renewables development and encourage competition among renewable energy producers. Initial RPS targets in Texas will be far exceeded by the end of 2001, with as much as 930 MW of wind slated for installation this year. RPS compliance costs appear negligible, with new wind projects reportedly contracted for under 3(US)/242/kWh, in part as a result of a 1.7(US)/242/kWh production tax credit, an outstanding wind resource, and an RPS that is sizable enough to drive project economies of scale. Obliged retail suppliers have been willing to enter into long-term contracts with renewable generators, reducing important risks for both the developer and the retail supplier. Finally, the country's first comprehensive renewable energy certificate program has been put into place to monitor and track RPS compliance.

  12. Prostate motion during standard radiotherapy as assessed by fiducial markers.

    PubMed

    Crook, J M; Raymond, Y; Salhani, D; Yang, H; Esche, B

    1995-10-01

    From November 1993 to August 1994, 55 patients with localized prostate carcinoma had three gold seeds placed in the prostate under transrectal ultrasound guidance prior to the start of radiotherapy in order to track prostate motion. Patients had a planning CT scan before initial simulation and again at about 40 Gy, just prior to simulation of a field reduction. Seed position relative to fixed bony landmarks (pubic symphysis and both ischial tuberosities) was digitized from each pair of orthogonal films from the initial and boost simulation using the Nucletron brachytherapy planning system. Vector analysis was performed to rule out the possibility of independent seed migration within the prostate between the time of initial and boost simulation. Prostate motion was seen in the posterior (mean: 0.56 cm; SD: 0.41 cm) and inferior directions (mean: 0.59 cm; SD: 0.45 cm). The base of the prostate was displaced more than 1 cm posteriorly in 30% of patients and in 11% in the inferior direction. Prostate position is related to rectal and bladder filling. Distension of these organs displaces the prostate in an anterosuperior direction, with lesser degrees of filling allowing the prostate to move posteriorly and inferiorly. Conformal therapy planning must take this motion into consideration. Changes in prostate position of this magnitude preclude the use of standard margins. PMID:8539455

  13. Study of Optimal Replacement of Thyroxine in the ElDerly (SORTED): protocol for a mixed methods feasibility study to assess the clinical utility of lower dose thyroxine in elderly hypothyroid patients: study protocol for a randomized controlled trial

    PubMed Central

    2013-01-01

    Background The population of the UK is ageing. There is compelling evidence that thyroid stimulating hormone distribution levels increase with age. Currently, in UK clinical practice elderly hypothyroid patients are treated with levothyroxine to lower their thyroid stimulating hormone levels to a standard non-age-related range. Evidence suggests that mortality is negatively associated with thyroid stimulating hormone levels. We report the protocol of a feasibility study working towards a full-scale randomized controlled trial to test whether lower dose levothyroxine has beneficial cardiovascular outcomes in the oldest old. Methods/design SORTED is a mixed methods study with three components: SORTED A: A feasibility study of a dual-center single-blinded randomized controlled trial of elderly hypothyroid patients currently treated with levothyroxine. Setting: Patients will be recruited from 20 general practices and two hospital trust endocrine units in Northumberland, Tyne and Wear. Participants: Target recruitment of 50 elderly hypothyroid patients currently treated with levothyroxine, identified in both primary and secondary care settings. Intervention: Reduced dose of levothyroxine to achieve an elevated serum thyroid stimulating hormone (target range 4.1 to 8.0 mU/L) versus standard levothyroxine replacement (target range 0.4 to 4.0 mU/L). Randomization: Using random permuted blocks, in a ratio of 1:1, randomization will be carried out by Newcastle Clinical Trials Unit. Outcomes: Study feasibility (recruitment and retention rates and medication compliance), acceptability of the trial design, assessment of mobility and falls risk, and change in cardiovascular risk factors. SORTED B: Qualitative study using in-depth interviews to understand patients’ willingness to take part in a randomized controlled trial and participants’ experience of the intervention. SORTED C: Retrospective cohort study of 400 treated hypothyroid patients aged 80 years or over

  14. Exploring a New Simulation Approach to Improve Clinical Reasoning Teaching and Assessment: Randomized Trial Protocol

    PubMed Central

    Moussa, Ahmed; Loye, Nathalie; Charlin, Bernard; Audétat, Marie-Claude

    2016-01-01

    Background Helping trainees develop appropriate clinical reasoning abilities is a challenging goal in an environment where clinical situations are marked by high levels of complexity and unpredictability. The benefit of simulation-based education to assess clinical reasoning skills has rarely been reported. More specifically, it is unclear if clinical reasoning is better acquired if the instructor's input occurs entirely after or is integrated during the scenario. Based on educational principles of the dual-process theory of clinical reasoning, a new simulation approach called simulation with iterative discussions (SID) is introduced. The instructor interrupts the flow of the scenario at three key moments of the reasoning process (data gathering, integration, and confirmation). After each stop, the scenario is continued where it was interrupted. Finally, a brief general debriefing ends the session. System-1 process of clinical reasoning is assessed by verbalization during management of the case, and System-2 during the iterative discussions without providing feedback. Objective The aim of this study is to evaluate the effectiveness of Simulation with Iterative Discussions versus the classical approach of simulation in developing reasoning skills of General Pediatrics and Neonatal-Perinatal Medicine residents. Methods This will be a prospective exploratory, randomized study conducted at Sainte-Justine hospital in Montreal, Qc, between January and March 2016. All post-graduate year (PGY) 1 to 6 residents will be invited to complete one SID or classical simulation 30 minutes audio video-recorded complex high-fidelity simulations covering a similar neonatology topic. Pre- and post-simulation questionnaires will be completed and a semistructured interview will be conducted after each simulation. Data analyses will use SPSS and NVivo softwares. Results This study is in its preliminary stages and the results are expected to be made available by April, 2016. Conclusions

  15. Assessing Cellulase Performance on Pretreated Lignocellulosic Biomass Using Saccharification and Fermentation-Based Protocols

    NASA Astrophysics Data System (ADS)

    Dowe, Nancy

    Cellulase enzyme is a key cost component in the production of fuels and chemicals from lignocellulosic biomass. Cellulolytic ability of the enzyme preparation is often measured by activity assays using model substrates such as filter paper. Using lignocellulosic biomass as the substrate to assess enzyme performance has the potential of being more process relevant. We describe two procedures that use washed pretreated cellulosic material to measure the efficacy of cellulase enzymes. First, a saccharification assay that measures glucose yield as a function of the amount of cellulase used in the process. And second, the simultaneous saccharification and fermentation (SSF) assay measures cellulase performance by the amount of ethanol produced from enzymatic hydrolysis of the cellulosic material. You can use both assays to screen cellulases under a variety of substrate types, loadings, and process conditions.

  16. Assessment of fully automated antibody homology modeling protocols in molecular operating environment.

    PubMed

    Maier, Johannes K X; Labute, Paul

    2014-08-01

    The success of antibody-based drugs has led to an increased demand for predictive computational tools to assist antibody engineering efforts surrounding the six hypervariable loop regions making up the antigen binding site. Accurate computational modeling of isolated protein loop regions can be quite difficult; consequently, modeling an antigen binding site that includes six loops is particularly challenging. In this work, we present a method for automatic modeling of the FV region of an immunoglobulin based upon the use of a precompiled antibody x-ray structure database, which serves as a source of framework and hypervariable region structural templates that are grafted together. We applied this method (on common desktop hardware) to the Second Antibody Modeling Assessment (AMA-II) target structures as well as an experimental specialized CDR-H3 loop modeling method. The results of the computational structure predictions will be presented and discussed. PMID:24715627

  17. A multi-parametric assessment of decontamination protocols for the subglacial Lake Ellsworth probe.

    PubMed

    Magiopoulos, I; McQuillan, J S; Burd, C L; Mowlem, M; Tsaloglou, M-N

    2016-04-01

    Direct measurement and sampling of pristine environments, such as subglacial lakes, without introducing contaminating microorganisms and biomolecules from the surface, represents a significant engineering and microbiological challenge. In this study, we compare methods for decontamination of titanium grade 5 surfaces, the material extensively used to construct a custom-made probe for reaching, measuring and sampling subglacial Lake Ellsworth in West Antarctica. Coupons of titanium were artificially contaminated with Pseudomonas fluorescens bacteria and then exposed to a number of decontamination procedures. The most effective sterilants were (i) hydrogen peroxide vapour, and (ii) Biocleanse™, a commercially available, detergent-based biocidal solution. After each decontamination procedure the bacteria were incapable of proliferation, and showed no evidence of metabolic activity based on the generation of adenosine triphosphate (ATP). The use of ultraviolet irradiation or ethyl alcohol solution was comparatively ineffective for sterilisation. Hydrogen peroxide vapour and ultraviolet irradiation, which directly damage nucleic acids, were the most effective methods for removing detectable DNA, which was measured using 16S rRNA gene copy number and fluorescence-based total DNA quantification. Our results have not only been used to tailor the Ellsworth probe decontamination process, but also hold value for subsequent engineering projects, where high standards of decontamination are required. PMID:26892386

  18. Assessing the Immunogenic Response of a Single Center's Pneumococcal Vaccination Protocol in Sickle Cell Disease.

    PubMed

    Santoro, Jonathan D; Myers, Leann; Kanter, Julie

    2016-04-01

    Sickle cell disease (SCD) is the most common inherited hematologic disorder in the United States. Patients with SCD are at increased risk of invasive pneumococcal disease and are reliant on both early penicillin prophylaxis and antipneumococcal vaccination for prevention of infection. Although studies examining vaccine response have demonstrated a drop-off of titer response after 3 years, an optimal vaccination regimen has not been identified. Our study sought to assess the immunogenicity of our center's pneumococcal vaccination strategy, which included Prevnar (PCV-7) (before the introduction of PCV-13) followed by Pneumovax (PPV-23) given routinely at 2 and 5 years of age and then every 5 years thereafter. Our goal was to assess vaccine response in a population of patients with SCD who had received vaccines according to this regimen using multiplex bead analysis. Our study demonstrated a significant percentage of persons with SCD do not maintain a sufficient vaccination response to PPV-23 for 5 years. Our study revealed that only 36% of patients had protective levels of antipneumococcal antibody titers at an average of 37 months after vaccination. Most alarmingly, within the group of patients with subtherapeutic titers, 64% demonstrated vaccine response to <25% of the tested serotypes. These findings were significantly associated with duration of time since last vaccine administration, but the mean age of lack of response was below the 3-year window where vaccine response was previously reported to wane. Our results indicate antipneumococcal immunity may not be optimally maintained using this vaccination strategy in patients with SCD leaving them vulnerable to invasive pneumococcal disease. Many pediatric hematologists stop prophylactic penicillin at 5 years of age making these results alarming. We recommend further investigation into an optimal vaccine schedule and monitoring of antipneumococcal titers in at-risk patients. PMID:26886376

  19. Assessing the Immunogenic Response of a Single Center's Pneumococcal Vaccination Protocol in Sickle Cell Disease.

    PubMed

    Santoro, Jonathan D; Myers, Leann; Kanter, Julie

    2016-04-01

    Sickle cell disease (SCD) is the most common inherited hematologic disorder in the United States. Patients with SCD are at increased risk of invasive pneumococcal disease and are reliant on both early penicillin prophylaxis and antipneumococcal vaccination for prevention of infection. Although studies examining vaccine response have demonstrated a drop-off of titer response after 3 years, an optimal vaccination regimen has not been identified. Our study sought to assess the immunogenicity of our center's pneumococcal vaccination strategy, which included Prevnar (PCV-7) (before the introduction of PCV-13) followed by Pneumovax (PPV-23) given routinely at 2 and 5 years of age and then every 5 years thereafter. Our goal was to assess vaccine response in a population of patients with SCD who had received vaccines according to this regimen using multiplex bead analysis. Our study demonstrated a significant percentage of persons with SCD do not maintain a sufficient vaccination response to PPV-23 for 5 years. Our study revealed that only 36% of patients had protective levels of antipneumococcal antibody titers at an average of 37 months after vaccination. Most alarmingly, within the group of patients with subtherapeutic titers, 64% demonstrated vaccine response to <25% of the tested serotypes. These findings were significantly associated with duration of time since last vaccine administration, but the mean age of lack of response was below the 3-year window where vaccine response was previously reported to wane. Our results indicate antipneumococcal immunity may not be optimally maintained using this vaccination strategy in patients with SCD leaving them vulnerable to invasive pneumococcal disease. Many pediatric hematologists stop prophylactic penicillin at 5 years of age making these results alarming. We recommend further investigation into an optimal vaccine schedule and monitoring of antipneumococcal titers in at-risk patients.

  20. Comparative life cycle assessment of standard and green roofs.

    PubMed

    Saiz, Susana; Kennedy, Christopher; Bass, Brad; Pressnail, Kim

    2006-07-01

    Life cycle assessment (LCA) is used to evaluate the benefits, primarily from reduced energy consumption, resulting from the addition of a green roof to an eight story residential building in Madrid. Building energy use is simulated and a bottom-up LCA is conducted assuming a 50 year building life. The key property of a green roof is its low solar absorptance, which causes lower surface temperature, thereby reducing the heat flux through the roof. Savings in annual energy use are just over 1%, but summer cooling load is reduced by over 6% and reductions in peak hour cooling load in the upper floors reach 25%. By replacing the common flat roof with a green roof, environmental impacts are reduced by between 1.0 and 5.3%. Similar reductions might be achieved by using a white roof with additional insulation for winter, but more substantial reductions are achieved if common use of green roofs leads to reductions in the urban heat island.

  1. 7 CFR 319.40-11 - Plant pest risk assessment standards.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Wood Articles § 319.40-11 Plant pest risk assessment standards. When evaluating a request to import a... found on the bark; (ii) Plant pests found under the bark; and (iii) Plant pests found in the wood....

  2. 7 CFR 319.40-11 - Plant pest risk assessment standards.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Unmanufactured Wood Articles § 319.40-11 Plant pest risk assessment standards. When evaluating a request to... found on the bark; (ii) Plant pests found under the bark; and (iii) Plant pests found in the wood....

  3. 7 CFR 319.40-11 - Plant pest risk assessment standards.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Unmanufactured Wood Articles § 319.40-11 Plant pest risk assessment standards. When evaluating a request to... found on the bark; (ii) Plant pests found under the bark; and (iii) Plant pests found in the wood....

  4. Environmental assessment in support of proposed voluntary energy conservation standard for new residential buildings

    SciTech Connect

    Hadley, D.L.; Parker, G.B.; Callaway, J.W.; Marsh, S.J.; Roop, J.M.; Taylor, Z.T.

    1989-06-01

    The objective of this environmental assessment (EA) is to identify the potential environmental impacts that could result from the proposed voluntary residential standard (VOLRES) on private sector construction of new residential buildings. 49 refs., 15 tabs.

  5. Perioperative Standard Oral Nutrition Supplements Versus Immunonutrition in Patients Undergoing Colorectal Resection in an Enhanced Recovery (ERAS) Protocol: A Multicenter Randomized Clinical Trial (SONVI Study).

    PubMed

    Moya, Pedro; Soriano-Irigaray, Leticia; Ramirez, Jose Manuel; Garcea, Alessandro; Blasco, Olga; Blanco, Francisco Javier; Brugiotti, Carlo; Miranda, Elena; Arroyo, Antonio

    2016-05-01

    To compare immunonutrition versus standard high calorie nutrition in patients undergoing elective colorectal resection within an Enhanced Recovery After Surgery (ERAS) program.Despite progress in recent years in the surgical management of patients with colorectal cancer (ERAS programs), postoperative complications are frequent. Nutritional supplements enriched with immunonutrients have recently been introduced into clinical practice. However, the extent to which the combination of ERAS protocols and immunonutrition benefits patients undergoing colorectal cancer surgery is unknown.The SONVI study is a prospective, multicenter, randomized trial with 2 parallel treatment groups receiving either the study product (an immune-enhancing feed) or the control supplement (a hypercaloric hypernitrogenous supplement) for 7 days before colorectal resection and 5 days postoperatively.A total of 264 patients were randomized. At baseline, both groups were comparable in regards to age, sex, surgical risk, comorbidity, and analytical and nutritional parameters. The median length of the postoperative hospital stay was 5 days with no differences between the groups. A decrease in the total number of complications was observed in the immunonutrition group compared with the control group, primarily due to a significant decrease in infectious complications (23.8% vs. 10.7%, P = 0.0007). Of the infectious complications, wound infection differed significantly between the groups (16.4% vs. 5.7%, P = 0.0008). Other infectious complications were lower in the immunonutrition group but were not statistically significantly different.The implementation of ERAS protocols including immunonutrient-enriched supplements reduces the complications of patients undergoing colorectal resection.This study is registered with ClinicalTrial.gov: NCT02393976. PMID:27227930

  6. Perioperative Standard Oral Nutrition Supplements Versus Immunonutrition in Patients Undergoing Colorectal Resection in an Enhanced Recovery (ERAS) Protocol: A Multicenter Randomized Clinical Trial (SONVI Study).

    PubMed

    Moya, Pedro; Soriano-Irigaray, Leticia; Ramirez, Jose Manuel; Garcea, Alessandro; Blasco, Olga; Blanco, Francisco Javier; Brugiotti, Carlo; Miranda, Elena; Arroyo, Antonio

    2016-05-01

    To compare immunonutrition versus standard high calorie nutrition in patients undergoing elective colorectal resection within an Enhanced Recovery After Surgery (ERAS) program.Despite progress in recent years in the surgical management of patients with colorectal cancer (ERAS programs), postoperative complications are frequent. Nutritional supplements enriched with immunonutrients have recently been introduced into clinical practice. However, the extent to which the combination of ERAS protocols and immunonutrition benefits patients undergoing colorectal cancer surgery is unknown.The SONVI study is a prospective, multicenter, randomized trial with 2 parallel treatment groups receiving either the study product (an immune-enhancing feed) or the control supplement (a hypercaloric hypernitrogenous supplement) for 7 days before colorectal resection and 5 days postoperatively.A total of 264 patients were randomized. At baseline, both groups were comparable in regards to age, sex, surgical risk, comorbidity, and analytical and nutritional parameters. The median length of the postoperative hospital stay was 5 days with no differences between the groups. A decrease in the total number of complications was observed in the immunonutrition group compared with the control group, primarily due to a significant decrease in infectious complications (23.8% vs. 10.7%, P = 0.0007). Of the infectious complications, wound infection differed significantly between the groups (16.4% vs. 5.7%, P = 0.0008). Other infectious complications were lower in the immunonutrition group but were not statistically significantly different.The implementation of ERAS protocols including immunonutrient-enriched supplements reduces the complications of patients undergoing colorectal resection.This study is registered with ClinicalTrial.gov: NCT02393976.

  7. Safety and efficacy assessment of standardized herbal formula PM012

    PubMed Central

    2012-01-01

    Background This study was conducted to evaluate the efficacy of the herbal formula PM012 on an Alzheimer's disease model, human presenilin 2 mutant transgenic mice (hPS2m), and also to evaluate the toxicity of PM012 in Sprague-Dawely rats after 4 or 26 weeks treatment with repeated oral administration. Methods Spatial learning and memory capacities of hPS2m transgenic mice were evaluated using the Morris Water Maze. Simultaneously, PM012 was repeatedly administered orally to male and female SD rats (15/sex/group) at doses of 0 (vehicle control), 500, 1,000 and 2,000 mg/kg/day for 4 or 26 weeks. To evaluate the recovery potential, 5 animals of each sex were assigned to vehicle control and 2,000 mg/kg/day groups during the 4-week recovery period. Results The results showed that PM012-treated hPS2m transgenic mice showed significantly reduced escape latency when compared with the hPS2m transgenic mice. The repeated oral administration of PM012 over 26 weeks in male and female rats induced an increase and increasing trend in thymus weight in the female treatment groups (main and recovery groups), but the change was judged to be toxicologically insignificant. In addition, the oral administration of the herbal medicine PM012 did not cause adverse effects as assessed by clinical signs, mortality, body weight, food and water consumption, ophthalmology, urinalysis, hematology, serum biochemistry, blood clotting time, organ weights and histopathology. The No Observed Adverse Effects Levels of PM012 was determined to be 2,000 mg/kg/day for both sexes, and the target organ was not identified. Conclusion These results suggest that PM012 has potential for use in the treatment of the Alzheimer's disease without serious adverse effects. PMID:22458507

  8. Standardizing Assessment of Competences and Competencies of Oncology Nurses Working in Ambulatory Care.

    PubMed

    Beaver, Clara; Magnan, Morris A; Henderson, Denise; DeRose, Patricia; Carolin, Kathleen; Bepler, Gerold

    2016-01-01

    A nursing quality consortium standardized nursing practice across 17 independently functioning ambulatory oncology sites. Programs were developed to validate both competences and competencies. One program assessed nine competences needed to develop systems of care to detect and treat treatment-related side effects. A second program was developed to assess competencies needed to prevent harm to oncology patients. This manuscript describes a successful approach to standardizing nursing practice across geographically distant academic and community sites. PMID:26985750

  9. A protocol to assess insect resistance to heat waves, applied to bumblebees (Bombus Latreille, 1802).

    PubMed

    Martinet, Baptiste; Lecocq, Thomas; Smet, Jérémy; Rasmont, Pierre

    2015-01-01

    Insect decline results from numerous interacting factors including climate change. One of the major phenomena related to climate change is the increase of the frequency of extreme events such as heat waves. Since heat waves are suspected to dramatically increase insect mortality, there is an urgent need to assess their potential impact. Here, we determined and compared the resistance to heat waves of insects under hyperthermic stress through their time before heat stupor (THS) when they are exposed to an extreme temperature (40°C). For this, we used a new experimental standardised device available in the field or in locations close to the field collecting sites. We applied this approach on different Arctic, Boreo-Alpine and Widespread bumblebee species in order to predict consequences of heat waves. Our results show a heat resistance gradient: the heat stress resistance of species with a centred arctic distribution is weaker than the heat resistance of the Boreo-Alpine species with a larger distribution which is itself lower than the heat stress resistance of the ubiquitous species.

  10. Study Protocol on Intentional Distortion in Personality Assessment: Relationship with Test Format, Culture, and Cognitive Ability.

    PubMed

    Van Geert, Eline; Orhon, Altan; Cioca, Iulia A; Mamede, Rui; Golušin, Slobodan; Hubená, Barbora; Morillo, Daniel

    2016-01-01

    Self-report personality questionnaires, traditionally offered in a graded-scale format, are widely used in high-stakes contexts such as job selection. However, job applicants may intentionally distort their answers when filling in these questionnaires, undermining the validity of the test results. Forced-choice questionnaires are allegedly more resistant to intentional distortion compared to graded-scale questionnaires, but they generate ipsative data. Ipsativity violates the assumptions of classical test theory, distorting the reliability and construct validity of the scales, and producing interdependencies among the scores. This limitation is overcome in the current study by using the recently developed Thurstonian item response theory model. As online testing in job selection contexts is increasing, the focus will be on the impact of intentional distortion on personality questionnaire data collected online. The present study intends to examine the effect of three different variables on intentional distortion: (a) test format (graded-scale versus forced-choice); (b) culture, as data will be collected in three countries differing in their attitudes toward intentional distortion (the United Kingdom, Serbia, and Turkey); and (c) cognitive ability, as a possible predictor of the ability to choose the more desirable responses. Furthermore, we aim to integrate the findings using a comprehensive model of intentional distortion. In the Anticipated Results section, three main aspects are considered: (a) the limitations of the manipulation, theoretical approach, and analyses employed; (b) practical implications for job selection and for personality assessment in a broader sense; and

  11. Study Protocol on Intentional Distortion in Personality Assessment: Relationship with Test Format, Culture, and Cognitive Ability

    PubMed Central

    Van Geert, Eline; Orhon, Altan; Cioca, Iulia A.; Mamede, Rui; Golušin, Slobodan; Hubená, Barbora; Morillo, Daniel

    2016-01-01

    Self-report personality questionnaires, traditionally offered in a graded-scale format, are widely used in high-stakes contexts such as job selection. However, job applicants may intentionally distort their answers when filling in these questionnaires, undermining the validity of the test results. Forced-choice questionnaires are allegedly more resistant to intentional distortion compared to graded-scale questionnaires, but they generate ipsative data. Ipsativity violates the assumptions of classical test theory, distorting the reliability and construct validity of the scales, and producing interdependencies among the scores. This limitation is overcome in the current study by using the recently developed Thurstonian item response theory model. As online testing in job selection contexts is increasing, the focus will be on the impact of intentional distortion on personality questionnaire data collected online. The present study intends to examine the effect of three different variables on intentional distortion: (a) test format (graded-scale versus forced-choice); (b) culture, as data will be collected in three countries differing in their attitudes toward intentional distortion (the United Kingdom, Serbia, and Turkey); and (c) cognitive ability, as a possible predictor of the ability to choose the more desirable responses. Furthermore, we aim to integrate the findings using a comprehensive model of intentional distortion. In the Anticipated Results section, three main aspects are considered: (a) the limitations of the manipulation, theoretical approach, and analyses employed; (b) practical implications for job selection and for personality assessment in a broader sense; and (c) suggestions for further research. PMID:27445902

  12. Study Protocol on Intentional Distortion in Personality Assessment: Relationship with Test Format, Culture, and Cognitive Ability.

    PubMed

    Van Geert, Eline; Orhon, Altan; Cioca, Iulia A; Mamede, Rui; Golušin, Slobodan; Hubená, Barbora; Morillo, Daniel

    2016-01-01

    Self-report personality questionnaires, traditionally offered in a graded-scale format, are widely used in high-stakes contexts such as job selection. However, job applicants may intentionally distort their answers when filling in these questionnaires, undermining the validity of the test results. Forced-choice questionnaires are allegedly more resistant to intentional distortion compared to graded-scale questionnaires, but they generate ipsative data. Ipsativity violates the assumptions of classical test theory, distorting the reliability and construct validity of the scales, and producing interdependencies among the scores. This limitation is overcome in the current study by using the recently developed Thurstonian item response theory model. As online testing in job selection contexts is increasing, the focus will be on the impact of intentional distortion on personality questionnaire data collected online. The present study intends to examine the effect of three different variables on intentional distortion: (a) test format (graded-scale versus forced-choice); (b) culture, as data will be collected in three countries differing in their attitudes toward intentional distortion (the United Kingdom, Serbia, and Turkey); and (c) cognitive ability, as a possible predictor of the ability to choose the more desirable responses. Furthermore, we aim to integrate the findings using a comprehensive model of intentional distortion. In the Anticipated Results section, three main aspects are considered: (a) the limitations of the manipulation, theoretical approach, and analyses employed; (b) practical implications for job selection and for personality assessment in a broader sense; and PMID:27445902

  13. A Protocol to Assess Insect Resistance to Heat Waves, Applied to Bumblebees (Bombus Latreille, 1802)

    PubMed Central

    Martinet, Baptiste; Lecocq, Thomas; Smet, Jérémy; Rasmont, Pierre

    2015-01-01

    Insect decline results from numerous interacting factors including climate change. One of the major phenomena related to climate change is the increase of the frequency of extreme events such as heat waves. Since heat waves are suspected to dramatically increase insect mortality, there is an urgent need to assess their potential impact. Here, we determined and compared the resistance to heat waves of insects under hyperthermic stress through their time before heat stupor (THS) when they are exposed to an extreme temperature (40°C). For this, we used a new experimental standardised device available in the field or in locations close to the field collecting sites. We applied this approach on different Arctic, Boreo-Alpine and Widespread bumblebee species in order to predict consequences of heat waves. Our results show a heat resistance gradient: the heat stress resistance of species with a centred arctic distribution is weaker than the heat resistance of the Boreo-Alpine species with a larger distribution which is itself lower than the heat stress resistance of the ubiquitous species. PMID:25738862

  14. Standards-Based Curriculum, Differentiated Instruction, and End of Course Assessments

    ERIC Educational Resources Information Center

    Hartnell, Benjamin Jeffry

    2011-01-01

    Differentiated instruction, standards-based curriculum, and end of course assessments (ECAs) are not mandated in most high schools across the United States. As such, classroom grades do not accurately reflect district report cards. In particular, grades at the study site, a suburban high school, do not show the specific standards and benchmarks…

  15. Convergence or Divergence: Alignment of Standards, Assessment, and Issues of Diversity.

    ERIC Educational Resources Information Center

    Carter, Norvella, Ed.

    In this report, teacher educators scrutinize the relationships between the standards and assessment movement in education and the United States' increasingly multicultural population. The papers include: "Foreword" (Jacqueline Jordan Irvine); (1) "Diversity and Standards: Defining the Issues" (Norvella P. Carter); (2) "Accountability and…

  16. Performance Assessment and Renewing Teacher Education: The Possibilities of the NBPTS Standards

    ERIC Educational Resources Information Center

    Galluzzo, Gary R.

    2005-01-01

    simple wedge with its tip made of durable, clear, and appropriate teaching standards forged in the furnace of science and theory widens to bring the materials and processes of art and practice to bear on the work of strengthening and renewing all of teaching by applying the national board's standards and assessment processes. With this model,…

  17. INEE Minimum Standards: A Tool for Education Quality Assessment in Afghan Refugee Schools in Pakistan

    ERIC Educational Resources Information Center

    Qahir, Katayon

    2007-01-01

    This article details a pilot Minimum Standards assessment in Afghan refugee schools supported by the International Rescue Committee's Female Education Program in the North West Frontier Province of Pakistan. A set of specifically selected, contextualized indicators, based on the global INEE Minimum Standards, served as a tool for teachers and…

  18. Child Development Functionality Assessment Guide: Standards and Requirements for Developing Most Efficient Organizations.

    ERIC Educational Resources Information Center

    Department of the Army, Washington, DC.

    As part of its cost containment efforts, the U.S. Navy continues to evaluate its child development program to expand availability without compromising the high quality standards required by the 1989 Military Child Care Act. This manual provides guidelines for conducting Functionality Assessments (FA) and delineates the standards and requirements…

  19. Smoothed Standardization Assessment of Testlet Level DIF on a Math Free-Response Item Type.

    ERIC Educational Resources Information Center

    Lyu, C. Felicia; And Others

    A smoothed version of standardization, which merges kernel smoothing with the traditional standardization differential item functioning (DIF) approach, was used to examine DIF for student-produced response (SPR) items on the Scholastic Assessment Test (SAT) I mathematics test at both the item and testlet levels. This nonparametric technique avoids…

  20. Interpretation of Standards with Bloom's Revised Taxonomy: A Comparison of Teachers and Assessment Experts

    ERIC Educational Resources Information Center

    Nasstrom, Gunilla

    2009-01-01

    In education, standards have to be interpreted, for planning of teaching, for development of assessments and for alignment analysis. In most cases, it is important that there is an agreement between individuals and organizations about how to interpret standards. However, there is a lack of studies of how consistent different group of judges are…