25 CFR 900.125 - What shall a construction contract proposal contain?
Code of Federal Regulations, 2012 CFR
2012-04-01
... tribal building codes and engineering standards; (4) Structural integrity; (5) Accountability of funds..., standards and methods (including national, regional, state, or tribal building codes or construction... methods (including national, regional, state, or tribal building codes or construction industry standards...
25 CFR 900.125 - What shall a construction contract proposal contain?
Code of Federal Regulations, 2014 CFR
2014-04-01
... tribal building codes and engineering standards; (4) Structural integrity; (5) Accountability of funds..., standards and methods (including national, regional, state, or tribal building codes or construction... methods (including national, regional, state, or tribal building codes or construction industry standards...
25 CFR 900.125 - What shall a construction contract proposal contain?
Code of Federal Regulations, 2013 CFR
2013-04-01
... tribal building codes and engineering standards; (4) Structural integrity; (5) Accountability of funds..., standards and methods (including national, regional, state, or tribal building codes or construction... methods (including national, regional, state, or tribal building codes or construction industry standards...
25 CFR 900.125 - What shall a construction contract proposal contain?
Code of Federal Regulations, 2011 CFR
2011-04-01
... tribal building codes and engineering standards; (4) Structural integrity; (5) Accountability of funds..., standards and methods (including national, regional, state, or tribal building codes or construction... methods (including national, regional, state, or tribal building codes or construction industry standards...
25 CFR 900.125 - What shall a construction contract proposal contain?
Code of Federal Regulations, 2010 CFR
2010-04-01
... tribal building codes and engineering standards; (4) Structural integrity; (5) Accountability of funds..., standards and methods (including national, regional, state, or tribal building codes or construction... methods (including national, regional, state, or tribal building codes or construction industry standards...
A review of the latest guidelines for NIBP device validation.
Alpert, Bruce S; Quinn, David E; Friedman, Bruce A
2013-12-01
The current ISO Standard is accepted as the National Standard in almost every industrialized nation. An overview of the most recently adopted standards is provided. Standards writing groups including the Advancement of Medical Instrumentation Sphygmomanometer Committee and ISO JWG7 are working to expand standardized evaluation methods to include the evaluation of devices intended for use in environments where motion artifact is common. An Association for the Advancement of Medical Instrumentation task group on noninvasive blood pressure measurement in the presence of motion artifact has published a technical information report containing research and standardized methods for the evaluation of blood pressure device performance in the presence of motion artifact.
Methodology issues in implementation science.
Newhouse, Robin; Bobay, Kathleen; Dykes, Patricia C; Stevens, Kathleen R; Titler, Marita
2013-04-01
Putting evidence into practice at the point of care delivery requires an understanding of implementation strategies that work, in what context and how. To identify methodological issues in implementation science using 4 studies as cases and make recommendations for further methods development. Four cases are presented and methodological issues identified. For each issue raised, evidence on the state of the science is described. Issues in implementation science identified include diverse conceptual frameworks, potential weaknesses in pragmatic study designs, and the paucity of standard concepts and measurement. Recommendations to advance methods in implementation include developing a core set of implementation concepts and metrics, generating standards for implementation methods including pragmatic trials, mixed methods designs, complex interventions and measurement, and endorsing reporting standards for implementation studies.
Test methods for optical disk media characteristics (for 356 mm ruggedized magneto-optic media)
NASA Technical Reports Server (NTRS)
Podio, Fernando L.
1991-01-01
Standard test methods for computer storage media characteristics are essential and allow for conformance to media interchange standards. The test methods were developed for 356 mm two-sided laminated glass substrate with a magneto-optic active layer media technology. These test methods may be used for testing other media types, but in each case their applicability must be evaluated. Test methods are included for a series of different media characteristics, including operational, nonoperational, and storage environments; mechanical and physical characteristics; and substrate, recording layer, and preformat characteristics. Tests for environmental qualification and media lifetimes are also included. The best methods include testing conditions, testing procedures, a description of the testing setup, and the required calibration procedures.
Instructional Basics: Oppelt Standard Method of Therapeutic and Recreational Ice Skating.
ERIC Educational Resources Information Center
Oppelt, Kurt
Detailed in the booklet is the standard ice skating method and considered are the benefits of therapeutic ice skating for the handicapped and aged. Values for the mentally retarded and physically handicapped are seen to include physiological (such as increased flexibility and improved posture), psychological (including satifaction and enhanced…
Setting Standards for Minimum Competency Tests.
ERIC Educational Resources Information Center
Mehrens, William A.
Some general questions about minimum competency tests are discussed, and various methods of setting standards are reviewed with major attention devoted to those methods used for dichotomizing a continuum. Methods reviewed under the heading of Absolute Judgments of Test Content include Nedelsky's, Angoff's, Ebel's, and Jaeger's. These methods are…
ERIC Educational Resources Information Center
Coester, Lee Anne
2010-01-01
This study was designed to gather input from early career elementary teachers with the goal of finding ways to improve elementary mathematics methods courses. Multiple areas were explored including the degree to which respondents' elementary mathematics methods course focused on the NCTM Process Standards, the teachers' current standards-based…
ERIC Educational Resources Information Center
Wood, Timothy J.; Humphrey-Murto, Susan M.; Norman, Geoffrey R.
2006-01-01
When setting standards, administrators of small-scale OSCEs often face several challenges, including a lack of resources, a lack of available expertise in statistics, and difficulty in recruiting judges. The Modified Borderline-Group Method is a standard setting procedure that compensates for these challenges by using physician examiners and is…
Mobile Robot and Mobile Manipulator Research Towards ASTM Standards Development.
Bostelman, Roger; Hong, Tsai; Legowik, Steven
2016-01-01
Performance standards for industrial mobile robots and mobile manipulators (robot arms onboard mobile robots) have only recently begun development. Low cost and standardized measurement techniques are needed to characterize system performance, compare different systems, and to determine if recalibration is required. This paper discusses work at the National Institute of Standards and Technology (NIST) and within the ASTM Committee F45 on Driverless Automatic Guided Industrial Vehicles. This includes standards for both terminology, F45.91, and for navigation performance test methods, F45.02. The paper defines terms that are being considered. Additionally, the paper describes navigation test methods that are near ballot and docking test methods being designed for consideration within F45.02. This includes the use of low cost artifacts that can provide alternatives to using relatively expensive measurement systems.
Mobile Robot and Mobile Manipulator Research Towards ASTM Standards Development
Bostelman, Roger; Hong, Tsai; Legowik, Steven
2017-01-01
Performance standards for industrial mobile robots and mobile manipulators (robot arms onboard mobile robots) have only recently begun development. Low cost and standardized measurement techniques are needed to characterize system performance, compare different systems, and to determine if recalibration is required. This paper discusses work at the National Institute of Standards and Technology (NIST) and within the ASTM Committee F45 on Driverless Automatic Guided Industrial Vehicles. This includes standards for both terminology, F45.91, and for navigation performance test methods, F45.02. The paper defines terms that are being considered. Additionally, the paper describes navigation test methods that are near ballot and docking test methods being designed for consideration within F45.02. This includes the use of low cost artifacts that can provide alternatives to using relatively expensive measurement systems. PMID:28690359
40 CFR 600.512-12 - Model year report.
Code of Federal Regulations, 2013 CFR
2013-07-01
... CFR parts 531 or 533 as applicable, and the applicable fleet average CO2 emission standards. Model... standards. Model year reports shall include a statement that the method of measuring vehicle track width... models and the applicable in-use CREE emission standard. The list of models shall include the applicable...
40 CFR 600.512-12 - Model year report.
Code of Federal Regulations, 2012 CFR
2012-07-01
... CFR parts 531 or 533 as applicable, and the applicable fleet average CO2 emission standards. Model... standards. Model year reports shall include a statement that the method of measuring vehicle track width... models and the applicable in-use CREE emission standard. The list of models shall include the applicable...
40 CFR 600.512-12 - Model year report.
Code of Federal Regulations, 2014 CFR
2014-07-01
... CFR parts 531 or 533 as applicable, and the applicable fleet average CO2 emission standards. Model... standards. Model year reports shall include a statement that the method of measuring vehicle track width... models and the applicable in-use CREE emission standard. The list of models shall include the applicable...
Will the "Real" Proficiency Standard Please Stand Up?
ERIC Educational Resources Information Center
Baron, Joan Boykoff; And Others
Connecticut's experience with four different standard-setting methods regarding multiple choice proficiency tests is described. The methods include Angoff, Nedelsky, Borderline Group, and Contrasting Groups Methods. All Connecticut ninth graders were administered proficiency tests in reading, language arts, and mathematics. As soon as final test…
Optimal Multicomponent Analysis Using the Generalized Standard Addition Method.
ERIC Educational Resources Information Center
Raymond, Margaret; And Others
1983-01-01
Describes an experiment on the simultaneous determination of chromium and magnesium by spectophotometry modified to include the Generalized Standard Addition Method computer program, a multivariate calibration method that provides optimal multicomponent analysis in the presence of interference and matrix effects. Provides instructions for…
Improved lossless intra coding for H.264/MPEG-4 AVC.
Lee, Yung-Lyul; Han, Ki-Hun; Sullivan, Gary J
2006-09-01
A new lossless intra coding method based on sample-by-sample differential pulse code modulation (DPCM) is presented as an enhancement of the H.264/MPEG-4 AVC standard. The H.264/AVC design includes a multidirectional spatial prediction method to reduce spatial redundancy by using neighboring samples as a prediction for the samples in a block of data to be encoded. In the new lossless intra coding method, the spatial prediction is performed based on samplewise DPCM instead of in the block-based manner used in the current H.264/AVC standard, while the block structure is retained for the residual difference entropy coding process. We show that the new method, based on samplewise DPCM, does not have a major complexity penalty, despite its apparent pipeline dependencies. Experiments show that the new lossless intra coding method reduces the bit rate by approximately 12% in comparison with the lossless intra coding method previously included in the H.264/AVC standard. As a result, the new method is currently being adopted into the H.264/AVC standard in a new enhancement project.
Data Friction Meets Social Friction: Challenges for standardization in emerging fields of geoscience
NASA Astrophysics Data System (ADS)
Darch, P. T.
2017-12-01
Many interdisciplinary endeavors in the geosciences occur in emergent scientific fields. These fields are often characterized by heterogeneity of methods for production and collection of data, and by data scarcity. This paper presents findings about processes of methods standardization from a long-term case study of an emergent, data-scarce field, the deep subseafloor biosphere. Researchers come from many physical and life science backgrounds to study interactions between microbial life in the seafloor and the physical environment they inhabit. Standardization of methods for collecting data promises multiple benefits to this field, including: Addressing data scarcity through enabling greater data reuse and promoting better interoperability with large scale infrastructures; Fostering stronger collaborative links between researchers distributed across institutions and backgrounds. Ongoing standardization efforts in the field do not only involve scientific judgments about which among a range of methods is most efficient, least biased, or most reliable. Instead, these efforts also encounter multiple difficult social challenges, including: Lack of agreed upon criteria about how to judge competing methods: should efficiency, bias, or reliability take priority?; Lack of resources to carry out the work necessary to determine standards, particularly acute in emergent fields; Concerns that standardization is premature in such a new field, foreclosing the possibility of better methods being developed in the future; Concerns that standardization could prematurely shut down important scientific debates; Concerns among some researchers that their own work may become obsolete should the methods chosen as standard be different from their own. The success of these standardization efforts will depend on addressing both scientific and social dimensions, to ensure widespread acceptance among researchers in the field.
[Sampling methods for PM2.5 from stationary sources: a review].
Jiang, Jing-Kun; Deng, Jian-Guo; Li, Zhen; Li, Xing-Hua; Duan, Lei; Hao, Ji-Ming
2014-05-01
The new China national ambient air quality standard has been published in 2012 and will be implemented in 2016. To meet the requirements in this new standard, monitoring and controlling PM2,,5 emission from stationary sources are very important. However, so far there is no national standard method on sampling PM2.5 from stationary sources. Different sampling methods for PM2.5 from stationary sources and relevant international standards were reviewed in this study. It includes the methods for PM2.5 sampling in flue gas and the methods for PM2.5 sampling after dilution. Both advantages and disadvantages of these sampling methods were discussed. For environmental management, the method for PM2.5 sampling in flue gas such as impactor and virtual impactor was suggested as a standard to determine filterable PM2.5. To evaluate environmental and health effects of PM2.5 from stationary sources, standard dilution method for sampling of total PM2.5 should be established.
[Overview and prospect of syndrome differentiation of hypertension in traditional Chinese medicine].
Yang, Xiao-Chen; Xiong, Xing-Jiang; Wang, Jie
2014-01-01
This article is to overview the literature of syndrome differentiation of traditional Chinese medicine on hypertension. According to the theory of disease in combination with syndrome, we concluded syndrome types of hypertension in four aspects, including national standards, industry standards, teaching standards and personal experience. Meanwhile, in order to provide new methods and approaches for normalized research, we integrated modern testing methods and statistical methods to analyze syndrome differentiation for the treatment of hypertension.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1993-06-01
The bibliography contains citations concerning standards and standard tests for water quality in drinking water sources, reservoirs, and distribution systems. Standards from domestic and international sources are presented. Glossaries and vocabularies that concern water quality analysis, testing, and evaluation are included. Standard test methods for individual elements, selected chemicals, sensory properties, radioactivity, and other chemical and physical properties are described. Discussions for proposed standards on new pollutant materials are briefly considered. (Contains a minimum of 203 citations and includes a subject term index and title list.)
Automated installation methods for photovoltaic arrays
NASA Astrophysics Data System (ADS)
Briggs, R.; Daniels, A.; Greenaway, R.; Oster, J., Jr.; Racki, D.; Stoeltzing, R.
1982-11-01
Since installation expenses constitute a substantial portion of the cost of a large photovoltaic power system, methods for reduction of these costs were investigated. The installation of the photovoltaic arrays includes all areas, starting with site preparation (i.e., trenching, wiring, drainage, foundation installation, lightning protection, grounding and installation of the panel) and concluding with the termination of the bus at the power conditioner building. To identify the optimum combination of standard installation procedures and automated/mechanized techniques, the installation process was investigated including the equipment and hardware available, the photovoltaic array structure systems and interfaces, and the array field and site characteristics. Preliminary designs of hardware for both the standard installation method, the automated/mechanized method, and a mix of standard installation procedures and mechanized procedures were identified to determine which process effectively reduced installation costs. In addition, costs associated with each type of installation method and with the design, development and fabrication of new installation hardware were generated.
Hypothesis Testing Using Factor Score Regression
Devlieger, Ines; Mayer, Axel; Rosseel, Yves
2015-01-01
In this article, an overview is given of four methods to perform factor score regression (FSR), namely regression FSR, Bartlett FSR, the bias avoiding method of Skrondal and Laake, and the bias correcting method of Croon. The bias correcting method is extended to include a reliable standard error. The four methods are compared with each other and with structural equation modeling (SEM) by using analytic calculations and two Monte Carlo simulation studies to examine their finite sample characteristics. Several performance criteria are used, such as the bias using the unstandardized and standardized parameterization, efficiency, mean square error, standard error bias, type I error rate, and power. The results show that the bias correcting method, with the newly developed standard error, is the only suitable alternative for SEM. While it has a higher standard error bias than SEM, it has a comparable bias, efficiency, mean square error, power, and type I error rate. PMID:29795886
Issues concerning international comparison of free-field calibrations of acoustical standards
NASA Astrophysics Data System (ADS)
Nedzelnitsky, Victor
2002-11-01
Primary free-field calibrations of laboratory standard microphones by the reciprocity method establish these microphones as reference standard devices for calibrating working standard microphones, other measuring microphones, and practical instruments such as sound level meters and personal sound exposure meters (noise dosimeters). These primary, secondary, and other calibrations are indispensable to the support of regulatory requirements, standards, and product characterization and quality control procedures important for industry, commerce, health, and safety. International Electrotechnical Commission (IEC) Technical Committee 29 Electroacoustics produces international documentary standards, including standards for primary and secondary free-field calibration and measurement procedures and their critically important application to practical instruments. This paper addresses some issues concerning calibrations, standards activities, and the international key comparison of primary free-field calibrations of IEC-type LS2 laboratory standard microphones that is being planned by the Consultative Committee for Acoustics, Ultrasound, and Vibration (CCAUV) of the International Committee for Weights and Measures (CIPM). This comparison will include free-field calibrations by the reciprocity method at participating major national metrology laboratories throughout the world.
48 CFR 315.7000 - Section 508 accessibility standards.
Code of Federal Regulations, 2010 CFR
2010-10-01
... CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Acquisition of Electronic Information Technology 315.7000 Section 508 accessibility standards. EIT products and services, including EIT...
Liu, Shu-Yu; Hu, Chang-Qin
2007-10-17
This study introduces the general method of quantitative nuclear magnetic resonance (qNMR) for the calibration of reference standards of macrolide antibiotics. Several qNMR experimental conditions were optimized including delay, which is an important parameter of quantification. Three kinds of macrolide antibiotics were used to validate the accuracy of the qNMR method by comparison with the results obtained by the high performance liquid chromatography (HPLC) method. The purities of five common reference standards of macrolide antibiotics were measured by the 1H qNMR method and the mass balance method, respectively. The analysis results of the two methods were compared. The qNMR is quick and simple to use. In a new medicine research and development process, qNMR provides a new and reliable method for purity analysis of the reference standard.
Calibrated permeation standards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dameron, Arrelaine A.; Reese, Matthew O.; Kempe, Michael D.
2017-11-21
A permeation standard is provided. The permeation standard may include a substrate that is impermeable to an analyte, an orifice disposed in the substrate, and a permeable material filling the orifice. The orifice and the permeable material are configured to provide a predetermined transmission rate of the analyte through the permeation standard. Also provided herein are methods for forming the permeation standard.
Qualitative Analysis on Stage: Making the Research Process More Public.
ERIC Educational Resources Information Center
Anfara, Vincent A., Jr.; Brown, Kathleen M.
The increased use of qualitative research methods has spurred interest in developing formal standards for assessing its validity. These standards, however, fall short if they do not include public disclosure of methods as a criterion. The researcher must be accountable in documenting the actions associated with establishing internal validity…
Simplified Laboratory Procedures for Wastewater Examination. Second Edition.
ERIC Educational Resources Information Center
Water Pollution Control Federation, Washington, DC.
This booklet is for wastewater treatment plant operators who find it difficult to follow the detailed discussions and procedures found in "Standard Methods for the Examination of Water and Wastewater." It is intended to be used with "Standard Methods" available for reference. Included in this publication are chapters on…
Zietze, Stefan; Müller, Rainer H; Brecht, René
2008-03-01
In order to set up a batch-to-batch-consistency analytical scheme for N-glycosylation analysis, several sample preparation steps including enzyme digestions and fluorophore labelling and two HPLC-methods were established. The whole method scheme was standardized, evaluated and validated according to the requirements on analytical testing in early clinical drug development by usage of a recombinant produced reference glycoprotein (RGP). The standardization of the methods was performed by clearly defined standard operation procedures. During evaluation of the methods, the major interest was in the loss determination of oligosaccharides within the analytical scheme. Validation of the methods was performed with respect to specificity, linearity, repeatability, LOD and LOQ. Due to the fact that reference N-glycan standards were not available, a statistical approach was chosen to derive accuracy from the linearity data. After finishing the validation procedure, defined limits for method variability could be calculated and differences observed in consistency analysis could be separated into significant and incidental ones.
Huang, Yande; Su, Bao-Ning; Ye, Qingmei; Palaniswamy, Venkatapuram A; Bolgar, Mark S; Raglione, Thomas V
2014-01-01
The classical internal standard quantitative NMR (qNMR) method determines the purity of an analyte by the determination of a solution containing the analyte and a standard. Therefore, the standard must meet the requirements of chemical compatibility and lack of resonance interference with the analyte as well as a known purity. The identification of such a standard can be time consuming and must be repeated for each analyte. In contrast, the external standard qNMR method utilizes a standard with a known purity to calibrate the NMR instrument. The external standard and the analyte are measured separately, thereby eliminating the matter of chemical compatibility and resonance interference between the standard and the analyte. However, the instrumental factors, including the quality of NMR tubes, must be kept the same. Any deviations will compromise the accuracy of the results. An innovative qNMR method reported herein utilizes an internal reference substance along with an external standard to assume the role of the standard used in the traditional internal standard qNMR method. In this new method, the internal reference substance must only be chemically compatible and be free of resonance-interference with the analyte or external standard whereas the external standard must only be of a known purity. The exact purity or concentration of the internal reference substance is not required as long as the same quantity is added to the external standard and the analyte. The new method reduces the burden of searching for an appropriate standard for each analyte significantly. Therefore the efficiency of the qNMR purity assay increases while the precision of the internal standard method is retained. Copyright © 2013 Elsevier B.V. All rights reserved.
HPLC analysis and standardization of Brahmi vati – An Ayurvedic poly-herbal formulation
Mishra, Amrita; Mishra, Arun K.; Tiwari, Om Prakash; Jha, Shivesh
2013-01-01
Objectives The aim of the present study was to standardize Brahmi vati (BV) by simultaneous quantitative estimation of Bacoside A3 and Piperine adopting HPLC–UV method. BV very important Ayurvedic polyherbo formulation used to treat epilepsy and mental disorders containing thirty eight ingredients including Bacopa monnieri L. and Piper longum L. Materials and methods An HPLC–UV method was developed for the standardization of BV in light of simultaneous quantitative estimation of Bacoside A3 and Piperine, the major constituents of B. monnieri L. and P. longum L. respectively. The developed method was validated on parameters including linearity, precision, accuracy and robustness. Results The HPLC analysis showed significant increase in amount of Bacoside A3 and Piperine in the in-house sample of BV when compared with all three different marketed samples of the same. Results showed variations in the amount of Bacoside A3 and Piperine in different samples which indicate non-uniformity in their quality which will lead to difference in their therapeutic effects. Conclusion The outcome of the present investigation underlines the importance of standardization of Ayurvedic formulations. The developed method may be further used to standardize other samples of BV or other formulations containing Bacoside A3 and Piperine. PMID:24396246
A thioacidolysis method tailored for higher‐throughput quantitative analysis of lignin monomers
Foster, Cliff; Happs, Renee M.; Doeppke, Crissa; Meunier, Kristoffer; Gehan, Jackson; Yue, Fengxia; Lu, Fachuang; Davis, Mark F.
2016-01-01
Abstract Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β‐O‐4 linkages. Current thioacidolysis methods are low‐throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non‐chlorinated organic solvent and is tailored for higher‐throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1–2 mg of biomass per assay and has been quantified using fast‐GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, including standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day‐to‐day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. The method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses. PMID:27534715
HPLC analysis and standardization of Brahmi vati - An Ayurvedic poly-herbal formulation.
Mishra, Amrita; Mishra, Arun K; Tiwari, Om Prakash; Jha, Shivesh
2013-09-01
The aim of the present study was to standardize Brahmi vati (BV) by simultaneous quantitative estimation of Bacoside A3 and Piperine adopting HPLC-UV method. BV very important Ayurvedic polyherbo formulation used to treat epilepsy and mental disorders containing thirty eight ingredients including Bacopa monnieri L. and Piper longum L. An HPLC-UV method was developed for the standardization of BV in light of simultaneous quantitative estimation of Bacoside A3 and Piperine, the major constituents of B. monnieri L. and P. longum L. respectively. The developed method was validated on parameters including linearity, precision, accuracy and robustness. The HPLC analysis showed significant increase in amount of Bacoside A3 and Piperine in the in-house sample of BV when compared with all three different marketed samples of the same. Results showed variations in the amount of Bacoside A3 and Piperine in different samples which indicate non-uniformity in their quality which will lead to difference in their therapeutic effects. The outcome of the present investigation underlines the importance of standardization of Ayurvedic formulations. The developed method may be further used to standardize other samples of BV or other formulations containing Bacoside A3 and Piperine.
Modified Confidence Intervals for the Mean of an Autoregressive Process.
1985-08-01
Validity of the method 45 3.6 Theorem 47 4 Derivation of corrections 48 Introduction 48 The zero order pivot 50 4.1 Algorithm 50 CONTENTS The first...of standard confidence intervals. There are several standard methods of setting confidence intervals in simulations, including the regener- ative... method , batch means, and time series methods . We-will focus-s on improved confidence intervals for the mean of an autoregressive process, and as such our
Comparison of Web-Based and Face-to-Face Standard Setting Using the Angoff Method
ERIC Educational Resources Information Center
Katz, Irvin R.; Tannenbaum, Richard J.
2014-01-01
Web-based standard setting holds promise for reducing the travel and logistical inconveniences of traditional, face-to-face standard setting meetings. However, because there are few published reports of setting standards via remote meeting technology, little is known about the practical potential of the approach, including technical feasibility of…
Modified Drop Tower Impact Tests for American Football Helmets.
Rush, G Alston; Prabhu, R; Rush, Gus A; Williams, Lakiesha N; Horstemeyer, M F
2017-02-19
A modified National Operating Committee on Standards for Athletic Equipment (NOCSAE) test method for American football helmet drop impact test standards is presented that would provide better assessment of a helmet's on-field impact performance by including a faceguard on the helmet. In this study, a merger of faceguard and helmet test standards is proposed. The need for a more robust systematic approach to football helmet testing procedures is emphasized by comparing representative results of the Head Injury Criterion (HIC), Severity Index (SI), and peak acceleration values for different helmets at different helmet locations under modified NOCSAE standard drop tower tests. Essentially, these comparative drop test results revealed that the faceguard adds a stiffening kinematic constraint to the shell that lessens total energy absorption. The current NOCSAE standard test methods can be improved to represent on-field helmet hits by attaching the faceguards to helmets and by including two new helmet impact locations (Front Top and Front Top Boss). The reported football helmet test method gives a more accurate representation of a helmet's performance and its ability to mitigate on-field impacts while promoting safer football helmets.
Quality assurance, training, and certification in ozone air pollution studies
Susan Schilling; Paul Miller; Brent Takemoto
1996-01-01
Uniform, or standard, measurement methods of data are critical to projects monitoring change to forest systems. Standardized methods, with known or estimable errors, contribute greatly to the confidence associated with decisions on the basis of field data collections (Zedaker and Nicholas 1990). Quality assurance (QA) for the measurement process includes operations and...
40 CFR 80.582 - What are the sampling and testing methods for the fuel marker?
Code of Federal Regulations, 2010 CFR
2010-07-01
... developed by a Voluntary Consensus-Based Standards Body, such as the American Society for Testing and... test method documentation, including a description of the technology and/or instrumentation that makes... this standard from the American Society for Testing and Materials, 100 Barr Harbor Dr., West...
40 CFR 80.582 - What are the sampling and testing methods for the fuel marker?
Code of Federal Regulations, 2011 CFR
2011-07-01
... developed by a Voluntary Consensus-Based Standards Body, such as the American Society for Testing and... test method documentation, including a description of the technology and/or instrumentation that makes... this standard from the American Society for Testing and Materials, 100 Barr Harbor Dr., West...
The Weakest Link: Library Catalogs.
ERIC Educational Resources Information Center
Young, Terrence E., Jr.
2002-01-01
Describes methods of correcting MARC records in online public access catalogs in school libraries. Highlights include in-house methods; professional resources; conforming to library cataloging standards; vendor services, including Web-based services; software specifically developed for record cleanup; and outsourcing. (LRW)
Antianaerobic Antimicrobials: Spectrum and Susceptibility Testing
Wexler, Hannah M.; Goldstein, Ellie J. C.
2013-01-01
SUMMARY Susceptibility testing of anaerobic bacteria recovered from selected cases can influence the choice of antimicrobial therapy. The Clinical and Laboratory Standards Institute (CLSI) has standardized many laboratory procedures, including anaerobic susceptibility testing (AST), and has published documents for AST. The standardization of testing methods by the CLSI allows comparisons of resistance trends among various laboratories. Susceptibility testing should be performed on organisms recovered from sterile body sites, those that are isolated in pure culture, or those that are clinically important and have variable or unique susceptibility patterns. Organisms that should be considered for individual isolate testing include highly virulent pathogens for which susceptibility cannot be predicted, such as Bacteroides, Prevotella, Fusobacterium, and Clostridium spp.; Bilophila wadsworthia; and Sutterella wadsworthensis. This review describes the current methods for AST in research and reference laboratories. These methods include the use of agar dilution, broth microdilution, Etest, and the spiral gradient endpoint system. The antimicrobials potentially effective against anaerobic bacteria include beta-lactams, combinations of beta-lactams and beta-lactamase inhibitors, metronidazole, chloramphenicol, clindamycin, macrolides, tetracyclines, and fluoroquinolones. The spectrum of efficacy, antimicrobial resistance mechanisms, and resistance patterns against these agents are described. PMID:23824372
Appelbaum, Mark; Cooper, Harris; Kline, Rex B; Mayo-Wilson, Evan; Nezu, Arthur M; Rao, Stephen M
2018-01-01
Following a review of extant reporting standards for scientific publication, and reviewing 10 years of experience since publication of the first set of reporting standards by the American Psychological Association (APA; APA Publications and Communications Board Working Group on Journal Article Reporting Standards, 2008), the APA Working Group on Quantitative Research Reporting Standards recommended some modifications to the original standards. Examples of modifications include division of hypotheses, analyses, and conclusions into 3 groupings (primary, secondary, and exploratory) and some changes to the section on meta-analysis. Several new modules are included that report standards for observational studies, clinical trials, longitudinal studies, replication studies, and N-of-1 studies. In addition, standards for analytic methods with unique characteristics and output (structural equation modeling and Bayesian analysis) are included. These proposals were accepted by the Publications and Communications Board of APA and supersede the standards included in the 6th edition of the Publication Manual of the American Psychological Association (APA, 2010). (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Neuss, Michael N; Gilmore, Terry R; Belderson, Kristin M; Billett, Amy L; Conti-Kalchik, Tara; Harvey, Brittany E; Hendricks, Carolyn; LeFebvre, Kristine B; Mangu, Pamela B; McNiff, Kristen; Olsen, MiKaela; Schulmeister, Lisa; Von Gehr, Ann; Polovich, Martha
2016-12-01
Purpose To update the ASCO/Oncology Nursing Society (ONS) Chemotherapy Administration Safety Standards and to highlight standards for pediatric oncology. Methods The ASCO/ONS Chemotherapy Administration Safety Standards were first published in 2009 and updated in 2011 to include inpatient settings. A subsequent 2013 revision expanded the standards to include the safe administration and management of oral chemotherapy. A joint ASCO/ONS workshop with stakeholder participation, including that of the Association of Pediatric Hematology Oncology Nurses and American Society of Pediatric Hematology/Oncology, was held on May 12, 2015, to review the 2013 standards. An extensive literature search was subsequently conducted, and public comments on the revised draft standards were solicited. Results The updated 2016 standards presented here include clarification and expansion of existing standards to include pediatric oncology and to introduce new standards: most notably, two-person verification of chemotherapy preparation processes, administration of vinca alkaloids via minibags in facilities in which intrathecal medications are administered, and labeling of medications dispensed from the health care setting to be taken by the patient at home. The standards were reordered and renumbered to align with the sequential processes of chemotherapy prescription, preparation, and administration. Several standards were separated into their respective components for clarity and to facilitate measurement of adherence to a standard. Conclusion As oncology practice has changed, so have chemotherapy administration safety standards. Advances in technology, cancer treatment, and education and training have prompted the need for periodic review and revision of the standards. Additional information is available at http://www.asco.org/chemo-standards .
Humphries, Romney M; Kircher, Susan; Ferrell, Andrea; Krause, Kevin M; Malherbe, Rianna; Hsiung, Andre; Burnham, C A
2018-05-09
Expedited pathways to antimicrobial agent approval by the United States Food and Drug Administration (FDA) have led to increased delays between drug approval and the availability of FDA-cleared antimicrobial susceptibility testing (AST) devices. Antimicrobial disks for use with disk diffusion testing are among the first AST devices available to clinical laboratories. However, many laboratories are reluctant to implement a disk diffusion method for a variety of reasons, including dwindling proficiency with this method, interruptions to laboratory workflow, uncertainty surrounding the quality and reliability of a disk diffusion test, and perceived need to report an MIC to clinicians. This mini-review provides a report from the Clinical and Laboratory Standards Institute Working Group on Methods Development and Standardization on the current standards and clinical utility of disk diffusion testing. Copyright © 2018 American Society for Microbiology.
A Critical Analysis of the Body of Work Method for Setting Cut-Scores
ERIC Educational Resources Information Center
Radwan, Nizam; Rogers, W. Todd
2006-01-01
The recent increase in the use of constructed-response items in educational assessment and the dissatisfaction with the nature of the decision that the judges must make using traditional standard-setting methods created a need to develop new and effective standard-setting procedures for tests that include both multiple-choice and…
Wohlsen, T; Bates, J; Vesey, G; Robinson, W A; Katouli, M
2006-04-01
To use BioBall cultures as a precise reference standard to evaluate methods for enumeration of Escherichia coli and other coliform bacteria in water samples. Eight methods were evaluated including membrane filtration, standard plate count (pour and spread plate methods), defined substrate technology methods (Colilert and Colisure), the most probable number method and the Petrifilm disposable plate method. Escherichia coli and Enterobacter aerogenes BioBall cultures containing 30 organisms each were used. All tests were performed using 10 replicates. The mean recovery of both bacteria varied with the different methods employed. The best and most consistent results were obtained with Petrifilm and the pour plate method. Other methods either yielded a low recovery or showed significantly high variability between replicates. The BioBall is a very suitable quality control tool for evaluating the efficiency of methods for bacterial enumeration in water samples.
Airside HVAC BESTEST: HVAC Air-Distribution System Model Test Cases for ASHRAE Standard 140
DOE Office of Scientific and Technical Information (OSTI.GOV)
Judkoff, Ronald; Neymark, Joel; Kennedy, Mike D.
This paper summarizes recent work to develop new airside HVAC equipment model analytical verification test cases for ANSI/ASHRAE Standard 140, Standard Method of Test for the Evaluation of Building Energy Analysis Computer Programs. The analytical verification test method allows comparison of simulation results from a wide variety of building energy simulation programs with quasi-analytical solutions, further described below. Standard 140 is widely cited for evaluating software for use with performance-path energy efficiency analysis, in conjunction with well-known energy-efficiency standards including ASHRAE Standard 90.1, the International Energy Conservation Code, and other international standards. Airside HVAC Equipment is a common area ofmore » modelling not previously explicitly tested by Standard 140. Integration of the completed test suite into Standard 140 is in progress.« less
Performing skin microbiome research: A method to the madness
Kong, Heidi H.; Andersson, Björn; Clavel, Thomas; Common, John E.; Jackson, Scott A.; Olson, Nathan D.; Segre, Julia A.; Traidl-Hoffmann, Claudia
2017-01-01
Growing interest in microbial contributions to human health and disease has increasingly led investigators to examine the microbiome in both healthy skin and cutaneous disorders, including acne, psoriasis and atopic dermatitis. The need for common language, effective study design, and validated methods are critical for high-quality, standardized research. Features, unique to skin, pose particular challenges when conducting microbiome research. This review discusses microbiome research standards and highlights important factors to consider, including clinical study design, skin sampling, sample processing, DNA sequencing, control inclusion, and data analysis. PMID:28063650
Coordination and standardization of federal sedimentation activities
Glysson, G. Douglas; Gray, John R.
1997-01-01
- precipitation information critical to water resources management. Memorandum M-92-01 covers primarily freshwater bodies and includes activities, such as "development and distribution of consensus standards, field-data collection and laboratory analytical methods, data processing and interpretation, data-base management, quality control and quality assurance, and water- resources appraisals, assessments, and investigations." Research activities are not included.
A method for the geometric and densitometric standardization of intraoral radiographs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duckworth, J.E.; Judy, P.F.; Goodson, J.M.
1983-07-01
The interpretation of dental radiographs for the diagnosis of periodontal disease conditions poses several difficulties. These include the inability to adequately reproduce the projection geometry and optical density of the exposures. In order to improve the ability to extract accurate quantitative information from a radiographic survey of periodontal status, a method was developed which provided for consistent reproduction of both geometric and densitometric exposure parameters. This technique employed vertical bitewing projections in holders customized to individual segments of the dentition. A copper stepwedge was designed to provide densitometric standardization, and wire markers were included to permit measurement of angular variation.more » In a series of 53 paired radiographs, measurement of alveolar crest heights was found to be reproducible within approximately 0.1 mm. This method provided a full mouth radiographic survey using seven films, each complete with internal standards suitable for computer-based image processing.« less
Development of the Nurse Practitioner Standards for Practice Australia
Buckley, Thomas; Donoghue, Judith; Heartfield, Marie; Bryce, Julianne; Cox, Darlene; Waters, Donna; Gosby, Helen; Kelly, John; Dunn, Sandra V.
2015-01-01
This article describes the context and development of the new Nurse Practitioner Standards for Practice in Australia, which went into effect in January 2014. The researchers used a mixed-methods design to engage a broad range of stakeholders who brought both political and practice knowledge to the development of the new standards. Methods included interviews, focus groups, surveys, and work-based observation of nurse practitioner practice. Stakeholders varied in terms of their need for detail in the standards. Nonetheless, they invariably agreed that the standards should be clinically focussed attributes. The pillars common in many advanced practice nursing standards, such as practice, research, education, and leadership, were combined and expressed in a new and unique clinical attribute. PMID:26162455
Bedner, Mary; Schantz, Michele M; Sander, Lane C; Sharpless, Katherine E
2008-05-23
Liquid chromatographic (LC) methods using atmospheric pressure chemical ionization/mass spectrometric (APCI-MS) detection were developed for the separation and analysis of the phytosterols campesterol, cycloartenol, lupenone, lupeol, beta-sitosterol, and stigmasterol. Brassicasterol and cholesterol were also included for investigation as internal standards. The methods were used to identify and quantify the phytosterols in each of two Serenoa repens (saw palmetto) Standard Reference Materials (SRMs) developed by the National Institute of Standards and Technology (NIST). Values obtained by LC-MS were compared to those obtained using the more traditional approach of gas chromatography with flame ionization detection. This is the first reported use of LC-MS to determine phytosterols in saw palmetto dietary supplement materials.
Double row equivalent for rotator cuff repair: A biomechanical analysis of a new technique.
Robinson, Sean; Krigbaum, Henry; Kramer, Jon; Purviance, Connor; Parrish, Robin; Donahue, Joseph
2018-06-01
There are numerous configurations of double row fixation for rotator cuff tears however, there remains to be a consensus on the best method. In this study, we evaluated three different double-row configurations, including a new method. Our primary question is whether the new anchor and technique compares in biomechanical strength to standard double row techniques. Eighteen prepared fresh frozen bovine infraspinatus tendons were randomized to one of three groups including the New Double Row Equivalent, Arthrex Speedbridge and a transosseous equivalent using standard Stabilynx anchors. Biomechanical testing was performed on humeri sawbones and ultimate load, strain, yield strength, contact area, contact pressure, and a survival plots were evaluated. The new double row equivalent method demonstrated increased survival as well as ultimate strength at 415N compared to the remainder testing groups as well as equivalent contact area and pressure to standard double row techniques. This new anchor system and technique demonstrated higher survival rates and loads to failure than standard double row techniques. This data provides us with a new method of rotator cuff fixation which should be further evaluated in the clinical setting. Basic science biomechanical study.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, D.; Wohlgemuth, J.; Gu, X.
2013-11-01
The curing of cross-linkable encapsulation is a critical consideration for photovoltaic (PV) modules manufactured using a lamination process. Concerns related to ethylene-co-vinyl acetate (EVA) include the quality (e.g., expiration and uniformity) of the films or completion (duration) of the cross-linking of the EVA within a laminator. Because these issues are important to both EVA and module manufacturers, an international standard has recently been proposed by the Encapsulation Task-Group within the Working Group 2 (WG2) of the International Electrotechnical Commission (IEC) Technical Committee 82 (TC82) for the quantification of the degree of cure for EVA encapsulation. The present draft of themore » standard calls for the use of differential scanning calorimetry (DSC) as the rapid, enabling secondary (test) method. Both the residual enthalpy- and melt/freeze-DSC methods are identified. The DSC methods are calibrated against the gel content test, the primary (reference) method. Aspects of other established methods, including indentation and rotor cure metering, were considered by the group. Key details of the test procedure will be described.« less
Evaluating Public Libraries Using Standard Scores: The Library Quotient.
ERIC Educational Resources Information Center
O'Connor, Daniel O.
1982-01-01
Describes a method for assessing the performance of public libraries using a standardized scoring system and provides an analysis of public library data from New Jersey as an example. Library standards and the derivation of measurement ratios are also discussed. A 33-item bibliography and three data tables are included. (JL)
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The first part covers standards for gaseous fuels. The standard part covers standards on coal and coke including the classification of coals, determination of major elements in coal ash and trace elements in coal, metallurgical properties of coal and coke, methods of analysis of coal and coke, petrographic analysis of coal and coke, physical characteristics of coal, quality assurance and sampling.
Both US Environmental Protection Agency (EPA) SW-846 Methods 8260C/5035 and 8261A include mixing soil with water and addition of internal standards prior to analyses but the equilibration of internal standards with the soil is not required. With increasing total organic carbon (...
New Standards Require Teaching More Statistics: Are Preservice Secondary Mathematics Teachers Ready?
ERIC Educational Resources Information Center
Lovett, Jennifer N.; Lee, Hollylynne S.
2017-01-01
Mathematics teacher education programs often need to respond to changing expectations and standards for K-12 curriculum and accreditation. New standards for high school mathematics in the United States include a strong emphasis in statistics. This article reports results from a mixed methods cross-institutional study examining the preparedness of…
ERIC Educational Resources Information Center
Lindle, Jane Clark; Stalion, Nancy; Young, Lu
2005-01-01
Kentucky's accountability system includes a school-processes audit known as Standards and Indicators for School Improvement (SISI), which is in a nascent stage of validation. Content validity methods include comparison to instruments measuring similar constructs as well as other techniques such as job analysis. This study used a two-phase process…
Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.; ...
2016-07-05
Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.
Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less
Endoscope field of view measurement.
Wang, Quanzeng; Khanicheh, Azadeh; Leiner, Dennis; Shafer, David; Zobel, Jurgen
2017-03-01
The current International Organization for Standardization (ISO) standard (ISO 8600-3: 1997 including Amendment 1: 2003) for determining endoscope field of view (FOV) does not accurately characterize some novel endoscopic technologies such as endoscopes with a close focus distance and capsule endoscopes. We evaluated the endoscope FOV measurement method (the FOV WS method) in the current ISO 8600-3 standard and proposed a new method (the FOV EP method). We compared the two methods by measuring the FOV of 18 models of endoscopes (one device for each model) from seven key international manufacturers. We also estimated the device to device variation of two models of colonoscopes by measuring several hundreds of devices. Our results showed that the FOV EP method was more accurate than the FOV WS method, and could be used for all endoscopes. We also found that the labelled FOV values of many commercial endoscopes are significantly overstated. Our study can help endoscope users understand endoscope FOV and identify a proper method for FOV measurement. This paper can be used as a reference to revise the current endoscope FOV measurement standard.
Endoscope field of view measurement
Wang, Quanzeng; Khanicheh, Azadeh; Leiner, Dennis; Shafer, David; Zobel, Jurgen
2017-01-01
The current International Organization for Standardization (ISO) standard (ISO 8600-3: 1997 including Amendment 1: 2003) for determining endoscope field of view (FOV) does not accurately characterize some novel endoscopic technologies such as endoscopes with a close focus distance and capsule endoscopes. We evaluated the endoscope FOV measurement method (the FOVWS method) in the current ISO 8600-3 standard and proposed a new method (the FOVEP method). We compared the two methods by measuring the FOV of 18 models of endoscopes (one device for each model) from seven key international manufacturers. We also estimated the device to device variation of two models of colonoscopes by measuring several hundreds of devices. Our results showed that the FOVEP method was more accurate than the FOVWS method, and could be used for all endoscopes. We also found that the labelled FOV values of many commercial endoscopes are significantly overstated. Our study can help endoscope users understand endoscope FOV and identify a proper method for FOV measurement. This paper can be used as a reference to revise the current endoscope FOV measurement standard. PMID:28663840
Bootstrap Methods: A Very Leisurely Look.
ERIC Educational Resources Information Center
Hinkle, Dennis E.; Winstead, Wayland H.
The Bootstrap method, a computer-intensive statistical method of estimation, is illustrated using a simple and efficient Statistical Analysis System (SAS) routine. The utility of the method for generating unknown parameters, including standard errors for simple statistics, regression coefficients, discriminant function coefficients, and factor…
Standardized Methods for Electronic Shearography
NASA Technical Reports Server (NTRS)
Lansing, Matthew D.
1997-01-01
Research was conducted in development of operating procedures and standard methods to evaluate fiber reinforced composite materials, bonded or sprayed insulation, coatings, and laminated structures with MSFC electronic shearography systems. Optimal operating procedures were developed for the Pratt and Whitney Electronic Holography/Shearography Inspection System (EH/SIS) operating in shearography mode, as well as the Laser Technology, Inc. (LTI) SC-4000 and Ettemeyer SHS-94 ISTRA shearography systems. Operating practices for exciting the components being inspected were studied, including optimal methods for transient heating with heat lamps and other methods as appropriate to enhance inspection capability.
40 CFR 63.93 - Approval of State requirements that substitute for a section 112 rule.
Code of Federal Regulations, 2011 CFR
2011-07-01
..., board and administrative orders, permits issued pursuant to permit templates, or State operating permits... respective Federal rule; (2) Levels of control (including associated performance test methods) and compliance... must include monitoring or another method for determining compliance. (ii) If a standard in the...
Preliminary evaluation of a gel tube agglutination major cross-match method in dogs.
Villarnovo, Dania; Burton, Shelley A; Horney, Barbara S; MacKenzie, Allan L; Vanderstichel, Raphaël
2016-09-01
A major cross-match gel tube test is available for use in dogs yet has not been clinically evaluated. This study compared cross-match results obtained using the gel tube and the standard tube methods for canine samples. Study 1 included 107 canine sample donor-recipient pairings cross-match tested with the RapidVet-H method gel tube test and compared results with the standard tube method. Additionally, 120 pairings using pooled sera containing anti-canine erythrocyte antibody at various concentrations were tested with leftover blood from a hospital population to assess sensitivity and specificity of the gel tube method in comparison with the standard method. The gel tube method had a good relative specificity of 96.1% in detecting lack of agglutination (compatibility) compared to the standard tube method. Agreement between the 2 methods was moderate. Nine of 107 pairings showed agglutination/incompatibility on either test, too few to allow reliable calculation of relative sensitivity. Fifty percent of the gel tube method results were difficult to interpret due to sample spreading in the reaction and/or negative control tubes. The RapidVet-H method agreed with the standard cross-match method on compatible samples, but detected incompatibility in some sample pairs that were compatible with the standard method. Evaluation using larger numbers of incompatible pairings is needed to assess diagnostic utility. The gel tube method results were difficult to categorize due to sample spreading. Weak agglutination reactions or other factors such as centrifuge model may be responsible. © 2016 American Society for Veterinary Clinical Pathology.
Wang, Wenguang; Ma, Xiaoli; Guo, Xiaoyu; Zhao, Mingbo; Tu, Pengfei; Jiang, Yong
2015-09-18
In order to solve the bottleneck of reference standards shortage for comprehensive quality control of traditional Chinese medicines (TCMs), a series of strategies, including one single reference standard to determine multi-compounds (SSDMC), quantitative analysis by standardized reference extract (QASRE), and quantitative nuclear magnetic resonance spectroscopy (qNMR) were proposed, and Mahoniae Caulis was selected as an example to develop and validate these methods for simultaneous determination of four alkaloids, columbamine, jatrorrhizine, palmatine, and berberine. Comprehensive comparisons among these methods and with the conventional external standard method (ESM) were carried out. The relative expanded uncertainty of measurement was firstly used to compare their credibility. The results showed that all these three new developed methods can accurately accomplish the quantification by using only one purified reference standard, but each of them has its own advantages and disadvantages as well as the specific application scope, which were also discussed in detail in this paper. Copyright © 2015 Elsevier B.V. All rights reserved.
Bringing Standardized Processes in Atom-Probe Tomography: I Establishing Standardized Terminology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Ian M; Danoix, F; Forbes, Richard
2011-01-01
Defining standardized methods requires careful consideration of the entire field and its applications. The International Field Emission Society (IFES) has elected a Standards Committee, whose task is to determine the needed steps to establish atom-probe tomography as an accepted metrology technique. Specific tasks include developing protocols or standards for: terminology and nomenclature; metrology and instrumentation, including specifications for reference materials; test methodologies; modeling and simulations; and science-based health, safety, and environmental practices. The Committee is currently working on defining terminology related to atom-probe tomography with the goal to include terms into a document published by the International Organization for Standardsmore » (ISO). A lot of terms also used in other disciplines have already been defined) and will be discussed for adoption in the context of atom-probe tomography.« less
[Modified Delphi method in the constitution of school sanitation standard].
Yin, Xunqiang; Liang, Ying; Tan, Hongzhuan; Gong, Wenjie; Deng, Jing; Luo, Jiayou; Di, Xiaokang; Wu, Yue
2012-11-01
To constitute school sanitation standard using modified Delphi method, and to explore the feasibility and the predominance of Delphi method in the constitution of school sanitation standard. Two rounds of expert consultations were adopted in this study. The data were analyzed with SPSS15.0 to screen indices of school sanitation standard. Thirty-two experts accomplished the 2 rounds of consultations. The average length of expert service was (24.69 ±8.53) years. The authority coefficient was 0.729 ±0.172. The expert positive coefficient was 94.12% (32/34) in the first round and 100% (32/32) in the second round. The harmonious coefficients of importance, feasibility and rationality in the second round were 0.493 (P<0.05), 0.527 (P<0.01), and 0.535 (P<0.01), respectively, suggesting unanimous expert opinions. According to the second round of consultation, 38 indices were included in the framework. Theoretical analysis, literature review, investigation and so on are generally used in health standard constitution currently. Delphi method is a rapid, effective and feasible method in this field.
Rollier, Patricia; Lombard, Bertrand; Guillier, Laurent; François, Danièle; Romero, Karol; Pierru, Sylvie; Bouhier, Laurence; Gnanou Besse, Nathalie
2018-05-01
The reference method for the detection and enumeration of L. monocytogenes in food (Standards EN ISO 11290-1&2) have been validated by inter-laboratory studies in the frame of the Mandate M381 from European Commission to CEN. In this paper, the inter-laboratory studies led in 2013 on 5 matrices (cold-smoked salmon, milk powdered infant food formula, vegetables, environment, and cheese) to validate Standard EN ISO 11290-2 are reported. According to the results obtained, the method of the revised Standard EN ISO 11290-2 can be considered as a good method for the enumeration of L. monocytogenes in foods and food processing environment, in particular for the matrices included in the study. Values of repeatability and reproducibility standard deviations can be considered satisfactory for this type of method with a confirmation stage, since most of them were below 0.3 log 10 , also at low levels, close to the regulatory limit of 100 CFU/g. Copyright © 2018 Elsevier B.V. All rights reserved.
Sinigalliano, Christopher D.; Ervin, Jared S.; Van De Werfhorst, Laurie C.; Badgley, Brian D.; Ballestée, Elisenda; Bartkowiaka, Jakob; Boehm, Alexandria B.; Byappanahalli, Muruleedhara N.; Goodwin, Kelly D.; Gourmelon, Michèle; Griffith, John; Holden, Patricia A.; Jay, Jenny; Layton, Blythe; Lee, Cheonghoon; Lee, Jiyoung; Meijer, Wim G.; Noble, Rachel; Raith, Meredith; Ryu, Hodon; Sadowsky, Michael J.; Schriewer, Alexander; Wang, Dan; Wanless, David; Whitman, Richard; Wuertz, Stefan; Santo Domingo, Jorge W.
2013-01-01
Here we report results from a multi-laboratory (n = 11) evaluation of four different PCR methods targeting the 16S rRNA gene of Catellicoccus marimammalium originally developed to detect gull fecal contamination in coastal environments. The methods included a conventional end-point PCR method, a SYBR® Green qPCR method, and two TaqMan® qPCR methods. Different techniques for data normalization and analysis were tested. Data analysis methods had a pronounced impact on assay sensitivity and specificity calculations. Across-laboratory standardization of metrics including the lower limit of quantification (LLOQ), target detected but not quantifiable (DNQ), and target not detected (ND) significantly improved results compared to results submitted by individual laboratories prior to definition standardization. The unit of measure used for data normalization also had a pronounced effect on measured assay performance. Data normalization to DNA mass improved quantitative method performance as compared to enterococcus normalization. The MST methods tested here were originally designed for gulls but were found in this study to also detect feces from other birds, particularly feces composited from pigeons. Sequencing efforts showed that some pigeon feces from California contained sequences similar to C. marimammalium found in gull feces. These data suggest that the prevalence, geographic scope, and ecology of C. marimammalium in host birds other than gulls require further investigation. This study represents an important first step in the multi-laboratory assessment of these methods and highlights the need to broaden and standardize additional evaluations, including environmentally relevant target concentrations in ambient waters from diverse geographic regions.
77 FR 15605 - Mobile Commerce and Personalization Promotion
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-16
... automation mailings. Postage Payment Methods Postage payment methods will be restricted to permit imprint.... 2. Standard Mail (including Nonprofit) letters or flats. d. Postage must be paid by permit imprint...
Dahlquist, Robert T.; Reyner, Karina; Robinson, Richard D.; Farzad, Ali; Laureano-Phillips, Jessica; Garrett, John S.; Young, Joseph M.; Zenarosa, Nestor R.; Wang, Hao
2018-01-01
Background Emergency department (ED) shift handoffs are potential sources of delay in care. We aimed to determine the impact that using standardized reporting tool and process may have on throughput metrics for patients undergoing a transition of care at shift change. Methods We performed a prospective, pre- and post-intervention quality improvement study from September 1 to November 30, 2015. A handoff procedure intervention, including a mandatory workshop and personnel training on a standard reporting system template, was implemented. The primary endpoint was patient length of stay (LOS). A comparative analysis of differences between patient LOS and various handoff communication methods were assessed pre- and post-intervention. Communication methods were entered a multivariable logistic regression model independently as risk factors for patient LOS. Results The final analysis included 1,006 patients, with 327 comprising the pre-intervention and 679 comprising the post-intervention populations. Bedside rounding occurred 45% of the time without a standard reporting during pre-intervention and increased to 85% of the time with the use of a standard reporting system in the post-intervention period (P < 0.001). Provider time (provider-initiated care to patient care completed) in the pre-intervention period averaged 297 min, but decreased to 265 min in the post-intervention period (P < 0.001). After adjusting for other communication methods, the use of a standard reporting system during handoff was associated with shortened ED LOS (OR = 0.60, 95% CI 0.40 - 0.90, P < 0.05). Conclusions Standard reporting system use during emergency physician handoffs at shift change improves ED throughput efficiency and is associated with shorter ED LOS. PMID:29581808
Environmental Chemicals in Urine and Blood: Improving Methods for Creatinine and Lipid Adjustment.
O'Brien, Katie M; Upson, Kristen; Cook, Nancy R; Weinberg, Clarice R
2016-02-01
Investigators measuring exposure biomarkers in urine typically adjust for creatinine to account for dilution-dependent sample variation in urine concentrations. Similarly, it is standard to adjust for serum lipids when measuring lipophilic chemicals in serum. However, there is controversy regarding the best approach, and existing methods may not effectively correct for measurement error. We compared adjustment methods, including novel approaches, using simulated case-control data. Using a directed acyclic graph framework, we defined six causal scenarios for epidemiologic studies of environmental chemicals measured in urine or serum. The scenarios include variables known to influence creatinine (e.g., age and hydration) or serum lipid levels (e.g., body mass index and recent fat intake). Over a range of true effect sizes, we analyzed each scenario using seven adjustment approaches and estimated the corresponding bias and confidence interval coverage across 1,000 simulated studies. For urinary biomarker measurements, our novel method, which incorporates both covariate-adjusted standardization and the inclusion of creatinine as a covariate in the regression model, had low bias and possessed 95% confidence interval coverage of nearly 95% for most simulated scenarios. For serum biomarker measurements, a similar approach involving standardization plus serum lipid level adjustment generally performed well. To control measurement error bias caused by variations in serum lipids or by urinary diluteness, we recommend improved methods for standardizing exposure concentrations across individuals.
Development Of Methodologies Using PhabrOmeter For Fabric Drape Evaluation
NASA Astrophysics Data System (ADS)
Lin, Chengwei
Evaluation of fabric drape is important for textile industry as it reveals the aesthetic and functionality of the cloth and apparel. Although many fabric drape measuring methods have been developed for several decades, they are falling behind the need for fast product development by the industry. To meet the requirement of industries, it is necessary to develop an effective and reliable method to evaluate fabric drape. The purpose of the present study is to determine if PhabrOmeter can be applied to fabric drape evaluation. PhabrOmeter is a fabric sensory performance evaluating instrument which is developed to provide fast and reliable quality testing results. This study was sought to determine the relationship between fabric drape and other fabric attributes. In addition, a series of conventional methods including AATCC standards, ASTM standards and ISO standards were used to characterize the fabric samples. All the data were compared and analyzed with linear correlation method. The results indicate that PhabrOmeter is reliable and effective instrument for fabric drape evaluation. Besides, some effects including fabric structure, testing directions were considered to examine their impact on fabric drape.
Li, Tan; Zhang, Qingguo; Zhang, Ying
2018-01-01
The assessment of forest ecosystem services can quantify the impact of these services on human life and is the main basis for formulating a standard of compensation for these services. Moreover, the calculation of the indirect value of forest ecosystem services should not be ignored, as has been the case in some previous publications. A low compensation standard and the lack of a dynamic coordination mechanism are the main problems existing in compensation implementation. Using comparison and analysis, this paper employed accounting for both the costs and benefits of various alternatives. The analytic hierarchy process (AHP) method and the Pearl growth-curve method were used to adjust the results. This research analyzed the contribution of each service value from the aspects of forest produce services, ecology services, and society services. We also conducted separate accounting for cost and benefit, made a comparison of accounting and evaluation methods, and estimated the implementation period of the compensation standard. The main conclusions of this research include the fact that any compensation standard should be determined from the points of view of both benefit and cost in a region. The results presented here allow the range between the benefit and cost compensation to be laid out more reasonably. The practical implications of this research include the proposal that regional decision-makers should consider a dynamic compensation method to meet with the local economic level by using diversified ways to raise the compensation standard, and that compensation channels should offer a mixed mode involving both the market and government. PMID:29561789
Schneller, Mikkel B; Pedersen, Mogens T; Gupta, Nidhi; Aadahl, Mette; Holtermann, Andreas
2015-03-13
We compared the accuracy of five objective methods, including two newly developed methods combining accelerometry and activity type recognition (Acti4), against indirect calorimetry, to estimate total energy expenditure (EE) of different activities in semi-standardized settings. Fourteen participants performed a standardized and semi-standardized protocol including seven daily life activity types, while having their EE measured by indirect calorimetry. Simultaneously, physical activity was quantified by an ActivPAL3, two ActiGraph GT3X+'s and an Actiheart. EE was estimated by the standard ActivPAL3 software (ActivPAL), ActiGraph GT3X+ (ActiGraph) and Actiheart (Actiheart), and by a combination of activity type recognition via Acti4 software and activity counts per minute (CPM) of either a hip- or thigh-worn ActiGraph GT3X+ (AGhip + Acti4 and AGthigh + Acti4). At group level, estimated physical activities EE by Actiheart (MSE = 2.05) and AGthigh + Acti4 (MSE = 0.25) were not significantly different from measured EE by indirect calorimetry, while significantly underestimated by ActiGraph, ActivPAL and AGhip + Acti4. AGthigh + Acti4 and Actiheart explained 77% and 45%, of the individual variations in measured physical activity EE by indirect calorimetry, respectively. This study concludes that combining accelerometer data from a thigh-worn ActiGraph GT3X+ with activity type recognition improved the accuracy of activity specific EE estimation against indirect calorimetry in semi-standardized settings compared to previously validated methods using CPM only.
Li, Tan; Zhang, Qingguo; Zhang, Ying
2018-03-21
The assessment of forest ecosystem services can quantify the impact of these services on human life and is the main basis for formulating a standard of compensation for these services. Moreover, the calculation of the indirect value of forest ecosystem services should not be ignored, as has been the case in some previous publications. A low compensation standard and the lack of a dynamic coordination mechanism are the main problems existing in compensation implementation. Using comparison and analysis, this paper employed accounting for both the costs and benefits of various alternatives. The analytic hierarchy process (AHP) method and the Pearl growth-curve method were used to adjust the results. This research analyzed the contribution of each service value from the aspects of forest produce services, ecology services, and society services. We also conducted separate accounting for cost and benefit, made a comparison of accounting and evaluation methods, and estimated the implementation period of the compensation standard. The main conclusions of this research include the fact that any compensation standard should be determined from the points of view of both benefit and cost in a region. The results presented here allow the range between the benefit and cost compensation to be laid out more reasonably. The practical implications of this research include the proposal that regional decision-makers should consider a dynamic compensation method to meet with the local economic level by using diversified ways to raise the compensation standard, and that compensation channels should offer a mixed mode involving both the market and government.
40 CFR 131.6 - Minimum requirements for water quality standards submission.
Code of Federal Regulations, 2014 CFR
2014-07-01
...)(2) and 303(c)(2) of the Act. (b) Methods used and analyses conducted to support water quality... scientific basis of the standards which do not include the uses specified in section 101(a)(2) of the Act as...
40 CFR 131.6 - Minimum requirements for water quality standards submission.
Code of Federal Regulations, 2012 CFR
2012-07-01
...)(2) and 303(c)(2) of the Act. (b) Methods used and analyses conducted to support water quality... scientific basis of the standards which do not include the uses specified in section 101(a)(2) of the Act as...
40 CFR 131.6 - Minimum requirements for water quality standards submission.
Code of Federal Regulations, 2013 CFR
2013-07-01
...)(2) and 303(c)(2) of the Act. (b) Methods used and analyses conducted to support water quality... scientific basis of the standards which do not include the uses specified in section 101(a)(2) of the Act as...
40 CFR 86.094-13 - Light-duty exhaust durability programs.
Code of Federal Regulations, 2010 CFR
2010-07-01
... requirements. (5) In-use verification. The Standard Self-Approval Durability Program includes no requirement... selection methods, durability data vehicle compliance requirements, in-use verification requirements... Accumulation Carryover. Light-duty Trucks Tier 1 & Tier 0 Standard Self-Approval Carryover. Alternative Service...
Bodiwala, Kunjan; Shah, Shailesh; Patel, Yogini; Prajapati, Pintu; Marolia, Bhavin; Kalyankar, Gajanan
2017-01-01
Two sensitive, accurate, and precise spectrophotometric methods have been developed and validated for the simultaneous estimation of ofloxacin (OFX), clotrimazole (CLZ), and lignocaine hydrochloride (LGN) in their combined dosage form (ear drops) without prior separation. The derivative ratio spectra method (method 1) includes the measurement of OFX and CLZ at zero-crossing points (ZCPs) of each other obtained from the ratio derivative spectra using standard LGN as a divisor, whereas the measurement of LGN at the ZCP of CLZ is obtained from the ratio derivative spectra using standard OFX as a divisor. The double divisor-ratio derivative method (method 2) includes the measurement of each drug at its amplitude in the double divisor-ratio spectra obtained using a standard mixture of the other two drugs as the divisor. Both methods were found to be linear (correlation coefficients of >0.996) over the ranges of 3-15, 10-50, and 20-100 μg/mL for OFX, CLZ, and LGN, respectively; precise (RSD of <2%); and accurate (recovery of >98%) for the estimation of each drug. The developed methods were successfully applied for the estimation of these drugs in a marketed ear-drop formulation. Excipients and other ingredients did not interfere with the estimation of these drugs. Both methods were statistically compared using the t-test.
Implementation of a standard format for GPS common view data
NASA Technical Reports Server (NTRS)
Weiss, Marc A.; Thomas, Claudine
1995-01-01
A new format for standardizing common view time transfer data, recommended by the Consultative Committee for the Definition of the Second, is being implemented in receivers commonly used for contributing data for the generation of International Atomic Time. We discuss three aspects of this new format that potentially improve GPS common-view time transfer: (1) the standard specifies the method for treating short term data, (2) it presents data in consistent formats including needed terms not previously available, and (3) the standard includes a header of parameters important for the GPS common-view process. In coordination with the release of firmware conforming to this new format the Bureau International des Poids et Mesures will release future international track schedules consistent with the new standard.
Gross Motor Development in Children Aged 3-5 Years, United States 2012.
Kit, Brian K; Akinbami, Lara J; Isfahani, Neda Sarafrazi; Ulrich, Dale A
2017-07-01
Objective Gross motor development in early childhood is important in fostering greater interaction with the environment. The purpose of this study is to describe gross motor skills among US children aged 3-5 years using the Test of Gross Motor Development (TGMD-2). Methods We used 2012 NHANES National Youth Fitness Survey (NNYFS) data, which included TGMD-2 scores obtained according to an established protocol. Outcome measures included locomotor and object control raw and age-standardized scores. Means and standard errors were calculated for demographic and weight status with SUDAAN using sample weights to calculate nationally representative estimates, and survey design variables to account for the complex sampling methods. Results The sample included 339 children aged 3-5 years. As expected, locomotor and object control raw scores increased with age. Overall mean standardized scores for locomotor and object control were similar to the mean value previously determined using a normative sample. Girls had a higher mean locomotor, but not mean object control, standardized score than boys (p < 0.05). However, the mean locomotor standardized scores for both boys and girls fell into the range categorized as "average." There were no other differences by age, race/Hispanic origin, weight status, or income in either of the subtest standardized scores (p > 0.05). Conclusions In a nationally representative sample of US children aged 3-5 years, TGMD-2 mean locomotor and object control standardized scores were similar to the established mean. These results suggest that standardized gross motor development among young children generally did not differ by demographic or weight status.
A need for standardization in visual acuity measurement.
Patel, Hina; Congdon, Nathan; Strauss, Glenn; Lansingh, Charles
2017-01-01
Standardization of terminologies and methods is increasingly important in all fields including ophthalmology, especially currently when research and new technology are rapidly driving improvements in medicine. This review highlights the range of notations used by vision care professionals around the world for vision measurement, and the challenges resulting from this practice. The global community is urged to move toward a uniform standard.
Manifesting Destiny: Re/Presentations of Indigenous Peoples in K-12 U.S. History Standards
ERIC Educational Resources Information Center
Shear, Sarah B.; Knowles, Ryan T.; Soden, Gregory J.; Castro, Antonio J.
2015-01-01
In this mixed-methods study, we use a postcolonial framework to investigate how state standards represent Indigenous histories and cultures. The research questions that guided this study include: (a) What is the frequency of Indigenous content (histories, cultures, current issues) covered in state-level U.S. history standards for K-12? (b) What is…
Treating Depression during Pregnancy and the Postpartum: A Preliminary Meta-Analysis
ERIC Educational Resources Information Center
Bledsoe, Sarah E.; Grote, Nancy K.
2006-01-01
Objectives: This meta-analysis evaluates treatment effects for nonpsychotic major depression during pregnancy and postpartum comparing interventions by type and timing. Methods: Studies for decreasing depressive severity during pregnancy and postpartum applying treatment trials and standardized measures were included. Standardized mean differences…
International Standards for Genomes, Transcriptomes, and Metagenomes
Mason, Christopher E.; Afshinnekoo, Ebrahim; Tighe, Scott; Wu, Shixiu; Levy, Shawn
2017-01-01
Challenges and biases in preparing, characterizing, and sequencing DNA and RNA can have significant impacts on research in genomics across all kingdoms of life, including experiments in single-cells, RNA profiling, and metagenomics (across multiple genomes). Technical artifacts and contamination can arise at each point of sample manipulation, extraction, sequencing, and analysis. Thus, the measurement and benchmarking of these potential sources of error are of paramount importance as next-generation sequencing (NGS) projects become more global and ubiquitous. Fortunately, a variety of methods, standards, and technologies have recently emerged that improve measurements in genomics and sequencing, from the initial input material to the computational pipelines that process and annotate the data. Here we review current standards and their applications in genomics, including whole genomes, transcriptomes, mixed genomic samples (metagenomes), and the modified bases within each (epigenomes and epitranscriptomes). These standards, tools, and metrics are critical for quantifying the accuracy of NGS methods, which will be essential for robust approaches in clinical genomics and precision medicine. PMID:28337071
NASA Technical Reports Server (NTRS)
Hoisington, C. M.
1984-01-01
A position estimation algorithm was developed to track a humpback whale tagged with an ARGOS platform after a transmitter deployment failure and the whale's diving behavior precluded standard methods. The algorithm is especially useful where a transmitter location program exists; it determines the classical keplarian elements from the ARGOS spacecraft position vectors included with the probationary file messages. A minimum of three distinct messages are required. Once the spacecraft orbit is determined, the whale is located using standard least squares regression techniques. Experience suggests that in instances where circumstances inherent in the experiment yield message data unsuitable for the standard ARGOS reduction, (message data may be too sparse, span an insufficient period, or include variable-length messages). System ARGOS can still provide much valuable location information if the user is willing to accept the increased location uncertainties.
Wang, Rong; Xu, Xin
2015-12-01
To compare the effect of 2 methods of occlusion adjustment on occlusal balance and muscles of mastication in patients with dental implant restoration. Twenty patients, each with a single edentulous posterior dentition with no distal dentition were selected, and divided into 2 groups. Patients in group A underwent original occlusion adjustment method and patients in group B underwent occlusal plane reduction technique. Ankylos implants were implanted in the edentulous space in each patient and restored with fixed prosthodontics single unit crown. Occlusion was adjusted in each restoration accordingly. Electromyograms were conducted to determine the effect of adjustment methods on occlusion and muscles of mastication 3 months and 6 months after initial restoration and adjustment. Data was collected and measurements for balanced occlusal measuring standards were obtained, including central occlusion force (COF), asymmetry index of molar occlusal force(AMOF). Balanced muscles of mastication measuring standards were also obtained including measurements from electromyogram for the muscles of mastication and the anterior bundle of the temporalis muscle at the mandibular rest position, average electromyogram measurements of the anterior bundle of the temporalis muscle at the intercuspal position(ICP), Astot, masseter muscle asymmetry index, and anterior temporalis asymmetry index (ASTA). Statistical analysis was performed using Student 's t test with SPSS 18.0 software package. Three months after occlusion adjustment, parameters of the original occlusion adjustment method were significantly different between group A and group B in balanced occlusal measuring standards and balanced muscles of mastication measuring standards. Six months after occlusion adjustment, parameters of the original occlusion adjustment methods were significantly different between group A and group B in balanced muscles of mastication measuring standards, but was no significant difference in balanced occlusal measuring standards. Using occlusion plane reduction adjustment technique, it is possible to obtain occlusion index and muscles of mastication's electromyogram index similar to the opposite side's natural dentition in patients with single unit fix prosthodontics crown and single posterior edentulous dentition without distal dentitions.
Evaluation of a new automated instrument for pretransfusion testing.
Morelati, F; Revelli, N; Maffei, L M; Poretti, M; Santoro, C; Parravicini, A; Rebulla, P; Cole, R; Sirchia, G
1998-10-01
A number of automated devices for pretransfusion testing have recently become available. This study evaluated a fully automated device based on column agglutination technology (AutoVue System, Ortho, Raritan, NJ). Some 6747 tests including forward and reverse ABO group, Rh type and phenotype, antibody screen, autocontrol, and crossmatch were performed on random samples from 1069 blood donors, 2063 patients, and 98 newborns and cord blood. Also tested were samples from 168 immunized patients and 53 donors expressing weak or variant A and D antigens. Test results and technician times required for their performance were compared with those obtained by standard methods (manual column agglutination technology, slide, semiautomatic handler). No erroneous conclusions were found in regard to the 5028 ABO group and Rh type or phenotype determinations carried out with the device. The device rejected 1.53 percent of tests for sample inadequacy. Of the remaining 18 tests with discrepant results found with the device and not confirmed with the standard methods, 6 gave such results because of mixed-field reactions, 10 gave negative results with A2 RBCs in reverse ABO grouping, and 2 gave very weak positive reactions in antibody screening and crossmatching. In the samples from immunized patients, the device missed one weak anti-K, whereas standard methods missed five weak antibodies. In addition, 48, 34, and 31 of the 53 weak or variant antigens were detected by the device, the slide method, and the semiautomated handler, respectively. Technician time with the standard methods was 1.6 to 7 times higher than that with the device. The technical performance of the device compared favorably with that of standard methods, with a number of advantages, including in particular the saving of technician time. Sample inadequacy was the most common cause of discrepancy, which suggests that standardization of sample collection can further improve the performance of the device.
Estimating airline operating costs
NASA Technical Reports Server (NTRS)
Maddalon, D. V.
1978-01-01
A review was made of the factors affecting commercial aircraft operating and delay costs. From this work, an airline operating cost model was developed which includes a method for estimating the labor and material costs of individual airframe maintenance systems. The model, similar in some respects to the standard Air Transport Association of America (ATA) Direct Operating Cost Model, permits estimates of aircraft-related costs not now included in the standard ATA model (e.g., aircraft service, landing fees, flight attendants, and control fees). A study of the cost of aircraft delay was also made and a method for estimating the cost of certain types of airline delay is described.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kane, V.E.
1982-01-01
A class of goodness-of-fit estimators is found to provide a useful alternative in certain situations to the standard maximum likelihood method which has some undesirable estimation characteristics for estimation from the three-parameter lognormal distribution. The class of goodness-of-fit tests considered include the Shapiro-Wilk and Filliben tests which reduce to a weighted linear combination of the order statistics that can be maximized in estimation problems. The weighted order statistic estimators are compared to the standard procedures in Monte Carlo simulations. Robustness of the procedures are examined and example data sets analyzed.
Corbel, Michael J; Das, Rose Gaines; Lei, Dianliang; Xing, Dorothy K L; Horiuchi, Yoshinobu; Dobbelaer, Roland
2008-04-07
This report reflects the discussion and conclusions of a WHO group of experts from National Regulatory Authorities (NRAs), National Control Laboratories (NCLs), vaccine industries and other relevant institutions involved in standardization and control of diphtheria, tetanus and pertussis vaccines (DTP), held on 20-21 July 2006 and 28-30 March 2007, in Geneva Switzerland for the revision of WHO Manual for quality control of DTP vaccines. Taking into account recent developments and standardization in quality control methods and the revision of WHO recommendations for D, T, P vaccines, and a need for updating the manual has been recognized. In these two meetings the current situation of quality control methods in terms of potency, safety and identity tests for DTP vaccines and statistical analysis of data were reviewed. Based on the WHO recommendations and recent validation of testing methods, the content of current manual were reviewed and discussed. The group agreed that the principles to be observed in selecting methods included identifying those critical for assuring safety, efficacy and quality and which were consistent with WHO recommendations/requirements. Methods that were well recognized but not yet included in current Recommendations should be taken into account. These would include in vivo and/or in vitro methods for determining potency, safety testing and identity. The statistical analysis of the data should be revised and updated. It was noted that the mouse based assays for toxoid potency were still quite widely used and it was desirable to establish appropriate standards for these to enable the results to be related to the standard guinea pig assays. The working group was met again to review the first drafts and to input further suggestions or amendments to the contributions of the drafting groups. The revised manual was to be finalized and published by WHO.
VIDAS Listeria species Xpress (LSX).
Johnson, Ronald; Mills, John
2013-01-01
The AOAC GovVal study compared the VIDAS Listeria species Xpress (LSX) to the Health Products and Food Branch MFHPB-30 reference method for detection of Listeria on stainless steel. The LSX method utilizes a novel and proprietary enrichment media, Listeria Xpress broth, enabling detection of Listeria species in environmental samples with the automated VIDAS in a minimum of 26 h. The LSX method also includes the use of the chromogenic media, chromID Ottaviani Agosti Agar (OAA) and chromID Lmono for confirmation of LSX presumptive results. In previous AOAC validation studies comparing VIDAS LSX to the U.S. Food and Drug Administration's Bacteriological Analytical Manual (FDA-BAM) and the U.S. Department of Agriculture-Food Safety and Inspection Service (USDA-FSIS) reference methods, the LSX method was approved as AOAC Official Method 2010.02 for the detection of Listeria species in dairy products, vegetables, seafood, raw meats and poultry, and processed meats and poultry, and as AOAC Performance Tested Method 100501 in a variety of foods and on environmental surfaces. The GovVal comparative study included 20 replicate test portions each at two contamination levels for stainless steel where fractionally positive results (5-15 positive results/20 replicate portions tested) were obtained by at least one method at one level. Five uncontaminated controls were included. In the stainless steel artificially contaminated surface study, there were 25 confirmed positives by the VIDAS LSX assay and 22 confirmed positives by the standard culture methods. Chi-square analysis indicated no statistical differences between the VIDAS LSX method and the MFHPB-30 standard methods at the 5% level of significance. Confirmation of presumptive LSX results with the chromogenic OAA and Lmono media was shown to be equivalent to the appropriate reference method agars. The data in this study demonstrate that the VIDAS LSX method is an acceptable alternative method to the MFHPB-30 standard culture method for the detection of Listeria species on stainless steel.
Study Methods to Standardize Thermography NDE
NASA Technical Reports Server (NTRS)
Walker, James L.; Workman, Gary L.
1998-01-01
The purpose of this work is to develop thermographic inspection methods and standards for use in evaluating structural composites and aerospace hardware. Qualification techniques and calibration methods are investigated to standardize the thermographic method for use in the field. Along with the inspections of test standards structural hardware, support hardware is designed and fabricated to aid in the thermographic process. Also, a standard operating procedure is developed for performing inspections with the Bales Thermal Image Processor (TIP). Inspections are performed on a broad range of structural composites. These materials include various graphite/epoxies, graphite/cyanide-ester, graphite/silicon-carbide, graphite phenolic and Keviar/epoxy. Also metal honeycomb (titanium and aluminum faceplates over an aluminum honeycomb core) structures are investigated. Various structural shapes are investigated and the thickness of the structures vary from as few as 3 plies to as many as 80 plies. Special emphasis is placed on characterizing defects in attachment holes and bondlines, in addition to those resulting from impact damage and the inclusion of foreign matter. Image processing through statistical analysis and digital filtering is investigated to enhance the quality and quantify the NDE thermal images when necessary.
Study Methods to Standardize Thermography NDE
NASA Technical Reports Server (NTRS)
Walker, James L.; Workman, Gary L.
1998-01-01
The purpose of this work is to develop thermographic inspection methods and standards for use in evaluating structural composites and aerospace hardware. Qualification techniques and calibration methods are investigated to standardize the thermographic method for use in the field. Along with the inspections of test standards structural hardware, support hardware is designed and fabricated to aid in the thermographic process. Also, a standard operating procedure is developed for performing inspections with the Bales Thermal Image Processor (TIP). Inspections are performed on a broad range of structural composites. These materials include graphite/epoxies, graphite/cyanide-ester, graphite/silicon-carbide, graphite phenolic and Kevlar/epoxy. Also metal honeycomb (titanium and aluminum faceplates over an aluminum honeycomb core) structures are investigated. Various structural shapes are investigated and the thickness of the structures vary from as few as 3 plies to as many as 80 plies. Special emphasis is placed on characterizing defects in attachment holes and bondlines, in addition to those resulting from impact damage and the inclusion of foreign matter. Image processing through statistical analysis and digital filtering is investigated to enhance the quality and quantify the NDE thermal images when necessary.
Levitt, Heidi M; Bamberg, Michael; Creswell, John W; Frost, David M; Josselson, Ruthellen; Suárez-Orozco, Carola
2018-01-01
The American Psychological Association Publications and Communications Board Working Group on Journal Article Reporting Standards for Qualitative Research (JARS-Qual Working Group) was charged with examining the state of journal article reporting standards as they applied to qualitative research and with generating recommendations for standards that would be appropriate for a wide range of methods within the discipline of psychology. These standards describe what should be included in a research report to enable and facilitate the review process. This publication marks a historical moment-the first inclusion of qualitative research in APA Style, which is the basis of both the Publication Manual of the American Psychological Association (APA, 2010) and APA Style CENTRAL, an online program to support APA Style. In addition to the general JARS-Qual guidelines, the Working Group has developed standards for both qualitative meta-analysis and mixed methods research. The reporting standards were developed for psychological qualitative research but may hold utility for a broad range of social sciences. They honor a range of qualitative traditions, methods, and reporting styles. The Working Group was composed of a group of researchers with backgrounds in varying methods, research topics, and approaches to inquiry. In this article, they present these standards and their rationale, and they detail the ways that the standards differ from the quantitative research reporting standards. They describe how the standards can be used by authors in the process of writing qualitative research for submission as well as by reviewers and editors in the process of reviewing research. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
This data set contains the method performance results. This includes field blanks, method blanks, duplicate samples, analytical duplicates, matrix spikes, and surrogate recovery standards.
The Children’s Total Exposure to Persistent Pesticides and Other Persistent Pollutant (...
[Extraction method suitable for detection of unheated crustaceans including cephalothorax by ELISA].
Shibahara, Yusuke; Yamada, Itta; Uesaka, Yoshihiko; Uneo, Noriko; Abe, Akihisa; Ohashi, Eiji; Shiomi, Kazuo
2009-08-01
When unheated whole samples of crustaceans (shrimp, prawn and crab) were analyzed with our ELISA kit (FA test EIA-Crustacean 'Nissui') using anti-tropomyosin antibodies, a remarkable reduction in reactivity was recognized. This reduction in activity was found to be due to the digestion of tropomyosin during the extraction process by proteases contained in cephalothorax. To avoid the digestion of tropomyosin by proteases, we developed an extraction method (heating method) suitable for the detection of tropomyosin in unheated crustaceans including cephalothorax. Experiments with unheated whole samples of various species of crustaceans confirmed that the heating method greatly improved the low reactivity in the standard method; the heating method gave extraction efficiencies of as high as 93-107%. Various processed crustaceans with cephalothorax, such as dry products (unheated or weakly heated products) and pickles in soy sauce (unheated products), that showed low reactivity with the standard method were confirmed to give superior results with the heating method. These results indicated that the developed heating method is suitable for detecting unheated crustaceans with cephalothorax by means of the ELISA kit.
Cordeiro, Fernando; Robouch, Piotr; de la Calle, Maria Beatriz; Emteborg, Håkan; Charoud-Got, Jean; Schmitz, Franz
2011-01-01
A collaborative study, International Evaluation Measurement Programme-25a, was conducted in accordance with international protocols to determine the performance characteristics of an analytical method for the determination of dissolved bromate in drinking water. The method should fulfill the analytical requirements of Council Directive 98/83/EC (referred to in this work as the Drinking Water Directive; DWD). The new draft standard method under investigation is based on ion chromatography followed by post-column reaction and UV detection. The collaborating laboratories used the Draft International Organization for Standardization (ISO)/Draft International Standard (DIS) 11206 document. The existing standard method (ISO 15061:2001) is based on ion chromatography using suppressed conductivity detection, in which a preconcentration step may be required for the determination of bromate concentrations as low as 3 to 5 microg/L. The new method includes a dilution step that reduces the matrix effects, thus allowing the determination of bromate concentrations down to 0.5 microg/L. Furthermore, the method aims to minimize any potential interference of chlorite ions. The collaborative study investigated different types of drinking water, such as soft, hard, and mineral water. Other types of water, such as raw water (untreated), swimming pool water, a blank (named river water), and a bromate standard solution, were included as test samples. All test matrixes except the swimming pool water were spiked with high-purity potassium bromate to obtain bromate concentrations ranging from 1.67 to 10.0 microg/L. Swimming pool water was not spiked, as this water was incurred with bromate. Test samples were dispatched to 17 laboratories from nine different countries. Sixteen participants reported results. The repeatability RSD (RSD(r)) ranged from 1.2 to 4.1%, while the reproducibility RSD (RSDR) ranged from 2.3 to 5.9%. These precision characteristics compare favorably with those of ISO 15601. A thorough comparison of the performance characteristics is presented in this report. All method performance characteristics obtained in the frame of this collaborative study indicate that the draft ISO/DIS 11206 standard method meets the requirements set down by the DWD. It can, therefore, be considered to fit its intended analytical purpose.
78 FR 43066 - Magnuson-Stevens Act Provisions; National Standard 2-Scientific Information
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-19
... situations where simpler tools and assessment methods are warranted, scientific advice should be accompanied... higher data category and improve assessment methods.'' One commenter also suggested adding... consistent with appropriate scientific methods, undergo scientific review, and peer review, which may include...
The Large Introductory Class as an Exercise in Organization Design.
ERIC Educational Resources Information Center
Wagner, John A., III; Van Dyne, Linn
1999-01-01
Four methods for large group instruction differ in control and coordination dimensions: (1) centralization with mutual adjustment; (2) centralization with standardization; (3) decentralization with standardization; and (4) decentralization with mutual adjustment. Other factors to consider include class size and interests of various constituencies:…
Variations on an Historical Case Study
ERIC Educational Resources Information Center
Field, Patrick
2006-01-01
The National Inquiry Standard for Science Education Preparation requires science teachers to introduce students to scientific inquiry to solve problems by various methods, including active learning in a collaborative environment. In order for science teachers to comply with this inquiry standard, activities must be designed for students to…
ESTABLISH AND STANDARDIZE METHODOLOGY FOR DETECTION OF WATERBORNE VIRUSES FROM HUMAN SOURCES
Research is conducted to develop and standardize methods to detect and measure occurrence of human enteric viruses that cause waterborne disease. The viruses of concern include the emerging pathogens--hepatitis E virus and group B rotaviruses. Also of concern are the coxsackiev...
Dahlquist, Robert T; Reyner, Karina; Robinson, Richard D; Farzad, Ali; Laureano-Phillips, Jessica; Garrett, John S; Young, Joseph M; Zenarosa, Nestor R; Wang, Hao
2018-05-01
Emergency department (ED) shift handoffs are potential sources of delay in care. We aimed to determine the impact that using standardized reporting tool and process may have on throughput metrics for patients undergoing a transition of care at shift change. We performed a prospective, pre- and post-intervention quality improvement study from September 1 to November 30, 2015. A handoff procedure intervention, including a mandatory workshop and personnel training on a standard reporting system template, was implemented. The primary endpoint was patient length of stay (LOS). A comparative analysis of differences between patient LOS and various handoff communication methods were assessed pre- and post-intervention. Communication methods were entered a multivariable logistic regression model independently as risk factors for patient LOS. The final analysis included 1,006 patients, with 327 comprising the pre-intervention and 679 comprising the post-intervention populations. Bedside rounding occurred 45% of the time without a standard reporting during pre-intervention and increased to 85% of the time with the use of a standard reporting system in the post-intervention period (P < 0.001). Provider time (provider-initiated care to patient care completed) in the pre-intervention period averaged 297 min, but decreased to 265 min in the post-intervention period (P < 0.001). After adjusting for other communication methods, the use of a standard reporting system during handoff was associated with shortened ED LOS (OR = 0.60, 95% CI 0.40 - 0.90, P < 0.05). Standard reporting system use during emergency physician handoffs at shift change improves ED throughput efficiency and is associated with shorter ED LOS.
Grahn, Anna; Bråve, Andreas; Tolfvenstam, Thomas; Studahl, Marie
2018-06-01
Nosocomial transmission of Lassa virus (LASV) is reported to be low when care for the index patient includes proper barrier nursing methods. We investigated whether asymptomatic LASV infection occurred in healthcare workers who used standard barrier nursing methods during the first 15 days of caring for a patient with Lassa fever in Sweden. Of 76 persons who were defined as having been potentially exposed to LASV, 53 provided blood samples for detection of LASV IgG. These persons also responded to a detailed questionnaire to evaluate exposure to different body fluids from the index patient. LASV-specific IgG was not detected in any of the 53 persons. Five of 53 persons had not been using proper barrier nursing methods. Our results strengthen the argument for a low risk of secondary transmission of LASV in humans when standard barrier nursing methods are used and the patient has only mild symptoms.
40 CFR 60.103a - Work practice standards.
Code of Federal Regulations, 2010 CFR
2010-07-01
... becomes an affected flare subject to this subpart. The plan must include: (1) A diagram illustrating all connections to the flare; (2) Methods for monitoring flow rate to the flare, including a detailed description...
40 CFR 60.103a - Work practice standards.
Code of Federal Regulations, 2012 CFR
2012-07-01
... becomes an affected flare subject to this subpart. The plan must include: (1) A diagram illustrating all connections to the flare; (2) Methods for monitoring flow rate to the flare, including a detailed description...
System and methods of resource usage using an interoperable management framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heileman, Gregory L.; Jamkhedkar, Pramod A.; Lamb, Christopher C.
Generic rights expression language allowing interoperability across different computing environments including resource usage of different applications. A formal framework for usage management provides scaffolding upon which interoperable usage management systems can be built. Certain features of the framework are standardized, such as the operational semantics, including areas free of standards that necessitate choice and innovation to achieve a balance of flexibility and usability for interoperability in usage management systems.
Phillips, Melissa M; Bedner, Mary; Reitz, Manuela; Burdette, Carolyn Q; Nelson, Michael A; Yen, James H; Sander, Lane C; Rimmer, Catherine A
2017-02-01
Two independent analytical approaches, based on liquid chromatography with absorbance detection and liquid chromatography with mass spectrometric detection, have been developed for determination of isoflavones in soy materials. These two methods yield comparable results for a variety of soy-based foods and dietary supplements. Four Standard Reference Materials (SRMs) have been produced by the National Institute of Standards and Technology to assist the food and dietary supplement community in method validation and have been assigned values for isoflavone content using both methods. These SRMs include SRM 3234 Soy Flour, SRM 3236 Soy Protein Isolate, SRM 3237 Soy Protein Concentrate, and SRM 3238 Soy-Containing Solid Oral Dosage Form. A fifth material, SRM 3235 Soy Milk, was evaluated using the methods and found to be inhomogeneous for isoflavones and unsuitable for value assignment. Graphical Abstract Separation of six isoflavone aglycones and glycosides found in Standard Reference Material (SRM) 3236 Soy Protein Isolate.
Li, Dan; Jiang, Jia; Han, Dandan; Yu, Xinyu; Wang, Kun; Zang, Shuang; Lu, Dayong; Yu, Aimin; Zhang, Ziwei
2016-04-05
A new method is proposed for measuring the antioxidant capacity by electron spin resonance spectroscopy based on the loss of electron spin resonance signal after Cu(2+) is reduced to Cu(+) with antioxidant. Cu(+) was removed by precipitation in the presence of SCN(-). The remaining Cu(2+) was coordinated with diethyldithiocarbamate, extracted into n-butanol and determined by electron spin resonance spectrometry. Eight standards widely used in antioxidant capacity determination, including Trolox, ascorbic acid, ferulic acid, rutin, caffeic acid, quercetin, chlorogenic acid, and gallic acid were investigated. The standard curves for determining the eight standards were plotted, and results showed that the linear regression correlation coefficients were all high enough (r > 0.99). Trolox equivalent antioxidant capacity values for the antioxidant standards were calculated, and a good correlation (r > 0.94) between the values obtained by the present method and cupric reducing antioxidant capacity method was observed. The present method was applied to the analysis of real fruit samples and the evaluation of the antioxidant capacity of these fruits.
A Stable Whole Building Performance Method for Standard 90.1-Part II
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosenberg, Michael I.; Eley, Charles
2016-06-01
In May of 2013 we introduced a new approach for compliance with Standard 90.1 that was under development based on the Performance Rating Method of Appendix G to Standard 90.11. Since then, the approach has been finalized through Addendum BM to Standard 90.1-2013 and will be published in the 2016 edition of the Standard. In the meantime, ASHRAE has published an advanced copy of Appendix G including Addendum BM and several other addenda so that software developers and energy program administrators can get a preview of what is coming in the 2016 edition of the Standard2. This article is anmore » update on Addendum BM, summarizes changes made to the original concept as introduced in May of 2013, and provides an approach for developing performance targets for code compliance and beyond code programs.« less
This data set contains the method performance results for CTEPP-OH. This includes field blanks, method blanks, duplicate samples, analytical duplicates, matrix spikes, and surrogate recovery standards.
The Children’s Total Exposure to Persistent Pesticides and Other Persisten...
Flood-frequency prediction methods for unregulated streams of Tennessee, 2000
Law, George S.; Tasker, Gary D.
2003-01-01
Up-to-date flood-frequency prediction methods for unregulated, ungaged rivers and streams of Tennessee have been developed. Prediction methods include the regional-regression method and the newer region-of-influence method. The prediction methods were developed using stream-gage records from unregulated streams draining basins having from 1 percent to about 30 percent total impervious area. These methods, however, should not be used in heavily developed or storm-sewered basins with impervious areas greater than 10 percent. The methods can be used to estimate 2-, 5-, 10-, 25-, 50-, 100-, and 500-year recurrence-interval floods of most unregulated rural streams in Tennessee. A computer application was developed that automates the calculation of flood frequency for unregulated, ungaged rivers and streams of Tennessee. Regional-regression equations were derived by using both single-variable and multivariable regional-regression analysis. Contributing drainage area is the explanatory variable used in the single-variable equations. Contributing drainage area, main-channel slope, and a climate factor are the explanatory variables used in the multivariable equations. Deleted-residual standard error for the single-variable equations ranged from 32 to 65 percent. Deleted-residual standard error for the multivariable equations ranged from 31 to 63 percent. These equations are included in the computer application to allow easy comparison of results produced by the different methods. The region-of-influence method calculates multivariable regression equations for each ungaged site and recurrence interval using basin characteristics from 60 similar sites selected from the study area. Explanatory variables that may be used in regression equations computed by the region-of-influence method include contributing drainage area, main-channel slope, a climate factor, and a physiographic-region factor. Deleted-residual standard error for the region-of-influence method tended to be only slightly smaller than those for the regional-regression method and ranged from 27 to 62 percent.
78 FR 20695 - Walk-Through Metal Detectors and Hand-Held Metal Detectors Test Method Validation
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-05
... Detectors and Hand-Held Metal Detectors Test Method Validation AGENCY: National Institute of Justice, DOJ... ensure that the test methods in the standards are properly documented, NIJ is requesting proposals (including price quotes) for test method validation efforts from testing laboratories. NIJ is also seeking...
This compendium includes descriptions of methods for analyzing metals, pesticides and volatile organic compounds (VOCs) in water. The individual methods covered are these: (1) Method 200.8: determination of trace elements in waters and wastes by inductively coupled plasma-mass s...
Endo, Yasushi
2018-01-01
Edible fats and oils are among the basic components of the human diet, along with carbohydrates and proteins, and they are the source of high energy and essential fatty acids such as linoleic and linolenic acids. Edible fats and oils are used in for pan- and deep-frying, and in salad dressing, mayonnaise and processed foods such as chocolates and cream. The physical and chemical properties of edible fats and oils can affect the quality of oil foods and hence must be evaluated in detail. The physical characteristics of edible fats and oils include color, specific gravity, refractive index, melting point, congeal point, smoke point, flash point, fire point, and viscosity, while the chemical characteristics include acid value, saponification value, iodine value, fatty acid composition, trans isomers, triacylglycerol composition, unsaponifiable matters (sterols, tocopherols) and minor components (phospholipids, chlorophyll pigments, glycidyl fatty acid esters). Peroxide value, p-anisidine value, carbonyl value, polar compounds and polymerized triacylglycerols are indexes of the deterioration of edible fats and oils. This review describes the analytical methods to evaluate the quality of edible fats and oils, especially the Standard Methods for Analysis of Fats, Oils and Related Materials edited by Japan Oil Chemists' Society (the JOCS standard methods) and advanced methods.
Standard model of knowledge representation
NASA Astrophysics Data System (ADS)
Yin, Wensheng
2016-09-01
Knowledge representation is the core of artificial intelligence research. Knowledge representation methods include predicate logic, semantic network, computer programming language, database, mathematical model, graphics language, natural language, etc. To establish the intrinsic link between various knowledge representation methods, a unified knowledge representation model is necessary. According to ontology, system theory, and control theory, a standard model of knowledge representation that reflects the change of the objective world is proposed. The model is composed of input, processing, and output. This knowledge representation method is not a contradiction to the traditional knowledge representation method. It can express knowledge in terms of multivariate and multidimensional. It can also express process knowledge, and at the same time, it has a strong ability to solve problems. In addition, the standard model of knowledge representation provides a way to solve problems of non-precision and inconsistent knowledge.
Hypothesis Testing Using Factor Score Regression: A Comparison of Four Methods
ERIC Educational Resources Information Center
Devlieger, Ines; Mayer, Axel; Rosseel, Yves
2016-01-01
In this article, an overview is given of four methods to perform factor score regression (FSR), namely regression FSR, Bartlett FSR, the bias avoiding method of Skrondal and Laake, and the bias correcting method of Croon. The bias correcting method is extended to include a reliable standard error. The four methods are compared with each other and…
This report is a standardized methodology description for the determination of strong acidity of fine particles (less than 2.5 microns) in ambient air using annular denuder technology. his methodology description includes two parts: art A - Standard Method and Part B - Enhanced M...
A Dialogic Construction of Ethical Standards for the Teaching Profession
ERIC Educational Resources Information Center
Smith, Deirdre Mary
2013-01-01
In Ontario, Canada, both the educational community and the public, which is understood to include parents, students and citizens of the province, participated in a multi-phased, longitudinal, dialogic inquiry to develop a set of ethical standards for the teaching profession. Collective discovery methods, syntheses, and validation of ethical…
48 CFR 9904.406-61 - Interpretation.
Code of Federal Regulations, 2010 CFR
2010-10-01
... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.406-61 Interpretation. (a) Questions have arisen as to... categories of costs that have been included in the past and may be considered in the future as restructuring... restructuring costs shall not exceed five years. The straight-line method of amortization should normally be...
Science Education in the Boy Scouts of America
ERIC Educational Resources Information Center
Hintz, Rachel Sterneman
2009-01-01
This study of science education in the Boy Scouts of America focused on males with Boy Scout experience. The mixed-methods study topics included: merit badge standards compared with National Science Education Standards, Scout responses to open-ended survey questions, the learning styles of Scouts, a quantitative assessment of science content…
School Occupational Exposure to Bloodborne Pathogens. A Report To Provide Information.
ERIC Educational Resources Information Center
Iowa State Dept. of Education, Des Moines. Bureau of Special Education.
This report discusses a federally mandated standard concerning establishment of a program to reduce exposure to blood and other potentially infectious materials (OPIM) in Iowa schools and education agencies. The standard includes the following components: introduction, scope and application, definitions, exposure control, methods of compliance,…
Missing portion sizes in FFQ--alternatives to use of standard portions.
Køster-Rasmussen, Rasmus; Siersma, Volkert; Halldorsson, Thorhallur I; de Fine Olivarius, Niels; Henriksen, Jan E; Heitmann, Berit L
2015-08-01
Standard portions or substitution of missing portion sizes with medians may generate bias when quantifying the dietary intake from FFQ. The present study compared four different methods to include portion sizes in FFQ. We evaluated three stochastic methods for imputation of portion sizes based on information about anthropometry, sex, physical activity and age. Energy intakes computed with standard portion sizes, defined as sex-specific medians (median), or with portion sizes estimated with multinomial logistic regression (MLR), 'comparable categories' (Coca) or k-nearest neighbours (KNN) were compared with a reference based on self-reported portion sizes (quantified by a photographic food atlas embedded in the FFQ). The Danish Health Examination Survey 2007-2008. The study included 3728 adults with complete portion size data. Compared with the reference, the root-mean-square errors of the mean daily total energy intake (in kJ) computed with portion sizes estimated by the four methods were (men; women): median (1118; 1061), MLR (1060; 1051), Coca (1230; 1146), KNN (1281; 1181). The equivalent biases (mean error) were (in kJ): median (579; 469), MLR (248; 178), Coca (234; 188), KNN (-340; 218). The methods MLR and Coca provided the best agreement with the reference. The stochastic methods allowed for estimation of meaningful portion sizes by conditioning on information about physiology and they were suitable for multiple imputation. We propose to use MLR or Coca to substitute missing portion size values or when portion sizes needs to be included in FFQ without portion size data.
Algorithms for the explicit computation of Penrose diagrams
NASA Astrophysics Data System (ADS)
Schindler, J. C.; Aguirre, A.
2018-05-01
An algorithm is given for explicitly computing Penrose diagrams for spacetimes of the form . The resulting diagram coordinates are shown to extend the metric continuously and nondegenerately across an arbitrary number of horizons. The method is extended to include piecewise approximations to dynamically evolving spacetimes using a standard hypersurface junction procedure. Examples generated by an implementation of the algorithm are shown for standard and new cases. In the appendix, this algorithm is compared to existing methods.
1990-10-04
methods Category 6: Cryptographic methods (hard/ software ) - Tested countermeasures and standard means - Acknowledgements As the number of antivirus ...Skulason), only our own antiviruses have been mentioned in the catalog. We hope to include the major antivirus packages in the future. The current...Center GTE SRI International Trusted Information Systems, Inc. Grumann Data Systems SRI International Software Engineering Institute Trusted
Ghannoum, M. A.; Arthington-Skaggs, B.; Chaturvedi, V.; Espinel-Ingroff, A.; Pfaller, M. A.; Rennie, R.; Rinaldi, M. G.; Walsh, T. J.
2006-01-01
The Clinical and Laboratory Standards Institute (CLSI; formerly National Committee for Clinical Laboratory Standards, or NCCLS) M38-A standard for the susceptibility testing of filamentous fungi does not specifically address the testing of dermatophytes. In 2003, a multicenter study investigated the reproducibility of the microdilution method developed at the Center for Medical Mycology, Cleveland, Ohio, for testing the susceptibility of dermatophytes. Data from that study supported the introduction of this method for testing dermatophytes in the future version of the CLSI M38-A standard. In order for the method to be accepted by CLSI, appropriate quality control isolates needed to be identified. To that end, an interlaboratory study, involving the original six laboratories plus two additional sites, was conducted to evaluate potential candidates for quality control isolates. These candidate strains included five Trichophyton rubrum strains known to have elevated MICs to terbinafine and five Trichophyton mentagrophytes strains. Antifungal agents tested included ciclopirox, fluconazole, griseofulvin, itraconazole, posaconazole, terbinafine, and voriconazole. Based on the data generated, two quality control isolates, one T. rubrum isolate and one T. mentagrophytes isolate, were identified and submitted to the American Type Culture Collection (ATCC) for inclusion as reference strains. Ranges encompassing 95.2 to 97.9% of all data points for all seven drugs were established. PMID:17050812
Matteson, Brent S; Hanson, Susan K; Miller, Jeffrey L; Oldham, Warren J
2015-04-01
An optimized method was developed to analyze environmental soil and sediment samples for (237)Np, (239)Pu, and (240)Pu by ICP-MS using a (242)Pu isotope dilution standard. The high yield, short time frame required for analysis, and the commercial availability of the (242)Pu tracer are significant advantages of the method. Control experiments designed to assess method uncertainty, including variation in inter-element fractionation that occurs during the purification protocol, suggest that the overall precision for measurements of (237)Np is typically on the order of ± 5%. Measurements of the (237)Np concentration in a Peruvian Soil blank (NIST SRM 4355) spiked with a known concentration of (237)Np tracer confirmed the accuracy of the method, agreeing well with the expected value. The method has been used to determine neptunium and plutonium concentrations in several environmental matrix standard reference materials available from NIST: SRM 4357 (Radioactivity Standard), SRM 1646a (Estuarine Sediment) and SRM 2702 (Inorganics in Marine Sediment). Copyright © 2015 Elsevier Ltd. All rights reserved.
New patient-centered care standards from the commission on cancer: opportunities and challenges.
Fashoyin-Aje, Lola A; Martinez, Kathryn A; Dy, Sydney M
2012-01-01
The Commission on Cancer of the American College of Surgeons publishes accreditation standards that hospitals and cancer treatment centers implement to ensure quality care to cancer patients. These standards address the full spectrum of cancer care, from cancer prevention to survivorship and end-of-life care. The most recent revisions of these standards included new standards in "patient-centered areas," including the provision of palliative care services, treatment and survivorship plans, psychological distress screening, and patient navigation programs. Unified by their emphasis on the early identification of patients at risk of receiving suboptimal care and the importance of ensuring that issues arising during and after completion of cancer treatment are addressed, they are a welcome expansion of the standards guiding cancer care. As with all standards, however, the next steps will be to further define how they will be implemented and to determine how success will be assessed. This will require ongoing critical evaluation of the standards and their implementation, including the need for member institutions to define successful implementation methods and measurable outcomes and identification of areas most in need of further research. Copyright © 2012 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1969-07-01
The Fifth International Conference on Nondestructive Testing was held in Montreal, Canada, for the purpose of promoting international collaboration in all matters related to the development and use of nondestructive test methods. A total of 82 papers were selected for presentation. Session titles included: evaluation of material quality; ultrasonics - identification and measurements; thermal methods; testing of welds; visual aids in nondestructive testing; measurements of stress and elastic properties; magnetic and eddy-current methods; surface methods and neutron radiography; standardization - general; ultrasonics at elevated temperatures; applications; x-ray techniques; radiography; ultrasonic standardization; training and qualification; and, correlation of weld defects.
Interagency comparison of iodometric methods for ozone determination
NASA Technical Reports Server (NTRS)
Demore, W. B.; Romanovsky, J. C.; Feldstein, M.; Mueller, P. K.; Hamming, W. J.
1976-01-01
The California Air Resources Board appointed an Oxidant Calibration Committee for the purpose of evaluating the accuracy of the different agency calibration procedures. The committee chose UV absorption photometry as the reference method for ozone measurement. Interagency comparisons of the various iodometric methods were conducted relative to the ultraviolet standard. The tests included versions of the iodometric methods as employed by the Air Resources Board, the Los Angeles Air Pollution Control District, and the EPA. An alternative candidate reference method for ozone measurement, gas phase titration, was also included in the test series.
Complete 3D kinematics of upper extremity functional tasks.
van Andel, Carolien J; Wolterbeek, Nienke; Doorenbosch, Caroline A M; Veeger, DirkJan H E J; Harlaar, Jaap
2008-01-01
Upper extremity (UX) movement analysis by means of 3D kinematics has the potential to become an important clinical evaluation method. However, no standardized protocol for clinical application has yet been developed, that includes the whole upper limb. Standardization problems include the lack of a single representative function, the wide range of motion of joints and the complexity of the anatomical structures. A useful protocol would focus on the functional status of the arm and particularly the orientation of the hand. The aim of this work was to develop a standardized measurement method for unconstrained movement analysis of the UX that includes hand orientation, for a set of functional tasks for the UX and obtain normative values. Ten healthy subjects performed four representative activities of daily living (ADL). In addition, six standard active range of motion (ROM) tasks were executed. Joint angles of the wrist, elbow, shoulder and scapula were analyzed throughout each ADL task and minimum/maximum angles were determined from the ROM tasks. Characteristic trajectories were found for the ADL tasks, standard deviations were generally small and ROM results were consistent with the literature. The results of this study could form the normative basis for the development of a 'UX analysis report' equivalent to the 'gait analysis report' and would allow for future comparisons with pediatric and/or pathologic movement patterns.
Trichinella diagnostics and control: mandatory and best practices for ensuring food safety.
Gajadhar, Alvin A; Pozio, Edoardo; Gamble, H Ray; Nöckler, Karsten; Maddox-Hyttel, Charlotte; Forbes, Lorry B; Vallée, Isabelle; Rossi, Patrizia; Marinculić, Albert; Boireau, Pascal
2009-02-23
Because of its role in human disease, there are increasing global requirements for reliable diagnostic and control methods for Trichinella in food animals to ensure meat safety and to facilitate trade. Consequently, there is a need for standardization of methods, programs, and best practices used in the control of Trichinella and trichinellosis. This review article describes the biology and epidemiology of Trichinella, and describes recommended test methods as well as modified and optimized procedures that are used in meat inspection programs. The use of ELISA for monitoring animals for infection in various porcine and equine pre- and post-slaughter programs, including farm or herd certification programs is also discussed. A brief review of the effectiveness of meat processing methods, such as freezing, cooking and preserving is provided. The importance of proper quality assurance and its application in all aspects of a Trichinella diagnostic system is emphasized. It includes the use of international quality standards, test validation and standardization, critical control points, laboratory accreditation, certification of analysts and proficiency testing. Also described, are the roles and locations of international and regional reference laboratories for trichinellosis where expert advice and support on research and diagnostics are available.
Teaching Prevention in Pediatrics.
ERIC Educational Resources Information Center
Cheng, Tina L.; Greenberg, Larrie; Loeser, Helen; Keller, David
2000-01-01
Reviews methods of teaching preventive medicine in pediatrics and highlights innovative programs. Methods of teaching prevention in pediatrics include patient interactions, self-directed learning, case-based learning, small-group learning, standardized patients, computer-assisted instruction, the Internet, student-centered learning, and lectures.…
In Situ Catalytic Groundwater Treatment Using Pd-Catalysts and Horizontal Flow Treatment Wells
2007-02-01
1,2,4-Trichlorobenzene Hexachlorobutadiene Naphthalene 1,2,3-Trichlorobenzene Internal Standards Fluorobenzene 2-Bromo- 1 - chloropropane a Retention...internal standard method using a purge-and-trap. Internal standards were: Fluorobenzene for PID, 2-Bromo- 1 - chloropropane for HECD. b Detector does not...information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and
Environmental Chemicals in Urine and Blood: Improving Methods for Creatinine and Lipid Adjustment
O’Brien, Katie M.; Upson, Kristen; Cook, Nancy R.; Weinberg, Clarice R.
2015-01-01
Background Investigators measuring exposure biomarkers in urine typically adjust for creatinine to account for dilution-dependent sample variation in urine concentrations. Similarly, it is standard to adjust for serum lipids when measuring lipophilic chemicals in serum. However, there is controversy regarding the best approach, and existing methods may not effectively correct for measurement error. Objectives We compared adjustment methods, including novel approaches, using simulated case–control data. Methods Using a directed acyclic graph framework, we defined six causal scenarios for epidemiologic studies of environmental chemicals measured in urine or serum. The scenarios include variables known to influence creatinine (e.g., age and hydration) or serum lipid levels (e.g., body mass index and recent fat intake). Over a range of true effect sizes, we analyzed each scenario using seven adjustment approaches and estimated the corresponding bias and confidence interval coverage across 1,000 simulated studies. Results For urinary biomarker measurements, our novel method, which incorporates both covariate-adjusted standardization and the inclusion of creatinine as a covariate in the regression model, had low bias and possessed 95% confidence interval coverage of nearly 95% for most simulated scenarios. For serum biomarker measurements, a similar approach involving standardization plus serum lipid level adjustment generally performed well. Conclusions To control measurement error bias caused by variations in serum lipids or by urinary diluteness, we recommend improved methods for standardizing exposure concentrations across individuals. Citation O’Brien KM, Upson K, Cook NR, Weinberg CR. 2016. Environmental chemicals in urine and blood: improving methods for creatinine and lipid adjustment. Environ Health Perspect 124:220–227; http://dx.doi.org/10.1289/ehp.1509693 PMID:26219104
Fuels characterization studies. [jet fuels
NASA Technical Reports Server (NTRS)
Seng, G. T.; Antoine, A. C.; Flores, F. J.
1980-01-01
Current analytical techniques used in the characterization of broadened properties fuels are briefly described. Included are liquid chromatography, gas chromatography, and nuclear magnetic resonance spectroscopy. High performance liquid chromatographic ground-type methods development is being approached from several directions, including aromatic fraction standards development and the elimination of standards through removal or partial removal of the alkene and aromatic fractions or through the use of whole fuel refractive index values. More sensitive methods for alkene determinations using an ultraviolet-visible detector are also being pursued. Some of the more successful gas chromatographic physical property determinations for petroleum derived fuels are the distillation curve (simulated distillation), heat of combustion, hydrogen content, API gravity, viscosity, flash point, and (to a lesser extent) freezing point.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, D. C.; Gu, X.; Haldenman, S.
The curing of cross-linkable encapsulation is a critical consideration for photovoltaic (PV) modules manufactured using a lamination process. Concerns related to ethylene-co-vinyl acetate (EVA) include the quality (e.g., expiration and uniformity) of the films or completion (duration) of the cross-linking of the EVA within a laminator. Because these issues are important to both EVA and module manufacturers, an international standard has recently been proposed by the Encapsulation Task-Group within the Working Group 2 (WG2) of the International Electrotechnical Commission (IEC) Technical Committee 82 (TC82) for the quantification of the degree of cure for EVA encapsulation. The present draft of themore » standard calls for the use of differential scanning calorimetry (DSC) as the rapid, enabling secondary (test) method. Both the residual enthalpy- and melt/freeze-DSC methods are identified. The DSC methods are calibrated against the gel content test, the primary (reference) method. Aspects of other established methods, including indentation and rotor cure metering, were considered by the group. Key details of the test procedure will be described.« less
The method of expected number of deaths, 1786-1886-1986.
Keiding, N
1987-04-01
"The method of expected number of deaths is an integral part of standardization of vital rates, which is one of the oldest statistical techniques. The expected number of deaths was calculated in 18th century actuarial mathematics...but the method seems to have been forgotten, and was reinvented in connection with 19th century studies of geographical and occupational variations of mortality.... It is noted that standardization of rates is intimately connected to the study of relative mortality, and a short description of very recent developments in the methodology of that area is included." (SUMMARY IN FRE) excerpt
Wan, Xiang; Wang, Wenqian; Liu, Jiming; Tong, Tiejun
2014-12-19
In systematic reviews and meta-analysis, researchers often pool the results of the sample mean and standard deviation from a set of similar clinical trials. A number of the trials, however, reported the study using the median, the minimum and maximum values, and/or the first and third quartiles. Hence, in order to combine results, one may have to estimate the sample mean and standard deviation for such trials. In this paper, we propose to improve the existing literature in several directions. First, we show that the sample standard deviation estimation in Hozo et al.'s method (BMC Med Res Methodol 5:13, 2005) has some serious limitations and is always less satisfactory in practice. Inspired by this, we propose a new estimation method by incorporating the sample size. Second, we systematically study the sample mean and standard deviation estimation problem under several other interesting settings where the interquartile range is also available for the trials. We demonstrate the performance of the proposed methods through simulation studies for the three frequently encountered scenarios, respectively. For the first two scenarios, our method greatly improves existing methods and provides a nearly unbiased estimate of the true sample standard deviation for normal data and a slightly biased estimate for skewed data. For the third scenario, our method still performs very well for both normal data and skewed data. Furthermore, we compare the estimators of the sample mean and standard deviation under all three scenarios and present some suggestions on which scenario is preferred in real-world applications. In this paper, we discuss different approximation methods in the estimation of the sample mean and standard deviation and propose some new estimation methods to improve the existing literature. We conclude our work with a summary table (an Excel spread sheet including all formulas) that serves as a comprehensive guidance for performing meta-analysis in different situations.
A simple web-based tool to compare freshwater fish data collected using AFS standard methods
Bonar, Scott A.; Mercado-Silva, Norman; Rahr, Matt; Torrey, Yuta T.; Cate, Averill
2016-01-01
The American Fisheries Society (AFS) recently published Standard Methods for Sampling North American Freshwater Fishes. Enlisting the expertise of 284 scientists from 107 organizations throughout Canada, Mexico, and the United States, this text was developed to facilitate comparisons of fish data across regions or time. Here we describe a user-friendly web tool that automates among-sample comparisons in individual fish condition, population length-frequency distributions, and catch per unit effort (CPUE) data collected using AFS standard methods. Currently, the web tool (1) provides instantaneous summaries of almost 4,000 data sets of condition, length frequency, and CPUE of common freshwater fishes collected using standard gears in 43 states and provinces; (2) is easily appended with new standardized field data to update subsequent queries and summaries; (3) compares fish data from a particular water body with continent, ecoregion, and state data summaries; and (4) provides additional information about AFS standard fish sampling including benefits, ongoing validation studies, and opportunities to comment on specific methods. The web tool—programmed in a PHP-based Drupal framework—was supported by several AFS Sections, agencies, and universities and is freely available from the AFS website and fisheriesstandardsampling.org. With widespread use, the online tool could become an important resource for fisheries biologists.
Probability of Failure Analysis Standards and Guidelines for Expendable Launch Vehicles
NASA Astrophysics Data System (ADS)
Wilde, Paul D.; Morse, Elisabeth L.; Rosati, Paul; Cather, Corey
2013-09-01
Recognizing the central importance of probability of failure estimates to ensuring public safety for launches, the Federal Aviation Administration (FAA), Office of Commercial Space Transportation (AST), the National Aeronautics and Space Administration (NASA), and U.S. Air Force (USAF), through the Common Standards Working Group (CSWG), developed a guide for conducting valid probability of failure (POF) analyses for expendable launch vehicles (ELV), with an emphasis on POF analysis for new ELVs. A probability of failure analysis for an ELV produces estimates of the likelihood of occurrence of potentially hazardous events, which are critical inputs to launch risk analysis of debris, toxic, or explosive hazards. This guide is intended to document a framework for POF analyses commonly accepted in the US, and should be useful to anyone who performs or evaluates launch risk analyses for new ELVs. The CSWG guidelines provide performance standards and definitions of key terms, and are being revised to address allocation to flight times and vehicle response modes. The POF performance standard allows a launch operator to employ alternative, potentially innovative methodologies so long as the results satisfy the performance standard. Current POF analysis practice at US ranges includes multiple methodologies described in the guidelines as accepted methods, but not necessarily the only methods available to demonstrate compliance with the performance standard. The guidelines include illustrative examples for each POF analysis method, which are intended to illustrate an acceptable level of fidelity for ELV POF analyses used to ensure public safety. The focus is on providing guiding principles rather than "recipe lists." Independent reviews of these guidelines were performed to assess their logic, completeness, accuracy, self- consistency, consistency with risk analysis practices, use of available information, and ease of applicability. The independent reviews confirmed the general validity of the performance standard approach and suggested potential updates to improve the accuracy each of the example methods, especially to address reliability growth.
Nuutinen, Mikko; Virtanen, Toni; Rummukainen, Olli; Häkkinen, Jukka
2016-03-01
This article presents VQone, a graphical experiment builder, written as a MATLAB toolbox, developed for image and video quality ratings. VQone contains the main elements needed for the subjective image and video quality rating process. This includes building and conducting experiments and data analysis. All functions can be controlled through graphical user interfaces. The experiment builder includes many standardized image and video quality rating methods. Moreover, it enables the creation of new methods or modified versions from standard methods. VQone is distributed free of charge under the terms of the GNU general public license and allows code modifications to be made so that the program's functions can be adjusted according to a user's requirements. VQone is available for download from the project page (http://www.helsinki.fi/psychology/groups/visualcognition/).
Selecting, training and assessing new general practice community teachers in UK medical schools.
Hydes, Ciaran; Ajjawi, Rola
2015-09-01
Standards for undergraduate medical education in the UK, published in Tomorrow's Doctors, include the criterion 'everyone involved in educating medical students will be appropriately selected, trained, supported and appraised'. To establish how new general practice (GP) community teachers of medical students are selected, initially trained and assessed by UK medical schools and establish the extent to which Tomorrow's Doctors standards are being met. A mixed-methods study with questionnaire data collected from 24 lead GPs at UK medical schools, 23 new GP teachers from two medical schools plus a semi-structured telephone interview with two GP leads. Quantitative data were analysed descriptively and qualitative data were analysed informed by framework analysis. GP teachers' selection is non-standardised. One hundred per cent of GP leads provide initial training courses for new GP teachers; 50% are mandatory. The content and length of courses varies. All GP leads use student feedback to assess teaching, but other required methods (peer review and patient feedback) are not universally used. To meet General Medical Council standards, medical schools need to include equality and diversity in initial training and use more than one method to assess new GP teachers. Wider debate about the selection, training and assessment of new GP teachers is needed to agree minimum standards.
Zhao, Gai; Bian, Yang; Li, Ming
2013-12-18
To analyze the impact of passing items above the roof level in the gross motor subtest of Peabody development motor scales (PDMS-2) on its assessment results. In the subtests of PDMS-2, 124 children from 1.2 to 71 months were administered. Except for the original scoring method, a new scoring method which includes passing items above the ceiling were developed. The standard scores and quotients of the two scoring methods were compared using the independent-samples t test. Only one child could pass the items above the ceiling in the stationary subtest, 19 children in the locomotion subtest, and 17 children in the visual-motor integration subtest. When the scores of these passing items were included in the raw scores, the total raw scores got the added points of 1-12, the standard scores added 0-1 points and the motor quotients added 0-3 points. The diagnostic classification was changed only in two children. There was no significant difference between those two methods about motor quotients or standard scores in the specific subtest (P>0.05). The passing items above a ceiling of PDMS-2 isn't a rare situation. It usually takes place in the locomotion subtest and visual-motor integration subtest. Including these passing items into the scoring system will not make significant difference in the standard scores of the subtests or the developmental motor quotients (DMQ), which supports the original setting of a ceiling established by upassing 3 items in a row. However, putting the passing items above the ceiling into the raw score will improve tracking of children's developmental trajectory and intervention effects.
A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.
Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, includingmore » standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.« less
A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers
Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.; ...
2016-09-14
Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, includingmore » standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.« less
The U.S. Environmental Protection Agency (EPA) held a workshop in January 2003 on the detection of viruses in water using polymerase chain reaction (PCR)-based methods. Speakers were asked to address a series of specific questions, including whether a single standard method coul...
Yu, Chen; Zhang, Qian; Xu, Peng-Yao; Bai, Yin; Shen, Wen-Bin; Di, Bin; Su, Meng-Xiang
2018-01-01
Quantitative nuclear magnetic resonance (qNMR) is a well-established technique in quantitative analysis. We presented a validated 1 H-qNMR method for assay of octreotide acetate, a kind of cyclic octopeptide. Deuterium oxide was used to remove the undesired exchangeable peaks, which was referred to as proton exchange, in order to make the quantitative signals isolated in the crowded spectrum of the peptide and ensure precise quantitative analysis. Gemcitabine hydrochloride was chosen as the suitable internal standard. Experimental conditions, including relaxation delay time, the numbers of scans, and pulse angle, were optimized first. Then method validation was carried out in terms of selectivity, stability, linearity, precision, and robustness. The assay result was compared with that by means of high performance liquid chromatography, which is provided by Chinese Pharmacopoeia. The statistical F test, Student's t test, and nonparametric test at 95% confidence level indicate that there was no significant difference between these two methods. qNMR is a simple and accurate quantitative tool with no need for specific corresponding reference standards. It has the potential of the quantitative analysis of other peptide drugs and standardization of the corresponding reference standards. Copyright © 2017 John Wiley & Sons, Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The first part covers standards for gaseous fuels. The second part covers standards on coal and coke including the classification of coals, determination of major elements in coal ash and trace elements in coal, metallurgical properties of coal and coke, methods of analysis of coal and coke, petrogrpahic analysis of coal and coke, physical characteristics of coal, quality assurance and sampling.
Instructional Practices: A Qualitative Study on the Response to Common Core Standardized Testing
ERIC Educational Resources Information Center
Hightower, Gabrielle
2017-01-01
The purpose of this qualitative study was to examine the instructional practices implemented by Tennessee elementary teachers in response to Common Core Standardized Testing. This research study utilized a basic qualitative method that included a purposive and convenient sampling. This qualitative study focused on face-to-face interviews, phone…
From Taylor to Tyler to "No Child Left Behind": Legitimating Educational Standards
ERIC Educational Resources Information Center
Waldow, Florian
2015-01-01
In the early 20th century, proponents of the so-called "social efficiency movement" in the United States tried to apply methods and concepts for enhancing efficiency in industrial production to the organization of teaching and learning processes. This included the formulation of "educational standards" analogous to industrial…
NASA Technical Reports Server (NTRS)
2009-01-01
This Interim Standard establishes requirements for evaluation, testing, and selection of materials that are intended for use in space vehicles, associated Ground Support Equipment (GSE), and facilities used during assembly, test, and flight operations. Included are requirements, criteria, and test methods for evaluating the flammability, offgassing, and compatibility of materials.
Readability Levels of Health-Based Websites: From Content to Comprehension
ERIC Educational Resources Information Center
Schutten, Mary; McFarland, Allison
2009-01-01
Three of the national health education standards include decision-making, accessing information and analyzing influences. WebQuests are a popular inquiry-oriented method used by secondary teachers to help students achieve these content standards. While WebQuests support higher level thinking skills, the readability level of the information on the…
Standardized Tests of Handwriting Readiness: A Systematic Review of the Literature
ERIC Educational Resources Information Center
van Hartingsveldt, Margo J.; de Groot, Imelda J. M.; Aarts, Pauline B. M.; Nijhuis-van der Sanden, Maria W. G.
2011-01-01
Aim: To establish if there are psychometrically sound standardized tests or test items to assess handwriting readiness in 5- and 6-year-old children on the levels of occupations activities/tasks and performance. Method: Electronic databases were searched to identify measurement instruments. Tests were included in a systematic review if: (1)…
Perceptions of Peer Sexual Behavior: Do Adolescents Believe in a Sexual Double Standard?
ERIC Educational Resources Information Center
Young, Michael; Cardenas, Susan; Donnelly, Joseph; Kittleson, Mark J.
2016-01-01
Background: The purpose of the study was to (1) examine attitudes of adolescents toward peer models having sex or choosing abstinence, and (2) determine whether a "double standard" in perception existed concerning adolescent abstinence and sexual behavior. Methods: Adolescents (N = 173) completed questionnaires that included 1 of 6…
Oromucosal film preparations: classification and characterization methods.
Preis, Maren; Woertz, Christina; Kleinebudde, Peter; Breitkreutz, Jörg
2013-09-01
Recently, the regulatory authorities have enlarged the variety of 'oromucosal preparations' by buccal films and orodispersible films. Various film preparations have entered the market and pharmacopoeias. Due to the novelty of the official monographs, no standardized characterization methods and quality specifications are included. This review reports the methods of choice to characterize oromucosal film preparations with respect to biorelevant characterization and quality control. Commonly used dissolution tests for other dosage forms are not transferable for films in all cases. Alternatives and guidance on decision, which methods are favorable for film preparations are discussed. Furthermore, issues about requirements for film dosage forms are reflected. Oromucosal film preparations offer a wide spectrum of opportunities. There are a lot of suggestions in the literature on how to control the quality of these innovative products, but no standardized tests are available. Regulatory authorities need to define the standards and quality requirements more precisely.
Phillips, Melissa M.; Bedner, Mary; Gradl, Manuela; Burdette, Carolyn Q.; Nelson, Michael A.; Yen, James H.; Sander, Lane C.; Rimmer, Catherine A.
2017-01-01
Two independent analytical approaches, based on liquid chromatography with absorbance detection and liquid chromatography with mass spectrometric detection, have been developed for determination of isoflavones in soy materials. These two methods yield comparable results for a variety of soy-based foods and dietary supplements. Four Standard Reference Materials (SRMs) have been produced by the National Institute of Standards and Technology to assist the food and dietary supplement community in method validation and have been assigned values for isoflavone content using both methods. These SRMs include SRM 3234 Soy Flour, SRM 3236 Soy Protein Isolate, SRM 3237 Soy Protein Concentrate, and SRM 3238 Soy-Containing Solid Oral Dosage Form. A fifth material, SRM 3235 Soy Milk, was evaluated using the methods and found to be inhomogeneous for isoflavones and unsuitable for value assignment. PMID:27832301
Hurst, William J; Stanley, Bruce; Glinski, Jan A; Davey, Matthew; Payne, Mark J; Stuart, David A
2009-10-15
This report describes the characterization of a series of commercially available procyanidin standards ranging from dimers DP = 2 to decamers DP = 10 for the determination of procyanidins from cocoa and chocolate. Using a combination of HPLC with fluorescence detection and MALDI-TOF mass spectrometry, the purity of each standard was determined and these data were used to determine relative response factors. These response factors were compared with other response factors obtained from published methods. Data comparing the procyanidin analysis of a commercially available US dark chocolate calculated using each of the calibration methods indicates divergent results and demonstrate that previous methods may significantly underreport the procyanidins in cocoa-containing products. These results have far reaching implications because the previous calibration methods have been used to develop data for a variety of scientific reports, including food databases and clinical studies.
Waites, Ken B; Duffy, Lynn B; Bébéar, Cécile M; Matlow, Anne; Talkington, Deborah F; Kenny, George E; Totten, Patricia A; Bade, Donald J; Zheng, Xiaotian; Davidson, Maureen K; Shortridge, Virginia D; Watts, Jeffrey L; Brown, Steven D
2012-11-01
An international multilaboratory collaborative study was conducted to develop standard media and consensus methods for the performance and quality control of antimicrobial susceptibility testing of Mycoplasma pneumoniae, Mycoplasma hominis, and Ureaplasma urealyticum using broth microdilution and agar dilution techniques. A reference strain from the American Type Culture Collection was designated for each species, which was to be used for quality control purposes. Repeat testing of replicate samples of each reference strain by participating laboratories utilizing both methods and different lots of media enabled a 3- to 4-dilution MIC range to be established for drugs in several different classes, including tetracyclines, macrolides, ketolides, lincosamides, and fluoroquinolones. This represents the first multilaboratory collaboration to standardize susceptibility testing methods and to designate quality control parameters to ensure accurate and reliable assay results for mycoplasmas and ureaplasmas that infect humans.
[Determination of heavy metals in four traditional Chinese medicines by ICP-MS].
Wen, Hui-Min; Chen, Xiao-Hui; Dong, Ting-Xia; Zhan, Hua-Qiang; Bi, Kai-Shun
2006-08-01
To establish a ICP-MS method for the determination of heavy metals, including As, Hg, Pb, Cd, in four traditional Chinese medicines. The samples were digested by closed-versel microwave. The four heavy metals were directly analyzed by ICP-MS. Select internal standard element in for the method by which the analyse signal drife is corrected by the signal of another element (internal standard elements) added to both the standard solution and sample. For all of the analyzed heary methals, the correlative coefficient of the calibration curves was over 0.999 2. The recovery rates of the procedure were 97.5%-108.0%, and its RSD was lower than 11.6%. This method was convenient, quick-acquired, accurate and highly sensitive. The method can be used for the quality control of trace elements in traditional Chinese medicines and for the contents determination of traditional Chinese medicines from different habitats and species.
Wind Tunnel Force Balance Calibration Study - Interim Results
NASA Technical Reports Server (NTRS)
Rhew, Ray D.
2012-01-01
Wind tunnel force balance calibration is preformed utilizing a variety of different methods and does not have a direct traceable standard such as standards used for most calibration practices (weights, and voltmeters). These different calibration methods and practices include, but are not limited to, the loading schedule, the load application hardware, manual and automatic systems, re-leveling and non-re-leveling. A study of the balance calibration techniques used by NASA was undertaken to develop metrics for reviewing and comparing results using sample calibrations. The study also includes balances of different designs, single and multi-piece. The calibration systems include, the manual, and the automatic that are provided by NASA and its vendors. The results to date will be presented along with the techniques for comparing the results. In addition, future planned calibrations and investigations based on the results will be provided.
Olson, Mary C.; Iverson, Jana L.; Furlong, Edward T.; Schroeder, Michael P.
2004-01-01
A method for the determination of 28 polycyclic aromatic hydrocarbons (PAHs) and 25 alkylated PAH homolog groups in sediment samples is described. The compounds are extracted from sediment by solvent extraction, followed by partial isolation using high-performance gel permeation chromatography. The compounds are identified and uantitated using capillary-column gas chromatography/mass spectrometry. The report presents performance data for full-scan ion monitoring. Method detection limits in laboratory reagent matrix samples range from 1.3 to 5.1 micrograms per kilogram for the 28 PAHs. The 25 groups of alkylated PAHs are homologs of five groups of isomeric parent PAHs. Because of the lack of authentic standards, these homologs are reported semiquantitatively using a response factor from a parent PAH or a specific alkylated PAH. Precision data for the alkylated PAH homologs are presented using two different standard reference manuals produced by the National Institute of Standards and Technology: SRM 1941b and SRM 1944. The percent relative standard deviations for identified alkylated PAH homolog groups ranged from 1.55 to 6.98 for SRM 1941b and from 6.11 to 12.0 for SRM 1944. Homolog group concentrations reported under this method include the concentrations of individually identified compounds that are members of the group. Organochlorine (OC) pesticides--including toxaphene, polychlorinated biphenyls (PCBs), and organophosphate (OP) pesticides--can be isolated simultaneously using this method. In brief, sediment samples are centrifuged to remove excess water and extracted overnight with dichloromethan (95 percent) and methanol (5 percent). The extract is concentrated and then filtered through a 0.2-micrometer polytetrafluoroethylene syringe filter. The PAH fraction is isolated by quantitatively injecting an aliquot of sample onto two polystyrene-divinylbenzene gel-permeation chromatographic columns connected in series. The compounds are eluted with dichloromethane, a PAH fraction is collected, and a portion of the coextracted interferences, including elemental sulfur, is separated and discarded. The extract is solvent exchanged, the volume is reduced, and internal standard is added. Sample analysis is completed using a gas chromatograph/mass spectrometer and full-scan acquisition.
Improving Pharmacy Student Communication Outcomes Using Standardized Patients.
Gillette, Chris; Rudolph, Michael; Rockich-Winston, Nicole; Stanton, Robert; Anderson, H Glenn
2017-08-01
Objective. To examine whether standardized patient encounters led to an improvement in a student pharmacist-patient communication assessment compared to traditional active-learning activities within a classroom setting. Methods. A quasi-experimental study was conducted with second-year pharmacy students in a drug information and communication skills course. Student patient communication skills were assessed using high-stakes communication assessment. Results. Two hundred and twenty students' data were included. Students were significantly more likely to have higher scores on the communication assessment when they had higher undergraduate GPAs, were female, and taught using standardized patients. Similarly, students were significantly more likely to pass the assessment on the first attempt when they were female and when they were taught using standardized patients. Conclusion. Incorporating standardized patients within a communication course resulted in improved scores as well as first-time pass rates on a communication assessment than when using different methods of active learning.
Dong, Ming; Fisher, Carolyn; Añez, Germán; Rios, Maria; Nakhasi, Hira L.; Hobson, J. Peyton; Beanan, Maureen; Hockman, Donna; Grigorenko, Elena; Duncan, Robert
2016-01-01
Aims To demonstrate standardized methods for spiking pathogens into human matrices for evaluation and comparison among diagnostic platforms. Methods and Results This study presents detailed methods for spiking bacteria or protozoan parasites into whole blood and virus into plasma. Proper methods must start with a documented, reproducible pathogen source followed by steps that include standardized culture, preparation of cryopreserved aliquots, quantification of the aliquots by molecular methods, production of sufficient numbers of individual specimens and testing of the platform with multiple mock specimens. Results are presented following the described procedures that showed acceptable reproducibility comparing in-house real-time PCR assays to a commercially available multiplex molecular assay. Conclusions A step by step procedure has been described that can be followed by assay developers who are targeting low prevalence pathogens. Significance and Impact of Study The development of diagnostic platforms for detection of low prevalence pathogens such as biothreat or emerging agents is challenged by the lack of clinical specimens for performance evaluation. This deficit can be overcome using mock clinical specimens made by spiking cultured pathogens into human matrices. To facilitate evaluation and comparison among platforms, standardized methods must be followed in the preparation and application of spiked specimens. PMID:26835651
Jenke, Dennis; Sadain, Salma; Nunez, Karen; Byrne, Frances
2007-01-01
The performance of an ion chromatographic method for measuring citrate and phosphate in pharmaceutical solutions is evaluated. Performance characteristics examined include accuracy, precision, specificity, response linearity, robustness, and the ability to meet system suitability criteria. In general, the method is found to be robust within reasonable deviations from its specified operating conditions. Analytical accuracy is typically 100 +/- 3%, and short-term precision is not more than 1.5% relative standard deviation. The instrument response is linear over a range of 50% to 150% of the standard preparation target concentrations (12 mg/L for phosphate and 20 mg/L for citrate), and the results obtained using a single-point standard versus a calibration curve are essentially equivalent. A small analytical bias is observed and ascribed to the relative purity of the differing salts, used as raw materials in tested finished products and as reference standards in the analytical method. The assay is specific in that no phosphate or citrate peaks are observed in a variety of method-related solutions and matrix blanks (with and without autoclaving). The assay with manual preparation of the eluents is sensitive to the composition of the eluent in the sense that the eluent must be effectively degassed and protected from CO(2) ingress during use. In order for the assay to perform effectively, extensive system equilibration and conditioning is required. However, a properly conditioned and equilibrated system can be used to test a number of samples via chromatographic runs that include many (> 50) injections.
NASA Astrophysics Data System (ADS)
Wright, K. E.; Popa, K.; Pöml, P.
2018-01-01
Transmutation nuclear fuels contain weight percentage quantities of actinide elements, including Pu, Am and Np. Because of the complex spectra presented by actinide elements using electron probe microanalysis (EPMA), it is necessary to have relatively pure actinide element standards to facilitate overlap correction and accurate quantitation. Synthesis of actinide oxide standards is complicated by their multiple oxidation states, which can result in inhomogeneous standards or standards that are not stable at atmospheric conditions. Synthesis of PuP4 results in a specimen that exhibits stable oxidation-reduction chemistry and is sufficiently homogenous to serve as an EPMA standard. This approach shows promise as a method for producing viable actinide standards for microanalysis.
Niosh analytical methods for Set G
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1976-12-01
Industrial Hygiene sampling and analytical monitoring methods validated under the joint NIOSH/OSHA Standards Completion Program for Set G are contained herein. Monitoring methods for the following compounds are included: butadiene, heptane, ketene, methyl cyclohexane, octachloronaphthalene, pentachloronaphthalene, petroleum distillates, propylene dichloride, turpentine, dioxane, hexane, LPG, naphtha(coal tar), octane, pentane, propane, and stoddard solvent.
ERIC Educational Resources Information Center
Ramchandani, Dilip
2011-01-01
Background/Objective: The author analyzed and compared various assessment methods for assessment of medical students; these methods included clinical assessment and the standardized National Board of Medical Education (NBME) subject examination. Method: Students were evaluated on their 6-week clerkship in psychiatry by both their clinical…
19 CFR 163.5 - Methods for storage of records.
Code of Federal Regulations, 2012 CFR
2012-04-01
... standard business practice for storage of records include, but are not limited to, machine readable data... 19 Customs Duties 2 2012-04-01 2012-04-01 false Methods for storage of records. 163.5 Section 163... THE TREASURY (CONTINUED) RECORDKEEPING § 163.5 Methods for storage of records. (a) Original records...
19 CFR 163.5 - Methods for storage of records.
Code of Federal Regulations, 2011 CFR
2011-04-01
... standard business practice for storage of records include, but are not limited to, machine readable data... 19 Customs Duties 2 2011-04-01 2011-04-01 false Methods for storage of records. 163.5 Section 163... THE TREASURY (CONTINUED) RECORDKEEPING § 163.5 Methods for storage of records. (a) Original records...
Dobecki, Marek
2012-01-01
This paper reviews the requirements for measurement methods of chemical agents in the air at workstations. European standards, which have a status of Polish standards, comprise some requirements and information on sampling strategy, measuring techniques, type of samplers, sampling pumps and methods of occupational exposure evaluation at a given technological process. Measurement methods, including air sampling and analytical procedure in a laboratory, should be appropriately validated before intended use. In the validation process, selected methods are tested and budget of uncertainty is set up. The validation procedure that should be implemented in the laboratory together with suitable statistical tools and major components of uncertainity to be taken into consideration, were presented in this paper. Methods of quality control, including sampling and laboratory analyses were discussed. Relative expanded uncertainty for each measurement expressed as a percentage, should not exceed the limit of values set depending on the type of occupational exposure (short-term or long-term) and the magnitude of exposure to chemical agents in the work environment.
An evolutionary algorithm that constructs recurrent neural networks.
Angeline, P J; Saunders, G M; Pollack, J B
1994-01-01
Standard methods for simultaneously inducing the structure and weights of recurrent neural networks limit every task to an assumed class of architectures. Such a simplification is necessary since the interactions between network structure and function are not well understood. Evolutionary computations, which include genetic algorithms and evolutionary programming, are population-based search methods that have shown promise in many similarly complex tasks. This paper argues that genetic algorithms are inappropriate for network acquisition and describes an evolutionary program, called GNARL, that simultaneously acquires both the structure and weights for recurrent networks. GNARL's empirical acquisition method allows for the emergence of complex behaviors and topologies that are potentially excluded by the artificial architectural constraints imposed in standard network induction methods.
Harrison, Jesse P; Boardman, Carl; O'Callaghan, Kenneth; Delort, Anne-Marie; Song, Jim
2018-05-01
Plastic litter is encountered in aquatic ecosystems across the globe, including polar environments and the deep sea. To mitigate the adverse societal and ecological impacts of this waste, there has been debate on whether 'biodegradable' materials should be granted exemptions from plastic bag bans and levies. However, great care must be exercised when attempting to define this term, due to the broad and complex range of physical and chemical conditions encountered within natural ecosystems. Here, we review existing international industry standards and regional test methods for evaluating the biodegradability of plastics within aquatic environments (wastewater, unmanaged freshwater and marine habitats). We argue that current standards and test methods are insufficient in their ability to realistically predict the biodegradability of carrier bags in these environments, due to several shortcomings in experimental procedures and a paucity of information in the scientific literature. Moreover, existing biodegradability standards and test methods for aquatic environments do not involve toxicity testing or account for the potentially adverse ecological impacts of carrier bags, plastic additives, polymer degradation products or small (microscopic) plastic particles that can arise via fragmentation. Successfully addressing these knowledge gaps is a key requirement for developing new biodegradability standard(s) for lightweight carrier bags.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prokofiev, I.; Wiencek, T.; McGann, D.
1997-10-07
Powder metallurgy dispersions of uranium alloys and silicides in an aluminum matrix have been developed by the RERTR program as a new generation of proliferation-resistant fuels. Testing is done with miniplate-type fuel plates to simulate standard fuel with cladding and matrix in plate-type configurations. In order to seal the dispersion fuel plates, a diffusion bond must exist between the aluminum coverplates surrounding the fuel meat. Four different variations in the standard method for roll-bonding 6061 aluminum were studied. They included mechanical cleaning, addition of a getter material, modifications to the standard chemical etching, and welding methods. Aluminum test pieces weremore » subjected to a bend test after each rolling pass. Results, based on 400 samples, indicate that at least a 70% reduction in thickness is required to produce a diffusion bond using the standard rollbonding method versus a 60% reduction using the Type II method in which the assembly was welded 100% and contained open 9mm holes at frame corners.« less
Menachery, Philby Babu; Noronha, Judith Angelitta; Fernanades, Sweety
2017-08-01
The 'Standard Days Method' is a fertility awareness-based method of family planning that identifies day 8 through day 19 of the menstrual cycle as fertile days during which a woman is likely to conceive with unprotected intercourse. The study was aimed to determine the effectiveness of a promotional program on the 'Standard Days Method' in terms of improving the knowledge scores and attitude scores. A pre-experimental one-group pretest-posttest research design was adopted. The samples included 365 female postgraduate students from selected colleges of Udupi Taluk, Karnataka. The data was collected using self-administered questionnaires. The plan for the promotional program was also established. The findings of the study were analyzed using the descriptive and inferential statistics. The mean pretest and posttest knowledge scores were computed, and it was found that there was an increase in the mean knowledge score from 8.96 ± 3.84 to 32.64 ± 5.59, respectively. It was observed that the promotional program on 'Standard Days Method' was effective in improving the knowledge ( p < 0.001) and attitude ( p < 0.001) of the postgraduate students. The promotional program on Standard Days Method of family planning was effective in improving the knowledge and attitude of the postgraduate female students. This will enable the women to adopt this method and plan their pregnancies naturally and reduce the side effects of using oral contraceptives.
NASA Technical Reports Server (NTRS)
Gracey, William
1948-01-01
A simplified compound-pendulum method for the experimental determination of the moments of inertia of airplanes about the x and y axes is described. The method is developed as a modification of the standard pendulum method reported previously in NACA report, NACA-467. A brief review of the older method is included to form a basis for discussion of the simplified method. (author)
Chen, Yibin; Chen, Jiaxi; Chen, Xuan; Wang, Min; Wang, Wei
2015-01-01
A new method of uniform sampling is evaluated in this paper. The items and indexes were adopted to evaluate the rationality of the uniform sampling. The evaluation items included convenience of operation, uniformity of sampling site distribution, and accuracy and precision of measured results. The evaluation indexes included operational complexity, occupation rate of sampling site in a row and column, relative accuracy of pill weight, and relative deviation of pill weight. They were obtained from three kinds of drugs with different shape and size by four kinds of sampling methods. Gray correlation analysis was adopted to make the comprehensive evaluation by comparing it with the standard method. The experimental results showed that the convenience of uniform sampling method was 1 (100%), odds ratio of occupation rate in a row and column was infinity, relative accuracy was 99.50-99.89%, reproducibility RSD was 0.45-0.89%, and weighted incidence degree exceeded the standard method. Hence, the uniform sampling method was easy to operate, and the selected samples were distributed uniformly. The experimental results demonstrated that the uniform sampling method has good accuracy and reproducibility, which can be put into use in drugs analysis.
Extraction of organic contaminants from marine sediments and tissues using microwave energy.
Jayaraman, S; Pruell, R J; McKinney, R
2001-07-01
In this study, we compared microwave solvent extraction (MSE) to conventional methods for extracting organic contaminants from marine sediments and tissues with high and varying moisture content. The organic contaminants measured were polychlorinated biphenyl (PCB) congeners, chlorinated pesticides, and polycyclic aromatic hydrocarbons (PAHs). Initial experiments were conducted on dry standard reference materials (SRMs) and field collected marine sediments. Moisture content in samples greatly influenced the recovery of the analytes of interest. When wet sediments were included in a sample batch, low recoveries were often encountered in other samples in the batch, including the dry SRM. Experiments were conducted to test the effect of standardizing the moisture content in all samples in a batch prior to extraction. SRM1941a (marine sediment). SRM1974a (mussel tissue), as well as QA96SED6 (marine sediment), and QA96TIS7 (marine tissue), both from 1996 NIST Intercalibration Exercise were extracted using microwave and conventional methods. Moisture levels were adjusted in SRMs to match those of marine sediment and tissue samples before microwave extraction. The results demonstrated that it is crucial to standardize the moisture content in all samples, including dry reference material to ensure good recovery of organic contaminants. MSE yielded equivalent or superior recoveries compared to conventional methods for the majority of the compounds evaluated. The advantages of MSE over conventional methods are reduced solvent usage, higher sample throughput and the elimination of halogenated solvent usage.
Best practice in forensic entomology--standards and guidelines.
Amendt, Jens; Campobasso, Carlo P; Gaudry, Emmanuel; Reiter, Christian; LeBlanc, Hélène N; Hall, Martin J R
2007-03-01
Forensic entomology, the use of insects and other arthropods in forensic investigations, is becoming increasingly more important in such investigations. To ensure its optimal use by a diverse group of professionals including pathologists, entomologists and police officers, a common frame of guidelines and standards is essential. Therefore, the European Association for Forensic Entomology has developed a protocol document for best practice in forensic entomology, which includes an overview of equipment used for collection of entomological evidence and a detailed description of the methods applied. Together with the definitions of key terms and a short introduction to the most important methods for the estimation of the minimum postmortem interval, the present paper aims to encourage a high level of competency in the field of forensic entomology.
Code of Federal Regulations, 2012 CFR
2012-04-01
... STANDARDS FOR DIAGNOSTIC SUBSTANCES FOR LABORATORY TESTS Hepatitis B Surface Antigen § 660.45 Labeling. In... capable of transmitting hepatitis and should be handled accordingly. (d) The package shall include a... test methods, and (3) warnings as to possible hazards, including hepatitis transmitted in handling the...
Code of Federal Regulations, 2014 CFR
2014-04-01
... STANDARDS FOR DIAGNOSTIC SUBSTANCES FOR LABORATORY TESTS Hepatitis B Surface Antigen § 660.45 Labeling. In... capable of transmitting hepatitis and should be handled accordingly. (d) The package shall include a... test methods, and (3) warnings as to possible hazards, including hepatitis transmitted in handling the...
Code of Federal Regulations, 2013 CFR
2013-04-01
... STANDARDS FOR DIAGNOSTIC SUBSTANCES FOR LABORATORY TESTS Hepatitis B Surface Antigen § 660.45 Labeling. In... capable of transmitting hepatitis and should be handled accordingly. (d) The package shall include a... test methods, and (3) warnings as to possible hazards, including hepatitis transmitted in handling the...
Code of Federal Regulations, 2011 CFR
2011-04-01
... STANDARDS FOR DIAGNOSTIC SUBSTANCES FOR LABORATORY TESTS Hepatitis B Surface Antigen § 660.45 Labeling. In... capable of transmitting hepatitis and should be handled accordingly. (d) The package shall include a... test methods, and (3) warnings as to possible hazards, including hepatitis transmitted in handling the...
Code of Federal Regulations, 2010 CFR
2010-04-01
... STANDARDS FOR DIAGNOSTIC SUBSTANCES FOR LABORATORY TESTS Hepatitis B Surface Antigen § 660.45 Labeling. In... capable of transmitting hepatitis and should be handled accordingly. (d) The package shall include a... test methods, and (3) warnings as to possible hazards, including hepatitis transmitted in handling the...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-04
... comments on what standards MARAD should apply concerning determinations of foreign reconstruction of U.S... arisen with regard to MARAD's method of determination in a foreign rebuild context. That matter was... standards as applied to the CCF program and cargo preference. The questions included: (1) What substantive...
The US Environmental Protection Agency (EPA) published a National Ambient Air Quality Standard (NAAQS) and the accompanying Federal Reference Method (FRM) for PM10 in 1987. The EPA revised the particle standards and FRM in 1997 to include PM2.5. In 2005, EPA...
A method for predicting the noise levels of coannular jets with inverted velocity profiles
NASA Technical Reports Server (NTRS)
Russell, J. W.
1979-01-01
A coannular jet was equated with a single stream equivalent jet with the same mass flow, energy, and thrust. The acoustic characteristics of the coannular jet were then related to the acoustic characteristics of the single jet. Forward flight effects were included by incorporating a forward exponent, a Doppler amplification factor, and a Strouhal frequency shift. Model test data, including 48 static cases and 22 wind tunnel cases, were used to evaluate the prediction method. For the static cases and the low forward velocity wind tunnel cases, the spectral mean square pressure correlation coefficients were generally greater than 90 percent, and the spectral sound pressure level standard deviation were generally less than 3 decibels. The correlation coefficient and the standard deviation were not affected by changes in equivalent jet velocity. Limitations of the prediction method are also presented.
29 CFR 30.4 - Affirmative action plans.
Code of Federal Regulations, 2010 CFR
2010-07-01
... nondiscrimination. It includes procedures, methods, and programs for the identification, positive recruitment... to develop programs for preparing students to meet the standards and criteria required to qualify for... the above requirements. (d) Goals and timetables. (1) A sponsor adopting a selection method under § 30...
Differentiated Instruction in the Classroom
ERIC Educational Resources Information Center
Kelly, Gretchen
2013-01-01
Low achievement on standardized tests may be attributed to many factors, including teaching methods. Differentiated instruction has been identified as a teaching method using different learning modalities that appeal to varied student interests with individualized instruction. The purpose of this quantitative study was to compare whole-group…
7 CFR 226.22 - Procurement standards.
Code of Federal Regulations, 2010 CFR
2010-01-01
... pertinent to the following: rationale for the method of procurement, selection of contract type, contractor... competition. The description may include a statement of the qualitative nature of the material, product or... technical resources. (i) Program procurements shall be made by one of the following methods: (1) Small...
EVALUATION OF BIOSOLID SAMPLE PROCESSING TECHNIQUES TO MAXIMIZE RECOVERY OF BACTERIA
Current federal regulations (40 CFR 503) require enumeration of fecal coliform or Salmoella prior to land application of Class A biosolids. This regulation specifies use of enumeration methods included in "Standard Methods for the Examination of Water and Wastewater 18th Edition,...
Preiksaitis, J.; Tong, Y.; Pang, X.; Sun, Y.; Tang, L.; Cook, L.; Pounds, S.; Fryer, J.; Caliendo, A. M.
2015-01-01
Quantitative detection of cytomegalovirus (CMV) DNA has become a standard part of care for many groups of immunocompromised patients; recent development of the first WHO international standard for human CMV DNA has raised hopes of reducing interlaboratory variability of results. Commutability of reference material has been shown to be necessary if such material is to reduce variability among laboratories. Here we evaluated the commutability of the WHO standard using 10 different real-time quantitative CMV PCR assays run by eight different laboratories. Test panels, including aliquots of 50 patient samples (40 positive samples and 10 negative samples) and lyophilized CMV standard, were run, with each testing center using its own quantitative calibrators, reagents, and nucleic acid extraction methods. Commutability was assessed both on a pairwise basis and over the entire group of assays, using linear regression and correspondence analyses. Commutability of the WHO material differed among the tests that were evaluated, and these differences appeared to vary depending on the method of statistical analysis used and the cohort of assays included in the analysis. Depending on the methodology used, the WHO material showed poor or absent commutability with up to 50% of assays. Determination of commutability may require a multifaceted approach; the lack of commutability seen when using the WHO standard with several of the assays here suggests that further work is needed to bring us toward true consensus. PMID:26269622
Comparison of US EPA and European emission standards for combustion and incineration technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Licata, A.; Hartenstein, H.U.; Terracciano, L.
1997-12-01
There has been considerable debate, misunderstanding, and controversy when comparing emission standards used in Europe and the United States. One of the first questions you hear whenever U.S. EPA publishes a new emission standard is, {open_quotes}Is it as restrictive or is it the same as the German standard{close_quotes}. Although both systems of regulation call for the use of CEMS for compliance, there are substantial differences in how emission standards are structured in Europe and in the U.S. They include reference points, averaging times, sampling methods, and technology. Generally, the European standards tend to be more restrictive, due in part tomore » the fact that the facilities are of necessity sited in close proximity to residential areas. In Germany, for example, regulations in general are comprehensive and include both design standards and emission limits while U.S. EPA`s rules are source specific and, in most cases, limited to numerical emission standards. In some cases, comparisons can be made between emission standards and, in some cases, comparisons can only be made with restrictive caveats. The paper will present a comprehensive overview of the emission standards and how they are applied.« less
Environmental Response Laboratory Network activities include the All Hazard Receipt Facility and Screening Protocol, standardizing chemical methods, Chemical Warfare Agent Fixed Laboratory Pilot Project, microbial efforts, and WLA response plan.
Standardization of shape memory alloy test methods toward certification of aerospace applications
NASA Astrophysics Data System (ADS)
Hartl, D. J.; Mabe, J. H.; Benafan, O.; Coda, A.; Conduit, B.; Padan, R.; Van Doren, B.
2015-08-01
The response of shape memory alloy (SMA) components employed as actuators has enabled a number of adaptable aero-structural solutions. However, there are currently no industry or government-accepted standardized test methods for SMA materials when used as actuators and their transition to commercialization and production has been hindered. This brief fast track communication introduces to the community a recently initiated collaborative and pre-competitive SMA specification and standardization effort that is expected to deliver the first ever regulatory agency-accepted material specification and test standards for SMA as employed as actuators for commercial and military aviation applications. In the first phase of this effort, described herein, the team is working to review past efforts and deliver a set of agreed-upon properties to be included in future material certification specifications as well as the associated experiments needed to obtain them in a consistent manner. Essential for the success of this project is the participation and input from a number of organizations and individuals, including engineers and designers working in materials and processing development, application design, SMA component fabrication, and testing at the material, component, and system level. Going forward, strong consensus among this diverse body of participants and the SMA research community at large is needed to advance standardization concepts for universal adoption by the greater aerospace community and especially regulatory bodies. It is expected that the development and release of public standards will be done in collaboration with an established standards development organization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, Jaromy; Sun Zaijing; Wells, Doug
2009-03-10
Photon activation analysis detected elements in two NIST standards that did not have reported concentration values. A method is currently being developed to infer these concentrations by using scaling parameters and the appropriate known quantities within the NIST standard itself. Scaling parameters include: threshold, peak and endpoint energies; photo-nuclear cross sections for specific isotopes; Bremstrahlung spectrum; target thickness; and photon flux. Photo-nuclear cross sections and energies from the unknown elements must also be known. With these quantities, the same integral was performed for both the known and unknown elements resulting in an inference of the concentration of the un-reported elementmore » based on the reported value. Since Rb and Mn were elements that were reported in the standards, and because they had well-identified peaks, they were used as the standards of inference to determine concentrations of the unreported elements of As, I, Nb, Y, and Zr. This method was tested by choosing other known elements within the standards and inferring a value based on the stated procedure. The reported value of Mn in the first NIST standard was 403{+-}15 ppm and the reported value of Ca in the second NIST standard was 87000 ppm (no reported uncertainty). The inferred concentrations were 370{+-}23 ppm and 80200{+-}8700 ppm respectively.« less
Reproducibility in Computational Neuroscience Models and Simulations
McDougal, Robert A.; Bulanova, Anna S.; Lytton, William W.
2016-01-01
Objective Like all scientific research, computational neuroscience research must be reproducible. Big data science, including simulation research, cannot depend exclusively on journal articles as the method to provide the sharing and transparency required for reproducibility. Methods Ensuring model reproducibility requires the use of multiple standard software practices and tools, including version control, strong commenting and documentation, and code modularity. Results Building on these standard practices, model sharing sites and tools have been developed that fit into several categories: 1. standardized neural simulators, 2. shared computational resources, 3. declarative model descriptors, ontologies and standardized annotations; 4. model sharing repositories and sharing standards. Conclusion A number of complementary innovations have been proposed to enhance sharing, transparency and reproducibility. The individual user can be encouraged to make use of version control, commenting, documentation and modularity in development of models. The community can help by requiring model sharing as a condition of publication and funding. Significance Model management will become increasingly important as multiscale models become larger, more detailed and correspondingly more difficult to manage by any single investigator or single laboratory. Additional big data management complexity will come as the models become more useful in interpreting experiments, thus increasing the need to ensure clear alignment between modeling data, both parameters and results, and experiment. PMID:27046845
The use of Delphi and Nominal Group Technique in nursing education: A review.
Foth, Thomas; Efstathiou, Nikolaos; Vanderspank-Wright, Brandi; Ufholz, Lee-Anne; Dütthorn, Nadin; Zimansky, Manuel; Humphrey-Murto, Susan
2016-08-01
Consensus methods are used by healthcare professionals and educators within nursing education because of their presumed capacity to extract the profession's' "collective knowledge" which is often considered tacit knowledge that is difficult to verbalize and to formalize. Since their emergence, consensus methods have been criticized and their rigour has been questioned. Our study focuses on the use of consensus methods in nursing education and seeks to explore how extensively consensus methods are used, the types of consensus methods employed, the purpose of the research and how standardized the application of the methods is. A systematic approach was employed to identify articles reporting the use of consensus methods in nursing education. The search strategy included keyword search in five electronic databases [Medline (Ovid), Embase (Ovid), AMED (Ovid), ERIC (Ovid) and CINAHL (EBSCO)] for the period 2004-2014. We included articles published in English, French, German and Greek discussing the use of consensus methods in nursing education or in the context of identifying competencies. A standardized extraction form was developed using an iterative process with results from the search. General descriptors such as type of journal, nursing speciality, type of educational issue addressed, method used, geographic scope were recorded. Features reflecting methodology such as number, selection and composition of panel participants, number of rounds, response rates, definition of consensus, and feedback were recorded. 1230 articles were screened resulting in 101 included studies. The Delphi was used in 88.2% of studies. Most were reported in nursing journals (63.4%). The most common purpose to use these methods was defining competencies, curriculum development and renewal, and assessment. Remarkably, both standardization and reporting of consensus methods was noted to be generally poor. Areas where the methodology appeared weak included: preparation of the initial questionnaire; the selection and description of participants; number of rounds and number of participants remaining after each round; formal feedback of group ratings; definitions of consensus and a priori definition of numbers of rounds; and modifications to the methodology. The findings of this study are concerning if interpreted within the context of the structural critiques because our findings lend support to these critiques. If consensus methods should continue being used to inform best practices in nursing education, they must be rigorous in design. Copyright © 2016 Elsevier Ltd. All rights reserved.
[Discussion on several contents of textbook Acupuncture and Moxibustion (New Century 4th Edition)].
Tian, Kaiyu
2018-02-12
The textbook Acupuncture and Moxibustion (New Century 4th Edition) was published by China Press of Traditional Chinese Medicine in August of 2016. The author proposed several discussions in the textbook. The information, including the issue date of China national standard Standardized Manipulations of Acupuncture and Moxibustion , the number of foreign countries where China medical teams were assigned, and the number of acupuncture indications recommended by WHO, was not accurate. The content, including several methods of acupoint location, specification of filiform needles, rotating angle of needle, disinfection of needles and skin, locations and indications of scalp acupuncture, etc. should be corrected. Besides, the writing of textbooks should follow national or industry standards.
White, Donald J; Schneiderman, Eva; Colón, Ellen; St John, Samuel
2015-01-01
This paper describes the development and standardization of a profilometry-based method for assessment of dentifrice abrasivity called Radioactive Dentin Abrasivity - Profilometry Equivalent (RDA-PE). Human dentine substrates are mounted in acrylic blocks of precise standardized dimensions, permitting mounting and brushing in V8 brushing machines. Dentin blocks are masked to create an area of "contact brushing." Brushing is carried out in V8 brushing machines and dentifrices are tested as slurries. An abrasive standard is prepared by diluting the ISO 11609 abrasivity reference calcium pyrophosphate abrasive into carboxymethyl cellulose/glycerin, just as in the RDA method. Following brushing, masked areas are removed and profilometric analysis is carried out on treated specimens. Assessments of average abrasion depth (contact or optical profilometry) are made. Inclusion of standard calcium pyrophosphate abrasive permits a direct RDA equivalent assessment of abrasion, which is characterized with profilometry as Depth test/Depth control x 100. Within the test, the maximum abrasivity standard of 250 can be created in situ simply by including a treatment group of standard abrasive with 2.5x number of brushing strokes. RDA-PE is enabled in large part by the availability of easy-to-use and well-standardized modern profilometers, but its use in V8 brushing machines is enabled by the unique specific conditions described herein. RDA-PE permits the evaluation of dentifrice abrasivity to dentin without the requirement of irradiated teeth and infrastructure for handling them. In direct comparisons, the RDA-PE method provides dentifrice abrasivity assessments comparable to the gold industry standard RDA technique.
ANALYTICAL METHODS NECESSARY TO IMPLEMENT RISK-BASED CRITERIA FOR CHEMICALS IN MUNICIPAL SLUDGE
The Ambient Water Quality Criteria that were promulgated by the U.S. Environmental Protection Agency in 1980 included water concentration levels which, for many pollutants, were so low as to be unmeasurable by standard analytical methods. Criteria for controlling toxics in munici...
USE OF A MOLECULAR PROBE ASSAY FOR MONITORING SALMONELLA SPP. IN BIOSOLIDS SAMPLES
Current federal regulations (40 CFR 503) require enumeration of fecal coliform or salmonellae prior to land application of biosolids. This regulation specifies use of enumeration methods included in "Standard methods for the Examination of Water and Wastewater 18th Edition," (SM)...
NASA Technical Reports Server (NTRS)
Parrott, T. L.
1973-01-01
An improved method for the design of expansion-chamber mufflers is described and applied to the task of reducing exhaust noise generated by a helicopter. The method is an improvement of standard transmission-line theory in that it accounts for the effect of the mean exhaust-gas flow on the acoustic-transmission properties of a muffler system, including the termination boundary condition. The method has been computerized, and the computer program includes an optimization procedure that adjusts muffler component lengths to achieve a minimum specified desired transmission loss over a specified frequency range. A printout of the program is included together with a user-oriented description.
Giuliani, N; Beyer, J; Augsburger, M; Varlet, V
2015-03-01
Drug abuse is a widespread problem affecting both teenagers and adults. Nitrous oxide is becoming increasingly popular as an inhalation drug, causing harmful neurological and hematological effects. Some gas chromatography-mass spectrometry (GC-MS) methods for nitrous oxide measurement have been previously described. The main drawbacks of these methods include a lack of sensitivity for forensic applications; including an inability to quantitatively determine the concentration of gas present. The following study provides a validated method using HS-GC-MS which incorporates hydrogen sulfide as a suitable internal standard allowing the quantification of nitrous oxide. Upon analysis, sample and internal standard have similar retention times and are eluted quickly from the molecular sieve 5Å PLOT capillary column and the Porabond Q column therefore providing rapid data collection whilst preserving well defined peaks. After validation, the method has been applied to a real case of N2O intoxication indicating concentrations in a mono-intoxication. Copyright © 2015 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Shannon, Kathleen
2018-01-01
This paper describes, as an alternative to the Moore Method or a purely flipped classroom, a student-driven, textbook-supported method for teaching that allows movement through the standard course material with differing depths, but the same pace. This method, which includes a combination of board work followed by class discussion, on-demand brief…
MASW on the standard seismic prospective scale using full spread recording
NASA Astrophysics Data System (ADS)
Białas, Sebastian; Majdański, Mariusz; Trzeciak, Maciej; Gałczyński, Edward; Maksym, Andrzej
2015-04-01
The Multichannel Analysis of Surface Waves (MASW) is one of seismic survey methods that use the dispersion curve of surface waves in order to describe the stiffness of the surface. Is is used mainly for geotechnical engineering scale with total length of spread between 5 - 450 m and spread offset between 1 - 100 m, the hummer is the seismic source on this surveys. The standard procedure of MASW survey is: data acquisition, dispersion analysis and inversion of extracting dispersion curve to obtain the closest theoretical curve. The final result includes share-wave velocity (Vs) values at different depth along the surveyed lines. The main goal of this work is to expand this engineering method to the bigger scale with the length of standard prospecting spread of 20 km using 4.5 Hz version of vertical component geophones. The standard vibroseis and explosive method are used as the seismic source. The acquisition were conducted on the full spread all the time during each single shoot. The seismic data acquisition used for this analysis were carried out on the Braniewo 2014 project in north of Poland. The results achieved during standard MASW procedure says that this method can be used on much bigger scale as well. The different methodology of this analysis requires only much stronger seismic source.
The surveillance of nursing standards: an organisational case study.
Cooke, Hannah
2006-11-01
Quality assurance has acquired increasing prominence in contemporary healthcare systems and there has been an 'explosion' of audit activity. Some authors have begun to investigate the impact of audit activity on organisational and professional cultures. This paper considers data from a wider study of the management of the 'problem' nurse. Nurses and managers had contrasting perceptions of the value of different methods of assessing ward standards and their views are presented here. The study involved organisational case studies in three healthcare Trusts in the north of England. The fieldwork for this study was funded by the United Kingdom Central Council for Nursing, Midwifery and Health Visiting under their research scholarship programme. Multiple methods were employed including observation, interviewing and documentary analysis. A total of 144 informal interviews were carried out with ward nurses and their managers. The study demonstrated different viewpoints regarding the surveillance of nursing standards at top management, middle management and ward levels. The paper considers the discrepancies between these different viewpoints. None of the participants placed a high value on audit as a method of assessing ward standards. Complaints data and informal methods were more highly valued by managers. Ward nurses stressed the importance of presence and vigilance in assuring high standards of nursing care.
Method and Apparatus for Reducing the Vulnerability of Latches to Single Event Upsets
NASA Technical Reports Server (NTRS)
Shuler, Robert L., Jr. (Inventor)
2002-01-01
A delay circuit includes a first network having an input and an output node, a second network having an input and an output, the input of the second network being coupled to the output node of the first network. The first network and the second network are configured such that: a glitch at the input to the first network having a length of approximately one-half of a standard glitch time or less does not cause the voltage at the output of the second network to cross a threshold, a glitch at the input to the first network having a length of between approximately one-half and two standard glitch times causes the voltage at the output of the second network to cross the threshold for less than the length of the glitch, and a glitch at the input to the first network having a length of greater than approximately two standard glitch times causes the voltage at the output of the second network to cross the threshold for approximately the time of the glitch. The method reduces the vulnerability of a latch to single event upsets. The latch includes a gate having an input and an output and a feedback path from the output to the input of the gate. The method includes inserting a delay into the feedback path and providing a delay in the gate.
Method and Apparatus for Reducing the Vulnerability of Latches to Single Event Upsets
NASA Technical Reports Server (NTRS)
Shuler, Robert L., Jr. (Inventor)
2002-01-01
A delay circuit includes a first network having an input and an output node, a second network having an input and an output, the input of the second network being coupled to the output node of the first network. The first network and the second network are configured such that: a glitch at the input to the first network having a length of approximately one-half of a standard glitch time or less does not cause tile voltage at the output of the second network to cross a threshold, a glitch at the input to the first network having a length of between approximately one-half and two standard glitch times causes the voltage at the output of the second network to cross the threshold for less than the length of the glitch, and a glitch at the input to the first network having a length of greater than approximately two standard glitch times causes the voltage at the output of the second network to cross the threshold for approximately the time of the glitch. A method reduces the vulnerability of a latch to single event upsets. The latch includes a gate having an input and an output and a feedback path from the output to the input of the gate. The method includes inserting a delay into the feedback path and providing a delay in the gate.
International Standards for Properties and Performance of Advanced Ceramics - 30 years of Excellence
NASA Technical Reports Server (NTRS)
Jenkins, Michael G.; Salem, Jonathan A.; Helfinstine, John; Quinn, George D.; Gonczy, Stephen T.
2016-01-01
Mechanical and physical properties/performance of brittle bodies (e.g., advanced ceramics and glasses) can be difficult to measure correctly unless the proper techniques are used. For three decades, ASTM Committee C28 on Advanced Ceramics, has developed numerous full-consensus standards (e.g., test methods, practices, guides, terminology) to measure various properties and performance of a monolithic and composite ceramics and coatings that, in some cases, may be applicable to glasses. These standards give the "what, how, how not, why, why not, etc." for many mechanical, physical, thermal, properties and performance of advanced ceramics. Use of these standards provides accurate, reliable, repeatable and complete data. Involvement in ASTM Committee C28 has included users, producers, researchers, designers, academicians, etc. who write, continually update, and validate through round robin test programmes, more than 45 standards in the 30 years since the Committee's inception in 1986. Included in this poster is a pictogram of the ASTM Committee C28 standards and how to obtain them either as i) individual copies with full details or ii) a complete collection in one volume. A listing of other ASTM committees of interest is included. In addition, some examples of the tangible benefits of standards for advanced ceramics are employed to demonstrate their practical application.
Turewicz, Michael; Kohl, Michael; Ahrens, Maike; Mayer, Gerhard; Uszkoreit, Julian; Naboulsi, Wael; Bracht, Thilo; Megger, Dominik A; Sitek, Barbara; Marcus, Katrin; Eisenacher, Martin
2017-11-10
The analysis of high-throughput mass spectrometry-based proteomics data must address the specific challenges of this technology. To this end, the comprehensive proteomics workflow offered by the de.NBI service center BioInfra.Prot provides indispensable components for the computational and statistical analysis of this kind of data. These components include tools and methods for spectrum identification and protein inference, protein quantification, expression analysis as well as data standardization and data publication. All particular methods of the workflow which address these tasks are state-of-the-art or cutting edge. As has been shown in previous publications, each of these methods is adequate to solve its specific task and gives competitive results. However, the methods included in the workflow are continuously reviewed, updated and improved to adapt to new scientific developments. All of these particular components and methods are available as stand-alone BioInfra.Prot services or as a complete workflow. Since BioInfra.Prot provides manifold fast communication channels to get access to all components of the workflow (e.g., via the BioInfra.Prot ticket system: bioinfraprot@rub.de) users can easily benefit from this service and get support by experts. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
For the regulatory process, EPA is required to develop a regulatory impact analysis (RIA). This August 2010 RIA includes an economic impact analysis (EIA) and a small entity impacts analysis and documents the RIA methods and results for the 2010 rules
ERIC Educational Resources Information Center
Hill, James E.; And Others
A study has been made at the National Bureau of Standards of the different techniques that are or could be used for testing solar collectors and thermal storage devices that are used in solar heating and cooling systems. This report reviews the various testing methods and outlines a recommended test procedure, including apparatus and…
Aircraft Power-Plant Instruments
NASA Technical Reports Server (NTRS)
Sontag, Harcourt; Brombacher, W G
1934-01-01
This report supersedes NACA-TR-129 which is now obsolete. Aircraft power-plant instruments include tachometers, engine thermometers, pressure gages, fuel-quantity gages, fuel flow meters and indicators, and manifold pressure gages. The report includes a description of the commonly used types and some others, the underlying principle utilized in the design, and some design data. The inherent errors of the instrument, the methods of making laboratory tests, descriptions of the test apparatus, and data in considerable detail in the performance of commonly used instruments are presented. Standard instruments and, in cases where it appears to be of interest, those used as secondary standards are described. A bibliography of important articles is included.
Evaluation of internal noise methods for Hotelling observers
NASA Astrophysics Data System (ADS)
Zhang, Yani; Pham, Binh T.; Eckstein, Miguel P.
2005-04-01
Including internal noise in computer model observers to degrade model observer performance to human levels is a common method to allow for quantitatively comparisons of human and model performance. In this paper, we studied two different types of methods for injecting internal noise to Hotelling model observers. The first method adds internal noise to the output of the individual channels: a) Independent non-uniform channel noise, b) Independent uniform channel noise. The second method adds internal noise to the decision variable arising from the combination of channel responses: a) internal noise standard deviation proportional to decision variable's standard deviation due to the external noise, b) internal noise standard deviation proportional to decision variable's variance caused by the external noise. We tested the square window Hotelling observer (HO), channelized Hotelling observer (CHO), and Laguerre-Gauss Hotelling observer (LGHO). The studied task was detection of a filling defect of varying size/shape in one of four simulated arterial segment locations with real x-ray angiography backgrounds. Results show that the internal noise method that leads to the best prediction of human performance differs across the studied models observers. The CHO model best predicts human observer performance with the channel internal noise. The HO and LGHO best predict human observer performance with the decision variable internal noise. These results might help explain why previous studies have found different results on the ability of each Hotelling model to predict human performance. Finally, the present results might guide researchers with the choice of method to include internal noise into their Hotelling models.
Defining a standard set of patient-centered outcomes for men with localized prostate cancer.
Martin, Neil E; Massey, Laura; Stowell, Caleb; Bangma, Chris; Briganti, Alberto; Bill-Axelson, Anna; Blute, Michael; Catto, James; Chen, Ronald C; D'Amico, Anthony V; Feick, Günter; Fitzpatrick, John M; Frank, Steven J; Froehner, Michael; Frydenberg, Mark; Glaser, Adam; Graefen, Markus; Hamstra, Daniel; Kibel, Adam; Mendenhall, Nancy; Moretti, Kim; Ramon, Jacob; Roos, Ian; Sandler, Howard; Sullivan, Francis J; Swanson, David; Tewari, Ashutosh; Vickers, Andrew; Wiegel, Thomas; Huland, Hartwig
2015-03-01
Value-based health care has been proposed as a unifying force to drive improved outcomes and cost containment. To develop a standard set of multidimensional patient-centered health outcomes for tracking, comparing, and improving localized prostate cancer (PCa) treatment value. We convened an international working group of patients, registry experts, urologists, and radiation oncologists to review existing data and practices. The group defined a recommended standard set representing who should be tracked, what should be measured and at what time points, and what data are necessary to make meaningful comparisons. Using a modified Delphi method over a series of teleconferences, the group reached consensus for the Standard Set. We recommend that the Standard Set apply to men with newly diagnosed localized PCa treated with active surveillance, surgery, radiation, or other methods. The Standard Set includes acute toxicities occurring within 6 mo of treatment as well as patient-reported outcomes tracked regularly out to 10 yr. Patient-reported domains of urinary incontinence and irritation, bowel symptoms, sexual symptoms, and hormonal symptoms are included, and the recommended measurement tool is the Expanded Prostate Cancer Index Composite Short Form. Disease control outcomes include overall, cause-specific, metastasis-free, and biochemical relapse-free survival. Baseline clinical, pathologic, and comorbidity information is included to improve the interpretability of comparisons. We have defined a simple, easily implemented set of outcomes that we believe should be measured in all men with localized PCa as a crucial first step in improving the value of care. Measuring, reporting, and comparing identical outcomes across treatments and treatment centers will provide patients and providers with information to make informed treatment decisions. We defined a set of outcomes that we recommend being tracked for every man being treated for localized prostate cancer. Copyright © 2014 European Association of Urology. Published by Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Allen, G.
1972-01-01
The use of the theta-operator method and generalized hypergeometric functions in obtaining solutions to nth-order linear ordinary differential equations is explained. For completeness, the analysis of the differential equation to determine whether the point of expansion is an ordinary point or a regular singular point is included. The superiority of the two methods shown over the standard method is demonstrated by using all three of the methods to work out several examples. Also included is a compendium of formulae and properties of the theta operator and generalized hypergeometric functions which is complete enough to make the report self-contained.
Assessing Data Quality in Emergent Domains of Earth Sciences
NASA Astrophysics Data System (ADS)
Darch, P. T.; Borgman, C.
2016-12-01
As earth scientists seek to study known phenomena in new ways, and to study new phenomena, they often develop new technologies and new methods such as embedded network sensing, or reapply extant technologies, such as seafloor drilling. Emergent domains are often highly multidisciplinary as researchers from many backgrounds converge on new research questions. They may adapt existing methods, or develop methods de novo. As a result, emerging domains tend to be methodologically heterogeneous. As these domains mature, pressure to standardize methods increases. Standardization promotes trust, reliability, accuracy, and reproducibility, and simplifies data management. However, for standardization to occur, researchers must be able to assess which of the competing methods produces the highest quality data. The exploratory nature of emerging domains discourages standardization. Because competing methods originate in different disciplinary backgrounds, their scientific credibility is difficult to compare. Instead of direct comparison, researchers attempt to conduct meta-analyses. Scientists compare datasets produced by different methods to assess their consistency and efficiency. This paper presents findings from a long-term qualitative case study of research on the deep subseafloor biosphere, an emergent domain. A diverse community converged on the study of microbes in the seafloor and those microbes' interactions with the physical environments they inhabit. Data on this problem are scarce, leading to calls for standardization as a means to acquire and analyze greater volumes of data. Lacking consistent methods, scientists attempted to conduct meta-analyses to determine the most promising methods on which to standardize. Among the factors that inhibited meta-analyses were disparate approaches to metadata and to curating data. Datasets may be deposited in a variety of databases or kept on individual scientists' servers. Associated metadata may be inconsistent or hard to interpret. Incentive structures, including prospects for journal publication, often favor new data over reanalyzing extant datasets. Assessing data quality in emergent domains is extremely difficult and will require adaptations in infrastructure, culture, and incentives.
USDA-ARS?s Scientific Manuscript database
Biodiesel is usually analyzed by the various methods called for in standards such as ASTM D6751 and EN 14214. Nuclear magnetic resonance (NMR) is not one of these methods. However, NMR, with 1H-NMR commonly applied, can be useful in a variety of applications related to biodiesel. These include monit...
Alternative Fuels Data Center: State Requirements Boost the Transition to
these fleets to choose between one of two compliance methods - Standard Compliance, which requires Laws and Incentives website also includes representative examples of incentives and regulations at the participating in multi-party partnerships are examples of innovative methods that will drive legislation and
29 CFR 2590.701-4 - Rules relating to creditable coverage.
Code of Federal Regulations, 2010 CFR
2010-07-01
... foreign country, or any political subdivision of a State, the U.S. government, or a foreign country that... Children's Health Insurance Program). (2) Excluded coverage. Creditable coverage does not include coverage... standard method described in paragraph (b) of this section. A plan or issuer may use the alternative method...
Health-Related Benefits of Attaining the 8-Hr Ozone Standard
Hubbell, Bryan J.; Hallberg, Aaron; McCubbin, Donald R.; Post, Ellen
2005-01-01
During the 2000–2002 time period, between 36 and 56% of ozone monitors each year in the United States failed to meet the current ozone standard of 80 ppb for the fourth highest maximum 8-hr ozone concentration. We estimated the health benefits of attaining the ozone standard at these monitors using the U.S. Environmental Protection Agency’s Environmental Benefits Mapping and Analysis Program. We used health impact functions based on published epidemiologic studies, and valuation functions derived from the economics literature. The estimated health benefits for 2000 and 2001 are similar in magnitude, whereas the results for 2002 are roughly twice that of each of the prior 2 years. The simple average of health impacts across the 3 years includes reductions of 800 premature deaths, 4,500 hospital and emergency department admissions, 900,000 school absences, and > 1 million minor restricted activity days. The simple average of benefits (including premature mortality) across the 3 years is $5.7 billion [90% confidence interval (CI), 0.6–15.0] for the quadratic rollback simulation method and $4.9 billion (90% CI, 0.5–14.0) for the proportional rollback simulation method. Results are sensitive to the form of the standard and to assumptions about background ozone levels. If the form of the standard is based on the first highest maximum 8-hr concentration, impacts are increased by a factor of 2–3. Increasing the assumed hourly background from zero to 40 ppb reduced impacts by 30 and 60% for the proportional and quadratic attainment simulation methods, respectively. PMID:15626651
Realization of the medium and high vacuum primary standard in CENAM, Mexico
NASA Astrophysics Data System (ADS)
Torres-Guzman, J. C.; Santander, L. A.; Jousten, K.
2005-12-01
A medium and high vacuum primary standard, based on the static expansion method, has been set up at Centro Nacional de Metrología (CENAM), Mexico. This system has four volumes and covers a measuring range of 1 × 10-5 Pa to 1 × 103 Pa of absolute pressure. As part of its realization, a characterization was performed, which included volume calibrations, several tests and a bilateral key comparison. To determine the expansion ratios, two methods were applied: the gravimetric method and the method with a linearized spinning rotor gauge. The outgassing ratios for the whole system were also determined. A comparison was performed with Physikalisch-Technische Bundesanstalt (comparison SIM-Euromet.M.P-BK3). By means of this comparison, a link has been achieved with the Euromet comparison (Euromet.M.P-K1.b). As a result, it is concluded that the value obtained at CENAM is equivalent to the Euromet reference value, and therefore the design, construction and operation of CENAM's SEE-1 vacuum primary standard were successful.
Aldana Marcos, H J; Ferrari, C C; Benitez, I; Affanni, J M
1996-12-01
This paper reports the standardization of methods used for processing and embedding various vertebrate brains of different size in paraffin. Other technical details developed for avoiding frequent difficulties arising during laboratory routine are also reported. Some modifications of the Nissl and Klüver-Barrera staining methods are proposed. These modifications include: 1) a Nissl stain solution with a rapid and efficient action with easier differentiation; 2) the use of a cheap microwave oven for the Klüver-Barrera stain. These procedures have the advantage of permitting Nissl and Klüver-Barrera staining of nervous tissue in about five and fifteen minutes respectively. The proposed procedures have been tested in brains obtained from fish, amphibians, reptiles and mammals of different body sizes. They are the result of our long experience in preparing slides for comparative studies. Serial sections of excellent quality were regularly obtained in all the specimens studied. These standardized methods, being simple and quick, are recommended for routine use in neurobiological laboratories.
Teramura, Hajime; Fukuda, Noriko; Okada, Yumiko; Ogihara, Hirokazu
2018-01-01
The four types of chromogenic selective media that are commercially available in Japan were compared for establishing a Japanese standard method for detecting Cronobacter spp. based on ISO/TS 22964:2006. When assessed using 9 standard Cronobacter spp. strains and 29 non-Cronobacter strains, Enterobacter sakazakii isolation agar, Chromocult TM Enterobacter sakazakii agar, CHROMagar TM E. sakazakii, and XM-sakazakii agar demonstrated excellent inclusivity and exclusivity. Using the ISO/TS 22964:2006 method, the recovered numbers of 38 Cronobacter spp. strains, including 29 C. sakazakii isolates obtained from each medium, were equivalent, indicating that there was no significant difference (p > 0.05) among the four types of chromogenic selective media. Thus, we demonstrated that these four chromogenic selective media are suitable alternatives when using the standard method for detecting Cronobacter spp. in Japan, based on the ISO/TS 22964:2006.
Applications of Automation Methods for Nonlinear Fracture Test Analysis
NASA Technical Reports Server (NTRS)
Allen, Phillip A.; Wells, Douglas N.
2013-01-01
Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.
Torey, Angeline; Sasidharan, Sreenivasan; Yeng, Chen; Latha, Lachimanan Yoga
2010-05-10
Quality control standardizations of the various medicinal plants used in traditional medicine is becoming more important today in view of the commercialization of formulations based on these plants. An attempt at standardization of Cassia spectabilis leaf has been carried out with respect to authenticity, assay and chemical constituent analysis. The authentication involved many parameters, including gross morphology, microscopy of the leaves and functional group analysis by Fourier Transform Infrared (FTIR) spectroscopy. The assay part of standardization involved determination of the minimum inhibitory concentration (MIC) of the extract which could help assess the chemical effects and establish curative values. The MIC of the C. spectabilis leaf extracts was investigated using the Broth Dilution Method. The extracts showed a MIC value of 6.25 mg/mL, independent of the extraction time. The chemical constituent aspect of standardization involves quantification of the main chemical components in C. spectabilis. The GCMS method used for quantification of 2,4-(1H,3H)-pyrimidinedione in the extract was rapid, accurate, precise, linear (R(2) = 0.8685), rugged and robust. Hence this method was suitable for quantification of this component in C. spectabilis. The standardization of C. spectabilis is needed to facilitate marketing of medicinal plants, with a view to promoting the export of valuable Malaysian Traditional Medicinal plants such as C. spectabilis.
Boardman, Carl; O'Callaghan, Kenneth; Delort, Anne-Marie; Song, Jim
2018-01-01
Plastic litter is encountered in aquatic ecosystems across the globe, including polar environments and the deep sea. To mitigate the adverse societal and ecological impacts of this waste, there has been debate on whether ‘biodegradable' materials should be granted exemptions from plastic bag bans and levies. However, great care must be exercised when attempting to define this term, due to the broad and complex range of physical and chemical conditions encountered within natural ecosystems. Here, we review existing international industry standards and regional test methods for evaluating the biodegradability of plastics within aquatic environments (wastewater, unmanaged freshwater and marine habitats). We argue that current standards and test methods are insufficient in their ability to realistically predict the biodegradability of carrier bags in these environments, due to several shortcomings in experimental procedures and a paucity of information in the scientific literature. Moreover, existing biodegradability standards and test methods for aquatic environments do not involve toxicity testing or account for the potentially adverse ecological impacts of carrier bags, plastic additives, polymer degradation products or small (microscopic) plastic particles that can arise via fragmentation. Successfully addressing these knowledge gaps is a key requirement for developing new biodegradability standard(s) for lightweight carrier bags. PMID:29892374
A review of contemporary methods for the presentation of scientific uncertainty.
Makinson, K A; Hamby, D M; Edwards, J A
2012-12-01
Graphic methods for displaying uncertainty are often the most concise and informative way to communicate abstract concepts. Presentation methods currently in use for the display and interpretation of scientific uncertainty are reviewed. Numerous subjective and objective uncertainty display methods are presented, including qualitative assessments, node and arrow diagrams, standard statistical methods, box-and-whisker plots,robustness and opportunity functions, contribution indexes, probability density functions, cumulative distribution functions, and graphical likelihood functions.
40 CFR 63.7142 - What are the requirements for claiming area source status?
Code of Federal Regulations, 2013 CFR
2013-07-01
... test must be repeated. (v) The post-test analyte spike procedure of section 11.2.7 of ASTM Method D6735... (3) ASTM Method D6735-01, Standard Test Method for Measurement of Gaseous Chlorides and Fluorides...)(3)(i) through (vi) of this section are followed. (i) A test must include three or more runs in which...
40 CFR 63.7142 - What are the requirements for claiming area source status?
Code of Federal Regulations, 2011 CFR
2011-07-01
... test must be repeated. (v) The post-test analyte spike procedure of section 11.2.7 of ASTM Method D6735... (3) ASTM Method D6735-01, Standard Test Method for Measurement of Gaseous Chlorides and Fluorides...)(3)(i) through (vi) of this section are followed. (i) A test must include three or more runs in which...
40 CFR 63.7142 - What are the requirements for claiming area source status?
Code of Federal Regulations, 2012 CFR
2012-07-01
... test must be repeated. (v) The post-test analyte spike procedure of section 11.2.7 of ASTM Method D6735... (3) ASTM Method D6735-01, Standard Test Method for Measurement of Gaseous Chlorides and Fluorides...)(3)(i) through (vi) of this section are followed. (i) A test must include three or more runs in which...
40 CFR 63.7142 - What are the requirements for claiming area source status?
Code of Federal Regulations, 2010 CFR
2010-07-01
... test must be repeated. (v) The post-test analyte spike procedure of section 11.2.7 of ASTM Method D6735... (3) ASTM Method D6735-01, Standard Test Method for Measurement of Gaseous Chlorides and Fluorides...)(3)(i) through (vi) of this section are followed. (i) A test must include three or more runs in which...
40 CFR 63.7142 - What are the requirements for claiming area source status?
Code of Federal Regulations, 2014 CFR
2014-07-01
... test must be repeated. (v) The post-test analyte spike procedure of section 11.2.7 of ASTM Method D6735... (3) ASTM Method D6735-01, Standard Test Method for Measurement of Gaseous Chlorides and Fluorides...)(3)(i) through (vi) of this section are followed. (i) A test must include three or more runs in which...
ERIC Educational Resources Information Center
Wootton-Gorges, Sandra L.; Stein-Wexler, Rebecca; Walton, John W.; Rosas, Angela J.; Coulter, Kevin P.; Rogers, Kristen K.
2008-01-01
Purpose: Chest radiographs (CXR) are the standard method for evaluating rib fractures in abused infants. Computed tomography (CT) is a sensitive method to detect rib fractures. The purpose of this study was to compare CT and CXR in the evaluation of rib fractures in abused infants. Methods: This retrospective study included all 12 abused infants…
Gordeev, S A; Voronin, S G
2016-01-01
To analyze the efficacy of modified (passive radiocarpal articulation flexion/extension) and «standard» (passive radiocarpal articulation flexion) methods of kinesthetic evoked potentials for proprioceptive sensitivity assessment in healthy subjects and patients with spondylotic cervical myelopathy. The study included 14 healthy subjects (4 women and 10 men, mean age 54.1±10.5 years) and 8 patients (2 women and 6 men, mean age 55.8±10.9 years) with spondylotic cervical myelopathy. Muscle-joint sensation was examined during the clinical study. A modified method of kinesthetic evoked potentials was developed. This method differed from the "standard" one by the organization of a cycle including several passive movements,where each new movement differed from the preceding one by the direction. The modified method of kinesthetic evoked potentials ensures more reliable kinesthetic sensitivity assessment due to movement variability. Asignificant increaseof the latent periods of the early components of the response was found in patients compared to healthy subjects. The modified method of kinesthetic evoked potentials can be used for objective diagnosis of proprioceptive sensitivity disorders in patients with spondylotic cervical myelopathy.
Garbarino, John R.
1999-01-01
The inductively coupled plasma?mass spectrometric (ICP?MS) methods have been expanded to include the determination of dissolved arsenic, boron, lithium, selenium, strontium, thallium, and vanadium in filtered, acidified natural water. Method detection limits for these elements are now 10 to 200 times lower than by former U.S. Geological Survey (USGS) methods, thus providing lower variability at ambient concentrations. The bias and variability of the method was determined by using results from spike recoveries, standard reference materials, and validation samples. Spike recoveries at 5 to 10 times the method detection limit and 75 micrograms per liter in reagent-water, surface-water, and groundwater matrices averaged 93 percent for seven replicates, although selected elemental recoveries in a ground-water matrix with an extremely high iron sulfate concentration were negatively biased by 30 percent. Results for standard reference materials were within 1 standard deviation of the most probable value. Statistical analysis of the results from about 60 filtered, acidified natural-water samples indicated that there was no significant difference between ICP?MS and former USGS official methods of analysis.
Herrera, Samantha; Enuameh, Yeetey; Adjei, George; Ae-Ngibise, Kenneth Ayuurebobi; Asante, Kwaku Poku; Sankoh, Osman; Owusu-Agyei, Seth; Yé, Yazoume
2017-10-23
Lack of valid and reliable data on malaria deaths continues to be a problem that plagues the global health community. To address this gap, the verbal autopsy (VA) method was developed to ascertain cause of death at the population level. Despite the adoption and wide use of VA, there are many recognized limitations of VA tools and methods, especially for measuring malaria mortality. This study synthesizes the strengths and limitations of existing VA tools and methods for measuring malaria mortality (MM) in low- and middle-income countries through a systematic literature review. The authors searched PubMed, Cochrane Library, Popline, WHOLIS, Google Scholar, and INDEPTH Network Health and Demographic Surveillance System sites' websites from 1 January 1990 to 15 January 2016 for articles and reports on MM measurement through VA. article presented results from a VA study where malaria was a cause of death; article discussed limitations/challenges related to measurement of MM through VA. Two authors independently searched the databases and websites and conducted a synthesis of articles using a standard matrix. The authors identified 828 publications; 88 were included in the final review. Most publications were VA studies; others were systematic reviews discussing VA tools or methods; editorials or commentaries; and studies using VA data to develop MM estimates. The main limitation were low sensitivity and specificity of VA tools for measuring MM. Other limitations included lack of standardized VA tools and methods, lack of a 'true' gold standard to assess accuracy of VA malaria mortality. Existing VA tools and methods for measuring MM have limitations. Given the need for data to measure progress toward the World Health Organization's Global Technical Strategy for Malaria 2016-2030 goals, the malaria community should define strategies for improving MM estimates, including exploring whether VA tools and methods could be further improved. Longer term strategies should focus on improving countries' vital registration systems for more robust and timely cause of death data.
ANSI/ASHRAE/IES Standard 90.1-2010 Performance Rating Method Reference Manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goel, Supriya; Rosenberg, Michael I.
This document is intended to be a reference manual for the Appendix G Performance Rating Method (PRM) of ANSI/ASHRAE/IES Standard 90.1- 2010 (Standard 90.1-2010).The PRM is used for rating the energy efficiency of commercial and high-rise residential buildings with designs that exceed the requirements of Standard 90.1. The procedures and processes described in this manual are designed to provide consistency and accuracy by filling in gaps and providing additional details needed by users of the PRM. It should be noted that this document is created independently from ASHRAE and SSPC 90.1 and is not sanctioned nor approved by either ofmore » those entities . Potential users of this manual include energy modelers, software developers and implementers of “beyond code” energy programs. Energy modelers using ASHRAE Standard 90.1-2010 for beyond code programs can use this document as a reference manual for interpreting requirements of the Performance Rating method. Software developers, developing tools for automated creation of the baseline model can use this reference manual as a guideline for developing the rules for the baseline model.« less
Facilitating Stewardship of scientific data through standards based workflows
NASA Astrophysics Data System (ADS)
Bastrakova, I.; Kemp, C.; Potter, A. K.
2013-12-01
There are main suites of standards that can be used to define the fundamental scientific methodology of data, methods and results. These are firstly Metadata standards to enable discovery of the data (ISO 19115), secondly the Sensor Web Enablement (SWE) suite of standards that include the O&M and SensorML standards and thirdly Ontology that provide vocabularies to define the scientific concepts and relationships between these concepts. All three types of standards have to be utilised by the practicing scientist to ensure that those who ultimately have to steward the data stewards to ensure that the data can be preserved curated and reused and repurposed. Additional benefits of this approach include transparency of scientific processes from the data acquisition to creation of scientific concepts and models, and provision of context to inform data use. Collecting and recording metadata is the first step in scientific data flow. The primary role of metadata is to provide details of geographic extent, availability and high-level description of data suitable for its initial discovery through common search engines. The SWE suite provides standardised patterns to describe observations and measurements taken for these data, capture detailed information about observation or analytical methods, used instruments and define quality determinations. This information standardises browsing capability over discrete data types. The standardised patterns of the SWE standards simplify aggregation of observation and measurement data enabling scientists to transfer disintegrated data to scientific concepts. The first two steps provide a necessary basis for the reasoning about concepts of ';pure' science, building relationship between concepts of different domains (linked-data), and identifying domain classification and vocabularies. Geoscience Australia is re-examining its marine data flows, including metadata requirements and business processes, to achieve a clearer link between scientific data acquisition and analysis requirements and effective interoperable data management and delivery. This includes participating in national and international dialogue on development of standards, embedding data management activities in business processes, and developing scientific staff as effective data stewards. Similar approach is applied to the geophysical data. By ensuring the geophysical datasets at GA strictly follow metadata and industry standards we are able to implement a provenance based workflow where the data is easily discoverable, geophysical processing can be applied to it and results can be stored. The provenance based workflow enables metadata records for the results to be produced automatically from the input dataset metadata.
Gao, Wen; Wang, Rui; Li, Dan; Liu, Ke; Chen, Jun; Li, Hui-Jun; Xu, Xiaojun; Li, Ping; Yang, Hua
2016-01-05
The flowers of Lonicera japonica Thunb. were extensively used to treat many diseases. As the demands for L. japonica increased, some related Lonicera plants were often confused or misused. Caffeoylquinic acids were always regarded as chemical markers in the quality control of L. japonica, but they could be found in all Lonicera species. Thus, a simple and reliable method for the evaluation of different Lonicera flowers is necessary to be established. In this work a method based on single standard to determine multi-components (SSDMC) combined with principal component analysis (PCA) for control and distinguish of Lonicera species flowers have been developed. Six components including three caffeoylquinic acids and three iridoid glycosides were assayed simultaneously using chlorogenic acid as the reference standard. The credibility and feasibility of the SSDMC method were carefully validated and the results demonstrated that there were no remarkable differences compared with external standard method. Finally, a total of fifty-one batches covering five Lonicera species were analyzed and PCA was successfully applied to distinguish the Lonicera species. This strategy simplifies the processes in the quality control of multiple-componential herbal medicine which effectively adapted for improving the quality control of those herbs belonging to closely related species. Copyright © 2015 Elsevier B.V. All rights reserved.
Adamski, Mateusz G; Gumann, Patryk; Baird, Alison E
2014-01-01
Over the past decade rapid advances have occurred in the understanding of RNA expression and its regulation. Quantitative polymerase chain reactions (qPCR) have become the gold standard for quantifying gene expression. Microfluidic next generation, high throughput qPCR now permits the detection of transcript copy number in thousands of reactions simultaneously, dramatically increasing the sensitivity over standard qPCR. Here we present a gene expression analysis method applicable to both standard polymerase chain reactions (qPCR) and high throughput qPCR. This technique is adjusted to the input sample quantity (e.g., the number of cells) and is independent of control gene expression. It is efficiency-corrected and with the use of a universal reference sample (commercial complementary DNA (cDNA)) permits the normalization of results between different batches and between different instruments--regardless of potential differences in transcript amplification efficiency. Modifications of the input quantity method include (1) the achievement of absolute quantification and (2) a non-efficiency corrected analysis. When compared to other commonly used algorithms the input quantity method proved to be valid. This method is of particular value for clinical studies of whole blood and circulating leukocytes where cell counts are readily available.
Standardized Methods for Enhanced Quality and Comparability of Tuberculous Meningitis Studies.
Marais, Ben J; Heemskerk, Anna D; Marais, Suzaan S; van Crevel, Reinout; Rohlwink, Ursula; Caws, Maxine; Meintjes, Graeme; Misra, Usha K; Mai, Nguyen T H; Ruslami, Rovina; Seddon, James A; Solomons, Regan; van Toorn, Ronald; Figaji, Anthony; McIlleron, Helen; Aarnoutse, Robert; Schoeman, Johan F; Wilkinson, Robert J; Thwaites, Guy E
2017-02-15
Tuberculous meningitis (TBM) remains a major cause of death and disability in tuberculosis-endemic areas, especially in young children and immunocompromised adults. Research aimed at improving outcomes is hampered by poor standardization, which limits study comparison and the generalizability of results. We propose standardized methods for the conduct of TBM clinical research that were drafted at an international tuberculous meningitis research meeting organized by the Oxford University Clinical Research Unit in Vietnam. We propose a core dataset including demographic and clinical information to be collected at study enrollment, important aspects related to patient management and monitoring, and standardized reporting of patient outcomes. The criteria proposed for the conduct of observational and intervention TBM studies should improve the quality of future research outputs, can facilitate multicenter studies and meta-analyses of pooled data, and could provide the foundation for a global TBM data repository.
System and method for controlling a combustor assembly
York, William David; Ziminsky, Willy Steve; Johnson, Thomas Edward; Stevenson, Christian Xavier
2013-03-05
A system and method for controlling a combustor assembly are disclosed. The system includes a combustor assembly. The combustor assembly includes a combustor and a fuel nozzle assembly. The combustor includes a casing. The fuel nozzle assembly is positioned at least partially within the casing and includes a fuel nozzle. The fuel nozzle assembly further defines a head end. The system further includes a viewing device configured for capturing an image of at least a portion of the head end, and a processor communicatively coupled to the viewing device, the processor configured to compare the image to a standard image for the head end.
Achieving Innovation and Affordability Through Standardization of Materials Development and Testing
NASA Technical Reports Server (NTRS)
Bray, M. H.; Zook, L. M.; Raley, R. E.; Chapman, C.
2011-01-01
The successful expansion of development, innovation, and production within the aeronautics industry during the 20th century was facilitated by collaboration of government agencies with the commercial aviation companies. One of the initial products conceived from the collaboration was the ANC-5 Bulletin, first published in 1937. The ANC-5 Bulletin had intended to standardize the requirements of various government agencies in the design of aircraft structure. The national space policy shift in priority for NASA with an emphasis on transferring the travel to low earth orbit to commercial space providers highlights an opportunity and a need for the national and global space industries. The same collaboration and standardization that is documented and maintained by the industry within MIL-HDBK-5 (MMPDS-01) and MIL-HBDK-17 (nonmetallic mechanical properties) can also be exploited to standardize the thermal performance properties, processing methods, test methods, and analytical methods for use in aircraft and spacecraft design and associated propulsion systems. In addition to the definition of thermal performance description and standardization, the standardization for test methods and analysis for extreme environments (high temperature, cryogenics, deep space radiation, etc) would also be highly valuable to the industry. Its subsequent revisions and conversion to MIL-HDBK-5 and then MMPDS-01 established and then expanded to contain standardized mechanical property design values and other related design information for metallic materials used in aircraft, missiles, and space vehicles. It also includes guidance on standardization of composition, processing, and analytical methods for presentation and inclusion into the handbook. This standardization enabled an expansion of the technologies to provide efficiency and reliability to the consumers. It can be established that many individual programs within the government agencies have been overcome with development costs generated from these nonstandard requirements. Without industry standardization and acceptance, the programs are driven to shoulder the costs of determining design requirements, performance criteria, and then material qualification and certification. A significant investment that the industry could make to both reduce individual program development costs and schedules while expanding commercial space flight capabilities would be to invest in standardizing material performance properties for high temperature, cryogenic, and deep space environments for both metallic and nonmetallic materials.
How Children Learn Mathematics, Teaching Implications of Piaget's Research.
ERIC Educational Resources Information Center
Copeland, Richard W.
Included are the standard topics presented in the undergraduate and/or graduate course on methods of teaching mathematics in elementary education. Chapter 1 describes the historical development of learning theories, including Piaget's. Chapter 2 contains a biographical sketch of Piaget and an explanation of his theory of cognitive development.…
Microarray Genomic Systems Development
2008-06-01
11 species), Escherichia coli TOP10 (7 strains), and Geobacillus stearothermophilus . Using standard molecular biology methods, we isolated genomic...comparisons. Results: Different species of bacteria, including Escherichia coli, Bacillus bacteria, and Geobacillus stearothermophilus produce qualitatively...oligonucleotides to labelled genomic DNA from a set of test samples, including eleven Bacillus species, Geobacillus stearothermophilus , and seven Escherichia
Cheng, Dongwan; Zheng, Li; Hou, Junjie; Wang, Jifeng; Xue, Peng; Yang, Fuquan; Xu, Tao
2015-01-01
The absolute quantification of target proteins in proteomics involves stable isotope dilution coupled with multiple reactions monitoring mass spectrometry (SID-MRM-MS). The successful preparation of stable isotope-labeled internal standard peptides is an important prerequisite for the SID-MRM absolute quantification methods. Dimethyl labeling has been widely used in relative quantitative proteomics and it is fast, simple, reliable, cost-effective, and applicable to any protein sample, making it an ideal candidate method for the preparation of stable isotope-labeled internal standards. MRM mass spectrometry is of high sensitivity, specificity, and throughput characteristics and can quantify multiple proteins simultaneously, including low-abundance proteins in precious samples such as pancreatic islets. In this study, a new method for the absolute quantification of three proteases involved in insulin maturation, namely PC1/3, PC2 and CPE, was developed by coupling a stable isotope dimethyl labeling strategy for internal standard peptide preparation with SID-MRM-MS quantitative technology. This method offers a new and effective approach for deep understanding of the functional status of pancreatic β cells and pathogenesis in diabetes.
Atiyeh, Bishara S
2007-01-01
Hypertrophic scars, resulting from alterations in the normal processes of cutaneous wound healing, are characterized by proliferation of dermal tissue with excessive deposition of fibroblast-derived extracellular matrix proteins, especially collagen, over long periods, and by persistent inflammation and fibrosis. Hypertrophic scars are among the most common and frustrating problems after injury. As current aesthetic surgical techniques become more standardized and results more predictable, a fine scar may be the demarcating line between acceptable and unacceptable aesthetic results. However, hypertrophic scars remain notoriously difficult to eradicate because of the high recurrence rates and the incidence of side effects associated with available treatment methods. This review explores the various treatment methods for hypertrophic scarring described in the literature including evidence-based therapies, standard practices, and emerging methods, attempting to distinguish those with clearly proven efficiency from anecdotal reports about therapies of doubtful benefits while trying to differentiate between prophylactic measures and actual treatment methods. Unfortunately, the distinction between hypertrophic scar treatments and keloid treatments is not obvious in most reports, making it difficult to assess the efficacy of hypertrophic scar treatment.
Sovio, Ulla; Smith, Gordon C S
2018-02-01
It has been proposed that correction of offspring weight percentiles (customization) might improve the prediction of adverse pregnancy outcome; however, the approach is not accepted universally. A complication in the interpretation of the data is that the main method for calculation of customized percentiles uses a fetal growth standard, and multiple analyses have compared the results with birthweight-based standards. First, we aimed to determine whether women who deliver small-for-gestational-age infants using a customized standard differed from other women. Second, we aimed to compare the association between birthweight percentile and adverse outcome using 3 different methods for percentile calculation: (1) a noncustomized actual birthweight standard, (2) a noncustomized fetal growth standard, and (3) a fully customized fetal growth standard. We analyzed data from the Pregnancy Outcome Prediction study, a prospective cohort study of nulliparous women who delivered in Cambridge, UK, between 2008 and 2013. We used a composite adverse outcome, namely, perinatal morbidity or preeclampsia. Receiver operating characteristic curve analysis was used to compare the 3 methods of calculating birthweight percentiles in relation to the composite adverse outcome. We confirmed previous observations that delivering an infant who was small for gestational age (<10th percentile) with the use of a fully customized fetal growth standard but who was appropriate for gestational age with the use of a noncustomized actual birthweight standard was associated with higher rates of adverse outcomes. However, we also observed that the mothers of these infants were 3-4 times more likely to be obese and to deliver preterm. When we compared the risk of adverse outcome from logistic regression models that were fitted to the birthweight percentiles that were derived by each of the 3 predefined methods, the areas under the receiver operating characteristic curves were similar for all 3 methods: 0.56 (95% confidence interval, 0.54-0.59) fully customized, 0.56 (95% confidence interval, 0.53-0.59) noncustomized fetal weight standard, and 0.55 (95% confidence interval, 0.53-0.58) noncustomized actual birthweight standard. When we classified the top 5% of predicted risk as high risk, the methods that used a fetal growth standard showed attenuation after adjustment for gestational age, whereas the birthweight standard did not. Further adjustment for the maternal characteristics, which included weight, attenuated the association with the customized standard, but not the other 2 methods. The associations after full adjustment were similar when we compared the 3 approaches. The independent association between birthweight percentile and adverse outcome was similar when we compared actual birthweight standards and fetal growth standards and compared customized and noncustomized standards. Use of fetal weight standards and customized percentiles for maternal characteristics could lead to stronger associations with adverse outcome through confounding by preterm birth and maternal obesity. Copyright © 2017 Elsevier Inc. All rights reserved.
The European standard for sun-protective clothing: EN 13758.
Gambichler, T; Laperre, J; Hoffmann, K
2006-02-01
Clothing is considered one of the most important tools for sun protection. Contrary to popular opinion, however, some summer fabrics provide insufficient ultraviolet (UV) protection. The European Committee for Standardization (CEN), has developed a new standard on requirements for test methods and labelling of sun-protective garments. This document has now been completed and is published. Within CEN, a working group, CEN/TC 248 WG14 'UV protective clothing', was set up with the mission to produce standards on the UV-protective properties of textile materials. This working group started its activities in 1998 and included 30 experts (dermatologists, physicists, textile technologists, fabric manufacturers and retailers of apparel textiles) from 11 European member states. Within this working group, all medical, ethical, technical and economical aspects of standardization of UV-protective clothing were discussed on the basis of the expertise of each member and in consideration of the relevant literature in this field. Decisions were made in consensus. The first part of the standard (EN 13758-1) deals with all details of test methods (e.g. spectrophotometric measurements) for textile materials and part 2 (EN 13758-2) covers classification and marking of apparel textiles. UV-protective cloths for which compliance with this standard is claimed must fulfill all stringent instructions of testing, classification and marking, including a UV protection factor (UPF) larger than 40 (UPF 40+), average UVA transmission lower than 5%, and design requirements as specified in part 2 of the standard. A pictogram, which is marked with the number of the standard EN 13758-2 and the UPF of 40+, shall be attached to the garment if it is in compliance with the standard. The dermatology community should take cognizance of this new standard document. Garment manufacturers and retailers may now follow these official guidelines for testing and labelling of UV-protective summer clothes, and the sun-aware consumer can easily recognize garments that definitely provide sufficient UV protection.
NASA Astrophysics Data System (ADS)
Bednarczyk, Z.
2012-04-01
The paper presents landslide monitoring methods used for prediction of landslide activity at locations in the Carpathian Mountains (SE Poland). Different types of monitoring methods included standard and real-time early warning measurement with use of hourly data transfer to the Internet were used. Project financed from the EU funds was carried out for the purpose of public road reconstruction. Landslides with low displacement rates (varying from few mm to over 5cm/year) had size of 0.4-2.2mln m3. Flysch layers involved in mass movements represented mixture of clayey soils and sandstones of high moisture content and plasticity. Core sampling and GPR scanning were used for recognition of landslide size and depths. Laboratory research included index, IL oedometer, triaxial and direct shear laboratory tests. GPS-RTK mapping was employed for actualization of landslide morphology. Instrumentation consisted of standard inclinometers, piezometers and pore pressure transducers. Measurements were carried 2006-2011, every month. In May 2010 the first in Poland real-time monitoring system was installed at landslide complex over the Szymark-Bystra public road. It included in-place uniaxial sensors and 3D continuous inclinometers installed to the depths of 12-16m with tilt sensors every 0.5m. Vibrating wire pore pressure and groundwater level transducers together with automatic meteorological station analyzed groundwater and weather conditions. Obtained monitoring and field investigations data provided parameters for LEM and FEM slope stability analysis. They enabled prediction and control of landslide behaviour before, during and after stabilization or partly stabilization works. In May 2010 after the maximum precipitation (100mm/3hours) the rates of observed displacements accelerated to over 11cm in a few days and damaged few standard inclinometer installations. However permanent control of the road area was possible by continuous inclinometer installations. Comprehensive monitoring and modelling methods before the landslide counteraction stage could lead to a safer and more economical recognition of landslide remediation possibilities.
Neutron cross section standards and instrumentation. Annual report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wasson, O.A.
The objective of this interagency program is to provide accurate neutron interaction measurements for the US Department of Energy nuclear programs which include waste disposal, fusion, safeguards, defense, fission, and personnel protection. These measurements are also useful to other energy programs which indirectly use the unique properties of the neutron for diagnostic and analytical purposes. The work includes the measurement of reference cross sections and related neutron data employing unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; the preservation of standard reference deposits and the development of improved neutronmore » detectors and measurement methods. A related and essential element of the program is critical evaluation of neutron interaction data including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology. This report from the National Institute of Standards and Technology contains a summary of the accomplishments of the Neutron Cross Section Standards and Instrumentation Project during the third year of this three-year interagency agreement. The proposed program and required budget for the following three years are also presented. The program continues the shifts in priority instituted in order to broaden the program base.« less
Freeman, Karoline; Tsertsvadze, Alexander; Taylor-Phillips, Sian; McCarthy, Noel; Mistry, Hema; Manuel, Rohini; Mason, James
2017-01-01
Multiplex gastrointestinal pathogen panel (GPP) tests simultaneously identify bacterial, viral and parasitic pathogens from the stool samples of patients with suspected infectious gastroenteritis presenting in hospital or the community. We undertook a systematic review to compare the accuracy of GPP tests with standard microbiology techniques. Searches in Medline, Embase, Web of Science and the Cochrane library were undertaken from inception to January 2016. Eligible studies compared GPP tests with standard microbiology techniques in patients with suspected gastroenteritis. Quality assessment of included studies used tailored QUADAS-2. In the absence of a reference standard we analysed test performance taking GPP tests and standard microbiology techniques in turn as the benchmark test, using random effects meta-analysis of proportions. No study provided an adequate reference standard with which to compare the test accuracy of GPP and conventional tests. Ten studies informed a meta-analysis of positive and negative agreement. Positive agreement across all pathogens was 0.93 (95% CI 0.90 to 0.96) when conventional methods were the benchmark and 0.68 (95% CI: 0.58 to 0.77) when GPP provided the benchmark. Negative agreement was high in both instances due to the high proportion of negative cases. GPP testing produced a greater number of pathogen-positive findings than conventional testing. It is unclear whether these additional 'positives' are clinically important. GPP testing has the potential to simplify testing and accelerate reporting when compared to conventional microbiology methods. However the impact of GPP testing upon the management, treatment and outcome of patients is poorly understood and further studies are needed to evaluate the health economic impact of GPP testing compared with standard methods. The review protocol is registered with PROSPERO as CRD42016033320.
Basch, Ethan; Abernethy, Amy P; Mullins, C Daniel; Reeve, Bryce B; Smith, Mary Lou; Coons, Stephen Joel; Sloan, Jeff; Wenzel, Keith; Chauhan, Cynthia; Eppard, Wayland; Frank, Elizabeth S; Lipscomb, Joseph; Raymond, Stephen A; Spencer, Merianne; Tunis, Sean
2012-12-01
Examining the patient's subjective experience in prospective clinical comparative effectiveness research (CER) of oncology treatments or process interventions is essential for informing decision making. Patient-reported outcome (PRO) measures are the standard tools for directly eliciting the patient experience. There are currently no widely accepted standards for developing or implementing PRO measures in CER. Recommendations for the design and implementation of PRO measures in CER were developed via a standardized process including multistakeholder interviews, a technical working group, and public comments. Key recommendations are to include assessment of patient-reported symptoms as well as health-related quality of life in all prospective clinical CER studies in adult oncology; to identify symptoms relevant to a particular study population and context based on literature review and/or qualitative and quantitative methods; to assure that PRO measures used are valid, reliable, and sensitive in a comparable population (measures particularly recommended include EORTC QLQ-C30, FACT, MDASI, PRO-CTCAE, and PROMIS); to collect PRO data electronically whenever possible; to employ methods that minimize missing patient reports and include a plan for analyzing and reporting missing PRO data; to report the proportion of responders and cumulative distribution of responses in addition to mean changes in scores; and to publish results of PRO analyses simultaneously with other clinical outcomes. Twelve core symptoms are recommended for consideration in studies in advanced or metastatic cancers. Adherence to methodologic standards for the selection, implementation, and analysis/reporting of PRO measures will lead to an understanding of the patient experience that informs better decisions by patients, providers, regulators, and payers.
Jabs, Douglas A; Nussenblatt, Robert B; Rosenbaum, James T
2005-09-01
To begin a process of standardizing the methods for reporting clinical data in the field of uveitis. Consensus workshop. Members of an international working group were surveyed about diagnostic terminology, inflammation grading schema, and outcome measures, and the results used to develop a series of proposals to better standardize the use of these entities. Small groups employed nominal group techniques to achieve consensus on several of these issues. The group affirmed that an anatomic classification of uveitis should be used as a framework for subsequent work on diagnostic criteria for specific uveitic syndromes, and that the classification of uveitis entities should be on the basis of the location of the inflammation and not on the presence of structural complications. Issues regarding the use of the terms "intermediate uveitis," "pars planitis," "panuveitis," and descriptors of the onset and course of the uveitis were addressed. The following were adopted: standardized grading schema for anterior chamber cells, anterior chamber flare, and for vitreous haze; standardized methods of recording structural complications of uveitis; standardized definitions of outcomes, including "inactive" inflammation, "improvement'; and "worsening" of the inflammation, and "corticosteroid sparing," and standardized guidelines for reporting visual acuity outcomes. A process of standardizing the approach to reporting clinical data in uveitis research has begun, and several terms have been standardized.
ERIC Educational Resources Information Center
Anderson, Carl B.; Metzger, Scott Alan
2011-01-01
This study is a mixed-methods text analysis of African American representation within K-12 U.S. History content standards treating the revolutionary era, the early U.S. republic, the Civil War era, and Reconstruction. The states included in the analysis are Michigan, New Jersey, South Carolina, and Virginia. The analysis finds that the reviewed…
Colour measurements of pigmented rice grain using flatbed scanning and image analysis
NASA Astrophysics Data System (ADS)
Kaisaat, Khotchakorn; Keawdonree, Nuttapong; Chomkokard, Sakchai; Jinuntuya, Noparit; Pattanasiri, Busara
2017-09-01
Recently, the National Bureau of Agricultural Commodity and Food Standards (ACFS) have drafted a manual of Thai colour rice standards. However, there are no quantitative descriptions of rice colour and its measurement method. These drawbacks might lead to misunderstanding for people who use the manual. In this work, we proposed an inexpensive method, using flatbed scanning together with image analysis, to quantitatively measure rice colour and colour uniformity. To demonstrate its general applicability for colour differentiation of rice, we applied it to different kinds of pigmented rice, including Riceberry rice with and without uniform colour and Chinese black rice.
Recommendations for fluorescence instrument qualification: the new ASTM Standard Guide.
DeRose, Paul C; Resch-Genger, Ute
2010-03-01
Aimed at improving quality assurance and quantitation for modern fluorescence techniques, ASTM International (ASTM) is about to release a Standard Guide for Fluorescence, reviewed here. The guide's main focus is on steady state fluorometry, for which available standards and instrument characterization procedures are discussed along with their purpose, suitability, and general instructions for use. These include the most relevant instrument properties needing qualification, such as linearity and spectral responsivity of the detection system, spectral irradiance reaching the sample, wavelength accuracy, sensitivity or limit of detection for an analyte, and day-to-day performance verification. With proper consideration of method-inherent requirements and limitations, many of these procedures and standards can be adapted to other fluorescence techniques. In addition, procedures for the determination of other relevant fluorometric quantities including fluorescence quantum yields and fluorescence lifetimes are briefly introduced. The guide is a clear and concise reference geared for users of fluorescence instrumentation at all levels of experience and is intended to aid in the ongoing standardization of fluorescence measurements.
Scattering of cylindrical electric field waves from an elliptical dielectric cylindrical shell
NASA Astrophysics Data System (ADS)
Urbanik, E. A.
1982-12-01
This thesis examines the scattering of cylindrical waves by large dielectric scatterers of elliptic cross section. The solution method was the method of moments using a Galerkin approach. Sinusoidal basis and testing functions were used resulting in a higher convergence rate. The higher rate of convergence made it possible for the program to run on the Aeronautical Systems Division's CYBER computers without any special storage methods. This report includes discussion on moment methods, solution of integral equations, and the relationship between the electric field and the source region or self cell singularity. Since the program produced unacceptable run times, no results are contained herein. The importance of this work is the evaluation of the practicality of moment methods using standard techniques. The long run times for a mid-sized scatterer demonstrate the impracticality of moment methods for dielectrics using standard techniques.
Alternative methods to trench backfill.
DOT National Transportation Integrated Search
2005-04-30
Conduit structures dealing with hydraulic drainage needs in the Louisiana highway system include pipe culverts, pipe arch culverts, storm drains, sewers, etc. Although the Louisiana Department of Transportation and Development (LADOTD) has standard s...
An inversion-based self-calibration for SIMS measurements: Application to H, F, and Cl in apatite
NASA Astrophysics Data System (ADS)
Boyce, J. W.; Eiler, J. M.
2011-12-01
Measurements of volatile abundances in igneous apatites can provide information regarding the abundances and evolution of volatiles in magmas, with applications to terrestrial volcanism and planetary evolution. Secondary ion mass spectrometry (SIMS) measurements can produce accurate and precise measurements of H and other volatiles in many materials including apatite. SIMS standardization generally makes use of empirical linear transfer functions that relate measured ion ratios to independently known concentrations. However, this approach is often limited by the lack of compositionally diverse, well-characterized, homogeneous standards. In general, SIMS calibrations are developed for minor and trace elements, and any two are treated as independent of one another. However, in crystalline materials, additional stoichiometric constraints may apply. In the case of apatite, the sum of concentrations of abundant volatile elements (H, Cl, and F) should closely approach 100% occupancy of their collective structural site. Here we propose and document the efficacy of a method for standardizing SIMS analyses of abundant volatiles in apatites that takes advantage of this stoichiometric constraint. The principle advantage of this method is that it is effectively self-standardizing; i.e., it requires no independently known homogeneous reference standards. We define a system of independent linear equations relating measured ion ratios (H/P, Cl/P, F/P) and unknown calibration slopes. Given sufficient range in the concentrations of the different elements among apatites measured in a single analytical session, solving this system of equations allows for the calibration slope for each element to be determined without standards, using only blank-corrected ion ratios. In the case that a data set of this kind lacks sufficient range in measured compositions of one or more of the relevant ion ratios, one can employ measurements of additional apatites of a variety of compositions to increase the statistical range and make the inversion more accurate and precise. These additional non-standard apatites need only be wide-ranging in composition: They need not be homogenous nor have known H, F, or Cl concentrations. Tests utilizing synthetic data and data generated in the laboratory indicate that this method should yield satisfactory results provided apatites meet the criteria of the model. The inversion method is able to reproduce conventional calibrations to within <2.5%, a level of accuracy comparable to or even better than the uncertainty of the conventional calibration, and one that includes both error in the inversion method as well as any true error in the independently determined values of the standards. Uncertainties in the inversion calibrations range from 0.1-1.7% (2σ), typically an order of magnitude smaller than the uncertainties in conventional calibrations (~4-5% for H2O, 1-19% for F and Cl). However, potential systematic errors stem from the model assumption of 100% occupancy of this site by the measured elements. Use of this method simplifies analysis of H, F, and Cl in apatites by SIMS, and may also be amenable to other stoichiometrically limited substitution groups, including P+As+S+Si+C in apatite, and Zr+Hf+U+Th in non-metamict zircon.
Environmental Performance of North American Wood Panel Manufacturing
R. Bergman; D. Kaestner; A. Taylor
2015-01-01
Manufacturing building products such as wood panels has environmental impacts, including contributions to climate change. This paper is a compilation of four studies quantifying these impacts using the life-cycle assessment (LCA) method on five wood-based panel products made in North America during 2012. LCA is an internationally accepted and standardized method for...
Life cycle impacts of North American wood panel Manufacturing
Richard Bergman; D. Kaestner; A. M. Taylor
2016-01-01
Manufacturing building products such as wood panels impacts the environment, including contributing to climate change. This study is a compilation of four studies quantifying these impacts using the life cycle assessment (LCA) method on five wood-based panel products made in North America during 2012. LCA is an internationally accepted and standardized method for...
Standard methods have been established by USEPA, ASTM International, Environment Canada and Organization for Economic Cooperation and Development for conducting sediment toxicity tests with various species of midges including Chironomus dilutus. Short-term 10-day exposures are ty...
The report gives details of a small-chamber test method developed by the EPA for characterizing volatile organic compound (VOC) emissions from interior latex and alkyd paints. Current knowledge about VOC, including hazardous air pollutant, emissions from interior paints generated...
New generation all-silica based optical elements for high power laser systems
NASA Astrophysics Data System (ADS)
Tolenis, T.; GrinevičiÅ«tÄ--, L.; Melninkaitis, A.; Selskis, A.; Buzelis, R.; MažulÄ--, L.; Drazdys, R.
2017-08-01
Laser resistance of optical elements is one of the major topics in photonics. Various routes have been taken to improve optical coatings, including, but not limited by, materials engineering and optimisation of electric field distribution in multilayers. During the decades of research, it was found, that high band-gap materials, such as silica, are highly resistant to laser light. Unfortunately, only the production of anti-reflection coatings of all-silica materials are presented to this day. A novel route will be presented in materials engineering, capable to manufacture high reflection optical elements using only SiO2 material and GLancing Angle Deposition (GLAD) method. The technique involves the deposition of columnar structure and tailoring the refractive index of silica material throughout the coating thickness. A numerous analysis indicate the superior properties of GLAD coatings when compared with standard methods for Bragg mirrors production. Several groups of optical components are presented including anti-reflection coatings and Bragg mirrors. Structural and optical characterisation of the method have been performed and compared with standard methods. All researches indicate the possibility of new generation coatings for high power laser systems.
Credit risk migration rates modeling as open systems: A micro-simulation approach
NASA Astrophysics Data System (ADS)
Landini, S.; Uberti, M.; Casellina, S.
2018-05-01
The last financial crisis of 2008 stimulated the development of new Regulatory Criteria (commonly known as Basel III) that pushed the banking activity to become more prudential, either in the short and the long run. As well known, in 2014 the International Accounting Standards Board (IASB) promulgated the new International Financial Reporting Standard 9 (IFRS 9) for financial instruments that will become effective in January 2018. Since the delayed recognition of credit losses on loans was identified as a weakness in existing accounting standards, the IASB has introduced an Expected Loss model that requires more timely recognition of credit losses. Specifically, new standards require entities to account both for expected losses from when the impairments are recognized for the first time and for full loan lifetime; moreover, a clear preference toward forward looking models is expressed. In this new framework, it is necessary a re-thinking of the widespread standard theoretical approach on which the well known prudential model is founded. The aim of this paper is then to define an original methodological approach to migration rates modeling for credit risk which is innovative respect to the standard method from the point of view of a bank as well as in a regulatory perspective. Accordingly, the proposed not-standard approach considers a portfolio as an open sample allowing for entries, migrations of stayers and exits as well. While being consistent with the empirical observations, this open-sample approach contrasts with the standard closed-sample method. In particular, this paper offers a methodology to integrate the outcomes of the standard closed-sample method within the open-sample perspective while removing some of the assumptions of the standard method. Three main conclusions can be drawn in terms of economic capital provision: (a) based on the Markovian hypothesis with a-priori absorbing state at default, the standard closed-sample method is to be abandoned for not to predict lenders' bankruptcy by construction; (b) to meet more reliable estimates along with the new regulatory standards, the sample to estimate migration rates matrices for credit risk should include either entries and exits; (c) the static eigen-decomposition standard procedure to forecast migration rates should be replaced with a stochastic process dynamics methodology while conditioning forecasts to macroeconomic scenarios.
NASA Astrophysics Data System (ADS)
Mai, W.; Zhang, J.-F.; Zhao, X.-M.; Li, Z.; Xu, Z.-W.
2017-11-01
Wastewater from the dye industry is typically analyzed using a standard method for measurement of chemical oxygen demand (COD) or by a single-wavelength spectroscopic method. To overcome the disadvantages of these methods, ultraviolet-visible (UV-Vis) spectroscopy was combined with principal component regression (PCR) and partial least squares regression (PLSR) in this study. Unlike the standard method, this method does not require digestion of the samples for preparation. Experiments showed that the PLSR model offered high prediction performance for COD, with a mean relative error of about 5% for two dyes. This error is similar to that obtained with the standard method. In this study, the precision of the PLSR model decreased with the number of dye compounds present. It is likely that multiple models will be required in reality, and the complexity of a COD monitoring system would be greatly reduced if the PLSR model is used because it can include several dyes. UV-Vis spectroscopy with PLSR successfully enhanced the performance of COD prediction for dye wastewater and showed good potential for application in on-line water quality monitoring.
Boehm, A B; Griffith, J; McGee, C; Edge, T A; Solo-Gabriele, H M; Whitman, R; Cao, Y; Getrich, M; Jay, J A; Ferguson, D; Goodwin, K D; Lee, C M; Madison, M; Weisberg, S B
2009-11-01
The absence of standardized methods for quantifying faecal indicator bacteria (FIB) in sand hinders comparison of results across studies. The purpose of the study was to compare methods for extraction of faecal bacteria from sands and recommend a standardized extraction technique. Twenty-two methods of extracting enterococci and Escherichia coli from sand were evaluated, including multiple permutations of hand shaking, mechanical shaking, blending, sonication, number of rinses, settling time, eluant-to-sand ratio, eluant composition, prefiltration and type of decantation. Tests were performed on sands from California, Florida and Lake Michigan. Most extraction parameters did not significantly affect bacterial enumeration. anova revealed significant effects of eluant composition and blending; with both sodium metaphosphate buffer and blending producing reduced counts. The simplest extraction method that produced the highest FIB recoveries consisted of 2 min of hand shaking in phosphate-buffered saline or deionized water, a 30-s settling time, one-rinse step and a 10 : 1 eluant volume to sand weight ratio. This result was consistent across the sand compositions tested in this study but could vary for other sand types. Method standardization will improve the understanding of how sands affect surface water quality.
Salvati, Louis M; McClure, Sean C; Reddy, Todime M; Cellar, Nicholas A
2016-05-01
This method provides simultaneous determination of total vitamins B1, B2, B3, and B6 in infant formula and related nutritionals (adult and infant). The method was given First Action for vitamins B1, B2, and B6, but not B3, during the AOAC Annual Meeting in September 2015. The method uses acid phosphatase to dephosphorylate the phosphorylated vitamin forms. It then measures thiamine (vitamin B1); riboflavin (vitamin B2); nicotinamide and nicotinic acid (vitamin B3); and pyridoxine, pyridoxal, and pyridoxamine (vitamin B6) from digested sample extract by liquid chromatography-tandem mass spectrometry. A single-laboratory validation was performed on 14 matrixes provided by the AOAC Stakeholder Panel on Infant Formula and Adult Nutritionals (SPIFAN) to demonstrate method effectiveness. The method met requirements of the AOAC SPIFAN Standard Method Performance Requirement for each of the three vitamins, including average over-spike recovery of 99.6 ± 3.5%, average repeatability of 1.5 ± 0.8% relative standard deviation, and average intermediate precision of 3.9 ± 1.3% relative standard deviation.
Roofing research and standards development: Fourth volume. ASTM special technical publication 1349
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wallace, T.J.; Rossiter, W.J. Jr.
1999-07-01
As the roofing industry has stabilized, a broad variety of roof systems have found general acceptance by the building owners, architects, engineers, contractors, and others who select and install roofs. These roof systems include those based on conventional built-up membranes using glass and synthetic reinforcements, synthetic polymeric membranes using elastomers and thermoplastics, polymer-modified membranes, and sprayed polyurethane foam. ASTM Committee D8 on Roofing, Waterproofing, and Bituminous Materials has contributed significantly in many important ways to the roofing community's stabilization including issuing standard specifications to assist consumers in the selection and use of these systems. This is not surprising, as itmore » has always been among the purpose of D8 to provide standards to assist in the selection and use of low-sloped and steep roofing. The Committee's scope includes development of standards associated with application, inspection, maintenance, and analyses. Some of the issues facing the roofing community today--for example, enhanced system durability, better methods of material characterization, environmental impact, recycling of materials and systems, industry conversation to the S.I. system metric--readily fall within D8's scope. The availability of sound standard can contribute to the resolution of many of these issues.« less
Interoperability in planetary research for geospatial data analysis
NASA Astrophysics Data System (ADS)
Hare, Trent M.; Rossi, Angelo P.; Frigeri, Alessandro; Marmo, Chiara
2018-01-01
For more than a decade there has been a push in the planetary science community to support interoperable methods for accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (e.g., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized geospatial image formats, geologic mapping conventions, U.S. Federal Geographic Data Committee (FGDC) cartographic and metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Map Tile Services (cached image tiles), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they can be just as valuable for planetary domain. Another initiative, called VESPA (Virtual European Solar and Planetary Access), will marry several of the above geoscience standards and astronomy-based standards as defined by International Virtual Observatory Alliance (IVOA). This work outlines the current state of interoperability initiatives in use or in the process of being researched within the planetary geospatial community.
Issues in the Assessment of Social Phobia: A Review
Letamendi, Andrea M.; Chavira, Denise A.; Stein, Murray B.
2010-01-01
Since the emergence of social phobia in DSM nomenclature, the mental health community has witnessed an expansion in standardized methods for the screening, diagnosis, and measurement of the disorder. This article reviews formal assessment methods for social phobia, including diagnostic interview, clinician-administered instruments, and self report questionnaires. Frequently used tools for assessing constructs related to social phobia, such as disability and quality of life, are also briefly presented. This review evaluates each method by highlighting the assessment features recommended in social phobia literature, including method of administration, item content, coverage, length of scale, type of scores generated, and time frame. PMID:19728569
Fission matrix-based Monte Carlo criticality analysis of fuel storage pools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farlotti, M.; Ecole Polytechnique, Palaiseau, F 91128; Larsen, E. W.
2013-07-01
Standard Monte Carlo transport procedures experience difficulties in solving criticality problems in fuel storage pools. Because of the strong neutron absorption between fuel assemblies, source convergence can be very slow, leading to incorrect estimates of the eigenvalue and the eigenfunction. This study examines an alternative fission matrix-based Monte Carlo transport method that takes advantage of the geometry of a storage pool to overcome this difficulty. The method uses Monte Carlo transport to build (essentially) a fission matrix, which is then used to calculate the criticality and the critical flux. This method was tested using a test code on a simplemore » problem containing 8 assemblies in a square pool. The standard Monte Carlo method gave the expected eigenfunction in 5 cases out of 10, while the fission matrix method gave the expected eigenfunction in all 10 cases. In addition, the fission matrix method provides an estimate of the error in the eigenvalue and the eigenfunction, and it allows the user to control this error by running an adequate number of cycles. Because of these advantages, the fission matrix method yields a higher confidence in the results than standard Monte Carlo. We also discuss potential improvements of the method, including the potential for variance reduction techniques. (authors)« less
Conceptual designs for in situ analysis of Mars soil
NASA Technical Reports Server (NTRS)
Mckay, C. P.; Zent, A. P.; Hartman, H.
1991-01-01
A goal of this research is to develop conceptual designs for instrumentation to perform in situ measurements of the Martian soil in order to determine the existence and nature of any reactive chemicals. Our approach involves assessment and critical review of the Viking biology results which indicated the presence of a soil oxidant, an investigation of the possible application of standard soil science techniques to the analysis of Martian soil, and a preliminary consideration of non-standard methods that may be necessary for use in the highly oxidizing Martian soil. Based on our preliminary analysis, we have developed strawman concepts for standard soil analysis on Mars, including pH, suitable for use on a Mars rover mission. In addition, we have devised a method for the determination of the possible strong oxidants on Mars.
A new NIST primary standardization of 18F.
Fitzgerald, R; Zimmerman, B E; Bergeron, D E; Cessna, J C; Pibida, L; Moreira, D S
2014-02-01
A new primary standardization of (18)F by NIST is reported. The standard is based on live-timed beta-gamma anticoincidence counting with confirmatory measurements by three other methods: (i) liquid scintillation (LS) counting using CIEMAT/NIST (3)H efficiency tracing; (ii) triple-to-double coincidence ratio (TDCR) counting; and (iii) NaI integral counting and HPGe γ-ray spectrometry. The results are reported as calibration factors for NIST-maintained ionization chambers (including some "dose calibrators"). The LS-based methods reveal evidence for cocktail instability for one LS cocktail. Using an ionization chamber to link this work with previous NIST results, the new value differs from the previous reports by about 4%, but appears to be in good agreement with the key comparison reference value (KCRV) of 2005. © 2013 Published by Elsevier Ltd.
A framework for the meta-analysis of Bland-Altman studies based on a limits of agreement approach.
Tipton, Elizabeth; Shuster, Jonathan
2017-10-15
Bland-Altman method comparison studies are common in the medical sciences and are used to compare a new measure to a gold-standard (often costlier or more invasive) measure. The distribution of these differences is summarized by two statistics, the 'bias' and standard deviation, and these measures are combined to provide estimates of the limits of agreement (LoA). When these LoA are within the bounds of clinically insignificant differences, the new non-invasive measure is preferred. Very often, multiple Bland-Altman studies have been conducted comparing the same two measures, and random-effects meta-analysis provides a means to pool these estimates. We provide a framework for the meta-analysis of Bland-Altman studies, including methods for estimating the LoA and measures of uncertainty (i.e., confidence intervals). Importantly, these LoA are likely to be wider than those typically reported in Bland-Altman meta-analyses. Frequently, Bland-Altman studies report results based on repeated measures designs but do not properly adjust for this design in the analysis. Meta-analyses of Bland-Altman studies frequently exclude these studies for this reason. We provide a meta-analytic approach that allows inclusion of estimates from these studies. This includes adjustments to the estimate of the standard deviation and a method for pooling the estimates based upon robust variance estimation. An example is included based on a previously published meta-analysis. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Collier, J W; Shah, R B; Bryant, A R; Habib, M J; Khan, M A; Faustino, P J
2011-02-20
A rapid, selective, and sensitive gradient HPLC method was developed for the analysis of dissolution samples of levothyroxine sodium tablets. Current USP methodology for levothyroxine (L-T(4)) was not adequate to resolve co-elutants from a variety of levothyroxine drug product formulations. The USP method for analyzing dissolution samples of the drug product has shown significant intra- and inter-day variability. The sources of method variability include chromatographic interferences introduced by the dissolution media and the formulation excipients. In the present work, chromatographic separation of levothyroxine was achieved on an Agilent 1100 Series HPLC with a Waters Nova-pak column (250 mm × 3.9 mm) using a 0.01 M phosphate buffer (pH 3.0)-methanol (55:45, v/v) in a gradient elution mobile phase at a flow rate of 1.0 mL/min and detection UV wavelength of 225 nm. The injection volume was 800 μL and the column temperature was maintained at 28°C. The method was validated according to USP Category I requirements. The validation characteristics included accuracy, precision, specificity, linearity, and analytical range. The standard curve was found to have a linear relationship (r(2)>0.99) over the analytical range of 0.08-0.8 μg/mL. Accuracy ranged from 90 to 110% for low quality control (QC) standards and 95 to 105% for medium and high QC standards. Precision was <2% at all QC levels. The method was found to be accurate, precise, selective, and linear for L-T(4) over the analytical range. The HPLC method was successfully applied to the analysis of dissolution samples of marketed levothyroxine sodium tablets. Published by Elsevier B.V.
Collier, J.W.; Shah, R.B.; Bryant, A.R.; Habib, M.J.; Khan, M.A.; Faustino, P.J.
2011-01-01
A rapid, selective, and sensitive gradient HPLC method was developed for the analysis of dissolution samples of levothyroxine sodium tablets. Current USP methodology for levothyroxine (l-T4) was not adequate to resolve co-elutants from a variety of levothyroxine drug product formulations. The USP method for analyzing dissolution samples of the drug product has shown significant intra- and inter-day variability. The sources of method variability include chromatographic interferences introduced by the dissolution media and the formulation excipients. In the present work, chromatographic separation of levothyroxine was achieved on an Agilent 1100 Series HPLC with a Waters Nova-pak column (250mm × 3.9mm) using a 0.01 M phosphate buffer (pH 3.0)–methanol (55:45, v/v) in a gradient elution mobile phase at a flow rate of 1.0 mL/min and detection UV wavelength of 225 nm. The injection volume was 800 µL and the column temperature was maintained at 28 °C. The method was validated according to USP Category I requirements. The validation characteristics included accuracy, precision, specificity, linearity, and analytical range. The standard curve was found to have a linear relationship (r2 > 0.99) over the analytical range of 0.08–0.8 µg/mL. Accuracy ranged from 90 to 110% for low quality control (QC) standards and 95 to 105% for medium and high QC standards. Precision was <2% at all QC levels. The method was found to be accurate, precise, selective, and linear for l-T4 over the analytical range. The HPLC method was successfully applied to the analysis of dissolution samples of marketed levothyroxine sodium tablets. PMID:20947276
Towards Formal Implementation of PUS Standard
NASA Astrophysics Data System (ADS)
Ilić, D.
2009-05-01
As an effort to promote the reuse of on-board and ground systems ESA developed a standard for packet telemetry and telecommand - PUS. It defines a set of standard service models with the corresponding structures of the associated telemetry and telecommand packets. Various missions then can choose to implement those standard PUS services that best conform to their specific requirements. In this paper we propose a formal development (based on the Event-B method) of reusable service patterns, which can be instantiated for concrete application. Our formal models allow us to formally express and verify specific service properties including various telecommand and telemetry packet structure validation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neymark, J.; Kennedy, M.; Judkoff, R.
This report documents a set of diagnostic analytical verification cases for testing the ability of whole building simulation software to model the air distribution side of typical heating, ventilating and air conditioning (HVAC) equipment. These cases complement the unitary equipment cases included in American National Standards Institute (ANSI)/American Society of Heating, Refrigerating, and Air-Conditioning Engineers (ASHRAE) Standard 140, Standard Method of Test for the Evaluation of Building Energy Analysis Computer Programs, which test the ability to model the heat-transfer fluid side of HVAC equipment.
Helium Mass Spectrometer Leak Detection: A Method to Quantify Total Measurement Uncertainty
NASA Technical Reports Server (NTRS)
Mather, Janice L.; Taylor, Shawn C.
2015-01-01
In applications where leak rates of components or systems are evaluated against a leak rate requirement, the uncertainty of the measured leak rate must be included in the reported result. However, in the helium mass spectrometer leak detection method, the sensitivity, or resolution, of the instrument is often the only component of the total measurement uncertainty noted when reporting results. To address this shortfall, a measurement uncertainty analysis method was developed that includes the leak detector unit's resolution, repeatability, hysteresis, and drift, along with the uncertainty associated with the calibration standard. In a step-wise process, the method identifies the bias and precision components of the calibration standard, the measurement correction factor (K-factor), and the leak detector unit. Together these individual contributions to error are combined and the total measurement uncertainty is determined using the root-sum-square method. It was found that the precision component contributes more to the total uncertainty than the bias component, but the bias component is not insignificant. For helium mass spectrometer leak rate tests where unit sensitivity alone is not enough, a thorough evaluation of the measurement uncertainty such as the one presented herein should be performed and reported along with the leak rate value.
Unice, Kenneth M; Kreider, Marisa L; Panko, Julie M
2012-11-08
Pyrolysis(pyr)-GC/MS analysis of characteristic thermal decomposition fragments has been previously used for qualitative fingerprinting of organic sources in environmental samples. A quantitative pyr-GC/MS method based on characteristic tire polymer pyrolysis products was developed for tread particle quantification in environmental matrices including soil, sediment, and air. The feasibility of quantitative pyr-GC/MS analysis of tread was confirmed in a method evaluation study using artificial soil spiked with known amounts of cryogenically generated tread. Tread concentration determined by blinded analyses was highly correlated (r2 ≥ 0.88) with the known tread spike concentration. Two critical refinements to the initial pyrolysis protocol were identified including use of an internal standard and quantification by the dimeric markers vinylcyclohexene and dipentene, which have good specificity for rubber polymer with no other appreciable environmental sources. A novel use of deuterated internal standards of similar polymeric structure was developed to correct the variable analyte recovery caused by sample size, matrix effects, and ion source variability. The resultant quantitative pyr-GC/MS protocol is reliable and transferable between laboratories.
Tan, K. E.; Ellis, B. C.; Lee, R.; Stamper, P. D.; Zhang, S. X.
2012-01-01
Matrix-assisted laser desorption ionization–time of flight mass spectrometry (MALDI-TOF MS) has been found to be an accurate, rapid, and inexpensive method for the identification of bacteria and yeasts. Previous evaluations have compared the accuracy, time to identification, and costs of the MALDI-TOF MS method against standard identification systems or commercial panels. In this prospective study, we compared a protocol incorporating MALDI-TOF MS (MALDI protocol) with the current standard identification protocols (standard protocol) to determine the performance in actual practice using a specimen-based, bench-by-bench approach. The potential impact on time to identification (TTI) and costs had MALDI-TOF MS been the first-line identification method was quantitated. The MALDI protocol includes supplementary tests, notably for Streptococcus pneumoniae and Shigella, and indications for repeat MALDI-TOF MS attempts, often not measured in previous studies. A total of 952 isolates (824 bacterial isolates and 128 yeast isolates) recovered from 2,214 specimens were assessed using the MALDI protocol. Compared with standard protocols, the MALDI protocol provided identifications 1.45 days earlier on average (P < 0.001). In our laboratory, we anticipate that the incorporation of the MALDI protocol can reduce reagent and labor costs of identification by $102,424 or 56.9% within 12 months. The model included the fixed annual costs of the MALDI-TOF MS, such as the cost of protein standards and instrument maintenance, and the annual prevalence of organisms encountered in our laboratory. This comprehensive cost analysis model can be generalized to other moderate- to high-volume laboratories. PMID:22855510
Tan, K E; Ellis, B C; Lee, R; Stamper, P D; Zhang, S X; Carroll, K C
2012-10-01
Matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS) has been found to be an accurate, rapid, and inexpensive method for the identification of bacteria and yeasts. Previous evaluations have compared the accuracy, time to identification, and costs of the MALDI-TOF MS method against standard identification systems or commercial panels. In this prospective study, we compared a protocol incorporating MALDI-TOF MS (MALDI protocol) with the current standard identification protocols (standard protocol) to determine the performance in actual practice using a specimen-based, bench-by-bench approach. The potential impact on time to identification (TTI) and costs had MALDI-TOF MS been the first-line identification method was quantitated. The MALDI protocol includes supplementary tests, notably for Streptococcus pneumoniae and Shigella, and indications for repeat MALDI-TOF MS attempts, often not measured in previous studies. A total of 952 isolates (824 bacterial isolates and 128 yeast isolates) recovered from 2,214 specimens were assessed using the MALDI protocol. Compared with standard protocols, the MALDI protocol provided identifications 1.45 days earlier on average (P < 0.001). In our laboratory, we anticipate that the incorporation of the MALDI protocol can reduce reagent and labor costs of identification by $102,424 or 56.9% within 12 months. The model included the fixed annual costs of the MALDI-TOF MS, such as the cost of protein standards and instrument maintenance, and the annual prevalence of organisms encountered in our laboratory. This comprehensive cost analysis model can be generalized to other moderate- to high-volume laboratories.
Rosskopf, U; Daas, A; Terao, E; von Hunolstein, C
2017-01-01
Before release onto the market, it must be demonstrated that the total and free polysaccharide (poly ribosyl-ribitol-phosphate, PRP) content of Haemophilus influenzae type b (Hib) vaccine complies with requirements. However, manufacturers use different methods to assay PRP content: a national control laboratory must establish and validate the relevant manufacturer methodology before using it to determine PRP content. An international study was organised by the World Health Organization (WHO), in collaboration with the Biological Standardisation Programme (BSP) of the Council of Europe/European Directorate for the Quality of Medicines & HealthCare (EDQM) and of the European Union Commission, to verify the suitability of a single method for determining PRP content in liquid pentavalent vaccines (DTwP-HepB-Hib) containing a whole-cell pertussis component. It consists of HCl hydrolysis followed by chromatographic separation and quantification of ribitol on a CarboPac MA1 column using high-performance anion exchange chromatography coupled with pulsed amperometric detection (HPAEC-PAD). The unconjugated, free, PRP is separated from the total PRP using C4 solid-phase extraction cartridges (SPE C4). Ten quality control laboratories performed two independent analyses applying the proposed analytical test protocol to five vaccine samples, including a vaccine lot with sub-potent PRP content and very high free PRP content. Both WHO PRP standard and ribitol reference standard were included as calibrating standards. A significant bias between WHO PRP standard and ribitol reference standard was observed. Study results showed that the proposed analytical method is, in principle, suitable for the intended use provided that a validation is performed as usually expected from quality control laboratories.
Coordinate measuring machine test standard apparatus and method
Bieg, L.F.
1994-08-30
A coordinate measuring machine test standard apparatus and method are disclosed which includes a rotary spindle having an upper phase plate and an axis of rotation, a kinematic ball mount attached to the phase plate concentric with the axis of rotation of the phase plate, a groove mounted at the circumference of the phase plate, and an arm assembly which rests in the groove. The arm assembly has a small sphere at one end and a large sphere at the other end. The small sphere may be a coordinate measuring machine probe tip and may have variable diameters. The large sphere is secured in the kinematic ball mount and the arm is held in the groove. The kinematic ball mount includes at least three mounting spheres and the groove is an angular locating groove including at least two locking spheres. The arm may have a hollow inner core and an outer layer. The rotary spindle may be a ratio reducer. The device is used to evaluate the measuring performance of a coordinate measuring machine for periodic recertification, including 2 and 3 dimensional accuracy, squareness, straightness, and angular accuracy. 5 figs.
Milder, Jeffrey C; Arbuthnot, Margaret; Blackman, Allen; Brooks, Sharon E; Giovannucci, Daniele; Gross, Lee; Kennedy, Elizabeth T; Komives, Kristin; Lambin, Eric F; Lee, Audrey; Meyer, Daniel; Newton, Peter; Phalan, Ben; Schroth, Götz; Semroc, Bambi; Van Rikxoort, Henk; Zrust, Michal
2015-04-01
Sustainability standards and certification serve to differentiate and provide market recognition to goods produced in accordance with social and environmental good practices, typically including practices to protect biodiversity. Such standards have seen rapid growth, including in tropical agricultural commodities such as cocoa, coffee, palm oil, soybeans, and tea. Given the role of sustainability standards in influencing land use in hotspots of biodiversity, deforestation, and agricultural intensification, much could be gained from efforts to evaluate and increase the conservation payoff of these schemes. To this end, we devised a systematic approach for monitoring and evaluating the conservation impacts of agricultural sustainability standards and for using the resulting evidence to improve the effectiveness of such standards over time. The approach is oriented around a set of hypotheses and corresponding research questions about how sustainability standards are predicted to deliver conservation benefits. These questions are addressed through data from multiple sources, including basic common information from certification audits; field monitoring of environmental outcomes at a sample of certified sites; and rigorous impact assessment research based on experimental or quasi-experimental methods. Integration of these sources can generate time-series data that are comparable across sites and regions and provide detailed portraits of the effects of sustainability standards. To implement this approach, we propose new collaborations between the conservation research community and the sustainability standards community to develop common indicators and monitoring protocols, foster data sharing and synthesis, and link research and practice more effectively. As the role of sustainability standards in tropical land-use governance continues to evolve, robust evidence on the factors contributing to effectiveness can help to ensure that such standards are designed and implemented to maximize benefits for biodiversity conservation. © 2014 Society for Conservation Biology.
Methods of measurement for semiconductor materials, process control, and devices
NASA Technical Reports Server (NTRS)
Bullis, W. M. (Editor)
1972-01-01
Activities directed toward the development of methods of measurement for semiconductor materials, process control, and devices are described. Topics investigated include: measurements of transistor delay time; application of the infrared response technique to the study of radiation-damaged, lithium-drifted silicon detectors; and identification of a condition that minimizes wire flexure and reduces the failure rate of wire bonds in transistors and integrated circuits under slow thermal cycling conditions. Supplementary data concerning staff, standards committee activities, technical services, and publications are included as appendixes.
Langley Wind Tunnel Data Quality Assurance-Check Standard Results
NASA Technical Reports Server (NTRS)
Hemsch, Michael J.; Grubb, John P.; Krieger, William B.; Cler, Daniel L.
2000-01-01
A framework for statistical evaluation, control and improvement of wind funnel measurement processes is presented The methodology is adapted from elements of the Measurement Assurance Plans developed by the National Bureau of Standards (now the National Institute of Standards and Technology) for standards and calibration laboratories. The present methodology is based on the notions of statistical quality control (SQC) together with check standard testing and a small number of customer repeat-run sets. The results of check standard and customer repeat-run -sets are analyzed using the statistical control chart-methods of Walter A. Shewhart long familiar to the SQC community. Control chart results are presented for. various measurement processes in five facilities at Langley Research Center. The processes include test section calibration, force and moment measurements with a balance, and instrument calibration.
ERIC Educational Resources Information Center
Reichler-Beguelin, Marie-Jose, Ed.
1993-01-01
Papers from the conference on linguistic anomaly include: "La definition interactive de la deviance en situation exolingue et bilingue" ("The Interactive Definition of Deviation in Exolinguistic and Bilingual Situations") (Bernard Py); "La negociation ratee: pratiques sociales et methodes interactives du traitement de la…
Wavelet-Bayesian inference of cosmic strings embedded in the cosmic microwave background
NASA Astrophysics Data System (ADS)
McEwen, J. D.; Feeney, S. M.; Peiris, H. V.; Wiaux, Y.; Ringeval, C.; Bouchet, F. R.
2017-12-01
Cosmic strings are a well-motivated extension to the standard cosmological model and could induce a subdominant component in the anisotropies of the cosmic microwave background (CMB), in addition to the standard inflationary component. The detection of strings, while observationally challenging, would provide a direct probe of physics at very high-energy scales. We develop a framework for cosmic string inference from observations of the CMB made over the celestial sphere, performing a Bayesian analysis in wavelet space where the string-induced CMB component has distinct statistical properties to the standard inflationary component. Our wavelet-Bayesian framework provides a principled approach to compute the posterior distribution of the string tension Gμ and the Bayesian evidence ratio comparing the string model to the standard inflationary model. Furthermore, we present a technique to recover an estimate of any string-induced CMB map embedded in observational data. Using Planck-like simulations, we demonstrate the application of our framework and evaluate its performance. The method is sensitive to Gμ ∼ 5 × 10-7 for Nambu-Goto string simulations that include an integrated Sachs-Wolfe contribution only and do not include any recombination effects, before any parameters of the analysis are optimized. The sensitivity of the method compares favourably with other techniques applied to the same simulations.
Durão, Solange; Kredo, Tamara; Volmink, Jimmy
2015-06-01
To develop, assess, and maximize the sensitivity of a search strategy to identify diet and nutrition trials in PubMed using relative recall. We developed a search strategy to identify diet and nutrition trials in PubMed. We then constructed a gold standard reference set to validate the identified trials using the relative recall method. Relative recall was calculated by dividing the number of references from the gold standard our search strategy identified by the total number of references in the gold standard. Our gold standard comprised 298 trials, derived from 16 included systematic reviews. The initial search strategy identified 242 of 298 references, with a relative recall of 81.2% [95% confidence interval (CI): 76.3%, 85.5%]. We analyzed titles and abstracts of the 56 missed references for possible additional terms. We then modified the search strategy accordingly. The relative recall of the final search strategy was 88.6% (95% CI: 84.4%, 91.9%). We developed a search strategy to identify diet and nutrition trials in PubMed with a high relative recall (sensitivity). This could be useful for establishing a nutrition trials register to support the conduct of future research, including systematic reviews. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Rachakonda, Prem; Muralikrishnan, Bala; Cournoyer, Luc; Cheok, Geraldine; Lee, Vincent; Shilling, Meghan; Sawyer, Daniel
2017-10-01
The Dimensional Metrology Group at the National Institute of Standards and Technology is performing research to support the development of documentary standards within the ASTM E57 committee. This committee is addressing the point-to-point performance evaluation of a subclass of 3D imaging systems called terrestrial laser scanners (TLSs), which are laser-based and use a spherical coordinate system. This paper discusses the usage of sphere targets for this effort, and methods to minimize the errors due to the determination of their centers. The key contributions of this paper include methods to segment sphere data from a TLS point cloud, and the study of some of the factors that influence the determination of sphere centers.
NASA Handbook for Models and Simulations: An Implementation Guide for NASA-STD-7009
NASA Technical Reports Server (NTRS)
Steele, Martin J.
2013-01-01
The purpose of this Handbook is to provide technical information, clarification, examples, processes, and techniques to help institute good modeling and simulation practices in the National Aeronautics and Space Administration (NASA). As a companion guide to NASA-STD- 7009, Standard for Models and Simulations, this Handbook provides a broader scope of information than may be included in a Standard and promotes good practices in the production, use, and consumption of NASA modeling and simulation products. NASA-STD-7009 specifies what a modeling and simulation activity shall or should do (in the requirements) but does not prescribe how the requirements are to be met, which varies with the specific engineering discipline, or who is responsible for complying with the requirements, which depends on the size and type of project. A guidance document, which is not constrained by the requirements of a Standard, is better suited to address these additional aspects and provide necessary clarification. This Handbook stems from the Space Shuttle Columbia Accident Investigation (2003), which called for Agency-wide improvements in the "development, documentation, and operation of models and simulations"' that subsequently elicited additional guidance from the NASA Office of the Chief Engineer to include "a standard method to assess the credibility of the models and simulations."2 General methods applicable across the broad spectrum of model and simulation (M&S) disciplines were sought to help guide the modeling and simulation processes within NASA and to provide for consistent reporting ofM&S activities and analysis results. From this, the standardized process for the M&S activity was developed. The major contents of this Handbook are the implementation details of the general M&S requirements ofNASA-STD-7009, including explanations, examples, and suggestions for improving the credibility assessment of an M&S-based analysis.
Uncertainty Analysis for Angle Calibrations Using Circle Closure
Estler, W. Tyler
1998-01-01
We analyze two types of full-circle angle calibrations: a simple closure in which a single set of unknown angular segments is sequentially compared with an unknown reference angle, and a dual closure in which two divided circles are simultaneously calibrated by intercomparison. In each case, the constraint of circle closure provides auxiliary information that (1) enables a complete calibration process without reference to separately calibrated reference artifacts, and (2) serves to reduce measurement uncertainty. We derive closed-form expressions for the combined standard uncertainties of angle calibrations, following guidelines published by the International Organization for Standardization (ISO) and NIST. The analysis includes methods for the quantitative evaluation of the standard uncertainty of small angle measurement using electronic autocollimators, including the effects of calibration uncertainty and air turbulence. PMID:28009359
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kane, V.E.
1979-10-01
The standard maximum likelihood and moment estimation procedures are shown to have some undesirable characteristics for estimating the parameters in a three-parameter lognormal distribution. A class of goodness-of-fit estimators is found which provides a useful alternative to the standard methods. The class of goodness-of-fit tests considered include the Shapiro-Wilk and Shapiro-Francia tests which reduce to a weighted linear combination of the order statistics that can be maximized in estimation problems. The weighted-order statistic estimators are compared to the standard procedures in Monte Carlo simulations. Bias and robustness of the procedures are examined and example data sets analyzed including geochemical datamore » from the National Uranium Resource Evaluation Program.« less
Anand, K; Saini, Ks; Chopra, Y; Binod, Sk
2010-07-01
'Medical Devices' include everything from highly sophisticated, computerized, medical equipment, right down to simple wooden tongue depressors. Regulations embody the public expectations for how buildings and facilities are expected to perform and as such represent public policy. Regulators, who develop and enforce regulations, are empowered to act in the public's interest to set this policy and are ultimately responsible to the public in this regard. Standardization contributes to the basic infrastructure that underpins society including health and environment, while promoting sustainability and good regulatory practice. The international organizations that produce International Standards are the International Electrotechnical Commission (IEC), the International Organization for Standardization (ISO), and the International Telecommunication Union (ITU). With the increasing globalization of markets, International Standards (as opposed to regional or national standards) have become critical to the trading process, ensuring a level playing field for exports, and ensuring that imports meet the internationally recognized levels of performance and safety. The development of standards is done in response to sectors and stakeholders that express a clearly established need for them. An industry sector or other stakeholder group typically communicates its requirement for standards to one of the national members. To be accepted for development, a proposed work item must receive a majority support of the participating members, who verify the global relevance of the proposed item. The regulatory authority (RA) should provide a method for the recognition of international voluntary standards and for public notification of such recognition. The process of recognition may vary from country to country. Recognition may occur by periodic publication of lists of standards that a regulatory authority has found will meet the Essential Principles. In conclusion, International standards, such as, basic standards, group standards, and product standards, are a tool for harmonizing regulatory processes, to assure the safety, quality, and performance of medical devices. Standards represent the opinion of experts from all interested parties, including industry, regulators, users, and others.
Murphy, Thomas; Schwedock, Julie; Nguyen, Kham; Mills, Anna; Jones, David
2015-01-01
New recommendations for the validation of rapid microbiological methods have been included in the revised Technical Report 33 release from the PDA. The changes include a more comprehensive review of the statistical methods to be used to analyze data obtained during validation. This case study applies those statistical methods to accuracy, precision, ruggedness, and equivalence data obtained using a rapid microbiological methods system being evaluated for water bioburden testing. Results presented demonstrate that the statistical methods described in the PDA Technical Report 33 chapter can all be successfully applied to the rapid microbiological method data sets and gave the same interpretation for equivalence to the standard method. The rapid microbiological method was in general able to pass the requirements of PDA Technical Report 33, though the study shows that there can be occasional outlying results and that caution should be used when applying statistical methods to low average colony-forming unit values. Prior to use in a quality-controlled environment, any new method or technology has to be shown to work as designed by the manufacturer for the purpose required. For new rapid microbiological methods that detect and enumerate contaminating microorganisms, additional recommendations have been provided in the revised PDA Technical Report No. 33. The changes include a more comprehensive review of the statistical methods to be used to analyze data obtained during validation. This paper applies those statistical methods to analyze accuracy, precision, ruggedness, and equivalence data obtained using a rapid microbiological method system being validated for water bioburden testing. The case study demonstrates that the statistical methods described in the PDA Technical Report No. 33 chapter can be successfully applied to rapid microbiological method data sets and give the same comparability results for similarity or difference as the standard method. © PDA, Inc. 2015.
A protocol for lifetime energy and environmental impact assessment of building insulation materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shrestha, Som S., E-mail: shresthass@ornl.gov; Biswas, Kaushik; Desjarlais, Andre O.
This article describes a proposed protocol that is intended to provide a comprehensive list of factors to be considered in evaluating the direct and indirect environmental impacts of building insulation materials, as well as detailed descriptions of standardized calculation methodologies to determine those impacts. The energy and environmental impacts of insulation materials can generally be divided into two categories: (1) direct impact due to the embodied energy of the insulation materials and other factors and (2) indirect or environmental impacts avoided as a result of reduced building energy use due to addition of insulation. Standards and product category rules exist,more » which provide guidelines about the life cycle assessment (LCA) of materials, including building insulation products. However, critical reviews have suggested that these standards fail to provide complete guidance to LCA studies and suffer from ambiguities regarding the determination of the environmental impacts of building insulation and other products. The focus of the assessment protocol described here is to identify all factors that contribute to the total energy and environmental impacts of different building insulation products and, more importantly, provide standardized determination methods that will allow comparison of different insulation material types. Further, the intent is not to replace current LCA standards but to provide a well-defined, easy-to-use comparison method for insulation materials using existing LCA guidelines. - Highlights: • We proposed a protocol to evaluate the environmental impacts of insulation materials. • The protocol considers all life cycle stages of an insulation material. • Both the direct environmental impacts and the indirect impacts are defined. • Standardized calculation methods for the ‘avoided operational energy’ is defined. • Standardized calculation methods for the ‘avoided environmental impact’ is defined.« less
Miniprimer PCR, a New Lens for Viewing the Microbial World▿ †
Isenbarger, Thomas A.; Finney, Michael; Ríos-Velázquez, Carlos; Handelsman, Jo; Ruvkun, Gary
2008-01-01
Molecular methods based on the 16S rRNA gene sequence are used widely in microbial ecology to reveal the diversity of microbial populations in environmental samples. Here we show that a new PCR method using an engineered polymerase and 10-nucleotide “miniprimers” expands the scope of detectable sequences beyond those detected by standard methods using longer primers and Taq polymerase. After testing the method in silico to identify divergent ribosomal genes in previously cloned environmental sequences, we applied the method to soil and microbial mat samples, which revealed novel 16S rRNA gene sequences that would not have been detected with standard primers. Deeply divergent sequences were discovered with high frequency and included representatives that define two new division-level taxa, designated CR1 and CR2, suggesting that miniprimer PCR may reveal new dimensions of microbial diversity. PMID:18083877
76 FR 54293 - Review of National Ambient Air Quality Standards for Carbon Monoxide
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-31
...This rule is being issued at this time as required by a court order governing the schedule for completion of this review of the air quality criteria and the national ambient air quality standards (NAAQS) for carbon monoxide (CO). Based on its review, the EPA concludes the current primary standards are requisite to protect public health with an adequate margin of safety, and is retaining those standards. After review of the air quality criteria, EPA further concludes that no secondary standard should be set for CO at this time. EPA is also making changes to the ambient air monitoring requirements for CO, including those related to network design, and is updating, without substantive change, aspects of the Federal reference method.
Boehm, A.B.; Griffith, J.; McGee, C.; Edge, T.A.; Solo-Gabriele, H. M.; Whitman, R.; Cao, Y.; Getrich, M.; Jay, J.A.; Ferguson, D.; Goodwin, K.D.; Lee, C.M.; Madison, M.; Weisberg, S.B.
2009-01-01
Aims: The absence of standardized methods for quantifying faecal indicator bacteria (FIB) in sand hinders comparison of results across studies. The purpose of the study was to compare methods for extraction of faecal bacteria from sands and recommend a standardized extraction technique. Methods and Results: Twenty-two methods of extracting enterococci and Escherichia coli from sand were evaluated, including multiple permutations of hand shaking, mechanical shaking, blending, sonication, number of rinses, settling time, eluant-to-sand ratio, eluant composition, prefiltration and type of decantation. Tests were performed on sands from California, Florida and Lake Michigan. Most extraction parameters did not significantly affect bacterial enumeration. anova revealed significant effects of eluant composition and blending; with both sodium metaphosphate buffer and blending producing reduced counts. Conclusions: The simplest extraction method that produced the highest FIB recoveries consisted of 2 min of hand shaking in phosphate-buffered saline or deionized water, a 30-s settling time, one-rinse step and a 10 : 1 eluant volume to sand weight ratio. This result was consistent across the sand compositions tested in this study but could vary for other sand types. Significance and Impact of the Study: Method standardization will improve the understanding of how sands affect surface water quality. ?? 2009 The Society for Applied Microbiology.
Jha, Abhinav K.; Mena, Esther; Caffo, Brian; Ashrafinia, Saeed; Rahmim, Arman; Frey, Eric; Subramaniam, Rathan M.
2017-01-01
Abstract. Recently, a class of no-gold-standard (NGS) techniques have been proposed to evaluate quantitative imaging methods using patient data. These techniques provide figures of merit (FoMs) quantifying the precision of the estimated quantitative value without requiring repeated measurements and without requiring a gold standard. However, applying these techniques to patient data presents several practical difficulties including assessing the underlying assumptions, accounting for patient-sampling-related uncertainty, and assessing the reliability of the estimated FoMs. To address these issues, we propose statistical tests that provide confidence in the underlying assumptions and in the reliability of the estimated FoMs. Furthermore, the NGS technique is integrated within a bootstrap-based methodology to account for patient-sampling-related uncertainty. The developed NGS framework was applied to evaluate four methods for segmenting lesions from F-Fluoro-2-deoxyglucose positron emission tomography images of patients with head-and-neck cancer on the task of precisely measuring the metabolic tumor volume. The NGS technique consistently predicted the same segmentation method as the most precise method. The proposed framework provided confidence in these results, even when gold-standard data were not available. The bootstrap-based methodology indicated improved performance of the NGS technique with larger numbers of patient studies, as was expected, and yielded consistent results as long as data from more than 80 lesions were available for the analysis. PMID:28331883
Gosselin, Robert C; Adcock, Dorothy M; Bates, Shannon M; Douxfils, Jonathan; Favaloro, Emmanuel J; Gouin-Thibault, Isabelle; Guillermo, Cecilia; Kawai, Yohko; Lindhoff-Last, Edelgard; Kitchen, Steve
2018-03-01
This guidance document was prepared on behalf of the International Council for Standardization in Haematology (ICSH) for providing haemostasis-related guidance documents for clinical laboratories. This inaugural coagulation ICSH document was developed by an ad hoc committee, comprised of international clinical and laboratory direct acting oral anticoagulant (DOAC) experts. The committee developed consensus recommendations for laboratory measurement of DOACs (dabigatran, rivaroxaban, apixaban and edoxaban), which would be germane for laboratories assessing DOAC anticoagulation. This guidance document addresses all phases of laboratory DOAC measurements, including pre-analytical (e.g. preferred time sample collection, preferred sample type, sample stability), analytical (gold standard method, screening and quantifying methods) and post analytical (e.g. reporting units, quality assurance). The committee addressed the use and limitations of screening tests such as prothrombin time, activated partial thromboplastin time as well as viscoelastic measurements of clotting blood and point of care methods. Additionally, the committee provided recommendations for the proper validation or verification of performance of laboratory assays prior to implementation for clinical use, and external quality assurance to provide continuous assessment of testing and reporting method. Schattauer GmbH Stuttgart.
Bekiroglu, Somer; Myrberg, Olle; Ostman, Kristina; Ek, Marianne; Arvidsson, Torbjörn; Rundlöf, Torgny; Hakkarainen, Birgit
2008-08-05
A 1H-nuclear magnetic resonance (NMR) spectroscopy method for quantitative determination of benzethonium chloride (BTC) as a constituent of grapefruit seed extract was developed. The method was validated, assessing its specificity, linearity, range, and precision, as well as accuracy, limit of quantification and robustness. The method includes quantification using an internal reference standard, 1,3,5-trimethoxybenzene, and regarded as simple, rapid, and easy to implement. A commercial grapefruit seed extract was studied and the experiments were performed on spectrometers operating at two different fields, 300 and 600 MHz for proton frequencies, the former with a broad band (BB) probe and the latter equipped with both a BB probe and a CryoProbe. The concentration average for the product sample was 78.0, 77.8 and 78.4 mg/ml using the 300 BB probe, the 600MHz BB probe and CryoProbe, respectively. The standard deviation and relative standard deviation (R.S.D., in parenthesis) for the average concentrations was 0.2 (0.3%), 0.3 (0.4%) and 0.3mg/ml (0.4%), respectively.
Near-infrared fluorescence image quality test methods for standardized performance evaluation
NASA Astrophysics Data System (ADS)
Kanniyappan, Udayakumar; Wang, Bohan; Yang, Charles; Ghassemi, Pejhman; Wang, Quanzeng; Chen, Yu; Pfefer, Joshua
2017-03-01
Near-infrared fluorescence (NIRF) imaging has gained much attention as a clinical method for enhancing visualization of cancers, perfusion and biological structures in surgical applications where a fluorescent dye is monitored by an imaging system. In order to address the emerging need for standardization of this innovative technology, it is necessary to develop and validate test methods suitable for objective, quantitative assessment of device performance. Towards this goal, we develop target-based test methods and investigate best practices for key NIRF imaging system performance characteristics including spatial resolution, depth of field and sensitivity. Characterization of fluorescence properties was performed by generating excitation-emission matrix properties of indocyanine green and quantum dots in biological solutions and matrix materials. A turbid, fluorophore-doped target was used, along with a resolution target for assessing image sharpness. Multi-well plates filled with either liquid or solid targets were generated to explore best practices for evaluating detection sensitivity. Overall, our results demonstrate the utility of objective, quantitative, target-based testing approaches as well as the need to consider a wide range of factors in establishing standardized approaches for NIRF imaging system performance.
Resolution for color photography
NASA Astrophysics Data System (ADS)
Hubel, Paul M.; Bautsch, Markus
2006-02-01
Although it is well known that luminance resolution is most important, the ability to accurately render colored details, color textures, and colored fabrics cannot be overlooked. This includes the ability to accurately render single-pixel color details as well as avoiding color aliasing. All consumer digital cameras on the market today record in color and the scenes people are photographing are usually color. Yet almost all resolution measurements made on color cameras are done using a black and white target. In this paper we present several methods for measuring and quantifying color resolution. The first method, detailed in a previous publication, uses a slanted-edge target of two colored surfaces in place of the standard black and white edge pattern. The second method employs the standard black and white targets recommended in the ISO standard, but records these onto the camera through colored filters thus giving modulation between black and one particular color component; red, green, and blue color separation filters are used in this study. The third method, conducted at Stiftung Warentest, an independent consumer organization of Germany, uses a whitelight interferometer to generate fringe pattern targets of varying color and spatial frequency.
On the Latent Regression Model of Item Response Theory. Research Report. ETS RR-07-12
ERIC Educational Resources Information Center
Antal, Tamás
2007-01-01
Full account of the latent regression model for the National Assessment of Educational Progress is given. The treatment includes derivation of the EM algorithm, Newton-Raphson method, and the asymptotic standard errors. The paper also features the use of the adaptive Gauss-Hermite numerical integration method as a basic tool to evaluate…
ERIC Educational Resources Information Center
Owen-Stone, Deborah S.
2012-01-01
The purpose of this concurrent mixed methods study was to examine the collaborative relationship between scientists and science teachers and to incorporate and advocate scientific literacy based on past and current educational theories such as inquiry based teaching. The scope of this study included archived student standardized test scores,…
Yousuf, Naveed; Violato, Claudio; Zuberi, Rukhsana W
2015-01-01
CONSTRUCT: Authentic standard setting methods will demonstrate high convergent validity evidence of their outcomes, that is, cutoff scores and pass/fail decisions, with most other methods when compared with each other. The objective structured clinical examination (OSCE) was established for valid, reliable, and objective assessment of clinical skills in health professions education. Various standard setting methods have been proposed to identify objective, reliable, and valid cutoff scores on OSCEs. These methods may identify different cutoff scores for the same examinations. Identification of valid and reliable cutoff scores for OSCEs remains an important issue and a challenge. Thirty OSCE stations administered at least twice in the years 2010-2012 to 393 medical students in Years 2 and 3 at Aga Khan University are included. Psychometric properties of the scores are determined. Cutoff scores and pass/fail decisions of Wijnen, Cohen, Mean-1.5SD, Mean-1SD, Angoff, borderline group and borderline regression (BL-R) methods are compared with each other and with three variants of cluster analysis using repeated measures analysis of variance and Cohen's kappa. The mean psychometric indices on the 30 OSCE stations are reliability coefficient = 0.76 (SD = 0.12); standard error of measurement = 5.66 (SD = 1.38); coefficient of determination = 0.47 (SD = 0.19), and intergrade discrimination = 7.19 (SD = 1.89). BL-R and Wijnen methods show the highest convergent validity evidence among other methods on the defined criteria. Angoff and Mean-1.5SD demonstrated least convergent validity evidence. The three cluster variants showed substantial convergent validity with borderline methods. Although there was a high level of convergent validity of Wijnen method, it lacks the theoretical strength to be used for competency-based assessments. The BL-R method is found to show the highest convergent validity evidences for OSCEs with other standard setting methods used in the present study. We also found that cluster analysis using mean method can be used for quality assurance of borderline methods. These findings should be further confirmed by studies in other settings.
Analysis of Indonesian educational system standard with KSIM cross-impact method
NASA Astrophysics Data System (ADS)
Arridjal, F.; Aldila, D.; Bustamam, A.
2017-07-01
The Result of The Programme of International Student Assessment (PISA) on 2012 shows that Indonesia is on 64'th position from 65 countries in Mathematics Mean Score. The 2013 Learning Curve Mapping, Indonesia is included in the 10th category of countries with the lowest performance on cognitive skills aspect, i.e. 37'th position from 40 countries. Competency is built by 3 aspects, one of them is cognitive aspect. The low result of mapping on cognitive aspect, describe the low of graduate competences as an output of Indonesia National Education System (INES). INES adopting a concept Eight Educational System Standards (EESS), one of them is graduate competency standard which connected directly with Indonesia's students. This research aims is to model INES by using KSIM cross-impact. Linear regression models of EESS constructed using the accreditation national data of Senior High Schools in Indonesia. The results then interpreted as impact value on the construction of KSIM cross-impact INES. The construction is used to analyze the interaction of EESS and doing numerical simulation for possible public policy in the education sector, i.e. stimulate the growth of education staff standard, content, process and infrastructure. All simulations of public policy has been done with 2 methods i.e with a multiplier impact method and with constant intervention method. From numerical simulation result, it is shown that stimulate the growth standard of content in the construction KSIM cross-impact EESS is the best option for public policy to maximize the growth of graduate competency standard.
Franc, Jeffrey Michael; Ingrassia, Pier Luigi; Verde, Manuela; Colombo, Davide; Della Corte, Francesco
2015-02-01
Surge capacity, or the ability to manage an extraordinary volume of patients, is fundamental for hospital management of mass-casualty incidents. However, quantification of surge capacity is difficult and no universal standard for its measurement has emerged, nor has a standardized statistical method been advocated. As mass-casualty incidents are rare, simulation may represent a viable alternative to measure surge capacity. Hypothesis/Problem The objective of the current study was to develop a statistical method for the quantification of surge capacity using a combination of computer simulation and simple process-control statistical tools. Length-of-stay (LOS) and patient volume (PV) were used as metrics. The use of this method was then demonstrated on a subsequent computer simulation of an emergency department (ED) response to a mass-casualty incident. In the derivation phase, 357 participants in five countries performed 62 computer simulations of an ED response to a mass-casualty incident. Benchmarks for ED response were derived from these simulations, including LOS and PV metrics for triage, bed assignment, physician assessment, and disposition. In the application phase, 13 students of the European Master in Disaster Medicine (EMDM) program completed the same simulation scenario, and the results were compared to the standards obtained in the derivation phase. Patient-volume metrics included number of patients to be triaged, assigned to rooms, assessed by a physician, and disposed. Length-of-stay metrics included median time to triage, room assignment, physician assessment, and disposition. Simple graphical methods were used to compare the application phase group to the derived benchmarks using process-control statistical tools. The group in the application phase failed to meet the indicated standard for LOS from admission to disposition decision. This study demonstrates how simulation software can be used to derive values for objective benchmarks of ED surge capacity using PV and LOS metrics. These objective metrics can then be applied to other simulation groups using simple graphical process-control tools to provide a numeric measure of surge capacity. Repeated use in simulations of actual EDs may represent a potential means of objectively quantifying disaster management surge capacity. It is hoped that the described statistical method, which is simple and reusable, will be useful for investigators in this field to apply to their own research.
Connor, Brooke F.; Rose, Donna L.; Noriega, Mary C.; Murtaugh, Lucinda K.; Abney, Sonja R.
1998-01-01
This report presents precision and accuracy data for volatile organic compounds (VOCs) in the nanogram-per-liter range, including aromatic hydrocarbons, reformulated fuel components, and halogenated hydrocarbons using purge and trap capillary-column gas chromatography/mass spectrometry. One-hundred-four VOCs were initially tested. Of these, 86 are suitable for determination by this method. Selected data are provided for the 18 VOCs that were not included. This method also allows for the reporting of semiquantitative results for tentatively identified VOCs not included in the list of method compounds. Method detection limits, method performance data, preservation study results, and blank results are presented. The authors describe a procedure for reporting low-concentration detections at less than the reporting limit. The nondetection value (NDV) is introduced as a statistically defined reporting limit designed to limit false positives and false negatives to less than 1 percent. Nondetections of method compounds are reported as ?less than NDV.? Positive detections measured at less than NDV are reported as estimated concentrations to alert the data user to decreased confidence in accurate quantitation. Instructions are provided for analysts to report data at less than the reporting limits. This method can support the use of either method reporting limits that censor detections at lower concentrations or the use of NDVs as reporting limits. The data-reporting strategy for providing analytical results at less than the reporting limit is a result of the increased need to identify the presence or absence of environmental contaminants in water samples at increasingly lower concentrations. Long-term method detection limits (LTMDLs) for 86 selected compounds range from 0.013 to 2.452 micrograms per liter (?g/L) and differ from standard method detection limits (MDLs) in that the LTMDLs include the long-term variance of multiple instruments, multiple operators, and multiple calibrations over a longer time. For these reasons, LTMDLs are expected to be slightly higher than standard MDLs. Recoveries for all of the VOCs tested ranged from 36 (tert-butyl formate) to 155 percent (pentachlorobenzene). The majority of the compounds ranged from 85 to 115 percent recovery and had less than 5 percent relative standard deviation for concentrations spiked between 1 to 500 ?g/L in volatile blank-, surface-, and ground-water samples. Recoveries of 60 set spikes at low concentrations ranged from 70 to 114 percent (1,2,3- trimethylbenzene and acetone). Recovery data were collected over 6 months with multiple instruments, operators, and calibrations. In this method, volatile organic compounds are extracted from a water sample by actively purging with helium. The VOCs are collected onto a sorbent trap, thermally desorbed, separated by a Megabore gas chromatographic capillary column, and finally determined by a full-scan quadrupole mass spectrometer. Compound identification is confirmed by the gas chromatographic retention time and by the resultant mass spectrum, typically identified by three unique ions. An unknown compound detected in a sample can be tentatively identified by comparing the unknown mass spectrum to reference spectra in the mass-spectra computer-data system library compiled by the National Institute of Standards and Technology.
Ileleji, Klein E; Garcia, Arnoldo A; Kingsly, Ambrose R P; Clementson, Clairmont L
2010-01-01
This study quantified the variability among 14 standard moisture loss-on-drying (gravimetric) methods for determination of the moisture content of corn distillers dried grains with solubles (DDGS). The methods were compared with the Karl Fischer (KF) titration method to determine their percent variation from the KF method. Additionally, the thermo-balance method using a halogen moisture analyzer that is routinely used in fuel ethanol plants was included in the methods investigated. Moisture contents by the loss-on-drying methods were significantly different for DDGS samples from three fuel ethanol plants. The percent deviation of the moisture loss-on-drying methods decreased with decrease in drying temperature and, to a lesser extent, drying time. This was attributed to an overestimation of moisture content in DDGS due to the release of volatiles at high temperatures. Our findings indicate that the various methods that have been used for moisture determination by moisture loss-on-drying will not give identical results and therefore, caution should be exercised when selecting a moisture loss-on-drying method for DDGS.
Development of the Korean framework for senior-friendly hospitals: a Delphi study.
Kim, Yoon-Sook; Han, Seol-Heui; Hwang, Jeong-Hae; Park, Jae-Min; Lee, Jongmin; Choi, Jaekyung; Moon, Yeonsil; Kim, Hee Joung; Shin, Grace Jung Eun; Lee, Ji-Sun; Choi, Ye Ji; Uhm, Kyeong Eun; Kim, In Ae; Nam, Ji-Won
2017-08-04
Aging is an inevitable part of life. One can maintain well-being and wellness even after discharge and/or transition if his or her functional decline is minimized, sudden decline is prevented, and functioning is promoted during hospitalization. Caring appropriately for elderly patients requires the systematic application of Senior-Friendly Hospital principles to all operating systems, including medical centres' organization and environment, as well as patient treatment processes. The Senior-Friendly Hospital framework is valid and important for patient safety and quality improvement. This study aimed to make recommendations regarding the development of the Korean Framework for Senior-Friendly Hospitals for older patients' care management, patient safety interventions, and health promotion, via a Delphi survey. Two rounds of Delphi surveying were conducted with 15 participants who had at least 3 years' experience in accreditation surveying and medical accreditation standards, survey methods, and accreditation investigator education. In each round, we calculated statistics describing each standard's validity and feasibility. The Korean Framework for Senior-Friendly Hospitals included 4 Chapters, 11 categories, and 67 standards through consensus of the Senior-Friendly Hospitals task force and experts' peer review. After the two rounds of Delphi surveying, validity evaluation led to no changes in standards of the Senior-Friendly Hospitals; however, the number of standards showing adequate validity decreased from 67 to 58. Regarding feasibility, no changes were necessary in the standards; however, the number of categories showing adequate feasibility decreased from 11 to 8 and from 67 to 30, respectively. The excluded categories were 3.2, 4.2, and 4.3 (service, transportation, and signage and identification). The highest feasibility values were given to standards 2.1.1, 4.1.4, and 4.1.6. The highest feasibility score was given to standard 2.4.2. The Korean Framework for Senior-Friendly Hospitals needs to include 4 Chapters, 8 categories, and 30 standards. The Accreditation Program for Healthcare Organizations should include Senior-Friendly Hospitals -relevant standards considering Korea's medical environment.
Neutron activation analysis of certified samples by the absolute method
NASA Astrophysics Data System (ADS)
Kadem, F.; Belouadah, N.; Idiri, Z.
2015-07-01
The nuclear reactions analysis technique is mainly based on the relative method or the use of activation cross sections. In order to validate nuclear data for the calculated cross section evaluated from systematic studies, we used the neutron activation analysis technique (NAA) to determine the various constituent concentrations of certified samples for animal blood, milk and hay. In this analysis, the absolute method is used. The neutron activation technique involves irradiating the sample and subsequently performing a measurement of the activity of the sample. The fundamental equation of the activation connects several physical parameters including the cross section that is essential for the quantitative determination of the different elements composing the sample without resorting to the use of standard sample. Called the absolute method, it allows a measurement as accurate as the relative method. The results obtained by the absolute method showed that the values are as precise as the relative method requiring the use of standard sample for each element to be quantified.
UPLC-MS/MS determination of ptaquiloside and pterosin B in preserved natural water.
Clauson-Kaas, Frederik; Hansen, Hans Christian Bruun; Strobel, Bjarne W
2016-11-01
The naturally occurring carcinogen ptaquiloside and its degradation product pterosin B are found in water leaching from bracken stands. The objective of this work is to present a new sample preservation method and a fast UPLC-MS/MS method for quantification of ptaquiloside and pterosin B in environmental water samples, employing a novel internal standard. A faster, reliable, and efficient method was developed for isolation of high purity ptaquiloside and pterosin B from plant material for use as analytical standards, with purity verified by 1 H-NMR. The chemical analysis was performed by cleanup and preconcentration of samples with solid phase extraction, before analyte quantification with UPLC-MS/MS. By including gradient elution and optimizing the liquid chromatography mobile phase buffer system, a total run cycle of 5 min was achieved, with method detection limits, including preconcentration, of 8 and 4 ng/L for ptaquiloside and pterosin B, respectively. The use of loganin as internal standard improved repeatability of the determination of both analytes, though it could not be employed for sample preparation. Buffering raw water samples in situ with ammonium acetate to pH ∼5.5 decisively increased sample integrity at realistic transportation and storing conditions prior to extraction. Groundwater samples collected in November 2015 at the shallow water table below a Danish bracken stand were preserved and analyzed using the above methods, and PTA concentrations of 3.8 ± 0.24 μg/L (±sd, n = 3) were found, much higher than previously reported. Graphical abstract Workflow overview of ptaquiloside determination.
Ansermot, Nicolas; Rudaz, Serge; Brawand-Amey, Marlyse; Fleury-Souverain, Sandrine; Veuthey, Jean-Luc; Eap, Chin B
2009-08-01
Matrix effects, which represent an important issue in liquid chromatography coupled to mass spectrometry or tandem mass spectrometry detection, should be closely assessed during method development. In the case of quantitative analysis, the use of stable isotope-labelled internal standard with physico-chemical properties and ionization behaviour similar to the analyte is recommended. In this paper, an example of the choice of a co-eluting deuterated internal standard to compensate for short-term and long-term matrix effect in the case of chiral (R,S)-methadone plasma quantification is reported. The method was fully validated over a concentration range of 5-800 ng/mL for each methadone enantiomer with satisfactory relative bias (-1.0 to 1.0%), repeatability (0.9-4.9%) and intermediate precision (1.4-12.0%). From the results obtained during validation, a control chart process during 52 series of routine analysis was established using both intermediate precision standard deviation and FDA acceptance criteria. The results of routine quality control samples were generally included in the +/-15% variability around the target value and mainly in the two standard deviation interval illustrating the long-term stability of the method. The intermediate precision variability estimated in method validation was found to be coherent with the routine use of the method. During this period, 257 trough concentration and 54 peak concentration plasma samples of patients undergoing (R,S)-methadone treatment were successfully analysed for routine therapeutic drug monitoring.
Methods for assessing the quality of mammalian embryos: How far we are from the gold standard?
Rocha, José C; Passalia, Felipe; Matos, Felipe D; Maserati, Marc P; Alves, Mayra F; Almeida, Tamie G de; Cardoso, Bruna L; Basso, Andrea C; Nogueira, Marcelo F G
2016-08-01
Morphological embryo classification is of great importance for many laboratory techniques, from basic research to the ones applied to assisted reproductive technology. However, the standard classification method for both human and cattle embryos, is based on quality parameters that reflect the overall morphological quality of the embryo in cattle, or the quality of the individual embryonic structures, more relevant in human embryo classification. This assessment method is biased by the subjectivity of the evaluator and even though several guidelines exist to standardize the classification, it is not a method capable of giving reliable and trustworthy results. Latest approaches for the improvement of quality assessment include the use of data from cellular metabolism, a new morphological grading system, development kinetics and cleavage symmetry, embryo cell biopsy followed by pre-implantation genetic diagnosis, zona pellucida birefringence, ion release by the embryo cells and so forth. Nowadays there exists a great need for evaluation methods that are practical and non-invasive while being accurate and objective. A method along these lines would be of great importance to embryo evaluation by embryologists, clinicians and other professionals who work with assisted reproductive technology. Several techniques shows promising results in this sense, one being the use of digital images of the embryo as basis for features extraction and classification by means of artificial intelligence techniques (as genetic algorithms and artificial neural networks). This process has the potential to become an accurate and objective standard for embryo quality assessment.
Methods for assessing the quality of mammalian embryos: How far we are from the gold standard?
Rocha, José C.; Passalia, Felipe; Matos, Felipe D.; Maserati Jr, Marc P.; Alves, Mayra F.; de Almeida, Tamie G.; Cardoso, Bruna L.; Basso, Andrea C.; Nogueira, Marcelo F. G.
2016-01-01
Morphological embryo classification is of great importance for many laboratory techniques, from basic research to the ones applied to assisted reproductive technology. However, the standard classification method for both human and cattle embryos, is based on quality parameters that reflect the overall morphological quality of the embryo in cattle, or the quality of the individual embryonic structures, more relevant in human embryo classification. This assessment method is biased by the subjectivity of the evaluator and even though several guidelines exist to standardize the classification, it is not a method capable of giving reliable and trustworthy results. Latest approaches for the improvement of quality assessment include the use of data from cellular metabolism, a new morphological grading system, development kinetics and cleavage symmetry, embryo cell biopsy followed by pre-implantation genetic diagnosis, zona pellucida birefringence, ion release by the embryo cells and so forth. Nowadays there exists a great need for evaluation methods that are practical and non-invasive while being accurate and objective. A method along these lines would be of great importance to embryo evaluation by embryologists, clinicians and other professionals who work with assisted reproductive technology. Several techniques shows promising results in this sense, one being the use of digital images of the embryo as basis for features extraction and classification by means of artificial intelligence techniques (as genetic algorithms and artificial neural networks). This process has the potential to become an accurate and objective standard for embryo quality assessment. PMID:27584609
Flammability of gas mixtures. Part 1: fire potential.
Schröder, Volkmar; Molnarne, Maria
2005-05-20
International and European dangerous substances and dangerous goods regulations refer to the standard ISO 10156 (1996). This standard includes a test method and a calculation procedure for the determination of the flammability of gases and gas mixtures in air. The substance indices for the calculation, the so called "Tci values", which characterise the fire potential, are provided as well. These ISO Tci values are derived from explosion diagrams of older literature sources which do not take into account the test method and the test apparatus. However, since the explosion limits are influenced by apparatus parameters, the Tci values and lower explosion limits, given by the ISO tables, are inconsistent with those measured according to the test method of the same standard. In consequence, applying the ISO Tci values can result in wrong classifications. In this paper internationally accepted explosion limit test methods were evaluated and Tci values were derived from explosion diagrams. Therefore, an "open vessel" method with flame propagation criterion was favoured. These values were compared with the Tci values listed in ISO 10156. In most cases, significant deviations were found. A detailed study about the influence of inert gases on flammability is the objective of Part 2.
Granja, Rodrigo H M M; Salerno, Alessandro G; de Lima, Andreia C; Montalvo, Cynthia; Reche, Karine V G; Giannotti, Fabio M; Wanschel, Amarylis C B A
2014-01-01
Boldenone, an androgenic steroid, is forbidden for use in meat production in most countries worldwide. Residues of this drug in food present a potential risk to consumers. A sensitive LC/MS/MS method for analysis of 17β-boldenone using boldenone-d3 as an internal standard was developed. An enzymatic hydrolysis and extraction using ethyl acetate, methanol, and hexane were performed in the sample preparation. Parameters such as decision limit (CCα), detection capability (CCβ), precision, recovery, and ruggedness were evaluated according to the Brazilian Regulation 24/2009 (equivalent to European Union Decision 2002/657/EC) and International Organization for Standardization/International Electrotechnical Commission 17025:2005. CCα and CCβ were determined to be 0.17 and 0.29 μg/kg, respectively. Average recoveries from bovine liver samples fortified with 1, 1.5, and 2 μg/kg were around 100%. A complete statistical analysis was performed on the results obtained, including an estimation of the method uncertainty. The method is considered robust after being subjected to day-to-day analytical variations and has been used as a standard method in Brazil to report boldenone levels in bovine liver.
Analysis of volatile organic compounds. [trace amounts of organic volatiles in gas samples
NASA Technical Reports Server (NTRS)
Zlatkis, A. (Inventor)
1977-01-01
An apparatus and method are described for reproducibly analyzing trace amounts of a large number of organic volatiles existing in a gas sample. Direct injection of the trapped volatiles into a cryogenic percolum provides a sharply defined plug. Applications of the method include: (1) analyzing the headspace gas of body fluids and comparing a profile of the organic volatiles with standard profiles for the detection and monitoring of disease; (2) analyzing the headspace gas of foods and beverages and comparing the profile with standard profiles to monitor and control flavor and aroma; and (3) analyses for determining the organic pollutants in air or water samples.
Delay correlation analysis and representation for vital complaint VHDL models
Rich, Marvin J.; Misra, Ashutosh
2004-11-09
A method and system unbind a rise/fall tuple of a VHDL generic variable and create rise time and fall time generics of each generic variable that are independent of each other. Then, according to a predetermined correlation policy, the method and system collect delay values in a VHDL standard delay file, sort the delay values, remove duplicate delay values, group the delay values into correlation sets, and output an analysis file. The correlation policy may include collecting all generic variables in a VHDL standard delay file, selecting each generic variable, and performing reductions on the set of delay values associated with each selected generic variable.
Valls-Cantenys, Carme; Scheurer, Marco; Iglesias, Mònica; Sacher, Frank; Brauch, Heinz-Jürgen; Salvadó, Victoria
2016-09-01
A sensitive, multi-residue method using solid-phase extraction followed by liquid chromatography-tandem mass spectrometry (LC-MS/MS) was developed to determine a representative group of 35 analytes, including corrosion inhibitors, pesticides and pharmaceuticals such as analgesic and anti-inflammatory drugs, five iodinated contrast media, β-blockers and some of their metabolites and transformation products in water samples. Few other methods are capable of determining such a broad range of contrast media together with other analytes. We studied the parameters affecting the extraction of the target analytes, including sorbent selection and extraction conditions, their chromatographic separation (mobile phase composition and column) and detection conditions using two ionisation sources: electrospray ionisation (ESI) and atmospheric pressure chemical ionisation (APCI). In order to correct matrix effects, a total of 20 surrogate/internal standards were used. ESI was found to have better sensitivity than APCI. Recoveries ranging from 79 to 134 % for tap water and 66 to 144 % for surface water were obtained. Intra-day precision, calculated as relative standard deviation, was below 34 % for tap water and below 21 % for surface water, groundwater and effluent wastewater. Method quantification limits (MQL) were in the low ng L(-1) range, except for the contrast agents iomeprol, amidotrizoic acid and iohexol (22, 25.5 and 17.9 ng L(-1), respectively). Finally, the method was applied to the analysis of 56 real water samples as part of the validation procedure. All of the compounds were detected in at least some of the water samples analysed. Graphical Abstract Multi-residue method for the determination of micropollutants including pharmaceuticals, iodinated contrast media and pesticides in waters by LC-MS/MS.
Kamiński, Marian; Kartanowicz, Rafał; Przyjazny, Andrzej
2004-03-12
A method of effective application of normal-phase high-performance liquid chromatography (NP-HPLC) with ultraviolet diode array detection (DAD) and refractive index detection (RID) for the determination of class composition of gasoline and its components, i.e. for the determination of content of alkenes, aromatic and saturated hydrocarbons in gasoline meeting modern quality standards, has been developed. An aminopropyl-bonded silica stationary phase was used along with n-hexane or n-heptane as the mobile phase. A DAD signal integrated over the 207-240 nm range was used to determine alkenes. This eliminates the necessity of separating alkenes from saturates, because the latter do not absorb UV radiation above 200 nm. The content of aromatic hydrocarbons is determined by means of a refractive index detector. Calibration was based on hydrocarbon type composition determined by the fluorescent indicator adsorption method, ASTM D1319. The results obtained by the developed method were found to be consistent with those obtained by fluorescent indicator adsorption or by a multidimensional GC method (PIONA) (ASTM D5443). The method can be applied to gasoline meeting recent quality standards, irrespective of refining technology used in the production of gasoline components, including gasoline with various contents of oxygenates. The developed method cannot be used to determine the hydrocarbon type composition of gasoline that contains as a component the so-called pyrocondensate, i.e. the fraction with a boiling point up to 220 degrees C, obtained through thermal pyrolysis of distillation residues of crude oil or coal and, consequently, does not meet the quality standards. The paper includes the procedure for identification of this type of gasoline.
NASA Astrophysics Data System (ADS)
Vogt, William C.; Jia, Congxian; Wear, Keith A.; Garra, Brian S.; Pfefer, T. Joshua
2017-03-01
As Photoacoustic Tomography (PAT) matures and undergoes clinical translation, objective performance test methods are needed to facilitate device development, regulatory clearance and clinical quality assurance. For mature medical imaging modalities such as CT, MRI, and ultrasound, tissue-mimicking phantoms are frequently incorporated into consensus standards for performance testing. A well-validated set of phantom-based test methods is needed for evaluating performance characteristics of PAT systems. To this end, we have constructed phantoms using a custom tissue-mimicking material based on PVC plastisol with tunable, biologically-relevant optical and acoustic properties. Each phantom is designed to enable quantitative assessment of one or more image quality characteristics including 3D spatial resolution, spatial measurement accuracy, ultrasound/PAT co-registration, uniformity, penetration depth, geometric distortion, sensitivity, and linearity. Phantoms contained targets including high-intensity point source targets and dye-filled tubes. This suite of phantoms was used to measure the dependence of performance of a custom PAT system (equipped with four interchangeable linear array transducers of varying design) on design parameters (e.g., center frequency, bandwidth, element geometry). Phantoms also allowed comparison of image artifacts, including surface-generated clutter and bandlimited sensing artifacts. Results showed that transducer design parameters create strong variations in performance including a trade-off between resolution and penetration depth, which could be quantified with our method. This study demonstrates the utility of phantom-based image quality testing in device performance assessment, which may guide development of consensus standards for PAT systems.
Comparison of three commercially available fit-test methods.
Janssen, Larry L; Luinenburg, D Michael; Mullins, Haskell E; Nelson, Thomas J
2002-01-01
American National Standards Institute (ANSI) standard Z88.10, Respirator Fit Testing Methods, includes criteria to evaluate new fit-tests. The standard allows generated aerosol, particle counting, or controlled negative pressure quantitative fit-tests to be used as the reference method to determine acceptability of a new test. This study examined (1) comparability of three Occupational Safety and Health Administration-accepted fit-test methods, all of which were validated using generated aerosol as the reference method; and (2) the effect of the reference method on the apparent performance of a fit-test method under evaluation. Sequential fit-tests were performed using the controlled negative pressure and particle counting quantitative fit-tests and the bitter aerosol qualitative fit-test. Of 75 fit-tests conducted with each method, the controlled negative pressure method identified 24 failures; bitter aerosol identified 22 failures; and the particle counting method identified 15 failures. The sensitivity of each method, that is, agreement with the reference method in identifying unacceptable fits, was calculated using each of the other two methods as the reference. None of the test methods met the ANSI sensitivity criterion of 0.95 or greater when compared with either of the other two methods. These results demonstrate that (1) the apparent performance of any fit-test depends on the reference method used, and (2) the fit-tests evaluated use different criteria to identify inadequately fitting respirators. Although "acceptable fit" cannot be defined in absolute terms at this time, the ability of existing fit-test methods to reject poor fits can be inferred from workplace protection factor studies.
DISINFECTION OF NEW WATER MAINS
The 'AWWA Standard for Disinfecting Water Mains' (AWWA C601-68) has fallen into disuse by a number of water utilities because of repeated bacteriological failures following initial disinfection with the recommended high-dose chlorination. Other methods of disinfection, including ...
29 CFR 779.413 - Methods of compensation of retail store employees.
Code of Federal Regulations, 2010 CFR
2010-07-01
... STANDARDS ACT AS APPLIED TO RETAILERS OF GOODS OR SERVICES Provisions Relating to Certain Employees of... represent commissions “on goods or services,” which would include all types of commissions customarily based...
Carrico, Ruth M; Coty, Mary B; Goss, Linda K; Lajoie, Andrew S
2007-02-01
This pilot study was conducted to determine whether supplementing standard classroom training methods regarding respiratory disease transmission with a visual demonstration could improve the use of personal protective equipment among emergency department nurses. Participants included 20 emergency department registered nurses randomized into 2 groups: control and intervention. The intervention group received supplemental training using the visual demonstration of respiratory particle dispersion. Both groups were then observed throughout their work shifts as they provided care during January-March 2005. Participants who received supplemental visual training correctly utilized personal protective equipment statistically more often than did participants who received only the standard classroom training. Supplementing the standard training methods with a visual demonstration can improve the use of personal protective equipment during care of patients exhibiting respiratory symptoms.
MUSiC - Model-independent search for deviations from Standard Model predictions in CMS
NASA Astrophysics Data System (ADS)
Pieta, Holger
2010-02-01
We present an approach for a model independent search in CMS. Systematically scanning the data for deviations from the standard model Monte Carlo expectations, such an analysis can help to understand the detector and tune event generators. By minimizing the theoretical bias the analysis is furthermore sensitive to a wide range of models for new physics, including the uncounted number of models not-yet-thought-of. After sorting the events into classes defined by their particle content (leptons, photons, jets and missing transverse energy), a minimally prejudiced scan is performed on a number of distributions. Advanced statistical methods are used to determine the significance of the deviating regions, rigorously taking systematic uncertainties into account. A number of benchmark scenarios, including common models of new physics and possible detector effects, have been used to gauge the power of such a method. )
Parasitology: United Kingdom National Quality Assessment Scheme.
Hawthorne, M.; Chiodini, P. L.; Snell, J. J.; Moody, A. H.; Ramsay, A.
1992-01-01
AIMS: To assess the results from parasitology laboratories taking part in a quality assessment scheme between 1986 and 1991; and to compare performance with repeat specimens. METHODS: Quality assessment of blood parasitology, including tissue parasites (n = 444; 358 UK, 86 overseas), and faecal parasitology, including extra-intestinal parasites (n = 205; 141 UK, 64 overseas), was performed. RESULTS: Overall, the standard of performance was poor. A questionnaire distributed to participants showed that a wide range of methods was used, some of which were considered inadequate to achieve reliable results. Teaching material was distributed to participants from time to time in an attempt to improve standards. CONCLUSIONS: Since the closure of the IMLS fellowship course in 1972, fewer opportunities for specialised training in parasitology are available: more training is needed. Poor performance in the detection of malarial parasites is mainly attributable to incorrect speciation, misidentification, and lack of equipment such as an eyepiece graticule. PMID:1452791
SU-E-J-221: A Novel Expansion Method for MRI Based Target Delineation in Prostate Radiotherapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruiz, B; East Carolina University, Greenville, NC; Feng, Y
Purpose: To compare a novel bladder/rectum carveout expansion method on MRI delineated prostate to standard CT and expansion based methods for maintaining prostate coverage while providing superior bladder and rectal sparing. Methods: Ten prostate cases were planned to include four trials: MRI vs CT delineated prostate/proximal seminal vesicles, and each image modality compared to both standard expansions (8mm 3D expansion and 5mm posterior, i.e. ∼8mm) and carveout method expansions (5mm 3D expansion, 4mm posterior for GTV-CTV excluding expansion into bladder/rectum followed by additional 5mm 3D expansion to PTV, i.e. ∼1cm). All trials were planned to total dose 7920 cGy viamore » IMRT. Evaluation and comparison was made using the following criteria: QUANTEC constraints for bladder/rectum including analysis of low dose regions, changes in PTV volume, total control points, and maximum hot spot. Results: ∼8mm MRI expansion consistently produced the most optimal plan with lowest total control points and best bladder/rectum sparing. However, this scheme had the smallest prostate (average 22.9% reduction) and subsequent PTV volume, consistent with prior literature. ∼1cm MRI had an average PTV volume comparable to ∼8mm CT at 3.79% difference. Bladder QUANTEC constraints were on average less for the ∼1cm MRI as compared to the ∼8mm CT and observed as statistically significant with 2.64% reduction in V65. Rectal constraints appeared to follow the same trend. Case-by-case analysis showed variation in rectal V30 with MRI delineated prostate being most favorable regardless of expansion type. ∼1cm MRI and ∼8mm CT had comparable plan quality. Conclusion: MRI delineated prostate with standard expansions had the smallest PTV leading to margins that may be too tight. Bladder/rectum carveout expansion method on MRI delineated prostate was found to be superior to standard CT based methods in terms of bladder and rectal sparing while maintaining prostate coverage. Continued investigation is warranted for further validation.« less
Papadouka, Vikki; Ternier, Alexandra; Zucker, Jane R.
2016-01-01
Objective We compared the quality of data reported to New York City's immunization information system, the Citywide Immunization Registry (CIR), through its real-time Health Level 7 (HL7) Web service from electronic health records (EHRs), with data submitted through other methods. Methods We stratified immunizations administered and reported to the CIR in 2014 for patients aged 0–18 years by reporting method: (1) sending HL7 messages from EHRs through the Web service, (2) manual data entry, and (3) upload of a non-standard flat file from EHRs. We assessed completeness of reporting by measuring the percentage of immunizations reported with lot number, manufacturer, and Vaccines for Children (VFC) program eligibility. We assessed timeliness of reporting by determining the number of days from date of administration to date entered into the CIR. Results HL7 reporting accounted for the largest percentage (46.3%) of the 3.8 million immunizations reported in 2014. Of immunizations reported using HL7, 97.9% included the lot number and 92.6% included the manufacturer, compared with 50.4% and 48.0% for manual entry, and 65.9% and 48.8% for non-standard flat file, respectively. VFC eligibility was 96.9% complete when reported by manual data entry, 95.3% complete for HL7 reporting, and 87.2% complete for non-standard flat file reporting. Of the three reporting methods, HL7 was the most timely: 77.6% of immunizations were reported by HL7 in <1 day, compared with 53.6% of immunizations reported through manual data entry and 18.1% of immunizations reported through non-standard flat file. Conclusion HL7 reporting from EHRs resulted in more complete and timely data in the CIR compared with other reporting methods. Providing resources to facilitate HL7 reporting from EHRs to immunization information systems to increase data quality should be a priority for public health. PMID:27453603
2012-01-01
Background Healthcare accreditation standards are advocated as an important means of improving clinical practice and organisational performance. Standard development agencies have documented methodologies to promote open, transparent, inclusive development processes where standards are developed by members. They assert that their methodologies are effective and efficient at producing standards appropriate for the health industry. However, the evidence to support these claims requires scrutiny. The study’s purpose was to examine the empirical research that grounds the development methods and application of healthcare accreditation standards. Methods A multi-method strategy was employed over the period March 2010 to August 2011. Five academic health research databases (Medline, Psych INFO, Embase, Social work abstracts, and CINAHL) were interrogated, the websites of 36 agencies associated with the study topic were investigated, and a snowball search was undertaken. Search criteria included accreditation research studies, in English, addressing standards and their impact. Searching in stage 1 initially selected 9386 abstracts. In stage 2, this selection was refined against the inclusion criteria; empirical studies (n = 2111) were identified and refined to a selection of 140 papers with the exclusion of clinical or biomedical and commentary pieces. These were independently reviewed by two researchers and reduced to 13 articles that met the study criteria. Results The 13 articles were analysed according to four categories: overall findings; standards development; implementation issues; and impact of standards. Studies have only occurred in the acute care setting, predominately in 2003 (n = 5) and 2009 (n = 4), and in the United States (n = 8). A multidisciplinary focus (n = 9) and mixed method approach (n = 11) are common characteristics. Three interventional studies were identified, with the remaining 10 studies having research designs to investigate clinical or organisational impacts. No study directly examined standards development or other issues associated with their progression. Only one study noted implementation issues, identifying several enablers and barriers. Standards were reported to improve organisational efficiency and staff circumstances. However, the impact on clinical quality was mixed, with both improvements and a lack of measurable effects recorded. Conclusion Standards are ubiquitous within healthcare and are generally considered to be an important means by which to improve clinical practice and organisational performance. However, there is a lack of robust empirical evidence examining the development, writing, implementation and impacts of healthcare accreditation standards. PMID:22995152
The Next-Generation PCR-Based Quantification Method for Ambient Waters: Digital PCR.
Cao, Yiping; Griffith, John F; Weisberg, Stephen B
2016-01-01
Real-time quantitative PCR (qPCR) is increasingly being used for ambient water monitoring, but development of digital polymerase chain reaction (digital PCR) has the potential to further advance the use of molecular techniques in such applications. Digital PCR refines qPCR by partitioning the sample into thousands to millions of miniature reactions that are examined individually for binary endpoint results, with DNA density calculated from the fraction of positives using Poisson statistics. This direct quantification removes the need for standard curves, eliminating the labor and materials associated with creating and running standards with each batch, and removing biases associated with standard variability and mismatching amplification efficiency between standards and samples. Confining reactions and binary endpoint measurements to small partitions also leads to other performance advantages, including reduced susceptibility to inhibition, increased repeatability and reproducibility, and increased capacity to measure multiple targets in one analysis. As such, digital PCR is well suited for ambient water monitoring applications and is particularly advantageous as molecular methods move toward autonomous field application.
Below-Ambient and Cryogenic Thermal Testing
NASA Technical Reports Server (NTRS)
Fesmire, James E.
2016-01-01
Thermal insulation systems operating in below-ambient temperature conditions are inherently susceptible to moisture intrusion and vapor drive toward the cold side. The subsequent effects may include condensation, icing, cracking, corrosion, and other problems. Methods and apparatus for real-world thermal performance testing of below-ambient systems have been developed based on cryogenic boiloff calorimetry. New ASTM International standards on cryogenic testing and their extension to future standards for below-ambient testing of pipe insulation are reviewed.
ASTM international workshop on standards and measurements for tissue engineering scaffolds.
Simon, Carl G; Yaszemski, Michael J; Ratcliffe, Anthony; Tomlins, Paul; Luginbuehl, Reto; Tesk, John A
2015-07-01
The "Workshop on Standards & Measurements for Tissue Engineering Scaffolds" was held on May 21, 2013 in Indianapolis, IN, and was sponsored by the ASTM International (ASTM). The purpose of the workshop was to identify the highest priority items for future standards work for scaffolds used in the development and manufacture of tissue engineered medical products (TEMPs). Eighteen speakers and 78 attendees met to assess current scaffold standards and to prioritize needs for future standards. A key finding was that the ASTM TEMPs subcommittees (F04.41-46) have many active "guide" documents for educational purposes, but few standard "test methods" or "practices." Overwhelmingly, the most clearly identified need was standards for measuring the structure of scaffolds, followed by standards for biological characterization, including in vitro testing, animal models and cell-material interactions. The third most pressing need was to develop standards for assessing the mechanical properties of scaffolds. Additional needs included standards for assessing scaffold degradation, clinical outcomes with scaffolds, effects of sterilization on scaffolds, scaffold composition, and drug release from scaffolds. Discussions highlighted the need for additional scaffold reference materials and the need to use them for measurement traceability. Workshop participants emphasized the need to promote the use of standards in scaffold fabrication, characterization, and commercialization. Finally, participants noted that standards would be more broadly accepted if their impact in the TEMPs community could be quantified. Many scaffold standard needs have been identified and focus is turning to generating these standards to support the use of scaffolds in TEMPs. © 2014 Wiley Periodicals, Inc.
Montei, Carolyn; McDougal, Susan; Mozola, Mark; Rice, Jennifer
2014-01-01
The Soleris Non-fermenting Total Viable Count method was previously validated for a wide variety of food products, including cocoa powder. A matrix extension study was conducted to validate the method for use with cocoa butter and cocoa liquor. Test samples included naturally contaminated cocoa liquor and cocoa butter inoculated with natural microbial flora derived from cocoa liquor. A probability of detection statistical model was used to compare Soleris results at multiple test thresholds (dilutions) with aerobic plate counts determined using the AOAC Official Method 966.23 dilution plating method. Results of the two methods were not statistically different at any dilution level in any of the three trials conducted. The Soleris method offers the advantage of results within 24 h, compared to the 48 h required by standard dilution plating methods.
NASA Technical Reports Server (NTRS)
Moe, Karen L.; Perkins, Dorothy C.; Szczur, Martha R.
1987-01-01
The user support environment (USE) which is a set of software tools for a flexible standard interactive user interface to the Space Station systems, platforms, and payloads is described in detail. Included in the USE concept are a user interface language, a run time environment and user interface management system, support tools, and standards for human interaction methods. The goals and challenges of the USE are discussed as well as a methodology based on prototype demonstrations for involving users in the process of validating the USE concepts. By prototyping the key concepts and salient features of the proposed user interface standards, the user's ability to respond is greatly enhanced.
Holschneider, Alexander; Hutson, John; Peña, Albert; Beket, Elhamy; Chatterjee, Subir; Coran, Arnold; Davies, Michael; Georgeson, Keith; Grosfeld, Jay; Gupta, Devendra; Iwai, Naomi; Kluth, Dieter; Martucciello, Giuseppe; Moore, Samuel; Rintala, Risto; Smith, E Durham; Sripathi, D V; Stephens, Douglas; Sen, Sudipta; Ure, Benno; Grasshoff, Sabine; Boemers, Thomas; Murphy, Feilin; Söylet, Yunus; Dübbers, Martin; Kunst, Marc
2005-10-01
Anorectal malformations (ARM) are common congenital anomalies seen throughout the world. Comparison of outcome data has been hindered because of confusion related to classification and assessment systems. The goals of the Krinkenbeck Conference on ARM was to develop standards for an International Classification of ARM based on a modification of fistula type and adding rare and regional variants, and design a system for comparable follow up studies. Lesions were classified into major clinical groups based on the fistula location (perineal, recto-urethral, recto-vesical, vestibular), cloacal lesions, those with no fistula and anal stenosis. Rare and regional variants included pouch colon, rectal atresia or stenosis, rectovaginal fistula, H-fistula and others. Groups would be analyzed according to the type of procedure performed stratified for confounding associated conditions such as sacral anomalies and tethered cord. A standard method for postoperative assessment of continence was determined. A new International diagnostic classification system, operative groupings and a method of postoperative assessment of continence was developed by consensus of a large contingent of participants experienced in the management of patients with ARM. These methods should allow for a common standardization of diagnosis and comparing postoperative results.
NASA Astrophysics Data System (ADS)
Denny, Ellen G.; Gerst, Katharine L.; Miller-Rushing, Abraham J.; Tierney, Geraldine L.; Crimmins, Theresa M.; Enquist, Carolyn A. F.; Guertin, Patricia; Rosemartin, Alyssa H.; Schwartz, Mark D.; Thomas, Kathryn A.; Weltzin, Jake F.
2014-05-01
Phenology offers critical insights into the responses of species to climate change; shifts in species' phenologies can result in disruptions to the ecosystem processes and services upon which human livelihood depends. To better detect such shifts, scientists need long-term phenological records covering many taxa and across a broad geographic distribution. To date, phenological observation efforts across the USA have been geographically limited and have used different methods, making comparisons across sites and species difficult. To facilitate coordinated cross-site, cross-species, and geographically extensive phenological monitoring across the nation, the USA National Phenology Network has developed in situ monitoring protocols standardized across taxonomic groups and ecosystem types for terrestrial, freshwater, and marine plant and animal taxa. The protocols include elements that allow enhanced detection and description of phenological responses, including assessment of phenological "status", or the ability to track presence-absence of a particular phenophase, as well as standards for documenting the degree to which phenological activity is expressed in terms of intensity or abundance. Data collected by this method can be integrated with historical phenology data sets, enabling the development of databases for spatial and temporal assessment of changes in status and trends of disparate organisms. To build a common, spatially, and temporally extensive multi-taxa phenological data set available for a variety of research and science applications, we encourage scientists, resources managers, and others conducting ecological monitoring or research to consider utilization of these standardized protocols for tracking the seasonal activity of plants and animals.
Denny, Ellen G.; Gerst, Katharine L.; Miller-Rushing, Abraham J.; Tierney, Geraldine L.; Crimmins, Theresa M.; Enquist, Carolyn A.F.; Guertin, Patricia; Rosemartin, Alyssa H.; Schwartz, Mark D.; Thomas, Kathryn A.; Weltzin, Jake F.
2014-01-01
Phenology offers critical insights into the responses of species to climate change; shifts in species’ phenologies can result in disruptions to the ecosystem processes and services upon which human livelihood depends. To better detect such shifts, scientists need long-term phenological records covering many taxa and across a broad geographic distribution. To date, phenological observation efforts across the USA have been geographically limited and have used different methods, making comparisons across sites and species difficult. To facilitate coordinated cross-site, cross-species, and geographically extensive phenological monitoring across the nation, the USA National Phenology Network has developed in situ monitoring protocols standardized across taxonomic groups and ecosystem types for terrestrial, freshwater, and marine plant and animal taxa. The protocols include elements that allow enhanced detection and description of phenological responses, including assessment of phenological “status”, or the ability to track presence–absence of a particular phenophase, as well as standards for documenting the degree to which phenological activity is expressed in terms of intensity or abundance. Data collected by this method can be integrated with historical phenology data sets, enabling the development of databases for spatial and temporal assessment of changes in status and trends of disparate organisms. To build a common, spatially, and temporally extensive multi-taxa phenological data set available for a variety of research and science applications, we encourage scientists, resources managers, and others conducting ecological monitoring or research to consider utilization of these standardized protocols for tracking the seasonal activity of plants and animals.
Denny, Ellen G; Gerst, Katharine L; Miller-Rushing, Abraham J; Tierney, Geraldine L; Crimmins, Theresa M; Enquist, Carolyn A F; Guertin, Patricia; Rosemartin, Alyssa H; Schwartz, Mark D; Thomas, Kathryn A; Weltzin, Jake F
2014-05-01
Phenology offers critical insights into the responses of species to climate change; shifts in species' phenologies can result in disruptions to the ecosystem processes and services upon which human livelihood depends. To better detect such shifts, scientists need long-term phenological records covering many taxa and across a broad geographic distribution. To date, phenological observation efforts across the USA have been geographically limited and have used different methods, making comparisons across sites and species difficult. To facilitate coordinated cross-site, cross-species, and geographically extensive phenological monitoring across the nation, the USA National Phenology Network has developed in situ monitoring protocols standardized across taxonomic groups and ecosystem types for terrestrial, freshwater, and marine plant and animal taxa. The protocols include elements that allow enhanced detection and description of phenological responses, including assessment of phenological "status", or the ability to track presence-absence of a particular phenophase, as well as standards for documenting the degree to which phenological activity is expressed in terms of intensity or abundance. Data collected by this method can be integrated with historical phenology data sets, enabling the development of databases for spatial and temporal assessment of changes in status and trends of disparate organisms. To build a common, spatially, and temporally extensive multi-taxa phenological data set available for a variety of research and science applications, we encourage scientists, resources managers, and others conducting ecological monitoring or research to consider utilization of these standardized protocols for tracking the seasonal activity of plants and animals.
The effect of environmental initiatives on NASA specifications and standards activities
NASA Technical Reports Server (NTRS)
Griffin, Dennis; Webb, David; Cook, Beth
1995-01-01
The NASA Operational Environment Team (NOET) has conducted a survey of NASA centers specifications and standards that require the use of Ozone Depleting Substances (ODS's) (Chlorofluorocarbons (CFCs), Halons, and chlorinated solvents). The results of this survey are presented here, along with a pathfinder approach utilized at Marshall Space Flight Center (MSFC) to eliminate the use of ODS's in targeted specifications and standards. Presented here are the lessons learned from a pathfinder effort to replace CFC-113 in a significant MSFC specification for cleaning and cleanliness verification methods for oxygen, fuel and pneumatic service, including Shuttle propulsion elements.
Mihailov, Rossen; Stoeva, Dilyana; Pencheva, Blagovesta; Pentchev, Eugeni
2018-03-01
In a number of cases the monitoring of patients with type I diabetes mellitus requires measurement of the exogenous insulin levels. For the purpose of a clinical investigation of the efficacy of a medical device for application of exogenous insulin aspart, a verification of the method for measurement of this synthetic analogue of the hormone was needed. The information in the available medical literature for the measurement of the different exogenous insulin analogs is insufficient. Thus, verification was required to be in compliance with the active standards in Republic of Bulgaria. A manufactured method developed for ADVIA Centaur XP Immunoassay, Siemens Healthcare, was used which we verified using standard solutions and a patient serum pool by adding the appropriate quantity exogenous insulin aspart. The method was verified in accordance with the bioanalytical method verification criteria and regulatory requirements for using a standard method: CLIA chemiluminescence immunoassay ADVIA Centaur® XP. The following parameters are determined and monitored: intra-day precision and accuracy, inter-day precision and accuracy, limit of detection and lower limit of quantification, linearity, analytical recovery. The routine application of the method for measurement of immunoreactive insulin using the analyzer ADVIA Centaur® XP is directed to the measurement of endogenous insulin. The method is applicable for measuring different types of exogenous insulin, including insulin aspart.
2016-09-01
Method Scientific Operating Procedure Series : SOP-C En vi ro nm en ta l L ab or at or y Jonathon Brame and Chris Griggs September 2016...BET) Method Scientific Operating Procedure Series : SOP-C Jonathon Brame and Chris Griggs Environmental Laboratory U.S. Army Engineer Research and...response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing
HIPS: A new hippocampus subfield segmentation method.
Romero, José E; Coupé, Pierrick; Manjón, José V
2017-12-01
The importance of the hippocampus in the study of several neurodegenerative diseases such as Alzheimer's disease makes it a structure of great interest in neuroimaging. However, few segmentation methods have been proposed to measure its subfields due to its complex structure and the lack of high resolution magnetic resonance (MR) data. In this work, we present a new pipeline for automatic hippocampus subfield segmentation using two available hippocampus subfield delineation protocols that can work with both high and standard resolution data. The proposed method is based on multi-atlas label fusion technology that benefits from a novel multi-contrast patch match search process (using high resolution T1-weighted and T2-weighted images). The proposed method also includes as post-processing a new neural network-based error correction step to minimize systematic segmentation errors. The method has been evaluated on both high and standard resolution images and compared to other state-of-the-art methods showing better results in terms of accuracy and execution time. Copyright © 2017 Elsevier Inc. All rights reserved.
Zazzi, Barbara C.; Crepeau, Kathryn L.; Fram, Miranda S.; Bergamaschi, Brian A.
2005-01-01
An analytical method for the determination of haloacetic acid formation potential of water samples has been developed by the U.S. Geological Survey California Water Science Center Sacramento Laboratory. The haloacetic acid formation potential is measured by dosing water samples with chlorine under specified conditions of pH, temperature, incubation time, darkness, and residual-free chlorine. The haloacetic acids formed are bromochloroacetic acid, bromodichloroacetic acid, dibromochloroacetic acid, dibromoacetic acid, dichloroacetic acid, monobromoacetic acid, monochloroacetic acid, tribromoacetic acid, and trichloroacetic acid. They are extracted, methylated, and then analyzed using a gas chromatograph equipped with an electron capture detector. Method validation experiments were performed to determine the method accuracy, precision, and detection limit for each of the compounds. Method detection limits for these nine haloacetic acids ranged from 0.11 to 0.45 microgram per liter. Quality-control practices include the use of blanks, quality-control samples, calibration verification standards, surrogate recovery, internal standard, matrix spikes, and duplicates.
This compendium includes method summaries provided by the Centers for Disease Control and Prevention/National Center for Environmental Health (CDC/NCEH) for collection and shipping of blood and urine samples for analysis of metals and volatile organic compounds (VOCs). It provide...
This compendium includes method summaries provided by the Centers for Disease Control and Prevention/National Center for Environmental Health (CDC/NCEH) for the collection and shipping of blood and urine samples for analysis of metals and volatile organic compounds (VOCs). It pro...
Bruce R. Hartsough; Bryce J. Stokes
1990-01-01
A database of North American harvesting systems was developed. Parameters for each system included site, material and product characteristics, equipment mix and production rate. Onto-truck and delivered costs per green tonne, and breakeven oil prices were developed using standard costing methods. Systems costs were compared over the ranges of piece size, volume per...
This compendium contains seven SOPs developed by Food and Drug Administration (FDA) laboratories for methods of analyzing trace metals in dietary samples collected using Total Diet study procedures. The SOPs include the following: (1) Quality Control for Analysis of NHEXAS Food o...
This compendium includes method summaries provided by the Centers for Disease Control and Prevention/National Center for Environmental Health (CDC/NCEH) for the collection and shipping of blood and urine samples for analysis of metals and volatile organic compounds (VOCs). It pro...
Current federal regulations (40 CFR 503) require enumeration of fecal coliform or Salmoella prior to land application of Class A biosolids. This regulation specifies use of enumeration methods included in "Standard Methods for the Examination of Water and Wastewater 18th Edi...
The prevalence of depressive symptoms in frontotemporal dementia: a meta-analysis.
Chakrabarty, Trisha; Sepehry, Amir A; Jacova, Claudia; Hsiung, Ging-Yuek Robin
2015-01-01
Depression is common in Alzheimer's and vascular dementia and is associated with poorer outcomes; however, less is known about the impact of depression on frontotemporal dementia (FTD). Here, we conducted a meta-analysis of diagnostic methods and the prevalence of depressive symptoms in FTD. PubMed, EMBASE and PsychINFO were queried for 'depression' and/or 'depressive mood' in behavioral- and language-variant FTD. The prevalence and diagnosis of depressive symptoms were extracted from relevant studies and the results pooled using a random-effects model. We included 29 studies in this meta-analysis, with sample sizes ranging from 3 to 73 (n = 870). The omnibus estimated event rate of depressed mood was 0.334 (33%; 95% CI: 0.268-0.407). Symptoms were most commonly assessed via standardized neuropsychiatric rating scales, with other methods including subjective caregiver reports and chart reviews. The study results were heterogeneous due to the variability in diagnostic methods. Depressive symptoms similar to those in other dementias are commonly detected in FTD. However, the diagnostic methods are heterogeneous, and symptoms of depression often overlap with manifestations of FTD. Having a standardized diagnostic approach to depression in FTD will greatly facilitate future research in this area.
Chapman, Jennifer L; Porsch, Lucas; Vidaurre, Rodrigo; Backhaus, Thomas; Sinclair, Chris; Jones, Glyn; Boxall, Alistair B A
2017-12-15
Veterinary medicinal products (VMPs) require, as part of the European Union (EU) authorization process, consideration of both risks and benefits. Uses of VMPs have multiple risks (e.g., risks to the animal being treated, to the person administering the VMP) including risks to the environment. Environmental risks are not directly comparable to therapeutic benefits; there is no standardized approach to compare both environmental risks and therapeutic benefits. We have developed three methods for communicating and comparing therapeutic benefits and environmental risks for the benefit-risk assessment that supports the EU authorization process. Two of these methods support independent product evaluation (i.e., a summative classification and a visual scoring matrix classification); the other supports a comparative evaluation between alternative products (i.e., a comparative classification). The methods and the challenges to implementing a benefit-risk assessment including environmental risk are presented herein; how these concepts would work in current policy is discussed. Adaptability to scientific and policy development is considered. This work is an initial step in the development of a standardized methodology for integrated decision-making for VMPs. Copyright © 2017 Elsevier B.V. All rights reserved.
Keogh, Ruth H; Daniel, Rhian M; VanderWeele, Tyler J; Vansteelandt, Stijn
2018-05-01
Estimation of causal effects of time-varying exposures using longitudinal data is a common problem in epidemiology. When there are time-varying confounders, which may include past outcomes, affected by prior exposure, standard regression methods can lead to bias. Methods such as inverse probability weighted estimation of marginal structural models have been developed to address this problem. However, in this paper we show how standard regression methods can be used, even in the presence of time-dependent confounding, to estimate the total effect of an exposure on a subsequent outcome by controlling appropriately for prior exposures, outcomes, and time-varying covariates. We refer to the resulting estimation approach as sequential conditional mean models (SCMMs), which can be fitted using generalized estimating equations. We outline this approach and describe how including propensity score adjustment is advantageous. We compare the causal effects being estimated using SCMMs and marginal structural models, and we compare the two approaches using simulations. SCMMs enable more precise inferences, with greater robustness against model misspecification via propensity score adjustment, and easily accommodate continuous exposures and interactions. A new test for direct effects of past exposures on a subsequent outcome is described.
Method for measuring recovery of catalytic elements from fuel cells
Shore, Lawrence [Edison, NJ; Matlin, Ramail [Berkeley, NJ
2011-03-08
A method is provided for measuring the concentration of a catalytic clement in a fuel cell powder. The method includes depositing on a porous substrate at least one layer of a powder mixture comprising the fuel cell powder and an internal standard material, ablating a sample of the powder mixture using a laser, and vaporizing the sample using an inductively coupled plasma. A normalized concentration of catalytic element in the sample is determined by quantifying the intensity of a first signal correlated to the amount of catalytic element in the sample, quantifying the intensity of a second signal correlated to the amount of internal standard material in the sample, and using a ratio of the first signal intensity to the second signal intensity to cancel out the effects of sample size.
Gang, Wei-juan; Wang, Xin; Wang, Fang; Dong, Guo-feng; Wu, Xiao-dong
2015-08-01
The national standard of "Regulations of Acupuncture-needle Manipulating Techniques" is one of the national Criteria of Acupuncturology for which a total of 22 items have been already established. In the process of formulation, a series of common and specific problems have been met. In the present paper, the authors expound these problems from 3 aspects, namely principles for formulation, methods for formulating criteria, and considerations about some problems. The formulating principles include selection and regulations of principles for technique classification and technique-related key factors. The main methods for formulating criteria are 1) taking the literature as the theoretical foundation, 2) taking the clinical practice as the supporting evidence, and 3) taking the expounded suggestions or conclusions through peer review.
Grate, Jay W; Gonzalez, Jhanis J; O'Hara, Matthew J; Kellogg, Cynthia M; Morrison, Samuel S; Koppenaal, David W; Chan, George C-Y; Mao, Xianglei; Zorba, Vassilia; Russo, Richard E
2017-09-08
Solid sampling and analysis methods, such as laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS), are challenged by matrix effects and calibration difficulties. Matrix-matched standards for external calibration are seldom available and it is difficult to distribute spikes evenly into a solid matrix as internal standards. While isotopic ratios of the same element can be measured to high precision, matrix-dependent effects in the sampling and analysis process frustrate accurate quantification and elemental ratio determinations. Here we introduce a potentially general solid matrix transformation approach entailing chemical reactions in molten ammonium bifluoride (ABF) salt that enables the introduction of spikes as tracers or internal standards. Proof of principle experiments show that the decomposition of uranium ore in sealed PFA fluoropolymer vials at 230 °C yields, after cooling, new solids suitable for direct solid sampling by LA. When spikes are included in the molten salt reaction, subsequent LA-ICP-MS sampling at several spots indicate that the spikes are evenly distributed, and that U-235 tracer dramatically improves reproducibility in U-238 analysis. Precisions improved from 17% relative standard deviation for U-238 signals to 0.1% for the ratio of sample U-238 to spiked U-235, a factor of over two orders of magnitude. These results introduce the concept of solid matrix transformation (SMT) using ABF, and provide proof of principle for a new method of incorporating internal standards into a solid for LA-ICP-MS. This new approach, SMT-LA-ICP-MS, provides opportunities to improve calibration and quantification in solids based analysis. Looking forward, tracer addition to transformed solids opens up LA-based methods to analytical methodologies such as standard addition, isotope dilution, preparation of matrix-matched solid standards, external calibration, and monitoring instrument drift against external calibration standards.
ERIC Educational Resources Information Center
Kjaersgaard, Poul Soren, Ed.
2002-01-01
Papers from the conference in this volume include the following: "Towards Corpus Annotation Standards--The MATE Workbench" (Laila Dybkjaer and Niels Ole Bernsen); "Danish Text-to-Speech Synthesis Based on Stored Acoustic Segments" (Charles Hoequist); "Toward a Method for the Automated Design of Semantic…
Procedures for Constructing and Using Criterion-Referenced Performance Tests.
ERIC Educational Resources Information Center
Campbell, Clifton P.; Allender, Bill R.
1988-01-01
Criterion-referenced performance tests (CRPT) provide a realistic method for objectively measuring task proficiency against predetermined attainment standards. This article explains the procedures of constructing, validating, and scoring CRPTs and includes a checklist for a welding test. (JOW)
Building the United States National Vegetation Classification
Franklin, S.B.; Faber-Langendoen, D.; Jennings, M.; Keeler-Wolf, T.; Loucks, O.; Peet, R.; Roberts, D.; McKerrow, A.
2012-01-01
The Federal Geographic Data Committee (FGDC) Vegetation Subcommittee, the Ecological Society of America Panel on Vegetation Classification, and NatureServe have worked together to develop the United States National Vegetation Classification (USNVC). The current standard was accepted in 2008 and fosters consistency across Federal agencies and non-federal partners for the description of each vegetation concept and its hierarchical classification. The USNVC is structured as a dynamic standard, where changes to types at any level may be proposed at any time as new information comes in. But, because much information already exists from previous work, the NVC partners first established methods for screening existing types to determine their acceptability with respect to the 2008 standard. Current efforts include a screening process to assign confidence to Association and Group level descriptions, and a review of the upper three levels of the classification. For the upper levels especially, the expectation is that the review process includes international scientists. Immediate future efforts include the review of remaining levels and the development of a proposal review process.
Proposed Standards for Medical Education Submissions to the Journal of General Internal Medicine
Bowen, Judith L.; Gerrity, Martha S.; Kalet, Adina L.; Kogan, Jennifer R.; Spickard, Anderson; Wayne, Diane B.
2008-01-01
To help authors design rigorous studies and prepare clear and informative manuscripts, improve the transparency of editorial decisions, and raise the bar on educational scholarship, the Deputy Editors of the Journal of General Internal Medicine articulate standards for medical education submissions to the Journal. General standards include: (1) quality questions, (2) quality methods to match the questions, (3) insightful interpretation of findings, (4) transparent, unbiased reporting, and (5) attention to human subjects’ protection and ethical research conduct. Additional standards for specific study types are described. We hope these proposed standards will generate discussion that will foster their continued evolution. Electronic supplementary material The online version of this article (doi:10.1007/s11606-008-0676-z) contains supplementary material, which is available to authorized users. PMID:18612716
The impact of Life Science Identifier on informatics data.
Martin, Sean; Hohman, Moses M; Liefeld, Ted
2005-11-15
Since the Life Science Identifier (LSID) data identification and access standard made its official debut in late 2004, several organizations have begun to use LSIDs to simplify the methods used to uniquely name, reference and retrieve distributed data objects and concepts. In this review, the authors build on introductory work that describes the LSID standard by documenting how five early adopters have incorporated the standard into their technology infrastructure and by outlining several common misconceptions and difficulties related to LSID use, including the impact of the byte identity requirement for LSID-identified objects and the opacity recommendation for use of the LSID syntax. The review describes several shortcomings of the LSID standard, such as the lack of a specific metadata standard, along with solutions that could be addressed in future revisions of the specification.
Standard deviation and standard error of the mean.
Lee, Dong Kyu; In, Junyong; Lee, Sangseok
2015-06-01
In most clinical and experimental studies, the standard deviation (SD) and the estimated standard error of the mean (SEM) are used to present the characteristics of sample data and to explain statistical analysis results. However, some authors occasionally muddle the distinctive usage between the SD and SEM in medical literature. Because the process of calculating the SD and SEM includes different statistical inferences, each of them has its own meaning. SD is the dispersion of data in a normal distribution. In other words, SD indicates how accurately the mean represents sample data. However the meaning of SEM includes statistical inference based on the sampling distribution. SEM is the SD of the theoretical distribution of the sample means (the sampling distribution). While either SD or SEM can be applied to describe data and statistical results, one should be aware of reasonable methods with which to use SD and SEM. We aim to elucidate the distinctions between SD and SEM and to provide proper usage guidelines for both, which summarize data and describe statistical results.
Standard deviation and standard error of the mean
In, Junyong; Lee, Sangseok
2015-01-01
In most clinical and experimental studies, the standard deviation (SD) and the estimated standard error of the mean (SEM) are used to present the characteristics of sample data and to explain statistical analysis results. However, some authors occasionally muddle the distinctive usage between the SD and SEM in medical literature. Because the process of calculating the SD and SEM includes different statistical inferences, each of them has its own meaning. SD is the dispersion of data in a normal distribution. In other words, SD indicates how accurately the mean represents sample data. However the meaning of SEM includes statistical inference based on the sampling distribution. SEM is the SD of the theoretical distribution of the sample means (the sampling distribution). While either SD or SEM can be applied to describe data and statistical results, one should be aware of reasonable methods with which to use SD and SEM. We aim to elucidate the distinctions between SD and SEM and to provide proper usage guidelines for both, which summarize data and describe statistical results. PMID:26045923
Berthon, Beatrice; Spezi, Emiliano; Galavis, Paulina; Shepherd, Tony; Apte, Aditya; Hatt, Mathieu; Fayad, Hadi; De Bernardi, Elisabetta; Soffientini, Chiara D; Ross Schmidtlein, C; El Naqa, Issam; Jeraj, Robert; Lu, Wei; Das, Shiva; Zaidi, Habib; Mawlawi, Osama R; Visvikis, Dimitris; Lee, John A; Kirov, Assen S
2017-08-01
The aim of this paper is to define the requirements and describe the design and implementation of a standard benchmark tool for evaluation and validation of PET-auto-segmentation (PET-AS) algorithms. This work follows the recommendations of Task Group 211 (TG211) appointed by the American Association of Physicists in Medicine (AAPM). The recommendations published in the AAPM TG211 report were used to derive a set of required features and to guide the design and structure of a benchmarking software tool. These items included the selection of appropriate representative data and reference contours obtained from established approaches and the description of available metrics. The benchmark was designed in a way that it could be extendable by inclusion of bespoke segmentation methods, while maintaining its main purpose of being a standard testing platform for newly developed PET-AS methods. An example of implementation of the proposed framework, named PETASset, was built. In this work, a selection of PET-AS methods representing common approaches to PET image segmentation was evaluated within PETASset for the purpose of testing and demonstrating the capabilities of the software as a benchmark platform. A selection of clinical, physical, and simulated phantom data, including "best estimates" reference contours from macroscopic specimens, simulation template, and CT scans was built into the PETASset application database. Specific metrics such as Dice Similarity Coefficient (DSC), Positive Predictive Value (PPV), and Sensitivity (S), were included to allow the user to compare the results of any given PET-AS algorithm to the reference contours. In addition, a tool to generate structured reports on the evaluation of the performance of PET-AS algorithms against the reference contours was built. The variation of the metric agreement values with the reference contours across the PET-AS methods evaluated for demonstration were between 0.51 and 0.83, 0.44 and 0.86, and 0.61 and 1.00 for DSC, PPV, and the S metric, respectively. Examples of agreement limits were provided to show how the software could be used to evaluate a new algorithm against the existing state-of-the art. PETASset provides a platform that allows standardizing the evaluation and comparison of different PET-AS methods on a wide range of PET datasets. The developed platform will be available to users willing to evaluate their PET-AS methods and contribute with more evaluation datasets. © 2017 The Authors. Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.
Quantification of amine functional groups and their influence on OM/OC in the IMPROVE network
NASA Astrophysics Data System (ADS)
Kamruzzaman, Mohammed; Takahama, Satoshi; Dillner, Ann M.
2018-01-01
Recently, we developed a method using FT-IR spectroscopy coupled with partial least squares (PLS) regression to measure the four most abundant organic functional groups, aliphatic C-H, alcohol OH, carboxylic acid OH and carbonyl C=O, in atmospheric particulate matter. These functional groups are summed to estimate organic matter (OM) while the carbon from the functional groups is summed to estimate organic carbon (OC). With this method, OM and OM/OC can be estimated for each sample rather than relying on one assumed value to convert OC measurements to OM. This study continues the development of the FT-IR and PLS method for estimating OM and OM/OC by including the amine functional group. Amines are ubiquitous in the atmosphere and come from motor vehicle exhaust, animal husbandry, biomass burning, and vegetation among other sources. In this study, calibration standards for amines are produced by aerosolizing individual amine compounds and collecting them on PTFE filters using an IMPROVE sampler, thereby mimicking the filter media and collection geometry of ambient standards. The moles of amine functional group on each standard and a narrow range of amine-specific wavenumbers in the FT-IR spectra (wavenumber range 1 550-1 500 cm-1) are used to develop a PLS calibration model. The PLS model is validated using three methods: prediction of a set of laboratory standards not included in the model, a peak height analysis and a PLS model with a broader wavenumber range. The model is then applied to the ambient samples collected throughout 2013 from 16 IMPROVE sites in the USA. Urban sites have higher amine concentrations than most rural sites, but amine functional groups account for a lower fraction of OM at urban sites. Amine concentrations, contributions to OM and seasonality vary by site and sample. Amine has a small impact on the annual average OM/OC for urban sites, but for some rural sites including amine in the OM/OC calculations increased OM/OC by 0.1 or more.
Gioe, Terence J; Sharma, Amit; Tatman, Penny; Mehle, Susan
2011-01-01
Numerous joint implant options of varying cost are available to the surgeon, but it is unclear whether more costly implants add value in terms of function or longevity. We evaluated registry survival of higher-cost "premium" knee and hip components compared to lower-priced standard components. Premium TKA components were defined as mobile-bearing designs, high-flexion designs, oxidized-zirconium designs, those including moderately crosslinked polyethylene inserts, or some combination. Premium THAs included ceramic-on-ceramic, metal-on-metal, and ceramic-on-highly crosslinked polyethylene designs. We compared 3462 standard TKAs to 2806 premium TKAs and 868 standard THAs to 1311 premium THAs using standard statistical methods. The cost of the premium implants was on average approximately $1000 higher than the standard implants. There was no difference in the cumulative revision rate at 7-8 years between premium and standard TKAs or THAs. In this time frame, premium implants did not demonstrate better survival than standard implants. Revision indications for TKA did not differ, and infection and instability remained contributors. Longer followup is necessary to demonstrate whether premium implants add value in younger patient groups. Level III, therapeutic study. See Guidelines for Authors for a complete description of levels of evidence.
Uncertainty in Vs30-based site response
Thompson, Eric M.; Wald, David J.
2016-01-01
Methods that account for site response range in complexity from simple linear categorical adjustment factors to sophisticated nonlinear constitutive models. Seismic‐hazard analysis usually relies on ground‐motion prediction equations (GMPEs); within this framework site response is modeled statistically with simplified site parameters that include the time‐averaged shear‐wave velocity to 30 m (VS30) and basin depth parameters. Because VS30 is not known in most locations, it must be interpolated or inferred through secondary information such as geology or topography. In this article, we analyze a subset of stations for which VS30 has been measured to address effects of VS30 proxies on the uncertainty in the ground motions as modeled by GMPEs. The stations we analyze also include multiple recordings, which allow us to compute the repeatable site effects (or empirical amplification factors [EAFs]) from the ground motions. Although all methods exhibit similar bias, the proxy methods only reduce the ground‐motion standard deviations at long periods when compared to GMPEs without a site term, whereas measured VS30 values reduce the standard deviations at all periods. The standard deviation of the ground motions are much lower when the EAFs are used, indicating that future refinements of the site term in GMPEs have the potential to substantially reduce the overall uncertainty in the prediction of ground motions by GMPEs.
Integrated Data Collection Analysis (IDCA) Program - Statistical Analysis of RDX Standard Data Sets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sandstrom, Mary M.; Brown, Geoffrey W.; Preston, Daniel N.
2015-10-30
The Integrated Data Collection Analysis (IDCA) program is conducting a Proficiency Test for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are statistical analyses of the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Type II Class 5 standard. The material was tested as a well-characterized standard several times during the proficiency study to assess differences among participants and the range of results that may arise for well-behaved explosive materials. The analyses show that there are detectable differences among the results from IDCA participants. While these differences are statisticallymore » significant, most of them can be disregarded for comparison purposes to assess potential variability when laboratories attempt to measure identical samples using methods assumed to be nominally the same. The results presented in this report include the average sensitivity results for the IDCA participants and the ranges of values obtained. The ranges represent variation about the mean values of the tests of between 26% and 42%. The magnitude of this variation is attributed to differences in operator, method, and environment as well as the use of different instruments that are also of varying age. The results appear to be a good representation of the broader safety testing community based on the range of methods, instruments, and environments included in the IDCA Proficiency Test.« less
NASA Astrophysics Data System (ADS)
Aschonitis, Vassilis G.; Papamichail, Dimitris; Demertzi, Kleoniki; Colombani, Nicolo; Mastrocicco, Micol; Ghirardini, Andrea; Castaldelli, Giuseppe; Fano, Elisa-Anna
2017-08-01
The objective of the study is to provide global grids (0.5°) of revised annual coefficients for the Priestley-Taylor (P-T) and Hargreaves-Samani (H-S) evapotranspiration methods after calibration based on the ASCE (American Society of Civil Engineers)-standardized Penman-Monteith method (the ASCE method includes two reference crops: short-clipped grass and tall alfalfa). The analysis also includes the development of a global grid of revised annual coefficients for solar radiation (Rs) estimations using the respective Rs formula of H-S. The analysis was based on global gridded climatic data of the period 1950-2000. The method for deriving annual coefficients of the P-T and H-S methods was based on partial weighted averages (PWAs) of their mean monthly values. This method estimates the annual values considering the amplitude of the parameter under investigation (ETo and Rs) giving more weight to the monthly coefficients of the months with higher ETo values (or Rs values for the case of the H-S radiation formula). The method also eliminates the effect of unreasonably high or low monthly coefficients that may occur during periods where ETo and Rs fall below a specific threshold. The new coefficients were validated based on data from 140 stations located in various climatic zones of the USA and Australia with expanded observations up to 2016. The validation procedure for ETo estimations of the short reference crop showed that the P-T and H-S methods with the new revised coefficients outperformed the standard methods reducing the estimated root mean square error (RMSE) in ETo values by 40 and 25 %, respectively. The estimations of Rs using the H-S formula with revised coefficients reduced the RMSE by 28 % in comparison to the standard H-S formula. Finally, a raster database was built consisting of (a) global maps for the mean monthly ETo values estimated by ASCE-standardized method for both reference crops, (b) global maps for the revised annual coefficients of the P-T and H-S evapotranspiration methods for both reference crops and a global map for the revised annual coefficient of the H-S radiation formula and (c) global maps that indicate the optimum locations for using the standard P-T and H-S methods and their possible annual errors based on reference values. The database can support estimations of ETo and solar radiation for locations where climatic data are limited and it can support studies which require such estimations on larger scales (e.g. country, continent, world). The datasets produced in this study are archived in the PANGAEA database (https://doi.org/10.1594/PANGAEA.868808) and in the ESRN database (http://www.esrn-database.org or http://esrn-database.weebly.com).
Protocol - realist and meta-narrative evidence synthesis: Evolving Standards (RAMESES)
2011-01-01
Background There is growing interest in theory-driven, qualitative and mixed-method approaches to systematic review as an alternative to (or to extend and supplement) conventional Cochrane-style reviews. These approaches offer the potential to expand the knowledge base in policy-relevant areas - for example by explaining the success, failure or mixed fortunes of complex interventions. However, the quality of such reviews can be difficult to assess. This study aims to produce methodological guidance, publication standards and training resources for those seeking to use the realist and/or meta-narrative approach to systematic review. Methods/design We will: [a] collate and summarise existing literature on the principles of good practice in realist and meta-narrative systematic review; [b] consider the extent to which these principles have been followed by published and in-progress reviews, thereby identifying how rigour may be lost and how existing methods could be improved; [c] using an online Delphi method with an interdisciplinary panel of experts from academia and policy, produce a draft set of methodological steps and publication standards; [d] produce training materials with learning outcomes linked to these steps; [e] pilot these standards and training materials prospectively on real reviews-in-progress, capturing methodological and other challenges as they arise; [f] synthesise expert input, evidence review and real-time problem analysis into more definitive guidance and standards; [g] disseminate outputs to audiences in academia and policy. The outputs of the study will be threefold: 1. Quality standards and methodological guidance for realist and meta-narrative reviews for use by researchers, research sponsors, students and supervisors 2. A 'RAMESES' (Realist and Meta-review Evidence Synthesis: Evolving Standards) statement (comparable to CONSORT or PRISMA) of publication standards for such reviews, published in an open-access academic journal. 3. A training module for researchers, including learning outcomes, outline course materials and assessment criteria. Discussion Realist and meta-narrative review are relatively new approaches to systematic review whose overall place in the secondary research toolkit is not yet fully established. As with all secondary research methods, guidance on quality assurance and uniform reporting is an important step towards improving quality and consistency of studies. PMID:21843376
Robust Mediation Analysis Based on Median Regression
Yuan, Ying; MacKinnon, David P.
2014-01-01
Mediation analysis has many applications in psychology and the social sciences. The most prevalent methods typically assume that the error distribution is normal and homoscedastic. However, this assumption may rarely be met in practice, which can affect the validity of the mediation analysis. To address this problem, we propose robust mediation analysis based on median regression. Our approach is robust to various departures from the assumption of homoscedasticity and normality, including heavy-tailed, skewed, contaminated, and heteroscedastic distributions. Simulation studies show that under these circumstances, the proposed method is more efficient and powerful than standard mediation analysis. We further extend the proposed robust method to multilevel mediation analysis, and demonstrate through simulation studies that the new approach outperforms the standard multilevel mediation analysis. We illustrate the proposed method using data from a program designed to increase reemployment and enhance mental health of job seekers. PMID:24079925
Chu, S.; Hong, C.-S.; Rattner, B.A.; McGowan, P.C.
2003-01-01
A method for the determination of 146 polychlorinated biphenyls (PCBs), including 4 non-ortho and 8 mono-ortho substituted congeners and 26 chlorinated pesticides is described. The method consists of ultrasonic extraction, Florisilcleanup, HPLC fractionation over porous graphitic carbon (PGC), and final determination with GC/ECD and/or GC/MS. Two PCB congeners (PCB 30 and PCB 161) and two polybromo-biphenyls (2,4?,5-tribromobiphenyl and 3,3?,4,4?-tetrabromobiphenyl) were used as surrogate standards to evaluate the analytical efficiency. Four PCB congeners, PCB 14 and PCB 159 for the first fraction, PCB 61 for the second fraction, and PCB 204 for the third fraction, were used as internal standards to monitor the GC performance. The retention behavior of PCBs and pesticides on porous-graphitic-carbon column were discussed. The method was found to be effective and reliable under the operational conditions proposed and was applied successfully to the analysis of individual PCBs and chlorinated pesticides in heron egg samples.
TOKYO criteria 2014 for transpapillary biliary stenting.
Isayama, Hiroyuki; Hamada, Tsuyoshi; Yasuda, Ichiro; Itoi, Takao; Ryozawa, Shomei; Nakai, Yousuke; Kogure, Hirofumi; Koike, Kazuhiko
2015-01-01
It is difficult to carry out meta-analyses or to compare the results of different studies of biliary stents because there is no uniform evaluation method. Therefore, a standardized reporting system is required. We propose a new standardized system for reporting on biliary stents, the 'TOKYO criteria 2014', based on a consensus among Japanese pancreatobiliary endoscopists. Instead of stent occlusion, we use recurrent biliary obstruction, which includes occlusion and migration. The time to recurrent biliary obstruction was estimated using Kaplan-Meier analysis with the log-rank test. We can evaluate both plastic and self-expandable metallic stents (uncovered and covered). We also propose specification of the cause of recurrent biliary obstruction, identification of complications other than recurrent biliary obstruction, indication of severity, measures of technical and clinical success, and a standard for clinical care. Most importantly, the TOKYO criteria 2014 allow comparison of biliary stent quality across studies. Because blocked stents can be drained not only using transpapillary techniques but also by an endoscopic ultrasonography-guided transmural procedure, we should devise an evaluation method that includes transmural stenting in the near future. © 2014 The Authors. Digestive Endoscopy © 2014 Japan Gastroenterological Endoscopy Society.
2013-01-01
Background Increasing focus is being placed on Clerkship curriculum design and implementation in light of new undergraduate medical education research and accreditation standards. Canadian Otolaryngology-Head and Neck Surgery (OTOHNS) Clerkship programs are continually but independently evolving towards a common goal of improving Clerkship curriculum. Methods An electronic survey was sent to undergraduate OTOHNS directors at all Canadian medical schools (n = 17) examining their Clerkship curricula. Themes included Clerkship format, teaching methods, faculty support and development, program strengths, and barriers. Results Survey response rate was 76%. All responding schools had OTOHNS Clerkship programs ranging in type (mandatory, selective or elective) and length (<1 to 4 weeks). Learning modalities varied. Electronic learning tools were identified as increasingly important to curriculum delivery. Common strengths included wide clinical exposure and one-on-one mentoring. Multiple challenges were identified in curriculum implementation and evaluation. All schools expressed interest in developing national standards, objectives and e-learning resources. Conclusions Significant variation exists in OTOHNS Clerkship experiences between Canadian medical schools. Many schools perceive barriers of insufficient time, space and curriculum standardization. Interested Canadian OTOHNS educators are eager to collaborate to improve the collective OTOHNS Clerkship experience. PMID:23663703
Unice, Kenneth M.; Kreider, Marisa L.; Panko, Julie M.
2012-01-01
Pyrolysis(pyr)-GC/MS analysis of characteristic thermal decomposition fragments has been previously used for qualitative fingerprinting of organic sources in environmental samples. A quantitative pyr-GC/MS method based on characteristic tire polymer pyrolysis products was developed for tread particle quantification in environmental matrices including soil, sediment, and air. The feasibility of quantitative pyr-GC/MS analysis of tread was confirmed in a method evaluation study using artificial soil spiked with known amounts of cryogenically generated tread. Tread concentration determined by blinded analyses was highly correlated (r2 ≥ 0.88) with the known tread spike concentration. Two critical refinements to the initial pyrolysis protocol were identified including use of an internal standard and quantification by the dimeric markers vinylcyclohexene and dipentene, which have good specificity for rubber polymer with no other appreciable environmental sources. A novel use of deuterated internal standards of similar polymeric structure was developed to correct the variable analyte recovery caused by sample size, matrix effects, and ion source variability. The resultant quantitative pyr-GC/MS protocol is reliable and transferable between laboratories. PMID:23202830
Immunochemical Investigations of Cell Surface Antigens of Anaerobic Bacteria
1977-01-15
sterile cecal contents were included in all inocula, our data indicates * that cecal contents from germ free rats can be used in place of sterile cecal...The void volume of the column was estimated with blue dextran. Molecular size of the un- digested Pool 1 material was estimated using a PM-30 membrane...51) using bovine serum albumin as a standard. Total sugars were measured by the phenol-sulfuric acid method (52) using glucose as a standards
[Academic origin of round magnetic needle and standardization operation].
Cheng, Yan-Ting; Zhang, Tian-Sheng; Meng, Li-Qiang; Shi, Rui-Qi; Ji, Lai-Xi
2014-07-01
The origin and development of round magnetic needle was explored, and the structure of round magnetic needle was introduced in detail, including the handle, the body and the tip of the needle. The clinical opera tion of round magnetic needle were standardized from the aspects of the methods of holding needle, manipulation skill, tapping position, strength of manipulation, application scope and matters needing attention, which laid foundation for the popularization and application of round magnetic needle.
Evaluation of a High Throughput Starch Analysis Optimised for Wood
Bellasio, Chandra; Fini, Alessio; Ferrini, Francesco
2014-01-01
Starch is the most important long-term reserve in trees, and the analysis of starch is therefore useful source of physiological information. Currently published protocols for wood starch analysis impose several limitations, such as long procedures and a neutralization step. The high-throughput standard protocols for starch analysis in food and feed represent a valuable alternative. However, they have not been optimised or tested with woody samples. These have particular chemical and structural characteristics, including the presence of interfering secondary metabolites, low reactivity of starch, and low starch content. In this study, a standard method for starch analysis used for food and feed (AOAC standard method 996.11) was optimised to improve precision and accuracy for the analysis of starch in wood. Key modifications were introduced in the digestion conditions and in the glucose assay. The optimised protocol was then evaluated through 430 starch analyses of standards at known starch content, matrix polysaccharides, and wood collected from three organs (roots, twigs, mature wood) of four species (coniferous and flowering plants). The optimised protocol proved to be remarkably precise and accurate (3%), suitable for a high throughput routine analysis (35 samples a day) of specimens with a starch content between 40 mg and 21 µg. Samples may include lignified organs of coniferous and flowering plants and non-lignified organs, such as leaves, fruits and rhizomes. PMID:24523863
Gough, Albert H; Chen, Ning; Shun, Tong Ying; Lezon, Timothy R; Boltz, Robert C; Reese, Celeste E; Wagner, Jacob; Vernetti, Lawrence A; Grandis, Jennifer R; Lee, Adrian V; Stern, Andrew M; Schurdak, Mark E; Taylor, D Lansing
2014-01-01
One of the greatest challenges in biomedical research, drug discovery and diagnostics is understanding how seemingly identical cells can respond differently to perturbagens including drugs for disease treatment. Although heterogeneity has become an accepted characteristic of a population of cells, in drug discovery it is not routinely evaluated or reported. The standard practice for cell-based, high content assays has been to assume a normal distribution and to report a well-to-well average value with a standard deviation. To address this important issue we sought to define a method that could be readily implemented to identify, quantify and characterize heterogeneity in cellular and small organism assays to guide decisions during drug discovery and experimental cell/tissue profiling. Our study revealed that heterogeneity can be effectively identified and quantified with three indices that indicate diversity, non-normality and percent outliers. The indices were evaluated using the induction and inhibition of STAT3 activation in five cell lines where the systems response including sample preparation and instrument performance were well characterized and controlled. These heterogeneity indices provide a standardized method that can easily be integrated into small and large scale screening or profiling projects to guide interpretation of the biology, as well as the development of therapeutics and diagnostics. Understanding the heterogeneity in the response to perturbagens will become a critical factor in designing strategies for the development of therapeutics including targeted polypharmacology.
The skin prick test – European standards
2013-01-01
Skin prick testing is an essential test procedure to confirm sensitization in IgE-mediated allergic disease in subjects with rhinoconjunctivitis, asthma, urticaria, anapylaxis, atopic eczema and food and drug allergy. This manuscript reviews the available evidence including Medline and Embase searches, abstracts of international allergy meetings and position papers from the world allergy literature. The recommended method of prick testing includes the appropriate use of specific allergen extracts, positive and negative controls, interpretation of the tests after 15 – 20 minutes of application, with a positive result defined as a wheal ≥3 mm diameter. A standard prick test panel for Europe for inhalants is proposed and includes hazel (Corylus avellana), alder (Alnus incana), birch (Betula alba), plane (Platanus vulgaris), cypress (Cupressus sempervirens), grass mix (Poa pratensis, Dactilis glomerata, Lolium perenne, Phleum pratense, Festuca pratensis, Helictotrichon pretense), Olive (Olea europaea), mugwort (Artemisia vulgaris), ragweed (Ambrosia artemisiifolia), Alternaria alternata (tenuis), Cladosporium herbarum, Aspergillus fumigatus, Parietaria, cat, dog, Dermatophagoides pteronyssinus, Dermatophagoides farinae, and cockroach (Blatella germanica). Standardization of the skin test procedures and standard panels for different geographic locations are encouraged worldwide to permit better comparisons for diagnostic, clinical and research purposes. PMID:23369181
Coordinate measuring machine test standard apparatus and method
Bieg, Lothar F.
1994-08-30
A coordinate measuring machine test standard apparatus and method which iudes a rotary spindle having an upper phase plate and an axis of rotation, a kinematic ball mount attached to the phase plate concentric with the axis of rotation of the phase plate, a groove mounted at the circumference of the phase plate, and an arm assembly which rests in the groove. The arm assembly has a small sphere at one end and a large sphere at the other end. The small sphere may be a coordinate measuring machine probe tip and may have variable diameters. The large sphere is secured in the kinematic ball mount and the arm is held in the groove. The kinematic ball mount includes at least three mounting spheres and the groove is an angular locating groove including at least two locking spheres. The arm may have a hollow inner core and an outer layer. The rotary spindle may be a ratio reducer. The device is used to evaluate the measuring performance of a coordinate measuring machine for periodic recertification, including 2 and 3 dimensional accuracy, squareness, straightness, and angular accuracy.
NASA Astrophysics Data System (ADS)
Leif, Robert C.; Spidlen, Josef; Brinkman, Ryan R.
2008-02-01
Introduction: The International Society for Analytical Cytology, ISAC, is developing a new combined flow and image Analytical Cytometry Standard (ACS). This standard needs to serve both the research and clinical communities. The clinical medicine and clinical research communities have a need to exchange information with hospital and other clinical information systems. Methods: 1) Prototype the standard by creating CytometryML and a RAW format for binary data. 2) Join the ISAC Data Standards Task Force. 3) Create essential project documentation. 4) Cooperate with other groups by assisting in the preparation of the DICOM Supplement 122: Specimen Module and Pathology Service-Object Pair Classes. Results: CytometryML has been created and serves as a prototype and source of experience for the following: the Analytical Cytometry Standard (ACS) 1.0, the ACS container, Minimum Information about a Flow Cytometry Experiment (MIFlowCyt), and Requirements for a Data File Standard Format to Describe Flow Cytometry and Related Analytical Cytology Data. These requirements provide a means to judge the appropriateness of design elements and to develop tests for the final ACS. The requirements include providing the information required for understanding and reproducing a cytometry experiment or clinical measurement, and for a single standard for both flow and digital microscopic cytometry. Schemas proposed by other members of the ISAC Data Standards Task Force (e.g, Gating-ML) have been independently validated and have been integrated with CytometryML. The use of netCDF as an element of the ACS container has been proposed by others and a suggested method of its use is proposed.
NASA Technical Reports Server (NTRS)
Marley, Mike
2008-01-01
The focus of this paper will be on the thermal balance testing for the Operationally Responsive Space Standard Bus Battery. The Standard Bus thermal design required that the battery be isolated from the bus itself. This required the battery to have its own thermal control, including heaters and a radiator surface. Since the battery was not ready for testing during the overall bus thermal balance testing, a separate test was conducted to verify the thermal design for the battery. This paper will discuss in detail, the test set up, test procedure, and results from this test. Additionally this paper will consider the methods taken to determine the heat dissipation of the battery during charge and discharge. It seems that the heat dissipation for Lithium Ion batteries is relatively unknown and hard to quantify. The methods used during test and the post test analysis to estimate the heat dissipation of the battery will be discussed.
DOT National Transportation Integrated Search
2013-12-01
In 1992, an Applicants Guide and a Reviewers Guide to Traffic Impact Analyses to standardize the methodologies for conducting traffic : impact analyses (TIAs) in Indiana were developed for the Indiana Department of Transportation (INDOT). The m...
16 CFR 1031.7 - Commission support of voluntary standards activities.
Code of Federal Regulations, 2014 CFR
2014-01-01
...) Providing epidemiological and health science information and explanations of hazards for consumer products... or subsidizing technical assistance, including research, health science data, and engineering support.... (5) Providing assistance on methods of disseminating information and education about the voluntary...
16 CFR 1031.7 - Commission support of voluntary standards activities.
Code of Federal Regulations, 2011 CFR
2011-01-01
...) Providing epidemiological and health science information and explanations of hazards for consumer products... or subsidizing technical assistance, including research, health science data, and engineering support.... (5) Providing assistance on methods of disseminating information and education about the voluntary...
16 CFR § 1031.7 - Commission support of voluntary standards activities.
Code of Federal Regulations, 2013 CFR
2013-01-01
... actions: (1) Providing epidemiological and health science information and explanations of hazards for...) Performing or subsidizing technical assistance, including research, health science data, and engineering... participating. (5) Providing assistance on methods of disseminating information and education about the...
16 CFR 1031.7 - Commission support of voluntary standards activities.
Code of Federal Regulations, 2010 CFR
2010-01-01
...) Providing epidemiological and health science information and explanations of hazards for consumer products... or subsidizing technical assistance, including research, health science data, and engineering support.... (5) Providing assistance on methods of disseminating information and education about the voluntary...
16 CFR 1031.7 - Commission support of voluntary standards activities.
Code of Federal Regulations, 2012 CFR
2012-01-01
...) Providing epidemiological and health science information and explanations of hazards for consumer products... or subsidizing technical assistance, including research, health science data, and engineering support.... (5) Providing assistance on methods of disseminating information and education about the voluntary...
43 CFR 426.13 - Excess land appraisals.
Code of Federal Regulations, 2011 CFR
2011-10-01
... cost methods, as applicable. Reclamation will consider nonproject water supply factors as provided in... standard appraisal procedures. (c) Appraisals of nonproject water supplies. (1) The appraiser will consider nonproject water supply factors, where appropriate, including: (i) Ground water pumping lift; (ii) Surface...
Reviews the issues and approaches involved in considering and adopting cost-effectiveness tests for energy efficiency, including discussing each perspective represented by the five standard cost-effectiveness tests and clarifying key terms.
21 CFR 820.70 - Production and process controls.
Code of Federal Regulations, 2011 CFR
2011-04-01
... process control procedures that describe any process controls necessary to ensure conformance to specifications. Where process controls are needed they shall include: (1) Documented instructions, standard operating procedures (SOP's), and methods that define and control the manner of production; (2) Monitoring...
21 CFR 820.70 - Production and process controls.
Code of Federal Regulations, 2013 CFR
2013-04-01
... process control procedures that describe any process controls necessary to ensure conformance to specifications. Where process controls are needed they shall include: (1) Documented instructions, standard operating procedures (SOP's), and methods that define and control the manner of production; (2) Monitoring...
21 CFR 820.70 - Production and process controls.
Code of Federal Regulations, 2014 CFR
2014-04-01
... to ensure that a device conforms to its specifications. Where deviations from device specifications... specifications. Where process controls are needed they shall include: (1) Documented instructions, standard... establish and maintain procedures for changes to a specification, method, process, or procedure. Such...
21 CFR 820.70 - Production and process controls.
Code of Federal Regulations, 2012 CFR
2012-04-01
... to ensure that a device conforms to its specifications. Where deviations from device specifications... specifications. Where process controls are needed they shall include: (1) Documented instructions, standard... establish and maintain procedures for changes to a specification, method, process, or procedure. Such...
Koivula, Lauri; Kapanen, Mika; Seppälä, Tiina; Collan, Juhani; Dowling, Jason A; Greer, Peter B; Gustafsson, Christian; Gunnlaugsson, Adalsteinn; Olsson, Lars E; Wee, Leonard; Korhonen, Juha
2017-12-01
Recent studies have shown that it is possible to conduct entire radiotherapy treatment planning (RTP) workflow using only MR images. This study aims to develop a generalized intensity-based method to generate synthetic CT (sCT) images from standard T2-weighted (T2 w ) MR images of the pelvis. This study developed a generalized dual model HU conversion method to convert standard T2 w MR image intensity values to synthetic HU values, separately inside and outside of atlas-segmented bone volume contour. The method was developed and evaluated with 20 and 35 prostate cancer patients, respectively. MR images with scanning sequences in clinical use were acquired with four different MR scanners of three vendors. For the generated synthetic CT (sCT) images of the 35 prostate patients, the mean (and maximal) HU differences in soft and bony tissue volumes were 16 ± 6 HUs (34 HUs) and -46 ± 56 HUs (181 HUs), respectively, against the true CT images. The average of the PTV mean dose difference in sCTs compared to those in true CTs was -0.6 ± 0.4% (-1.3%). The study provides a generalized method for sCT creation from standard T2 w images of the pelvis. The method produced clinically acceptable dose calculation results for all the included scanners and MR sequences. Copyright © 2017 Elsevier B.V. All rights reserved.
Global Asymptotic Behavior of Iterative Implicit Schemes
NASA Technical Reports Server (NTRS)
Yee, H. C.; Sweby, P. K.
1994-01-01
The global asymptotic nonlinear behavior of some standard iterative procedures in solving nonlinear systems of algebraic equations arising from four implicit linear multistep methods (LMMs) in discretizing three models of 2 x 2 systems of first-order autonomous nonlinear ordinary differential equations (ODEs) is analyzed using the theory of dynamical systems. The iterative procedures include simple iteration and full and modified Newton iterations. The results are compared with standard Runge-Kutta explicit methods, a noniterative implicit procedure, and the Newton method of solving the steady part of the ODEs. Studies showed that aside from exhibiting spurious asymptotes, all of the four implicit LMMs can change the type and stability of the steady states of the differential equations (DEs). They also exhibit a drastic distortion but less shrinkage of the basin of attraction of the true solution than standard nonLMM explicit methods. The simple iteration procedure exhibits behavior which is similar to standard nonLMM explicit methods except that spurious steady-state numerical solutions cannot occur. The numerical basins of attraction of the noniterative implicit procedure mimic more closely the basins of attraction of the DEs and are more efficient than the three iterative implicit procedures for the four implicit LMMs. Contrary to popular belief, the initial data using the Newton method of solving the steady part of the DEs may not have to be close to the exact steady state for convergence. These results can be used as an explanation for possible causes and cures of slow convergence and nonconvergence of steady-state numerical solutions when using an implicit LMM time-dependent approach in computational fluid dynamics.
Gortmaker, Steven L.; Kenney, Erica L.; Carter, Jill E.; Howe, M. Caitlin Westfall; Reiner, Jennifer F.; Cradock, Angie L.
2016-01-01
Introduction Competitive beverages are drinks sold outside of the federally reimbursable school meals program and include beverages sold in vending machines, a la carte lines, school stores, and snack bars. Competitive beverages include sugar-sweetened beverages, which are associated with overweight and obesity. We described competitive beverage availability 9 years after the introduction in 2004 of district-wide nutrition standards for competitive beverages sold in Boston Public Schools. Methods In 2013, we documented types of competitive beverages sold in 115 schools. We collected nutrient data to determine compliance with the standards. We evaluated the extent to which schools met the competitive-beverage standards and calculated the percentage of students who had access to beverages that met or did not meet the standards. Results Of 115 schools, 89.6% met the competitive beverage nutrition standards; 88.5% of elementary schools and 61.5% of middle schools did not sell competitive beverages. Nutrition standards were met in 79.2% of high schools; 37.5% did not sell any competitive beverages, and 41.7% sold only beverages meeting the standards. Overall, 85.5% of students attended schools meeting the standards. Only 4.0% of students had access to sugar-sweetened beverages. Conclusion A comprehensive, district-wide competitive beverage policy with implementation support can translate into a sustained healthful environment in public schools. PMID:26940299
Development of job standards for clinical nutrition therapy for dyslipidemia patients.
Kang, Min-Jae; Seo, Jung-Sook; Kim, Eun-Mi; Park, Mi-Sun; Woo, Mi-Hye; Ju, Dal-Lae; Wie, Gyung-Ah; Lee, Song-Mi; Cha, Jin-A; Sohn, Cheong-Min
2015-04-01
Dyslipidemia has significantly contributed to the increase of death and morbidity rates related to cardiovascular diseases. Clinical nutrition service provided by dietitians has been reported to have a positive effect on relief of medical symptoms or reducing the further medical costs. However, there is a lack of researches to identify key competencies and job standard for clinical dietitians to care patients with dyslipidemia. Therefore, the purpose of this study was to analyze the job components of clinical dietitian and develop the standard for professional practice to provide effective nutrition management for dyslipidemia patients. The current status of clinical nutrition therapy for dyslipidemia patients in hospitals with 300 or more beds was studied. After duty tasks and task elements of nutrition care process for dyslipidemia clinical dietitians were developed by developing a curriculum (DACUM) analysis method. The developed job standards were pretested in order to evaluate job performance, difficulty, and job standards. As a result, the job standard included four jobs, 18 tasks, and 53 task elements, and specific job description includes 73 basic services and 26 recommended services. When clinical dietitians managing dyslipidemia patients performed their practice according to this job standard for 30 patients the job performance rate was 68.3%. Therefore, the job standards of clinical dietitians for clinical nutrition service for dyslipidemia patients proposed in this study can be effectively used by hospitals.
Comparison of scoring approaches for the NEI VFQ-25 in low vision.
Dougherty, Bradley E; Bullimore, Mark A
2010-08-01
The aim of this study was to evaluate different approaches to scoring the National Eye Institute Visual Functioning Questionnaire-25 (NEI VFQ-25) in patients with low vision including scoring by the standard method, by Rasch analysis, and by use of an algorithm created by Massof to approximate Rasch person measure. Subscale validity and use of a 7-item short form instrument proposed by Ryan et al. were also investigated. NEI VFQ-25 data from 50 patients with low vision were analyzed using the standard method of summing Likert-type scores and calculating an overall average, Rasch analysis using Winsteps software, and the Massof algorithm in Excel. Correlations between scores were calculated. Rasch person separation reliability and other indicators were calculated to determine the validity of the subscales and of the 7-item instrument. Scores calculated using all three methods were highly correlated, but evidence of floor and ceiling effects was found with the standard scoring method. None of the subscales investigated proved valid. The 7-item instrument showed acceptable person separation reliability and good targeting and item performance. Although standard scores and Rasch scores are highly correlated, Rasch analysis has the advantages of eliminating floor and ceiling effects and producing interval-scaled data. The Massof algorithm for approximation of the Rasch person measure performed well in this group of low-vision patients. The validity of the subscales VFQ-25 should be reconsidered.
Inertial effects on mechanically braked Wingate power calculations.
Reiser, R F; Broker, J P; Peterson, M L
2000-09-01
The standard procedure for determining subject power output from a 30-s Wingate test on a mechanically braked (friction-loaded) ergometer includes only the braking resistance and flywheel velocity in the computations. However, the inertial effects associated with accelerating and decelerating the crank and flywheel also require energy and, therefore, represent a component of the subject's power output. The present study was designed to determine the effects of drive-system inertia on power output calculations. Twenty-eight male recreational cyclists completed Wingate tests on a Monark 324E mechanically braked ergometer (resistance: 8.5% body mass (BM), starting cadence: 60 rpm). Power outputs were then compared using both standard (without inertial contribution) and corrected methods (with inertial contribution) of calculating power output. Relative 5-s peak power and 30-s average power for the corrected method (14.8 +/- 1.2 W x kg(-1) BM; 9.9 +/- 0.7 W x kg(-1) BM) were 20.3% and 3.1% greater than that of the standard method (12.3 +/- 0.7 W x kg(-1) BM; 9.6 +/- 0.7 W x kg(-1) BM), respectively. Relative 5-s minimum power for the corrected method (6.8 +/- 0.7 W x kg(-1) BM) was 6.8% less than that of the standard method (7.3 +/- 0.8 W x kg(-1) BM). The combined differences in the peak power and minimum power produced a fatigue index for the corrected method (54 +/- 5%) that was 31.7% greater than that of the standard method (41 +/- 6%). All parameter differences were significant (P < 0.01). The inertial contribution to power output was dominated by the flywheel; however, the contribution from the crank was evident. These results indicate that the inertial components of the ergometer drive system influence the power output characteristics, requiring care when computing, interpreting, and comparing Wingate results, particularly among different ergometer designs and test protocols.
Quantum chemical approach to estimating the thermodynamics of metabolic reactions.
Jinich, Adrian; Rappoport, Dmitrij; Dunn, Ian; Sanchez-Lengeling, Benjamin; Olivares-Amaya, Roberto; Noor, Elad; Even, Arren Bar; Aspuru-Guzik, Alán
2014-11-12
Thermodynamics plays an increasingly important role in modeling and engineering metabolism. We present the first nonempirical computational method for estimating standard Gibbs reaction energies of metabolic reactions based on quantum chemistry, which can help fill in the gaps in the existing thermodynamic data. When applied to a test set of reactions from core metabolism, the quantum chemical approach is comparable in accuracy to group contribution methods for isomerization and group transfer reactions and for reactions not including multiply charged anions. The errors in standard Gibbs reaction energy estimates are correlated with the charges of the participating molecules. The quantum chemical approach is amenable to systematic improvements and holds potential for providing thermodynamic data for all of metabolism.
The Global Neurological Burden of Tuberculosis.
Thakur, Kiran; Das, Mitashee; Dooley, Kelly E; Gupta, Amita
2018-04-01
Central nervous system (CNS) involvement of tuberculosis (TB) is the most severe manifestation of TB and accounts for approximately 5 to 10% of all extrapulmonary TB (EPTB) cases and approximately 1% of all TB cases. TB meningitis (TBM) is the most common form of CNS TB, though other forms occur, often in conjunction with TBM, including intracranial tuberculomas, tuberculous brain abscesses, and spinal tubercular arachnoiditis. CNS TB often presents with nonspecific clinical features that mimic symptoms of other neurological conditions, often making diagnosis difficult. Defining neuroimaging characteristics of TBM include thick basal meningeal enhancement, hydrocephalus, and parenchymal infarctions most commonly involving the basal ganglia and internal capsule. Traditional cerebrospinal fluid sample analysis frequently requires lengthy times-to-result and have low sensitivity. Given the pitfalls of conventional CNS TB diagnostic methods, various molecular-based methods, including immunoassays and polymerase chain reaction (PCR)-based assays have emerged as alternative diagnostic tools due to their rapidity, sensitivity, and specificity. Expert panels on TBM have recently emphasized the need for standard research procedures with updated case definitions and standardized study methods, which will hopefully pave the way for more robust multicenter international studies. In this article, we review the epidemiology, diagnosis, molecular factors associated with disease presentation and outcome, and treatment of CNS TB. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
Beyond maximum entropy: Fractal pixon-based image reconstruction
NASA Technical Reports Server (NTRS)
Puetter, R. C.; Pina, R. K.
1994-01-01
We have developed a new Bayesian image reconstruction method that has been shown to be superior to the best implementations of other methods, including Goodness-of-Fit (e.g. Least-Squares and Lucy-Richardson) and Maximum Entropy (ME). Our new method is based on the concept of the pixon, the fundamental, indivisible unit of picture information. Use of the pixon concept provides an improved image model, resulting in an image prior which is superior to that of standard ME.
User-Centred Design Using Gamestorming.
Currie, Leanne
2016-01-01
User-centered design (UX) is becoming a standard in software engineering and has tremendous potential in healthcare. The purpose of this tutorial will be to demonstrate and provide participants with practice in user-centred design methods that involve 'Gamestorming', a form of brainstorming where 'the rules of life are temporarily suspended'. Participants will learn and apply gamestorming methods including persona development via empathy mapping and methods to translate artefacts derived from participatory design sessions into functional and design requirements.
Microbiological methods for the water recovery systems test, revision 1.1
NASA Technical Reports Server (NTRS)
Rhoads, Tim; Kilgore, M. V., Jr.; Mikell, A. T., Jr.
1990-01-01
Current microbiological parameters specified to verify microbiological quality of Space Station Freedom water quality include the enumeration of total bacteria, anaerobes, aerobes, yeasts and molds, enteric bacteria, gram positives, gram negatives, and E. coli. In addition, other parameters have been identified as necessary to support the Water Recovery Test activities to be conducted at the NASA/MSFC later this year. These other parameters include aerotolerant eutrophic mesophiles, legionellae, and an additional method for heterotrophic bacteria. If inter-laboratory data are to be compared to evaluate quality, analytical methods must be eliminated as a variable. Therefore, each participating laboratory must utilize the same analytical methods and procedures. Without this standardization, data can be neither compared nor validated between laboratories. Multiple laboratory participation represents a conservative approach to insure quality and completeness of data. Invariably, sample loss will occur in transport and analyses. Natural variance is a reality on any test of this magnitude and is further enhanced because biological entities, capable of growth and death, are specific parameters of interest. The large variation due to the participation of human test subjects has been noted with previous testing. The resultant data might be dismissed as 'out of control' unless intra-laboratory control is included as part of the method or if participating laboratories are not available for verification. The purpose of this document is to provide standardized laboratory procedures for the enumeration of certain microorganisms in water and wastewater specific to the water recovery systems test. The document consists of ten separate cultural methods and one direct count procedure. It is not intended nor is it implied to be a complete microbiological methods manual.
Zhao, Li-Ting; Xiang, Yu-Hong; Dai, Yin-Mei; Zhang, Zhuo-Yong
2010-04-01
Near infrared spectroscopy was applied to measure the tissue slice of endometrial tissues for collecting the spectra. A total of 154 spectra were obtained from 154 samples. The number of normal, hyperplasia, and malignant samples was 36, 60, and 58, respectively. Original near infrared spectra are composed of many variables, for example, interference information including instrument errors and physical effects such as particle size and light scatter. In order to reduce these influences, original spectra data should be performed with different spectral preprocessing methods to compress variables and extract useful information. So the methods of spectral preprocessing and wavelength selection have played an important role in near infrared spectroscopy technique. In the present paper the raw spectra were processed using various preprocessing methods including first derivative, multiplication scatter correction, Savitzky-Golay first derivative algorithm, standard normal variate, smoothing, and moving-window median. Standard deviation was used to select the optimal spectral region of 4 000-6 000 cm(-1). Then principal component analysis was used for classification. Principal component analysis results showed that three types of samples could be discriminated completely and the accuracy almost achieved 100%. This study demonstrated that near infrared spectroscopy technology and chemometrics method could be a fast, efficient, and novel means to diagnose cancer. The proposed methods would be a promising and significant diagnosis technique of early stage cancer.
Standards for plant synthetic biology: a common syntax for exchange of DNA parts.
Patron, Nicola J; Orzaez, Diego; Marillonnet, Sylvestre; Warzecha, Heribert; Matthewman, Colette; Youles, Mark; Raitskin, Oleg; Leveau, Aymeric; Farré, Gemma; Rogers, Christian; Smith, Alison; Hibberd, Julian; Webb, Alex A R; Locke, James; Schornack, Sebastian; Ajioka, Jim; Baulcombe, David C; Zipfel, Cyril; Kamoun, Sophien; Jones, Jonathan D G; Kuhn, Hannah; Robatzek, Silke; Van Esse, H Peter; Sanders, Dale; Oldroyd, Giles; Martin, Cathie; Field, Rob; O'Connor, Sarah; Fox, Samantha; Wulff, Brande; Miller, Ben; Breakspear, Andy; Radhakrishnan, Guru; Delaux, Pierre-Marc; Loqué, Dominique; Granell, Antonio; Tissier, Alain; Shih, Patrick; Brutnell, Thomas P; Quick, W Paul; Rischer, Heiko; Fraser, Paul D; Aharoni, Asaph; Raines, Christine; South, Paul F; Ané, Jean-Michel; Hamberger, Björn R; Langdale, Jane; Stougaard, Jens; Bouwmeester, Harro; Udvardi, Michael; Murray, James A H; Ntoukakis, Vardis; Schäfer, Patrick; Denby, Katherine; Edwards, Keith J; Osbourn, Anne; Haseloff, Jim
2015-10-01
Inventors in the field of mechanical and electronic engineering can access multitudes of components and, thanks to standardization, parts from different manufacturers can be used in combination with each other. The introduction of BioBrick standards for the assembly of characterized DNA sequences was a landmark in microbial engineering, shaping the field of synthetic biology. Here, we describe a standard for Type IIS restriction endonuclease-mediated assembly, defining a common syntax of 12 fusion sites to enable the facile assembly of eukaryotic transcriptional units. This standard has been developed and agreed by representatives and leaders of the international plant science and synthetic biology communities, including inventors, developers and adopters of Type IIS cloning methods. Our vision is of an extensive catalogue of standardized, characterized DNA parts that will accelerate plant bioengineering. © 2015 The Authors. New Phytologist © 2015 New Phytologist Trust.
Leveraging standards to support patient-centric interdisciplinary plans of care.
Dykes, Patricia C; DaDamio, Rebecca R; Goldsmith, Denise; Kim, Hyeon-eui; Ohashi, Kumiko; Saba, Virginia K
2011-01-01
As health care systems and providers move towards meaningful use of electronic health records, the once distant vision of collaborative patient-centric, interdisciplinary plans of care, generated and updated across organizations and levels of care, may soon become a reality. Effective care planning is included in the proposed Stages 2-3 Meaningful Use quality measures. To facilitate interoperability, standardization of plan of care messaging, content, information and terminology models are needed. This degree of standardization requires local and national coordination. The purpose of this paper is to review some existing standards that may be leveraged to support development of interdisciplinary patient-centric plans of care. Standards are then applied to a use case to demonstrate one method for achieving patient-centric and interoperable interdisciplinary plan of care documentation. Our pilot work suggests that existing standards provide a foundation for adoption and implementation of patient-centric plans of care that are consistent with federal requirements.
TNSPackage: A Fortran2003 library designed for tensor network state methods
NASA Astrophysics Data System (ADS)
Dong, Shao-Jun; Liu, Wen-Yuan; Wang, Chao; Han, Yongjian; Guo, G.-C.; He, Lixin
2018-07-01
Recently, the tensor network states (TNS) methods have proven to be very powerful tools to investigate the strongly correlated many-particle physics in one and two dimensions. The implementation of TNS methods depends heavily on the operations of tensors, including contraction, permutation, reshaping tensors, SVD and so on. Unfortunately, the most popular computer languages for scientific computation, such as Fortran and C/C++ do not have a standard library for such operations, and therefore make the coding of TNS very tedious. We develop a Fortran2003 package that includes all kinds of basic tensor operations designed for TNS. It is user-friendly and flexible for different forms of TNS, and therefore greatly simplifies the coding work for the TNS methods.
NASA Astrophysics Data System (ADS)
Ortleb, Sigrun; Seidel, Christian
2017-07-01
In this second symposium at the limits of experimental and numerical methods, recent research is presented on practically relevant problems. Presentations discuss experimental investigation as well as numerical methods with a strong focus on application. In addition, problems are identified which require a hybrid experimental-numerical approach. Topics include fast explicit diffusion applied to a geothermal energy storage tank, noise in experimental measurements of electrical quantities, thermal fluid structure interaction, tensegrity structures, experimental and numerical methods for Chladni figures, optimized construction of hydroelectric power stations, experimental and numerical limits in the investigation of rain-wind induced vibrations as well as the application of exponential integrators in a domain-based IMEX setting.
2011-01-01
Background Verbal autopsy methods are critically important for evaluating the leading causes of death in populations without adequate vital registration systems. With a myriad of analytical and data collection approaches, it is essential to create a high quality validation dataset from different populations to evaluate comparative method performance and make recommendations for future verbal autopsy implementation. This study was undertaken to compile a set of strictly defined gold standard deaths for which verbal autopsies were collected to validate the accuracy of different methods of verbal autopsy cause of death assignment. Methods Data collection was implemented in six sites in four countries: Andhra Pradesh, India; Bohol, Philippines; Dar es Salaam, Tanzania; Mexico City, Mexico; Pemba Island, Tanzania; and Uttar Pradesh, India. The Population Health Metrics Research Consortium (PHMRC) developed stringent diagnostic criteria including laboratory, pathology, and medical imaging findings to identify gold standard deaths in health facilities as well as an enhanced verbal autopsy instrument based on World Health Organization (WHO) standards. A cause list was constructed based on the WHO Global Burden of Disease estimates of the leading causes of death, potential to identify unique signs and symptoms, and the likely existence of sufficient medical technology to ascertain gold standard cases. Blinded verbal autopsies were collected on all gold standard deaths. Results Over 12,000 verbal autopsies on deaths with gold standard diagnoses were collected (7,836 adults, 2,075 children, 1,629 neonates, and 1,002 stillbirths). Difficulties in finding sufficient cases to meet gold standard criteria as well as problems with misclassification for certain causes meant that the target list of causes for analysis was reduced to 34 for adults, 21 for children, and 10 for neonates, excluding stillbirths. To ensure strict independence for the validation of methods and assessment of comparative performance, 500 test-train datasets were created from the universe of cases, covering a range of cause-specific compositions. Conclusions This unique, robust validation dataset will allow scholars to evaluate the performance of different verbal autopsy analytic methods as well as instrument design. This dataset can be used to inform the implementation of verbal autopsies to more reliably ascertain cause of death in national health information systems. PMID:21816095
Development of new methodologies for evaluating the energy performance of new commercial buildings
NASA Astrophysics Data System (ADS)
Song, Suwon
The concept of Measurement and Verification (M&V) of a new building continues to become more important because efficient design alone is often not sufficient to deliver an efficient building. Simulation models that are calibrated to measured data can be used to evaluate the energy performance of new buildings if they are compared to energy baselines such as similar buildings, energy codes, and design standards. Unfortunately, there is a lack of detailed M&V methods and analysis methods to measure energy savings from new buildings that would have hypothetical energy baselines. Therefore, this study developed and demonstrated several new methodologies for evaluating the energy performance of new commercial buildings using a case-study building in Austin, Texas. First, three new M&V methods were developed to enhance the previous generic M&V framework for new buildings, including: (1) The development of a method to synthesize weather-normalized cooling energy use from a correlation of Motor Control Center (MCC) electricity use when chilled water use is unavailable, (2) The development of an improved method to analyze measured solar transmittance against incidence angle for sample glazing using different solar sensor types, including Eppley PSP and Li-Cor sensors, and (3) The development of an improved method to analyze chiller efficiency and operation at part-load conditions. Second, three new calibration methods were developed and analyzed, including: (1) A new percentile analysis added to the previous signature method for use with a DOE-2 calibration, (2) A new analysis to account for undocumented exhaust air in DOE-2 calibration, and (3) An analysis of the impact of synthesized direct normal solar radiation using the Erbs correlation on DOE-2 simulation. Third, an analysis of the actual energy savings compared to three different energy baselines was performed, including: (1) Energy Use Index (EUI) comparisons with sub-metered data, (2) New comparisons against Standards 90.1-1989 and 90.1-2001, and (3) A new evaluation of the performance of selected Energy Conservation Design Measures (ECDMs). Finally, potential energy savings were also simulated from selected improvements, including: minimum supply air flow, undocumented exhaust air, and daylighting.
NASA Astrophysics Data System (ADS)
Pfefer, Joshua; Agrawal, Anant
2012-03-01
In recent years there has been increasing interest in development of consensus, tissue-phantom-based approaches for assessment of biophotonic imaging systems, with the primary goal of facilitating clinical translation of novel optical technologies. Well-characterized test methods based on tissue phantoms can provide useful tools for performance assessment, thus enabling standardization and device inter-comparison during preclinical development as well as quality assurance and re-calibration in the clinical setting. In this review, we study the role of phantom-based test methods as described in consensus documents such as international standards for established imaging modalities including X-ray CT, MRI and ultrasound. Specifically, we focus on three image quality characteristics - spatial resolution, spatial measurement accuracy and image uniformity - and summarize the terminology, metrics, phantom design/construction approaches and measurement/analysis procedures used to assess these characteristics. Phantom approaches described are those in routine clinical use and tend to have simplified morphology and biologically-relevant physical parameters. Finally, we discuss the potential for applying knowledge gained from existing consensus documents in the development of standardized, phantom-based test methods for optical coherence tomography.
The FBI compression standard for digitized fingerprint images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brislawn, C.M.; Bradley, J.N.; Onyshczak, R.J.
1996-10-01
The FBI has formulated national standards for digitization and compression of gray-scale fingerprint images. The compression algorithm for the digitized images is based on adaptive uniform scalar quantization of a discrete wavelet transform subband decomposition, a technique referred to as the wavelet/scalar quantization method. The algorithm produces archival-quality images at compression ratios of around 15 to 1 and will allow the current database of paper fingerprint cards to be replaced by digital imagery. A compliance testing program is also being implemented to ensure high standards of image quality and interchangeability of data between different implementations. We will review the currentmore » status of the FBI standard, including the compliance testing process and the details of the first-generation encoder.« less
FBI compression standard for digitized fingerprint images
NASA Astrophysics Data System (ADS)
Brislawn, Christopher M.; Bradley, Jonathan N.; Onyshczak, Remigius J.; Hopper, Thomas
1996-11-01
The FBI has formulated national standards for digitization and compression of gray-scale fingerprint images. The compression algorithm for the digitized images is based on adaptive uniform scalar quantization of a discrete wavelet transform subband decomposition, a technique referred to as the wavelet/scalar quantization method. The algorithm produces archival-quality images at compression ratios of around 15 to 1 and will allow the current database of paper fingerprint cards to be replaced by digital imagery. A compliance testing program is also being implemented to ensure high standards of image quality and interchangeability of data between different implementations. We will review the current status of the FBI standard, including the compliance testing process and the details of the first-generation encoder.
Evaluation of methods for measuring particulate matter emissions from gas turbines.
Petzold, Andreas; Marsh, Richard; Johnson, Mark; Miller, Michael; Sevcenco, Yura; Delhaye, David; Ibrahim, Amir; Williams, Paul; Bauer, Heidi; Crayford, Andrew; Bachalo, William D; Raper, David
2011-04-15
The project SAMPLE evaluated methods for measuring particle properties in the exhaust of aircraft engines with respect to the development of standardized operation procedures for particulate matter measurement in aviation industry. Filter-based off-line mass methods included gravimetry and chemical analysis of carbonaceous species by combustion methods. Online mass methods were based on light absorption measurement or used size distribution measurements obtained from an electrical mobility analyzer approach. Number concentrations were determined using different condensation particle counters (CPC). Total mass from filter-based methods balanced gravimetric mass within 8% error. Carbonaceous matter accounted for 70% of gravimetric mass while the remaining 30% were attributed to hydrated sulfate and noncarbonaceous organic matter fractions. Online methods were closely correlated over the entire range of emission levels studied in the tests. Elemental carbon from combustion methods and black carbon from optical methods deviated by maximum 5% with respect to mass for low to medium emission levels, whereas for high emission levels a systematic deviation between online methods and filter based methods was found which is attributed to sampling effects. CPC based instruments proved highly reproducible for number concentration measurements with a maximum interinstrument standard deviation of 7.5%.
A Novel Field Deployable Point-of-Care Diagnostic Test for Cutaneous Leishmaniasis
2015-10-01
include localized cutaneous leishmaniasis (LCL), and destructive nasal and oropharyngeal lesions of mucosal leishmaniasis (ML). LCL in the New World...the high costs, personnel training and need of sophisticated equipment. Therefore, novel methods to detect leishmaniasis at the POC are urgently needed...To date, there is no field-standardized molecular method based on DNA amplification coupled with Lateral Flow reading to detect leishmaniasis
Soil analysis based on sa,ples withdrawn from different volumes: correlation versus calibration
Lucian Weilopolski; Kurt Johnsen; Yuen Zhang
2010-01-01
Soil, particularly in forests, is replete with spatial variation with respect to soil C. Th e present standard chemical method for soil analysis by dry combustion (DC) is destructive, and comprehensive sampling is labor intensive and time consuming. Th ese, among other factors, are contributing to the development of new methods for soil analysis. Th ese include a near...
Journal of Air Transportation, Volume 12, No. 1
NASA Technical Reports Server (NTRS)
Bowers, Brent D. (Editor); Kabashkin, Igor (Editor)
2007-01-01
Topics discussed include: a) Data Mining Methods Applied to Flight Operations Quality Assurance Data: A Comparison to Standard Statistical Methods; b) Financial Comparisons across Different Business Models in the Canadian Airline Industry; c) Carving a Niche for the "No-Frills" Carrier, Air Arabia, in Oil-Rich Skies; d) Situational Leadership in Air Traffic Control; and e) The Very Light Jet Arrives: Stakeholders and Their Perceptions.
Quantitative determination of atmospheric hydroperoxyl radical
Springston, Stephen R.; Lloyd, Judith; Zheng, Jun
2007-10-23
A method for the quantitative determination of atmospheric hydroperoxyl radical comprising: (a) contacting a liquid phase atmospheric sample with a chemiluminescent compound which luminesces on contact with hydroperoxyl radical; (b) determining luminescence intensity from the liquid phase atmospheric sample; and (c) comparing said luminescence intensity from the liquid phase atmospheric sample to a standard luminescence intensity for hydroperoxyl radical. An apparatus for automating the method is also included.
This compendium includes method summaries provided by the Centers for Disease Control and Prevention/National Center for Environmental Health (CDC/NCEH) for the collection and shipping of blood and urine samples for analysis of metals and volatile organic compounds (VOCs). The pr...
Quantum dynamics of thermalizing systems
NASA Astrophysics Data System (ADS)
White, Christopher David; Zaletel, Michael; Mong, Roger S. K.; Refael, Gil
2018-01-01
We introduce a method "DMT" for approximating density operators of 1D systems that, when combined with a standard framework for time evolution (TEBD), makes possible simulation of the dynamics of strongly thermalizing systems to arbitrary times. We demonstrate that the method performs well for both near-equilibrium initial states (Gibbs states with spatially varying temperatures) and far-from-equilibrium initial states, including quenches across phase transitions and pure states.
An isotope-dilution standard GC/MS/MS method for steroid hormones in water
Foreman, William T.; Gray, James L.; ReVello, Rhiannon C.; Lindley, Chris E.; Losche, Scott A.
2013-01-01
An isotope-dilution quantification method was developed for 20 natural and synthetic steroid hormones and additional compounds in filtered and unfiltered water. Deuterium- or carbon-13-labeled isotope-dilution standards (IDSs) are added to the water sample, which is passed through an octadecylsilyl solid-phase extraction (SPE) disk. Following extract cleanup using Florisil SPE, method compounds are converted to trimethylsilyl derivatives and analyzed by gas chromatography with tandem mass spectrometry. Validation matrices included reagent water, wastewater-affected surface water, and primary (no biological treatment) and secondary wastewater effluent. Overall method recovery for all analytes in these matrices averaged 100%; with overall relative standard deviation of 28%. Mean recoveries of the 20 individual analytes for spiked reagent-water samples prepared along with field samples analyzed in 2009–2010 ranged from 84–104%, with relative standard deviations of 6–36%. Detection levels estimated using ASTM International’s D6091–07 procedure range from 0.4 to 4 ng/L for 17 analytes. Higher censoring levels of 100 ng/L for bisphenol A and 200 ng/L for cholesterol and 3-beta-coprostanol are used to prevent bias and false positives associated with the presence of these analytes in blanks. Absolute method recoveries of the IDSs provide sample-specific performance information and guide data reporting. Careful selection of labeled compounds for use as IDSs is important because both inexact IDS-analyte matches and deuterium label loss affect an IDS’s ability to emulate analyte performance. Six IDS compounds initially tested and applied in this method exhibited deuterium loss and are not used in the final method.
Vo Duy, S; Besteiro, S; Berry, L; Perigaud, C; Bressolle, F; Vial, H J; Lefebvre-Tournier, I
2012-08-20
Plasmodium falciparum is the causative agent of malaria, a deadly infectious disease for which treatments are scarce and drug-resistant parasites are now increasingly found. A comprehensive method of identifying and quantifying metabolites of this intracellular parasite could expand the arsenal of tools to understand its biology, and be used to develop new treatments against the disease. Here, we present two methods based on liquid chromatography tandem mass spectrometry for reliable measurement of water-soluble metabolites involved in phospholipid biosynthesis, as well as several other metabolites that reflect the metabolic status of the parasite including amino acids, carboxylic acids, energy-related carbohydrates, and nucleotides. A total of 35 compounds was quantified. In the first method, polar compounds were retained by hydrophilic interaction chromatography (amino column) and detected in negative mode using succinic acid-(13)C(4) and fluorovaline as internal standards. In the second method, separations were carried out using reverse phase (C18) ion-pair liquid chromatography, with heptafluorobutyric acid as a volatile ion pairing reagent in positive detection mode, using d(9)-choline and 4-aminobutanol as internal standards. Standard curves were performed in P. falciparum-infected and uninfected red blood cells using standard addition method (r(2)>0.99). The intra- and inter-day accuracy and precision as well as the extraction recovery of each compound were determined. The lower limit of quantitation varied from 50pmol to 100fmol/3×10(7)cells. These methods were validated and successfully applied to determine intracellular concentrations of metabolites from uninfected host RBCs and isolated Plasmodium parasites. Copyright © 2012 Elsevier B.V. All rights reserved.
Käser, T; Pasternak, J A; Hamonic, G; Rieder, M; Lai, K; Delgado-Ortega, M; Gerdts, V; Meurens, F
2016-05-01
Chlamydiaceae is a family of intracellular bacteria causing a range of diverse pathological outcomes. The most devastating human diseases are ocular infections with C. trachomatis leading to blindness and genital infections causing pelvic inflammatory disease with long-term sequelae including infertility and chronic pelvic pain. In order to enable the comparison of experiments between laboratories investigating host-chlamydia interactions, the infectious titer has to be determined. Titer determination of chlamydia is most commonly performed via microscopy of host cells infected with a serial dilution of chlamydia. However, other methods including fluorescent ELISpot (Fluorospot) and DNA Chip Scanning Technology have also been proposed to enumerate chlamydia-infected cells. For viruses, flow cytometry has been suggested as a superior alternative to standard titration methods. In this study we compared the use of flow cytometry with microscopy and Fluorospot for the titration of C. suis as a representative of other intracellular bacteria. Titer determination via Fluorospot was unreliable, while titration via microscopy led to a linear read-out range of 16 - 64 dilutions and moderate reproducibility with acceptable standard deviations within and between investigators. In contrast, flow cytometry had a vast linear read-out range of 1,024 dilutions and the lowest standard deviations given a basic training in these methods. In addition, flow cytometry was faster and material costs were lower compared to microscopy. Flow cytometry offers a fast, cheap, precise, and reproducible alternative for the titration of intracellular bacteria like C. suis. © 2016 International Society for Advancement of Cytometry. © 2016 International Society for Advancement of Cytometry.
Barik, Mayadhar; Bajpai, Minu; Patnaik, Santosh; Mishra, Pravash; Behera, Priyamadhaba; Dwivedi, Sada Nanda
2016-01-01
Background: Cryopreservation is basically related to meritorious thin samples or small clumps of cells that are cooled quickly without loss. Our main objective is to establish and formulate an innovative method and protocol development for cryopreservation as a gold standard for clinical uses in laboratory practice and treatment. The knowledge regarding usefulness of cryopreservation in clinical practice is essential to carry forward the clinical practice and research. Materials and Methods: We are trying to compare different methods of cryopreservation (in two dozen of cells) at the same time we compare the embryo and oocyte freezing interms of fertilization rate according to the International standard protocol. Results: The combination of cryoprotectants and regimes of rapid cooling and rinsing during warming often allows successful cryopreservation of biological materials, particularly cell suspensions or thin tissue samples. Examples include semen, blood, tissue samples like tumors, histological cross-sections, human eggs and human embryos. Although presently many studies have reported that the children born from frozen embryos or “frosties,” show consistently positive results with no increase in birth defects or development abnormalities is quite good enough and similar to our study (50–85%). Conclusions: We ensure that cryopreservation technology provided useful cell survivability, tissue and organ preservation in a proper way. Although it varies according to different laboratory conditions, it is certainly beneficial for patient's treatment and research. Further studies are needed for standardization and development of new protocol. PMID:27512686
Sixty-five years since the New York heat wave: advances in sweat testing for cystic fibrosis.
Collie, Jake T B; Massie, R John; Jones, Oliver A H; LeGrys, Vicky A; Greaves, Ronda F
2014-02-01
The sweat test remains important as a diagnostic test for cystic fibrosis (CF) and has contributed greatly to our understanding of CF as a disease of epithelial electrolyte transport. The standardization of the sweat test, by Gibson and Cooke [Gibson and Cooke (1959) Pediatrics 1959;23:5], followed observations of excessive dehydration amongst patients with CF and confirmed the utility as a diagnostic test. Quantitative pilocarpine iontophoresis remains the gold standard for sweat induction, but there are a number of collection and analytical methods. The pathophysiology of electrolyte transport in sweat was described by Quinton [Quinton (1983) Nature 1983;301:421-422], and this complemented the developments in genetics that discovered the cystic fibrosis transmembrane conductance regulator (CFTR), an epithelial-based electrolyte transport protein. Knowledge of CF has since increased rapidly and further developments in sweat testing include: new collection methods, further standardization of the technique with international recommendations and age related reference intervals. More recently, sweat chloride values have been used as proof of effect for the new drugs that activate CFTR. However, there remain issues with adherence to sweat test guidelines in many countries and there are gaps in our knowledge, including reference intervals for some age groups and stability of sweat samples in transport. Furthermore, modern methods of elemental quantification need to be explored as alternatives to the original analytical methods for sweat electrolyte measurement. The purpose of this review is therefore to describe the development of the sweat test and consider future directions. © 2013 Wiley Periodicals, Inc.
Laurin, E; Thakur, K K; Gardner, I A; Hick, P; Moody, N J G; Crane, M S J; Ernst, I
2018-05-01
Design and reporting quality of diagnostic accuracy studies (DAS) are important metrics for assessing utility of tests used in animal and human health. Following standards for designing DAS will assist in appropriate test selection for specific testing purposes and minimize the risk of reporting biased sensitivity and specificity estimates. To examine the benefits of recommending standards, design information from published DAS literature was assessed for 10 finfish, seven mollusc, nine crustacean and two amphibian diseases listed in the 2017 OIE Manual of Diagnostic Tests for Aquatic Animals. Of the 56 DAS identified, 41 were based on field testing, eight on experimental challenge studies and seven on both. Also, we adapted human and terrestrial-animal standards and guidelines for DAS structure for use in aquatic animal diagnostic research. Through this process, we identified and addressed important metrics for consideration at the design phase: study purpose, targeted disease state, selection of appropriate samples and specimens, laboratory analytical methods, statistical methods and data interpretation. These recommended design standards for DAS are presented as a checklist including risk-of-failure points and actions to mitigate bias at each critical step. Adherence to standards when designing DAS will also facilitate future systematic review and meta-analyses of DAS research literature. © 2018 John Wiley & Sons Ltd.
Keller, Nicole S; Stefánsson, Andri; Sigfússon, Bergur
2014-10-01
A method for the analysis of arsenic species in aqueous sulfide samples is presented. The method uses an ion chromatography system connected with a Hydride-Generation Atomic Fluorescence Spectrometer (IC-HG-AFS). With this method inorganic As(III) and As(V) species in water samples can be analyzed, including arsenite (HnAs(III)O3(n-3)), thioarsenite (HnAs(III)S3(n-3)), arsenate (HnAs(V)O4(n-3)), monothioarsenate (HnAs(V)SO3(n-3)), dithioarsenate (HnAs(V)S2O2(n-3)), trithioarsenate (HnAs(V)S3O(n-3)) and tetrathioarsenate (HnAs(V)S4(n-3)). The peak identification and retention times were determined based on standard analysis of the various arsenic compounds. The analytical detection limit was ~1-3 µg L(-1) (LOD), depending on the quality of the baseline. This low detection limit makes this method also applicable to discriminate between waters meeting the drinking water standard of max. 10 µg L(-1) As, and waters that do not meet this standard. The new method was successfully applied for on-site determination of arsenic species in natural sulfidic waters, in which seven species were unambiguously identified. Copyright © 2014 Elsevier B.V. All rights reserved.
Cimetiere, Nicolas; Soutrel, Isabelle; Lemasle, Marguerite; Laplanche, Alain; Crocq, André
2013-01-01
The study of the occurrence and fate of pharmaceutical compounds in drinking or waste water processes has become very popular in recent years. Liquid chromatography with tandem mass spectrometry is a powerful analytical tool often used to determine pharmaceutical residues at trace level in water. However, many steps may disrupt the analytical procedure and bias the results. A list of 27 environmentally relevant molecules, including various therapeutic classes and (cardiovascular, veterinary and human antibiotics, neuroleptics, non-steroidal anti-inflammatory drugs, hormones and other miscellaneous pharmaceutical compounds), was selected. In this work, a method was developed using ultra performance liquid chromatography coupled to tandem mass spectrometry (UPLC-MS/MS) and solid-phase extraction to determine the concentration of the 27 targeted pharmaceutical compounds at the nanogram per litre level. The matrix effect was evaluated from water sampled at different treatment stages. Conventional methods with external calibration and internal standard correction were compared with the standard addition method (SAM). An accurate determination of pharmaceutical compounds in drinking water was obtained by the SAM associated with UPLC-MS/MS. The developed method was used to evaluate the occurrence and fate of pharmaceutical compounds in some drinking water treatment plants in the west of France.
BASIC: A Simple and Accurate Modular DNA Assembly Method.
Storch, Marko; Casini, Arturo; Mackrow, Ben; Ellis, Tom; Baldwin, Geoff S
2017-01-01
Biopart Assembly Standard for Idempotent Cloning (BASIC) is a simple, accurate, and robust DNA assembly method. The method is based on linker-mediated DNA assembly and provides highly accurate DNA assembly with 99 % correct assemblies for four parts and 90 % correct assemblies for seven parts [1]. The BASIC standard defines a single entry vector for all parts flanked by the same prefix and suffix sequences and its idempotent nature means that the assembled construct is returned in the same format. Once a part has been adapted into the BASIC format it can be placed at any position within a BASIC assembly without the need for reformatting. This allows laboratories to grow comprehensive and universal part libraries and to share them efficiently. The modularity within the BASIC framework is further extended by the possibility of encoding ribosomal binding sites (RBS) and peptide linker sequences directly on the linkers used for assembly. This makes BASIC a highly versatile library construction method for combinatorial part assembly including the construction of promoter, RBS, gene variant, and protein-tag libraries. In comparison with other DNA assembly standards and methods, BASIC offers a simple robust protocol; it relies on a single entry vector, provides for easy hierarchical assembly, and is highly accurate for up to seven parts per assembly round [2].
Li, Li; Liu, Dong-Jun
2014-01-01
Since 2012, China has been facing haze-fog weather conditions, and haze-fog pollution and PM2.5 have become hot topics. It is very necessary to evaluate and analyze the ecological status of the air environment of China, which is of great significance for environmental protection measures. In this study the current situation of haze-fog pollution in China was analyzed first, and the new Ambient Air Quality Standards were introduced. For the issue of air quality evaluation, a comprehensive evaluation model based on an entropy weighting method and nearest neighbor method was developed. The entropy weighting method was used to determine the weights of indicators, and the nearest neighbor method was utilized to evaluate the air quality levels. Then the comprehensive evaluation model was applied into the practical evaluation problems of air quality in Beijing to analyze the haze-fog pollution. Two simulation experiments were implemented in this study. One experiment included the indicator of PM2.5 and was carried out based on the new Ambient Air Quality Standards (GB 3095-2012); the other experiment excluded PM2.5 and was carried out based on the old Ambient Air Quality Standards (GB 3095-1996). Their results were compared, and the simulation results showed that PM2.5 was an important indicator for air quality and the evaluation results of the new Air Quality Standards were more scientific than the old ones. The haze-fog pollution situation in Beijing City was also analyzed based on these results, and the corresponding management measures were suggested. PMID:25170682
Analysis of street drugs in seized material without primary reference standards.
Laks, Suvi; Pelander, Anna; Vuori, Erkki; Ali-Tolppa, Elisa; Sippola, Erkki; Ojanperä, Ilkka
2004-12-15
A novel approach was used to analyze street drugs in seized material without primary reference standards. Identification was performed by liquid chromatography/time-of-flight mass spectrometry (LC/TOFMS), essentially based on accurate mass determination using a target library of 735 exact monoisotopic masses. Quantification was carried out by liquid chromatography/chemiluminescence nitrogen detection (LC/CLND) with a single secondary standard (caffeine), utilizing the detector's equimolar response to nitrogen. Sample preparation comprised dilution, first with methanol and further with the LC mobile phase. Altogether 21 seized drug samples were analyzed blind by the present method, and results were compared to accredited reference methods utilizing identification by gas chromatography/mass spectrometry and quantification by gas chromatography or liquid chromatography. The 31 drug findings by LC/TOFMS comprised 19 different drugs-of-abuse, byproducts, and adulterants, including amphetamine and tryptamine designer drugs, with one unresolved pair of compounds having an identical mass. By the reference methods, 27 findings could be confirmed, and among the four unconfirmed findings, only 1 apparent false positive was found. In the quantitative analysis of 11 amphetamine, heroin, and cocaine findings, mean relative difference between the results of LC/CLND and the reference methods was 11% (range 4.2-21%), without any observable bias. Mean relative standard deviation for three parallel LC/CLND results was 6%. Results suggest that the present combination of LC/TOFMS and LC/CLND offers a simple solution for the analysis of scheduled and designer drugs in seized material, independent of the availability of primary reference standards.
The role of ultrasound guidance in pediatric caudal block
Erbüyün, Koray; Açıkgöz, Barış; Ok, Gülay; Yılmaz, Ömer; Temeltaş, Gökhan; Tekin, İdil; Tok, Demet
2016-01-01
Objectives: To compare the time interval of the procedure, possible complications, post-operative pain levels, additional analgesics, and nurse satisfaction in ultrasonography-guided and standard caudal block applications. Methods: This retrospective study was conducted in Celal Bayar University Hospital, Manisa, Turkey, between January and December 2014, included 78 pediatric patients. Caudal block was applied to 2 different groups; one with ultrasound guide, and the other using the standard method. Results: The time interval of the procedure was significantly shorter in the standard application group compared with ultrasound-guided group (p=0.020). Wong-Baker FACES Pain Rating Scale values obtained at the 90th minute was statistically lower in the standard application group compared with ultrasound-guided group (p=0.035). No statistically significant difference was found on the other parameters between the 2 groups. The shorter time interval of the procedure at standard application group should not be considered as a distinctive mark by the pediatric anesthesiologists, because this time difference was as short as seconds. Conclusion: Ultrasound guidance for caudal block applications would neither increase nor decrease the success of the treatment. However, ultrasound guidance should be needed in cases where the detection of sacral anatomy is difficult, especially by palpations. PMID:26837396
Jiang, Ting-Fu; Lv, Zhi-Hua; Wang, Yuan-Hong; Yue, Mei-E
2006-06-01
A new, simple and rapid capillary electrophoresis (CE) method, using hexadimethrine bromide (HDB) as electroosmotic flow (EOF) modifier, was developed for the identification and quantitative determination of four plant hormones, including gibberellin A3 (GA3), indole-3-acetic acid (IAA), alpha-naphthaleneacetic acid (NAA) and 4-chlorophenoxyacetic acid (4-CA). The optimum separation was achieved with 20 mM borate buffer at pH 10.00 containing 0.005% (w/v) of HDB. The applied voltage was -25 kV and the capillary temperature was kept constant at 25 degrees C. Salicylic acid was used as internal standard for quantification. The calibration dependencies exhibited good linearity within the ratios of the concentrations of standard samples and internal standard and the ratios of the peak areas of samples and internal standard. The correlation coefficients were from 0.9952 to 0.9997. The relative standard deviations of migration times and peak areas were < 1.93 and 6.84%, respectively. The effects of buffer pH, the concentration of HDB and the voltage on the resolution were studied systematically. By this method, the contents of plant hormone in biofertilizer were successfully determined within 7 min, with satisfactory repeatability and recovery.
Phinney, Karen W; Sempos, Christopher T; Tai, Susan S-C; Camara, Johanna E; Wise, Stephen A; Eckfeldt, John H; Hoofnagle, Andrew N; Carter, Graham D; Jones, Julia; Myers, Gary L; Durazo-Arvizu, Ramon; Miller, W Greg; Bachmann, Lorin M; Young, Ian S; Pettit, Juanita; Caldwell, Grahame; Liu, Andrew; Brooks, Stephen P J; Sarafin, Kurtis; Thamm, Michael; Mensink, Gert B M; Busch, Markus; Rabenberg, Martina; Cashman, Kevin D; Kiely, Mairead; Galvin, Karen; Zhang, Joy Y; Kinsella, Michael; Oh, Kyungwon; Lee, Sun-Wha; Jung, Chae L; Cox, Lorna; Goldberg, Gail; Guberg, Kate; Meadows, Sarah; Prentice, Ann; Tian, Lu; Brannon, Patsy M; Lucas, Robyn M; Crump, Peter M; Cavalier, Etienne; Merkel, Joyce; Betz, Joseph M
2017-09-01
The Vitamin D Standardization Program (VDSP) coordinated a study in 2012 to assess the commutability of reference materials and proficiency testing/external quality assurance materials for total 25-hydroxyvitamin D [25(OH)D] in human serum, the primary indicator of vitamin D status. A set of 50 single-donor serum samples as well as 17 reference and proficiency testing/external quality assessment materials were analyzed by participating laboratories that used either immunoassay or LC-MS methods for total 25(OH)D. The commutability test materials included National Institute of Standards and Technology Standard Reference Material 972a Vitamin D Metabolites in Human Serum as well as materials from the College of American Pathologists and the Vitamin D External Quality Assessment Scheme. Study protocols and data analysis procedures were in accordance with Clinical and Laboratory Standards Institute guidelines. The majority of the test materials were found to be commutable with the methods used in this commutability study. These results provide guidance for laboratories needing to choose appropriate reference materials and select proficiency or external quality assessment programs and will serve as a foundation for additional VDSP studies.
Lin, Kai; Collins, Jeremy D; Lloyd-Jones, Donald M; Jolly, Marie-Pierre; Li, Debiao; Markl, Michael; Carr, James C
2016-03-01
To assess the performance of automated quantification of left ventricular function and mass based on heart deformation analysis (HDA) in asymptomatic older adults. This study complied with Health Insurance Portability and Accountability Act regulations. Following the approval of the institutional review board, 160 asymptomatic older participants were recruited for cardiac magnetic resonance imaging including two-dimensional cine images covering the entire left ventricle in short-axis view. Data analysis included the calculation of left ventricular ejection fraction (LVEF), left ventricular mass (LVM), and cardiac output (CO) using HDA and standard global cardiac function analysis (delineation of end-systolic and end-diastolic left ventricle epi- and endocardial borders). The agreement between methods was evaluated using intraclass correlation coefficient (ICC) and coefficient of variation (CoV). HDA had a shorter processing time than the standard method (1.5 ± 0.3 min/case vs. 5.8 ± 1.4 min/case, P < 0.001). There was good agreement for LVEF (ICC = 0.552, CoV = 10.5%), CO (ICC = 0.773, CoV = 13.5%), and LVM (ICC = 0.859, CoV = 14.5%) acquired with the standard method and HDA. There was a systemic bias toward lower LVEF (62.8% ± 8.3% vs. 69.3% ± 6.7%, P < 0.001) and CO (4.4 ± 1.0 L/min vs. 4.8 ± 1.3 L/min, P < 0.001) by HDA compared to the standard technique. Conversely, HDA overestimated LVM (114.8 ± 30.1 g vs. 100.2 ± 29.0 g, P < 0.001) as compared to the reference method. HDA has the potential to measure LVEF, CO, and LVM without the need for user interaction based on standard cardiac two-dimensional cine images. Copyright © 2015 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Cedergren, A
1974-06-01
A rapid and sensitive method using true potentiometric end-point detection has been developed and compared with the conventional amperometric method for Karl Fischer determination of water. The effect of the sulphur dioxide concentration on the shape of the titration curve is shown. By using kinetic data it was possible to calculate the course of titrations and make comparisons with those found experimentally. The results prove that the main reaction is the slow step, both in the amperometric and the potentiometric method. Results obtained in the standardization of the Karl Fischer reagent showed that the potentiometric method, including titration to a preselected potential, gave a standard deviation of 0.001(1) mg of water per ml, the amperometric method using extrapolation 0.002(4) mg of water per ml and the amperometric titration to a pre-selected diffusion current 0.004(7) mg of water per ml. Theories and results dealing with dilution effects are presented. The time of analysis was 1-1.5 min for the potentiometric and 4-5 min for the amperometric method using extrapolation.
King, B
2001-11-01
The new laboratory accreditation standard, ISO/IEC 17025, reflects current thinking on good measurement practice by requiring more explicit and more demanding attention to a number of activities. These include client interactions, method validation, traceability, and measurement uncertainty. Since the publication of the standard in 1999 there has been extensive debate about its interpretation. It is the author's view that if good quality practices are already in place and if the new requirements are introduced in a manner that is fit for purpose, the additional work required to comply with the new requirements can be expected to be modest. The paper argues that the rigour required in addressing the issues should be driven by customer requirements and the factors that need to be considered in this regard are discussed. The issues addressed include the benefits, interim arrangements, specifying the analytical requirement, establishing traceability, evaluating the uncertainty and reporting the information.
Jack, Clifford R; Barkhof, Frederik; Bernstein, Matt A; Cantillon, Marc; Cole, Patricia E; DeCarli, Charles; Dubois, Bruno; Duchesne, Simon; Fox, Nick C; Frisoni, Giovanni B; Hampel, Harald; Hill, Derek LG; Johnson, Keith; Mangin, Jean-François; Scheltens, Philip; Schwarz, Adam J; Sperling, Reisa; Suhy, Joyce; Thompson, Paul M; Weiner, Michael; Foster, Norman L
2012-01-01
Background The promise of Alzheimer’s disease (AD) biomarkers has led to their incorporation in new diagnostic criteria and in therapeutic trials; however, significant barriers exist to widespread use. Chief among these is the lack of internationally accepted standards for quantitative metrics. Hippocampal volumetry is the most widely studied quantitative magnetic resonance imaging (MRI) measure in AD and thus represents the most rational target for an initial effort at standardization. Methods and Results The authors of this position paper propose a path toward this goal. The steps include: 1) Establish and empower an oversight board to manage and assess the effort, 2) Adopt the standardized definition of anatomic hippocampal boundaries on MRI arising from the EADC-ADNI hippocampal harmonization effort as a Reference Standard, 3) Establish a scientifically appropriate, publicly available Reference Standard Dataset based on manual delineation of the hippocampus in an appropriate sample of subjects (ADNI), and 4) Define minimum technical and prognostic performance metrics for validation of new measurement techniques using the Reference Standard Dataset as a benchmark. Conclusions Although manual delineation of the hippocampus is the best available reference standard, practical application of hippocampal volumetry will require automated methods. Our intent is to establish a mechanism for credentialing automated software applications to achieve internationally recognized accuracy and prognostic performance standards that lead to the systematic evaluation and then widespread acceptance and use of hippocampal volumetry. The standardization and assay validation process outlined for hippocampal volumetry is envisioned as a template that could be applied to other imaging biomarkers. PMID:21784356
Closed-form confidence intervals for functions of the normal mean and standard deviation.
Donner, Allan; Zou, G Y
2012-08-01
Confidence interval methods for a normal mean and standard deviation are well known and simple to apply. However, the same cannot be said for important functions of these parameters. These functions include the normal distribution percentiles, the Bland-Altman limits of agreement, the coefficient of variation and Cohen's effect size. We present a simple approach to this problem by using variance estimates recovered from confidence limits computed for the mean and standard deviation separately. All resulting confidence intervals have closed forms. Simulation results demonstrate that this approach performs very well for limits of agreement, coefficients of variation and their differences.
Making the connection: the VA-Regenstrief project.
Martin, D K
1992-01-01
The Regenstrief Automated Medical Record System is a well-established clinical information system with powerful facilities for querying and decision support. My colleagues and I introduced this system into the Indianapolis Veterans Affairs (VA) Medical Center by interfacing it to the institution's automated data-processing system, the Decentralized Hospital Computer Program (DHCP), using a recently standardized method for clinical data interchange. This article discusses some of the challenges encountered in that process, including the translation of vocabulary terms and maintenance of the software interface. Efforts such as these demonstrate the importance of standardization in medical informatics and the need for data standards at all levels of information exchange.
Recent Applications of Higher-Order Spectral Analysis to Nonlinear Aeroelastic Phenomena
NASA Technical Reports Server (NTRS)
Silva, Walter A.; Hajj, Muhammad R.; Dunn, Shane; Strganac, Thomas W.; Powers, Edward J.; Stearman, Ronald
2005-01-01
Recent applications of higher-order spectral (HOS) methods to nonlinear aeroelastic phenomena are presented. Applications include the analysis of data from a simulated nonlinear pitch and plunge apparatus and from F-18 flight flutter tests. A MATLAB model of the Texas A&MUniversity s Nonlinear Aeroelastic Testbed Apparatus (NATA) is used to generate aeroelastic transients at various conditions including limit cycle oscillations (LCO). The Gaussian or non-Gaussian nature of the transients is investigated, related to HOS methods, and used to identify levels of increasing nonlinear aeroelastic response. Royal Australian Air Force (RAAF) F/A-18 flight flutter test data is presented and analyzed. The data includes high-quality measurements of forced responses and LCO phenomena. Standard power spectral density (PSD) techniques and HOS methods are applied to the data and presented. The goal of this research is to develop methods that can identify the onset of nonlinear aeroelastic phenomena, such as LCO, during flutter testing.
A Nanocoaxial-Based Electrochemical Sensor for the Detection of Cholera Toxin
NASA Astrophysics Data System (ADS)
Archibald, Michelle M.; Rizal, Binod; Connolly, Timothy; Burns, Michael J.; Naughton, Michael J.; Chiles, Thomas C.
2015-03-01
Sensitive, real-time detection of biomarkers is of critical importance for rapid and accurate diagnosis of disease for point of care (POC) technologies. Current methods do not allow for POC applications due to several limitations, including sophisticated instrumentation, high reagent consumption, limited multiplexing capability, and cost. Here, we report a nanocoaxial-based electrochemical sensor for the detection of bacterial toxins using an electrochemical enzyme-linked immunosorbent assay (ELISA) and differential pulse voltammetry (DPV). Proof-of-concept was demonstrated for the detection of cholera toxin (CT). The linear dynamic range of detection was 10 ng/ml - 1 μg/ml, and the limit of detection (LOD) was found to be 2 ng/ml. This level of sensitivity is comparable to the standard optical ELISA used widely in clinical applications. In addition to matching the detection profile of the standard ELISA, the nanocoaxial array provides a simple electrochemical readout and a miniaturized platform with multiplexing capabilities for the simultaneous detection of multiple biomarkers, giving the nanocoax a desirable advantage over the standard method towards POC applications. Sensitive, real-time detection of biomarkers is of critical importance for rapid and accurate diagnosis of disease for point of care (POC) technologies. Current methods do not allow for POC applications due to several limitations, including sophisticated instrumentation, high reagent consumption, limited multiplexing capability, and cost. Here, we report a nanocoaxial-based electrochemical sensor for the detection of bacterial toxins using an electrochemical enzyme-linked immunosorbent assay (ELISA) and differential pulse voltammetry (DPV). Proof-of-concept was demonstrated for the detection of cholera toxin (CT). The linear dynamic range of detection was 10 ng/ml - 1 μg/ml, and the limit of detection (LOD) was found to be 2 ng/ml. This level of sensitivity is comparable to the standard optical ELISA used widely in clinical applications. In addition to matching the detection profile of the standard ELISA, the nanocoaxial array provides a simple electrochemical readout and a miniaturized platform with multiplexing capabilities for the simultaneous detection of multiple biomarkers, giving the nanocoax a desirable advantage over the standard method towards POC applications. This work was supported by the National Institutes of Health (National Cancer Institute award No. CA137681 and National Institute of Allergy and Infectious Diseases Award No. AI100216).
Related Rates and the Speed of Light.
ERIC Educational Resources Information Center
Althoen, S. C.; Weidner, J. F.
1985-01-01
Standard calculus textbooks often include a related rates problem involving light cast onto a straight line by a revolving light source. Mathematical aspects to these problems (both in the solution and in the method by which that solution is obtained) are examined. (JN)
German 1990: Intensive and Diverse.
ERIC Educational Resources Information Center
Kempf, Franz R.
1990-01-01
Suggests the following methods for improving university-level German language education curricula: reduce skill-development time through immersion programs, adopt Zertifikat Deutsch als Fremdsprache as the proficiency standard for advanced students, and include literature from a variety of liberal arts disciplines in the reading materials for…
IMPROVED METHOD FOR THE STORAGE OF GROUND WATER SAMPLES CONTAINING VOLATILE ORGANIC ANALYTES
The sorption of volatile organic analytes from water samples by the Teflon septum surface used with standard glass 40-ml sample collection vials was investigated. Analytes tested included alkanes, isoalkanes, olefins, cycloalkanes, a cycloalkene, monoaromatics, a polynuclear arom...
Getting Real: Implementing Assessment Alternatives in Mathematics.
ERIC Educational Resources Information Center
Hopkins, Martha H.
1997-01-01
Recounts experiences of a university professor who returned to the elementary classroom and attempted to implement the National Council of Teachers of Mathematics Standards and appropriate assessment methods, including nontraditional paper-and-pencil tasks, journal-like writing assignments, focused observations, and performance-based assessments…
THE ROLE OF OBSERVATION AND FEEDBACK IN ENHANCING PERFORMANCE WITH MEDICATION ADMINISTRATION.
Davies, Karen; Mitchell, Charles; Coombes, Ian
2015-12-01
Legislation in Queensland such as the Health (Drugs and Poisons) Regulation 1996, the national registration competency standards set by the Nursing and Midwifery Board of Australia, and the Continuing Professional Development Registration Standards made pursuant to the Health Practitioner Regulation National Law define expected standards of practice for nurses. The Framework for Assessing Standards for Practice for Registered Nurses, Enrolled Nurses and Midwives, released in July 2015, includes the principles for assessing standards but not the methods. Local policies and procedures offer specific requirements founded on evidence-based practice. Observation of clinical practice with the provision of immediate descriptive feedback to individual practitioners has been associated with improved performance. This column describes the role of regular observation and individual feedback on medication administration as a strategy to enhance performance and patient care.
Flywheel Rotor Safe-Life Technology
NASA Technical Reports Server (NTRS)
Ratner, J. K. H.; Chang, J. B.; Christopher, D. A.; McLallin, Kerry L. (Technical Monitor)
2002-01-01
Since the 1960s, research has been conducted into the use of flywheels as energy storage systems. The-proposed applications include energy storage for hybrid and electric automobiles, attitude control and energy storage for satellites, and uninterruptible power supplies for hospitals and computer centers. For many years, however, the use of flywheels for space applications was restricted by the total weight of a system employing a metal rotor. With recent technological advances in the manufacturing of composite materials, however, lightweight composite rotors have begun to be proposed for such applications. Flywheels with composite rotors provide much higher power and energy storage capabilities than conventional chemical batteries. However, the failure of a high speed flywheel rotor could be a catastrophic event. For this reason, flywheel rotors are classified by the NASA Fracture Control Requirements Standard as fracture critical parts. Currently, there is no industry standard to certify a composite rotor for safe and reliable operation forth( required lifetime of the flywheel. Technical problems hindering the development of this standard include composite manufacturing inconsistencies, insufficient nondestructive evaluation (NDE) techniques for detecting defects and/or impact damage, lack of standard material test methods for characterizing composite rotor design allowables, and no unified proof (over-spin) test for flight rotors. As part of a flywheel rotor safe-life certification pro-ram funded b the government, a review of the state of the art in composite rotors is in progress. The goal of the review is to provide a clear picture of composite flywheel rotor technologies. The literature review has concentrated on the following topics concerning composites and composite rotors: durability (fatigue) and damage tolerance (safe-life) analysis/test methods, in-service NDE and health monitoring techniques, spin test methods/ procedures, and containment options. This report presents the papers selected for their relevance to this topic and summarizes them.
Measurement of Energy Performances for General-Structured Servers
NASA Astrophysics Data System (ADS)
Liu, Ren; Chen, Lili; Li, Pengcheng; Liu, Meng; Chen, Haihong
2017-11-01
Energy consumption of servers in data centers increases rapidly along with the wide application of Internet and connected devices. To improve the energy efficiency of servers, voluntary or mandatory energy efficiency programs for servers, including voluntary label program or mandatory energy performance standards have been adopted or being prepared in the US, EU and China. However, the energy performance of servers and testing methods of servers are not well defined. This paper presents matrices to measure the energy performances of general-structured servers. The impacts of various components of servers on their energy performances are also analyzed. Based on a set of normalized workload, the author proposes a standard method for testing energy efficiency of servers. Pilot tests are conducted to assess the energy performance testing methods of servers. The findings of the tests are discussed in the paper.
Extended Glauert tip correction to include vortex rollup effects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maniaci, David; Schmitz, Sven
Wind turbine loads predictions by blade-element momentum theory using the standard tip-loss correction have been shown to over-predict loading near the blade tip in comparison to experimental data. This over-prediction is theorized to be due to the assumption of light rotor loading, inherent in the standard tip-loss correction model of Glauert. A higher- order free-wake method, WindDVE, is used to compute the rollup process of the trailing vortex sheets downstream of wind turbine blades. Results obtained serve an exact correction function to the Glauert tip correction used in blade-element momentum methods. Lastly, it is found that accounting for the effectsmore » of tip vortex rollup within the Glauert tip correction indeed results in improved prediction of blade tip loads computed by blade-element momentum methods.« less
Extended Glauert tip correction to include vortex rollup effects
Maniaci, David; Schmitz, Sven
2016-10-03
Wind turbine loads predictions by blade-element momentum theory using the standard tip-loss correction have been shown to over-predict loading near the blade tip in comparison to experimental data. This over-prediction is theorized to be due to the assumption of light rotor loading, inherent in the standard tip-loss correction model of Glauert. A higher- order free-wake method, WindDVE, is used to compute the rollup process of the trailing vortex sheets downstream of wind turbine blades. Results obtained serve an exact correction function to the Glauert tip correction used in blade-element momentum methods. Lastly, it is found that accounting for the effectsmore » of tip vortex rollup within the Glauert tip correction indeed results in improved prediction of blade tip loads computed by blade-element momentum methods.« less
Selvi, Emine Kılıçkaya; Şahin, Uğur; Şahan, Serkan
2017-01-01
This method was developed for the determination of trace amounts of aluminum(III) in dialysis concentrates using atomic absorption spectrometry after coprecipitation with lanthanum phosphate. The analytical parameters that influenced the quantitative coprecipitation of analyte including amount of lanthanum, amount of phosfate, pH and duration time were optimized. The % recoveries of the analyte ion were in the range of 95-105 % with limit of detection (3s) of 0.5 µg l -1 . Preconcentration factor was found as 1000 and Relative Standard Deviation (RSD) % value obtained from model solutions was 2.5% for 0.02 mg L -1 . The accuracy of the method was evaluated with standard reference material (CWW-TMD Waste Water). The method was also applied to most concentrated acidic and basic dialysis concentrates with satisfactory results.
Hybrid Differential Dynamic Programming with Stochastic Search
NASA Technical Reports Server (NTRS)
Aziz, Jonathan; Parker, Jeffrey; Englander, Jacob
2016-01-01
Differential dynamic programming (DDP) has been demonstrated as a viable approach to low-thrust trajectory optimization, namely with the recent success of NASAs Dawn mission. The Dawn trajectory was designed with the DDP-based Static Dynamic Optimal Control algorithm used in the Mystic software. Another recently developed method, Hybrid Differential Dynamic Programming (HDDP) is a variant of the standard DDP formulation that leverages both first-order and second-order state transition matrices in addition to nonlinear programming (NLP) techniques. Areas of improvement over standard DDP include constraint handling, convergence properties, continuous dynamics, and multi-phase capability. DDP is a gradient based method and will converge to a solution nearby an initial guess. In this study, monotonic basin hopping (MBH) is employed as a stochastic search method to overcome this limitation, by augmenting the HDDP algorithm for a wider search of the solution space.
Fasihi, Yasser; Fooladi, Saba; Mohammadi, Mohammad Ali; Emaneini, Mohammad; Kalantar-Neyestanaki, Davood
2017-09-06
Molecular typing is an important tool for control and prevention of infection. A suitable molecular typing method for epidemiological investigation must be easy to perform, highly reproducible, inexpensive, rapid and easy to interpret. In this study, two molecular typing methods including the conventional PCR-sequencing method and high resolution melting (HRM) analysis were used for staphylococcal protein A (spa) typing of 30 Methicillin-resistant Staphylococcus aureus (MRSA) isolates recovered from clinical samples. Based on PCR-sequencing method results, 16 different spa types were identified among the 30 MRSA isolates. Among the 16 different spa types, 14 spa types separated by HRM method. Two spa types including t4718 and t2894 were not separated from each other. According to our results, spa typing based on HRM analysis method is very rapid, easy to perform and cost-effective, but this method must be standardized for different regions, spa types, and real-time machinery.
Saito, Maiko; Kurosawa, Yae; Okuyama, Tsuneo
2012-02-01
Antibody purification using proteins A and G has been a standard method for research and industrial processes. The conventional method, however, includes a three-step process, including buffer exchange, before chromatography. In addition, proteins A and G require low pH elution, which causes antibody aggregation and inactivates the antibody's immunity. This report proposes a two-step method using hydroxyapatite chromatography and membrane filtration, without proteins A and G. This novel method shortens the running time to one-third the conventional method for each cycle. Using our two-step method, 90.2% of the monoclonal antibodies purified were recovered in the elution fraction, the purity achieved was >90%, and most of the antigen-specific activity was retained. This report suggests that the two-step method using hydroxyapatite chromatography and membrane filtration should be considered as an alternative to purification using proteins A and G.
2010-08-01
available). It is assumed after this method is formally published that various standard vendors will offer other sources than the current single standard... single isomer. D Alkyl PAHs used to determine the SPME-GC/MS relative response factors including alkyl naphthalenes (1-methyl-, 2-methyl-, 1,2...Flag all compound results in the sample which were estimated above the upper calibration level with an “E” qualifier. 15. Precision and Bias 15.1 Single
Survey of Software Assurance Techniques for Highly Reliable Systems
NASA Technical Reports Server (NTRS)
Nelson, Stacy
2004-01-01
This document provides a survey of software assurance techniques for highly reliable systems including a discussion of relevant safety standards for various industries in the United States and Europe, as well as examples of methods used during software development projects. It contains one section for each industry surveyed: Aerospace, Defense, Nuclear Power, Medical Devices and Transportation. Each section provides an overview of applicable standards and examples of a mission or software development project, software assurance techniques used and reliability achieved.
NASA Astrophysics Data System (ADS)
Tsai, Suh-Jen Jane; Shiue, Chia-Chann; Chang, Shiow-Ing
1997-07-01
The analytical characteristics of copper in nickel-base alloys have been investigated with electrothermal atomic absorption spectrometry. Deuterium background correction was employed. The effects of various chemical modifiers on the analysis of copper were investigated. Organic modifiers which included 2-(5-bromo-2-pyridylazo)-5-(diethylamino-phenol) (Br-PADAP), ammonium citrate, 1-(2-pyridylazo)-naphthol, 4-(2-pyridylazo)resorcinol, ethylenediaminetetraacetic acid and Triton X-100 were studied. Inorganic modifiers palladium nitrate, magnesium nitrate, aluminum chloride, ammonium dihydrogen phosphate, hydrogen peroxide and potassium nitrate were also applied in this work. In addition, zirconium hydroxide and ammonium hydroxide precipitation methods have also been studied. Interference effects were effectively reduced with Br-PADAP modifier. Aqueous standards were used to construct the calibration curves. The detection limit was 1.9 pg. Standard reference materials of nickel-base alloys were used to evaluate the accuracy of the proposed method. The copper contents determined with the proposed method agreed closely with the certified values of the reference materials. The recoveries were within the range 90-100% with relative standard deviation of less than 10%. Good precision was obtained.
Li, S P; Qiao, C F; Chen, Y W; Zhao, J; Cui, X M; Zhang, Q W; Liu, X M; Hu, D J
2013-10-25
Root of Panax notoginseng (Burk.) F.H. Chen (Sanqi in Chinese) is one of traditional Chinese medicines (TCMs) based functional food. Saponins are the major bioactive components. The shortage of reference compounds or chemical standards is one of the main bottlenecks for quality control of TCMs. A novel strategy, i.e. standardized reference extract based qualification and single calibrated components directly quantitative estimation of multiple analytes, was proposed to easily and effectively control the quality of natural functional foods such as Sanqi. The feasibility and credibility of this methodology were also assessed with a developed fast HPLC method. Five saponins, including ginsenoside Rg1, Re, Rb1, Rd and notoginsenoside R1 were rapidly separated using a conventional HPLC in 20 min. The quantification method was also compared with individual calibration curve method. The strategy is feasible and credible, which is easily and effectively adapted for improving the quality control of natural functional foods such as Sanqi. Copyright © 2013 Elsevier B.V. All rights reserved.
The Same or Not the Same: Equivalence as an Issue in Educational Research
NASA Astrophysics Data System (ADS)
Lewis, Scott E.; Lewis, Jennifer E.
2005-09-01
In educational research, particularly in the sciences, a common research design calls for the establishment of a control and experimental group to determine the effectiveness of an intervention. As part of this design, it is often desirable to illustrate that the two groups were equivalent at the start of the intervention, based on measures such as standardized cognitive tests or student grades in prior courses. In this article we use SAT and ACT scores to illustrate a more robust way of testing equivalence. The method incorporates two one-sided t tests evaluating two null hypotheses, providing a stronger claim for equivalence than the standard method, which often does not address the possible problem of low statistical power. The two null hypotheses are based on the construction of an equivalence interval particular to the data, so the article also provides a rationale for and illustration of a procedure for constructing equivalence intervals. Our consideration of equivalence using this method also underscores the need to include sample sizes, standard deviations, and group means in published quantitative studies.
NASA Astrophysics Data System (ADS)
Li, Lu; Xu, Chong-Yu; Engeland, Kolbjørn
2013-04-01
SummaryWith respect to model calibration, parameter estimation and analysis of uncertainty sources, various regression and probabilistic approaches are used in hydrological modeling. A family of Bayesian methods, which incorporates different sources of information into a single analysis through Bayes' theorem, is widely used for uncertainty assessment. However, none of these approaches can well treat the impact of high flows in hydrological modeling. This study proposes a Bayesian modularization uncertainty assessment approach in which the highest streamflow observations are treated as suspect information that should not influence the inference of the main bulk of the model parameters. This study includes a comprehensive comparison and evaluation of uncertainty assessments by our new Bayesian modularization method and standard Bayesian methods using the Metropolis-Hastings (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions were used in combination with standard Bayesian method: the AR(1) plus Normal model independent of time (Model 1), the AR(1) plus Normal model dependent on time (Model 2) and the AR(1) plus Multi-normal model (Model 3). The results reveal that the Bayesian modularization method provides the most accurate streamflow estimates measured by the Nash-Sutcliffe efficiency and provide the best in uncertainty estimates for low, medium and entire flows compared to standard Bayesian methods. The study thus provides a new approach for reducing the impact of high flows on the discharge uncertainty assessment of hydrological models via Bayesian method.
Mezouari, S; Liu, W Yun; Pace, G; Hartman, T G
2015-01-01
The objective of this study was to develop an improved analytical method for the determination of 3-chloro-1,2-propanediol (3-MCPD) and 1,3-dichloropropanol (1,3-DCP) in paper-type food packaging. The established method includes aqueous extraction, matrix spiking of a deuterated surrogate internal standard (3-MCPD-d₅), clean-up using Extrelut solid-phase extraction, derivatisation using a silylation reagent, and GC-MS analysis of the chloropropanols as their corresponding trimethyl silyl ethers. The new method is applicable to food-grade packaging samples using European Commission standard aqueous extraction and aqueous food stimulant migration tests. In this improved method, the derivatisation procedure was optimised; the cost and time of the analysis were reduced by using 10 times less sample, solvents and reagents than in previously described methods. Overall the validation data demonstrate that the method is precise and reliable. The limit of detection (LOD) of the aqueous extract was 0.010 mg kg(-1) (w/w) for both 3-MCPD and 1,3-DCP. Analytical precision had a relative standard deviation (RSD) of 3.36% for 3-MCPD and an RSD of 7.65% for 1,3-DCP. The new method was satisfactorily applied to the analysis of over 100 commercial paperboard packaging samples. The data are being used to guide the product development of a next generation of wet-strength resins with reduced chloropropanol content, and also for risk assessments to calculate the virtual safe dose (VSD).
Weir, Christopher J; Butcher, Isabella; Assi, Valentina; Lewis, Stephanie C; Murray, Gordon D; Langhorne, Peter; Brady, Marian C
2018-03-07
Rigorous, informative meta-analyses rely on availability of appropriate summary statistics or individual participant data. For continuous outcomes, especially those with naturally skewed distributions, summary information on the mean or variability often goes unreported. While full reporting of original trial data is the ideal, we sought to identify methods for handling unreported mean or variability summary statistics in meta-analysis. We undertook two systematic literature reviews to identify methodological approaches used to deal with missing mean or variability summary statistics. Five electronic databases were searched, in addition to the Cochrane Colloquium abstract books and the Cochrane Statistics Methods Group mailing list archive. We also conducted cited reference searching and emailed topic experts to identify recent methodological developments. Details recorded included the description of the method, the information required to implement the method, any underlying assumptions and whether the method could be readily applied in standard statistical software. We provided a summary description of the methods identified, illustrating selected methods in example meta-analysis scenarios. For missing standard deviations (SDs), following screening of 503 articles, fifteen methods were identified in addition to those reported in a previous review. These included Bayesian hierarchical modelling at the meta-analysis level; summary statistic level imputation based on observed SD values from other trials in the meta-analysis; a practical approximation based on the range; and algebraic estimation of the SD based on other summary statistics. Following screening of 1124 articles for methods estimating the mean, one approximate Bayesian computation approach and three papers based on alternative summary statistics were identified. Illustrative meta-analyses showed that when replacing a missing SD the approximation using the range minimised loss of precision and generally performed better than omitting trials. When estimating missing means, a formula using the median, lower quartile and upper quartile performed best in preserving the precision of the meta-analysis findings, although in some scenarios, omitting trials gave superior results. Methods based on summary statistics (minimum, maximum, lower quartile, upper quartile, median) reported in the literature facilitate more comprehensive inclusion of randomised controlled trials with missing mean or variability summary statistics within meta-analyses.
Metroka, Amy E; Papadouka, Vikki; Ternier, Alexandra; Zucker, Jane R
2016-01-01
We compared the quality of data reported to New York City's immunization information system, the Citywide Immunization Registry (CIR), through its real-time Health Level 7 (HL7) Web service from electronic health records (EHRs), with data submitted through other methods. We stratified immunizations administered and reported to the CIR in 2014 for patients aged 0-18 years by reporting method: (1) sending HL7 messages from EHRs through the Web service, (2) manual data entry, and (3) upload of a non-standard flat file from EHRs. We assessed completeness of reporting by measuring the percentage of immunizations reported with lot number, manufacturer, and Vaccines for Children (VFC) program eligibility. We assessed timeliness of reporting by determining the number of days from date of administration to date entered into the CIR. HL7 reporting accounted for the largest percentage (46.3%) of the 3.8 million immunizations reported in 2014. Of immunizations reported using HL7, 97.9% included the lot number and 92.6% included the manufacturer, compared with 50.4% and 48.0% for manual entry, and 65.9% and 48.8% for non-standard flat file, respectively. VFC eligibility was 96.9% complete when reported by manual data entry, 95.3% complete for HL7 reporting, and 87.2% complete for non-standard flat file reporting. Of the three reporting methods, HL7 was the most timely: 77.6% of immunizations were reported by HL7 in <1 day, compared with 53.6% of immunizations reported through manual data entry and 18.1% of immunizations reported through non-standard flat file. HL7 reporting from EHRs resulted in more complete and timely data in the CIR compared with other reporting methods. Providing resources to facilitate HL7 reporting from EHRs to immunization information systems to increase data quality should be a priority for public health.
Pre-capture multiplexing improves efficiency and cost-effectiveness of targeted genomic enrichment.
Shearer, A Eliot; Hildebrand, Michael S; Ravi, Harini; Joshi, Swati; Guiffre, Angelica C; Novak, Barbara; Happe, Scott; LeProust, Emily M; Smith, Richard J H
2012-11-14
Targeted genomic enrichment (TGE) is a widely used method for isolating and enriching specific genomic regions prior to massively parallel sequencing. To make effective use of sequencer output, barcoding and sample pooling (multiplexing) after TGE and prior to sequencing (post-capture multiplexing) has become routine. While previous reports have indicated that multiplexing prior to capture (pre-capture multiplexing) is feasible, no thorough examination of the effect of this method has been completed on a large number of samples. Here we compare standard post-capture TGE to two levels of pre-capture multiplexing: 12 or 16 samples per pool. We evaluated these methods using standard TGE metrics and determined the ability to identify several classes of genetic mutations in three sets of 96 samples, including 48 controls. Our overall goal was to maximize cost reduction and minimize experimental time while maintaining a high percentage of reads on target and a high depth of coverage at thresholds required for variant detection. We adapted the standard post-capture TGE method for pre-capture TGE with several protocol modifications, including redesign of blocking oligonucleotides and optimization of enzymatic and amplification steps. Pre-capture multiplexing reduced costs for TGE by at least 38% and significantly reduced hands-on time during the TGE protocol. We found that pre-capture multiplexing reduced capture efficiency by 23 or 31% for pre-capture pools of 12 and 16, respectively. However efficiency losses at this step can be compensated by reducing the number of simultaneously sequenced samples. Pre-capture multiplexing and post-capture TGE performed similarly with respect to variant detection of positive control mutations. In addition, we detected no instances of sample switching due to aberrant barcode identification. Pre-capture multiplexing improves efficiency of TGE experiments with respect to hands-on time and reagent use compared to standard post-capture TGE. A decrease in capture efficiency is observed when using pre-capture multiplexing; however, it does not negatively impact variant detection and can be accommodated by the experimental design.
Guo, How-Ran
2011-10-20
Despite its limitations, ecological study design is widely applied in epidemiology. In most cases, adjustment for age is necessary, but different methods may lead to different conclusions. To compare three methods of age adjustment, a study on the associations between arsenic in drinking water and incidence of bladder cancer in 243 townships in Taiwan was used as an example. A total of 3068 cases of bladder cancer, including 2276 men and 792 women, were identified during a ten-year study period in the study townships. Three methods were applied to analyze the same data set on the ten-year study period. The first (Direct Method) applied direct standardization to obtain standardized incidence rate and then used it as the dependent variable in the regression analysis. The second (Indirect Method) applied indirect standardization to obtain standardized incidence ratio and then used it as the dependent variable in the regression analysis instead. The third (Variable Method) used proportions of residents in different age groups as a part of the independent variables in the multiple regression models. All three methods showed a statistically significant positive association between arsenic exposure above 0.64 mg/L and incidence of bladder cancer in men and women, but different results were observed for the other exposure categories. In addition, the risk estimates obtained by different methods for the same exposure category were all different. Using an empirical example, the current study confirmed the argument made by other researchers previously that whereas the three different methods of age adjustment may lead to different conclusions, only the third approach can obtain unbiased estimates of the risks. The third method can also generate estimates of the risk associated with each age group, but the other two are unable to evaluate the effects of age directly.
40 CFR Appendix A-7 to Part 60 - Test Methods 19 through 25E
Code of Federal Regulations, 2014 CFR
2014-07-01
... %O include the unavailable hydrogen and oxygen in the form of H2O.) 12.3.2.2 Use applicable sampling... are used during the averaging period. 12.5.2.1 Solid Fossil (Including Waste) Fuel/Sampling and... of the standards) on a dry basis for each gross sample. 12.5.2.2 Liquid Fossil Fuel-Sampling and...
Donald J. Kaczmarek; Randall Rousseau; Jeff A. Wright; Brian Wachelka
2014-01-01
Four eastern cottonwood clones, including standard operational clone ST66 and three advanced clonal selections were produced and included in a test utilizing five different plant propagation methods. Despite relatively large first-year growth differences among clones, all clones demonstrated similar responses to the treatments and clone à cutting treatment interactions...
A novel iterative scheme and its application to differential equations.
Khan, Yasir; Naeem, F; Šmarda, Zdeněk
2014-01-01
The purpose of this paper is to employ an alternative approach to reconstruct the standard variational iteration algorithm II proposed by He, including Lagrange multiplier, and to give a simpler formulation of Adomian decomposition and modified Adomian decomposition method in terms of newly proposed variational iteration method-II (VIM). Through careful investigation of the earlier variational iteration algorithm and Adomian decomposition method, we find unnecessary calculations for Lagrange multiplier and also repeated calculations involved in each iteration, respectively. Several examples are given to verify the reliability and efficiency of the method.
Woo, Kang-Lyung
2005-01-01
Low molecular weight alcohols including fusel oil were determined using diethyl ether extraction and capillary gas chromatography. Twelve kinds of alcohols were successfully resolved on the HP-FFAP (polyethylene glycol) capillary column. The diethyl ether extraction method was very useful for the analysis of alcohols in alcoholic beverages and biological samples with excellent cleanliness of the resulting chromatograms and high sensitivity compared to the direct injection method. Calibration graphs for all standard alcohols showed good linearity in the concentration range used, 0.001-2% (w/v) for all alcohols. Salting out effects were significant (p < 0.01) for the low molecular weight alcohols methanol, isopropanol, propanol, 2-butanol, n-butanol and ethanol, but not for the relatively high molecular weight alcohols amyl alcohol, isoamyl alcohol, and heptanol. The coefficients of variation of the relative molar responses were less than 5% for all of the alcohols. The limits of detection and quantitation were 1-5 and 10-60 microg/L for the diethyl ether extraction method, and 10-50 and 100-350 microg/L for the direct injection method, respectively. The retention times and relative retention times of standard alcohols were significantly shifted in the direct injection method when the injection volumes were changed, even with the same analysis conditions, but they were not influenced in the diethyl ether extraction method. The recoveries by the diethyl ether extraction method were greater than 95% for all samples and greater than 97% for biological samples.
Kunioka, Masao
2010-06-01
The biomass carbon ratios of biochemicals related to biomass have been reviewed. Commercial products from biomass were explained. The biomass carbon ratios of biochemical compounds were measured by accelerator mass spectrometry (AMS) based on the (14)C concentration of carbons in the compounds. This measuring method uses the mechanism that biomass carbons include a very low level of (14)C and petroleum carbons do not include (14)C similar to the carbon dating measuring method. It was confirmed that there were some biochemicals synthesized from petroleum-based carbons. This AMS method has a high accuracy with a small standard deviation and can be applied to plastic products.
NASA Astrophysics Data System (ADS)
Brajard, J.; Moulin, C.; Thiria, S.
2008-10-01
This paper presents a comparison of the atmospheric correction accuracy between the standard sea-viewing wide field-of-view sensor (SeaWiFS) algorithm and the NeuroVaria algorithm for the ocean off the Indian coast in March 1999. NeuroVaria is a general method developed to retrieve aerosol optical properties and water-leaving reflectances for all types of aerosols, including absorbing ones. It has been applied to SeaWiFS images of March 1999, during an episode of transport of absorbing aerosols coming from pollutant sources in India. Water-leaving reflectances and aerosol optical thickness estimated by the two methods were extracted along a transect across the aerosol plume for three days. The comparison showed that NeuroVaria allows the retrieval of oceanic properties in the presence of absorbing aerosols with a better spatial and temporal stability than the standard SeaWiFS algorithm. NeuroVaria was then applied to the available SeaWiFS images over a two-week period. NeuroVaria algorithm retrieves ocean products for a larger number of pixels than the standard one and eliminates most of the discontinuities and artifacts associated with the standard algorithm in presence of absorbing aerosols.
Isaman, V; Thelin, R
1995-09-01
Standard Operating Procedures (SOPs) are required in order to comply with the Good Laboratory Practice Standards (GLPS) as promulgated in the Federal Insecticide, Fungicide and Rodenticide Act (FIFRA) 40 CFR Part 160. Paragraph 160.81 (a) states: "A testing facility shall have standard operating procedures in writing setting forth study methods that management is satisfied are adequate to insure the quality and integrity of the data generated in the course of a study." Types of SOPs include administrative and personnel, analyses, substances, quality assurance and records, test system, equipment, and field related. All SOPs must be adequate in scope to describe the function in sufficient detail such that the study data are reproducible. All SOPs must be approved by a management level as described in a corporate organization chart. Signatures for SOP responsibility, authorship, and Quality Assurance review adds strength and accountability to the SOP. In the event a procedure or method is performed differently from what is stated in the SOP, an SOP deviation is necessary. As methods and procedures are improved, SOP revisions are necessary to maintain SOP adequacy and applicability. The replaced SOP is put into a historical SOP file and all copies of the replaced SOPs are destroyed.
Borgeat, François; Stankovic, Miroslava; Khazaal, Yasser; Rouget, Beatrice Weber; Baumann, Marie-Claude; Riquier, Françoise; O'Connor, Kieron; Jermann, Françoise; Zullino, Daniele; Bondolfi, Guido
2009-07-01
Exposure is considered to be an essential ingredient of cognitive-behavioral therapy treatment of social phobia and of most anxiety disorders. To assess the impact of the amount of exposure on outcome, 30 social phobic patients were randomly allocated to 1 of 2 group treatments of 8 weekly sessions: Self-Focused Exposure Therapy which is based essentially on prolonged exposure to public speaking combined with positive feedback or a more standard cognitive and behavioral method encompassing psychoeducation, cognitive work, working through exposure hierarchies of feared situations for exposure within and outside the group. The results show that the 2 methods led to significant and equivalent symptomatic improvements which were maintained at 1-year follow-up. There was a more rapid and initially more pronounced decrease in negative cognitions with the Self-Focused Exposure Therapy, which included no formal cognitive work, than with the more standard approach in which approximately a third of the content was cognitive. In contrast, decrease in social avoidance was more persistent with standard cognitive-behavior therapy which involved less exposure. The results indicate that positive cognitive change can be achieved more rapidly with non cognitive methods while avoidance decreases more reliably with a standard approach rather than an approach with an exclusive focus on exposure.
NASA Technical Reports Server (NTRS)
Otterson, D. A.; Seng, G. T.
1984-01-01
A new high-performance liquid chromatographic (HPLC) method for group-type analysis of middistillate fuels is described. It uses a refractive index detector and standards that are prepared by reacting a portion of the fuel sample with sulfuric acid. A complete analysis of a middistillate fuel for saturates and aromatics (including the preparation of the standard) requires about 15 min if standards for several fuels are prepared simultaneously. From model fuel studies, the method was found to be accurate to within 0.4 vol% saturates or aromatics, and provides a precision of + or - 0.4 vol%. Olefin determinations require an additional 15 min of analysis time. However, this determination is needed only for those fuels displaying a significant olefin response at 200 nm (obtained routinely during the saturated/aromatics analysis procedure). The olefin determination uses the responses of the olefins and the corresponding saturates, as well as the average value of their refractive index sensitivity ratios (1.1). Studied indicated that, although the relative error in the olefins result could reach 10 percent by using this average sensitivity ratio, it was 5 percent for the fuels used in this study. Olefin concentrations as low as 0.1 vol% have been determined using this method.
Saracevic, Andrea; Simundic, Ana-Maria; Celap, Ivana; Luzanic, Valentina
2013-07-01
Rigat and colleagues were the first ones to develop a rapid PCR-based assay for identifying the angiotensin converting enzyme insertion/deletion (I/D) polymorphism. Due to a big difference between the length of the wild-type and mute alleles the PCR method is prone to mistyping because of preferential amplification of the D allele causing depicting I/D heterozygotes as D/D homozygotes. The aim of this study was to investigate whether this preferential amplification can be repressed by amplifying a longer DNA fragment in a so called Long PCR protocol. We also aimed to compare the results of genotyping using five different PCR protocols and to estimate the mistyping rate. The study included 200 samples which were genotyped using standard method used in our laboratory, a stepdown PCR, PCR protocol with the inclusion of 4 % DMSO, PCR with the use of insertion specific primers and new Long PCR method. The results of this study have shown that accurate ACE I/D polymorphism genotyping can be accomplished with the standard and the Long PCR method. Also, as of our results, accurate ACE I/D polymorphism genotyping can be accomplished regardless of the method used. Therefore, if the standard method is optimized more cautiously, accurate results can be obtained by this simple, inexpensive and rapid PCR protocol.
Ramilo, Andrea; Navas, J Ignacio; Villalba, Antonio; Abollo, Elvira
2013-05-27
Bonamia ostreae and B. exitiosa have caused mass mortalities of various oyster species around the world and co-occur in some European areas. The World Organisation for Animal Health (OIE) has included infections with both species in the list of notifiable diseases. However, official methods for species-specific diagnosis of either parasite have certain limitations. In this study, new species-specific conventional PCR (cPCR) and real-time PCR techniques were developed to diagnose each parasite species. Moreover, a multiplex PCR method was designed to detect both parasites in a single assay. The analytical sensitivity and specificity of each new method were evaluated. These new procedures were compared with 2 OIE-recommended methods, viz. standard histology and PCR-RFLP. The new procedures showed higher sensitivity than the OIE recommended ones for the diagnosis of both species. The sensitivity of tests with the new primers was higher using oyster gills and gonad tissue, rather than gills alone. The lack of a 'gold standard' prevented accurate estimation of sensitivity and specificity of the new methods. The implementation of statistical tools (maximum likelihood method) for the comparison of the diagnostic tests showed the possibility of false positives with the new procedures, although the absence of a gold standard precluded certainty. Nevertheless, all procedures showed negative results when used for the analysis of oysters from a Bonamia-free area.
Faassen, Elisabeth J; Antoniou, Maria G; Beekman-Lukassen, Wendy; Blahova, Lucie; Chernova, Ekaterina; Christophoridis, Christophoros; Combes, Audrey; Edwards, Christine; Fastner, Jutta; Harmsen, Joop; Hiskia, Anastasia; Ilag, Leopold L; Kaloudis, Triantafyllos; Lopicic, Srdjan; Lürling, Miquel; Mazur-Marzec, Hanna; Meriluoto, Jussi; Porojan, Cristina; Viner-Mozzini, Yehudit; Zguna, Nadezda
2016-02-29
Exposure to β-N-methylamino-l-alanine (BMAA) might be linked to the incidence of amyotrophic lateral sclerosis, Alzheimer's disease and Parkinson's disease. Analytical chemistry plays a crucial role in determining human BMAA exposure and the associated health risk, but the performance of various analytical methods currently employed is rarely compared. A CYANOCOST initiated workshop was organized aimed at training scientists in BMAA analysis, creating mutual understanding and paving the way towards interlaboratory comparison exercises. During this workshop, we tested different methods (extraction followed by derivatization and liquid chromatography coupled to tandem mass spectrometry (LC-MS/MS) analysis, or directly followed by LC-MS/MS analysis) for trueness and intermediate precision. We adapted three workup methods for the underivatized analysis of animal, brain and cyanobacterial samples. Based on recovery of the internal standard D₃BMAA, the underivatized methods were accurate (mean recovery 80%) and precise (mean relative standard deviation 10%), except for the cyanobacterium Leptolyngbya. However, total BMAA concentrations in the positive controls (cycad seeds) showed higher variation (relative standard deviation 21%-32%), implying that D₃BMAA was not a good indicator for the release of BMAA from bound forms. Significant losses occurred during workup for the derivatized method, resulting in low recovery (<10%). Most BMAA was found in a trichloroacetic acid soluble, bound form and we recommend including this fraction during analysis.
White, Helen E; Hedges, John; Bendit, Israel; Branford, Susan; Colomer, Dolors; Hochhaus, Andreas; Hughes, Timothy; Kamel-Reid, Suzanne; Kim, Dong-Wook; Modur, Vijay; Müller, Martin C; Pagnano, Katia B; Pane, Fabrizio; Radich, Jerry; Cross, Nicholas C P; Labourier, Emmanuel
2013-06-01
Current guidelines for managing Philadelphia-positive chronic myeloid leukemia include monitoring the expression of the BCR-ABL1 (breakpoint cluster region/c-abl oncogene 1, non-receptor tyrosine kinase) fusion gene by quantitative reverse-transcription PCR (RT-qPCR). Our goal was to establish and validate reference panels to mitigate the interlaboratory imprecision of quantitative BCR-ABL1 measurements and to facilitate global standardization on the international scale (IS). Four-level secondary reference panels were manufactured under controlled and validated processes with synthetic Armored RNA Quant molecules (Asuragen) calibrated to reference standards from the WHO and the NIST. Performance was evaluated in IS reference laboratories and with non-IS-standardized RT-qPCR methods. For most methods, percent ratios for BCR-ABL1 e13a2 and e14a2 relative to ABL1 or BCR were robust at 4 different levels and linear over 3 logarithms, from 10% to 0.01% on the IS. The intraassay and interassay imprecision was <2-fold overall. Performance was stable across 3 consecutive lots, in multiple laboratories, and over a period of 18 months to date. International field trials demonstrated the commutability of the reagents and their accurate alignment to the IS within the intra- and interlaboratory imprecision of IS-standardized methods. The synthetic calibrator panels are robust, reproducibly manufactured, analytically calibrated to the WHO primary standards, and compatible with most BCR-ABL1 RT-qPCR assay designs. The broad availability of secondary reference reagents will further facilitate interlaboratory comparative studies and independent quality assessment programs, which are of paramount importance for worldwide standardization of BCR-ABL1 monitoring results and the optimization of current and new therapeutic approaches for chronic myeloid leukemia. © 2013 American Association for Clinical Chemistry.
Wise, Stephen A; Tai, Susan S-C; Burdette, Carolyn Q; Camara, Johanna E; Bedner, Mary; Lippa, Katrice A; Nelson, Michael A; Nalin, Federica; Phinney, Karen W; Sander, Lane C; Betz, Joseph M; Sempos, Christopher T; Coates, Paul M
2017-09-01
Since 2005, the National Institute of Standards and Technology (NIST) has collaborated with the National Institutes of Health (NIH), Office of Dietary Supplements (ODS) to improve the quality of measurements related to human nutritional markers of vitamin D status. In support of the NIH-ODS Vitamin D Initiative, including the Vitamin D Standardization Program (VDSP), NIST efforts have focused on (1) development of validated analytical methods, including reference measurement procedures (RMPs); (2) development of Standard Reference Materials (SRMs); (3) value assignment of critical study samples using NIST RMPs; and (4) development and coordination of laboratory measurement QA programs. As a result of this collaboration, NIST has developed RMPs for 25-hydroxyvitamin D2 [25(OH)D2], 25(OH)D3, and 24R,25-dihydroxyvitamin D3 [24R,25(OH)2D3]; disseminated serum-based SRMs with values assigned for 25(OH)D2, 25(OH)D3, 3-epi-25(OH)D3, and 24R,25(OH)2D3; assigned values for critical samples for VDSP studies, including an extensive interlaboratory comparison and reference material commutability study; provided an accuracy basis for the Vitamin D External Quality Assurance Scheme; coordinated the first accuracy-based measurement QA program for the determination of 25(OH)D2, 25(OH)D3, and 3-epi-25(OH)D3 in human serum/plasma; and developed methods and SRMs for the determination of vitamin D and 25(OH)D in food and supplement matrix SRMs. The details of these activities and their benefit and impact to the NIH-ODS Vitamin D Initiative are described.