Endoscope field of view measurement.
Wang, Quanzeng; Khanicheh, Azadeh; Leiner, Dennis; Shafer, David; Zobel, Jurgen
2017-03-01
The current International Organization for Standardization (ISO) standard (ISO 8600-3: 1997 including Amendment 1: 2003) for determining endoscope field of view (FOV) does not accurately characterize some novel endoscopic technologies such as endoscopes with a close focus distance and capsule endoscopes. We evaluated the endoscope FOV measurement method (the FOV WS method) in the current ISO 8600-3 standard and proposed a new method (the FOV EP method). We compared the two methods by measuring the FOV of 18 models of endoscopes (one device for each model) from seven key international manufacturers. We also estimated the device to device variation of two models of colonoscopes by measuring several hundreds of devices. Our results showed that the FOV EP method was more accurate than the FOV WS method, and could be used for all endoscopes. We also found that the labelled FOV values of many commercial endoscopes are significantly overstated. Our study can help endoscope users understand endoscope FOV and identify a proper method for FOV measurement. This paper can be used as a reference to revise the current endoscope FOV measurement standard.
Endoscope field of view measurement
Wang, Quanzeng; Khanicheh, Azadeh; Leiner, Dennis; Shafer, David; Zobel, Jurgen
2017-01-01
The current International Organization for Standardization (ISO) standard (ISO 8600-3: 1997 including Amendment 1: 2003) for determining endoscope field of view (FOV) does not accurately characterize some novel endoscopic technologies such as endoscopes with a close focus distance and capsule endoscopes. We evaluated the endoscope FOV measurement method (the FOVWS method) in the current ISO 8600-3 standard and proposed a new method (the FOVEP method). We compared the two methods by measuring the FOV of 18 models of endoscopes (one device for each model) from seven key international manufacturers. We also estimated the device to device variation of two models of colonoscopes by measuring several hundreds of devices. Our results showed that the FOVEP method was more accurate than the FOVWS method, and could be used for all endoscopes. We also found that the labelled FOV values of many commercial endoscopes are significantly overstated. Our study can help endoscope users understand endoscope FOV and identify a proper method for FOV measurement. This paper can be used as a reference to revise the current endoscope FOV measurement standard. PMID:28663840
Annual Book of ASTM Standards, Part 23: Water; Atmospheric Analysis.
ERIC Educational Resources Information Center
American Society for Testing and Materials, Philadelphia, PA.
Standards for water and atmospheric analysis are compiled in this segment, Part 23, of the American Society for Testing and Materials (ASTM) annual book of standards. It contains all current formally approved ASTM standard and tentative test methods, definitions, recommended practices, proposed methods, classifications, and specifications. One…
A review of the quantum current standard
NASA Astrophysics Data System (ADS)
Kaneko, Nobu-Hisa; Nakamura, Shuji; Okazaki, Yuma
2016-03-01
The electric current, voltage, and resistance standards are the most important standards related to electricity and magnetism. Of these three standards, only the ampere, which is the unit of electric current, is an International System of Units (SI) base unit. However, even with modern technology, relatively large uncertainty exists regarding the generation and measurement of current. As a result of various innovative techniques based on nanotechnology and novel materials, new types of junctions for quantum current generation and single-electron current sources have recently been proposed. These newly developed methods are also being used to investigate the consistency of the three quantum electrical effects, i.e. the Josephson, quantum Hall, and single-electron tunneling effects, which are also known as ‘the quantum metrology triangle’. This article describes recent research and related developments regarding current standards and quantum-metrology-triangle experiments.
Analytical evaluation of current starch methods used in the international sugar industry: Part I.
Cole, Marsha; Eggleston, Gillian; Triplett, Alexa
2017-08-01
Several analytical starch methods exist in the international sugar industry to mitigate starch-related processing challenges and assess the quality of traded end-products. These methods use iodometric chemistry, mostly potato starch standards, and utilize similar solubilization strategies, but had not been comprehensively compared. In this study, industrial starch methods were compared to the USDA Starch Research method using simulated raw sugars. Type of starch standard, solubilization approach, iodometric reagents, and wavelength detection affected total starch determination in simulated raw sugars. Simulated sugars containing potato starch were more accurately detected by the industrial methods, whereas those containing corn starch, a better model for sugarcane starch, were only accurately measured by the USDA Starch Research method. Use of a potato starch standard curve over-estimated starch concentrations. Among the variables studied, starch standard, solubilization approach, and wavelength detection affected the sensitivity, accuracy/precision, and limited the detection/quantification of the current industry starch methods the most. Published by Elsevier Ltd.
Validation studies of Karl Fisher reference method for moisture in cotton
USDA-ARS?s Scientific Manuscript database
With current international standard oven drying (SOD) techniques lacking precision and accuracy statements, a new standard reference method is needed. Volumetric Karl Fischer Titration (KFT) is a widely used measure of moisture content. The method is used in many ASTM methods, 14 NIST SRMs, and te...
Standard methods for sampling freshwater fishes: Opportunities for international collaboration
Bonar, Scott A.; Mercado-Silva, Norman; Hubert, Wayne A.; Beard, Douglas; Dave, Göran; Kubečka, Jan; Graeb, Brian D. S.; Lester, Nigel P.; Porath, Mark T.; Winfield, Ian J.
2017-01-01
With publication of Standard Methods for Sampling North American Freshwater Fishes in 2009, the American Fisheries Society (AFS) recommended standard procedures for North America. To explore interest in standardizing at intercontinental scales, a symposium attended by international specialists in freshwater fish sampling was convened at the 145th Annual AFS Meeting in Portland, Oregon, in August 2015. Participants represented all continents except Australia and Antarctica and were employed by state and federal agencies, universities, nongovernmental organizations, and consulting businesses. Currently, standardization is practiced mostly in North America and Europe. Participants described how standardization has been important for management of long-term data sets, promoting fundamental scientific understanding, and assessing efficacy of large spatial scale management strategies. Academics indicated that standardization has been useful in fisheries education because time previously used to teach how sampling methods are developed is now more devoted to diagnosis and treatment of problem fish communities. Researchers reported that standardization allowed increased sample size for method validation and calibration. Group consensus was to retain continental standards where they currently exist but to further explore international and intercontinental standardization, specifically identifying where synergies and bridges exist, and identify means to collaborate with scientists where standardization is limited but interest and need occur.
ERIC Educational Resources Information Center
Coester, Lee Anne
2010-01-01
This study was designed to gather input from early career elementary teachers with the goal of finding ways to improve elementary mathematics methods courses. Multiple areas were explored including the degree to which respondents' elementary mathematics methods course focused on the NCTM Process Standards, the teachers' current standards-based…
Hanford Technical Basis for Multiple Dosimetry Effective Dose Methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hill, Robin L.; Rathbone, Bruce A.
2010-08-01
The current method at Hanford for dealing with the results from multiple dosimeters worn during non-uniform irradiation is to use a compartmentalization method to calculate the effective dose (E). The method, as documented in the current version of Section 6.9.3 in the 'Hanford External Dosimetry Technical Basis Manual, PNL-MA-842,' is based on the compartmentalization method presented in the 1997 ANSI/HPS N13.41 standard, 'Criteria for Performing Multiple Dosimetry.' With the adoption of the ICRP 60 methodology in the 2007 revision to 10 CFR 835 came changes that have a direct affect on the compartmentalization method described in the 1997 ANSI/HPS N13.41more » standard, and, thus, to the method used at Hanford. The ANSI/HPS N13.41 standard committee is in the process of updating the standard, but the changes to the standard have not yet been approved. And, the drafts of the revision of the standard tend to align more with ICRP 60 than with the changes specified in the 2007 revision to 10 CFR 835. Therefore, a revised method for calculating effective dose from non-uniform external irradiation using a compartmental method was developed using the tissue weighting factors and remainder organs specified in 10 CFR 835 (2007).« less
REPRESENTATIVE SAMPLING AND ANALYSIS OF HETEROGENEOUS SOILS
Standard sampling and analysis methods for hazardous substances in contaminated soils currently are available and routinely employed. Standard methods inherently assume a homogeneous soil matrix and contaminant distribution; therefore only small sample quantities typically are p...
Hao, Zhi-hong; Yao, Jian-zhen; Tang, Rui-ling; Zhang, Xue-mei; Li, Wen-ge; Zhang, Qin
2015-02-01
The method for the determmation of trace boron, molybdenum, silver, tin and lead in geochemical samples by direct current are full spectrum direct reading atomic emission spectroscopy (DC-Arc-AES) was established. Direct current are full spectrum direct reading atomic emission spectrometer with a large area of solid-state detectors has functions of full spectrum direct reading and real-time background correction. The new electrodes and new buffer recipe were proposed in this paper, and have applied for national patent. Suitable analytical line pairs, back ground correcting points of elements and the internal standard method were selected, and Ge was used as internal standard. Multistage currents were selected in the research on current program, and each current set different holding time to ensure that each element has a good signal to noise ratio. Continuous rising current mode selected can effectively eliminate the splash of the sample. Argon as shielding gas can eliminate CN band generating and reduce spectral background, also plays a role in stabilizing the are, and argon flow 3.5 L x min(-1) was selected. Evaporation curve of each element was made, and it was concluded that the evaporation behavior of each element is consistent, and combined with the effects of different spectrographic times on the intensity and background, the spectrographic time of 35s was selected. In this paper, national standards substances were selected as a standard series, and the standard series includes different nature and different content of standard substances which meet the determination of trace boron, molybdenum, silver, tin and lead in geochemical samples. In the optimum experimental conditions, the detection limits for B, Mo, Ag, Sn and Pb are 1.1, 0.09, 0.01, 0.41, and 0.56 microg x g(-1) respectively, and the precisions (RSD, n=12) for B, Mo, Ag, Sn and Pb are 4.57%-7.63%, 5.14%-7.75%, 5.48%-12.30%, 3.97%-10.46%, and 4.26%-9.21% respectively. The analytical accuracy was validated by national standards and the results are in agreement with certified values. The method is simple, rapid, is an advanced analytical method for the determination of trace amounts of geochemical samples' boron, molybdenum, silver, tin and lead, and has a certain practicality.
Measurement of the depth of narrow slotted sections in eddy current reference standards
NASA Astrophysics Data System (ADS)
Kim, Young-Joo; Kim, Young-gil; Ahn, Bongyoung; Yoon, Dong-Jin
2007-02-01
The dimensions of the slots in eddy current (EC) reference standards are too narrow to be measured by general depth measurement methods such as the optical (laser) or stylus methods. However, measurement of the dimensions of the machined slots is a prerequisite to using the blocks as references. The present paper suggests a measurement method for the slotted section using an ultrasonic test. The width and depth of the slots measured in our study are roughly 0.1 mm and 0.5 mm, respectively. The time of flight (TOF) of the ultrasonic wave was measured precisely. The ultrasonic velocity in the material of the EC reference standard was calculated with the measured values of the TOF and its thickness. Reflected waves from the tip of the slot and the bottom surface of the EC standard were successfully classified. Using this method we have successfully determined the depth of the slotted section.
Koyama, Kazuo; Miyazaki, Kinuko; Abe, Kousuke; Egawa, Yoshitsugu; Fukazawa, Toru; Kitta, Tadashi; Miyashita, Takashi; Nezu, Toru; Nohara, Hidenori; Sano, Takashi; Takahashi, Yukinari; Taniguchi, Hideji; Yada, Hiroshi; Yamazaki, Kumiko; Watanabe, Yomi
2017-06-01
An indirect enzymatic analysis method for the quantification of fatty acid esters of 2-/3-monochloro-1,2-propanediol (2/3-MCPD) and glycidol was developed, using the deuterated internal standard of each free-form component. A statistical method for calibration and quantification of 2-MCPD-d 5 , which is difficult to obtain, is substituted by 3-MCPD-d 5 used for calculation of 3-MCPD. Using data from a previous collaborative study, the current method for the determination of 2-MCPD content using 2-MCPD-d 5 was compared to three alternative new methods using 3-MCPD-d 5 . The regression analysis showed that the alternative methods were unbiased compared to the current method. The relative standard deviation (RSD R ) among the testing laboratories was ≤ 15% and the Horwitz ratio was ≤ 1.0, a satisfactory value.
78 FR 45104 - Model Manufactured Home Installation Standards: Ground Anchor Installations
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-26
... test methods for establishing working load design values of ground anchor assemblies used for new... anchor installations and establish standardized test methods to determine ground anchor performance and... currently no national test method for rating and certifying ground anchor assemblies in different soil...
NASA Technical Reports Server (NTRS)
Nguyen, Truong X.; Ely, Jay J.; Koppen, Sandra V.
2001-01-01
This paper describes the implementation of mode-stirred method for susceptibility testing according to the current DO-160D standard. Test results on an Engine Data Processor using the implemented procedure and the comparisons with the standard anechoic test results are presented. The comparison experimentally shows that the susceptibility thresholds found in mode-stirred method are consistently higher than anechoic. This is consistent with the recent statistical analysis finding by NIST that the current calibration procedure overstates field strength by a fixed amount. Once the test results are adjusted for this value, the comparisons with the anechoic results are excellent. The results also show that test method has excellent chamber to chamber repeatability. Several areas for improvements to the current procedure are also identified and implemented.
NASA Technical Reports Server (NTRS)
Greene, Ben; McClure, Mark B.; Baker, David L.
2006-01-01
This work presents an overview of the International Organization for Standardization (ISO) 15859 International Standard for Space Systems Fluid Characteristics, Sampling and Test Methods Parts 1 through 13 issued in June 2004. These standards establish requirements for fluid characteristics, sampling, and test methods for 13 fluids of concern to the propellant community and propellant characterization laboratories: oxygen, hydrogen, nitrogen, helium, nitrogen tetroxide, monomethylhydrazine, hydrazine, kerosene, argon, water, ammonia, carbon dioxide, and breathing air. A comparison of the fluid characteristics, sampling, and test methods required by the ISO standards to the current military and NASA specifications, which are in use at NASA facilities and elsewhere, is presented. Many ISO standards composition limits and other content agree with those found in the applicable parts of NASA SE-S-0073, NASA SSP 30573, military performance standards and details, and Compressed Gas Association (CGA) commodity specifications. The status of a current project managed at NASA Johnson Space Center White Sands Test Facility (WSTF) to rewrite these documents is discussed.
Improved lossless intra coding for H.264/MPEG-4 AVC.
Lee, Yung-Lyul; Han, Ki-Hun; Sullivan, Gary J
2006-09-01
A new lossless intra coding method based on sample-by-sample differential pulse code modulation (DPCM) is presented as an enhancement of the H.264/MPEG-4 AVC standard. The H.264/AVC design includes a multidirectional spatial prediction method to reduce spatial redundancy by using neighboring samples as a prediction for the samples in a block of data to be encoded. In the new lossless intra coding method, the spatial prediction is performed based on samplewise DPCM instead of in the block-based manner used in the current H.264/AVC standard, while the block structure is retained for the residual difference entropy coding process. We show that the new method, based on samplewise DPCM, does not have a major complexity penalty, despite its apparent pipeline dependencies. Experiments show that the new lossless intra coding method reduces the bit rate by approximately 12% in comparison with the lossless intra coding method previously included in the H.264/AVC standard. As a result, the new method is currently being adopted into the H.264/AVC standard in a new enhancement project.
Stan T. Lebow; Patricia K. Lebow; Kolby C. Hirth
2017-01-01
Current standardized methods are not well-suited for estimating in-service preservative leaching from treated wood products. This study compared several alternative leaching methods to a commonly used standard method, and to leaching under natural exposure conditions. Small blocks or lumber specimens were pressure treated with a wood preservative containing borax and...
Analytical evaluation of current starch methods used in the international sugar industry: Part I
USDA-ARS?s Scientific Manuscript database
Several analytical starch methods currently exist in the international sugar industry that are used to prevent or mitigate starch-related processing challenges as well as assess the quality of traded end-products. These methods use simple iodometric chemistry, mostly potato starch standards, and uti...
Nonvolatile, semivolatile, or volatile: redefining volatile for volatile organic compounds.
Võ, Uyên-Uyén T; Morris, Michael P
2014-06-01
Although widely used in air quality regulatory frameworks, the term "volatile organic compound" (VOC) is poorly defined. Numerous standardized tests are currently used in regulations to determine VOC content (and thus volatility), but in many cases the tests do not agree with each other, nor do they always accurately represent actual evaporation rates under ambient conditions. The parameters (time, temperature, reference material, column polarity, etc.) used in the definitions and the associated test methods were created without a significant evaluation of volatilization characteristics in real world settings. Not only do these differences lead to varying VOC content results, but occasionally they conflict with one another. An ambient evaporation study of selected compounds and a few formulated products was conducted and the results were compared to several current VOC test methodologies: SCAQMD Method 313 (M313), ASTM Standard Test Method E 1868-10 (E1868), and US. EPA Reference Method 24 (M24). The ambient evaporation study showed a definite distinction between nonvolatile, semivolatile, and volatile compounds. Some low vapor pressure (LVP) solvents, currently considered exempt as VOCs by some methods, volatilize at ambient conditions nearly as rapidly as the traditional high-volatility solvents they are meant to replace. Conversely, bio-based and heavy hydrocarbons did not readily volatilize, though they often are calculated as VOCs in some traditional test methods. The study suggests that regulatory standards should be reevaluated to more accurately reflect real-world emission from the use of VOC containing products. The definition of VOC in current test methods may lead to regulations that exclude otherwise viable alternatives or allow substitutions of chemicals that may limit the environmental benefits sought in the regulation. A study was conducted to examine volatility of several compounds and a few formulated products under several current VOC test methodologies and ambient evaporation. This paper provides ample evidence to warrant a reevaluation of regulatory standards and provides a framework for progressive developments based on reasonable and scientifically justifiable definitions of VOCs.
Current federal regulations required monitoring for fecal coliforms or Salmonella in biosolids destined for land application. Methods used for analysis of fecal coliforms and Salmonella were reviewed and a standard protocol was developed. The protocols were then...
Current federal regulations require monitoring for fecal coliforms or Salmonella in biosolids destined for land application. Methods used for analysis of fecal coliforms and Salmonella were reviewed and a standard protocol was developed. The protocols were then evaluated by testi...
William E. Fox; Daniel W. McCollum; John E. Mitchell; Louis E. Swanson; Urs P. Kreuter; John A. Tanaka; Gary R. Evans; H. Theodore Heintz; Robert P. Breckenridge; Paul H. Geissler
2009-01-01
Currently, there is no standard method to assess the complex systems in rangeland ecosystems. Decision makers need baselines to create a common language of current rangeland conditions and standards for continued rangeland assessment. The Sustainable Rangeland Roundtable (SRR), a group of private and public organizations and agencies, has created a forum to discuss...
ERIC Educational Resources Information Center
Sam, Daisy
2011-01-01
The purpose of this mixed methods study was to investigate urban middle school teachers' descriptions of their competency in the current National Education Technology Standards for Teachers (NETS-T). The study also investigated how urban middle school teachers currently use technology to support their teaching and student learning. Research…
Strategies of bringing drug product marketing applications to meet current regulatory standards.
Wu, Yan; Freed, Anita; Lavrich, David; Raghavachari, Ramesh; Huynh-Ba, Kim; Shah, Ketan; Alasandro, Mark
2015-08-01
In the past decade, many guidance documents have been issued through collaboration of global organizations and regulatory authorities. Most of these are applicable to new products, but there is a risk that currently marketed products will not meet the new compliance standards during audits and inspections while companies continue to make changes through the product life cycle for continuous improvement or market demands. This discussion presents different strategies to bringing drug product marketing applications to meet current and emerging standards. It also discusses stability and method designs to meet process validation and global development efforts.
Comparison of a novel fixation device with standard suturing methods for spinal cord stimulators.
Bowman, Richard G; Caraway, David; Bentley, Ishmael
2013-01-01
Spinal cord stimulation is a well-established treatment for chronic neuropathic pain of the trunk or limbs. Currently, the standard method of fixation is to affix the leads of the neuromodulation device to soft tissue, fascia or ligament, through the use of manually tying general suture. A novel semiautomated device is proposed that may be advantageous to the current standard. Comparison testing in an excised caprine spine and simulated bench top model was performed. Three tests were performed: 1) perpendicular pull from fascia of caprine spine; 2) axial pull from fascia of caprine spine; and 3) axial pull from Mylar film. Six samples of each configuration were tested for each scenario. Standard 2-0 Ethibond was compared with a novel semiautomated device (Anulex fiXate). Upon completion of testing statistical analysis was performed for each scenario. For perpendicular pull in the caprine spine, the failure load for standard suture was 8.95 lbs with a standard deviation of 1.39 whereas for fiXate the load was 15.93 lbs with a standard deviation of 2.09. For axial pull in the caprine spine, the failure load for standard suture was 6.79 lbs with a standard deviation of 1.55 whereas for fiXate the load was 12.31 lbs with a standard deviation of 4.26. For axial pull in Mylar film, the failure load for standard suture was 10.87 lbs with a standard deviation of 1.56 whereas for fiXate the load was 19.54 lbs with a standard deviation of 2.24. These data suggest a novel semiautomated device offers a method of fixation that may be utilized in lieu of standard suturing methods as a means of securing neuromodulation devices. Data suggest the novel semiautomated device in fact may provide a more secure fixation than standard suturing methods. © 2012 International Neuromodulation Society.
Comparison of different measurement methods for transmittance haze
NASA Astrophysics Data System (ADS)
Yu, Hsueh-Ling; Hsaio, Chin-Chai
2009-08-01
Transmittance haze is increasingly important to the LCD and solar cell industry. Most commercial haze measurement instruments are designed according to the method recommended in the documentary standards like ASTM D 1003 (ASTM 2003 Standard Test Method for Haze and Luminous Transmittance of Transparent Plastics), JIS K 7361 (JIS 1997 Plastics—Determination of the Total Luminous Transmittance of Transparent Materials—Part 1: Single Beam Instrument) and ISO 14782 (ISO 1997 Plastics—Determination of Haze of Transparent Materials). To improve the measurement accuracy of the current standards, a new apparatus was designed by the Center for Measurement Standards (Yu et al 2006 Meas. Sci. Technol. 17 N29-36). Besides the methods mentioned above, a double-beam method is used in the design of some instruments. There are discrepancies between the various methods. But no matter which method is used, a white standard is always needed. This paper compares the measurement results from different methods, presents the effect of the white standard, and analyses the measurement uncertainty.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Other methods of protecting offtrack direct-current equipment; approved by an authorized representative of the Secretary. 75.703-4 Section 75.703-4... MANDATORY SAFETY STANDARDS-UNDERGROUND COAL MINES Grounding § 75.703-4 Other methods of protecting offtrack...
Code of Federal Regulations, 2011 CFR
2011-07-01
... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Other methods of protecting offtrack direct-current equipment; approved by an authorized representative of the Secretary. 75.703-4 Section 75.703-4... MANDATORY SAFETY STANDARDS-UNDERGROUND COAL MINES Grounding § 75.703-4 Other methods of protecting offtrack...
Standardizing the Delivery of 20 μL of Hapten During Patch Testing.
Selvick, Annika; Stauss, Kari; Strobush, Katrina; Taylor, Lauren; Picard, Alexandra; Doll, Andrea; Reeder, Margo
2016-01-01
The current method for patch test tray assembly requires hand dispensing a small volume of hapten onto chambers. Because of human error, this technique produces inaccurate and inconsistent results. The recommended volume of hapten for patch testing using Finn Chambers is 20 μL. The aims of this study were to create a device that standardizes the delivery of 20 μL and to compare it with the current hand dispensing technique. A device, named the Revolution, was created using the SolidWorks program. Five nurses in our Contact Dermatitis Clinic were asked to load 10 Finn Chambers using the current technique and also using the Revolution. Assembly time, volume of petrolatum, and accuracy of placement were measured. After the 3 trials, the nurses completed a survey on the 2 methods. The amount of petrolatum dispensed using the current technique ranged from 16 to 85 μL, with an average amount of 41.39 μL. The Revolution design dispensed an average of 19.78 μL. The current hand dispensing technique does not allow for accurate and consistent dispensing of 20 μL for patch testing. In contrast, the Revolution is an accurate and consistent device that can help standardize the patch testing method.
The Development of MST Test Information for the Prediction of Test Performances
ERIC Educational Resources Information Center
Park, Ryoungsun; Kim, Jiseon; Chung, Hyewon; Dodd, Barbara G.
2017-01-01
The current study proposes novel methods to predict multistage testing (MST) performance without conducting simulations. This method, called MST test information, is based on analytic derivation of standard errors of ability estimates across theta levels. We compared standard errors derived analytically to the simulation results to demonstrate the…
Developing Carbon Nanotube Standards at NASA
NASA Technical Reports Server (NTRS)
Nikolaev, Pasha; Arepalli, Sivaram; Sosa, Edward; Gorelik, Olga; Yowell, Leonard
2007-01-01
Single wall carbon nanotubes (SWCNTs) are currently being produced and processed by several methods. Many researchers are continuously modifying existing methods and developing new methods to incorporate carbon nanotubes into other materials and utilize the phenomenal properties of SWCNTs. These applications require availability of SWCNTs with known properties and there is a need to characterize these materials in a consistent manner. In order to monitor such progress, it is critical to establish a means by which to define the quality of SWCNT material and develop characterization standards to evaluate of nanotube quality across the board. Such characterization standards should be applicable to as-produced materials as well as processed SWCNT materials. In order to address this issue, NASA Johnson Space Center has developed a protocol for purity and dispersion characterization of SWCNTs (Ref.1). The NASA JSC group is currently working with NIST, ANSI and ISO to establish purity and dispersion standards for SWCNT material. A practice guide for nanotube characterization is being developed in cooperation with NIST (Ref.2). Furthermore, work is in progress to incorporate additional characterization methods for electrical, mechanical, thermal, optical and other properties of SWCNTs.
Developing Carbon Nanotube Standards at NASA
NASA Technical Reports Server (NTRS)
Nikolaev, Pasha; Arepalli, Sivaram; Sosa, Edward; Gorelik, Olga; Yowell, Leonard
2007-01-01
Single wall carbon nanotubes (SWCNTs) are currently being produced and processed by several methods. Many researchers are continuously modifying existing methods and developing new methods to incorporate carbon nanotubes into other materials and utilize the phenomenal properties of SWCNTs. These applications require availability of SWCNTs with known properties and there is a need to characterize these materials in a consistent manner. In order to monitor such progress, it is critical to establish a means by which to define the quality of SWCNT material and develop characterization standards to evaluate of nanotube quality across the board. Such characterization standards should be applicable to as-produced materials as well as processed SWCNT materials. In order to address this issue, NASA Johnson Space Center has developed a protocol for purity and dispersion characterization of SWCNTs. The NASA JSC group is currently working with NIST, ANSI and ISO to establish purity and dispersion standards for SWCNT material. A practice guide for nanotube characterization is being developed in cooperation with NIST. Furthermore, work is in progress to incorporate additional characterization methods for electrical, mechanical, thermal, optical and other properties of SWCNTs.
Standard test method for grindability of coal by the Hardgrove-machine method. ASTM standard
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-05-01
This test method is under the jurisdiction of ASTM Committee D-5 on Coal and Coke and is the direct responsibility of Subcommittee D05.07 on Physical Characteristics of Coal. The current edition was approved on November 10, 1997, and published May 1998. It was originally published as D 409-51. The last previous edition was D 409-93a.
46 CFR 188.35-1 - Standards to be used.
Code of Federal Regulations, 2010 CFR
2010-10-01
... COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) OCEANOGRAPHIC RESEARCH VESSELS GENERAL... subchapter an item, or method of construction, or testing is required to meet the standards established by the American Bureau of Shipping, the current standards in effect at the time of construction of the...
Comparative study of methods to measure the density of Cementious powders
Helsel, Michelle A.; Bentz, Dale
2016-01-01
The accurate measurement of the density of hydraulic cement has an essential role in the determination of concrete mixture proportions. As more supplementary cementitious materials (SCM), such as fly ash, and slag, or cement replacements materials such as limestone and calcium carbonate are used in blended cements, knowledge of the density of each powder or of the blended cement would allow a more accurate calculation of the proportions of a concrete mixture by volume instead of by mass. The current ASTM standard for measuring cement density is the “Test Method for Density of Hydraulic Cements” (ASTM C188-14), which utilizes a liquid displacement method to measure the volume of the cement. This paper will examine advantageous modifications of the current ASTM test, by alcohol substitutions for kerosene. In addition, a gas (helium) pycnometry method is evaluated as a possible alternative to the current standard. The described techniques will be compared to determine the most precise and reproducible method for measuring the density of hydraulic cements and other powders. PMID:27099404
Comparative study of methods to measure the density of Cementious powders.
Helsel, Michelle A; Ferraris, Chiara F; Bentz, Dale
2016-11-01
The accurate measurement of the density of hydraulic cement has an essential role in the determination of concrete mixture proportions. As more supplementary cementitious materials (SCM), such as fly ash, and slag, or cement replacements materials such as limestone and calcium carbonate are used in blended cements, knowledge of the density of each powder or of the blended cement would allow a more accurate calculation of the proportions of a concrete mixture by volume instead of by mass. The current ASTM standard for measuring cement density is the "Test Method for Density of Hydraulic Cements" (ASTM C188-14), which utilizes a liquid displacement method to measure the volume of the cement. This paper will examine advantageous modifications of the current ASTM test, by alcohol substitutions for kerosene. In addition, a gas (helium) pycnometry method is evaluated as a possible alternative to the current standard. The described techniques will be compared to determine the most precise and reproducible method for measuring the density of hydraulic cements and other powders.
Samuel V. Glass; Stanley D. Gatland II; Kohta Ueno; Christopher J. Schumacher
2017-01-01
ASHRAE Standard 160, Criteria for Moisture-Control Design Analysis in Buildings, was published in 2009. The standard sets criteria for moisture design loads, hygrothermal analysis methods, and satisfactory moisture performance of the building envelope. One of the evaluation criteria specifies conditions necessary to avoid mold growth. The current standard requires that...
Reef Fish Survey Techniques: Assessing the Potential for Standardizing Methodologies.
Caldwell, Zachary R; Zgliczynski, Brian J; Williams, Gareth J; Sandin, Stuart A
2016-01-01
Dramatic changes in populations of fishes living on coral reefs have been documented globally and, in response, the research community has initiated efforts to assess and monitor reef fish assemblages. A variety of visual census techniques are employed, however results are often incomparable due to differential methodological performance. Although comparability of data may promote improved assessment of fish populations, and thus management of often critically important nearshore fisheries, to date no standardized and agreed-upon survey method has emerged. This study describes the use of methods across the research community and identifies potential drivers of method selection. An online survey was distributed to researchers from academic, governmental, and non-governmental organizations internationally. Although many methods were identified, 89% of survey-based projects employed one of three methods-belt transect, stationary point count, and some variation of the timed swim method. The selection of survey method was independent of the research design (i.e., assessment goal) and region of study, but was related to the researcher's home institution. While some researchers expressed willingness to modify their current survey protocols to more standardized protocols (76%), their willingness decreased when methodologies were tied to long-term datasets spanning five or more years. Willingness to modify current methodologies was also less common among academic researchers than resource managers. By understanding both the current application of methods and the reported motivations for method selection, we hope to focus discussions towards increasing the comparability of quantitative reef fish survey data.
A review of the latest guidelines for NIBP device validation.
Alpert, Bruce S; Quinn, David E; Friedman, Bruce A
2013-12-01
The current ISO Standard is accepted as the National Standard in almost every industrialized nation. An overview of the most recently adopted standards is provided. Standards writing groups including the Advancement of Medical Instrumentation Sphygmomanometer Committee and ISO JWG7 are working to expand standardized evaluation methods to include the evaluation of devices intended for use in environments where motion artifact is common. An Association for the Advancement of Medical Instrumentation task group on noninvasive blood pressure measurement in the presence of motion artifact has published a technical information report containing research and standardized methods for the evaluation of blood pressure device performance in the presence of motion artifact.
Evaluation of 3 dental unit waterline contamination testing methods
Porteous, Nuala; Sun, Yuyu; Schoolfield, John
2015-01-01
Previous studies have found inconsistent results from testing methods used to measure heterotrophic plate count (HPC) bacteria in dental unit waterline (DUWL) samples. This study used 63 samples to compare the results obtained from an in-office chairside method and 2 currently used commercial laboratory HPC methods (Standard Methods 9215C and 9215E). The results suggest that the Standard Method 9215E is not suitable for application to DUWL quality monitoring, due to the detection of limited numbers of heterotrophic organisms at the required 35°C incubation temperature. The results also confirm that while the in-office chairside method is useful for DUWL quality monitoring, the Standard Method 9215C provided the most accurate results. PMID:25574718
Construct Validation of Content Standards for Teaching
ERIC Educational Resources Information Center
van der Schaaf, Marieke F.; Stokking, Karel M.
2011-01-01
Current international demands to strengthen the teaching profession have led to an increased development and use of professional content standards. The study aims to provide insight in the construct validity of content standards by researching experts' underlying assumptions and preferences when participating in a delphi method. In three rounds 21…
USDA-ARS?s Scientific Manuscript database
The objective of this study was to evaluate the percentage of US producers and milk not currently meeting the proposed bulk tank somatic cell counts (BTSCC) limits. Five different limits of BTSCC were evaluated for compliance: 750K, 600K, 500K, and 400K using the current US methods and 400K using th...
A rapid method for soil cement design : Louisiana slope value method.
DOT National Transportation Integrated Search
1964-03-01
The current procedure used by the Louisiana Department of Highways for laboratory design of cement stabilized soil base and subbase courses is taken from standard AASHO test methods, patterned after Portland Cement Association criteria. These methods...
[Work quota setting and man-hour productivity estimation in pathologists].
Svistunov, V V; Makarov, S V; Makarova, A E
The paper considers the development and current state of the regulation of work quota setting and remuneration in pathologists. Reasoning from the current staff standards for morbid anatomy departments (units), the authors present a method to calculate the load of pathologists. The essence of the proposed method is demonstrated using a specific example.
McCourt, Clare M; McArt, Darragh G; Mills, Ken; Catherwood, Mark A; Maxwell, Perry; Waugh, David J; Hamilton, Peter; O'Sullivan, Joe M; Salto-Tellez, Manuel
2013-01-01
Next Generation Sequencing (NGS) has the potential of becoming an important tool in clinical diagnosis and therapeutic decision-making in oncology owing to its enhanced sensitivity in DNA mutation detection, fast-turnaround of samples in comparison to current gold standard methods and the potential to sequence a large number of cancer-driving genes at the one time. We aim to test the diagnostic accuracy of current NGS technology in the analysis of mutations that represent current standard-of-care, and its reliability to generate concomitant information on other key genes in human oncogenesis. Thirteen clinical samples (8 lung adenocarcinomas, 3 colon carcinomas and 2 malignant melanomas) already genotyped for EGFR, KRAS and BRAF mutations by current standard-of-care methods (Sanger Sequencing and q-PCR), were analysed for detection of mutations in the same three genes using two NGS platforms and an additional 43 genes with one of these platforms. The results were analysed using closed platform-specific proprietary bioinformatics software as well as open third party applications. Our results indicate that the existing format of the NGS technology performed well in detecting the clinically relevant mutations stated above but may not be reliable for a broader unsupervised analysis of the wider genome in its current design. Our study represents a diagnostically lead validation of the major strengths and weaknesses of this technology before consideration for diagnostic use.
ERIC Educational Resources Information Center
Li, Deping; Oranje, Andreas
2007-01-01
Two versions of a general method for approximating standard error of regression effect estimates within an IRT-based latent regression model are compared. The general method is based on Binder's (1983) approach, accounting for complex samples and finite populations by Taylor series linearization. In contrast, the current National Assessment of…
ERIC Educational Resources Information Center
Caruthers, Tarchell Peeples
2013-01-01
Current research shows that, despite standards-based mathematics reform, American students lag behind in mathematics achievement when compared to their counterparts in other countries. The purpose of this mixed methods study was to examine if reading level, as measured by the Scholastic Reading Inventory, is related to standards-based mathematics…
REGULATORY METHODS PROGRAM SUPPORT FOR NAAQSS
This task supports attainment determinations of the National Ambient Air Quality Standards (NAAQS) for particulate matter (PM) in the areas of development, testing, and improvement of new and current PM Federal Reference Methods (FRMs) and Federal Equivalent Methods (FEMs). The ...
Code of Federal Regulations, 2010 CFR
2010-07-01
... disposal and sales price or the method used to determine current fair market value where a recipient... following standards. For equipment with a current per unit fair market value of $5,000 or more, the... in the cost of the original project or program to the current fair market value of the equipment. If...
Code of Federal Regulations, 2010 CFR
2010-07-01
... the following standards. For equipment with a current per unit fair market value of $5000 or more, the... cost of the original project or program to the current fair market value of the equipment, plus any... data, including date of disposal and sales price or the method used to determine current fair market...
Humphries, Romney M; Kircher, Susan; Ferrell, Andrea; Krause, Kevin M; Malherbe, Rianna; Hsiung, Andre; Burnham, C A
2018-05-09
Expedited pathways to antimicrobial agent approval by the United States Food and Drug Administration (FDA) have led to increased delays between drug approval and the availability of FDA-cleared antimicrobial susceptibility testing (AST) devices. Antimicrobial disks for use with disk diffusion testing are among the first AST devices available to clinical laboratories. However, many laboratories are reluctant to implement a disk diffusion method for a variety of reasons, including dwindling proficiency with this method, interruptions to laboratory workflow, uncertainty surrounding the quality and reliability of a disk diffusion test, and perceived need to report an MIC to clinicians. This mini-review provides a report from the Clinical and Laboratory Standards Institute Working Group on Methods Development and Standardization on the current standards and clinical utility of disk diffusion testing. Copyright © 2018 American Society for Microbiology.
Mission Systems Open Architecture Science and Technology (MOAST) program
NASA Astrophysics Data System (ADS)
Littlejohn, Kenneth; Rajabian-Schwart, Vahid; Kovach, Nicholas; Satterthwaite, Charles P.
2017-04-01
The Mission Systems Open Architecture Science and Technology (MOAST) program is an AFRL effort that is developing and demonstrating Open System Architecture (OSA) component prototypes, along with methods and tools, to strategically evolve current OSA standards and technical approaches, promote affordable capability evolution, reduce integration risk, and address emerging challenges [1]. Within the context of open architectures, the program is conducting advanced research and concept development in the following areas: (1) Evolution of standards; (2) Cyber-Resiliency; (3) Emerging Concepts and Technologies; (4) Risk Reduction Studies and Experimentation; and (5) Advanced Technology Demonstrations. Current research includes the development of methods, tools, and techniques to characterize the performance of OMS data interconnection methods for representative mission system applications. Of particular interest are the OMS Critical Abstraction Layer (CAL), the Avionics Service Bus (ASB), and the Bulk Data Transfer interconnects, as well as to develop and demonstrate cybersecurity countermeasures techniques to detect and mitigate cyberattacks against open architecture based mission systems and ensure continued mission operations. Focus is on cybersecurity techniques that augment traditional cybersecurity controls and those currently defined within the Open Mission System and UCI standards. AFRL is also developing code generation tools and simulation tools to support evaluation and experimentation of OSA-compliant implementations.
Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.; ...
2016-07-05
Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.
Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less
The impact of heterogeneity in individual frailty on the dynamics of mortality.
Vaupel, J W; Manton, K G; Stallard, E
1979-08-01
Life table methods are developed for populations whose members differ in their endowment for longevity. Unlike standard methods, which ignore such heterogeneity, these methods use different calculations to construct cohort, period, and individual life tables. The results imply that standard methods overestimate current life expectancy and potential gains in life expectancy from health and safety interventions, while underestimating rates of individual aging, past progress in reducing mortality, and mortality differentials between pairs of populations. Calculations based on Swedish mortality data suggest that these errors may be important, especially in old age.
Hydrogen Infrastructure Testing and Research Facility | Hydrogen and Fuel
stations, enabling NREL to validate current industry standards and methods for hydrogen fueling as well as the HITRF to: Develop, quantify performance of, and improve renewable hydrogen production methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ronald L. Boring; David I. Gertman; Jeffrey C. Joe
2005-09-01
An ongoing issue within human-computer interaction (HCI) is the need for simplified or “discount” methods. The current economic slowdown has necessitated innovative methods that are results driven and cost effective. The myriad methods of design and usability are currently being cost-justified, and new techniques are actively being explored that meet current budgets and needs. Recent efforts in human reliability analysis (HRA) are highlighted by the ten-year development of the Standardized Plant Analysis Risk HRA (SPAR-H) method. The SPAR-H method has been used primarily for determining humancentered risk at nuclear power plants. The SPAR-H method, however, shares task analysis underpinnings withmore » HCI. Despite this methodological overlap, there is currently no HRA approach deployed in heuristic usability evaluation. This paper presents an extension of the existing SPAR-H method to be used as part of heuristic usability evaluation in HCI.« less
The current benchmark method for detecting Cryptosporidium oocysts in water is the U.S. Environmental Protection Agency (U.S. EPA) Method 1623. Studies evaluating this method report that recoveries are highly variable and dependent upon laboratory, water sample, and analyst. Ther...
ERIC Educational Resources Information Center
Oladele, Babatunde
2017-01-01
The aim of the current study is to analyse the 2014 Post UTME scores of candidates in the university of Ibadan towards the establishment of cut off using two methods of standard settings. Prospective candidates who seek admission to higher institution are often denied admission through the Post UTME exercise. There is no single recommended…
Calibration of High Heat Flux Sensors at NIST
Murthy, A. V.; Tsai, B. K.; Gibson, C. E.
1997-01-01
An ongoing program at the National Institute of Standards and Technology (NIST) is aimed at improving and standardizing heat-flux sensor calibration methods. The current calibration needs of U.S. science and industry exceed the current NIST capability of 40 kW/m2 irradiance. In achieving this goal, as well as meeting lower-level non-radiative heat flux calibration needs of science and industry, three different types of calibration facilities currently are under development at NIST: convection, conduction, and radiation. This paper describes the research activities associated with the NIST Radiation Calibration Facility. Two different techniques, transfer and absolute, are presented. The transfer calibration technique employs a transfer standard calibrated with reference to a radiometric standard for calibrating the sensors using a graphite tube blackbody. Plans for an absolute calibration facility include the use of a spherical blackbody and a cooled aperture and sensor-housing assembly to calibrate the sensors in a low convective environment. PMID:27805156
Glucose Measurement: Time for a Gold Standard
Hagvik, Joakim
2007-01-01
There is no internationally recognized reference method for the measurement of blood glucose. The Centers for Disease Control and Prevention (CDC) highlighted the need for standardization some years ago when a project was started. The project objectives were to (1) investigate whether there are significant differences in calibration levels among currently used glucose monitors for home use and (2) develop a reference method for glucose determination. A first study confirmed the assumption that currently used home-use monitors differ significantly and that standardization is necessary in order to minimize variability and to improve patient care. As a reference method, CDC recommended a method based on isotope dilution gas chromatography–mass spectrometry, an assay that has received support from clinical chemists worldwide. CDC initiated a preliminary study to establish the suitability of this method, but then the project came to a halt. It is hoped that CDC, with support from the industry, as well as academic and professional organizations such as the American Association for Clinical Chemistry and International Federation of Clinical Chemistry and Laboratory Medicine, will be able to finalize the project and develop the long-awaited and much needed “gold standard” for glucose measurement. PMID:19888402
ERIC Educational Resources Information Center
Alowaydhi, Wafa Hafez
2016-01-01
The current study aimed at standardizing the program of learning Arabic for non-native speakers in Saudi Electronic University according to certain standards of total quality. To achieve its purpose, the study adopted the descriptive analytical method. The author prepared a measurement tool for evaluating the electronic learning programs in light…
A need for standardization in visual acuity measurement.
Patel, Hina; Congdon, Nathan; Strauss, Glenn; Lansingh, Charles
2017-01-01
Standardization of terminologies and methods is increasingly important in all fields including ophthalmology, especially currently when research and new technology are rapidly driving improvements in medicine. This review highlights the range of notations used by vision care professionals around the world for vision measurement, and the challenges resulting from this practice. The global community is urged to move toward a uniform standard.
Manifesting Destiny: Re/Presentations of Indigenous Peoples in K-12 U.S. History Standards
ERIC Educational Resources Information Center
Shear, Sarah B.; Knowles, Ryan T.; Soden, Gregory J.; Castro, Antonio J.
2015-01-01
In this mixed-methods study, we use a postcolonial framework to investigate how state standards represent Indigenous histories and cultures. The research questions that guided this study include: (a) What is the frequency of Indigenous content (histories, cultures, current issues) covered in state-level U.S. history standards for K-12? (b) What is…
Study of flow over object problems by a nodal discontinuous Galerkin-lattice Boltzmann method
NASA Astrophysics Data System (ADS)
Wu, Jie; Shen, Meng; Liu, Chen
2018-04-01
The flow over object problems are studied by a nodal discontinuous Galerkin-lattice Boltzmann method (NDG-LBM) in this work. Different from the standard lattice Boltzmann method, the current method applies the nodal discontinuous Galerkin method into the streaming process in LBM to solve the resultant pure convection equation, in which the spatial discretization is completed on unstructured grids and the low-storage explicit Runge-Kutta scheme is used for time marching. The present method then overcomes the disadvantage of standard LBM for depending on the uniform meshes. Moreover, the collision process in the LBM is completed by using the multiple-relaxation-time scheme. After the validation of the NDG-LBM by simulating the lid-driven cavity flow, the simulations of flows over a fixed circular cylinder, a stationary airfoil and rotating-stationary cylinders are performed. Good agreement of present results with previous results is achieved, which indicates that the current NDG-LBM is accurate and effective for flow over object problems.
Standard Test Methods for Textile Composites
NASA Technical Reports Server (NTRS)
Masters, John E.; Portanova, Marc A.
1996-01-01
Standard testing methods for composite laminates reinforced with continuous networks of braided, woven, or stitched fibers have been evaluated. The microstructure of these textile' composite materials differs significantly from that of tape laminates. Consequently, specimen dimensions and loading methods developed for tape type composites may not be applicable to textile composites. To this end, a series of evaluations were made comparing testing practices currently used in the composite industry. Information was gathered from a variety of sources and analyzed to establish a series of recommended test methods for textile composites. The current practices established for laminated composite materials by ASTM and the MIL-HDBK-17 Committee were considered. This document provides recommended test methods for determining both in-plane and out-of-plane properties. Specifically, test methods are suggested for: unnotched tension and compression; open and filled hole tension; open hole compression; bolt bearing; and interlaminar tension. A detailed description of the material architectures evaluated is also provided, as is a recommended instrumentation practice.
Corbel, Michael J; Das, Rose Gaines; Lei, Dianliang; Xing, Dorothy K L; Horiuchi, Yoshinobu; Dobbelaer, Roland
2008-04-07
This report reflects the discussion and conclusions of a WHO group of experts from National Regulatory Authorities (NRAs), National Control Laboratories (NCLs), vaccine industries and other relevant institutions involved in standardization and control of diphtheria, tetanus and pertussis vaccines (DTP), held on 20-21 July 2006 and 28-30 March 2007, in Geneva Switzerland for the revision of WHO Manual for quality control of DTP vaccines. Taking into account recent developments and standardization in quality control methods and the revision of WHO recommendations for D, T, P vaccines, and a need for updating the manual has been recognized. In these two meetings the current situation of quality control methods in terms of potency, safety and identity tests for DTP vaccines and statistical analysis of data were reviewed. Based on the WHO recommendations and recent validation of testing methods, the content of current manual were reviewed and discussed. The group agreed that the principles to be observed in selecting methods included identifying those critical for assuring safety, efficacy and quality and which were consistent with WHO recommendations/requirements. Methods that were well recognized but not yet included in current Recommendations should be taken into account. These would include in vivo and/or in vitro methods for determining potency, safety testing and identity. The statistical analysis of the data should be revised and updated. It was noted that the mouse based assays for toxoid potency were still quite widely used and it was desirable to establish appropriate standards for these to enable the results to be related to the standard guinea pig assays. The working group was met again to review the first drafts and to input further suggestions or amendments to the contributions of the drafting groups. The revised manual was to be finalized and published by WHO.
Determination of viable legionellae in engineered water systems: Do we find what we are looking for?
Kirschner, Alexander K.T.
2016-01-01
In developed countries, legionellae are one of the most important water-based bacterial pathogens caused by management failure of engineered water systems. For routine surveillance of legionellae in engineered water systems and outbreak investigations, cultivation-based standard techniques are currently applied. However, in many cases culture-negative results are obtained despite the presence of viable legionellae, and clinical cases of legionellosis cannot be traced back to their respective contaminated water source. Among the various explanations for these discrepancies, the presence of viable but non-culturable (VBNC) Legionella cells has received increased attention in recent discussions and scientific literature. Alternative culture-independent methods to detect and quantify legionellae have been proposed in order to complement or even substitute the culture method in the future. Such methods should detect VBNC Legionella cells and provide a more comprehensive picture of the presence of legionellae in engineered water systems. However, it is still unclear whether and to what extent these VBNC legionellae are hazardous to human health. Current risk assessment models to predict the risk of legionellosis from Legionella concentrations in the investigated water systems contain many uncertainties and are mainly based on culture-based enumeration. If VBNC legionellae should be considered in future standard analysis, quantitative risk assessment models including VBNC legionellae must be proven to result in better estimates of human health risk than models based on cultivation alone. This review critically evaluates current methods to determine legionellae in the VBNC state, their potential to complement the standard culture-based method in the near future, and summarizes current knowledge on the threat that VBNC legionellae may pose to human health. PMID:26928563
Determination of viable legionellae in engineered water systems: Do we find what we are looking for?
Kirschner, Alexander K T
2016-04-15
In developed countries, legionellae are one of the most important water-based bacterial pathogens caused by management failure of engineered water systems. For routine surveillance of legionellae in engineered water systems and outbreak investigations, cultivation-based standard techniques are currently applied. However, in many cases culture-negative results are obtained despite the presence of viable legionellae, and clinical cases of legionellosis cannot be traced back to their respective contaminated water source. Among the various explanations for these discrepancies, the presence of viable but non-culturable (VBNC) Legionella cells has received increased attention in recent discussions and scientific literature. Alternative culture-independent methods to detect and quantify legionellae have been proposed in order to complement or even substitute the culture method in the future. Such methods should detect VBNC Legionella cells and provide a more comprehensive picture of the presence of legionellae in engineered water systems. However, it is still unclear whether and to what extent these VBNC legionellae are hazardous to human health. Current risk assessment models to predict the risk of legionellosis from Legionella concentrations in the investigated water systems contain many uncertainties and are mainly based on culture-based enumeration. If VBNC legionellae should be considered in future standard analysis, quantitative risk assessment models including VBNC legionellae must be proven to result in better estimates of human health risk than models based on cultivation alone. This review critically evaluates current methods to determine legionellae in the VBNC state, their potential to complement the standard culture-based method in the near future, and summarizes current knowledge on the threat that VBNC legionellae may pose to human health. Copyright © 2016 The Author. Published by Elsevier Ltd.. All rights reserved.
A Randomized, Controlled Trial of ZMapp for Ebola Virus Infection
2016-01-01
BACKGROUND Data from studies in nonhuman primates suggest that the triple monoclonal antibody cocktail ZMapp is a promising immune-based treatment for Ebola virus disease (EVD). METHODS Beginning in March 2015, we conducted a randomized, controlled trial of ZMapp plus the current standard of care as compared with the current standard of care alone in patients with EVD that was diagnosed in West Africa by polymerase-chain-reaction (PCR) assay. Eligible patients of any age were randomly assigned in a 1:1 ratio to receive either the current standard of care or the current standard of care plus three intravenous infusions of ZMapp (50 mg per kilogram of body weight, administered every third day). Patients were stratified according to baseline PCR cycle-threshold value for the virus (≤22 vs. >22) and country of enrollment. Oral favipiravir was part of the current standard of care in Guinea. The primary end point was mortality at 28 days. RESULTS A total of 72 patients were enrolled at sites in Liberia, Sierra Leone, Guinea, and the United States. Of the 71 patients who could be evaluated, 21 died, representing an overall case fatality rate of 30%. Death occurred in 13 of 35 patients (37%) who received the current standard of care alone and in 8 of 36 patients (22%) who received the current standard of care plus ZMapp. The observed posterior probability that ZMapp plus the current standard of care was superior to the current standard of care alone was 91.2%, falling short of the prespecified threshold of 97.5%. Frequentist analyses yielded similar results (absolute difference in mortality with ZMapp, −15 percentage points; 95% confidence interval, −36 to 7). Baseline viral load was strongly predictive of both mortality and duration of hospitalization in all age groups. CONCLUSIONS In this randomized, controlled trial of a putative therapeutic agent for EVD, although the estimated effect of ZMapp appeared to be beneficial, the result did not meet the prespecified statistical threshold for efficacy. (Funded by the National Institute of Allergy and Infectious Diseases and others; PREVAIL II ClinicalTrials.gov number, NCT02363322.) PMID:27732819
NASA Technical Reports Server (NTRS)
Larkin, Paul; Goldstein, Bob
2008-01-01
This paper presents an update to the methods and procedures used in Direct Field Acoustic Testing (DFAT). The paper will discuss some of the recent techniques and developments that are currently being used and the future publication of a reference standard. Acoustic testing using commercial sound system components is becoming a popular and cost effective way of generating a required acoustic test environment both in and out of a reverberant chamber. This paper will present the DFAT test method, the usual setup and procedure and the development and use of a closed-loop, narrow-band control system. Narrow-band control of the acoustic PSD allows all standard techniques and procedures currently used in random control to be applied to acoustics and some examples are given. The paper will conclude with a summary of the development of a standard practice guideline that is hoped to be available in the first quarter of next year.
FBI compression standard for digitized fingerprint images
NASA Astrophysics Data System (ADS)
Brislawn, Christopher M.; Bradley, Jonathan N.; Onyshczak, Remigius J.; Hopper, Thomas
1996-11-01
The FBI has formulated national standards for digitization and compression of gray-scale fingerprint images. The compression algorithm for the digitized images is based on adaptive uniform scalar quantization of a discrete wavelet transform subband decomposition, a technique referred to as the wavelet/scalar quantization method. The algorithm produces archival-quality images at compression ratios of around 15 to 1 and will allow the current database of paper fingerprint cards to be replaced by digital imagery. A compliance testing program is also being implemented to ensure high standards of image quality and interchangeability of data between different implementations. We will review the current status of the FBI standard, including the compliance testing process and the details of the first-generation encoder.
A convenient method for X-ray analysis in TEM that measures mass thickness and composition
NASA Astrophysics Data System (ADS)
Statham, P.; Sagar, J.; Holland, J.; Pinard, P.; Lozano-Perez, S.
2018-01-01
We consider a new approach for quantitative analysis in transmission electron microscopy (TEM) that offers the same convenience as single-standard quantitative analysis in scanning electron microscopy (SEM). Instead of a bulk standard, a thin film with known mass thickness is used as a reference. The procedure involves recording an X-ray spectrum from the reference film for each session of acquisitions on real specimens. There is no need to measure the beam current; the current only needs to be stable for the duration of the session. A new reference standard with a large (1 mm x 1 mm) area of uniform thickness of 100 nm silicon nitride is used to reveal regions of X-ray detector occlusion that would give misleading results for any X-ray method that measures thickness. Unlike previous methods, the new X-ray method does not require an accurate beam current monitor but delivers equivalent accuracy in mass thickness measurement. Quantitative compositional results are also automatically corrected for specimen self-absorption. The new method is tested using a wedge specimen of Inconel 600 that is used to calibrate the high angle angular dark field (HAADF) signal to provide a thickness reference and results are compared with electron energy-loss spectrometry (EELS) measurements. For the new X-ray method, element composition results are consistent with the expected composition for the alloy and the mass thickness measurement is shown to provide an accurate alternative to EELS for thickness determination in TEM without the uncertainty associated with mean free path estimates.
ISO radiation sterilization standards
NASA Astrophysics Data System (ADS)
Lambert, Byron J.; Hansen, Joyce M.
1998-06-01
This presentation provides an overview of the current status of the ISO radiation sterilization standards. The ISO standards are voluntary standards which detail both the validation and routine control of the sterilization process. ISO 11137 was approved in 1994 and published in 1995. When reviewing the standard you will note that less than 20% of the standard is devoted to requirements and the remainder is guidance on how to comply with the requirements. Future standards developments in radiation sterilization are being focused on providing additional guidance. The guidance that is currently provided in informative annexes of ISO 11137 includes: device/packaging materials, dose setting methods, and dosimeters and dose measurement, currently, there are four Technical Reports being developed to provide additional guidance: 1. AAMI Draft TIR, "Radiation Sterilization Material Qualification" 2. ISO TR 13409-1996, "Sterilization of health care products — Radiation sterilization — Substantiation of 25 kGy as a sterilization dose for small or infrequent production batches" 3. ISO Draft TR, "Sterilization of health care products — Radiation sterilization Selection of a sterilization dose for a single production batch" li]4. ISO Draft TR, "Sterilization of health care products — Radiation sterilization-Product Families, Plans for Sampling and Frequency of Dose Audits."
The TIAA Graded Payment Method and the CPI.
ERIC Educational Resources Information Center
King, Francis P.
1995-01-01
The graded payment method of receiving traditional annuity benefits was introduced by the Teachers Insurance and Annuity Association (TIAA) in 1982 to introduce an inflation-fighting factor into the annuity program. Under the graded method, in contrast to the standard method, a part of current annuity dividend income is withheld each year to…
Current methods invariably require sample concentration, typically solid-phase extraction, so as to be amendable for measurement at ambient concentration levels. Such methods (i.e. EPA Method 544) are only validated for a limited number of the known variants where standards are ...
A study on setting of the fatigue limit of temporary dental implants.
Kim, M H; Cho, E J; Lee, J W; Kim, E K; Yoo, S H; Park, C W
2017-07-01
A temporary dental implant is a medical device which is temporarily used to support a prosthesis such as an artificial tooth used for restoring patient's masticatory function during implant treatment. It is implanted in the oral cavity to substitute for the role of tooth. Due to the aging and westernization of current Korean society, the number of tooth extraction and implantation procedures is increasing, leading to an increase in the use and development of temporary dental implants. Because an implant performs a masticatory function in place of a tooth, a dynamic load is repeatedly put on the implant. Thus, the fatigue of implants is reported to be the most common causes of the fracture thereof. According to the investigation and analysis of the current domestic and international standards, the standard for fatigue of implant fixtures is not separately specified. Although a test method for measuring the fatigue is suggested in an ISO standard, it is a standard for permanent dental implants. Most of the test standards for Korean manufacturers and importers apply 250 N or more based on the guidance for the safety and performance evaluation of dental implants. Therefore, this study is intended to figure out the fatigue standard which can be applied to temporary dental implants when measuring the fatigue according to the test method suggested in the permanent dental implant standard. The results determined that suitable fatigue standards of temporary dental implants should be provided by each manufacturer rather than applying 250 N. This study will be useful for the establishment of the fatigue standards and fatigue test methods of the manufacturers and importers of temporary dental implants.
Shirasaki, Osamu; Asou, Yosuke; Takahashi, Yukio
2007-12-01
Owing to fast or stepwise cuff deflation, or measuring at places other than the upper arm, the clinical accuracy of most recent automated sphygmomanometers (auto-BPMs) cannot be validated by one-arm simultaneous comparison, which would be the only accurate validation method based on auscultation. Two main alternative methods are provided by current standards, that is, two-arm simultaneous comparison (method 1) and one-arm sequential comparison (method 2); however, the accuracy of these validation methods might not be sufficient to compensate for the suspicious accuracy in lateral blood pressure (BP) differences (LD) and/or BP variations (BPV) between the device and reference readings. Thus, the Japan ISO-WG for sphygmomanometer standards has been studying a new method that might improve validation accuracy (method 3). The purpose of this study is to determine the appropriateness of method 3 by comparing immunity to LD and BPV with those of the current validation methods (methods 1 and 2). The validation accuracy of the above three methods was assessed in human participants [N=120, 45+/-15.3 years (mean+/-SD)]. An oscillometric automated monitor, Omron HEM-762, was used as the tested device. When compared with the others, methods 1 and 3 showed a smaller intra-individual standard deviation of device error (SD1), suggesting their higher reproducibility of validation. The SD1 by method 2 (P=0.004) significantly correlated with the participant's BP, supporting our hypothesis that the increased SD of device error by method 2 is at least partially caused by essential BPV. Method 3 showed a significantly (P=0.0044) smaller interparticipant SD of device error (SD2), suggesting its higher interparticipant consistency of validation. Among the methods of validation of the clinical accuracy of auto-BPMs, method 3, which showed the highest reproducibility and highest interparticipant consistency, can be proposed as being the most appropriate.
Testing Standard Reliability Criteria
ERIC Educational Resources Information Center
Sherry, David
2017-01-01
Maul's paper, "Rethinking Traditional Methods of Survey Validation" (Andrew Maul), contains two stages. First he presents empirical results that cast doubt on traditional methods for validating psychological measurement instruments. These results motivate the second stage, a critique of current conceptions of psychological measurement…
Grimes, D.J.; Marranzino, A.P.
1968-01-01
Two spectrographic methods are used in mobile field laboratories of the U. S. Geological Survey. In the direct-current arc method, the ground sample is mixed with graphite powder, packed into an electrode crater, and burned to completion. Thirty elements are determined. In the spark method, the sample, ground to pass a 150-mesh screen, is digested in hydrofluoric acid followed by evaporation to dryness and dissolution in aqua regia. The solution is fed into the spark gap by means of a rotating-disk electrode arrangement and is excited with an alternating-current spark discharge. Fourteen elements are determined. In both techniques, light is recorded on Spectrum Analysis No. 1, 35-millimeter film, and the spectra are compared visually with those of standard films.
Evaluating the Performance of the IEEE Standard 1366 Method for Identifying Major Event Days
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eto, Joseph H.; LaCommare, Kristina Hamachi; Sohn, Michael D.
IEEE Standard 1366 offers a method for segmenting reliability performance data to isolate the effects of major events from the underlying year-to-year trends in reliability. Recent analysis by the IEEE Distribution Reliability Working Group (DRWG) has found that reliability performance of some utilities differs from the expectations that helped guide the development of the Standard 1366 method. This paper proposes quantitative metrics to evaluate the performance of the Standard 1366 method in identifying major events and in reducing year-to-year variability in utility reliability. The metrics are applied to a large sample of utility-reported reliability data to assess performance of themore » method with alternative specifications that have been considered by the DRWG. We find that none of the alternatives perform uniformly 'better' than the current Standard 1366 method. That is, none of the modifications uniformly lowers the year-to-year variability in System Average Interruption Duration Index without major events. Instead, for any given alternative, while it may lower the value of this metric for some utilities, it also increases it for other utilities (sometimes dramatically). Thus, we illustrate some of the trade-offs that must be considered in using the Standard 1366 method and highlight the usefulness of the metrics we have proposed in conducting these evaluations.« less
Ion diffusion may introduce spurious current sources in current-source density (CSD) analysis.
Halnes, Geir; Mäki-Marttunen, Tuomo; Pettersen, Klas H; Andreassen, Ole A; Einevoll, Gaute T
2017-07-01
Current-source density (CSD) analysis is a well-established method for analyzing recorded local field potentials (LFPs), that is, the low-frequency part of extracellular potentials. Standard CSD theory is based on the assumption that all extracellular currents are purely ohmic, and thus neglects the possible impact from ionic diffusion on recorded potentials. However, it has previously been shown that in physiological conditions with large ion-concentration gradients, diffusive currents can evoke slow shifts in extracellular potentials. Using computer simulations, we here show that diffusion-evoked potential shifts can introduce errors in standard CSD analysis, and can lead to prediction of spurious current sources. Further, we here show that the diffusion-evoked prediction errors can be removed by using an improved CSD estimator which accounts for concentration-dependent effects. NEW & NOTEWORTHY Standard CSD analysis does not account for ionic diffusion. Using biophysically realistic computer simulations, we show that unaccounted-for diffusive currents can lead to the prediction of spurious current sources. This finding may be of strong interest for in vivo electrophysiologists doing extracellular recordings in general, and CSD analysis in particular. Copyright © 2017 the American Physiological Society.
A CRITICAL EVALUATION OF A FLOW CYTOMETER USED FOR DETECTING ENTEROCOCCI IN RECREATIONAL WATERS
The current U. S. Environmental Protection Agency-approved method for enterococci (Method 1600) in recreational water is a membrane filter (MF) method that takes 24 hours to obtain results. If the recreational water is not in compliance with the standard, the risk of exposure to...
[Modified Delphi method in the constitution of school sanitation standard].
Yin, Xunqiang; Liang, Ying; Tan, Hongzhuan; Gong, Wenjie; Deng, Jing; Luo, Jiayou; Di, Xiaokang; Wu, Yue
2012-11-01
To constitute school sanitation standard using modified Delphi method, and to explore the feasibility and the predominance of Delphi method in the constitution of school sanitation standard. Two rounds of expert consultations were adopted in this study. The data were analyzed with SPSS15.0 to screen indices of school sanitation standard. Thirty-two experts accomplished the 2 rounds of consultations. The average length of expert service was (24.69 ±8.53) years. The authority coefficient was 0.729 ±0.172. The expert positive coefficient was 94.12% (32/34) in the first round and 100% (32/32) in the second round. The harmonious coefficients of importance, feasibility and rationality in the second round were 0.493 (P<0.05), 0.527 (P<0.01), and 0.535 (P<0.01), respectively, suggesting unanimous expert opinions. According to the second round of consultation, 38 indices were included in the framework. Theoretical analysis, literature review, investigation and so on are generally used in health standard constitution currently. Delphi method is a rapid, effective and feasible method in this field.
Using transcranial direct-current stimulation (tDCS) to understand cognitive processing.
Reinhart, Robert M G; Cosman, Josh D; Fukuda, Keisuke; Woodman, Geoffrey F
2017-01-01
Noninvasive brain stimulation methods are becoming increasingly common tools in the kit of the cognitive scientist. In particular, transcranial direct-current stimulation (tDCS) is showing great promise as a tool to causally manipulate the brain and understand how information is processed. The popularity of this method of brain stimulation is based on the fact that it is safe, inexpensive, its effects are long lasting, and you can increase the likelihood that neurons will fire near one electrode and decrease the likelihood that neurons will fire near another. However, this method of manipulating the brain to draw causal inferences is not without complication. Because tDCS methods continue to be refined and are not yet standardized, there are reports in the literature that show some striking inconsistencies. Primary among the complications of the technique is that the tDCS method uses two or more electrodes to pass current and all of these electrodes will have effects on the tissue underneath them. In this tutorial, we will share what we have learned about using tDCS to manipulate how the brain perceives, attends, remembers, and responds to information from our environment. Our goal is to provide a starting point for new users of tDCS and spur discussion of the standardization of methods to enhance replicability.
Using transcranial direct-current stimulation (tDCS) to understand cognitive processing
Reinhart, Robert M.G.; Cosman, Josh D.; Fukuda, Keisuke; Woodman, Geoffrey F.
2017-01-01
Noninvasive brain stimulation methods are becoming increasingly common tools in the kit of the cognitive scientist. In particular, transcranial direct-current stimulation (tDCS) is showing great promise as a tool to causally manipulate the brain and understand how information is processed. The popularity of this method of brain stimulation is based on the fact that it is safe, inexpensive, its effects are long lasting, and you can increase the likelihood that neurons will fire near one electrode and decrease the likelihood that neurons will fire near another. However, this method of manipulating the brain to draw causal inferences is not without complication. Because tDCS methods continue to be refined and are not yet standardized, there are reports in the literature that show some striking inconsistencies. Primary among the complications of the technique is that the tDCS method uses two or more electrodes to pass current and all of these electrodes will have effects on the tissue underneath them. In this tutorial, we will share what we have learned about using tDCS to manipulate how the brain perceives, attends, remembers, and responds to information from our environment. Our goal is to provide a starting point for new users of tDCS and spur discussion of the standardization of methods to enhance replicability. PMID:27804033
Code of Federal Regulations, 2010 CFR
2010-07-01
... 1210.34 Parks, Forests, and Public Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION GENERAL RULES... data, including date of disposal and sales price or the method used to determine current fair market... the following standards. For equipment with a current per unit fair market value of $5,000 or more...
The Cost of a Military Person-Year. A Method for Computing Savings from Force Reductions
2007-01-01
personnel FASAB Federal Accounting Standards Advisory Board FASB Financial Accounting Standards Board FERS Federal Employee Retirement System FG federal...an average prescribed SMCR of only $61,393, which is clearly problematic: Using the Comptroller’s own methods as mandated by DoD financial man... financial commitment the Figure 4.2 Probability-Adjusted Present Values of Retirement Benefits for Current Members by YOS RAND MG598-4.2 1,400 $ th o u
1990-10-04
methods Category 6: Cryptographic methods (hard/ software ) - Tested countermeasures and standard means - Acknowledgements As the number of antivirus ...Skulason), only our own antiviruses have been mentioned in the catalog. We hope to include the major antivirus packages in the future. The current...Center GTE SRI International Trusted Information Systems, Inc. Grumann Data Systems SRI International Software Engineering Institute Trusted
Evaluation of new aquatic toxicity test methods for oil dispersants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pace, C.B.; Clark, J.R.; Bragin, G.E.
1994-12-31
Current aquatic toxicity test methods used for dispersant registration do not address real world exposure scenarios. Current test methods require 48 or 96 hour constant exposure conditions. In contrast, environmentally realistic exposures can be described as a pulse in which the initial concentration declines over time. Recent research using a specially designed testing apparatus (the California system) has demonstrated that exposure to Corexit 9527{reg_sign} under pulsed exposure conditions may be 3 to 22 times less toxic compared to continuous exposure scenarios. The objectives of this study were to compare results of toxicity tests using the California test system to resultsmore » from standardized tests, evaluate sensitivity of regional (Holmesimysis cast and Atherinops affinis) vs. standard test species (Mysidopsis bahia and Menidia beryllina) and determine if tests using the California test system and method are reproducible. All tests were conducted using Corexit 9527{reg_sign} as the test material. Standard toxicity tests conducted with M. bahia and H. cast resulted in LC50s similar to those from tests using the California apparatus. LC50s from tests conducted in the authors` laboratory with the California system and standard test species were within a factor of 2 to 6 of data previously reported for west coast species. Results of tests conducted with H. cast in the laboratory compared favorably to data reported by Singer et al. 1991.« less
ERIC Educational Resources Information Center
Hopwood, Christopher J.; Morey, Leslie C.; Edelen, Maria Orlando; Shea, M. Tracie; Grilo, Carlos M.; Sanislow, Charles A.; McGlashan, Thomas H.; Daversa, Maria T.; Gunderson, John G.; Zanarini, Mary C.; Markowitz, John C.; Skodol, Andrew E.
2008-01-01
Interview methods are widely regarded as the standard for the diagnosis of borderline personality disorder (BPD), whereas self-report methods are considered a time-efficient alternative. However, the relative validity of these methods has not been sufficiently tested. The current study used data from the Collaborative Longitudinal Personality…
A Tool for Estimating Variability in Wood Preservative Treatment Retention
Patricia K. Lebow; Adam M. Taylor; Timothy M. Young
2015-01-01
Composite sampling is standard practice for evaluation of preservative retention levels in preservative-treated wood. Current protocols provide an average retention value but no estimate of uncertainty. Here we describe a statistical method for calculating uncertainty estimates using the standard sampling regime with minimal additional chemical analysis. This tool can...
Some Issues in Item Response Theory: Dimensionality Assessment and Models for Guessing
ERIC Educational Resources Information Center
Smith, Jessalyn
2009-01-01
Currently, standardized tests are widely used as a method to measure how well schools and students meet academic standards. As a result, measurement issues have become an increasingly popular topic of study. Unidimensional item response models are used to model latent abilities and specific item characteristics. This class of models makes…
La Barbera, Luigi; Galbusera, Fabio; Wilke, Hans-Joachim; Villa, Tomaso
2016-09-01
To discuss whether the available standard methods for preclinical evaluation of posterior spine stabilization devices can represent basic everyday life activities and how to compare the results obtained with different procedures. A comparative finite element study compared ASTM F1717 and ISO 12189 standards to validated instrumented L2-L4 segments undergoing standing, upper body flexion and extension. The internal loads on the spinal rod and the maximum stress on the implant are analysed. ISO recommended anterior support stiffness and force allow for reproducing bending moments measured in vivo on an instrumented physiological segment during upper body flexion. Despite the significance of ASTM model from an engineering point of view, the overly conservative vertebrectomy model represents an unrealistic worst case scenario. A method is proposed to determine the load to apply on assemblies with different anterior support stiffnesses to guarantee a comparable bending moment and reproduce specific everyday life activities. The study increases our awareness on the use of the current standards to achieve meaningful results easy to compare and interpret.
Feliciano, Rodrigo P; Shea, Michael P; Shanmuganayagam, Dhanansayan; Krueger, Christian G; Howell, Amy B; Reed, Jess D
2012-05-09
The 4-(dimethylamino)cinnamaldehyde (DMAC) assay is currently used to quantify proanthocyanidin (PAC) content in cranberry products. However, this method suffers from issues of accuracy and precision in the analysis and comparison of PAC levels across a broad range of cranberry products. Current use of procyanidin A2 as a standard leads to an underestimation of PACs content in certain cranberry products, especially those containing higher molecular weight PACs. To begin to address the issue of accuracy, a method for the production of a cranberry PAC standard, derived from an extraction of cranberry (c-PAC) press cake, was developed and evaluated. Use of the c-PAC standard to quantify PAC content in cranberry samples resulted in values that were 2.2 times higher than those determined by procyanidin A2. Increased accuracy is critical for estimating PAC content in relationship to research on authenticity, efficacy, and bioactivity, especially in designing clinical trials for determination of putative health benefits.
Code of Federal Regulations, 2010 CFR
2010-10-01
... disposal and sales price or the method used to determine current fair market value where a recipient... accordance with the following standards. For equipment with a current per unit fair market value of $5,000 or... fair market value of the equipment. If the recipient has no need for the equipment, the recipient shall...
Code of Federal Regulations, 2010 CFR
2010-01-01
... disposal and sales price or the method used to determine current fair market value where a recipient... accordance with the following standards. For equipment with a current per unit fair market value of $5000 or... fair market value of the equipment. If the recipient has no need for the equipment, the recipient shall...
Code of Federal Regulations, 2010 CFR
2010-07-01
... disposal and sales price or the method used to determine current fair market value where a recipient... accordance with the following standards. For equipment with a current per unit fair market value of $5000 or... fair market value of the equipment. If the recipient has no need for the equipment, the recipient shall...
41 CFR 105-72.404 - Equipment.
Code of Federal Regulations, 2010 CFR
2010-07-01
... disposal and sales price or the method used to determine current fair market value where a recipient... accordance with the following standards. For equipment with a current per unit fair market value of $5000 or... fair market value of the equipment. If the recipient has no need for the equipment, the recipient shall...
Development of Mass Spectrometric Ionization Methods for Fullerenes and Fullerene Derivatives
Currently investigations into the environmental behavior of fullerenes and fullerene derivatives is hampered by the lack of well characterized standards and by the lack of readily available quantitative analytical methods. Reported herein are investigations into the utility of ma...
The relationship between onsite manufacture of spray polyurethane foam insulation (SPFI) and potential exposures is not well understood. Currently, no comprehensive standard test methods exist for characterizing and quantifying product emissions. Exposures to diisocyanate compoun...
Exciting (the) Vacuum: Possible Manifestations of the Higgs particle at the LHC
David Kaplan
2017-12-09
The Higgs boson is the particle most anticipated at the LHC. However, there is currently no leading theory of electroweak symmetry breaking (and the 'Higgs mechanism'). The many possibilities suggest many ways the Higgs could appear in the detectors, some of which require non-standard search methods. I will review the current state of beyond the standard model physics and the implication for Higgs physics. I then discuss some non-standard Higgs decays and suggest (perhaps naive) new experimental strategies for detecting the Higgs in such cases. In some models, while part of the new physics at the weak scale would be visible, the Higgs would be nearly impossible to detect.
Park, Yu Rang; Yoon, Young Jo; Jang, Tae Hun; Seo, Hwa Jeong
2014-01-01
Objectives Extension of the standard model while retaining compliance with it is a challenging issue because there is currently no method for semantically or syntactically verifying an extended data model. A metadata-based extended model, named CCR+, was designed and implemented to achieve interoperability between standard and extended models. Methods Furthermore, a multilayered validation method was devised to validate the standard and extended models. The American Society for Testing and Materials (ASTM) Community Care Record (CCR) standard was selected to evaluate the CCR+ model; two CCR and one CCR+ XML files were evaluated. Results In total, 188 metadata were extracted from the ASTM CCR standard; these metadata are semantically interconnected and registered in the metadata registry. An extended-data-model-specific validation file was generated from these metadata. This file can be used in a smartphone application (Health Avatar CCR+) as a part of a multilayered validation. The new CCR+ model was successfully evaluated via a patient-centric exchange scenario involving multiple hospitals, with the results supporting both syntactic and semantic interoperability between the standard CCR and extended, CCR+, model. Conclusions A feasible method for delivering an extended model that complies with the standard model is presented herein. There is a great need to extend static standard models such as the ASTM CCR in various domains: the methods presented here represent an important reference for achieving interoperability between standard and extended models. PMID:24627817
Emerson, Jane F; Emerson, Scott S
2005-01-01
A standardized urinalysis and manual microscopic cell counting system was evaluated for its potential to reduce intra- and interoperator variability in urine and cerebrospinal fluid (CSF) cell counts. Replicate aliquots of pooled specimens were submitted blindly to technologists who were instructed to use either the Kova system with the disposable Glasstic slide (Hycor Biomedical, Inc., Garden Grove, CA) or the standard operating procedure of the University of California-Irvine (UCI), which uses plain glass slides for urine sediments and hemacytometers for CSF. The Hycor system provides a mechanical means of obtaining a fixed volume of fluid in which to resuspend the sediment, and fixes the volume of specimen to be microscopically examined by using capillary filling of a chamber containing in-plane counting grids. Ninety aliquots of pooled specimens of each type of body fluid were used to assess the inter- and intraoperator reproducibility of the measurements. The variability of replicate Hycor measurements made on a single specimen by the same or different observers was compared with that predicted by a Poisson distribution. The Hycor methods generally resulted in test statistics that were slightly lower than those obtained with the laboratory standard methods, indicating a trend toward decreasing the effects of various sources of variability. For 15 paired aliquots of each body fluid, tests for systematically higher or lower measurements with the Hycor methods were performed using the Wilcoxon signed-rank test. Also examined was the average difference between the Hycor and current laboratory standard measurements, along with a 95% confidence interval (CI) for the true average difference. Without increasing labor or the requirement for attention to detail, the Hycor method provides slightly better interrater comparisons than the current method used at UCI. Copyright 2005 Wiley-Liss, Inc.
Alternative volume performance standards for Medicare physicians' services.
Marquis, M S; Kominski, G F
1994-01-01
The Omnibus Budget Reconciliation Act of 1989 (OBRA89) established volume performance standards (VPSs) as a key element in Medicare physician reform. This policy requires making choices along three dimensions: the risk pool, the scope and nature of the standard, and the application of the standard. VPSs have most effectively controlled expenditures and changed physician behavior when they use states as the risk pool, are composed entirely of Medicare Part B services, and establish per capita utilization targets. The institution of separate standards for voluntarily formed physician groups would pose substantial administrative challenges and has the potential to effect adverse outcomes. Instead, Congress should continue to encourage prepaid plans for the purpose of lowering health care use. Under current law, VPSs will be used to adjust future price increases. Congress may not wish to emulate the example of countries that have imposed expenditure ceilings to control costs unless the current method of using VPSs proves unsuccessful.
ERIC Educational Resources Information Center
Patalino, Marianne
Problems in current course evaluation methods are discussed and an alternative method is described for the construction, analysis, and interpretation of a test to evaluate instructional programs. The method presented represents a different approach to the traditional overreliance on standardized achievement tests and the total scores they provide.…
The current U. S. Environmental Protection Agency-approved method for Enterococci (Method 1600) in recreational water is a membrane filter (MF) method that takes 24 hours to obtain results. If the recreational water is not in compliance with the standard, the risk of exposure to...
Various approaches in EPR identification of gamma-irradiated plant foodstuffs: A review.
Aleksieva, Katerina I; Yordanov, Nicola D
2018-03-01
Irradiation of food in the world is becoming a preferred method for their sterilization and extending their shelf life. For the purpose of trade with regard to the rights of consumers is necessary marking of irradiated foodstuffs, and the use of appropriate methods for unambiguous identification of radiation treatment. One-third of the current standards of the European Union to identify irradiated foods use the method of the Electron Paramagnetic Resonance (EPR) spectroscopy. On the other hand the current standards for irradiated foods of plant origin have some weaknesses that led to the development of new methodologies for the identification of irradiated food. New approaches for EPR identification of radiation treatment of herbs and spices when the specific signal is absent or disappeared after irradiation are discussed. Direct EPR measurements of dried fruits and vegetables and different pretreatments for fresh samples are reviewed. Copyright © 2017 Elsevier Ltd. All rights reserved.
Park, Yu Rang; Yoon, Young Jo; Jang, Tae Hun; Seo, Hwa Jeong; Kim, Ju Han
2014-01-01
Extension of the standard model while retaining compliance with it is a challenging issue because there is currently no method for semantically or syntactically verifying an extended data model. A metadata-based extended model, named CCR+, was designed and implemented to achieve interoperability between standard and extended models. Furthermore, a multilayered validation method was devised to validate the standard and extended models. The American Society for Testing and Materials (ASTM) Community Care Record (CCR) standard was selected to evaluate the CCR+ model; two CCR and one CCR+ XML files were evaluated. In total, 188 metadata were extracted from the ASTM CCR standard; these metadata are semantically interconnected and registered in the metadata registry. An extended-data-model-specific validation file was generated from these metadata. This file can be used in a smartphone application (Health Avatar CCR+) as a part of a multilayered validation. The new CCR+ model was successfully evaluated via a patient-centric exchange scenario involving multiple hospitals, with the results supporting both syntactic and semantic interoperability between the standard CCR and extended, CCR+, model. A feasible method for delivering an extended model that complies with the standard model is presented herein. There is a great need to extend static standard models such as the ASTM CCR in various domains: the methods presented here represent an important reference for achieving interoperability between standard and extended models.
Assessment of bifacial photovoltaic module power rating methodologies–inside and out
Deline, Chris; MacAlpine, Sara; Marion, Bill; ...
2017-01-26
One-sun power ratings for bifacial modules are currently undefined. This is partly because there is no standard definition of rear irradiance given 1000 W·m -2 on the front. Using field measurements and simulations, we evaluate multiple deployment scenarios for bifacial modules and provide details on the amount of irradiance that could be expected. A simplified case that represents a single module deployed under conditions consistent with existing one-sun irradiance standards lead to a bifacial reference condition of 1000 W·m -2 G front and 130-140 W·m -2 G rear. For fielded systems of bifacial modules, Grear magnitude and spatial uniformity willmore » be affected by self-shade from adjacent modules, varied ground cover, and ground-clearance height. A standard measurement procedure for bifacial modules is also currently undefined. A proposed international standard is under development, which provides the motivation for this paper. Here, we compare field measurements of bifacial modules under natural illumination with proposed indoor test methods, where irradiance is only applied to one side at a time. The indoor method has multiple advantages, including controlled and repeatable irradiance and thermal environment, along with allowing the use of conventional single-sided flash test equipment. The comparison results are promising, showing that indoor and outdoor methods agree within 1%-2% for multiple rear-irradiance conditions and bifacial module construction. Furthermore, a comparison with single-diode theory also shows good agreement to indoor measurements, within 1%-2% for power and other current-voltage curve parameters.« less
Campbell, Rebecca; Pierce, Steven J; Sharma, Dhruv B; Shaw, Jessica; Feeney, Hannah; Nye, Jeffrey; Schelling, Kristin; Fehler-Cabral, Giannina
2017-01-01
A growing number of U.S. cities have large numbers of untested sexual assault kits (SAKs) in police property facilities. Testing older kits and maintaining current case work will be challenging for forensic laboratories, creating a need for more efficient testing methods. We evaluated selective degradation methods for DNA extraction using actual case work from a sample of previously unsubmitted SAKs in Detroit, Michigan. We randomly assigned 350 kits to either standard or selective degradation testing methods and then compared DNA testing rates and CODIS entry rates between the two groups. Continuation-ratio modeling showed no significant differences, indicating that the selective degradation method had no decrement in performance relative to customary methods. Follow-up equivalence tests indicated that CODIS entry rates for the two methods could differ by more than ±5%. Selective degradation methods required less personnel time for testing and scientific review than standard testing. © 2016 American Academy of Forensic Sciences.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1969-07-01
The Fifth International Conference on Nondestructive Testing was held in Montreal, Canada, for the purpose of promoting international collaboration in all matters related to the development and use of nondestructive test methods. A total of 82 papers were selected for presentation. Session titles included: evaluation of material quality; ultrasonics - identification and measurements; thermal methods; testing of welds; visual aids in nondestructive testing; measurements of stress and elastic properties; magnetic and eddy-current methods; surface methods and neutron radiography; standardization - general; ultrasonics at elevated temperatures; applications; x-ray techniques; radiography; ultrasonic standardization; training and qualification; and, correlation of weld defects.
Reading the Small Print – Labelling Recommendations for Orthopaedic Implants
Haene, Roger A; Sandhu, Ranbir S; Baxandall, Richard
2009-01-01
INTRODUCTION There exist, currently, no clear guidelines regarding standards for surgical implant labelling. Dimensions of the laminar flow canopies in orthopaedic use fixes the distance at which implant labels can be read. Mistakes when reading the label on an implant box can pose health risks for patients, and financial consequences for medical institutions. SUBJECTS AND METHODS Using scientifically validated tools such as the Snellen Chart Formula, a theoretical minimum standard for text on implant labels was reached. This theoretical standard was then tested under real operating conditions. After discovering a minimum practical standard for implant labels, the authors then audited current labels in use on a wide range of orthopaedic implant packages. Furthermore, other non-text-related labelling problems were also noted. RESULTS There is a definite minimum standard which should be observed when implant labels are manufactured. Implants in current use bear labels on the packaging that are of an insufficient standard to ensure patient safety in theatre. CONCLUSIONS The authors have established text parameters that will increase the legibility of implant labels. In the interests of improving risk management in theatre, therefore, the authors propose a standard for orthopaedic implant labelling, and believe this will provide a useful foundation for further discussion between the orthopaedic community and implant manufacturers. PMID:19686615
The US Environmental Protection Agency (EPA) along with state, local, and tribal governments operate Federal Reference Method (FRM) and Federal Equivalent Method (FEM) instruments to assess compliance with US air pollution standards designed to protect human and ecosystem health....
EVALUATION OF BIOSOLID SAMPLE PROCESSING TECHNIQUES TO MAXIMIZE RECOVERY OF BACTERIA
Current federal regulations (40 CFR 503) require enumeration of fecal coliform or Salmoella prior to land application of Class A biosolids. This regulation specifies use of enumeration methods included in "Standard Methods for the Examination of Water and Wastewater 18th Edition,...
NASA Technical Reports Server (NTRS)
Ratcliffe, James G.
2010-01-01
This paper details part of an effort focused on the development of a standardized facesheet/core peel debonding test procedure. The purpose of the test is to characterize facesheet/core peel in sandwich structure, accomplished through the measurement of the critical strain energy release rate associated with the debonding process. The specific test method selected for the standardized test procedure utilizes a single cantilever beam (SCB) specimen configuration. The objective of the current work is to develop a method for establishing SCB specimen dimensions. This is achieved by imposing specific limitations on specimen dimensions, with the objectives of promoting a linear elastic specimen response, and simplifying the data reduction method required for computing the critical strain energy release rate associated with debonding. The sizing method is also designed to be suitable for incorporation into a standardized test protocol. Preliminary application of the resulting sizing method yields practical specimen dimensions.
Automated feature detection and identification in digital point-ordered signals
Oppenlander, Jane E.; Loomis, Kent C.; Brudnoy, David M.; Levy, Arthur J.
1998-01-01
A computer-based automated method to detect and identify features in digital point-ordered signals. The method is used for processing of non-destructive test signals, such as eddy current signals obtained from calibration standards. The signals are first automatically processed to remove noise and to determine a baseline. Next, features are detected in the signals using mathematical morphology filters. Finally, verification of the features is made using an expert system of pattern recognition methods and geometric criteria. The method has the advantage that standard features can be, located without prior knowledge of the number or sequence of the features. Further advantages are that standard features can be differentiated from irrelevant signal features such as noise, and detected features are automatically verified by parameters extracted from the signals. The method proceeds fully automatically without initial operator set-up and without subjective operator feature judgement.
Madej, Roberta M.; Davis, Jack; Holden, Marcia J.; Kwang, Stan; Labourier, Emmanuel; Schneider, George J.
2010-01-01
The utility of quantitative molecular diagnostics for patient management depends on the ability to relate patient results to prior results or to absolute values in clinical practice guidelines. To do this, those results need to be comparable across time and methods, either by producing the same value across methods and test versions or by using reliable and stable conversions. Universally available standards and reference materials specific to quantitative molecular technologies are critical to this process but are few in number. This review describes recent history in the establishment of international standards for nucleic acid test development, organizations involved in current efforts, and future issues and initiatives. PMID:20075208
Pharmacist perceptions of new competency standards
Maitreemit, Pagamas; Pongcharoensuk, Petcharat; Kapol, Nattiya; Armstrong, Edward P.
2008-01-01
Objective To suggest revisions to the Thai pharmacy competency standards and determine the perceptions of Thai pharmacy practitioners and faculty about the proposed pharmacy competency standards. Methods The current competency standards were revised by brainstorming session with nine Thai pharmacy experts according to their perceptions of society’s pharmacy needs. The revised standards were proposed and validated by 574 pharmacy practitioners and faculty members by using a written questionnaire. The respondents were classified based on their practice setting. Results The revision of pharmacy competency standard proposed the integration and addition to current competencies. Of 830 distributed questionnaires, 574 completed questionnaires were received (69.2% response rate). The proposed new competency standards contained 7 domains and 46 competencies. The majority of the respondents were supportive of all 46 proposed competencies. The highest ranked domain was Domain 1 (Practice Pharmacy within Laws, Professional Standards, and Ethics). The second and third highest expectations of pharmacy graduates were Domain 4 (Provide pharmaceutical care) and Domain 3 (Communicate and disseminate knowledge effectively). Conclusion The expectation for pharmacy graduates’ competencies were high and respondents encouraged additional growth in multidisciplinary efforts to improve patient care. PMID:25177401
Visualization of medical data based on EHR standards.
Kopanitsa, G; Hildebrand, C; Stausberg, J; Englmeier, K H
2013-01-01
To organize an efficient interaction between a doctor and an EHR the data has to be presented in the most convenient way. Medical data presentation methods and models must be flexible in order to cover the needs of the users with different backgrounds and requirements. Most visualization methods are doctor oriented, however, there are indications that the involvement of patients can optimize healthcare. The research aims at specifying the state of the art of medical data visualization. The paper analyzes a number of projects and defines requirements for a generic ISO 13606 based data visualization method. In order to do so it starts with a systematic search for studies on EHR user interfaces. In order to identify best practices visualization methods were evaluated according to the following criteria: limits of application, customizability, re-usability. The visualization methods were compared by using specified criteria. The review showed that the analyzed projects can contribute knowledge to the development of a generic visualization method. However, none of them proposed a model that meets all the necessary criteria for a re-usable standard based visualization method. The shortcomings were mostly related to the structure of current medical concept specifications. The analysis showed that medical data visualization methods use hardcoded GUI, which gives little flexibility. So medical data visualization has to turn from a hardcoded user interface to generic methods. This requires a great effort because current standards are not suitable for organizing the management of visualization data. This contradiction between a generic method and a flexible and user-friendly data layout has to be overcome.
Multiple imputation to account for measurement error in marginal structural models
Edwards, Jessie K.; Cole, Stephen R.; Westreich, Daniel; Crane, Heidi; Eron, Joseph J.; Mathews, W. Christopher; Moore, Richard; Boswell, Stephen L.; Lesko, Catherine R.; Mugavero, Michael J.
2015-01-01
Background Marginal structural models are an important tool for observational studies. These models typically assume that variables are measured without error. We describe a method to account for differential and non-differential measurement error in a marginal structural model. Methods We illustrate the method estimating the joint effects of antiretroviral therapy initiation and current smoking on all-cause mortality in a United States cohort of 12,290 patients with HIV followed for up to 5 years between 1998 and 2011. Smoking status was likely measured with error, but a subset of 3686 patients who reported smoking status on separate questionnaires composed an internal validation subgroup. We compared a standard joint marginal structural model fit using inverse probability weights to a model that also accounted for misclassification of smoking status using multiple imputation. Results In the standard analysis, current smoking was not associated with increased risk of mortality. After accounting for misclassification, current smoking without therapy was associated with increased mortality [hazard ratio (HR): 1.2 (95% CI: 0.6, 2.3)]. The HR for current smoking and therapy (0.4 (95% CI: 0.2, 0.7)) was similar to the HR for no smoking and therapy (0.4; 95% CI: 0.2, 0.6). Conclusions Multiple imputation can be used to account for measurement error in concert with methods for causal inference to strengthen results from observational studies. PMID:26214338
An Examination of the Relationship between a Child's Developmental Age and Early Literacy Learning
ERIC Educational Resources Information Center
Moran, Christine E.; Senseny, Karlen
2016-01-01
American students typically attend kindergarten at the chronological age (CA) of five and currently with the implementation of Common Core State Standards, there are expectations that children learn how to read in order to meet these academic standards, despite whether or not they are developmentally ready. This mixed methods study examined age…
A Standards-Based Approach for Reporting Assessment Results in South Africa
ERIC Educational Resources Information Center
Kanjee, Anil; Moloi, Qetelo
2016-01-01
This article proposes the use of a standards-based approach to reporting results from large-scale assessment surveys in South Africa. The use of this approach is intended to address the key shortcomings observed in the current reporting framework prescribed in the national curriculum documents. Using the Angoff method and data from the Annual…
[Study on the reorganization of standards related to food contact ceramics and porcelains].
Zhang, Jianbo; Zhu, Lei; Zhang, Hong; Liu, Shan; Wang, Zhutian
2014-07-01
To solve the problem of overlap, iterance and conflict among current standards related to food contact ceramics and porcelains. To collect all the current standards related to food contact ceramics and porcelains and reorganize them following the settled principles and method and list the standards that need to be revoked, revised, incorporated, or keep valid and excluded from the food safety standard system. 19 standards were collected in this study and reorganized. The main food safety indexes in these standards were the limits for lead and cadmium that released from food contact ceramics and porcelains. There were released limits for lead and cadmium in 10 standards, including 4 horizontal standards and 6 commodity standards. The provisions in these 10 standards were in conflict. And as a result of this, the 4 horizontal standards were suggested to be incorporated and revised to one food safety standard, while the 6 commodity standards were suggested to be revised and exclude the lead and cadmium provisions. Another 7 commodity standards only referenced provisions for lead and cadmium limits from horizontal standards, and these 7 standards were suggested to be excluded from food safety standard system. There were no food safety indexes in 2 standards of the 19 standards, these standards were considered not related to food safety and no need to be reorganized. There were conflicts about the released limits of lead and cadmium among the current standards related to food contact ceramics and porcelains. So, it is necessary to a set up a new food safety standard for released lead and cadmium permissible limits which can apply to all food contact ceramics and porcelains. This food safety standard should be based on food safety risk assessment and the actual situations of manufacture and usage of food contact ceramics and porcelains. The provisions in international standards and relative standards from other countries can also provide references to this standard.
Gupta, Veer; Henriksen, Kim; Edwards, Melissa; Jeromin, Andreas; Lista, Simone; Bazenet, Chantal; Soares, Holly; Lovestone, Simon; Hampel, Harald; Montine, Thomas; Blennow, Kaj; Foroud, Tatiana; Carrillo, Maria; Graff-Radford, Neill; Laske, Christoph; Breteler, Monique; Shaw, Leslie; Trojanowski, John Q.; Schupf, Nicole; Rissman, Robert A.; Fagan, Anne M.; Oberoi, Pankaj; Umek, Robert; Weiner, Michael W.; Grammas, Paula; Posner, Holly; Martins, Ralph
2015-01-01
The lack of readily available biomarkers is a significant hindrance towards progressing to effective therapeutic and preventative strategies for Alzheimer’s disease (AD). Blood-based biomarkers have potential to overcome access and cost barriers and greatly facilitate advanced neuroimaging and cerebrospinal fluid biomarker approaches. Despite the fact that preanalytical processing is the largest source of variability in laboratory testing, there are no currently available standardized preanalytical guidelines. The current international working group provides the initial starting point for such guidelines for standardized operating procedures (SOPs). It is anticipated that these guidelines will be updated as additional research findings become available. The statement provides (1) a synopsis of selected preanalytical methods utilized in many international AD cohort studies, (2) initial draft guidelines/SOPs for preanalytical methods, and (3) a list of required methodological information and protocols to be made available for publications in the field in order to foster cross-validation across cohorts and laboratories. PMID:25282381
Schoenberg, Mike R; Lange, Rael T; Brickell, Tracey A; Saklofske, Donald H
2007-04-01
Neuropsychologic evaluation requires current test performance be contrasted against a comparison standard to determine if change has occurred. An estimate of premorbid intelligence quotient (IQ) is often used as a comparison standard. The Wechsler Intelligence Scale for Children-Fourth Edition (WISC-IV) is a commonly used intelligence test. However, there is no method to estimate premorbid IQ for the WISC-IV, limiting the test's utility for neuropsychologic assessment. This study develops algorithms to estimate premorbid Full Scale IQ scores. Participants were the American WISC-IV standardization sample (N = 2172). The sample was randomly divided into 2 groups (development and validation). The development group was used to generate 12 algorithms. These algorithms were accurate predictors of WISC-IV Full Scale IQ scores in healthy children and adolescents. These algorithms hold promise as a method to predict premorbid IQ for patients with known or suspected neurologic dysfunction; however, clinical validation is required.
White, Helen E; Hedges, John; Bendit, Israel; Branford, Susan; Colomer, Dolors; Hochhaus, Andreas; Hughes, Timothy; Kamel-Reid, Suzanne; Kim, Dong-Wook; Modur, Vijay; Müller, Martin C; Pagnano, Katia B; Pane, Fabrizio; Radich, Jerry; Cross, Nicholas C P; Labourier, Emmanuel
2013-06-01
Current guidelines for managing Philadelphia-positive chronic myeloid leukemia include monitoring the expression of the BCR-ABL1 (breakpoint cluster region/c-abl oncogene 1, non-receptor tyrosine kinase) fusion gene by quantitative reverse-transcription PCR (RT-qPCR). Our goal was to establish and validate reference panels to mitigate the interlaboratory imprecision of quantitative BCR-ABL1 measurements and to facilitate global standardization on the international scale (IS). Four-level secondary reference panels were manufactured under controlled and validated processes with synthetic Armored RNA Quant molecules (Asuragen) calibrated to reference standards from the WHO and the NIST. Performance was evaluated in IS reference laboratories and with non-IS-standardized RT-qPCR methods. For most methods, percent ratios for BCR-ABL1 e13a2 and e14a2 relative to ABL1 or BCR were robust at 4 different levels and linear over 3 logarithms, from 10% to 0.01% on the IS. The intraassay and interassay imprecision was <2-fold overall. Performance was stable across 3 consecutive lots, in multiple laboratories, and over a period of 18 months to date. International field trials demonstrated the commutability of the reagents and their accurate alignment to the IS within the intra- and interlaboratory imprecision of IS-standardized methods. The synthetic calibrator panels are robust, reproducibly manufactured, analytically calibrated to the WHO primary standards, and compatible with most BCR-ABL1 RT-qPCR assay designs. The broad availability of secondary reference reagents will further facilitate interlaboratory comparative studies and independent quality assessment programs, which are of paramount importance for worldwide standardization of BCR-ABL1 monitoring results and the optimization of current and new therapeutic approaches for chronic myeloid leukemia. © 2013 American Association for Clinical Chemistry.
Eddy Current, Magnetic Particle and Hardness Testing, Aviation Quality Control (Advanced): 9227.04.
ERIC Educational Resources Information Center
Dade County Public Schools, Miami, FL.
This unit of instruction includes the principles of eddy current, magnetic particle and hardness testing; standards used for analyzing test results; techniques of operating equipment; interpretation of indications; advantages and limitations of these methods of testing; care and calibration of equipment; and safety and work precautions. Motion…
Back to the Future: Implications of the Neopositivist Research Agenda for Adult Basic Education
ERIC Educational Resources Information Center
Belzer, Alisa; Clair, Ralf St.
2005-01-01
Federal educational policy, funding, and legislation are currently forwarding a research agenda described by the current administration as scientifically based. We characterize this agenda, described in multiple official documents as the gold standard, as neopositivist. We consider the ways in which this research paradigm and the methods that…
Analysis and elimination method of the effects of cables on LVRT testing for offshore wind turbines
NASA Astrophysics Data System (ADS)
Jiang, Zimin; Liu, Xiaohao; Li, Changgang; Liu, Yutian
2018-02-01
The current state, characteristics and necessity of the low voltage ride through (LVRT) on-site testing for grid-connected offshore wind turbines are introduced firstly. Then the effects of submarine cables on the LVRT testing are analysed based on the equivalent circuit of the testing system. A scheme for eliminating the effects of cables on the proposed LVRT testing method is presented. The specified voltage dips are guaranteed to be in compliance with the testing standards by adjusting the ratio between the current limiting impedance and short circuit impedance according to the steady voltage relationship derived from the equivalent circuit. Finally, simulation results demonstrate that the voltage dips at the high voltage side of wind turbine transformer satisfy the requirements of testing standards.
USE OF A MOLECULAR PROBE ASSAY FOR MONITORING SALMONELLA SPP. IN BIOSOLIDS SAMPLES
Current federal regulations (40 CFR 503) require enumeration of fecal coliform or salmonellae prior to land application of biosolids. This regulation specifies use of enumeration methods included in "Standard methods for the Examination of Water and Wastewater 18th Edition," (SM)...
Li, Zhen-hua; Li, Hong-bin; Zhang, Zhi
2013-07-01
Electronic transformers are widely used in power systems because of their wide bandwidth and good transient performance. However, as an emerging technology, the failure rate of electronic transformers is higher than that of traditional transformers. As a result, the calibration period needs to be shortened. Traditional calibration methods require the power of transmission line be cut off, which results in complicated operation and power off loss. This paper proposes an online calibration system which can calibrate electronic current transformers without power off. In this work, the high accuracy standard current transformer and online operation method are the key techniques. Based on the clamp-shape iron-core coil and clamp-shape air-core coil, a combined clamp-shape coil is designed as the standard current transformer. By analyzing the output characteristics of the two coils, the combined clamp-shape coil can achieve verification of the accuracy. So the accuracy of the online calibration system can be guaranteed. Moreover, by employing the earth potential working method and using two insulating rods to connect the combined clamp-shape coil to the high voltage bus, the operation becomes simple and safe. Tests in China National Center for High Voltage Measurement and field experiments show that the proposed system has a high accuracy of up to 0.05 class.
STANDARD REFERENCE MATERIALS FOR THE POLYMERS INDUSTRY.
McDonough, Walter G; Orski, Sara V; Guttman, Charles M; Migler, Kalman D; Beers, Kathryn L
2016-01-01
The National Institute of Standards and Technology (NIST) provides science, industry, and government with a central source of well-characterized materials certified for chemical composition or for some chemical or physical property. These materials are designated Standard Reference Materials ® (SRMs) and are used to calibrate measuring instruments, to evaluate methods and systems, or to produce scientific data that can be referred readily to a common base. In this paper, we discuss the history of polymer based SRMs, their current status, and challenges and opportunities to develop new standards to address industrial measurement challenges.
Lim, Maria A; Louie, Brenton; Ford, Daniel; Heath, Kyle; Cha, Paulyn; Betts-Lacroix, Joe; Lum, Pek Yee; Robertson, Timothy L; Schaevitz, Laura
2017-01-01
Despite a broad spectrum of anti-arthritic drugs currently on the market, there is a constant demand to develop improved therapeutic agents. Efficient compound screening and rapid evaluation of treatment efficacy in animal models of rheumatoid arthritis (RA) can accelerate the development of clinical candidates. Compound screening by evaluation of disease phenotypes in animal models facilitates preclinical research by enhancing understanding of human pathophysiology; however, there is still a continuous need to improve methods for evaluating disease. Current clinical assessment methods are challenged by the subjective nature of scoring-based methods, time-consuming longitudinal experiments, and the requirement for better functional readouts with relevance to human disease. To address these needs, we developed a low-touch, digital platform for phenotyping preclinical rodent models of disease. As a proof-of-concept, we utilized the rat collagen-induced arthritis (CIA) model of RA and developed the Digital Arthritis Index (DAI), an objective and automated behavioral metric that does not require human-animal interaction during the measurement and calculation of disease parameters. The DAI detected the development of arthritis similar to standard in vivo methods, including ankle joint measurements and arthritis scores, as well as demonstrated a positive correlation to ankle joint histopathology. The DAI also determined responses to multiple standard-of-care (SOC) treatments and nine repurposed compounds predicted by the SMarTR TM Engine to have varying degrees of impact on RA. The disease profiles generated by the DAI complemented those generated by standard methods. The DAI is a highly reproducible and automated approach that can be used in-conjunction with standard methods for detecting RA disease progression and conducting phenotypic drug screens.
Park, Hae-Jeong; Kwon, Jun Soo; Youn, Tak; Pae, Ji Soo; Kim, Jae-Jin; Kim, Myung-Sun; Ha, Kyoo-Seob
2002-11-01
We describe a method for the statistical parametric mapping of low resolution electromagnetic tomography (LORETA) using high-density electroencephalography (EEG) and individual magnetic resonance images (MRI) to investigate the characteristics of the mismatch negativity (MMN) generators in schizophrenia. LORETA, using a realistic head model of the boundary element method derived from the individual anatomy, estimated the current density maps from the scalp topography of the 128-channel EEG. From the current density maps that covered the whole cortical gray matter (up to 20,000 points), volumetric current density images were reconstructed. Intensity normalization of the smoothed current density images was used to reduce the confounding effect of subject specific global activity. After transforming each image into a standard stereotaxic space, we carried out statistical parametric mapping of the normalized current density images. We applied this method to the source localization of MMN in schizophrenia. The MMN generators, produced by a deviant tone of 1,200 Hz (5% of 1,600 trials) under the standard tone of 1,000 Hz, 80 dB binaural stimuli with 300 msec of inter-stimulus interval, were measured in 14 right-handed schizophrenic subjects and 14 age-, gender-, and handedness-matched controls. We found that the schizophrenic group exhibited significant current density reductions of MMN in the left superior temporal gyrus and the left inferior parietal gyrus (P < 0. 0005). This study is the first voxel-by-voxel statistical mapping of current density using individual MRI and high-density EEG. Copyright 2002 Wiley-Liss, Inc.
USDA-ARS?s Scientific Manuscript database
Moisture affects economical and rheological properties of cotton, making its accurate determination important. A significant difference in moisture contents between the current and most cited standard oven drying ASTM method (ASTM D 2495, SOD) and volumetric Karl Fischer Titration (KFT) has been est...
The U.S. Environmental Protection Agency (EPA), Research Triangle Park, North Carolina, has a program to evaluate and standardize source testing methods for hazardous pollutants in support of current and future air quality regulations. ccasionally, questions arise concerning an e...
Direct mapping of local redox current density on a monolith electrode by laser scanning.
Lee, Seung-Woo; Lopez, Jeffrey; Saraf, Ravi F
2013-09-15
An optical method of mapping local redox reaction over a monolith electrode using simple laser scanning is described. As the optical signal is linearly proportional to the maximum redox current that is measured concomitantly by voltammetry, the optical signal quantitatively maps the local redox current density distribution. The method is demonstrated on two types of reactions: (1) a reversible reaction where the redox moieties are ionic, and (2) an irreversible reaction on two different types of enzymes immobilized on the electrode where the reaction moieties are nonionic. To demonstrate the scanning capability, the local redox behavior on a "V-shaped" electrode is studied where the local length scale and, hence, the local current density, is nonuniform. The ability to measure the current density distribution by this method will pave the way for multianalyte analysis on a monolith electrode using a standard three-electrode configuration. The method is called Scanning Electrometer for Electrical Double-layer (SEED). Copyright © 2013 Elsevier B.V. All rights reserved.
Informatics and Standards for Nanomedicine Technology
Thomas, Dennis G.; Klaessig, Fred; Harper, Stacey L.; Fritts, Martin; Hoover, Mark D.; Gaheen, Sharon; Stokes, Todd H.; Reznik-Zellen, Rebecca; Freund, Elaine T.; Klemm, Juli D.; Paik, David S.; Baker, Nathan A.
2011-01-01
There are several issues to be addressed concerning the management and effective use of information (or data), generated from nanotechnology studies in biomedical research and medicine. These data are large in volume, diverse in content, and are beset with gaps and ambiguities in the description and characterization of nanomaterials. In this work, we have reviewed three areas of nanomedicine informatics: information resources; taxonomies, controlled vocabularies, and ontologies; and information standards. Informatics methods and standards in each of these areas are critical for enabling collaboration, data sharing, unambiguous representation and interpretation of data, semantic (meaningful) search and integration of data; and for ensuring data quality, reliability, and reproducibility. In particular, we have considered four types of information standards in this review, which are standard characterization protocols, common terminology standards, minimum information standards, and standard data communication (exchange) formats. Currently, due to gaps and ambiguities in the data, it is also difficult to apply computational methods and machine learning techniques to analyze, interpret and recognize patterns in data that are high dimensional in nature, and also to relate variations in nanomaterial properties to variations in their chemical composition, synthesis, characterization protocols, etc. Progress towards resolving the issues of information management in nanomedicine using informatics methods and standards discussed in this review will be essential to the rapidly growing field of nanomedicine informatics. PMID:21721140
Harvey, Matthew J; Mason, Nicholas J; McLean, Andrew; Rzepa, Henry S
2015-01-01
We describe three different procedures based on metadata standards for enabling automated retrieval of scientific data from digital repositories utilising the persistent identifier of the dataset with optional specification of the attributes of the data document such as filename or media type. The procedures are demonstrated using the JSmol molecular visualizer as a component of a web page and Avogadro as a stand-alone modelling program. We compare our methods for automated retrieval of data from a standards-compliant data repository with those currently in operation for a selection of existing molecular databases and repositories. Our methods illustrate the importance of adopting a standards-based approach of using metadata declarations to increase access to and discoverability of repository-based data. Graphical abstract.
Prigge, R.; Micke, H.; Krüger, J.
1963-01-01
As part of a collaborative assay of the proposed Fifth International Standard for Gas-Gangrene Antitoxin (Perfringens), five ampoules of the proposed replacement material were assayed in the authors' laboratory against the then current Fourth International Standard. Both in vitro and in vivo methods were used. This paper presents the results and their statistical analysis. The two methods yielded different results which were not likely to have been due to chance, but exact statistical comparison is not possible. It is thought, however, that the differences may be due, at least in part, to differences in the relative proportions of zeta-antitoxin and alpha-antitoxin in the Fourth and Fifth International Standards and the consequent different reactions with the test toxin that was used for titration. PMID:14107746
Evaluation and Field Assessment of Bifacial Photovoltaic Module Power Rating Methodologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deline, Chris; MacAlpine, Sara; Marion, Bill
2016-11-21
1-sun power ratings for bifacial modules are currently undefined. This is partly because there is no standard definition of rear irradiance given 1000 Wm-2 on the front. Using field measurements and simulations, we evaluate multiple deployment scenarios for bifacial modules and provide details on the amount of irradiance that could be expected. A simplified case that represents a single module deployed under conditions consistent with existing 1-sun irradiance standards leads to a bifacial reference condition of 1000 Wm-2 Gfront and 130-140 Wm-2 Grear. For fielded systems of bifacial modules, Grear magnitude and spatial uniformity will be affected by self-shade frommore » adjacent modules, varied ground cover, and ground-clearance height. A standard measurement procedure for bifacial modules is also currently undefined. A proposed international standard is under development, which provides the motivation for this work. Here, we compare outdoor field measurements of bifacial modules with irradiance on both sides with proposed indoor test methods where irradiance is only applied to one side at a time. The indoor method has multiple advantages, including controlled and repeatable irradiance and thermal environment, along with allowing the use of conventional single-sided flash test equipment. The comparison results are promising, showing that the indoor and outdoor methods agree within 1%-2% for multiple rear-irradiance conditions and bifacial module types.« less
Evaluation and Field Assessment of Bifacial Photovoltaic Module Power Rating Methodologies: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deline, Chris; MacAlpine, Sara; Marion, Bill
2016-06-16
1-sun power ratings for bifacial modules are currently undefined. This is partly because there is no standard definition of rear irradiance given 1000 Wm-2 on the front. Using field measurements and simulations, we evaluate multiple deployment scenarios for bifacial modules and provide details on the amount of irradiance that could be expected. A simplified case that represents a single module deployed under conditions consistent with existing 1-sun irradiance standards leads to a bifacial reference condition of 1000 Wm-2 Gfront and 130-140 Wm-2 Grear. For fielded systems of bifacial modules, Grear magnitude and spatial uniformity will be affected by self-shade frommore » adjacent modules, varied ground cover, and ground-clearance height. A standard measurement procedure for bifacial modules is also currently undefined. A proposed international standard is under development, which provides the motivation for this work. Here, we compare outdoor field measurements of bifacial modules with irradiance on both sides with proposed indoor test methods where irradiance is only applied to one side at a time. The indoor method has multiple advantages, including controlled and repeatable irradiance and thermal environment, along with allowing the use of conventional single-sided flash test equipment. The comparison results are promising, showing that the indoor and outdoor methods agree within 1%-2% for multiple rear-irradiance conditions and bifacial module types.« less
Environmentally safe aviation fuels
NASA Technical Reports Server (NTRS)
Liberio, Patricia D.
1995-01-01
In response to the Air Force directive to remove Ozone Depleting Chemicals (ODC's) from military specifications and Defense Logistics Agency's Hazardous Waste Minimization Program, we are faced with how to ensure a quality aviation fuel without using such chemicals. Many of these chemicals are found throughout the fuel and fuel related military specifications and are part of test methods that help qualify the properties and quality of the fuels before they are procured. Many years ago there was a directive for military specifications to use commercially standard test methods in order to provide standard testing in private industry and government. As a result the test methods used in military specifications are governed by the American Society of Testing and Materials (ASTM). The Air Force has been very proactive in the removal or replacement of the ODC's and hazardous materials in these test methods. For example, ASTM D3703 (Standard Test Method for Peroxide Number of Aviation Turbine Fuels), requires the use of Freon 113, a known ODC. A new rapid, portable hydroperoxide test for jet fuels similar to ASTM D3703 that does not require the use of ODC's has been developed. This test has proved, in limited testing, to be a viable substitute method for ASTM D3703. The Air Force is currently conducting a round robin to allow the method to be accepted by ASTM and therefore replace the current method. This paper will describe the Air Force's initiatives to remove ODC's and hazardous materials from the fuel and fuel related military specifications that the Air Force Wright Laboratory.
12 CFR 217.210 - Standardized measurement method for specific risk
Code of Federal Regulations, 2014 CFR
2014-01-01
... current fair value of the transaction plus the absolute value of the present value of all remaining... a securitization position and its credit derivative hedge has a specific risk add-on of zero if: (i... institution must multiply the absolute value of the current fair value of each net long or net short debt or...
12 CFR 3.210 - Standardized measurement method for specific risk
Code of Federal Regulations, 2014 CFR
2014-01-01
... purchased credit protection is capped at the current fair value of the transaction plus the absolute value... specific risk add-on of zero if: (i) The debt or securitization position is fully hedged by a total return... absolute value of the current fair value of each net long or net short debt or securitization position in...
Treasure of the Past IX: Exposure Standardization of Iodine-125 Seeds Used for Brachytherapy
Loftus, T. P.
2001-01-01
A method for calibrating iodine-125 seeds in terms of exposure has been established. The standard free-air ionization chamber, used for measuring soft x rays, was chosen for the measurements. Arrays of four to six seeds were used to enhance the ionization-current-to-background-current ratio. Seeds from an array were measured individually in a re-entrant chamber. The quotient of the exposure rate for the array by the sum of the ionization currents in the re-entrant chamber is the calibration factor for the re-entrant chamber. Calibration factors were established for three types of iodine-125 seeds. The overall uncertainty for the seed exposure calibrations is less than 6%. PMID:27500052
Faravan, Amir; Mohammadi, Nooredin; Alizadeh Ghavidel, Alireza; Toutounchi, Mohammad Zia; Ghanbari, Ameneh; Mazloomi, Mehran
2016-01-01
Introduction: Standards have a significant role in showing the minimum level of optimal optimum and the expected performance. Since the perfusion technology staffs play an the leading role in providing the quality services to the patients undergoing open heart surgery with cardiopulmonary bypass machine, this study aimed to assess the standards on how Iranian perfusion technology staffs evaluate and manage the patients during the cardiopulmonary bypass process and compare their practice with the recommended standards by American Society of Extracorporeal Technology. Methods: In this descriptive study, data was collected from 48 Iranian public hospitals and educational health centers through a researcher-created questionnaire. The data collection questionnaire assessed the standards which are recommended by American Society of Extracorporeal Technology. Results: Findings showed that appropriate measurements were carried out by the perfusion technology staffs to prevent the hemodilution and avoid the blood transfusion and unnecessary blood products, determine the initial dose of heparin based on one of the proposed methods, monitor the anticoagulants based on ACT measurement, and determine the additional doses of heparin during the cardiopulmonary bypass based on ACT or protamine titration. It was done only in 4.2% of hospitals and health centers. Conclusion: Current practices of cardiopulmonary perfusion technology in Iran are inappropriate based on the standards of American Society of Cardiovascular Perfusion. This represents the necessity of authorities’ attention to the validation programs and development of the caring standards on one hand and continuous assessment of using these standards on the other hand. PMID:27489600
Johnston, Jennifer M.
2014-01-01
The majority of biological processes mediated by G Protein-Coupled Receptors (GPCRs) take place on timescales that are not conveniently accessible to standard molecular dynamics (MD) approaches, notwithstanding the current availability of specialized parallel computer architectures, and efficient simulation algorithms. Enhanced MD-based methods have started to assume an important role in the study of the rugged energy landscape of GPCRs by providing mechanistic details of complex receptor processes such as ligand recognition, activation, and oligomerization. We provide here an overview of these methods in their most recent application to the field. PMID:24158803
Fast analytical spectral filtering methods for magnetic resonance perfusion quantification.
Reddy, Kasireddy V; Mitra, Abhishek; Yalavarthy, Phaneendra K
2016-08-01
The deconvolution in the perfusion weighted imaging (PWI) plays an important role in quantifying the MR perfusion parameters. The PWI application to stroke and brain tumor studies has become a standard clinical practice. The standard approach for this deconvolution is oscillatory-limited singular value decomposition (oSVD) and frequency domain deconvolution (FDD). The FDD is widely recognized as the fastest approach currently available for deconvolution of MR perfusion data. In this work, two fast deconvolution methods (namely analytical fourier filtering and analytical showalter spectral filtering) are proposed. Through systematic evaluation, the proposed methods are shown to be computationally efficient and quantitatively accurate compared to FDD and oSVD.
Zhiyong Cai; Michael O. Hunt; Robert J. Ross; Lawrence A. Soltis
1999-01-01
To date, there is no standard method for evaluating the structural integrity of wood floor systems using nondestructive techniques. Current methods of examination and assessment are often subjective and therefore tend to yield imprecise or variable results. For this reason, estimates of allowable wood floor loads are often conservative. The assignment of conservatively...
Space Technology 5 Multi-point Measurements of Near-Earth Magnetic Fields: Initial Results
NASA Technical Reports Server (NTRS)
Slavin, James A.; Le, G.; Strangeway, R. L.; Wang, Y.; Boardsen, S.A.; Moldwin, M. B.; Spence, H. E.
2007-01-01
The Space Technology 5 (ST-5) mission successfully placed three micro-satellites in a 300 x 4500 km dawn-dusk orbit on 22 March 2006. Each spacecraft carried a boom-mounted vector fluxgate magnetometer that returned highly sensitive and accurate measurements of the geomagnetic field. These data allow, for the first time, the separation of temporal and spatial variations in field-aligned current (FAC) perturbations measured in low-Earth orbit on time scales of approximately 10 sec to 10 min. The constellation measurements are used to directly determine field-aligned current sheet motion, thickness and current density. In doing so, we demonstrate two multi-point methods for the inference of FAC current density that have not previously been possible in low-Earth orbit; 1) the "standard method," based upon s/c velocity, but corrected for FAC current sheet motion, and 2) the "gradiometer method" which uses simultaneous magnetic field measurements at two points with known separation. Future studies will apply these methods to the entire ST-5 data set and expand to include geomagnetic field gradient analyses as well as field-aligned and ionospheric currents.
A thioacidolysis method tailored for higher‐throughput quantitative analysis of lignin monomers
Foster, Cliff; Happs, Renee M.; Doeppke, Crissa; Meunier, Kristoffer; Gehan, Jackson; Yue, Fengxia; Lu, Fachuang; Davis, Mark F.
2016-01-01
Abstract Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β‐O‐4 linkages. Current thioacidolysis methods are low‐throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non‐chlorinated organic solvent and is tailored for higher‐throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1–2 mg of biomass per assay and has been quantified using fast‐GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, including standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day‐to‐day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. The method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses. PMID:27534715
A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.
Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, includingmore » standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.« less
A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers
Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.; ...
2016-09-14
Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, includingmore » standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.« less
Ittianuwat, R; Fard, M; Kato, K
2017-01-01
Although much research has been done in developing the current ISO 2631-1 (1997) standard method for assessment seat vibration comfort, little consideration has been given to the influence of vehicle seat structural dynamics on comfort assessment. Previous research has shown that there are inconsistencies between standard methods and subjective evaluation of comfort at around vehicle seat twisting resonant frequencies. This study reports the frequency-weighted r.m.s. accelerations in [Formula: see text], [Formula: see text] and [Formula: see text] axes and the total vibration (point vibration total value) at five locations on seatback surface at around vehicle seat twisting resonant frequencies. The results show that the vibration measured at the centre of seatback surface, suggested by current ISO 2631-1 (1997), at around twisting resonant frequencies was the least for all tested vehicle seats. The greatest point vibration total value on the seatback surface varies among vehicle seats. The variations in vibration measured at different locations on seatback surface at around twisting resonant frequencies were sufficiently great that might affect the comfort assessment of vehicle seat.Practitioner Summary: The influence of vehicle seat structural dynamics has not been considered in current ISO 2631-1 (1997). The results of this study show that the vibration measures on seatback surface at around vehicle seat twisting resonant frequency depends on vehicle seats and dominate at the top or the bottom of seatback but not at the centre.
A review of contemporary methods for the presentation of scientific uncertainty.
Makinson, K A; Hamby, D M; Edwards, J A
2012-12-01
Graphic methods for displaying uncertainty are often the most concise and informative way to communicate abstract concepts. Presentation methods currently in use for the display and interpretation of scientific uncertainty are reviewed. Numerous subjective and objective uncertainty display methods are presented, including qualitative assessments, node and arrow diagrams, standard statistical methods, box-and-whisker plots,robustness and opportunity functions, contribution indexes, probability density functions, cumulative distribution functions, and graphical likelihood functions.
Pezzotti, Giuseppe; Affatato, Saverio; Rondinella, Alfredo; Yorifuji, Makiko; Marin, Elia; Zhu, Wenliang; McEntire, Bryan; Bal, Sonny B.; Yamamoto, Kengo
2017-01-01
A clear discrepancy between predicted in vitro and actual in vivo surface phase stability of BIOLOX®delta zirconia-toughened alumina (ZTA) femoral heads has been demonstrated by several independent research groups. Data from retrievals challenge the validity of the standard method currently utilized in evaluating surface stability and raise a series of important questions: (1) Why do in vitro hydrothermal aging treatments conspicuously fail to model actual results from the in vivo environment? (2) What is the preponderant microscopic phenomenon triggering the accelerated transformation in vivo? (3) Ultimately, what revisions of the current in vitro standard are needed in order to obtain consistent predictions of ZTA transformation kinetics in vivo? Reported in this paper is a new in toto method for visualizing the surface stability of femoral heads. It is based on CAD-assisted Raman spectroscopy to quantitatively assess the phase transformation observed in ZTA retrievals. Using a series of independent analytical probes, an evaluation of the microscopic mechanisms responsible for the polymorphic transformation is also provided. An outline is given of the possible ways in which the current hydrothermal simulation standard for artificial joints can be improved in an attempt to reduce the gap between in vitro simulation and reality. PMID:28772828
Method of Calculating the Correction Factors for Cable Dimensioning in Smart Grids
NASA Astrophysics Data System (ADS)
Simutkin, M.; Tuzikova, V.; Tlusty, J.; Tulsky, V.; Muller, Z.
2017-04-01
One of the main causes of overloading electrical equipment by currents of higher harmonics is the great increasing of a number of non-linear electricity power consumers. Non-sinusoidal voltages and currents affect the operation of electrical equipment, reducing its lifetime, increases the voltage and power losses in the network, reducing its capacity. There are standards that respects emissions amount of higher harmonics current that cannot provide interference limit for a safe level in power grid. The article presents a method for determining a correction factor to the long-term allowable current of the cable, which allows for this influence. Using mathematical models in the software Elcut, it was described thermal processes in the cable in case the flow of non-sinusoidal current. Developed in the article theoretical principles, methods, mathematical models allow us to calculate the correction factor to account for the effect of higher harmonics in the current spectrum for network equipment in any type of non-linear load.
Standardization of Assays That Detect Anti-Rubella Virus IgG Antibodies
Grangeot-Keros, Liliane; Vauloup-Fellous, Christelle
2015-01-01
SUMMARY Rubella virus usually causes a mild infection in humans but can cause congenital rubella syndrome (CRS). Vaccination programs have significantly decreased primary rubella virus infection and CRS; however, vaccinated individuals usually have lower levels of rubella virus IgG than those with natural infections. Rubella virus IgG is quantified with enzyme immunoassays that have been calibrated against the World Health Organization (WHO) international standard and report results in international units per milliliter. It is recognized that the results reported by these assays are not standardized. This investigation into the reasons for the lack of standardization found that the current WHO international standard (RUB-1-94) fails by three key metrological principles. The standard is not a pure analyte but is composed of pooled human immunoglobulin. It was not calibrated by certified reference methods; rather, superseded tests were used. Finally, no measurement uncertainty estimations have been provided. There is an analytical and clinical consequence to the lack of standardization of rubella virus IgG assays, which leads to misinterpretation of results. The current approach to standardization of rubella virus IgG assays has not achieved the desired results. A new approach is required. PMID:26607813
Low extractable wipers for cleaning space flight hardware
NASA Technical Reports Server (NTRS)
Tijerina, Veronica; Gross, Frederick C.
1986-01-01
There is a need for low extractable wipers for solvent cleaning of space flight hardware. Soxhlet extraction is the method utilized today by most NASA subcontractors, but there may be alternate methods to achieve the same results. The need for low non-volatile residue materials, the history of soxhlet extraction, and proposed alternate methods are discussed, as well as different types of wipers, test methods, and current standards.
Comparing rapid and culture indicator bacteria methods at inland lake beaches
Francy, Donna S.; Bushon, Rebecca N.; Brady, Amie M.G.; Kephart, Christopher M.
2013-01-01
A rapid method, quantitative polymerase chain reaction (qPCR), for quantifying indicator bacteria in recreational waters is desirable for public health protection. We report that replacing current Escherichia coli standards with new US Environmental Protection Agency beach action values (BAVs) for enterococci by culture or qPCR may result in more advisories being posted at inland recreational lakes. In this study, concentrations of E. coli and enterococci by culture methods were compared to concentrations of Enterococcus spp. by qPCR at 3 inland lake beaches in Ohio. The E. coli and enterococci culture results were significantly related at all beaches; however, the relations between culture results and Enterococcus spp. qPCR results were not always significant and differed among beaches. All the qPCR results exceeded the new BAV for Enterococcus spp. by qPCR, whereas only 23.7% of culture results for E. coli and 79% of culture results for enterococci exceeded the current standard for E. coli or BAV for enterococci.
A simple web-based tool to compare freshwater fish data collected using AFS standard methods
Bonar, Scott A.; Mercado-Silva, Norman; Rahr, Matt; Torrey, Yuta T.; Cate, Averill
2016-01-01
The American Fisheries Society (AFS) recently published Standard Methods for Sampling North American Freshwater Fishes. Enlisting the expertise of 284 scientists from 107 organizations throughout Canada, Mexico, and the United States, this text was developed to facilitate comparisons of fish data across regions or time. Here we describe a user-friendly web tool that automates among-sample comparisons in individual fish condition, population length-frequency distributions, and catch per unit effort (CPUE) data collected using AFS standard methods. Currently, the web tool (1) provides instantaneous summaries of almost 4,000 data sets of condition, length frequency, and CPUE of common freshwater fishes collected using standard gears in 43 states and provinces; (2) is easily appended with new standardized field data to update subsequent queries and summaries; (3) compares fish data from a particular water body with continent, ecoregion, and state data summaries; and (4) provides additional information about AFS standard fish sampling including benefits, ongoing validation studies, and opportunities to comment on specific methods. The web tool—programmed in a PHP-based Drupal framework—was supported by several AFS Sections, agencies, and universities and is freely available from the AFS website and fisheriesstandardsampling.org. With widespread use, the online tool could become an important resource for fisheries biologists.
MUSQA: a CS method to build a multi-standard quality management system
NASA Astrophysics Data System (ADS)
Cros, Elizabeth; Sneed, Isabelle
2002-07-01
CS Communication & Systèmes, through its long quality management experience, has been able to build and evolve its Quality Management System according to clients requirements, norms, standards and models (ISO, DO178, ECSS, CMM, ...), evolving norms (transition from ISO 9001:1994 to ISO 9001:2000) and the TQM approach, being currently deployed. The aim of this paper is to show how, from this enriching and instructive experience, CS has defined and formalised its method: MuSQA (Multi-Standard Quality Approach). This method allows to built a new Quality Management System or simplify and unify an existing one. MuSQA objective is to provide any organisation with an open Quality Management System, which is able to evolve easily and turns to be a useful instrument for everyone, operational as well as non-operational staff.
The draft of the ASTM Test Method for air entitled: "Airborne Asbestos Concentration in Ambient and Indoor Atmospheres as Determined by Transmission Electron Microscopy Direct Transfer (TEM)" (ASTM Z7077Z) is an adaptation of the International Standard, ISO 10312. It is currently...
Colorimetric micro-assay for accelerated screening of mould inhibitors
Carol A. Clausen; Vina W. Yang
2013-01-01
Since current standard laboratory methods are time-consuming macro-assays that rely on subjective visual ratings of mould growth, rapid and quantitative laboratory methods are needed to screen potential mould inhibitors for use in and on cellulose-based products. A colorimetric micro-assay has been developed that uses XTT tetrazolium salt to enzymatically assess...
A Comparison of Exposure Control Procedures in CATS Using the GPC Model
ERIC Educational Resources Information Center
Leroux, Audrey J.; Dodd, Barbara G.
2016-01-01
The current study compares the progressive-restricted standard error (PR-SE) exposure control method with the Sympson-Hetter, randomesque, and no exposure control (maximum information) procedures using the generalized partial credit model with fixed- and variable-length CATs and two item pools. The PR-SE method administered the entire item pool…
The U.S. Department of Agriculture Automated Multiple-Pass Method accurately assesses sodium intakes
USDA-ARS?s Scientific Manuscript database
Accurate and practical methods to monitor sodium intake of the U.S. population are critical given current sodium reduction strategies. While the gold standard for estimating sodium intake is the 24 hour urine collection, few studies have used this biomarker to evaluate the accuracy of a dietary ins...
The report gives details of a small-chamber test method developed by the EPA for characterizing volatile organic compound (VOC) emissions from interior latex and alkyd paints. Current knowledge about VOC, including hazardous air pollutant, emissions from interior paints generated...
USDA-ARS?s Scientific Manuscript database
Most efforts to harness the power of big data for ecology and environmental sciences focus on data and metadata sharing, standardization, and accuracy. However, many scientists have not accepted the data deluge as an integral part of their research because the current scientific method is not scalab...
New method for determination of ten pesticides in human blood.
García-Repetto, R; Giménez, M P; Repetto, M
2001-01-01
An analytical method was developed for precise identification and quantitation of 10 pesticides in human blood. The pesticides studied, which have appeared frequently in actual cases, were endosulfan, lindane, parathion, ethyl-azinphos, diazinon, malathion, alachlor, tetradifon, fenthion and dicofol (o-p' and p-p' isomers). The current method replaces an earlier method which involved liquid-liquid extraction with a mixture of n-hexane-benzene (1 + 1). The extraction is performed by solid-phase extraction, with C18 cartridges and 2 internal standards, perthane and triphenylphosphate. Eluates were analyzed by gas chromatography (GC) with nitrogen-phosphorus and electrochemical detectors. Results were confirmed by GC-mass spectrometry in the electron impact mode. Blood blank samples spiked with 2 standard mixtures and an internal standard were used for quantitation. Mean recoveries ranged from 71.83 to 97.10%. Detection and quantitation limits are reported for each pesticide. Examples are provided to show the application of the present method to actual samples.
McLain, B.J.
1993-01-01
Graphite furnace atomic absorption spectrophotometry is a sensitive, precise, and accurate method for the determination of chromium in natural water samples. The detection limit for this analytical method is 0.4 microg/L with a working linear limit of 25.0 microg/L. The precision at the detection limit ranges from 20 to 57 percent relative standard deviation (RSD) with an improvement to 4.6 percent RSD for concentrations more than 3 microg/L. Accuracy of this method was determined for a variety of reference standards that was representative of the analytical range. The results were within the established standard deviations. Samples were spiked with known concentrations of chromium with recoveries ranging from 84 to 122 percent. In addition, a comparison of data between graphite furnace atomic absorption spectrophotometry and direct-current plasma atomic emission spectrometry resulted in suitable agreement between the two methods, with an average deviation of +/- 2.0 microg/L throughout the analytical range.
Superstatistics analysis of the ion current distribution function: Met3PbCl influence study.
Miśkiewicz, Janusz; Trela, Zenon; Przestalski, Stanisław; Karcz, Waldemar
2010-09-01
A novel analysis of ion current time series is proposed. It is shown that higher (second, third and fourth) statistical moments of the ion current probability distribution function (PDF) can yield new information about ion channel properties. The method is illustrated on a two-state model where the PDF of the compound states are given by normal distributions. The proposed method was applied to the analysis of the SV cation channels of vacuolar membrane of Beta vulgaris and the influence of trimethyllead chloride (Met(3)PbCl) on the ion current probability distribution. Ion currents were measured by patch-clamp technique. It was shown that Met(3)PbCl influences the variance of the open-state ion current but does not alter the PDF of the closed-state ion current. Incorporation of higher statistical moments into the standard investigation of ion channel properties is proposed.
The American Society for Aesthetic Plastic Surgery (ASAPS) survey: current trends in liposuction.
Ahmad, Jamil; Eaves, Felmont F; Rohrich, Rod J; Kenkel, Jeffrey M
2011-02-01
The emergence of new technologies necessitates a study of current trends in liposuction and other methods for fat removal. The American Society for Aesthetic Plastic Surgery (ASAPS) conducted a survey of its members to gain valuable information from Board-certified plastic surgeons about their experience with new technologies for fat removal and managing complications after liposuction. The ASAPS Current Trends in Liposuction Survey was emailed to 1713 ASAPS members. Data were tabulated and examined to determine current trends in liposuction and other fat removal techniques performed by ASAPS members. The response rate for the survey was 28.7% (n = 492). Most ASAPS respondents reported performing between 50 and 100 liposuction procedures annually. Most plastic surgeons currently employ or have previous experience with suction-assisted lipectomy/liposuction (SAL), ultrasound-assisted liposuction (UAL), and power-assisted liposuction, but fewer reported experience with laser-assisted liposuction (LAL), mesotherapy, or external, noninvasive devices. SAL was the preferred method of fat removal for 51.4%. UAL, LAL, and SAL were most commonly associated with complications. Only 10.5% of ASAPS members employ LAL; 38% have treated a patient with complications secondary to LAL. Valuable information about current trends in liposuction and other fat removal techniques has been gained from this survey. Although many studies have been published that review issues related to safety, morbidity, aesthetics, and recovery after different methods of fat removal, more prospective studies with standardized objective outcome measures comparing these techniques, particularly newer modalities, are needed to continue improving safety-related standards of care.
Development of Gold Standard Ion-Selective Electrode-Based Methods for Fluoride Analysis
Martínez-Mier, E.A.; Cury, J.A.; Heilman, J.R.; Katz, B.P.; Levy, S.M.; Li, Y.; Maguire, A.; Margineda, J.; O’Mullane, D.; Phantumvanit, P.; Soto-Rojas, A.E.; Stookey, G.K.; Villa, A.; Wefel, J.S.; Whelton, H.; Whitford, G.M.; Zero, D.T.; Zhang, W.; Zohouri, V.
2011-01-01
Background/Aims: Currently available techniques for fluoride analysis are not standardized. Therefore, this study was designed to develop standardized methods for analyzing fluoride in biological and nonbiological samples used for dental research. Methods A group of nine laboratories analyzed a set of standardized samples for fluoride concentration using their own methods. The group then reviewed existing analytical techniques for fluoride analysis, identified inconsistencies in the use of these techniques and conducted testing to resolve differences. Based on the results of the testing undertaken to define the best approaches for the analysis, the group developed recommendations for direct and microdiffusion methods using the fluoride ion-selective electrode. Results Initial results demonstrated that there was no consensus regarding the choice of analytical techniques for different types of samples. Although for several types of samples, the results of the fluoride analyses were similar among some laboratories, greater differences were observed for saliva, food and beverage samples. In spite of these initial differences, precise and true values of fluoride concentration, as well as smaller differences between laboratories, were obtained once the standardized methodologies were used. Intraclass correlation coefficients ranged from 0.90 to 0.93, for the analysis of a certified reference material, using the standardized methodologies. Conclusion The results of this study demonstrate that the development and use of standardized protocols for F analysis significantly decreased differences among laboratories and resulted in more precise and true values. PMID:21160184
Disseminating the unit of mass from multiple primary realisations
NASA Astrophysics Data System (ADS)
Nielsen, Lars
2016-12-01
When a new definition of the kilogram has been adopted in 2018 as expected, the unit of mass will be realised by the watt balance method, the x-ray crystal density method or perhaps other primary methods still to be developed. So far, the standard uncertainties associated with the available primary methods are at least one order of magnitude larger than the standard uncertainty associated with mass comparisons using mass comparators, so differences in primary realisations of the kilogram are easily detected, whereas many National Metrology Institutes would have to increase their calibration and measurement capabilities (CMCs) if they were traceable to a single primary realisation. This paper presents a scheme for obtaining traceability to multiple primary realisations of the kilogram using a small group of stainless steel 1 kg weights, which are allowed to change their masses over time in a way known to be realistic, and which are calibrated and stored in air. An analysis of the scheme shows that if the relative standard uncertainties of future primary realisations are equal to the relative standard uncertainties of the present methods used to measure the Planck constant, the unit of mass can be disseminated with a standard uncertainty less than 0.015 mg, which matches the smallest CMCs currently claimed for the calibration of 1 kg weights.
Setting the standard, implementation and auditing within haemodialysis.
Jones, J
1997-01-01
With an ever increasing awareness of the need to deliver a quality of care that is measurable in Nursing, the concept of Standards provides an ideal tool (1). Standards operate outside the boundaries of policies and procedures to provide an audit tool of authenticity and flexibility. Within our five Renal Units, while we felt confident that we were delivering an excellent standard of care to our patients and continually trying to improve upon it, what we really needed was a method of measuring this current level of care and highlighting key areas where we could offer improvement.
Arpinar, V E; Hamamura, M J; Degirmenci, E; Muftuler, L T
2012-07-07
Magnetic resonance electrical impedance tomography (MREIT) is a technique that produces images of conductivity in tissues and phantoms. In this technique, electrical currents are applied to an object and the resulting magnetic flux density is measured using magnetic resonance imaging (MRI) and the conductivity distribution is reconstructed using these MRI data. Currently, the technique is used in research environments, primarily studying phantoms and animals. In order to translate MREIT to clinical applications, strict safety standards need to be established, especially for safe current limits. However, there are currently no standards for safe current limits specific to MREIT. Until such standards are established, human MREIT applications need to conform to existing electrical safety standards in medical instrumentation, such as IEC601. This protocol limits patient auxiliary currents to 100 µA for low frequencies. However, published MREIT studies have utilized currents 10-400 times larger than this limit, bringing into question whether the clinical applications of MREIT are attainable under current standards. In this study, we investigated the feasibility of MREIT to accurately reconstruct the relative conductivity of a simple agarose phantom using 200 µA total injected current and tested the performance of two MREIT reconstruction algorithms. These reconstruction algorithms used are the iterative sensitivity matrix method (SMM) by Ider and Birgul (1998 Elektrik 6 215-25) with Tikhonov regularization and the harmonic B(Z) proposed by Oh et al (2003 Magn. Reason. Med. 50 875-8). The reconstruction techniques were tested at both 200 µA and 5 mA injected currents to investigate their noise sensitivity at low and high current conditions. It should be noted that 200 µA total injected current into a cylindrical phantom generates only 14.7 µA current in imaging slice. Similarly, 5 mA total injected current results in 367 µA in imaging slice. Total acquisition time for 200 µA and 5 mA experiments was about 1 h and 8.5 min, respectively. The results demonstrate that conductivity imaging is possible at low currents using the suggested imaging parameters and reconstructing the images using iterative SMM with Tikhonov regularization, which appears to be more tolerant to noisy data than harmonic B(Z).
Modified Drop Tower Impact Tests for American Football Helmets.
Rush, G Alston; Prabhu, R; Rush, Gus A; Williams, Lakiesha N; Horstemeyer, M F
2017-02-19
A modified National Operating Committee on Standards for Athletic Equipment (NOCSAE) test method for American football helmet drop impact test standards is presented that would provide better assessment of a helmet's on-field impact performance by including a faceguard on the helmet. In this study, a merger of faceguard and helmet test standards is proposed. The need for a more robust systematic approach to football helmet testing procedures is emphasized by comparing representative results of the Head Injury Criterion (HIC), Severity Index (SI), and peak acceleration values for different helmets at different helmet locations under modified NOCSAE standard drop tower tests. Essentially, these comparative drop test results revealed that the faceguard adds a stiffening kinematic constraint to the shell that lessens total energy absorption. The current NOCSAE standard test methods can be improved to represent on-field helmet hits by attaching the faceguards to helmets and by including two new helmet impact locations (Front Top and Front Top Boss). The reported football helmet test method gives a more accurate representation of a helmet's performance and its ability to mitigate on-field impacts while promoting safer football helmets.
Negeri, Zelalem F; Shaikh, Mateen; Beyene, Joseph
2018-05-11
Diagnostic or screening tests are widely used in medical fields to classify patients according to their disease status. Several statistical models for meta-analysis of diagnostic test accuracy studies have been developed to synthesize test sensitivity and specificity of a diagnostic test of interest. Because of the correlation between test sensitivity and specificity, modeling the two measures using a bivariate model is recommended. In this paper, we extend the current standard bivariate linear mixed model (LMM) by proposing two variance-stabilizing transformations: the arcsine square root and the Freeman-Tukey double arcsine transformation. We compared the performance of the proposed methods with the standard method through simulations using several performance measures. The simulation results showed that our proposed methods performed better than the standard LMM in terms of bias, root mean square error, and coverage probability in most of the scenarios, even when data were generated assuming the standard LMM. We also illustrated the methods using two real data sets. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
The Recognizability and Localizability of Auditory Alarms: Setting Global Medical Device Standards.
Edworthy, Judy; Reid, Scott; McDougall, Siné; Edworthy, Jonathan; Hall, Stephanie; Bennett, Danielle; Khan, James; Pye, Ellen
2017-11-01
Objective Four sets of eight audible alarms matching the functions specified in IEC 60601-1-8 were designed using known principles from auditory cognition with the intention that they would be more recognizable and localizable than those currently specified in the standard. Background The audible alarms associated with IEC 60601-1-8, a global medical device standard, are known to be difficult to learn and retain, and there have been many calls to update them. There are known principles of design and cognition that might form the basis of more readily recognizable alarms. There is also scope for improvement in the localizability of the existing alarms. Method Four alternative sets of alarms matched to the functions specified in IEC 60601-1-8 were tested for recognizability and localizability and compared with the alarms currently specified in the standard. Results With a single exception, all prototype sets of alarms outperformed the current IEC set on both recognizability and localizability. Within the prototype sets, auditory icons were the most easily recognized, but the other sets, using word rhythms and simple acoustic metaphors, were also more easily recognized than the current alarms. With the exception of one set, all prototype sets were also easier to localize. Conclusion Known auditory cognition and perception principles were successfully applied to an existing audible alarm problem. Application This work constitutes the first (benchmarking) phase of replacing the alarms currently specified in the standard. The design principles used for each set demonstrate the relative ease with which different alarm types can be recognized and localized.
DESIGN NOTE: New apparatus for haze measurement for transparent media
NASA Astrophysics Data System (ADS)
Yu, H. L.; Hsiao, C. C.; Liu, W. C.
2006-08-01
Precise measurement of luminous transmittance and haze of transparent media is increasingly important to the LCD industry. Currently there are at least three documentary standards for measuring transmission haze. Unfortunately, none of those standard methods by itself can obtain the precise values for the diffuse transmittance (DT), total transmittance (TT) and haze. This note presents a new apparatus capable of precisely measuring all three variables simultaneously. Compared with current structures, the proposed design contains one more compensatory port. For optimal design, the light trap absorbs the beam completely, light scattered by the instrument is zero and the interior surface of the integrating sphere, baffle, as well as the reflectance standard, are of equal characteristic. The accurate values of the TT, DT and haze can be obtained using the new apparatus. Even if the design is not optimal, the measurement errors of the new apparatus are smaller than those of other methods especially for high sphere reflectance. Therefore, the sphere can be made of a high reflectance material for the new apparatus to increase the signal-to-noise ratio.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Zhen-hua; Li, Hong-bin; Zhang, Zhi
Electronic transformers are widely used in power systems because of their wide bandwidth and good transient performance. However, as an emerging technology, the failure rate of electronic transformers is higher than that of traditional transformers. As a result, the calibration period needs to be shortened. Traditional calibration methods require the power of transmission line be cut off, which results in complicated operation and power off loss. This paper proposes an online calibration system which can calibrate electronic current transformers without power off. In this work, the high accuracy standard current transformer and online operation method are the key techniques. Basedmore » on the clamp-shape iron-core coil and clamp-shape air-core coil, a combined clamp-shape coil is designed as the standard current transformer. By analyzing the output characteristics of the two coils, the combined clamp-shape coil can achieve verification of the accuracy. So the accuracy of the online calibration system can be guaranteed. Moreover, by employing the earth potential working method and using two insulating rods to connect the combined clamp-shape coil to the high voltage bus, the operation becomes simple and safe. Tests in China National Center for High Voltage Measurement and field experiments show that the proposed system has a high accuracy of up to 0.05 class.« less
Harrison, Jesse P; Boardman, Carl; O'Callaghan, Kenneth; Delort, Anne-Marie; Song, Jim
2018-05-01
Plastic litter is encountered in aquatic ecosystems across the globe, including polar environments and the deep sea. To mitigate the adverse societal and ecological impacts of this waste, there has been debate on whether 'biodegradable' materials should be granted exemptions from plastic bag bans and levies. However, great care must be exercised when attempting to define this term, due to the broad and complex range of physical and chemical conditions encountered within natural ecosystems. Here, we review existing international industry standards and regional test methods for evaluating the biodegradability of plastics within aquatic environments (wastewater, unmanaged freshwater and marine habitats). We argue that current standards and test methods are insufficient in their ability to realistically predict the biodegradability of carrier bags in these environments, due to several shortcomings in experimental procedures and a paucity of information in the scientific literature. Moreover, existing biodegradability standards and test methods for aquatic environments do not involve toxicity testing or account for the potentially adverse ecological impacts of carrier bags, plastic additives, polymer degradation products or small (microscopic) plastic particles that can arise via fragmentation. Successfully addressing these knowledge gaps is a key requirement for developing new biodegradability standard(s) for lightweight carrier bags.
Establishment of a bioassay for the toxicity evaluation and quality control of Aconitum herbs.
Qin, Yi; Wang, Jia-bo; Zhao, Yan-ling; Shan, Li-mei; Li, Bao-cai; Fang, Fang; Jin, Cheng; Xiao, Xiao-he
2012-01-15
Currently, no bioassay is available for evaluating the toxicity of Aconitum herbs, which are well known for their lethal cardiotoxicity and neurotoxicity. In this study, we established a bioassay to evaluate the toxicity of Aconitum herbs. Test sample and standard solutions were administered to rats by intravenous infusion to determine their minimum lethal doses (MLD). Toxic potency was calculated by comparing the MLD. The experimental conditions of the method were optimized and standardized to ensure the precision and reliability of the bioassay. The application of the standardized bioassay was then tested by analyzing 18 samples of Aconitum herbs. Additionally, three major toxic alkaloids (aconitine, mesaconitine, and hypaconitine) in Aconitum herbs were analyzed using a liquid chromatographic method, which is the current method of choice for evaluating the toxicity of Aconitum herbs. We found that for all Aconitum herbs, the total toxicity of the extract was greater than the toxicity of the three alkaloids. Therefore, these three alkaloids failed to account for the total toxicity of Aconitum herbs. Compared with individual chemical analysis methods, the chief advantage of the bioassay is that it characterizes the total toxicity of Aconitum herbs. An incorrect toxicity evaluation caused by quantitative analysis of the three alkaloids might be effectively avoided by performing this bioassay. This study revealed that the bioassay is a powerful method for the safety assessment of Aconitum herbs. Copyright © 2011 Elsevier B.V. All rights reserved.
Im, Hyung-Jun; Bradshaw, Tyler; Solaiyappan, Meiyappan; Cho, Steve Y
2018-02-01
Numerous methods to segment tumors using 18 F-fluorodeoxyglucose positron emission tomography (FDG PET) have been introduced. Metabolic tumor volume (MTV) refers to the metabolically active volume of the tumor segmented using FDG PET, and has been shown to be useful in predicting patient outcome and in assessing treatment response. Also, tumor segmentation using FDG PET has useful applications in radiotherapy treatment planning. Despite extensive research on MTV showing promising results, MTV is not used in standard clinical practice yet, mainly because there is no consensus on the optimal method to segment tumors in FDG PET images. In this review, we discuss currently available methods to measure MTV using FDG PET, and assess the advantages and disadvantages of the methods.
Living systematic reviews: 3. Statistical methods for updating meta-analyses.
Simmonds, Mark; Salanti, Georgia; McKenzie, Joanne; Elliott, Julian
2017-11-01
A living systematic review (LSR) should keep the review current as new research evidence emerges. Any meta-analyses included in the review will also need updating as new material is identified. If the aim of the review is solely to present the best current evidence standard meta-analysis may be sufficient, provided reviewers are aware that results may change at later updates. If the review is used in a decision-making context, more caution may be needed. When using standard meta-analysis methods, the chance of incorrectly concluding that any updated meta-analysis is statistically significant when there is no effect (the type I error) increases rapidly as more updates are performed. Inaccurate estimation of any heterogeneity across studies may also lead to inappropriate conclusions. This paper considers four methods to avoid some of these statistical problems when updating meta-analyses: two methods, that is, law of the iterated logarithm and the Shuster method control primarily for inflation of type I error and two other methods, that is, trial sequential analysis and sequential meta-analysis control for type I and II errors (failing to detect a genuine effect) and take account of heterogeneity. This paper compares the methods and considers how they could be applied to LSRs. Copyright © 2017 Elsevier Inc. All rights reserved.
Current status of antifungal susceptibility testing methods.
Arikan, Sevtap
2007-11-01
Antifungal susceptibility testing is a very dynamic field of medical mycology. Standardization of in vitro susceptibility tests by the Clinical and Laboratory Standards Institute (CLSI) and the European Committee for Antimicrobial Susceptibility Testing (EUCAST), and current availability of reference methods constituted the major remarkable steps in the field. Based on the established minimum inhibitory concentration (MIC) breakpoints, it is now possible to determine the susceptibilities of Candida strains to fluconazole, itraconazole, voriconazole, and flucytosine. Moreover, utility of fluconazole antifungal susceptibility tests as an adjunct in optimizing treatment of candidiasis has now been validated. While the MIC breakpoints and clinical significance of susceptibility testing for the remaining fungi and antifungal drugs remain yet unclear, modifications of the available methods as well as other methodologies are being intensively studied to overcome the present drawbacks and limitations. Among the other methods under investigation are Etest, colorimetric microdilution, agar dilution, determination of fungicidal activity, flow cytometry, and ergosterol quantitation. Etest offers the advantage of practical application and favorable agreement rates with the reference methods that are frequently above acceptable limits. However, MIC breakpoints for Etest remain to be evaluated and established. Development of commercially available, standardized colorimetric panels that are based on CLSI method parameters has added more to the antifungal susceptibility testing armamentarium. Flow cytometry, on the other hand, appears to offer rapid susceptibility testing but requires specified equipment and further evaluation for reproducibility and standardization. Ergosterol quantitation is another novel approach, which appears potentially beneficial particularly in discrimination of azole-resistant isolates from heavy trailers. The method is yet investigational and requires to be further studied. Developments in methodology and applications of antifungal susceptibility testing will hopefully provide enhanced utility in clinical guidance of antifungal therapy. However, and particularly in immunosuppressed host, in vitro susceptibility is and will remain only one of several factors that influence clinical outcome.
Perich, C; Ricós, C; Alvarez, V; Biosca, C; Boned, B; Cava, F; Doménech, M V; Fernández-Calle, P; Fernández-Fernández, P; García-Lario, J V; Minchinela, J; Simón, M; Jansen, R
2014-05-15
Current external quality assurance schemes have been classified into six categories, according to their ability to verify the degree of standardization of the participating measurement procedures. SKML (Netherlands) is a Category 1 EQA scheme (commutable EQA materials with values assigned by reference methods), whereas SEQC (Spain) is a Category 5 scheme (replicate analyses of non-commutable materials with no values assigned by reference methods). The results obtained by a group of Spanish laboratories participating in a pilot study organized by SKML are examined, with the aim of pointing out the improvements over our current scheme that a Category 1 program could provide. Imprecision and bias are calculated for each analyte and laboratory, and compared with quality specifications derived from biological variation. Of the 26 analytes studied, 9 had results comparable with those from reference methods, and 10 analytes did not have comparable results. The remaining 7 analytes measured did not have available reference method values, and in these cases, comparison with the peer group showed comparable results. The reasons for disagreement in the second group can be summarized as: use of non-standard methods (IFCC without exogenous pyridoxal phosphate for AST and ALT, Jaffé kinetic at low-normal creatinine concentrations and with eGFR); non-commutability of the reference material used to assign values to the routine calibrator (calcium, magnesium and sodium); use of reference materials without established commutability instead of reference methods for AST and GGT, and lack of a systematic effort by manufacturers to harmonize results. Results obtained in this work demonstrate the important role of external quality assurance programs using commutable materials with values assigned by reference methods to correctly monitor the standardization of laboratory tests with consequent minimization of risk to patients. Copyright © 2013 Elsevier B.V. All rights reserved.
An objectively-analyzed method for measuring the useful penetration of x-ray imaging systems.
Glover, Jack L; Hudson, Lawrence T
2016-06-01
The ability to detect wires is an important capability of the cabinet x-ray imaging systems that are used in aviation security as well as the portable x-ray systems that are used by domestic law enforcement and military bomb squads. A number of national and international standards describe methods for testing this capability using the so called useful penetration test metric, where wires are imaged behind different thicknesses of blocking material. Presently, these tests are scored based on human judgments of wire visibility, which are inherently subjective. We propose a new method in which the useful penetration capabilities of an x-ray system are objectively evaluated by an image processing algorithm operating on digital images of a standard test object. The algorithm advantageously applies the Radon transform for curve parameter detection that reduces the problem of wire detection from two dimensions to one. The sensitivity of the wire detection method is adjustable and we demonstrate how the threshold parameter can be set to give agreement with human-judged results. The method was developed to be used in technical performance standards and is currently under ballot for inclusion in a US national aviation security standard.
An objectively-analyzed method for measuring the useful penetration of x-ray imaging systems
Glover, Jack L.; Hudson, Lawrence T.
2016-01-01
The ability to detect wires is an important capability of the cabinet x-ray imaging systems that are used in aviation security as well as the portable x-ray systems that are used by domestic law enforcement and military bomb squads. A number of national and international standards describe methods for testing this capability using the so called useful penetration test metric, where wires are imaged behind different thicknesses of blocking material. Presently, these tests are scored based on human judgments of wire visibility, which are inherently subjective. We propose a new method in which the useful penetration capabilities of an x-ray system are objectively evaluated by an image processing algorithm operating on digital images of a standard test object. The algorithm advantageously applies the Radon transform for curve parameter detection that reduces the problem of wire detection from two dimensions to one. The sensitivity of the wire detection method is adjustable and we demonstrate how the threshold parameter can be set to give agreement with human-judged results. The method was developed to be used in technical performance standards and is currently under ballot for inclusion in a US national aviation security standard. PMID:27499586
Taming Many-Parameter BSM Models with Bayesian Neural Networks
NASA Astrophysics Data System (ADS)
Kuchera, M. P.; Karbo, A.; Prosper, H. B.; Sanchez, A.; Taylor, J. Z.
2017-09-01
The search for physics Beyond the Standard Model (BSM) is a major focus of large-scale high energy physics experiments. One method is to look for specific deviations from the Standard Model that are predicted by BSM models. In cases where the model has a large number of free parameters, standard search methods become intractable due to computation time. This talk presents results using Bayesian Neural Networks, a supervised machine learning method, to enable the study of higher-dimensional models. The popular phenomenological Minimal Supersymmetric Standard Model was studied as an example of the feasibility and usefulness of this method. Graphics Processing Units (GPUs) are used to expedite the calculations. Cross-section predictions for 13 TeV proton collisions will be presented. My participation in the Conference Experience for Undergraduates (CEU) in 2004-2006 exposed me to the national and global significance of cutting-edge research. At the 2005 CEU, I presented work from the previous summer's SULI internship at Lawrence Berkeley Laboratory, where I learned to program while working on the Majorana Project. That work inspired me to follow a similar research path, which led me to my current work on computational methods applied to BSM physics.
An objectively-analyzed method for measuring the useful penetration of x-ray imaging systems
NASA Astrophysics Data System (ADS)
Glover, Jack L.; Hudson, Lawrence T.
2016-06-01
The ability to detect wires is an important capability of the cabinet x-ray imaging systems that are used in aviation security as well as the portable x-ray systems that are used by domestic law enforcement and military bomb squads. A number of national and international standards describe methods for testing this capability using the so called useful penetration test metric, where wires are imaged behind different thicknesses of blocking material. Presently, these tests are scored based on human judgments of wire visibility, which are inherently subjective. We propose a new method in which the useful penetration capabilities of an x-ray system are objectively evaluated by an image processing algorithm operating on digital images of a standard test object. The algorithm advantageously applies the Radon transform for curve parameter detection that reduces the problem of wire detection from two dimensions to one. The sensitivity of the wire detection method is adjustable and we demonstrate how the threshold parameter can be set to give agreement with human-judged results. The method was developed to be used in technical performance standards and is currently under ballot for inclusion in an international aviation security standard.
An Expert System for Classifying Stars on the MK Spectral Classification System
NASA Astrophysics Data System (ADS)
Corbally, Christopher J.; Gray, R. O.
2013-01-01
We will describe an expert computer system designed to classify stellar spectra on the MK Spectral Classification system employing methods similar to those of humans who make direct comparison with the MK classification standards. Like an expert human classifier, MKCLASS first comes up with a rough spectral type, and then refines that type by direct comparison with MK standards drawn from a standards library using spectral criteria appropriate to the spectral class. Certain common spectral-type peculiarities can also be detected by the program. The program is also capable of identifying WD spectra and carbon stars and giving appropriate (but currently approximate) spectral types on the relevant systems. We will show comparisons between spectral types (including luminosity types) performed by MKCLASS and humans. The program currently is capable of competent classifications in the violet-green region, but plans are underway to extend the spectral criteria into the red and near-infrared regions. Two standard libraries with resolutions of 1.8 and 3.6Å are now available, but a higher-resolution standard library, using the new spectrograph on the Vatican Advanced Technology Telescope, is currently under preparation. Once that library is available, MKCLASS and the spectral libraries will be made available to the astronomical community.
Accounting principles, revenue recognition, and the profitability of pharmacy benefit managers.
McLean, Robert A; Garis, Robert I
2005-03-01
To contrast pharmacy benefit management (PBM) companies' measured profitability by using two accounting standards. The first accounting standard is that which, under Generally Accepted Accounting Principles (GAAP), PBMs are currently allowed to employ. The second accounting standard, seemingly more congruent with the PBM business model, treats the PBM as an agent of the plan sponsor. Financial Accounting Standards Board (FASB) Emerging Issues Task Force Issue 99-19, U.S. Securities and Exchange 10-K filings and financial accounting literature. Under GAAP record keeping, the PBM industry profitability appears modest. Using currently applied GAAP, the PBM treats all payment from the plan sponsor as revenue and all payment to the pharmacy as revenue. However, the PBM functions, in practice, as an entity that passes-through money collected from one party (the sponsor) to other parties (dispensing pharmacies). Therefore, it would seem that the nature of PBM cash flows would be more accurately recorded as a pass-through entity. When the PBM is evaluated using an accounting method that recognizes the pass-through nature of its business, the PBM profit margin increases dramatically. Current GAAP standards make traditional financial statement analysis of PBMs unrevealing, and may hide genuinely outstanding financial performance. Investors, regulators, pharmacies, and the FASB all have an interest in moving to clarify this accounting anomaly.
Challenges to Global Implementation of Infrared Thermography Technology: Current Perspective
Shterenshis, Michael
2017-01-01
Medical infrared thermography (IT) produces an image of the infrared waves emitted by the human body as part of the thermoregulation process that can vary in intensity based on the health of the person. This review analyzes recent developments in the use of infrared thermography as a screening and diagnostic tool in clinical and nonclinical settings, and identifies possible future routes for improvement of the method. Currently, infrared thermography is not considered to be a fully reliable diagnostic method. If standard infrared protocol is established and a normative database is available, infrared thermography may become a reliable method for detecting inflammatory processes. PMID:29138741
Challenges to Global Implementation of Infrared Thermography Technology: Current Perspective.
Shterenshis, Michael
2017-01-01
Medical infrared thermography (IT) produces an image of the infrared waves emitted by the human body as part of the thermoregulation process that can vary in intensity based on the health of the person. This review analyzes recent developments in the use of infrared thermography as a screening and diagnostic tool in clinical and nonclinical settings, and identifies possible future routes for improvement of the method. Currently, infrared thermography is not considered to be a fully reliable diagnostic method. If standard infrared protocol is established and a normative database is available, infrared thermography may become a reliable method for detecting inflammatory processes.
Zanotti, Cinzia; Amadori, Massimo
2015-03-01
Porcine Circovirus 2 (PCV2) is involved in porcine circovirus-associated disease, that causes great economic losses to the livestock industry worldwide. Vaccination against PCV2 proved to be very effective in reducing disease occurrence and it is currently performed on a large scale. Starting from a previous model concerning Foot-and Mouth Disease Virus antigens, we developed a rapid and simple method to quantify PCV2 whole virion particles in inactivated vaccines. This procedure, based on sucrose gradient analysis and fluorometric evaluation of viral genomic content, allows for a better standardization of the antigen payload in vaccine batches. It also provides a valid indication of virion integrity. Most important, such a method can be applied to whole virion vaccines regardless of the production procedures, thus enabling meaningful comparisons on a common basis. In a future batch consistency approach to PCV2 vaccine manufacture, our procedure represents a valuable tool to improve in-process controls and to guarantee conformity of the final product with passmarks for approval. This might have important repercussions in terms of reduced usage of animals for vaccine batch release, in the framework of the current 3Rs policy. Copyright © 2015 The International Alliance for Biological Standardization. Published by Elsevier Ltd. All rights reserved.
DOT National Transportation Integrated Search
2017-09-01
The mechanistic-empirical pavement design method requires the elastic resilient modulus as the key input for characterization of geomaterials. Current density-based QA procedures do not measure resilient modulus. Additionally, the density-based metho...
METHOD FOR THE ANALYSIS OF ASBESTOS IN WATER USING MCE FILTERS
The current Federal Drinking Water Standard makes possible the use of methyl cellulose ester filters rather than the previously proposed Nuclepore™ filter. Updating of the previous counting rules brings them closer to AHERA specifications.
Inverse spin Hall effect in a closed loop circuit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Omori, Y.; Auvray, F.; Wakamura, T.
We present measurements of inverse spin Hall effects (ISHEs), in which the conversion of a spin current into a charge current via the ISHE is detected not as a voltage in a standard open circuit but directly as the charge current generated in a closed loop. The method is applied to the ISHEs of Bi-doped Cu and Pt. The derived expression of ISHE for the loop structure can relate the charge current flowing into the loop to the spin Hall angle of the SHE material and the resistance of the loop.
Sabzghabaei, Foroogh; Salajeghe, Mahla; Soltani Arabshahi, Seyed Kamran
2017-01-01
Background: In this study, ambulatory care training in Firoozgar hospital was evaluated based on Iranian national standards of undergraduate medical education related to ambulatory education using Baldrige Excellence Model. Moreover, some suggestions were offered to promote education quality in the current condition of ambulatory education in Firoozgar hospital and national standards using the gap analysis method. Methods: This descriptive analytic study was a kind of evaluation research performed using the standard check lists published by the office of undergraduate medical education council. Data were collected through surveying documents, interviewing, and observing the processes based on the Baldrige Excellence Model. After confirming the validity and reliability of the check lists, we evaluated the establishment level of the national standards of undergraduate medical education in the clinics of this hospital in the 4 following domains: educational program, evaluation, training and research resources, and faculty members. Data were analyzed according to the national standards of undergraduate medical education related to ambulatory education and the Baldrige table for scoring. Finally, the quality level of the current condition was determined as very appropriate, appropriate, medium, weak, and very weak. Results: In domains of educational program 62%, in evaluation 48%, in training and research resources 46%, in faculty members 68%, and in overall ratio, 56% of the standards were appropriate. Conclusion: The most successful domains were educational program and faculty members, but evaluation and training and research resources domains had a medium performance. Some domains and indicators were determined as weak and their quality needed to be improved, so it is suggested to provide the necessary facilities and improvements by attending to the quality level of the national standards of ambulatory education PMID:29951400
Magnetostriction measurement by four probe method
NASA Astrophysics Data System (ADS)
Dange, S. N.; Radha, S.
2018-04-01
The present paper describes the design and setting up of an indigenouslydevelopedmagnetostriction(MS) measurement setup using four probe method atroom temperature.A standard strain gauge is pasted with a special glue on the sample and its change in resistance with applied magnetic field is measured using KeithleyNanovoltmeter and Current source. An electromagnet with field upto 1.2 tesla is used to source the magnetic field. The sample is placed between the magnet poles using self designed and developed wooden probe stand, capable of moving in three mutually perpendicular directions. The nanovoltmeter and current source are interfaced with PC using RS232 serial interface. A software has been developed in for logging and processing of data. Proper optimization of measurement has been done through software to reduce the noise due to thermal emf and electromagnetic induction. The data acquired for some standard magnetic samples are presented. The sensitivity of the setup is 1microstrain with an error in measurement upto 5%.
[Establishment of Assessment Method for Air Bacteria and Fungi Contamination].
Zhang, Hua-ling; Yao, Da-jun; Zhang, Yu; Fang, Zi-liang
2016-03-15
In this paper, in order to settle existing problems in the assessment of air bacteria and fungi contamination, the indoor and outdoor air bacteria and fungi filed concentrations by impact method and settlement method in existing documents were collected and analyzed, then the goodness of chi square was used to test whether these concentration data obeyed normal distribution at the significant level of α = 0.05, and combined with the 3σ principle of normal distribution and the current assessment standards, the suggested concentrations ranges of air microbial concentrations were determined. The research results could provide a reference for developing air bacteria and fungi contamination assessment standards in the future.
Discharge measurements at gaging stations
Turnipseed, D. Phil; Sauer, Vernon B.
2010-01-01
The techniques and standards for making discharge measurements at streamflow gaging stations are described in this publication. The vertical axis rotating-element current meter, principally the Price current meter, has been traditionally used for most measurements of discharge; however, advancements in acoustic technology have led to important developments in the use of acoustic Doppler current profilers, acoustic Doppler velocimeters, and other emerging technologies for the measurement of discharge. These new instruments, based on acoustic Doppler theory, have the advantage of no moving parts, and in the case of the acoustic Doppler current profiler, quickly and easily provide three-dimensional stream-velocity profile data through much of the vertical water column. For much of the discussion of acoustic Doppler current profiler moving-boat methodology, the reader is referred to U.S. Geological Survey Techniques and Methods 3-A22 (Mueller and Wagner, 2009). Personal digital assistants (PDAs), electronic field notebooks, and other personal computers provide fast and efficient data-collection methods that are more error-free than traditional hand methods. The use of portable weirs and flumes, floats, volumetric tanks, indirect methods, and tracers in measuring discharge are briefly described.
NASA Technical Reports Server (NTRS)
Troy, B. E., Jr.; Maier, E. J.
1975-01-01
The effects of the grid transparency and finite collector size on the values of thermal ion density and temperature determined by the standard RPA (retarding potential analyzer) analysis method are investigated. The current-voltage curves calculated for varying RPA parameters and a given ion mass, temperature, and density are analyzed by the standard RPA method. It is found that only small errors in temperature and density are introduced for an RPA with typical dimensions, and that even when the density error is substantial for nontypical dimensions, the temperature error remains minimum.
A Proven Method for Meeting Export Control Objectives in Postal and Shipping Sectors
2015-02-01
months, the USPIS team developed and implemented an export screening standard operating procedure, implemented new and updated processes and systems ...support and protect the U.S. Postal Service and its employees, infrastructure, and customers; enforce the laws that defend the nation’s mail system ...the incidence of mail shipments violating export control laws, regulations, and standards . • Evaluate current processes and systems and identify
ERIC Educational Resources Information Center
Meester-Delver, Anke; Beelen, Anita; Hennekam, Raoul; Nollet, Frans; Hadders-Algra, Mijna
2007-01-01
The aim of this study was to determine the interrater reliability and stability over time of the Capacity Profile (CAP). The CAP is a standardized method for classifying additional care needs indicated by current impairments in five domains of body functions: physical health, neuromusculoskeletal and movement-related, sensory, mental, and voice…
ERIC Educational Resources Information Center
Owen-Stone, Deborah S.
2012-01-01
The purpose of this concurrent mixed methods study was to examine the collaborative relationship between scientists and science teachers and to incorporate and advocate scientific literacy based on past and current educational theories such as inquiry based teaching. The scope of this study included archived student standardized test scores,…
Recommendations for the treatment of aging in standard technical specifications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Orton, R.D.; Allen, R.P.
1995-09-01
As part of the US Nuclear Regulatory Commission`s Nuclear Plant Aging Research Program, Pacific Northwest Laboratory (PNL) evaluated the standard technical specifications for nuclear power plants to determine whether the current surveillance requirements (SRs) were effective in detecting age-related degradation. Nuclear Plant Aging Research findings for selected systems and components were reviewed to identify the stressors and operative aging mechanisms and to evaluate the methods available to detect, differentiate, and trend the resulting aging degradation. Current surveillance and testing requirements for these systems and components were reviewed for their effectiveness in detecting degraded conditions and for potential contributions to prematuremore » degradation. When the current surveillance and testing requirements appeared ineffective in detecting aging degradation or potentially could contribute to premature degradation, a possible deficiency in the SRs was identified that could result in undetected degradation. Based on this evaluation, PNL developed recommendations for inspection, surveillance, trending, and condition monitoring methods to be incorporated in the SRs to better detect age- related degradation of these selected systems and components.« less
Proposed design procedure for transmission shafting under fatigue loading
NASA Technical Reports Server (NTRS)
Loewenthal, S. H.
1978-01-01
The B106 American National Standards Committee is currently preparing a new standard for the design of transmission shafting. A design procedure, proposed for use in the new standard, for computing the diameter of rotating solid steel shafts under combined cyclic bending and steady torsion is presented. The formula is based on an elliptical variation of endurance strength with torque exhibited by combined stress fatigue data. Fatigue factors are cited to correct specimen bending endurance strength data for use in the shaft formula. A design example illustrates how the method is to be applied.
Rueda, A
2001-01-01
Researchers have identified at least twenty-five pathogens that can be transmitted through blood transfusions. Four percent of patients who receive the average amount of blood during a transfusion are at risk of being infected with a contaminated unit, and exposed to the danger of serious adverse reactions, including future debilitating conditions. Victims of transfusion-related diseases, however, generally have been unsuccessful when making claims against the purveyors of blood products because of blood shield statutes that were initially enacted in response to unknown pathogens that made the blood an "unavoidably unsafe" product. Today, blood purveyors are aware of the possibility of epidemics from unsafe blood and have continued to research and supervise the blood supply to create mechanisms that detect and inactivate various blood-borne pathogens. In response to the current and advancing methods of blood purification, this Article suggests that a hybrid strict liability/negligence standard be implemented to ensure advancements in safety of blood transfusions. A strict liability standard should attach for infections that can be detected and eliminated through current testing and inactivation methods. A negligence standard should govern infections for which no current test or inactivating method is available. Under this approach, blood purveyors would be compelled to take account of the risks of any manufacturing decisions that they make, and they would not enjoy the freedom from liability that the blood shield statutes now provide. The costs necessary to ensure compliance with this hybrid structure are small in comparison to the social and economic costs exacted by thousands of transfusion-related diseases.
Adjuncts to colonic cleansing before colonoscopy.
Park, Sanghoon; Lim, Yun Jeong
2014-03-21
Pre-procedural cleansing of the bowel can maximize the effectiveness and efficiency of colonoscopy. Yet, efficacy of the current gold standard colonic preparation method - high-volume oral administration of purgative agents 12-24 h prior to the procedure - is limited by several factors, such as patient compliance (due to poor palatability and inconvenience of the dosing regimen) and risks of complications (due to drug interactions or intolerance). Attempts to resolve these limitations have included providing adjunctive agents and methods to promote the colonic cleansing ability of the principal purgative agent, with the aim of lessening unpleasant side effects (such as bloating) and reducing the large ingested volume requirement. Several promising adjunctive agents are bisacodyl, magnesium citrate, senna, simethicone, metoclopramide, and prokinetics, and each are being investigated for their potential. This review provides an up to date summary of the reported investigations into the potencies and weaknesses of the key adjuncts currently being applied in clinic as supplements to the traditional bowel preparation agents. While the comparative analysis of these adjuncts showed that no single agent or method has yet achieved the goal of completely overcoming the limitations of the current gold standard preparation method, they at least provide endoscopists with an array of alternatives to help improve the suboptimal efficacy of the main cleansing solutions when used alone. To aid in this clinical endeavor, a subjective grade was assigned to each adjunct to indicate its practical value. In addition, the systematic review of the currently available agents and methods provides insight into the features of each that may be overcome or exploited to create novel drugs and strategies that may become adopted as effective bowel cleansing adjuncts or alternatives.
Adjuncts to colonic cleansing before colonoscopy
Park, Sanghoon; Lim, Yun Jeong
2014-01-01
Pre-procedural cleansing of the bowel can maximize the effectiveness and efficiency of colonoscopy. Yet, efficacy of the current gold standard colonic preparation method - high-volume oral administration of purgative agents 12-24 h prior to the procedure - is limited by several factors, such as patient compliance (due to poor palatability and inconvenience of the dosing regimen) and risks of complications (due to drug interactions or intolerance). Attempts to resolve these limitations have included providing adjunctive agents and methods to promote the colonic cleansing ability of the principal purgative agent, with the aim of lessening unpleasant side effects (such as bloating) and reducing the large ingested volume requirement. Several promising adjunctive agents are bisacodyl, magnesium citrate, senna, simethicone, metoclopramide, and prokinetics, and each are being investigated for their potential. This review provides an up to date summary of the reported investigations into the potencies and weaknesses of the key adjuncts currently being applied in clinic as supplements to the traditional bowel preparation agents. While the comparative analysis of these adjuncts showed that no single agent or method has yet achieved the goal of completely overcoming the limitations of the current gold standard preparation method, they at least provide endoscopists with an array of alternatives to help improve the suboptimal efficacy of the main cleansing solutions when used alone. To aid in this clinical endeavor, a subjective grade was assigned to each adjunct to indicate its practical value. In addition, the systematic review of the currently available agents and methods provides insight into the features of each that may be overcome or exploited to create novel drugs and strategies that may become adopted as effective bowel cleansing adjuncts or alternatives. PMID:24659864
The Manufacture, Shipping and Receiving and Quality Control of Rodent Bedding Materials
NASA Technical Reports Server (NTRS)
Kraft, Lisbeth M.
1980-01-01
The criteria for rodent bedding and nesting materials are discussed. The literature is reviewed regarding sources of bedding materials, manufacturing methods, quality control, procedures (microbiological, physical and chemical), storage, methods, shipment, methods of use and disposal, current knowledge concerning bedding effects on animals as related to research and testing and legal aspects. Future needs, especially with respect to the promulgation of standards, also are addressed.
Plastic and reconstructive robotic microsurgery--a review of current practices.
Saleh, D B; Syed, M; Kulendren, D; Ramakrishnan, V; Liverneaux, P A
2015-08-01
We sought to review the current state of robotics in this specialty. A Pubmed and Medline search was performed using key search terms for a comprehensive review of the whole cross-section of plastic and reconstructive practice. Overall, 28 publications specific to robotic plastic and reconstructive procedures were suitable for appraisal. The current evidence suggests robotics is comparable to standard methods despite its infancy. The possible applications are wide and could translate into superior patient outcomes. Copyright © 2015 Elsevier Masson SAS. All rights reserved.
Papas, Rebecca K; Sidle, John E; Wamalwa, Emmanuel S; Okumu, Thomas O; Bryant, Kendall L; Goulet, Joseph L; Maisto, Stephen A; Braithwaite, R Scott; Justice, Amy C
2010-08-01
Traditional homemade brew is believed to represent the highest proportion of alcohol use in sub-Saharan Africa. In Eldoret, Kenya, two types of brew are common: chang'aa, spirits, and busaa, maize beer. Local residents refer to the amount of brew consumed by the amount of money spent, suggesting a culturally relevant estimation method. The purposes of this study were to analyze ethanol content of chang'aa and busaa; and to compare two methods of alcohol estimation: use by cost, and use by volume, the latter the current international standard. Laboratory results showed mean ethanol content was 34% (SD = 14%) for chang'aa and 4% (SD = 1%) for busaa. Standard drink unit equivalents for chang'aa and busaa, respectively, were 2 and 1.3 (US) and 3.5 and 2.3 (Great Britain). Using a computational approach, both methods demonstrated comparable results. We conclude that cost estimation of alcohol content is more culturally relevant and does not differ in accuracy from the international standard.
Efficient method for computing the electronic transport properties of a multiterminal system
NASA Astrophysics Data System (ADS)
Lima, Leandro R. F.; Dusko, Amintor; Lewenkopf, Caio
2018-04-01
We present a multiprobe recursive Green's function method to compute the transport properties of mesoscopic systems using the Landauer-Büttiker approach. By introducing an adaptive partition scheme, we map the multiprobe problem into the standard two-probe recursive Green's function method. We apply the method to compute the longitudinal and Hall resistances of a disordered graphene sample, a system of current interest. We show that the performance and accuracy of our method compares very well with other state-of-the-art schemes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, Jaromy; Sun Zaijing; Wells, Doug
2009-03-10
Photon activation analysis detected elements in two NIST standards that did not have reported concentration values. A method is currently being developed to infer these concentrations by using scaling parameters and the appropriate known quantities within the NIST standard itself. Scaling parameters include: threshold, peak and endpoint energies; photo-nuclear cross sections for specific isotopes; Bremstrahlung spectrum; target thickness; and photon flux. Photo-nuclear cross sections and energies from the unknown elements must also be known. With these quantities, the same integral was performed for both the known and unknown elements resulting in an inference of the concentration of the un-reported elementmore » based on the reported value. Since Rb and Mn were elements that were reported in the standards, and because they had well-identified peaks, they were used as the standards of inference to determine concentrations of the unreported elements of As, I, Nb, Y, and Zr. This method was tested by choosing other known elements within the standards and inferring a value based on the stated procedure. The reported value of Mn in the first NIST standard was 403{+-}15 ppm and the reported value of Ca in the second NIST standard was 87000 ppm (no reported uncertainty). The inferred concentrations were 370{+-}23 ppm and 80200{+-}8700 ppm respectively.« less
Concrete testing device provides substantial savings : fact sheet.
DOT National Transportation Integrated Search
2011-11-01
Current practices require a permeability test, ASTM C1202: "Standard Test Method for Electrical Indication of Concrete's Ability to resist Chloride Ion Penetration," for structures with potential salt water intrusion. The test is run at 56 days of ag...
BACTERIOLOGICAL ANALYSIS WITH SAMPLING AND SAMPLE PRESERVATION SPECIFICS
Current federal regulations (40CFR 503) specify that under certain conditions treated municipal biosolids must be analyzed for fecal coliform or salmonellae. The regulations state that representative samples of biosolids must be collected and analyzed using standard methods. Th...
Hines, Catherine D. G.; Hamilton, Gavin; Sirlin, Claude B.; McKenzie, Charles A.; Yu, Huanzhou; Brittain, Jean H.; Reeder, Scott B.
2011-01-01
Purpose: To prospectively compare an investigational version of a complex-based chemical shift–based fat fraction magnetic resonance (MR) imaging method with MR spectroscopy for the quantification of hepatic steatosis. Materials and Methods: This study was approved by the institutional review board and was HIPAA compliant. Written informed consent was obtained before all studies. Fifty-five patients (31 women, 24 men; age range, 24–71 years) were prospectively imaged at 1.5 T with quantitative MR imaging and single-voxel MR spectroscopy, each within a single breath hold. The effects of T2* correction, spectral modeling of fat, and magnitude fitting for eddy current correction on fat quantification with MR imaging were investigated by reconstructing fat fraction images from the same source data with different combinations of error correction. Single-voxel T2-corrected MR spectroscopy was used to measure fat fraction and served as the reference standard. All MR spectroscopy data were postprocessed at a separate institution by an MR physicist who was blinded to MR imaging results. Fat fractions measured with MR imaging and MR spectroscopy were compared statistically to determine the correlation (r2), and the slope and intercept as measures of agreement between MR imaging and MR spectroscopy fat fraction measurements, to determine whether MR imaging can help quantify fat, and examine the importance of T2* correction, spectral modeling of fat, and eddy current correction. Two-sided t tests (significance level, P = .05) were used to determine whether estimated slopes and intercepts were significantly different from 1.0 and 0.0, respectively. Sensitivity and specificity for the classification of clinically significant steatosis were evaluated. Results: Overall, there was excellent correlation between MR imaging and MR spectroscopy for all reconstruction combinations. However, agreement was only achieved when T2* correction, spectral modeling of fat, and magnitude fitting for eddy current correction were used (r2 = 0.99; slope ± standard deviation = 1.00 ± 0.01, P = .77; intercept ± standard deviation = 0.2% ± 0.1, P = .19). Conclusion: T1-independent chemical shift–based water-fat separation MR imaging methods can accurately quantify fat over the entire liver, by using MR spectroscopy as the reference standard, when T2* correction, spectral modeling of fat, and eddy current correction methods are used. © RSNA, 2011 PMID:21248233
[Undergraduate psychiatric training in Turkey].
Cıngı Başterzi, Ayşe Devrim; Tükel, Raşit; Uluşahin, Aylin; Coşkun, Bülent; Alkın, Tunç; Murat Demet, Mehmet; Konuk, Numan; Taşdelen, Bahar
2010-01-01
The current trend in medical education is to abandon the experience-based traditional model and embrace the competency-based education model (CBE). The basic principle behind CBE is standardization. The first step in standardization is to determine what students must know, what they must accomplish, and what attitude they should display, and the establishment of educational goals. One of the goals of the Psychiatric Association of Turkey, Psychiatric Training Section is to standardize psychiatric training in Turkish medical schools. This study aimed to determine the current state of undergraduate psychiatric training in Turkish medical schools. Questionnaires were sent to the psychiatry department chairs of 41 medical schools. Data were analyzed using descriptive statistical methods. Of the 41 department chairs that were sent the questionnaire, 29 (70%) completed and returned them, of which 16 (66.7%) reported that they had already defined goals and educational objectives for their undergraduate psychiatric training programs. The Core Education Program, prepared by the Turkish Medicine and Health Education Council, was predominately used at 9 (37.5%) medical schools. Pre-clinical and clinical training schedules varied between medical schools. In all, 3 of the medical schools did not offer internships in psychiatry. The majority of chairs emphasized the importance of mood disorders (49.9%) and anxiety disorders (40%), suggesting that these disorders should be treated by general practitioners. Computer technology was commonly used for lecturing; however, utilization of interactive and skill-based teaching methods was limited. The most commonly used evaluation methods were written examination (87.5%) during preclinical training and oral examination (91.6%) during clinical training. The most important finding of this study was the lack of a standardized curriculum for psychiatric training in Turkey. Standardization of psychiatric training in Turkish medical schools must be developed.
Constant-current control method of multi-function electromagnetic transmitter.
Xue, Kaichang; Zhou, Fengdao; Wang, Shuang; Lin, Jun
2015-02-01
Based on the requirements of controlled source audio-frequency magnetotelluric, DC resistivity, and induced polarization, a constant-current control method is proposed. Using the required current waveforms in prospecting as a standard, the causes of current waveform distortion and current waveform distortion's effects on prospecting are analyzed. A cascaded topology is adopted to achieve 40 kW constant-current transmitter. The responsive speed and precision are analyzed. According to the power circuit of the transmitting system, the circuit structure of the pulse width modulation (PWM) constant-current controller is designed. After establishing the power circuit model of the transmitting system and the PWM constant-current controller model, analyzing the influence of ripple current, and designing an open-loop transfer function according to the amplitude-frequency characteristic curves, the parameters of the PWM constant-current controller are determined. The open-loop transfer function indicates that the loop gain is no less than 28 dB below 160 Hz, which assures the responsive speed of the transmitting system; the phase margin is 45°, which assures the stabilization of the transmitting system. Experimental results verify that the proposed constant-current control method can keep the control error below 4% and can effectively suppress load change caused by the capacitance of earth load.
Constant-current control method of multi-function electromagnetic transmitter
NASA Astrophysics Data System (ADS)
Xue, Kaichang; Zhou, Fengdao; Wang, Shuang; Lin, Jun
2015-02-01
Based on the requirements of controlled source audio-frequency magnetotelluric, DC resistivity, and induced polarization, a constant-current control method is proposed. Using the required current waveforms in prospecting as a standard, the causes of current waveform distortion and current waveform distortion's effects on prospecting are analyzed. A cascaded topology is adopted to achieve 40 kW constant-current transmitter. The responsive speed and precision are analyzed. According to the power circuit of the transmitting system, the circuit structure of the pulse width modulation (PWM) constant-current controller is designed. After establishing the power circuit model of the transmitting system and the PWM constant-current controller model, analyzing the influence of ripple current, and designing an open-loop transfer function according to the amplitude-frequency characteristic curves, the parameters of the PWM constant-current controller are determined. The open-loop transfer function indicates that the loop gain is no less than 28 dB below 160 Hz, which assures the responsive speed of the transmitting system; the phase margin is 45°, which assures the stabilization of the transmitting system. Experimental results verify that the proposed constant-current control method can keep the control error below 4% and can effectively suppress load change caused by the capacitance of earth load.
A Criterion to Control Nonlinear Error in the Mixed-Mode Bending Test
NASA Technical Reports Server (NTRS)
Reeder, James R.
2002-01-01
The mixed-mode bending test ha: been widely used to measure delamination toughness and was recently standardized by ASTM as Standard Test Method D6671-01. This simple test is a combination of the standard Mode I (opening) test and a Mode II (sliding) test. This test uses a unidirectional composite test specimen with an artificial delamination subjected to bending loads to characterize when a delamination will extend. When the displacements become large, the linear theory used to analyze the results of the test yields errors in the calcu1ated toughness values. The current standard places no limit on the specimen loading and therefore test data can be created using the standard that are significantly in error. A method of limiting the error that can be incurred in the calculated toughness values is needed. In this paper, nonlinear models of the MMB test are refined. One of the nonlinear models is then used to develop a simple criterion for prescribing conditions where thc nonlinear error will remain below 5%.
An Automated Method for High-Definition Transcranial Direct Current Stimulation Modeling*
Huang, Yu; Su, Yuzhuo; Rorden, Christopher; Dmochowski, Jacek; Datta, Abhishek; Parra, Lucas C.
2014-01-01
Targeted transcranial stimulation with electric currents requires accurate models of the current flow from scalp electrodes to the human brain. Idiosyncratic anatomy of individual brains and heads leads to significant variability in such current flows across subjects, thus, necessitating accurate individualized head models. Here we report on an automated processing chain that computes current distributions in the head starting from a structural magnetic resonance image (MRI). The main purpose of automating this process is to reduce the substantial effort currently required for manual segmentation, electrode placement, and solving of finite element models. In doing so, several weeks of manual labor were reduced to no more than 4 hours of computation time and minimal user interaction, while current-flow results for the automated method deviated by less than 27.9% from the manual method. Key facilitating factors are the addition of three tissue types (skull, scalp and air) to a state-of-the-art automated segmentation process, morphological processing to correct small but important segmentation errors, and automated placement of small electrodes based on easily reproducible standard electrode configurations. We anticipate that such an automated processing will become an indispensable tool to individualize transcranial direct current stimulation (tDCS) therapy. PMID:23367144
Tracking the hyoid bone in videofluoroscopic swallowing studies
NASA Astrophysics Data System (ADS)
Kellen, Patrick M.; Becker, Darci; Reinhardt, Joseph M.; van Daele, Douglas
2008-03-01
Difficulty swallowing, or dysphagia, has become a growing problem. Swallowing complications can lead to malnutrition, dehydration, respiratory infection, and even death. The current gold standard for analyzing and diagnosing dysphagia is the videofluoroscopic barium swallow study. In these studies, a fluoroscope is used to image the patient ingesting barium solutions of different volumes and viscosities. The hyoid bone anchors many key muscles involved in swallowing and plays a key role in the process. Abnormal hyoid bone motion during a swallow can indicate swallowing dysfunction. Currently in clinical settings, hyoid bone motion is assessed qualitatively, which can be subject to intra-rater and inter-rater bias. This paper presents a semi-automatic method for tracking the hyoid bone that makes quantitative analysis feasible. The user defines a template of the hyoid on one frame, and this template is tracked across subsequent frames. The matching phase is optimized by predicting the position of the template based on kinematics. An expert speech pathologist marked the position of the hyoid on each frame of ten studies to serve as the gold standard. Results from performing Bland-Altman analysis at a 95% confidence interval showed a bias of 0.0+/-0.08 pixels in x and -0.08+/-0.09 pixels in y between the manually-defined gold standard and the proposed method. The average Pearson's correlation between the gold standard and the proposed method was 0.987 in x and 0.980 in y. This paper also presents a method for automatically establishing a patient-centric coordinate system for the interpretation of hyoid motion. This coordinate system corrects for upper body patient motion during the study and identifies superior-inferior and anterior-posterior motion components. These tools make the use of quantitative hyoid motion analysis feasible in clinical and research settings.
NASA Astrophysics Data System (ADS)
Koshti, Ajay M.
2018-03-01
Like other NDE methods, eddy current surface crack detectability is determined using probability of detection (POD) demonstration. The POD demonstration involves eddy current testing of surface crack specimens with known crack sizes. Reliably detectable flaw size, denoted by, a90/95 is determined by statistical analysis of POD test data. The surface crack specimens shall be made from a similar material with electrical conductivity close to the part conductivity. A calibration standard with electro-discharged machined (EDM) notches is typically used in eddy current testing for surface crack detection. The calibration standard conductivity shall be within +/- 15% of the part conductivity. This condition is also applicable to the POD demonstration crack set. Here, a case is considered, where conductivity of the crack specimens available for POD testing differs by more than 15% from that of the part to be inspected. Therefore, a direct POD demonstration of reliably detectable flaw size is not applicable. Additional testing is necessary to use the demonstrated POD test data. An approach to estimate the reliably detectable flaw size in eddy current testing for part made from material A using POD crack specimens made from material B with different conductivity is provided. The approach uses additional test data obtained on EDM notch specimens made from materials A and B. EDM notch test data from the two materials is used to create a transfer function between the demonstrated a90/95 size on crack specimens made of material B and the estimated a90/95 size for part made of material A. Two methods are given. For method A, a90/95 crack size for material B is given and POD data is available. Objective of method A is to determine a90/95 crack size for material A using the same relative decision threshold that was used for material B. For method B, target crack size a90/95 for material A is known. Objective is to determine decision threshold for inspecting material A.
Raleigh, Veena; Sizmur, Steve; Tian, Yang; Thompson, James
2015-04-01
To examine the impact of patient-mix on National Health Service (NHS) acute hospital trust scores in two national NHS patient surveys. Secondary analysis of 2012 patient survey data for 57,915 adult inpatients at 142 NHS acute hospital trusts and 45,263 adult emergency department attendees at 146 NHS acute hospital trusts in England. Changes in trust scores for selected questions, ranks, inter-trust variance and score-based performance bands were examined using three methods: no adjustment for case-mix; the current standardization method with weighting for age, sex and, for inpatients only, admission method; and a regression model adjusting in addition for ethnicity, presence of a long-term condition, proxy response (inpatients only) and previous emergency attendances (emergency department survey only). For both surveys, all the variables examined were associated with patients' responses and affected inter-trust variance in scores, although the direction and strength of impact differed between variables. Inter-trust variance was generally greatest for the unadjusted scores and lowest for scores derived from the full regression model. Although trust scores derived from the three methods were highly correlated (Kendall's tau coefficients 0.70-0.94), up to 14% of trusts had discordant ranks of when the standardization and regression methods were compared. Depending on the survey and question, up to 14 trusts changed performance bands when the regression model with its fuller case-mix adjustment was used rather than the current standardization method. More comprehensive case-mix adjustment of patient survey data than the current limited adjustment reduces performance variation between NHS acute hospital trusts and alters the comparative performance bands of some trusts. Given the use of these data for high-impact purposes such as performance assessment, regulation, commissioning, quality improvement and patient choice, a review of the long-standing method for analysing patient survey data would be timely, and could improve rigour and comparability across the NHS. Performance comparisons need to be perceived as fair and scientifically robust to maintain confidence in publicly reported data, and to support their use by both the public and the NHS. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Building Energy Monitoring and Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hong, Tianzhen; Feng, Wei; Lu, Alison
This project aimed to develop a standard methodology for building energy data definition, collection, presentation, and analysis; apply the developed methods to a standardized energy monitoring platform, including hardware and software, to collect and analyze building energy use data; and compile offline statistical data and online real-time data in both countries for fully understanding the current status of building energy use. This helps decode the driving forces behind the discrepancy of building energy use between the two countries; identify gaps and deficiencies of current building energy monitoring, data collection, and analysis; and create knowledge and tools to collect and analyzemore » good building energy data to provide valuable and actionable information for key stakeholders.« less
A framework for automatic creation of gold-standard rigid 3D-2D registration datasets.
Madan, Hennadii; Pernuš, Franjo; Likar, Boštjan; Špiclin, Žiga
2017-02-01
Advanced image-guided medical procedures incorporate 2D intra-interventional information into pre-interventional 3D image and plan of the procedure through 3D/2D image registration (32R). To enter clinical use, and even for publication purposes, novel and existing 32R methods have to be rigorously validated. The performance of a 32R method can be estimated by comparing it to an accurate reference or gold standard method (usually based on fiducial markers) on the same set of images (gold standard dataset). Objective validation and comparison of methods are possible only if evaluation methodology is standardized, and the gold standard dataset is made publicly available. Currently, very few such datasets exist and only one contains images of multiple patients acquired during a procedure. To encourage the creation of gold standard 32R datasets, we propose an automatic framework. The framework is based on rigid registration of fiducial markers. The main novelty is spatial grouping of fiducial markers on the carrier device, which enables automatic marker localization and identification across the 3D and 2D images. The proposed framework was demonstrated on clinical angiograms of 20 patients. Rigid 32R computed by the framework was more accurate than that obtained manually, with the respective target registration error below 0.027 mm compared to 0.040 mm. The framework is applicable for gold standard setup on any rigid anatomy, provided that the acquired images contain spatially grouped fiducial markers. The gold standard datasets and software will be made publicly available.
Atiyeh, Bishara S
2007-01-01
Hypertrophic scars, resulting from alterations in the normal processes of cutaneous wound healing, are characterized by proliferation of dermal tissue with excessive deposition of fibroblast-derived extracellular matrix proteins, especially collagen, over long periods, and by persistent inflammation and fibrosis. Hypertrophic scars are among the most common and frustrating problems after injury. As current aesthetic surgical techniques become more standardized and results more predictable, a fine scar may be the demarcating line between acceptable and unacceptable aesthetic results. However, hypertrophic scars remain notoriously difficult to eradicate because of the high recurrence rates and the incidence of side effects associated with available treatment methods. This review explores the various treatment methods for hypertrophic scarring described in the literature including evidence-based therapies, standard practices, and emerging methods, attempting to distinguish those with clearly proven efficiency from anecdotal reports about therapies of doubtful benefits while trying to differentiate between prophylactic measures and actual treatment methods. Unfortunately, the distinction between hypertrophic scar treatments and keloid treatments is not obvious in most reports, making it difficult to assess the efficacy of hypertrophic scar treatment.
Evaluation of the depth-integration method of measuring water discharge in large rivers
Moody, J.A.; Troutman, B.M.
1992-01-01
The depth-integration method oor measuring water discharge makes a continuos measurement of the water velocity from the water surface to the bottom at 20 to 40 locations or verticals across a river. It is especially practical for large rivers where river traffic makes it impractical to use boats attached to taglines strung across the river or to use current meters suspended from bridges. This method has the additional advantage over the standard two- and eight-tenths method in that a discharge-weighted suspended-sediment sample can be collected at the same time. When this method is used in large rivers such as the Missouri, Mississippi and Ohio, a microwave navigation system is used to determine the ship's position at each vertical sampling location across the river, and to make accurate velocity corrections to compensate for shift drift. An essential feature is a hydraulic winch that can lower and raise the current meter at a constant transit velocity so that the velocities at all depths are measured for equal lengths of time. Field calibration measurements show that: (1) the mean velocity measured on the upcast (bottom to surface) is within 1% of the standard mean velocity determined by 9-11 point measurements; (2) if the transit velocity is less than 25% of the mean velocity, then average error in the mean velocity is 4% or less. The major source of bias error is a result of mounting the current meter above a sounding weight and sometimes above a suspended-sediment sampling bottle, which prevents measurement of the velocity all the way to the bottom. The measured mean velocity is slightly larger than the true mean velocity. This bias error in the discharge is largest in shallow water (approximately 8% for the Missouri River at Hermann, MO, where the mean depth was 4.3 m) and smallest in deeper water (approximately 3% for the Mississippi River at Vickbsurg, MS, where the mean depth was 14.5 m). The major source of random error in the discharge is the natural variability of river velocities, which we assumed to be independent and random at each vertical. The standard error of the estimated mean velocity, at an individual vertical sampling location, may be as large as 9%, for large sand-bed alluvial rivers. The computed discharge, however, is a weighted mean of these random velocities. Consequently the standard error of computed discharge is divided by the square root of the number of verticals, producing typical values between 1 and 2%. The discharges measured by the depth-integrated method agreed within ??5% of those measured simultaneously by the standard two- and eight-tenths, six-tenth and moving boat methods. ?? 1992.
ERIC Educational Resources Information Center
Newstreet, Carmen
2008-01-01
Teachers in the secondary social studies classroom do not regularly take the time to practice structured reflection on their teaching methods. In our current standards-driven environment, social studies classrooms are often not seen as places of higher learning. To combat these stereotypes, the author presents a method for accomplishing reflection…
Code of Federal Regulations, 2014 CFR
2014-07-01
... current expenditures or revenues per pupil for free public education among LEAs in the State is no more... State. The method for calculating the percentage of disparity in a State is in the appendix to this... in paragraph (a) of this section. The method for calculating the weighted average disparity...
Code of Federal Regulations, 2013 CFR
2013-07-01
... current expenditures or revenues per pupil for free public education among LEAs in the State is no more... State. The method for calculating the percentage of disparity in a State is in the appendix to this... in paragraph (a) of this section. The method for calculating the weighted average disparity...
Code of Federal Regulations, 2012 CFR
2012-07-01
... current expenditures or revenues per pupil for free public education among LEAs in the State is no more... State. The method for calculating the percentage of disparity in a State is in the appendix to this... in paragraph (a) of this section. The method for calculating the weighted average disparity...
Current federal regulations (40 CFR 503) require enumeration of fecal coliform or Salmoella prior to land application of Class A biosolids. This regulation specifies use of enumeration methods included in "Standard Methods for the Examination of Water and Wastewater 18th Edi...
ERIC Educational Resources Information Center
Fasli, Mukaddes; Hassanpour, Badiossadat
2017-01-01
In this century, all educational efforts strive to achieve quality assurance standards. Therefore, it will be naive to deny the existence of problems in architectural education. The current design studio critique method has been developed upon generations of students and educators. Architectural education is changing towards educating critical…
Wee, Eugene J.H.; Wang, Yuling; Tsao, Simon Chang-Hao; Trau, Matt
2016-01-01
Sensitive and accurate identification of specific DNA mutations can influence clinical decisions. However accurate diagnosis from limiting samples such as circulating tumour DNA (ctDNA) is challenging. Current approaches based on fluorescence such as quantitative PCR (qPCR) and more recently, droplet digital PCR (ddPCR) have limitations in multiplex detection, sensitivity and the need for expensive specialized equipment. Herein we describe an assay capitalizing on the multiplexing and sensitivity benefits of surface-enhanced Raman spectroscopy (SERS) with the simplicity of standard PCR to address the limitations of current approaches. This proof-of-concept method could reproducibly detect as few as 0.1% (10 copies, CV < 9%) of target sequences thus demonstrating the high sensitivity of the method. The method was then applied to specifically detect three important melanoma mutations in multiplex. Finally, the PCR/SERS assay was used to genotype cell lines and ctDNA from serum samples where results subsequently validated with ddPCR. With ddPCR-like sensitivity and accuracy yet at the convenience of standard PCR, we believe this multiplex PCR/SERS method could find wide applications in both diagnostics and research. PMID:27446486
Wee, Eugene J H; Wang, Yuling; Tsao, Simon Chang-Hao; Trau, Matt
2016-01-01
Sensitive and accurate identification of specific DNA mutations can influence clinical decisions. However accurate diagnosis from limiting samples such as circulating tumour DNA (ctDNA) is challenging. Current approaches based on fluorescence such as quantitative PCR (qPCR) and more recently, droplet digital PCR (ddPCR) have limitations in multiplex detection, sensitivity and the need for expensive specialized equipment. Herein we describe an assay capitalizing on the multiplexing and sensitivity benefits of surface-enhanced Raman spectroscopy (SERS) with the simplicity of standard PCR to address the limitations of current approaches. This proof-of-concept method could reproducibly detect as few as 0.1% (10 copies, CV < 9%) of target sequences thus demonstrating the high sensitivity of the method. The method was then applied to specifically detect three important melanoma mutations in multiplex. Finally, the PCR/SERS assay was used to genotype cell lines and ctDNA from serum samples where results subsequently validated with ddPCR. With ddPCR-like sensitivity and accuracy yet at the convenience of standard PCR, we believe this multiplex PCR/SERS method could find wide applications in both diagnostics and research.
NMR spectroscopy for assessment of lipid oxidation during frying
USDA-ARS?s Scientific Manuscript database
Except for total polar compounds (TPC), polymerized triacylglycerols (PTAG) and fatty acid composition, most other current standard methods for lipid oxidation detect very small amounts of oxidation products such as hydroperoxides, conjugated dienes, aldehydes, and epoxides. Therefore, amounts of th...
Jha, Abhinav K.; Kupinski, Matthew A.; Rodríguez, Jeffrey J.; Stephen, Renu M.; Stopeck, Alison T.
2012-01-01
In many studies, the estimation of the apparent diffusion coefficient (ADC) of lesions in visceral organs in diffusion-weighted (DW) magnetic resonance images requires an accurate lesion-segmentation algorithm. To evaluate these lesion-segmentation algorithms, region-overlap measures are used currently. However, the end task from the DW images is accurate ADC estimation, and the region-overlap measures do not evaluate the segmentation algorithms on this task. Moreover, these measures rely on the existence of gold-standard segmentation of the lesion, which is typically unavailable. In this paper, we study the problem of task-based evaluation of segmentation algorithms in DW imaging in the absence of a gold standard. We first show that using manual segmentations instead of gold-standard segmentations for this task-based evaluation is unreliable. We then propose a method to compare the segmentation algorithms that does not require gold-standard or manual segmentation results. The no-gold-standard method estimates the bias and the variance of the error between the true ADC values and the ADC values estimated using the automated segmentation algorithm. The method can be used to rank the segmentation algorithms on the basis of both accuracy and precision. We also propose consistency checks for this evaluation technique. PMID:22713231
Boardman, Carl; O'Callaghan, Kenneth; Delort, Anne-Marie; Song, Jim
2018-01-01
Plastic litter is encountered in aquatic ecosystems across the globe, including polar environments and the deep sea. To mitigate the adverse societal and ecological impacts of this waste, there has been debate on whether ‘biodegradable' materials should be granted exemptions from plastic bag bans and levies. However, great care must be exercised when attempting to define this term, due to the broad and complex range of physical and chemical conditions encountered within natural ecosystems. Here, we review existing international industry standards and regional test methods for evaluating the biodegradability of plastics within aquatic environments (wastewater, unmanaged freshwater and marine habitats). We argue that current standards and test methods are insufficient in their ability to realistically predict the biodegradability of carrier bags in these environments, due to several shortcomings in experimental procedures and a paucity of information in the scientific literature. Moreover, existing biodegradability standards and test methods for aquatic environments do not involve toxicity testing or account for the potentially adverse ecological impacts of carrier bags, plastic additives, polymer degradation products or small (microscopic) plastic particles that can arise via fragmentation. Successfully addressing these knowledge gaps is a key requirement for developing new biodegradability standard(s) for lightweight carrier bags. PMID:29892374
Systematic Model-in-the-Loop Test of Embedded Control Systems
NASA Astrophysics Data System (ADS)
Krupp, Alexander; Müller, Wolfgang
Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.
A brief overview on radon measurements in drinking water.
Jobbágy, Viktor; Altzitzoglou, Timotheos; Malo, Petya; Tanner, Vesa; Hult, Mikael
2017-07-01
The aim of this paper is to present information about currently used standard and routine methods for radon analysis in drinking waters. An overview is given about the current situation and the performance of different measurement methods based on literature data. The following parameters are compared and discussed: initial sample volume and sample preparation, detection systems, minimum detectable activity, counting efficiency, interferences, measurement uncertainty, sample capacity and overall turnaround time. Moreover, the parametric levels for radon in drinking water from the different legislations and directives/guidelines on radon are presented. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Lindsey, Brock A; Markel, Justin E; Kleinerman, Eugenie S
2017-06-01
Osteosarcoma (OS) is the most common primary malignancy of bone and patients with metastatic disease or recurrences continue to have very poor outcomes. Unfortunately, little prognostic improvement has been generated from the last 20 years of research and a new perspective is warranted. OS is extremely heterogeneous in both its origins and manifestations. Although multiple associations have been made between the development of osteosarcoma and race, gender, age, various genomic alterations, and exposure situations among others, the etiology remains unclear and controversial. Noninvasive diagnostic methods include serum markers like alkaline phosphatase and a growing variety of imaging techniques including X-ray, computed tomography, magnetic resonance imaging, and positron emission as well as combinations thereof. Still, biopsy and microscopic examination are required to confirm the diagnosis and carry additional prognostic implications such as subtype classification and histological response to neoadjuvant chemotherapy. The current standard of care combines surgical and chemotherapeutic techniques, with a multitude of experimental biologics and small molecules currently in development and some in clinical trial phases. In this review, in addition to summarizing the current understanding of OS etiology, diagnostic methods, and the current standard of care, our group describes various experimental therapeutics and provides evidence to encourage a potential paradigm shift toward the introduction of immunomodulation, which may offer a more comprehensive approach to battling cancer pleomorphism.
Recent Work in Hybrid Radiation Transport Methods with Applications to Commercial Nuclear Power
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kulesza, Joel A.
This talk will begin with an overview of hybrid radiation transport methods followed by a discussion of the author’s work to advance current capabilities. The talk will then describe applications for these methods in commercial nuclear power reactor analyses and techniques for experimental validation. When discussing these analytical and experimental activities, the importance of technical standards such as those created and maintained by ASTM International will be demonstrated.
Gantner, Martin; Schwarzmann, Günter; Sandhoff, Konrad; Kolter, Thomas
2014-12-01
Within recent years, ganglioside patterns have been increasingly analyzed by MS. However, internal standards for calibration are only available for gangliosides GM1, GM2, and GM3. For this reason, we prepared homologous internal standards bearing nonnatural fatty acids of the major mammalian brain gangliosides GM1, GD1a, GD1b, GT1b, and GQ1b, and of the tumor-associated gangliosides GM2 and GD2. The fatty acid moieties were incorporated after selective chemical or enzymatic deacylation of bovine brain gangliosides. For modification of the sphingoid bases, we developed a new synthetic method based on olefin cross metathesis. This method was used for the preparation of a lyso-GM1 and a lyso-GM2 standard. The total yield of this method was 8.7% for the synthesis of d17:1-lyso-GM1 from d20:1/18:0-GM1 in four steps. The title compounds are currently used as calibration substances for MS quantification and are also suitable for functional studies. Copyright © 2014 by the American Society for Biochemistry and Molecular Biology, Inc.
76 FR 54293 - Review of National Ambient Air Quality Standards for Carbon Monoxide
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-31
...This rule is being issued at this time as required by a court order governing the schedule for completion of this review of the air quality criteria and the national ambient air quality standards (NAAQS) for carbon monoxide (CO). Based on its review, the EPA concludes the current primary standards are requisite to protect public health with an adequate margin of safety, and is retaining those standards. After review of the air quality criteria, EPA further concludes that no secondary standard should be set for CO at this time. EPA is also making changes to the ambient air monitoring requirements for CO, including those related to network design, and is updating, without substantive change, aspects of the Federal reference method.
Intelligence Level Performance Standards Research for Autonomous Vehicles
Bostelman, Roger B.; Hong, Tsai H.; Messina, Elena
2017-01-01
United States and European safety standards have evolved to protect workers near Automatic Guided Vehicles (AGV’s). However, performance standards for AGV’s and mobile robots have only recently begun development. Lessons can be learned from research and standards efforts for mobile robots applied to emergency response and military applications. Research challenges, tests and evaluations, and programs to develop higher intelligence levels for vehicles can also used to guide industrial AGV developments towards more adaptable and intelligent systems. These other efforts also provide useful standards development criteria for AGV performance test methods. Current standards areas being considered for AGVs are for docking, navigation, obstacle avoidance, and the ground truth systems that measure performance. This paper provides a look to the future with standards developments in both the performance of vehicles and the dynamic perception systems that measure intelligent vehicle performance. PMID:28649189
Intelligence Level Performance Standards Research for Autonomous Vehicles.
Bostelman, Roger B; Hong, Tsai H; Messina, Elena
2015-01-01
United States and European safety standards have evolved to protect workers near Automatic Guided Vehicles (AGV's). However, performance standards for AGV's and mobile robots have only recently begun development. Lessons can be learned from research and standards efforts for mobile robots applied to emergency response and military applications. Research challenges, tests and evaluations, and programs to develop higher intelligence levels for vehicles can also used to guide industrial AGV developments towards more adaptable and intelligent systems. These other efforts also provide useful standards development criteria for AGV performance test methods. Current standards areas being considered for AGVs are for docking, navigation, obstacle avoidance, and the ground truth systems that measure performance. This paper provides a look to the future with standards developments in both the performance of vehicles and the dynamic perception systems that measure intelligent vehicle performance.
Mann, D V; Ho, C S; Critchley, L; Fok, B S P; Pang, E W H; Lam, C W K; Hjelm, N M
2007-05-01
The doubly labelled water (DLW) method is the technique of choice for measurement of free-living total energy expenditure (TEE) in humans. A major constraint on the clinical applicability of the method has been the expense of the (18)O isotope. We have used a reduced-dose (one-tenth of the currently recommended standard dose) of DLW for the measurement of TEE and body composition in nine healthy adult male volunteers. TEE measured by reduced-dose DLW was positively correlated with resting energy expenditure measured by metabolic cart (r=0.87, P<0.01). Isotope-derived fat mass and body mass index were strongly correlated (r=0.86, P<0.01). In four subjects in whom we performed a complementary evaluation using standard-dose isotope enrichment, the TEE measurements were satisfactorily comparable (mean+/-s.d.: reduced dose 2586+/-155 kcal/day vs standard dose 2843+/-321 kcal/day; mean difference 257+/-265 kcal/day). These data indicate that DLW measurements of human energy expenditure and body composition can be performed at a substantially reduced dose (and cost) of isotope enrichment than is currently employed.
Hofmann-Amtenbrink, Margarethe; Grainger, David W; Hofmann, Heinrich
2015-10-01
Although nanoparticles research is ongoing since more than 30years, the development of methods and standard protocols required for their safety and efficacy testing for human use is still in development. The review covers questions on toxicity, safety, risk and legal issues over the lifecycle of inorganic nanoparticles for medical applications. The following topics were covered: (i) In vitro tests may give only a very first indication of possible toxicity as in the actual methods interactions at systemic level are mainly neglected; (ii) the science-driven and the regulation-driven approaches do not really fit for decisive strategies whether or not a nanoparticle should be further developed and may receive a kind of "safety label". (iii) Cost and time of development are the limiting factors for the drug pipeline. Knowing which property of a nanoparticle makes it toxic it may be feasible to re-engineer the particle for higher safety (safety by design). Testing the safety and efficacy of nanoparticles for human use is still in need of standardization. In this concise review, the author described and discussed the current unresolved issues over the application of inorganic nanoparticles for medical applications. Copyright © 2015 Elsevier Inc. All rights reserved.
In Search of Easy-to-Use Methods for Calibrating ADCP's for Velocity and Discharge Measurements
Oberg, K.; ,
2002-01-01
A cost-effective procedure for calibrating acoustic Doppler current profilers (ADCP) in the field was presented. The advantages and disadvantages of various methods which are used for calibrating ADCP were discussed. The proposed method requires the use of differential global positioning system (DGPS) with sub-meter accuracy and standard software for collecting ADCP data. The method involves traversing a long (400-800 meter) course at a constant compass heading and speed, while collecting simultaneous DGPS and ADCP data.
Crapanzano, John P.; Heymann, Jonas J.; Monaco, Sara; Nassar, Aziza; Saqi, Anjali
2014-01-01
Background: In the recent past, algorithms and recommendations to standardize the morphological, immunohistochemical and molecular classification of lung cancers on cytology specimens have been proposed, and several organizations have recommended cell blocks (CBs) as the preferred modality for molecular testing. Based on the literature, there are several different techniques available for CB preparation-suggesting that there is no standard. The aim of this study was to conduct a survey of CB preparation techniques utilized in various practice settings and analyze current issues, if any. Materials and Methods: A single E-mail with a link to an electronic survey was distributed to members of the American Society of Cytopathology and other pathologists. Questions pertaining to the participants’ practice setting and CBs-volume, method, quality and satisfaction-were included. Results: Of 95 respondents, 90/95 (94%) completed the survey and comprise the study group. Most participants practice in a community hospital/private practice (44%) or academic center (41%). On average, 14 CBs (range 0-50; median 10) are prepared by a laboratory daily. Over 10 methods are utilized: Plasma thrombin (33%), HistoGel (27%), Cellient automated cell block system (8%) and others (31%) respectively. Forty of 90 (44%) respondents are either unsatisfied or sometimes satisfied with their CB quality, with low-cellular yield being the leading cause of dissatisfaction. There was no statistical significance between the three most common CB preparation methods and satisfaction with quality. Discussion: Many are dissatisfied with their current method of CB preparation, and there is no consistent method to prepare CBs. In today's era of personalized medicine with an increasing array of molecular tests being applied to cytological specimens, there is a need for a standardized protocol for CB optimization to enhance cellularity. PMID:24799951
Nonclinical dose formulation analysis method validation and sample analysis.
Whitmire, Monica Lee; Bryan, Peter; Henry, Teresa R; Holbrook, John; Lehmann, Paul; Mollitor, Thomas; Ohorodnik, Susan; Reed, David; Wietgrefe, Holly D
2010-12-01
Nonclinical dose formulation analysis methods are used to confirm test article concentration and homogeneity in formulations and determine formulation stability in support of regulated nonclinical studies. There is currently no regulatory guidance for nonclinical dose formulation analysis method validation or sample analysis. Regulatory guidance for the validation of analytical procedures has been developed for drug product/formulation testing; however, verification of the formulation concentrations falls under the framework of GLP regulations (not GMP). The only current related regulatory guidance is the bioanalytical guidance for method validation. The fundamental parameters for bioanalysis and formulation analysis validations that overlap include: recovery, accuracy, precision, specificity, selectivity, carryover, sensitivity, and stability. Divergence in bioanalytical and drug product validations typically center around the acceptance criteria used. As the dose formulation samples are not true "unknowns", the concept of quality control samples that cover the entire range of the standard curve serving as the indication for the confidence in the data generated from the "unknown" study samples may not always be necessary. Also, the standard bioanalytical acceptance criteria may not be directly applicable, especially when the determined concentration does not match the target concentration. This paper attempts to reconcile the different practices being performed in the community and to provide recommendations of best practices and proposed acceptance criteria for nonclinical dose formulation method validation and sample analysis.
Counting glomeruli and podocytes: rationale and methodologies
Puelles, Victor G.; Bertram, John F.
2015-01-01
Purpose of review There is currently much interest in the numbers of both glomeruli and podocytes. This interest stems from greater understanding of the effects of suboptimal fetal events on nephron endowment, the associations between low nephron number and chronic cardiovascular and kidney disease in adults, and the emergence of the podocyte depletion hypothesis. Recent findings Obtaining accurate and precise estimates of glomerular and podocyte number has proven surprisingly difficult. When whole kidneys or large tissue samples are available, design-based stereological methods are considered gold-standard because they are based on principles that negate systematic bias. However, these methods are often tedious and time-consuming, and oftentimes inapplicable when dealing with small samples such as biopsies. Therefore, novel methods suitable for small tissue samples, and innovative approaches to facilitate high through put measurements, such as magnetic resonance imaging (MRI) to estimate glomerular number and flow cytometry to estimate podocyte number, have recently been described. Summary This review describes current gold-standard methods for estimating glomerular and podocyte number, as well as methods developed in the past 3 years. We are now better placed than ever before to accurately and precisely estimate glomerular and podocyte number, and to examine relationships between these measurements and kidney health and disease. PMID:25887899
Visual function and fitness to drive.
Kotecha, Aachal; Spratt, Alexander; Viswanathan, Ananth
2008-01-01
Driving is recognized to be a visually intensive task and accordingly there is a legal minimum standard of vision required for all motorists. The purpose of this paper is to review the current United Kingdom (UK) visual requirements for driving and discuss the evidence base behind these legal rules. The role of newer, alternative tests of visual function that may be better indicators of driving safety will also be considered. Finally, the implications of ageing on driving ability are discussed. A search of Medline and PubMed databases was performed using the following keywords: driving, vision, visual function, fitness to drive and ageing. In addition, papers from the Department of Transport website and UK Royal College of Ophthalmologists guidelines were studied. Current UK visual standards for driving are based upon historical concepts, but recent advances in technology have brought about more sophisticated methods for assessing the status of the binocular visual field and examining visual attention. These tests appear to be better predictors of driving performance. Further work is required to establish whether these newer tests should be incorporated in the current UK visual standards when examining an individual's fitness to drive.
Comparison of air-kerma strength determinations for HDR (192)Ir sources.
Rasmussen, Brian E; Davis, Stephen D; Schmidt, Cal R; Micka, John A; Dewerd, Larry A
2011-12-01
To perform a comparison of the interim air-kerma strength standard for high dose rate (HDR) (192)Ir brachytherapy sources maintained by the University of Wisconsin Accredited Dosimetry Calibration Laboratory (UWADCL) with measurements of the various source models using modified techniques from the literature. The current interim standard was established by Goetsch et al. in 1991 and has remained unchanged to date. The improved, laser-aligned seven-distance apparatus of the University of Wisconsin Medical Radiation Research Center (UWMRRC) was used to perform air-kerma strength measurements of five different HDR (192)Ir source models. The results of these measurements were compared with those from well chambers traceable to the original standard. Alternative methodologies for interpolating the (192)Ir air-kerma calibration coefficient from the NIST air-kerma standards at (137)Cs and 250 kVp x rays (M250) were investigated and intercompared. As part of the interpolation method comparison, the Monte Carlo code EGSnrc was used to calculate updated values of A(wall) for the Exradin A3 chamber used for air-kerma strength measurements. The effects of air attenuation and scatter, room scatter, as well as the solution method were investigated in detail. The average measurements when using the inverse N(K) interpolation method for the Classic Nucletron, Nucletron microSelectron, VariSource VS2000, GammaMed Plus, and Flexisource were found to be 0.47%, -0.10%, -1.13%, -0.20%, and 0.89% different than the existing standard, respectively. A further investigation of the differences observed between the sources was performed using MCNP5 Monte Carlo simulations of each source model inside a full model of an HDR 1000 Plus well chamber. Although the differences between the source models were found to be statistically significant, the equally weighted average difference between the seven-distance measurements and the well chambers was 0.01%, confirming that it is not necessary to update the current standard maintained at the UWADCL.
Santos, Sara; Oliveira, Manuela; Amorim, António; van Asch, Barbara
2014-11-01
The grapevine (Vitis vinifera subsp. vinifera) is one of the most important agricultural crops worldwide. A long interest in the historical origins of ancient and cultivated current grapevines, as well as the need to establish phylogenetic relationships and parentage, solve homonymies and synonymies, fingerprint cultivars and clones, and assess the authenticity of plants and wines has encouraged the development of genetic identification methods. STR analysis is currently the most commonly used method for these purposes. A large dataset of grapevines genotypes for many cultivars worldwide has been produced in the last decade using a common set of recommended dinucleotide nuclear STRs. This type of marker has been replaced by long core-repeat loci in standardized state-of-the-art human forensic genotyping. The first steps toward harmonized grapevine genotyping have already been taken to bring the genetic identification methods closer to human forensic STR standards by previous authors. In this context, we bring forward a set of basic suggestions that reinforce the need to (i) guarantee trueness-to-type of the sample; (ii) use the long core-repeat markers; (iii) verify the specificity and amplification consistency of PCR primers; (iv) sequence frequent alleles and use these standardized allele ladders; (v) consider mutation rates when evaluating results of STR-based parentage and pedigree analysis; (vi) genotype large and representative samples in order to obtain allele frequency databases; (vii) standardize genotype data by establishing allele nomenclature based on repeat number to facilitate information exchange and data compilation. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Health Benefits from Large-Scale Ozone Reduction in the United States
Berman, Jesse D.; Fann, Neal; Hollingsworth, John W.; Pinkerton, Kent E.; Rom, William N.; Szema, Anthony M.; Breysse, Patrick N.; White, Ronald H.
2012-01-01
Background: Exposure to ozone has been associated with adverse health effects, including premature mortality and cardiopulmonary and respiratory morbidity. In 2008, the U.S. Environmental Protection Agency (EPA) lowered the primary (health-based) National Ambient Air Quality Standard (NAAQS) for ozone to 75 ppb, expressed as the fourth-highest daily maximum 8-hr average over a 24-hr period. Based on recent monitoring data, U.S. ozone levels still exceed this standard in numerous locations, resulting in avoidable adverse health consequences. Objectives: We sought to quantify the potential human health benefits from achieving the current primary NAAQS standard of 75 ppb and two alternative standard levels, 70 and 60 ppb, which represent the range recommended by the U.S. EPA Clean Air Scientific Advisory Committee (CASAC). Methods: We applied health impact assessment methodology to estimate numbers of deaths and other adverse health outcomes that would have been avoided during 2005, 2006, and 2007 if the current (or lower) NAAQS ozone standards had been met. Estimated reductions in ozone concentrations were interpolated according to geographic area and year, and concentration–response functions were obtained or derived from the epidemiological literature. Results: We estimated that annual numbers of avoided ozone-related premature deaths would have ranged from 1,410 to 2,480 at 75 ppb to 2,450 to 4,130 at 70 ppb, and 5,210 to 7,990 at 60 ppb. Acute respiratory symptoms would have been reduced by 3 million cases and school-loss days by 1 million cases annually if the current 75-ppb standard had been attained. Substantially greater health benefits would have resulted if the CASAC-recommended range of standards (70–60 ppb) had been met. Conclusions: Attaining a more stringent primary ozone standard would significantly reduce ozone-related premature mortality and morbidity. PMID:22809899
2014-01-01
Background Since the global standards for postgraduate medical education (PGME) were published in January 2003, they have gained worldwide attention. The current state of residency training programs in medical-school-affiliated hospitals throughout China was assessed in this study. Methods Based on the internationally recognized global standards for PGME, residents undergoing residency training at that time and the relevant residency training instructors and management personnel from 15 medical-school-affiliated hospitals throughout China were recruited and surveyed regarding the current state of residency training programs. A total of 938 questionnaire surveys were distributed between June 30, 2006 and July 30, 2006; of 892 surveys collected, 841 were valid. Results For six items, the total proportions of “basically meets standards” and “completely meets standards” were <70% for the basic standards. These items were identified in the fields of “training settings and educational resources”, “evaluation of training process”, and “trainees”. In all fields other than “continuous updates”, the average scores of the western regions were significantly lower than those of the eastern regions for both the basic and target standards. Specifically, the average scores for the basic standards on as many as 25 of the 38 items in the nine fields were significantly lower in the western regions. There were significant differences in the basic standards scores on 13 of the 38 items among trainees, instructors, and managers. Conclusions The residency training programs have achieved satisfactory outcomes in the hospitals affiliated with various medical schools in China. However, overall, the programs remain inadequate in certain areas. For the governments, organizations, and institutions responsible for PGME, such global standards for PGME are a very useful self-assessment tool and can help identify problems, promote reform, and ultimately standardize PGME. PMID:24885865
Velocity profile, water-surface slope, and bed-material size for selected streams in Colorado
Marchand, J.P.; Jarrett, R.D.; Jones, L.L.
1984-01-01
Existing methods for determining the mean velocity in a vertical sampling section do not address the conditions present in high-gradient, shallow-depth streams common to mountainous regions such as Colorado. The report presents velocity-profile data that were collected for 11 streamflow-gaging stations in Colorado using both a standard Price type AA current meter and a prototype Price Model PAA current meter. Computational results are compiled that will enable mean velocities calculated from measurements by the two current meters to be compared with each other and with existing methods for determining mean velocity. Water-surface slope, bed-material size, and flow-characteristic data for the 11 sites studied also are presented. (USGS)
Effective classification of the prevalence of Schistosoma mansoni.
Mitchell, Shira A; Pagano, Marcello
2012-12-01
To present an effective classification method based on the prevalence of Schistosoma mansoni in the community. We created decision rules (defined by cut-offs for number of positive slides), which account for imperfect sensitivity, both with a simple adjustment of fixed sensitivity and with a more complex adjustment of changing sensitivity with prevalence. To reduce screening costs while maintaining accuracy, we propose a pooled classification method. To estimate sensitivity, we use the De Vlas model for worm and egg distributions. We compare the proposed method with the standard method to investigate differences in efficiency, measured by number of slides read, and accuracy, measured by probability of correct classification. Modelling varying sensitivity lowers the lower cut-off more significantly than the upper cut-off, correctly classifying regions as moderate rather than lower, thus receiving life-saving treatment. The classification method goes directly to classification on the basis of positive pools, avoiding having to know sensitivity to estimate prevalence. For model parameter values describing worm and egg distributions among children, the pooled method with 25 slides achieves an expected 89.9% probability of correct classification, whereas the standard method with 50 slides achieves 88.7%. Among children, it is more efficient and more accurate to use the pooled method for classification of S. mansoni prevalence than the current standard method. © 2012 Blackwell Publishing Ltd.
DEVELOPMENT OF STANDARDIZED LARGE RIVER BIOASSESSMENT PROTOCOLS (LR-BP) FOR FISH ASSEMBLAGES
We conducted research comparing several methods currently in use for the bioassessment and monitoring of fish and benthic macroinvertebrate assemblages for large rivers. Fish data demonstrate that electrofishing 1000 m of shoreline is sufficient for bioassessments on boatable ri...
A Manual to Identify Sources of Fluvial Sediment
Sedimentation is one of the main causes of stream/river aquatic life use impairments in R3. Currently states lack standard guidance on appropriate tools available to quantify sediment sources and develop sediment budgets in TMDL Development. Methods for distinguishing sediment t...
Separation technologies for the recovery and dehydration of alcohols from fermentation broths
Multi-column distillation followed by molecular sieve adsorption is currently the standard method for producing fuel grade ethanol from dilute fermentation broths in modern corn-to-ethnol facilities. As the liquid biofuels industry transitions to lignocellulosic feedstocks, expan...
Carrasco-Labra, Alonso; Brignardello-Petersen, Romina; Santesso, Nancy; Neumann, Ignacio; Mustafa, Reem A; Mbuagbaw, Lawrence; Ikobaltzeta, Itziar Etxeandia; De Stio, Catherine; McCullagh, Lauren J; Alonso-Coello, Pablo; Meerpohl, Joerg J; Vandvik, Per Olav; Brozek, Jan L; Akl, Elie A; Bossuyt, Patrick; Churchill, Rachel; Glenton, Claire; Rosenbaum, Sarah; Tugwell, Peter; Welch, Vivian; Guyatt, Gordon; Schünemann, Holger
2015-04-16
Systematic reviews represent one of the most important tools for knowledge translation but users often struggle with understanding and interpreting their results. GRADE Summary-of-Findings tables have been developed to display results of systematic reviews in a concise and transparent manner. The current format of the Summary-of-Findings tables for presenting risks and quality of evidence improves understanding and assists users with finding key information from the systematic review. However, it has been suggested that additional methods to present risks and display results in the Summary-of-Findings tables are needed. We will conduct a non-inferiority parallel-armed randomized controlled trial to determine whether an alternative format to present risks and display Summary-of-Findings tables is not inferior compared to the current standard format. We will measure participant understanding, accessibility of the information, satisfaction, and preference for both formats. We will invite systematic review users to participate (that is clinicians, guideline developers, and researchers). The data collection process will be undertaken using the online 'Survey Monkey' system. For the primary outcome understanding, non-inferiority of the alternative format (Table A) to the current standard format (Table C) of Summary-of-Findings tables will be claimed if the upper limit of a 1-sided 95% confidence interval (for the difference of proportion of participants answering correctly a given question) excluded a difference in favor of the current format of more than 10%. This study represents an effort to provide systematic reviewers with additional options to display review results using Summary-of-Findings tables. In this way, review authors will have a variety of methods to present risks and more flexibility to choose the most appropriate table features to display (that is optional columns, risks expressions, complementary methods to display continuous outcomes, and so on). NCT02022631 (21 December 2013).
Fibrinolysis standards: a review of the current status.
Thelwell, C
2010-07-01
Biological standards are used to calibrate measurements of components of the fibrinolytic system, either for assigning potency values to therapeutic products, or to determine levels in human plasma as an indicator of thrombotic risk. Traditionally WHO International Standards are calibrated in International Units based on consensus values from collaborative studies. The International Unit is defined by the response activity of a given amount of the standard in a bioassay, independent of the method used. Assay validity is based on the assumption that both standard and test preparation contain the same analyte, and the response in an assay is a true function of this analyte. This principle is reflected in the diversity of source materials used to prepare fibrinolysis standards, which has depended on the contemporary preparations they were employed to measure. With advancing recombinant technology, and improved analytical techniques, a reference system based on reference materials and associated reference methods has been recommended for future fibrinolysis standards. Careful consideration and scientific judgement must however be applied when deciding on an approach to develop a new standard, with decisions based on the suitability of a standard to serve its purpose, and not just to satisfy a metrological ideal. 2010 The International Association for Biologicals. Published by Elsevier Ltd. All rights reserved.
Revision of the NIST Standard for (223)Ra: New Measurements and Review of 2008 Data.
Zimmerman, B E; Bergeron, D E; Cessna, J T; Fitzgerald, R; Pibida, L
2015-01-01
After discovering a discrepancy in the transfer standard currently being disseminated by the National Institute of Standards and Technology (NIST), we have performed a new primary standardization of the alpha-emitter (223)Ra using Live-timed Anticoincidence Counting (LTAC) and the Triple-to-Double Coincidence Ratio Method (TDCR). Additional confirmatory measurements were made with the CIEMAT-NIST efficiency tracing method (CNET) of liquid scintillation counting, integral γ-ray counting using a NaI(Tl) well counter, and several High Purity Germanium (HPGe) detectors in an attempt to understand the origin of the discrepancy and to provide a correction. The results indicate that a -9.5 % difference exists between activity values obtained using the former transfer standard relative to the new primary standardization. During one of the experiments, a 2 % difference in activity was observed between dilutions of the (223)Ra master solution prepared using the composition used in the original standardization and those prepared using 1 mol·L(-1) HCl. This effect appeared to be dependent on the number of dilutions or the total dilution factor to the master solution, but the magnitude was not reproducible. A new calibration factor ("K-value") has been determined for the NIST Secondary Standard Ionization Chamber (IC "A"), thereby correcting the discrepancy between the primary and secondary standards.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
This decision document presents the amendments to the remedial action for the Springfield Township Dump site, Oakland County, Michigan. The amended remedial action changes the selected method of addressing PCB-laden soils and also changes certain soil and groundwater cleanup standards previously selected in the 1990 Record of Decision (ROD) to reflect current state standards: The groundwater and soil vapor extraction and treatment systems and the arsenic and lead groundwater cleanup standards identified as part of the selected remedy in the 1990 ROD and in the 1993 Explanation of Significant Difference remain unchanged.
Faye, Sherry A.; Richards, Jason M.; Gallardo, Athena M.; ...
2017-02-07
Sequential extraction is a useful technique for assessing the potential to leach actinides from soils; however, current literature lacks uniformity in experimental details, making direct comparison of results impossible. This work continued development toward a standardized five-step sequential extraction protocol by analyzing extraction behaviors of 232Th, 238U, 239,240Pu and 241Am from lake and ocean sediment reference materials. Results produced a standardized procedure after creating more defined reaction conditions to improve method repeatability. A NaOH fusion procedure is recommended following sequential leaching for the complete dissolution of insoluble species.
Findlay, L; Desai, T; Heath, A; Poole, S; Crivellone, M; Hauck, W; Ambrose, M; Morris, T; Daas, A; Rautmann, G; Buchheit, K H; Spieser, J M; Terao, E
2015-01-01
An international collaborative study was organised jointly by the World Health Organization (WHO)/National Institute for Biological Standards and Control (NIBSC), the United States Pharmacopeia (USP) and the European Directorate for the Quality of Medicines & HealthCare (EDQM/Council of Europe) for the establishment of harmonised replacement endotoxin standards for these 3 organisations. Thirty-five laboratories worldwide, including Official Medicines Control Laboratories (OMCLs) and manufacturers enrolled in the study. Three candidate preparations (10/178, 10/190 and 10/196) were produced with the same material and same formulation as the current reference standards with the objective of generating a new (3(rd)) International Standard (IS) with the same potency (10 000 IU/vial) as the current (2(nd)) IS, as well as new European Pharmacopoeia (Ph. Eur.). and USP standards. The suitability of the candidate preparations to act as the reference standard in assays for endotoxin performed according to compendial methods was evaluated. Their potency was calibrated against the WHO 2(nd) IS for Endotoxin (94/580). Gelation and photometric methods produced similar results for each of the candidate preparations. The overall potency estimates for the 3 batches were comparable. Given the intrinsic assay precision, the observed differences between the batches may be considered unimportant for the intended use of these materials. Overall, these results were in line with those generated for the establishment of the current preparations of reference standards. Accelerated degradation testing of vials stored at elevated temperatures supported the long-term stability of the 3 candidate preparations. It was agreed between the 3 organisations that batch 10/178 be shared between WHO and EDQM and that batches 10/190 and 10/196 be allocated to USP, with a common assigned value of 10 000 IU/vial. This value maintains the continuity of the global harmonisation of reference materials and unitage for the testing of endotoxins in parenteral pharmaceutical products. Based on the results of the collaborative study, batch 10/178 was established by the European Pharmacopoeia Commission as the Ph. Eur. Endotoxin Biological Reference Preparation (BRP) batch 5. The same batch was also established by the Expert Committee on Biological Standardisation (ECBS) of WHO as the WHO 3(rd) IS for Endotoxin. Batch 10/190 was adopted as the USP Endotoxin Reference Standard, lot H0K354 and vials from this same batch (10/190) will serve as the United States Food and Drug Administration (USFDA) Endotoxin Standard, EC-7.
Johnston, Patrick A; Brown, Robert C
2014-08-13
A rapid method for the quantitation of total sugars in pyrolysis liquids using high-performance liquid chromatography (HPLC) was developed. The method avoids the tedious and time-consuming sample preparation required by current analytical methods. It is possible to directly analyze hydrolyzed pyrolysis liquids, bypassing the neutralization step usually required in determination of total sugars. A comparison with traditional methods was used to determine the validity of the results. The calibration curve coefficient of determination on all standard compounds was >0.999 using a refractive index detector. The relative standard deviation for the new method was 1.13%. The spiked sugar recoveries on the pyrolysis liquid samples were between 104 and 105%. The research demonstrates that it is possible to obtain excellent accuracy and efficiency using HPLC to quantitate glucose after acid hydrolysis of polymeric and oligomeric sugars found in fast pyrolysis bio-oils without neutralization.
Evolution of microbiological analytical methods for dairy industry needs
Sohier, Danièle; Pavan, Sonia; Riou, Armelle; Combrisson, Jérôme; Postollec, Florence
2014-01-01
Traditionally, culture-based methods have been used to enumerate microbial populations in dairy products. Recent developments in molecular methods now enable faster and more sensitive analyses than classical microbiology procedures. These molecular tools allow a detailed characterization of cell physiological states and bacterial fitness and thus, offer new perspectives to integration of microbial physiology monitoring to improve industrial processes. This review summarizes the methods described to enumerate and characterize physiological states of technological microbiota in dairy products, and discusses the current deficiencies in relation to the industry’s needs. Recent studies show that Polymerase chain reaction-based methods can successfully be applied to quantify fermenting microbes and probiotics in dairy products. Flow cytometry and omics technologies also show interesting analytical potentialities. However, they still suffer from a lack of validation and standardization for quality control analyses, as reflected by the absence of performance studies and official international standards. PMID:24570675
Evolution of microbiological analytical methods for dairy industry needs.
Sohier, Danièle; Pavan, Sonia; Riou, Armelle; Combrisson, Jérôme; Postollec, Florence
2014-01-01
Traditionally, culture-based methods have been used to enumerate microbial populations in dairy products. Recent developments in molecular methods now enable faster and more sensitive analyses than classical microbiology procedures. These molecular tools allow a detailed characterization of cell physiological states and bacterial fitness and thus, offer new perspectives to integration of microbial physiology monitoring to improve industrial processes. This review summarizes the methods described to enumerate and characterize physiological states of technological microbiota in dairy products, and discusses the current deficiencies in relation to the industry's needs. Recent studies show that Polymerase chain reaction-based methods can successfully be applied to quantify fermenting microbes and probiotics in dairy products. Flow cytometry and omics technologies also show interesting analytical potentialities. However, they still suffer from a lack of validation and standardization for quality control analyses, as reflected by the absence of performance studies and official international standards.
Adult Current Smoking: Differences in Definitions and Prevalence Estimates—NHIS and NSDUH, 2008
Ryan, Heather; Trosclair, Angela; Gfroerer, Joe
2012-01-01
Objectives. To compare prevalence estimates and assess issues related to the measurement of adult cigarette smoking in the National Health Interview Survey (NHIS) and the National Survey on Drug Use and Health (NSDUH). Methods. 2008 data on current cigarette smoking and current daily cigarette smoking among adults ≥18 years were compared. The standard NHIS current smoking definition, which screens for lifetime smoking ≥100 cigarettes, was used. For NSDUH, both the standard current smoking definition, which does not screen, and a modified definition applying the NHIS current smoking definition (i.e., with screen) were used. Results. NSDUH consistently yielded higher current cigarette smoking estimates than NHIS and lower daily smoking estimates. However, with use of the modified NSDUH current smoking definition, a notable number of subpopulation estimates became comparable between surveys. Younger adults and racial/ethnic minorities were most impacted by the lifetime smoking screen, with Hispanics being the most sensitive to differences in smoking variable definitions among all subgroups. Conclusions. Differences in current cigarette smoking definitions appear to have a greater impact on smoking estimates in some sub-populations than others. Survey mode differences may also limit intersurvey comparisons and trend analyses. Investigators are cautioned to use data most appropriate for their specific research questions. PMID:22649464
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sleiman, Mohamad; Chen, Sharon; Gilbert, Haley E.
A laboratory method to simulate natural exposure of roofing materials has been reported in a companion article. Here in the current article, we describe the results of an international, nine-participant interlaboratory study (ILS) conducted in accordance with ASTM Standard E691-09 to establish the precision and reproducibility of this protocol. The accelerated soiling and weathering method was applied four times by each laboratory to replicate coupons of 12 products representing a wide variety of roofing categories (single-ply membrane, factory-applied coating (on metal), bare metal, field-applied coating, asphalt shingle, modified-bitumen cap sheet, clay tile, and concrete tile). Participants reported initial and laboratory-agedmore » values of solar reflectance and thermal emittance. Measured solar reflectances were consistent within and across eight of the nine participating laboratories. Measured thermal emittances reported by six participants exhibited comparable consistency. For solar reflectance, the accelerated aging method is both repeatable and reproducible within an acceptable range of standard deviations: the repeatability standard deviation sr ranged from 0.008 to 0.015 (relative standard deviation of 1.2–2.1%) and the reproducibility standard deviation sR ranged from 0.022 to 0.036 (relative standard deviation of 3.2–5.8%). The ILS confirmed that the accelerated aging method can be reproduced by multiple independent laboratories with acceptable precision. In conclusion, this study supports the adoption of the accelerated aging practice to speed the evaluation and performance rating of new cool roofing materials.« less
Parrish, Nicole; Osterhout, Gerard; Dionne, Kim; Sweeney, Amy; Kwiatkowski, Nicole; Carroll, Karen; Jost, Kenneth C.; Dick, James
2007-01-01
Multidrug-resistant (MDR) Mycobacterium tuberculosis and extrensively drug-resistant (XDR) M. tuberculosis are emerging public health threats whose threats are compounded by the fact that current techniques for testing the susceptibility of M. tuberculosis require several days to weeks to complete. We investigated the use of high-performance liquid chromatography (HPLC)-based quantitation of mycolic acids as a means of rapidly determining drug resistance and susceptibility in M. tuberculosis. Standard susceptibility testing and determination of the MICs of drug-susceptible (n = 26) and drug-resistant M. tuberculosis strains, including MDR M. tuberculosis strains (n = 34), were performed by using the Bactec radiometric growth system as the reference method. The HPLC-based susceptibilities of the current first-line drugs, isoniazid (INH), rifampin (RIF), ethambutol (EMB), and pyrazinamide (PZA), were determined. The vials were incubated for 72 h, and aliquots were removed for HPLC analysis by using the Sherlock mycobacterial identification system. HPLC quantitation of total mycolic acid peaks (TMAPs) was performed with treated and untreated cultures. At 72 h, the levels of agreement of the HPLC method with the reference method were 99.5% for INH, EMB, and PZA and 98.7% for RIF. The inter- and intra-assay reproducibilities varied by drug, with an average precision of 13.4%. In summary, quantitation of TMAPs is a rapid, sensitive, and accurate method for antibiotic susceptibility testing of all first-line drugs currently used against M. tuberculosis and offers the potential of providing susceptibility testing results within hours, rather than days or weeks, for clinical M. tuberculosis isolates. PMID:17913928
Performance of the AOAC use-dilution method with targeted modifications: collaborative study.
Tomasino, Stephen F; Parker, Albert E; Hamilton, Martin A; Hamilton, Gordon C
2012-01-01
The U.S. Environmental Protection Agency (EPA), in collaboration with an industry work group, spearheaded a collaborative study designed to further enhance the AOAC use-dilution method (UDM). Based on feedback from laboratories that routinely conduct the UDM, improvements to the test culture preparation steps were prioritized. A set of modifications, largely based on culturing the test microbes on agar as specified in the AOAC hard surface carrier test method, were evaluated in a five-laboratory trial. The modifications targeted the preparation of the Pseudomonas aeruginosa test culture due to the difficulty in separating the pellicle from the broth in the current UDM. The proposed modifications (i.e., the modified UDM) were compared to the current UDM methodology for P. aeruginosa and Staphylococcus aureus. Salmonella choleraesuis was not included in the study. The goal was to determine if the modifications reduced method variability. Three efficacy response variables were statistically analyzed: the number of positive carriers, the log reduction, and the pass/fail outcome. The scope of the collaborative study was limited to testing one liquid disinfectant (an EPA-registered quaternary ammonium product) at two levels of presumed product efficacies, high and low. Test conditions included use of 400 ppm hard water as the product diluent and a 5% organic soil load (horse serum) added to the inoculum. Unfortunately, the study failed to support the adoption of the major modification (use of an agar-based approach to grow the test cultures) based on an analysis of method's variability. The repeatability and reproducibility standard deviations for the modified method were equal to or greater than those for the current method across the various test variables. However, the authors propose retaining the frozen stock preparation step of the modified method, and based on the statistical equivalency of the control log densities, support its adoption as a procedural change to the current UDM. The current UDM displayed acceptable responsiveness to changes in product efficacy; acceptable repeatability across multiple tests in each laboratory for the control counts and log reductions; and acceptable reproducibility across multiple laboratories for the control log density values and log reductions. Although the data do not support the adoption of all modifications, the UDM collaborative study data are valuable for assessing sources of method variability and a reassessment of the performance standard for the UDM.
Dong, Ren G; Sinsel, Erik W; Welcome, Daniel E; Warren, Christopher; Xu, Xueyan S; McDowell, Thomas W; Wu, John Z
2015-09-01
The hand coordinate systems for measuring vibration exposures and biodynamic responses have been standardized, but they are not actually used in many studies. This contradicts the purpose of the standardization. The objectives of this study were to identify the major sources of this problem, and to help define or identify better coordinate systems for the standardization. This study systematically reviewed the principles and definition methods, and evaluated typical hand coordinate systems. This study confirms that, as accelerometers remain the major technology for vibration measurement, it is reasonable to standardize two types of coordinate systems: a tool-based basicentric (BC) system and an anatomically based biodynamic (BD) system. However, these coordinate systems are not well defined in the current standard. Definition of the standard BC system is confusing, and it can be interpreted differently; as a result, it has been inconsistently applied in various standards and studies. The standard hand BD system is defined using the orientation of the third metacarpal bone. It is neither convenient nor defined based on important biological or biodynamic features. This explains why it is rarely used in practice. To resolve these inconsistencies and deficiencies, we proposed a revised method for defining the realistic handle BC system and an alternative method for defining the hand BD system. A fingertip-based BD system for measuring the principal grip force is also proposed based on an important feature of the grip force confirmed in this study.
Dong, Ren G.; Sinsel, Erik W.; Welcome, Daniel E.; Warren, Christopher; Xu, Xueyan S.; McDowell, Thomas W.; Wu, John Z.
2015-01-01
The hand coordinate systems for measuring vibration exposures and biodynamic responses have been standardized, but they are not actually used in many studies. This contradicts the purpose of the standardization. The objectives of this study were to identify the major sources of this problem, and to help define or identify better coordinate systems for the standardization. This study systematically reviewed the principles and definition methods, and evaluated typical hand coordinate systems. This study confirms that, as accelerometers remain the major technology for vibration measurement, it is reasonable to standardize two types of coordinate systems: a tool-based basicentric (BC) system and an anatomically based biodynamic (BD) system. However, these coordinate systems are not well defined in the current standard. Definition of the standard BC system is confusing, and it can be interpreted differently; as a result, it has been inconsistently applied in various standards and studies. The standard hand BD system is defined using the orientation of the third metacarpal bone. It is neither convenient nor defined based on important biological or biodynamic features. This explains why it is rarely used in practice. To resolve these inconsistencies and deficiencies, we proposed a revised method for defining the realistic handle BC system and an alternative method for defining the hand BD system. A fingertip-based BD system for measuring the principal grip force is also proposed based on an important feature of the grip force confirmed in this study. PMID:26929824
NASA Astrophysics Data System (ADS)
Goh, Chin-Teng; Cruden, Andrew
2014-11-01
Capacitance and resistance are the fundamental electrical parameters used to evaluate the electrical characteristics of a supercapacitor, namely the dynamic voltage response, energy capacity, state of charge and health condition. In the British Standards EN62391 and EN62576, the constant capacitance method can be further improved with a differential capacitance that more accurately describes the dynamic voltage response of supercapacitors. This paper presents a novel bivariate quadratic based method to model the dynamic voltage response of supercapacitors under high current charge-discharge cycling, and to enable the derivation of the differential capacitance and energy capacity directly from terminal measurements, i.e. voltage and current, rather than from multiple pulsed-current or excitation signal tests across different bias levels. The estimation results the author achieves are in close agreement with experimental measurements, within a relative error of 0.2%, at various high current levels (25-200 A), more accurate than the constant capacitance method (4-7%). The archival value of this paper is the introduction of an improved quantification method for the electrical characteristics of supercapacitors, and the disclosure of the distinct properties of supercapacitors: the nonlinear capacitance-voltage characteristic, capacitance variation between charging and discharging, and distribution of energy capacity across the operating voltage window.
Plastics processing: statistics, current practices, and evaluation.
Cooke, F
1993-11-01
The health care industry uses a huge quantity of plastic materials each year. Much of the machinery currently used, or supplied, for plastics processing is unsuitable for use in a clean environment. In this article, the author outlines the reasons for the current situation and urges companies to re-examine their plastic-processing methods, whether performed in-house or subcontracted out. Some of the factors that should be considered when evaluating plastics-processing equipment are outlined to assist companies in remaining competitive and complying with impending EC regulations on clean room standards for manufacturing areas.
Warrell, Mary J.; Riddell, Anna; Yu, Ly-Mee; Phipps, Judith; Diggle, Linda; Bourhy, Hervé; Deeks, Jonathan J.; Fooks, Anthony R.; Audry, Laurent; Brookes, Sharon M.; Meslin, François-Xavier; Moxon, Richard; Pollard, Andrew J.; Warrell, David A.
2008-01-01
Background The need for economical rabies post-exposure prophylaxis (PEP) is increasing in developing countries. Implementation of the two currently approved economical intradermal (ID) vaccine regimens is restricted due to confusion over different vaccines, regimens and dosages, lack of confidence in intradermal technique, and pharmaceutical regulations. We therefore compared a simplified 4-site economical PEP regimen with standard methods. Methods Two hundred and fifty-four volunteers were randomly allocated to a single blind controlled trial. Each received purified vero cell rabies vaccine by one of four PEP regimens: the currently accepted 2-site ID; the 8-site regimen using 0.05 ml per ID site; a new 4-site ID regimen (on day 0, approximately 0.1 ml at 4 ID sites, using the whole 0.5 ml ampoule of vaccine; on day 7, 0.1 ml ID at 2 sites and at one site on days 28 and 90); or the standard 5-dose intramuscular regimen. All ID regimens required the same total amount of vaccine, 60% less than the intramuscular method. Neutralising antibody responses were measured five times over a year in 229 people, for whom complete data were available. Findings All ID regimens showed similar immunogenicity. The intramuscular regimen gave the lowest geometric mean antibody titres. Using the rapid fluorescent focus inhibition test, some sera had unexpectedly high antibody levels that were not attributable to previous vaccination. The results were confirmed using the fluorescent antibody virus neutralisation method. Conclusions This 4-site PEP regimen proved as immunogenic as current regimens, and has the advantages of requiring fewer clinic visits, being more practicable, and having a wider margin of safety, especially in inexperienced hands, than the 2-site regimen. It is more convenient than the 8-site method, and can be used economically with vaccines formulated in 1.0 or 0.5 ml ampoules. The 4-site regimen now meets all requirements of immunogenicity for PEP and can be introduced without further studies. Trial Registration Controlled-Trials.com ISRCTN 30087513 PMID:18431444
Van Herpe, Tom; De Brabanter, Jos; Beullens, Martine; De Moor, Bart; Van den Berghe, Greet
2008-01-01
Introduction Blood glucose (BG) control performed by intensive care unit (ICU) nurses is becoming standard practice for critically ill patients. New (semi-automated) 'BG control' algorithms (or 'insulin titration' algorithms) are under development, but these require stringent validation before they can replace the currently used algorithms. Existing methods for objectively comparing different insulin titration algorithms show weaknesses. In the current study, a new approach for appropriately assessing the adequacy of different algorithms is proposed. Methods Two ICU patient populations (with different baseline characteristics) were studied, both treated with a similar 'nurse-driven' insulin titration algorithm targeting BG levels of 80 to 110 mg/dl. A new method for objectively evaluating BG deviations from normoglycemia was founded on a smooth penalty function. Next, the performance of this new evaluation tool was compared with the current standard assessment methods, on an individual as well as a population basis. Finally, the impact of four selected parameters (the average BG sampling frequency, the duration of algorithm application, the severity of disease, and the type of illness) on the performance of an insulin titration algorithm was determined by multiple regression analysis. Results The glycemic penalty index (GPI) was proposed as a tool for assessing the overall glycemic control behavior in ICU patients. The GPI of a patient is the average of all penalties that are individually assigned to each measured BG value based on the optimized smooth penalty function. The computation of this index returns a number between 0 (no penalty) and 100 (the highest penalty). For some patients, the assessment of the BG control behavior using the traditional standard evaluation methods was different from the evaluation with GPI. Two parameters were found to have a significant impact on GPI: the BG sampling frequency and the duration of algorithm application. A higher BG sampling frequency and a longer algorithm application duration resulted in an apparently better performance, as indicated by a lower GPI. Conclusion The GPI is an alternative method for evaluating the performance of BG control algorithms. The blood glucose sampling frequency and the duration of algorithm application should be similar when comparing algorithms. PMID:18302732
Management of tinnitus in English NHS audiology departments: an evaluation of current practice
Hoare, Derek J; Gander, Phillip E; Collins, Luke; Smith, Sandra; Hall, Deborah A
2012-01-01
Rationale, aim and objective In 2009, the UK Department of Health formalized recommended National Health Service practices for the management of tinnitus from primary care onwards. It is timely therefore to evaluate the perceived practicality, utility and impact of those guidelines in the context of current practice. Methods We surveyed current practice by posting a 36-item questionnaire to all audiology and hearing therapy staff that we were able to identify as being involved in tinnitus patient care in England. Results In total, 138 out of 351 clinicians responded (39% response rate). The findings indicate a consensus opinion that management should be tailored to individual symptom profiles but that there is little standardization of assessment procedures or tools in use. Conclusions While the lack of standardized practice might provide flexibility to meet local demand, it has drawbacks. It makes it difficult to ascertain key standards of best practice, it complicates the process of clinical audit, it implies unequal patient access to care, and it limits the implementation of translational research outcomes. We recommend that core elements of practice should be standardized, including use of a validated tinnitus questionnaires and an agreed pathway for decision making to better understand the rationale for management strategies offered. PMID:21087449
Probabilistic Parameter Uncertainty Analysis of Single Input Single Output Control Systems
NASA Technical Reports Server (NTRS)
Smith, Brett A.; Kenny, Sean P.; Crespo, Luis G.
2005-01-01
The current standards for handling uncertainty in control systems use interval bounds for definition of the uncertain parameters. This approach gives no information about the likelihood of system performance, but simply gives the response bounds. When used in design, current methods of m-analysis and can lead to overly conservative controller design. With these methods, worst case conditions are weighted equally with the most likely conditions. This research explores a unique approach for probabilistic analysis of control systems. Current reliability methods are examined showing the strong areas of each in handling probability. A hybrid method is developed using these reliability tools for efficiently propagating probabilistic uncertainty through classical control analysis problems. The method developed is applied to classical response analysis as well as analysis methods that explore the effects of the uncertain parameters on stability and performance metrics. The benefits of using this hybrid approach for calculating the mean and variance of responses cumulative distribution functions are shown. Results of the probabilistic analysis of a missile pitch control system, and a non-collocated mass spring system, show the added information provided by this hybrid analysis.
Cost effectiveness of the U.S. Geological Survey's stream-gaging program in Illinois
Mades, D.M.; Oberg, K.A.
1984-01-01
Data uses and funding sources were identified for 138 continuous-record discharge-gaging stations currently (1983) operated as part of the stream-gaging program in Illinois. Streamflow data from five of those stations are used only for regional hydrology studies. Most streamflow data are used for defining regional hydrology, defining rainfall-runoff relations, flood forecasting, regulating navigation systems, and water-quality sampling. Based on the evaluations of data use and of alternative methods for determining streamflow in place of stream gaging, no stations in the 1983 stream-gaging program should be deactivated. The current budget (in 1983 dollars) for operating the 138-station program is $768,000 per year. The average standard error of instantaneous discharge for the current practice for visiting the gaging stations is 36.5 percent. Missing stage record accounts for one-third of the 36.5 percent average standard error. (USGS)
1993-12-01
Generally Accepted Process While neither DoD Directives nor USAF Regulations specify exact mandatory TDY order processing methods, most USAF units...functional input. Finally, TDY order processing functional experts at Hanscom, Los Angeles and McClellan AFBs provided inputs based on their experiences...current electronic auditing capabilities. 81 DTPS Initiative. This DFAS-initiated action to standardize TDY order processing throughout DoD is currently
NASA Astrophysics Data System (ADS)
Salatino, Maria
2017-06-01
In the current submm and mm cosmology experiments the focal planes are populated by kilopixel transition edge sensors (TESes). Varying incoming power load requires frequent rebiasing of the TESes through standard current-voltage (IV) acquisition. The time required to perform IVs on such large arrays and the resulting transient heating of the bath reduces the sky observation time. We explore a bias step method that significantly reduces the time required for the rebiasing process. This exploits the detectors' responses to the injection of a small square wave signal on top of the dc bias current and knowledge of the shape of the detector transition R(T,I). This method has been tested on two detector arrays of the Atacama Cosmology Telescope (ACT). In this paper, we focus on the first step of the method, the estimate of the TES %Rn.
Current Taxonomical Situation of Streptococcus suis
Okura, Masatoshi; Osaki, Makoto; Nomoto, Ryohei; Arai, Sakura; Osawa, Ro; Sekizaki, Tsutomu; Takamatsu, Daisuke
2016-01-01
Streptococcus suis, a major porcine pathogen and an important zoonotic agent, is considered to be composed of phenotypically and genetically diverse strains. However, recent studies reported several “S. suis-like strains” that were identified as S. suis by commonly used methods for the identification of this bacterium, but were regarded as distinct species from S. suis according to the standards of several taxonomic analyses. Furthermore, it has been suggested that some S. suis-like strains can be assigned to several novel species. In this review, we discuss the current taxonomical situation of S. suis with a focus on (1) the classification history of the taxon of S. suis; (2) S. suis-like strains revealed by taxonomic analyses; (3) methods for detecting and identifying this species, including a novel method that can distinguish S. suis isolates from S. suis-like strains; and (4) current topics on the reclassification of S. suis-like strains. PMID:27348006
Current Taxonomical Situation of Streptococcus suis.
Okura, Masatoshi; Osaki, Makoto; Nomoto, Ryohei; Arai, Sakura; Osawa, Ro; Sekizaki, Tsutomu; Takamatsu, Daisuke
2016-06-24
Streptococcus suis, a major porcine pathogen and an important zoonotic agent, is considered to be composed of phenotypically and genetically diverse strains. However, recent studies reported several "S. suis-like strains" that were identified as S. suis by commonly used methods for the identification of this bacterium, but were regarded as distinct species from S. suis according to the standards of several taxonomic analyses. Furthermore, it has been suggested that some S. suis-like strains can be assigned to several novel species. In this review, we discuss the current taxonomical situation of S. suis with a focus on (1) the classification history of the taxon of S. suis; (2) S. suis-like strains revealed by taxonomic analyses; (3) methods for detecting and identifying this species, including a novel method that can distinguish S. suis isolates from S. suis-like strains; and (4) current topics on the reclassification of S. suis-like strains.
The production and quality of tomato concentrates.
Hayes, W A; Smith, P G; Morris, A E
1998-10-01
The standards and specifications for the quality and composition of tomato concentrates are reviewed. The main quality parameters of tomato puree and paste are color, consistency and flavor. Overall, there is an absence of standardization of methods and instruments to define quality. While color can now be measured objectively, there are currently no standard color requirements for tomato concentrates. Rheological measurements on both tomato juice and concentrates are reviewed; the power law finds wide applicability, although other rheological characteristics, particularly time dependency, have received far less attention and there has been little effort to relate rheological understanding to the commonly used empirical tests such as consistency measurements. The volatiles responsible for flavor and odor have been identified to the point where the natural odor of tomato paste can be imitated. Attempts to develop objective methods as a substitute for sensory assessment are reviewed.
Quantitative estimation of dust fall and smoke particles in Quetta Valley*
Sami, Muhammad; Waseem, Amir; Akbar, Sher
2006-01-01
Tightening of air quality standards for populated urban areas has led to increasing attention to assessment of air quality management areas, where violation of air quality standards occurs, and development of control strategies to eliminate such violation of air quality standards. The Quetta urban area is very densely built and has heavy motorized traffic. The increase of emissions mainly from traffic and industry are responsible for the increase in atmospheric pollution levels during the last years. The dust examined in the current study was collected by both deposit gauge and Petri dish methods at various sites of Quetta Valley. Smoke particles were obtained by bladder method from the exhausts of various types of motor vehicles. The concentration of lead found in the smoke ranged from 1.5×10−6 to 4.5×10−6. PMID:16773727
Deport, Coralie; Ratel, Jérémy; Berdagué, Jean-Louis; Engel, Erwan
2006-05-26
The current work describes a new method, the comprehensive combinatory standard correction (CCSC), for the correction of instrumental signal drifts in GC-MS systems. The method consists in analyzing together with the products of interest a mixture of n selected internal standards, and in normalizing the peak area of each analyte by the sum of standard areas and then, select among the summation operator sigma(p = 1)(n)C(n)p possible sums, the sum that enables the best product discrimination. The CCSC method was compared with classical techniques of data pre-processing like internal normalization (IN) or single standard correction (SSC) on their ability to correct raw data from the main drifts occurring in a dynamic headspace-gas chromatography-mass spectrometry system. Three edible oils with closely similar compositions in volatile compounds were analysed using a device which performance was modulated by using new or used dynamic headspace traps and GC-columns, and by modifying the tuning of the mass spectrometer. According to one-way ANOVA, the CCSC method increased the number of analytes discriminating the products (31 after CCSC versus 25 with raw data or after IN and 26 after SSC). Moreover, CCSC enabled a satisfactory discrimination of the products irrespective of the drifts. In a factorial discriminant analysis, 100% of the samples (n = 121) were well-classified after CCSC versus 45% for raw data, 90 and 93%, respectively after IN and SSC.
40 CFR 63.2390 - What records must I keep?
Code of Federal Regulations, 2010 CFR
2010-07-01
... Standards for Hazardous Air Pollutants: Organic Liquids Distribution (Non-Gasoline) Notifications, Reports... into which organic liquids are loaded at a transfer rack that is subject to control based on the... Method 27. (2) For transport vehicles without vapor collection equipment, current certification in...
Currently there are no standard radioanalytical methods applicable to the initial phase of a radiological emergency, for the early identification and quantification of alpha emitting radionuclides. Of particular interest are determinations of the presence and concentration of is...
Addiction Competencies in the 2009 CACREP Clinical Mental Health Counseling Program Standards
ERIC Educational Resources Information Center
Lee, Tiffany K.; Craig, Stephen E.; Fetherson, Bianca T. L.; Simpson, C. Dennis
2013-01-01
The Council for Accreditation of Counseling and Related Educational Programs developed addiction competencies for clinical mental health counseling students. This article highlights these competencies, provides an overview of current addiction training, and describes methods to integrate addiction education into curricula.
We conducted research comparing several methods currently in use for the bioassessment and monitoring of fish and benthic macroinvertebrate assemblages of large rivers. Fish data demonstrate that electrofishing 1000 m of shoreline is sufficient for bioassessments on boatable riv...
Standard methods for tracheal mite research
USDA-ARS?s Scientific Manuscript database
This chapter, for the COLOSS Beebook from the Bee Research Center in Switzerland, summarizes all the current information about the tracheal mite (Acarapis woodi) infesting honey bees (Apis mellifera). The chapter covers the effects on bees, its life history, and its range, as well as the identifica...
Kakkar, Chandan; Sripathi, Smiti; Parakh, Anushri; Shrivastav, Rajendra
2016-01-01
Introduction Urolithiasis is one of the major, recurring problem in young individuals and CT being the commonest diagnostic modality used. In order to reduce the radiation dose to the patient who are young and as stone formation is a recurring process; one of the simplest way would be, low dose CT along with tube current modulation. Aim Aim of this study was to compare the sensitivity and specificity of low dose (70mAs) with standard dose (250mAs) protocol in detecting urolithiasis and to define the tube current and mean effective patient dose by these protocols. Materials and Methods A prospective study was conducted in 200 patients over a period of 2 years with acute flank pain presentation. CT was performed in 100 cases with standard dose and another 100 with low dose protocol using tube current modulation. Sensitivity and specificity for calculus detection, percentage reduction of dose and tube current with low dose protocol was calculated. Results Urolithiasis was detected in 138 patients, 67 were examined by high dose and 71 were by low dose protocol. Sensitivity and Specificity of low dose protocol was 97.1% and 96.4% with similar results found in high BMI patients. Tube current modulation resulted in reduction of effective tube current by 12.17%. The mean effective patient dose for standard dose was 10.33 mSv whereas 2.92 mSv for low dose with 51.13–53.8% reduction in low dose protocol. Conclusion The study has reinforced that low-dose CT with tube current modulation is appropriate for diagnosis of urolithiasis with significant reduction in tube current and patient effective dose. PMID:27437322
Sidle, John E.; Wamalwa, Emmanuel S.; Okumu, Thomas O.; Bryant, Kendall L.; Goulet, Joseph L.; Maisto, Stephen A.; Braithwaite, R. Scott; Justice, Amy C.
2010-01-01
Traditional homemade brew is believed to represent the highest proportion of alcohol use in sub-Saharan Africa. In Eldoret, Kenya, two types of brew are common: chang’aa, spirits, and busaa, maize beer. Local residents refer to the amount of brew consumed by the amount of money spent, suggesting a culturally relevant estimation method. The purposes of this study were to analyze ethanol content of chang’aa and busaa; and to compare two methods of alcohol estimation: use by cost, and use by volume, the latter the current international standard. Laboratory results showed mean ethanol content was 34% (SD = 14%) for chang’aa and 4% (SD = 1%) for busaa. Standard drink unit equivalents for chang’aa and busaa, respectively, were 2 and 1.3 (US) and 3.5 and 2.3 (Great Britain). Using a computational approach, both methods demonstrated comparable results. We conclude that cost estimation of alcohol content is more culturally relevant and does not differ in accuracy from the international standard. PMID:19015972
Franc, M A; Cohen, N; Warner, A W; Shaw, P M; Groenen, P; Snapir, A
2011-04-01
DNA samples collected in clinical trials and stored for future research are valuable to pharmaceutical drug development. Given the perceived higher risk associated with genetic research, industry has implemented complex coding methods for DNA. Following years of experience with these methods and with addressing questions from institutional review boards (IRBs), ethics committees (ECs) and health authorities, the industry has started reexamining the extent of the added value offered by these methods. With the goal of harmonization, the Industry Pharmacogenomics Working Group (I-PWG) conducted a survey to gain an understanding of company practices for DNA coding and to solicit opinions on their effectiveness at protecting privacy. The results of the survey and the limitations of the coding methods are described. The I-PWG recommends dialogue with key stakeholders regarding coding practices such that equal standards are applied to DNA and non-DNA samples. The I-PWG believes that industry standards for privacy protection should provide adequate safeguards for DNA and non-DNA samples/data and suggests a need for more universal standards for samples stored for future research.
Theoretical Study of Watershed Eco-Compensation Standards
NASA Astrophysics Data System (ADS)
Yan, Dandan; Fu, Yicheng; Liu, Biu; Sha, Jinxia
2018-01-01
Watershed eco-compensation is an effective way to solve conflicts over water allocation and ecological destruction problems in the exploitation of water resources. Despite an increasing interest in the topic, the researches has neglected the effect of water quality and lacked systematic calculation method. In this study we reviewed and analyzed the current literature and proposedatheoretical framework to improve the calculation of co-compensation standard.Considering the perspectives of the river ecosystems, forest ecosystems and wetland ecosystems, the benefit compensation standard was determined by the input-output corresponding relationship. Based on the opportunity costs related to limiting development and water conservation loss, the eco-compensation standard was calculated.In order to eliminate the defects of eco-compensation implementation, the improvement suggestions were proposed for the compensation standard calculation and implementation.
Kroll, Mark W; Panescu, Dorin; Hinz, Andrew F; Lakkireddy, Dhanunjaya
2010-01-01
It has been long recognized that there are 2 methods for inducing VF (ventricular fibrillation) with electrical currents‥ These are: (1) delivering a high-charge shock into the cardiac T-wave, and (2) delivering lower level currents for 1-5 seconds. Present electrical safety standards are based on this understanding. We present new data showing a 3(rd) mechanism of inducing VF which involves the steps of delivering sufficient current to cause high-rate cardiac capture, causing cardiac output collapse, leading to ischemia, for sufficiently long duration, which then lowers the VFT (VF threshold) to the level of the current, which finally results in VF. This requires about 40% of the normal VF-induction current but requires a duration of minutes instead of seconds for the VF to be induced. Anesthetized and ventilated swine (n=6) had current delivered from a probe tip 10 mm from the epicardium sufficient to cause hypotensive capture but not directly induce VF within 5 s. After a median time of 90 s, VF was induced. This 3(rd) mechanism of VF induction should be studied further and considered for electrical safety standards and is relevant to long-duration TASER Electronic Control Device applications.
PKC regulates capsaicin-induced currents of dorsal root ganglion neurons in rats.
Zhou, Y; Zhou, Z S; Zhao, Z Q
2001-10-01
Capsaicin activates a non-specific cation conductance in a subset of dorsal root ganglion (DRG) neurons. The inward current and membrane potential of acutely isolated DRG neurons were examined using whole-cell patch recording methods. We report here that the current and voltage responses activated by capsaicin were markedly increased by phorbol 12-myristate 13-acetate (PMA), an activator of protein kinase C (PKC). The mean current, after application of 0.3 microM PMA, was 153.5+/-5.7% of control (n=32) in Ca(2+)-free external solution and 181.6+/-6.8% of control (n=15) in standard external solution. Under current-clamp conditions, 0.3 microM PMA facilitated capsaicin-induced depolarization and action potential generation. Bindolylmaleimide I (BIM), a specific inhibitor of PKC activity, abolished the effect of PMA. In addition, capsaicin-evoked current was attenuated to 68.3+/-5.0% of control (n=13) by individual administration of 1 microM BIM in standard external solution, while 0.3 microM BIM did not have this effect. These data suggest that PKC can directly regulate the capsaicin response in DRG neurons, which could increase nociceptive sensory transmission and contribute to hyperalgesia.
Crapanzano, John P; Heymann, Jonas J; Monaco, Sara; Nassar, Aziza; Saqi, Anjali
2014-01-01
In the recent past, algorithms and recommendations to standardize the morphological, immunohistochemical and molecular classification of lung cancers on cytology specimens have been proposed, and several organizations have recommended cell blocks (CBs) as the preferred modality for molecular testing. Based on the literature, there are several different techniques available for CB preparation-suggesting that there is no standard. The aim of this study was to conduct a survey of CB preparation techniques utilized in various practice settings and analyze current issues, if any. A single E-mail with a link to an electronic survey was distributed to members of the American Society of Cytopathology and other pathologists. Questions pertaining to the participants' practice setting and CBs-volume, method, quality and satisfaction-were included. Of 95 respondents, 90/95 (94%) completed the survey and comprise the study group. Most participants practice in a community hospital/private practice (44%) or academic center (41%). On average, 14 CBs (range 0-50; median 10) are prepared by a laboratory daily. Over 10 methods are utilized: Plasma thrombin (33%), HistoGel (27%), Cellient automated cell block system (8%) and others (31%) respectively. Forty of 90 (44%) respondents are either unsatisfied or sometimes satisfied with their CB quality, with low-cellular yield being the leading cause of dissatisfaction. There was no statistical significance between the three most common CB preparation methods and satisfaction with quality. Many are dissatisfied with their current method of CB preparation, and there is no consistent method to prepare CBs. In today's era of personalized medicine with an increasing array of molecular tests being applied to cytological specimens, there is a need for a standardized protocol for CB optimization to enhance cellularity.
[Research strategies in standard decoction of medicinal slices].
Chen, Shi-Lin; Liu, An; Li, Qi; Toru, Sugita; Zhu, Guang-Wei; Sun, Yi; Dai, Yun-Tao; Zhang, Jun; Zhang, Tie-Jun; Takehisa, Tomoda; Liu, Chang-Xiao
2016-04-01
This paper discusses the research situation of the standard decoction of medicinal slices at home and abroad. Combined with the experimental data, the author proposes that the standard decoction of medicinal slices is made of single herb using standard process which should be guided by the theory of traditional Chinese medicine, based on clinical practice and referred to modern extraction method with a standard process. And the author also proposes the principles of establishing the specification of process parameters and quality standards and established the basis of drug efficacy material and biological reference. As a standard material and standard system, the standard decoction of medicinal slices can provide standards for clinical medication, standardize the use of the new type of medicinal slices especially for dispensing granules, which were widely used in clinical. It can ensure the accuracy of drugs and consistency of dose, and to solve current supervision difficulties. Moreover the study of standard decoction of medicinal slices will provide the research on dispensing granules, traditional Chinese medicine prescription standard decoction and couplet medicines standard decoction a useful reference. Copyright© by the Chinese Pharmaceutical Association.
International Standards for Genomes, Transcriptomes, and Metagenomes
Mason, Christopher E.; Afshinnekoo, Ebrahim; Tighe, Scott; Wu, Shixiu; Levy, Shawn
2017-01-01
Challenges and biases in preparing, characterizing, and sequencing DNA and RNA can have significant impacts on research in genomics across all kingdoms of life, including experiments in single-cells, RNA profiling, and metagenomics (across multiple genomes). Technical artifacts and contamination can arise at each point of sample manipulation, extraction, sequencing, and analysis. Thus, the measurement and benchmarking of these potential sources of error are of paramount importance as next-generation sequencing (NGS) projects become more global and ubiquitous. Fortunately, a variety of methods, standards, and technologies have recently emerged that improve measurements in genomics and sequencing, from the initial input material to the computational pipelines that process and annotate the data. Here we review current standards and their applications in genomics, including whole genomes, transcriptomes, mixed genomic samples (metagenomes), and the modified bases within each (epigenomes and epitranscriptomes). These standards, tools, and metrics are critical for quantifying the accuracy of NGS methods, which will be essential for robust approaches in clinical genomics and precision medicine. PMID:28337071
2010-08-01
available). It is assumed after this method is formally published that various standard vendors will offer other sources than the current single standard... single isomer. D Alkyl PAHs used to determine the SPME-GC/MS relative response factors including alkyl naphthalenes (1-methyl-, 2-methyl-, 1,2...Flag all compound results in the sample which were estimated above the upper calibration level with an “E” qualifier. 15. Precision and Bias 15.1 Single
Evaluation of Acoustic Doppler Current Profiler measurements of river discharge
Morlock, S.E.
1996-01-01
The standard deviations of the ADCP measurements ranged from approximately 1 to 6 percent and were generally higher than the measurement errors predicted by error-propagation analysis of ADCP instrument performance. These error-prediction methods assume that the largest component of ADCP discharge measurement error is instrument related. The larger standard deviations indicate that substantial portions of measurement error may be attributable to sources unrelated to ADCP electronics or signal processing and are functions of the field environment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faye, Sherry A.; Richards, Jason M.; Gallardo, Athena M.
Sequential extraction is a useful technique for assessing the potential to leach actinides from soils; however, current literature lacks uniformity in experimental details, making direct comparison of results impossible. This work continued development toward a standardized five-step sequential extraction protocol by analyzing extraction behaviors of 232Th, 238U, 239,240Pu and 241Am from lake and ocean sediment reference materials. Results produced a standardized procedure after creating more defined reaction conditions to improve method repeatability. A NaOH fusion procedure is recommended following sequential leaching for the complete dissolution of insoluble species.
Exchange of Standardized Flight Dynamics Data
NASA Technical Reports Server (NTRS)
Martin-Mur, Tomas J.; Berry, David; Flores-Amaya, Felipe; Folliard, J.; Kiehling, R.; Ogawa, M.; Pallaschke, S.
2004-01-01
Spacecraft operations require the knowledge of the vehicle trajectory and attitude and also that of other spacecraft or natural bodies. This knowledge is normally provided by the Flight Dynamics teams of the different space organizations and, as very often spacecraft operations involve more than one organization, this information needs to be exchanged between Agencies. This is why the Navigation Working Group within the CCSDS (Consultative Committee for Space Data Systems), has been instituted with the task of establishing standards for the exchange of Flight Dynamics data. This exchange encompasses trajectory data, attitude data, and tracking data. The Navigation Working Group includes regular members and observers representing the participating Space Agencies. Currently the group includes representatives from CNES, DLR, ESA, NASA and JAXA. This Working Group meets twice per year in order to devise standardized language, methods, and formats for the description and exchange of Navigation data. Early versions of some of these standards have been used to support mutual tracking of ESA and NASA interplanetary spacecraft, especially during the arrival of the 2003 missions to Mars. This paper provides a summary of the activities carried out by the group, briefly outlines the current and envisioned standards, describes the tests and operational activities that have been performed using the standards, and lists and discusses the lessons learned from these activities.
Path Planning Method in Multi-obstacle Marine Environment
NASA Astrophysics Data System (ADS)
Zhang, Jinpeng; Sun, Hanxv
2017-12-01
In this paper, an improved algorithm for particle swarm optimization is proposed for the application of underwater robot in the complex marine environment. Not only did consider to avoid obstacles when path planning, but also considered the current direction and the size effect on the performance of the robot dynamics. The algorithm uses the trunk binary tree structure to construct the path search space and A * heuristic search method is used in the search space to find a evaluation standard path. Then the particle swarm algorithm to optimize the path by adjusting evaluation function, which makes the underwater robot in the current navigation easier to control, and consume less energy.
NASA Technical Reports Server (NTRS)
O'Brien, T. Kevin; Johnston, William M.; Toland, Gregory J.
2010-01-01
Mode II interlaminar fracture toughness and delamination onset and growth characterization data were generated for IM7/8552 graphite epoxy composite materials from two suppliers for use in fracture mechanics analyses. Both the fracture toughness testing and the fatigue testing were conducted using the End-notched Flexure (ENF) test. The ENF test for mode II fracture toughness is currently under review by ASTM as a potential standard test method. This current draft ASTM protocol was used as a guide to conduct the tests on the IM7/8552 material. This report summarizes the test approach, methods, procedures and results of this characterization effort.
Pérez, Teresa; Makrestsov, Nikita; Garatt, John; Torlakovic, Emina; Gilks, C Blake; Mallett, Susan
The Canadian Immunohistochemistry Quality Control program monitors clinical laboratory performance for estrogen receptor and progesterone receptor tests used in breast cancer treatment management in Canada. Current methods assess sensitivity and specificity at each time point, compared with a reference standard. We investigate alternative performance analysis methods to enhance the quality assessment. We used 3 methods of analysis: meta-analysis of sensitivity and specificity of each laboratory across all time points; sensitivity and specificity at each time point for each laboratory; and fitting models for repeated measurements to examine differences between laboratories adjusted by test and time point. Results show 88 laboratories participated in quality control at up to 13 time points using typically 37 to 54 histology samples. In meta-analysis across all time points no laboratories have sensitivity or specificity below 80%. Current methods, presenting sensitivity and specificity separately for each run, result in wide 95% confidence intervals, typically spanning 15% to 30%. Models of a single diagnostic outcome demonstrated that 82% to 100% of laboratories had no difference to reference standard for estrogen receptor and 75% to 100% for progesterone receptor, with the exception of 1 progesterone receptor run. Laboratories with significant differences to reference standard identified with Generalized Estimating Equation modeling also have reduced performance by meta-analysis across all time points. The Canadian Immunohistochemistry Quality Control program has a good design, and with this modeling approach has sufficient precision to measure performance at each time point and allow laboratories with a significantly lower performance to be targeted for advice.
Kondori, N; Svensson, E; Mattsby-Baltzer, I
2011-09-01
The use of anti-fungal agents has increased dramatically in recent years and new drugs have been developed. Several methods are available for determinations of their specific biological activities, i.e. the standard method for minimum inhibitory concentration-determination is described in M-38 [Clinical and Laboratory Standards Institute document M-38 (CLSI M-38)]. However, alternative methods, such as the E-test, are currently available in Mycology laboratories. The susceptibilities of clinical isolates of Aspergillus spp. (n = 29), Fusarium spp. (n = 5), zygomycetes (n = 21) and Schizophyllum (n = 1) were determined for itraconazole, voriconazole and posaconazole, using the CLSI M-38-A broth dilution method and also by the E-test. A good overall agreement (83.7%) between the two methods for all drugs and organisms was observed. Analyses of voriconazole showed a better agreement (93%) between the methods than posaconazole and itraconazole (85% and 74% respectively). Aspergillus spp. were the most susceptible fungi to the anti-fungal agents tested in this study. Posaconazole was the most active drug against filamentous fungi in vitro, followed by itraconazole and voriconazole. The latter (voriconazole) demonstrated no significant in vitro activity against zygomycetes. © 2010 Blackwell Verlag GmbH.
Update to the USDA-ARS fixed-wing spray nozzle models
USDA-ARS?s Scientific Manuscript database
The current USDA ARS Aerial Spray Nozzle Models were updated to reflect both new standardized measurement methods and systems, as well as, to increase operational spray pressure, aircraft airspeed and nozzle orientation angle limits. The new models were developed using both Central Composite Design...
78 FR 30810 - Paleontological Resources Preservation
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-23
... fossil resource management on public lands (see S. Rep. 105- 227, at 60 (1998)). The request directed the... of fossils; (2) the need for standards that would maximize the availability of fossils for scientific study; and (3) the effectiveness of current methods for storing and preserving fossils collected from...
19 CFR 146.95 - Methods of attribution.
Code of Federal Regulations, 2010 CFR
2010-04-01
... attribution. (a) Producibility—(1) General. A subzone operator must attribute the source of each final product... current or prior manufacturing period. Attribution of final products is allowable to the extent that the quantity of such products could have been produced from such feedstocks, using the industry standards of...
FRACTIONAL AEROSOL FILTRATION EFFICIENCY OF IN-DUCT VENTILATION AIR CLEANERS
The filtration efficiency of ventilation air cleaners is highly particle-size dependent over the 0.01 to 3 μm diameter size range. Current standardized test methods, which determine only overall efficiencies for ambient aerosol or other test aerosols, provide data of limited util...
Evaluation of standardized sample collection, packaging, and ...
Journal Sample collection procedures and primary receptacle (sample container and bag) decontamination methods should prevent contaminant transfer between contaminated and non-contaminated surfaces and areas during bio-incident operations. Cross-contamination of personnel, equipment, or sample containers may result in the exfiltration of biological agent from the exclusion (hot) zone and have unintended negative consequences on response resources, activities and outcomes. The current study was designed to: (1) evaluate currently recommended sample collection and packaging procedures to identify procedural steps that may increase the likelihood of spore exfiltration or contaminant transfer; (2) evaluate the efficacy of currently recommended primary receptacle decontamination procedures; and (3) evaluate the efficacy of outer packaging decontamination methods. Wet- and dry-deposited fluorescent tracer powder was used in contaminant transfer tests to qualitatively evaluate the currently-recommended sample collection procedures. Bacillus atrophaeus spores, a surrogate for Bacillus anthracis, were used to evaluate the efficacy of spray- and wipe-based decontamination procedures.
Characterization of YBa2Cu3O7, including critical current density Jc, by trapped magnetic field
NASA Technical Reports Server (NTRS)
Chen, In-Gann; Liu, Jianxiong; Weinstein, Roy; Lau, Kwong
1992-01-01
Spatial distributions of persistent magnetic field trapped by sintered and melt-textured ceramic-type high-temperature superconductor (HTS) samples have been studied. The trapped field can be reproduced by a model of the current consisting of two components: (1) a surface current Js and (2) a uniform volume current Jv. This Js + Jv model gives a satisfactory account of the spatial distribution of the magnetic field trapped by different types of HTS samples. The magnetic moment can be calculated, based on the Js + Jv model, and the result agrees well with that measured by standard vibrating sample magnetometer (VSM). As a consequence, Jc predicted by VSM methods agrees with Jc predicted from the Js + Jv model. The field mapping method described is also useful to reveal the granular structure of large HTS samples and regions of weak links.
Fuels characterization studies. [jet fuels
NASA Technical Reports Server (NTRS)
Seng, G. T.; Antoine, A. C.; Flores, F. J.
1980-01-01
Current analytical techniques used in the characterization of broadened properties fuels are briefly described. Included are liquid chromatography, gas chromatography, and nuclear magnetic resonance spectroscopy. High performance liquid chromatographic ground-type methods development is being approached from several directions, including aromatic fraction standards development and the elimination of standards through removal or partial removal of the alkene and aromatic fractions or through the use of whole fuel refractive index values. More sensitive methods for alkene determinations using an ultraviolet-visible detector are also being pursued. Some of the more successful gas chromatographic physical property determinations for petroleum derived fuels are the distillation curve (simulated distillation), heat of combustion, hydrogen content, API gravity, viscosity, flash point, and (to a lesser extent) freezing point.
Wang, Jingzhu; Yang, Rui; Yang, Wenning; Liu, Xin; Xing, Yanyi; Xu, Youxuan
2014-12-10
Isotope ratio mass spectrometry (IRMS) is applied to confirm testosterone (T) abuse by determining the carbon isotope ratios (δ(13)C value). However, (13)C labeled standards can be used to control the δ(13)C value and produce manipulated T which cannot be detected by the current method. A method was explored to remove the (13)C labeled atom at C-3 from the molecule of androsterone (Andro), the metabolite of T in urine, to produce the resultant (A-nor-5α-androstane-2,17-dione, ANAD). The difference in δ(13)C values between Andro and ANAD (Δδ(13)CAndro-ANAD, ‰) would change significantly in case manipulated T is abused. Twenty-one volunteers administered T manipulated with different (13)C labeled standards. The collected urine samples were analyzed with the established method, and the maximum value of Δδ(13)CAndro-ANAD post ingestion ranged from 3.0‰ to 8.8‰. Based on the population reference, the cut-off value of Δδ(13)CAndro-ANAD for positive result was suggested as 1.2‰. The developed method could be used to detect T manipulated with 3-(13)C labeled standards. Copyright © 2014 Elsevier B.V. All rights reserved.
Rocha, José Celso; Passalia, Felipe José; Matos, Felipe Delestro; Takahashi, Maria Beatriz; Ciniciato, Diego de Souza; Maserati, Marc Peter; Alves, Mayra Fernanda; de Almeida, Tamie Guibu; Cardoso, Bruna Lopes; Basso, Andrea Cristina; Nogueira, Marcelo Fábio Gouveia
2017-08-09
Morphological analysis is the standard method of assessing embryo quality; however, its inherent subjectivity tends to generate discrepancies among evaluators. Using genetic algorithms and artificial neural networks (ANNs), we developed a new method for embryo analysis that is more robust and reliable than standard methods. Bovine blastocysts produced in vitro were classified as grade 1 (excellent or good), 2 (fair), or 3 (poor) by three experienced embryologists according to the International Embryo Technology Society (IETS) standard. The images (n = 482) were subjected to automatic feature extraction, and the results were used as input for a supervised learning process. One part of the dataset (15%) was used for a blind test posterior to the fitting, for which the system had an accuracy of 76.4%. Interestingly, when the same embryologists evaluated a sub-sample (10%) of the dataset, there was only 54.0% agreement with the standard (mode for grades). However, when using the ANN to assess this sub-sample, there was 87.5% agreement with the modal values obtained by the evaluators. The presented methodology is covered by National Institute of Industrial Property (INPI) and World Intellectual Property Organization (WIPO) patents and is currently undergoing a commercial evaluation of its feasibility.
Validation of a standardized extraction method for formalin-fixed paraffin-embedded tissue samples.
Lagheden, Camilla; Eklund, Carina; Kleppe, Sara Nordqvist; Unger, Elizabeth R; Dillner, Joakim; Sundström, Karin
2016-07-01
Formalin-fixed paraffin-embedded (FFPE) samples can be DNA-extracted and used for human papillomavirus (HPV) genotyping. The xylene-based gold standard for extracting FFPE samples is laborious, suboptimal and involves health hazards for the personnel involved. To compare extraction with the standard xylene method to a xylene-free method used in an HPV LabNet Global Reference Laboratory at the Centers for Disease Control (CDC); based on a commercial method with an extra heating step. Fifty FFPE samples were randomly selected from a national audit of all cervical cancer cases diagnosed in Sweden during 10 years. For each case-block, a blank-block was sectioned, as a control for contamination. For xylene extraction, the standard WHO Laboratory Manual protocol was used. For the CDC method, the manufacturers' protocol was followed except for an extra heating step, 120°C for 20min. Samples were extracted and tested in parallel with β-globin real-time PCR, HPV16 real-time PCR and HPV typing using modified general primers (MGP)-PCR and Luminex assays. For a valid result the blank-block had to be betaglobin-negative in all tests and the case-block positive for beta-globin. Overall, detection was improved with the heating method and the amount of HPV-positive samples increased from 70% to 86% (p=0.039). For all samples where HPV type concordance could be evaluated, there was 100% type concordance. A xylene-free and robust extraction method for HPV-DNA typing in FFPE material is currently in great demand. Our proposed standardized protocol appears to be generally useful. Copyright © 2016. Published by Elsevier B.V.
It's no debate, debates are great.
Dy-Boarman, Eliza A; Nisly, Sarah A; Costello, Tracy J
A debate can be a pedagogical method used to instill essential functions in pharmacy students. This non-traditional teaching method may help to further develop a number of skills that are highlighted in the current Accreditation Council for Pharmacy Education Standards 2016 and Center for the Advancement of Pharmacy Education Educational Outcomes 2013. Debates have also been used as an educational tool in other health disciplines. Current pharmacy literature does illustrate the use of debates in various areas within the pharmacy curriculum in both required and elective courses; however, the current body of literature would suggest that debates are an underutilized teaching tool in pharmacy experiential education. With all potential benefits of debates as a teaching tool, pharmacy experiential preceptors should further explore their use in the experiential setting. Copyright © 2017 Elsevier Inc. All rights reserved.
Wolf, Heinz; Stauffer, Tony; Chen, Shu-Chen Y; Lee, Yoojin; Forster, Ronald; Ludzinski, Miron; Kamat, Madhav; Mulhall, Brian; Guazzo, Dana Morton
2009-01-01
Part 1 of this series demonstrated that a container closure integrity test performed according to ASTM F2338-09 Standard Test Method for Nondestructive Detection of Leaks in Packages by Vacuum Decay Method using a VeriPac 325/LV vacuum decay leak tester by Packaging Technologies & Inspection, LLC (PTI) is capable of detecting leaks > or = 5.0 microm (nominal diameter) in rigid, nonporous package systems, such as prefilled glass syringes. The current study compared USP, Ph.Eur. and ISO dye ingress integrity test methods to PTI's vacuum decay technology for the detection of these same 5-, 10-, and 15-microm laser-drilled hole defects in 1-mL glass prefilled syringes. The study was performed at three test sites using several inspectors and a variety of inspection conditions. No standard dye ingress method was found to reliably identify all holed syringes. Modifications to these standard dye tests' challenge conditions increased the potential for dye ingress, and adjustments to the visual inspection environment improved dye ingress detection. However, the risk of false positive test results with dye ingress tests remained. In contrast, the nondestructive vacuum decay leak test method reliably identified syringes with holes > or = 5.0 microm.
Non-invasive prediction of forthcoming cirrhosis-related complications
Kang, Wonseok; Kim, Seung Up; Ahn, Sang Hoon
2014-01-01
In patients with chronic liver diseases, identification of significant liver fibrosis and cirrhosis is essential for determining treatment strategies, assessing therapeutic response, and stratifying long-term prognosis. Although liver biopsy remains the reference standard for evaluating the extent of liver fibrosis in patients with chronic liver diseases, several non-invasive methods have been developed as alternatives to liver biopsies. Some of these non-invasive methods have demonstrated clinical accuracy for diagnosing significant fibrosis or cirrhosis in many cross-sectional studies with the histological fibrosis stage as a reference standard. However, non-invasive methods cannot be fully validated through cross-sectional studies since liver biopsy is not a perfect surrogate endpoint marker. Accordingly, recent studies have focused on assessing the performance of non-invasive methods through long-term, longitudinal, follow-up studies with solid clinical endpoints related to advanced stages of liver fibrosis and cirrhosis. As a result, current view is that these alternative methods can independently predict future cirrhosis-related complications, such as hepatic decompensation, liver failure, hepatocellular carcinoma, or liver-related death. The clinical role of non-invasive models seems to be shifting from a simple tool for predicting the extent of fibrosis to a surveillance tool for predicting future liver-related events. In this article, we will summarize recent longitudinal studies of non-invasive methods for predicting forthcoming complications related to liver cirrhosis and discuss the clinical value of currently available non-invasive methods based on evidence from the literature. PMID:24627597
Duff, Kevin
2012-01-01
Repeated assessments are a relatively common occurrence in clinical neuropsychology. The current paper will review some of the relevant concepts (e.g., reliability, practice effects, alternate forms) and methods (e.g., reliable change index, standardized based regression) that are used in repeated neuropsychological evaluations. The focus will be on the understanding and application of these concepts and methods in the evaluation of the individual patient through examples. Finally, some future directions for assessing change will be described. PMID:22382384
Targeted methods for quantitative analysis of protein glycosylation
Goldman, Radoslav; Sanda, Miloslav
2018-01-01
Quantification of proteins by LC-MS/MS-MRM has become a standard method with broad projected clinical applicability. MRM quantification of protein modifications is, however, far less utilized, especially in the case of glycoproteins. This review summarizes current methods for quantitative analysis of protein glycosylation with a focus on MRM methods. We describe advantages of this quantitative approach, analytical parameters that need to be optimized to achieve reliable measurements, and point out the limitations. Differences between major classes of N- and O-glycopeptides are described and class-specific glycopeptide assays are demonstrated. PMID:25522218
Analysis of drugs in human tissues by supercritical fluid extraction/immunoassay
NASA Astrophysics Data System (ADS)
Furton, Kenneth G.; Sabucedo, Alberta; Rein, Joseph; Hearn, W. L.
1997-02-01
A rapid, readily automated method has been developed for the quantitative analysis of phenobarbital from human liver tissues based on supercritical carbon dioxide extraction followed by fluorescence enzyme immunoassay. The method developed significantly reduces sample handling and utilizes the entire liver homogenate. The current method yields comparable recoveries and precision and does not require the use of an internal standard, although traditional GC/MS confirmation can still be performed on sample extracts. Additionally, the proposed method uses non-toxic, inexpensive carbon dioxide, thus eliminating the use of halogenated organic solvents.
Hysteroscopic Sterilization: History and Current Methods
Greenberg, James A
2008-01-01
For many practicing obstetrician-gynecologists, tubal ligation was the gold standard by which female sterilization techniques were measured. Yet gynecologic surgeons have simultaneously sought to occlude the fallopian tubes transcervically to avoid discomfort and complications associated with transabdominal approaches. In this review, the history of transcervical sterilization is discussed. Past, current, and upcoming techniques are reviewed. This article focuses on interval sterilization techniques, thus removing post-vaginal and post-cesarean delivery tubal ligations from the discussion. PMID:19015762
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schultz, Courtney A., E-mail: courtney.schultz@colostate.edu
Cumulative effects analysis (CEA) allows natural resource managers to understand the status of resources in historical context, learn from past management actions, and adapt future activities accordingly. U.S. federal agencies are required to complete CEA as part of environmental impact assessment under the National Environmental Policy Act (NEPA). Past research on CEA as part of NEPA has identified significant deficiencies in CEA practice, suggested methodologies for handling difficult aspects of CEA, and analyzed the rise in litigation over CEA in U.S. courts. This article provides a review of the literature and legal standards related to CEA as it is donemore » under NEPA and then examines current practice on a U.S. National Forest, utilizing qualitative methods in order to provide a detailed understanding of current approaches to CEA. Research objectives were to understand current practice, investigate ongoing challenges, and identify impediments to improvement. Methods included a systematic review of a set of NEPA documents and semi-structured interviews with practitioners, scientists, and members of the public. Findings indicate that the primary challenges associated with CEA include: issues of both geographic and temporal scale of analysis, confusion over the purpose of the requirement, the lack of monitoring data, and problems coordinating and disseminating data. Improved monitoring strategies and programmatic analyses could support improved CEA practice.« less
NASA Technical Reports Server (NTRS)
Vogt, R. A.
1979-01-01
The application of using the mission planning and analysis division (MPAD) common format trajectory data tape to predict temperatures for preflight and post flight mission analysis is presented and evaluated. All of the analyses utilized the latest Space Transportation System 1 flight (STS-1) MPAD trajectory tape, and the simplified '136 note' midsection/payload bay thermal math model. For the first 6.7 hours of the STS-1 flight profile, transient temperatures are presented for selected nodal locations with the current standard method, and the trajectory tape method. Whether the differences are considered significant or not depends upon the view point. Other transient temperature predictions are also presented. These results were obtained to investigate an initial concern that perhaps the predicted temperature differences between the two methods would not only be caused by the inaccuracies of the current method's assumed nominal attitude profile but also be affected by a lack of a sufficient number of orbit points in the current method. Comparison between 6, 12, and 24 orbit point parameters showed a surprising insensitivity to the number of orbit points.
Meisamy, Sina; Hines, Catherine D G; Hamilton, Gavin; Sirlin, Claude B; McKenzie, Charles A; Yu, Huanzhou; Brittain, Jean H; Reeder, Scott B
2011-03-01
To prospectively compare an investigational version of a complex-based chemical shift-based fat fraction magnetic resonance (MR) imaging method with MR spectroscopy for the quantification of hepatic steatosis. This study was approved by the institutional review board and was HIPAA compliant. Written informed consent was obtained before all studies. Fifty-five patients (31 women, 24 men; age range, 24-71 years) were prospectively imaged at 1.5 T with quantitative MR imaging and single-voxel MR spectroscopy, each within a single breath hold. The effects of T2 correction, spectral modeling of fat, and magnitude fitting for eddy current correction on fat quantification with MR imaging were investigated by reconstructing fat fraction images from the same source data with different combinations of error correction. Single-voxel T2-corrected MR spectroscopy was used to measure fat fraction and served as the reference standard. All MR spectroscopy data were postprocessed at a separate institution by an MR physicist who was blinded to MR imaging results. Fat fractions measured with MR imaging and MR spectroscopy were compared statistically to determine the correlation (r(2)), and the slope and intercept as measures of agreement between MR imaging and MR spectroscopy fat fraction measurements, to determine whether MR imaging can help quantify fat, and examine the importance of T2 correction, spectral modeling of fat, and eddy current correction. Two-sided t tests (significance level, P = .05) were used to determine whether estimated slopes and intercepts were significantly different from 1.0 and 0.0, respectively. Sensitivity and specificity for the classification of clinically significant steatosis were evaluated. Overall, there was excellent correlation between MR imaging and MR spectroscopy for all reconstruction combinations. However, agreement was only achieved when T2 correction, spectral modeling of fat, and magnitude fitting for eddy current correction were used (r(2) = 0.99; slope ± standard deviation = 1.00 ± 0.01, P = .77; intercept ± standard deviation = 0.2% ± 0.1, P = .19). T1-independent chemical shift-based water-fat separation MR imaging methods can accurately quantify fat over the entire liver, by using MR spectroscopy as the reference standard, when T2 correction, spectral modeling of fat, and eddy current correction methods are used. © RSNA, 2011.
Measurement of eddy-current distribution in the vacuum vessel of the Sino-UNIted Spherical Tokamak.
Li, G; Tan, Y; Liu, Y Q
2015-08-01
Eddy currents have an important effect on tokamak plasma equilibrium and control of magneto hydrodynamic activity. The vacuum vessel of the Sino-UNIted Spherical Tokamak is separated into two hemispherical sections by a toroidal insulating barrier. Consequently, the characteristics of eddy currents are more complex than those found in a standard tokamak. Thus, it is necessary to measure and analyze the eddy-current distribution. In this study, we propose an experimental method for measuring the eddy-current distribution in a vacuum vessel. By placing a flexible printed circuit board with magnetic probes onto the external surface of the vacuum vessel to measure the magnetic field parallel to the surface and then subtracting the magnetic field generated by the vertical-field coils, the magnetic field due to the eddy current can be obtained, and its distribution can be determined. We successfully applied this method to the Sino-UNIted Spherical Tokamak, and thus, we obtained the eddy-current distribution despite the presence of the magnetic field generated by the external coils.
Matrix effect and recovery terminology issues in regulated drug bioanalysis.
Huang, Yong; Shi, Robert; Gee, Winnie; Bonderud, Richard
2012-02-01
Understanding the meaning of the terms used in the bioanalytical method validation guidance is essential for practitioners to implement best practice. However, terms that have several meanings or that have different interpretations exist within bioanalysis, and this may give rise to differing practices. In this perspective we discuss an important but often confusing term - 'matrix effect (ME)' - in regulated drug bioanalysis. The ME can be interpreted as either the ionization change or the measurement bias of the method caused by the nonanalyte matrix. The ME definition dilemma makes its evaluation challenging. The matrix factor is currently used as a standard method for evaluation of ionization changes caused by the matrix in MS-based methods. Standard additions to pre-extraction samples have been suggested to evaluate the overall effects of a matrix from different sources on the analytical system, because it covers ionization variation and extraction recovery variation. We also provide our personal views on the term 'recovery'.
Simonella, Lucio E; Gaiero, Diego M; Palomeque, Miriam E
2014-10-01
Iron is an essential micronutrient for phytoplankton growth and is supplied to the remote areas of the ocean mainly through atmospheric dust/ash. The amount of soluble Fe in dust/ash is a major source of uncertainty in modeling-Fe dissolution and deposition to the surface ocean. Currently in the literature, there exist almost as many different methods to estimate fractional solubility as researchers in the field, making it difficult to compare results between research groups. Also, an important constraint to evaluate Fe solubility in atmospheric dust is the limited mass of sample which is usually only available in micrograms to milligrams amounts. A continuous flow (CF) method that can be run with low mass of sediments (<10mg) was tested against a standard method which require about 1g of sediments (BCR of the European Union). For validation of the CF experiment, we run both methods using South American surface sediment and deposited volcanic ash. Both materials tested are easy eroded by wind and are representative of atmospheric dust/ash exported from this region. The uncertainty of the CF method was obtained from seven replicates of one surface sediment sample, and shows very good reproducibility. The replication was conducted on different days in a span of two years and ranged between 8 and 22% (i.e., the uncertainty for the standard method was 6-19%). Compared to other standardized methods, the CF method allows studies of dissolution kinetic of metals and consumes less reagents and time (<3h). The method validated here is suggested to be used as a standardized method for Fe solubility studies on dust/ash. Copyright © 2014 Elsevier B.V. All rights reserved.
Engineering Documentation and Data Control
NASA Technical Reports Server (NTRS)
Matteson, Michael J.; Bramley, Craig; Ciaruffoli, Veronica
2001-01-01
Mississippi Space Services (MSS) the facility services contractor for NASA's John C. Stennis Space Center (SSC), is utilizing technology to improve engineering documentation and data control. Two identified improvement areas, labor intensive documentation research and outdated drafting standards, were targeted as top priority. MSS selected AutoManager(R) WorkFlow from Cyco software to manage engineering documentation. The software is currently installed on over 150 desctops. The outdated SSC drafting standard was written for pre-CADD drafting methods, in other words, board drafting. Implementation of COTS software solutions to manage engineering documentation and update the drafting standard resulted in significant increases in productivity by reducing the time spent searching for documents.
Verbal autopsy: current practices and challenges.
Soleman, Nadia; Chandramohan, Daniel; Shibuya, Kenji
2006-01-01
Cause-of-death data derived from verbal autopsy (VA) are increasingly used for health planning, priority setting, monitoring and evaluation in countries with incomplete or no vital registration systems. In some regions of the world it is the only method available to obtain estimates on the distribution of causes of death. Currently, the VA method is routinely used at over 35 sites, mainly in Africa and Asia. In this paper, we present an overview of the VA process and the results of a review of VA tools and operating procedures used at demographic surveillance sites and sample vital registration systems. We asked for information from 36 field sites about field-operating procedures and reviewed 18 verbal autopsy questionnaires and 10 cause-of-death lists used in 13 countries. The format and content of VA questionnaires, field-operating procedures, cause-of-death lists and the procedures to derive causes of death from VA process varied substantially among sites. We discuss the consequences of using varied methods and conclude that the VA tools and procedures must be standardized and reliable in order to make accurate national and international comparisons of VA data. We also highlight further steps needed in the development of a standard VA process. PMID:16583084
Ogata, Norio
2006-09-01
The strategy to eliminate hepatitis B virus (HBV) infection by administrating an HB vaccine is changing worldwide; however, this is not the case in Japan. An important concern about the HBV infection-preventing strategy in Japan may be that the assay methods for the antibody to hepatitis B surface antigen (anti-HBs) are not standardized. The minimum protective anti-HBs titer against HBV infection has been established as 10 mIU/ml by World Health Organization (WHO) -standardized assay methods worldwide, but that is still determined as a "positive" test result by the passive hemagglutination (PHA) method in Japan. We compared anti-HBs measurements in given samples among PHA(Mycell II, Institute of Immunology), chemiluminescent enzyme immunoassay (CLEIA) (Lumipulse, Fujirebio), and chemiluminescent immunoassay (CLIA) (Architect, Abbott), all of which are currently in wide use in Japan. First, anti-HBs measurements in serum from individuals who received a yeast-derived recombinant HB vaccine composed of the major surface protein of either subtype adr or subtype ayw were compared. The results clearly showed that in subtype adr-vaccinees CLIA underestimated the anti-HBs amount compared with CLEIA and PHA, but in ayw-vaccinees, the discordance in the measurements among the three kits was not prominent. Second, anti-HBs measurements in standard or calibration solutions of each assay kit were compared. Surprisingly, CLEIA showed higher measurements in all three kit-associated standard or calibration solutions than CLIA. Thus, the anti-HBs titer of 10 mIU/ml is difficult to introduce in Japan as the minimum protective level against HBV infection. Efforts to standardize anti-HBs assay methods are expected to share international evidence about the HBV infection-preventing strategy.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-21
...The U.S. Nuclear Regulatory Commission (NRC or Commission) is issuing a revision to regulatory guide (RG) 3.39, ``Standard Format and Content of License Applications for Mixed Oxide Fuel Fabrication Facilities.'' This guide endorses the standard format and content for license applications and integrated safety analysis (ISA) summaries described in the current version of NUREG-1718, ``Standard Review Plan for the Review of an Application for a Mixed Oxide (MOX) Fuel Fabrication Facility,'' as a method that the NRC staff finds acceptable for meeting the regulatory requirements of Title 10 of the Code of Federal Regulations (10 CFR) part 70, ``Domestic Licensing of Special Nuclear Material'' for mixed oxide fuel fabrication facilities.
The FBI compression standard for digitized fingerprint images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brislawn, C.M.; Bradley, J.N.; Onyshczak, R.J.
1996-10-01
The FBI has formulated national standards for digitization and compression of gray-scale fingerprint images. The compression algorithm for the digitized images is based on adaptive uniform scalar quantization of a discrete wavelet transform subband decomposition, a technique referred to as the wavelet/scalar quantization method. The algorithm produces archival-quality images at compression ratios of around 15 to 1 and will allow the current database of paper fingerprint cards to be replaced by digital imagery. A compliance testing program is also being implemented to ensure high standards of image quality and interchangeability of data between different implementations. We will review the currentmore » status of the FBI standard, including the compliance testing process and the details of the first-generation encoder.« less
An update on 'dose calibrator' settings for nuclides used in nuclear medicine.
Bergeron, Denis E; Cessna, Jeffrey T
2018-06-01
Most clinical measurements of radioactivity, whether for therapeutic or imaging nuclides, rely on commercial re-entrant ionization chambers ('dose calibrators'). The National Institute of Standards and Technology (NIST) maintains a battery of representative calibrators and works to link calibration settings ('dial settings') to primary radioactivity standards. Here, we provide a summary of NIST-determined dial settings for 22 radionuclides. We collected previously published dial settings and determined some new ones using either the calibration curve method or the dialing-in approach. The dial settings with their uncertainties are collected in a comprehensive table. In general, current manufacturer-provided calibration settings give activities that agree with National Institute of Standards and Technology standards to within a few percent.
Analytical procedures for water-soluble vitamins in foods and dietary supplements: a review.
Blake, Christopher J
2007-09-01
Water-soluble vitamins include the B-group vitamins and vitamin C. In order to correctly monitor water-soluble vitamin content in fortified foods for compliance monitoring as well as to establish accurate data banks, an accurate and precise analytical method is a prerequisite. For many years microbiological assays have been used for analysis of B vitamins. However they are no longer considered to be the gold standard in vitamins analysis as many studies have shown up their deficiencies. This review describes the current status of analytical methods, including microbiological assays and spectrophotometric, biosensor and chromatographic techniques. In particular it describes the current status of the official methods and highlights some new developments in chromatographic procedures and detection methods. An overview is made of multivitamin extractions and analyses for foods and supplements.
Kolb, Marit; Bahadir, Müfit; Teichgräber, Burkhard
2017-10-01
Worldwide, the standard methods for the determination of the important wastewater parameter chemical oxygen demand (COD) are still based on the use of the hazardous chemicals, mercury sulfate and chromium(VI). However, due to their properties they are meanwhile classified as "priority pollutants" and shall be phased out or banned in the frame of REACH (current European Chemical Law: Registration, Evaluation, Authorization and restriction of Chemicals) by the European Union. Hence, a new wet-chemical method free of mercury and chromium(VI) was developed. Manganese(III) was used as oxidant and silver nitrate for the removal of chloride ions. The quantification was performed by back titration of manganese(III) with iron(II) as done in the standard method. In order to minimize losses of organic substances during the precipitation of silver chloride, suspended and colloid organic matter had to be separated by precipitation of aluminum hydroxide in a first step. In these cases, two fractions, one of the suspended and colloid matters and a second of the dissolved organic substances, are prepared and oxidized separately. The method was tested with potassium hydrogen phthalate (KHP) as conventional COD reference substance and different types of wastewater samples. The oxidation of KHP was reproducible in a COD range of 20-500 mg/L with a mean recovery rate of 88.7% in comparison to the standard COD method (DIN 38409-41). Also in presence of 1000 mg/L chloride a recovery rate of 84.1% was reached. For a series of industrial and municipal wastewater samples a high correlation (R 2 = 0.9935) to the standard method with a mean recovery rate of 78.1% (±5.2%) was determined. Even though the results of the new method are not 100% of the standard method, its high correlation to the standard method and reproducibility offers an environmentally benign alternative method with no need to purchase new laboratory equipment. Copyright © 2017 Elsevier Ltd. All rights reserved.
Future Direction of IMIA Standardization
Kimura, M.; Ogishima, S.; Shabo, A.; Kim, I. K.; Parisot, C.; de Faria Leao, B.
2014-01-01
Summary Objectives Standardization in the field of health informatics has increased its importance and global alliance for establishing interoperability and compatibility internationally. Standardization has been organized by standard development organizations (SDOs) such as ISO (International Organization for Standardization), CEN (European Committee for Standardization), IHE (Integrating the Healthcare Enterprise), and HL7 (Health Level 7), etc. This paper reports the status of these SDOs’ activities. Methods In this workshop, we reviewed the past activities and the current situation of standardization in health care informatics with the standard development organizations such as ISO, CEN, IHE, and HL7. Then we discussed the future direction of standardization in health informatics toward “future medicine” based on standardized technologies. Results We could share the status of each SDO through exchange of opinions in the workshop. Some WHO members joined our discussion to support this constructive activity. Conclusion At this meeting, the workshop speakers have been appointed as new members of the IMIA working groups of Standards in Health Care Informatics (WG16). We could reach to the conclusion that we collaborate for the international standardization in health informatics toward “future medicine”. PMID:25123729
Warrell, Mary J; Riddell, Anna; Yu, Ly-Mee; Phipps, Judith; Diggle, Linda; Bourhy, Hervé; Deeks, Jonathan J; Fooks, Anthony R; Audry, Laurent; Brookes, Sharon M; Meslin, François-Xavier; Moxon, Richard; Pollard, Andrew J; Warrell, David A
2008-04-23
The need for economical rabies post-exposure prophylaxis (PEP) is increasing in developing countries. Implementation of the two currently approved economical intradermal (ID) vaccine regimens is restricted due to confusion over different vaccines, regimens and dosages, lack of confidence in intradermal technique, and pharmaceutical regulations. We therefore compared a simplified 4-site economical PEP regimen with standard methods. Two hundred and fifty-four volunteers were randomly allocated to a single blind controlled trial. Each received purified vero cell rabies vaccine by one of four PEP regimens: the currently accepted 2-site ID; the 8-site regimen using 0.05 ml per ID site; a new 4-site ID regimen (on day 0, approximately 0.1 ml at 4 ID sites, using the whole 0.5 ml ampoule of vaccine; on day 7, 0.1 ml ID at 2 sites and at one site on days 28 and 90); or the standard 5-dose intramuscular regimen. All ID regimens required the same total amount of vaccine, 60% less than the intramuscular method. Neutralising antibody responses were measured five times over a year in 229 people, for whom complete data were available. All ID regimens showed similar immunogenicity. The intramuscular regimen gave the lowest geometric mean antibody titres. Using the rapid fluorescent focus inhibition test, some sera had unexpectedly high antibody levels that were not attributable to previous vaccination. The results were confirmed using the fluorescent antibody virus neutralisation method. This 4-site PEP regimen proved as immunogenic as current regimens, and has the advantages of requiring fewer clinic visits, being more practicable, and having a wider margin of safety, especially in inexperienced hands, than the 2-site regimen. It is more convenient than the 8-site method, and can be used economically with vaccines formulated in 1.0 or 0.5 ml ampoules. The 4-site regimen now meets all requirements of immunogenicity for PEP and can be introduced without further studies. Controlled-Trials.com ISRCTN 30087513.
NASA Astrophysics Data System (ADS)
Endramawan, T.; Sifa, A.
2018-02-01
The purpose of this research is to know the type of discontinuity of SMAW welding result and to determine acceptance criteria based on American Society of Mechanical Engineer (ASME) standard. Material used is mild steel 98,71% Fe and 0,212% C with hardness 230 VHN with specimen diameter 20 cm and thickness 1.2 cm which is welded use SMAW butt joint with electrode for rooting LB 52U diameter 2.6 mm, current 70 Ampere and voltage 380 volt, filler used LB 5218 electrode diameter 3.2 mm with current 80 Ampere and 380 volt. The method used to analyze the welded with non destructive test dye penetrant (PT) method to see indication on the surface of the object and Ultrasonic (UT) to see indication on the sub and inner the surface of the object, the result is discontinuity recorded and analyzed and then the discontinuity is determine acceptance criteria based on the American Society of Mechanical Engineer (ASME) standards. The result show the discontinuity of porosity on the surface of the welded and inclusion on sub material used ultrasonic test, all indication on dye penetrant or ultrasonic test if there were rejected of result of welded that there must be gouging on part which rejected and then re-welding.
Effective Strategies for Teaching in K-8 Classrooms
ERIC Educational Resources Information Center
Moore, Kenneth D.; Hansen, Jacqueline
2011-01-01
Featuring a wealth of reflection activities and connections to standards, this concise, easy-to-read teaching methods text equips students with the content knowledge and skills they need to become effective K-8 teachers. The book maximizes instructional flexibility, reflects current educational issues, highlights recent research, and models best…
Effects of Computer-Based Training on Procedural Modifications to Standard Functional Analyses
ERIC Educational Resources Information Center
Schnell, Lauren K.; Sidener, Tina M.; DeBar, Ruth M.; Vladescu, Jason C.; Kahng, SungWoo
2018-01-01
Few studies have evaluated methods for training decision-making when functional analysis data are undifferentiated. The current study evaluated computer-based training to teach 20 graduate students to arrange functional analysis conditions, analyze functional analysis data, and implement procedural modifications. Participants were exposed to…
Your Science Classroom: Becoming an Elementary/Middle School Science Teacher
ERIC Educational Resources Information Center
Goldston, M. Jenice; Downey, Laura
2012-01-01
Designed around a practical "practice-what-you-teach" approach to methods instruction, "Your Science Classroom: Becoming an Elementary/Middle School Science Teacher" is based on current constructivist philosophy, organized around 5E inquiry, and guided by the National Science Education Teaching Standards. Written in a reader-friendly style, the…
Food Buying Guide for Child Nutrition Programs. Revised.
ERIC Educational Resources Information Center
Davis, Dorothy W.; And Others
This guide is based on the latest federal regulations and meal pattern requirements for the National School Lunch and Breakfast Programs. It considers current food production and marketing techniques, packaging methods, grading standards, and changing food habits in the American population. The guide gives average yield information on over 600…
42 CFR 37.52 - Method of obtaining definitive interpretations.
Code of Federal Regulations, 2012 CFR
2012-10-01
... other diseases must be demonstrated by those physicians who desire to be B Readers by taking and passing... specified by NIOSH. Each physician who desires to take the digital version of the examination will be provided a complete set of the current NIOSH-approved standard reference digital radiographs. Physicians...
Code of Federal Regulations, 2010 CFR
2010-07-01
... available demonstrated control technology, processes, operating methods, or other alternatives, including... technology currently available as determined by the Administrator pursuant to section 304(b)(1) of the Act... available technology economically achievable which will result in reasonable further progress toward the...
RAPID PCR-BASED MONITORING OF INFECTIOUS ENTEROVIRUSES IN DRINKING WATER. (R824756)
Currently, the standard method for the detection of enteroviruses and hepatitis A virus in water involves cell culture assay which is expensive and time consuming. Direct RT-PCR offers a rapid and sensitive alternative to virus detection but sensitivity is oft...
75 FR 12793 - Petitions for Modification
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-17
... number'' on the subject line, by any of the following methods: 1. Electronic Mail: Standards-Petitions... cables will be no smaller than 10 American Wire Gauge (AWG); (4) all circuit breakers used to protect... unit calibrated to trip at 70% of phase to phase short circuit current. The trip setting of these...
Multidimensional Scaling of High School Students' Perceptions of Academic Dishonesty
ERIC Educational Resources Information Center
Schmelkin, Liora Pedhazur; Gilbert, Kimberly A.; Silva, Rebecca
2010-01-01
Although cheating on tests and other forms of academic dishonesty are considered rampant, no standard definition of academic dishonesty exists. The current study was conducted to investigate the perceptions of academic dishonesty in high school students, utilizing an innovative methodology, multidimensional scaling (MDS). Two methods were used to…
A pseudo differential Gm—C complex filter with frequency tuning for IEEE802.15.4 applications
NASA Astrophysics Data System (ADS)
Xin, Cheng; Lungui, Zhong; Haigang, Yang; Fei, Liu; Tongqiang, Gao
2011-07-01
This paper presents a CMOS Gm—C complex filter for a low-IF receiver of the IEEE 802.15.4 standard. A pseudo differential OTA with reconfigurable common mode feedback and common mode feed-forward is proposed as well as the frequency tuning method based on a relaxation oscillator. A detailed analysis of non-ideality of the OTA and the frequency tuning method is elaborated. The analysis and measurement results have shown that the center frequency of the complex filter could be tuned accurately. The chip was fabricated in a standard 0.35 μm CMOS process, with a single 3.3 V power supply. The filter consumes 2.1mA current, has a measured in-band group delay ripple of less than 0.16 μs and an IRR larger than 28 dB at 2 MHz apart, which could meet the requirements oftheIEEE802.15.4 standard.
Greenhouse Gas Analysis by GC/MS
NASA Astrophysics Data System (ADS)
Bock, E. M.; Easton, Z. M.; Macek, P.
2015-12-01
Current methods to analyze greenhouse gases rely on designated complex, multiple-column, multiple-detector gas chromatographs. A novel method was developed in partnership with Shimadzu for simultaneous quantification of carbon dioxide (CO2), methane (CH4), and nitrous oxide (N2O) in environmental gas samples. Gas bulbs were used to make custom standard mixtures by injecting small volumes of pure analyte into the nitrogen-filled bulb. Resulting calibration curves were validated using a certified gas standard. The use of GC/MS systems to perform this analysis has the potential to move the analysis of greenhouse gasses from expensive, custom GC systems to standard single-quadrupole GC/MS systems that are available in most laboratories, which wide variety of applications beyond greenhouse gas analysis. Additionally, use of mass spectrometry can provide confirmation of identity of target analytes, and will assist in the identification of unknown peaks should they be present in the chromatogram.
Besley, Aiken; Vijver, Martina G; Behrens, Paul; Bosker, Thijs
2017-01-15
Microplastics are ubiquitous in the environment, are frequently ingested by organisms, and may potentially cause harm. A range of studies have found significant levels of microplastics in beach sand. However, there is a considerable amount of methodological variability among these studies. Methodological variation currently limits comparisons as there is no standard procedure for sampling or extraction of microplastics. We identify key sampling and extraction procedures across the literature through a detailed review. We find that sampling depth, sampling location, number of repeat extractions, and settling times are the critical parameters of variation. Next, using a case-study we determine whether and to what extent these differences impact study outcomes. By investigating the common practices identified in the literature with the case-study, we provide a standard operating procedure for sampling and extracting microplastics from beach sand. Copyright © 2016 Elsevier Ltd. All rights reserved.
Expert system verification and validation study. Delivery 3A and 3B: Trip summaries
NASA Technical Reports Server (NTRS)
French, Scott
1991-01-01
Key results are documented from attending the 4th workshop on verification, validation, and testing. The most interesting part of the workshop was when representatives from the U.S., Japan, and Europe presented surveys of VV&T within their respective regions. Another interesting part focused on current efforts to define industry standards for artificial intelligence and how that might affect approaches to VV&T of expert systems. The next part of the workshop focused on VV&T methods of applying mathematical techniques to verification of rule bases and techniques for capturing information relating to the process of developing software. The final part focused on software tools. A summary is also presented of the EPRI conference on 'Methodologies, Tools, and Standards for Cost Effective Reliable Software Verification and Validation. The conference was divided into discussion sessions on the following issues: development process, automated tools, software reliability, methods, standards, and cost/benefit considerations.
Detection of Only Viable Bacterial Spores Using a Live/Dead Indicator in Mixed Populations
NASA Technical Reports Server (NTRS)
Behar, Alberto E.; Stam, Christina N.; Smiley, Ronald
2013-01-01
This method uses a photoaffinity label that recognizes DNA and can be used to distinguish populations of bacterial cells from bacterial spores without the use of heat shocking during conventional culture, and live from dead bacterial spores using molecular-based methods. Biological validation of commercial sterility using traditional and alternative technologies remains challenging. Recovery of viable spores is cumbersome, as the process requires substantial incubation time, and the extended time to results limits the ability to quickly evaluate the efficacy of existing technologies. Nucleic acid amplification approaches such as PCR (polymerase chain reaction) have shown promise for improving time to detection for a wide range of applications. Recent real-time PCR methods are particularly promising, as these methods can be made at least semi-quantitative by correspondence to a standard curve. Nonetheless, PCR-based methods are rarely used for process validation, largely because the DNA from dead bacterial cells is highly stable and hence, DNA-based amplification methods fail to discriminate between live and inactivated microorganisms. Currently, no published method has been shown to effectively distinguish between live and dead bacterial spores. This technology uses a DNA binding photoaffinity label that can be used to distinguish between live and dead bacterial spores with detection limits ranging from 109 to 102 spores/mL. An environmental sample suspected of containing a mixture of live and dead vegetative cells and bacterial endospores is treated with a photoaffinity label. This step will eliminate any vegetative cells (live or dead) and dead endospores present in the sample. To further determine the bacterial spore viability, DNA is extracted from the spores and total population is quantified by real-time PCR. The current NASA standard assay takes 72 hours for results. Part of this procedure requires a heat shock step at 80 degC for 15 minutes before the sample can be plated. Using a photoaffinity label would remove this step from the current assay as the label readily penetrates both live and dead bacterial cells. Secondly, the photoaffinity label can only penetrate dead bacterial spores, leaving behind the viable spore population. This would allow for rapid bacterial spore detection in a matter of hours compared to the several days that it takes for the NASA standard assay.
Improving automation standards via semantic modelling: Application to ISA88.
Dombayci, Canan; Farreres, Javier; Rodríguez, Horacio; Espuña, Antonio; Graells, Moisès
2017-03-01
Standardization is essential for automation. Extensibility, scalability, and reusability are important features for automation software that rely in the efficient modelling of the addressed systems. The work presented here is from the ongoing development of a methodology for semi-automatic ontology construction methodology from technical documents. The main aim of this work is to systematically check the consistency of technical documents and support the improvement of technical document consistency. The formalization of conceptual models and the subsequent writing of technical standards are simultaneously analyzed, and guidelines proposed for application to future technical standards. Three paradigms are discussed for the development of domain ontologies from technical documents, starting from the current state of the art, continuing with the intermediate method presented and used in this paper, and ending with the suggested paradigm for the future. The ISA88 Standard is taken as a representative case study. Linguistic techniques from the semi-automatic ontology construction methodology is applied to the ISA88 Standard and different modelling and standardization aspects that are worth sharing with the automation community is addressed. This study discusses different paradigms for developing and sharing conceptual models for the subsequent development of automation software, along with presenting the systematic consistency checking method. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
A protocol for lifetime energy and environmental impact assessment of building insulation materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shrestha, Som S., E-mail: shresthass@ornl.gov; Biswas, Kaushik; Desjarlais, Andre O.
This article describes a proposed protocol that is intended to provide a comprehensive list of factors to be considered in evaluating the direct and indirect environmental impacts of building insulation materials, as well as detailed descriptions of standardized calculation methodologies to determine those impacts. The energy and environmental impacts of insulation materials can generally be divided into two categories: (1) direct impact due to the embodied energy of the insulation materials and other factors and (2) indirect or environmental impacts avoided as a result of reduced building energy use due to addition of insulation. Standards and product category rules exist,more » which provide guidelines about the life cycle assessment (LCA) of materials, including building insulation products. However, critical reviews have suggested that these standards fail to provide complete guidance to LCA studies and suffer from ambiguities regarding the determination of the environmental impacts of building insulation and other products. The focus of the assessment protocol described here is to identify all factors that contribute to the total energy and environmental impacts of different building insulation products and, more importantly, provide standardized determination methods that will allow comparison of different insulation material types. Further, the intent is not to replace current LCA standards but to provide a well-defined, easy-to-use comparison method for insulation materials using existing LCA guidelines. - Highlights: • We proposed a protocol to evaluate the environmental impacts of insulation materials. • The protocol considers all life cycle stages of an insulation material. • Both the direct environmental impacts and the indirect impacts are defined. • Standardized calculation methods for the ‘avoided operational energy’ is defined. • Standardized calculation methods for the ‘avoided environmental impact’ is defined.« less
Uncertainty and instream flow standards
Castleberry, D.; Cech, J.; Erman, D.; Hankin, D.; Healey, M.; Kondolf, M.; Mengel, M.; Mohr, M.; Moyle, P.; Nielsen, Jennifer L.; Speed, T.; Williams, J.
1996-01-01
Several years ago, Science published an important essay (Ludwig et al. 1993) on the need to confront the scientific uncertainty associated with managing natural resources. The essay did not discuss instream flow standards explicitly, but its arguments apply. At an April 1995 workshop in Davis, California, all 12 participants agreed that currently no scientifically defensible method exists for defining the instream flows needed to protect particular species of fish or aquatic ecosystems (Williams, in press). We also agreed that acknowledging this fact is an essential step in dealing rationally and effectively with the problem.Practical necessity and the protection of fishery resources require that new instream flow standards be established and that existing standards be revised. However, if standards cannot be defined scientifically, how can this be done? We join others in recommending the approach of adaptive management. Applied to instream flow standards, this approach involves at least three elements.
Faravan, Amir; Mohammadi, Nooredin; Alizadeh Ghavidel, Alireza; Toutounchi, Mohammad Zia; Ghanbari, Ameneh; Mazloomi, Mehran
2016-01-01
Standards have a significant role in showing the minimum level of optimal optimum and the expected performance. Since the perfusion technology staffs play an the leading role in providing the quality services to the patients undergoing open heart surgery with cardiopulmonary bypass machine, this study aimed to assess the standards on how Iranian perfusion technology staffs evaluate and manage the patients during the cardiopulmonary bypass process and compare their practice with the recommended standards by American Society of Extracorporeal Technology. In this descriptive study, data was collected from 48 Iranian public hospitals and educational health centers through a researcher-created questionnaire. The data collection questionnaire assessed the standards which are recommended by American Society of Extracorporeal Technology. Findings showed that appropriate measurements were carried out by the perfusion technology staffs to prevent the hemodilution and avoid the blood transfusion and unnecessary blood products, determine the initial dose of heparin based on one of the proposed methods, monitor the anticoagulants based on ACT measurement, and determine the additional doses of heparin during the cardiopulmonary bypass based on ACT or protamine titration. It was done only in 4.2% of hospitals and health centers. Current practices of cardiopulmonary perfusion technology in Iran are inappropriate based on the standards of American Society of Cardiovascular Perfusion. This represents the necessity of authorities' attention to the validation programs and development of the caring standards on one hand and continuous assessment of using these standards on the other hand.
Towards a Framework for Developing Semantic Relatedness Reference Standards
Pakhomov, Serguei V.S.; Pedersen, Ted; McInnes, Bridget; Melton, Genevieve B.; Ruggieri, Alexander; Chute, Christopher G.
2010-01-01
Our objective is to develop a framework for creating reference standards for functional testing of computerized measures of semantic relatedness. Currently, research on computerized approaches to semantic relatedness between biomedical concepts relies on reference standards created for specific purposes using a variety of methods for their analysis. In most cases, these reference standards are not publicly available and the published information provided in manuscripts that evaluate computerized semantic relatedness measurement approaches is not sufficient to reproduce the results. Our proposed framework is based on the experiences of medical informatics and computational linguistics communities and addresses practical and theoretical issues with creating reference standards for semantic relatedness. We demonstrate the use of the framework on a pilot set of 101 medical term pairs rated for semantic relatedness by 13 medical coding experts. While the reliability of this particular reference standard is in the “moderate” range; we show that using clustering and factor analyses offers a data-driven approach to finding systematic differences among raters and identifying groups of potential outliers. We test two ontology-based measures of relatedness and provide both the reference standard containing individual ratings and the R program used to analyze the ratings as open-source. Currently, these resources are intended to be used to reproduce and compare results of studies involving computerized measures of semantic relatedness. Our framework may be extended to the development of reference standards in other research areas in medical informatics including automatic classification, information retrieval from medical records and vocabulary/ontology development. PMID:21044697
Hendrickson, Carolyn M; Dobbins, Sarah; Redick, Brittney J; Greenberg, Molly D; Calfee, Carolyn S; Cohen, Mitchell Jay
2015-09-01
Adherence to rigorous research protocols for identifying adult respiratory distress syndrome (ARDS) after trauma is variable. To examine how misclassification of ARDS may bias observational studies in trauma populations, we evaluated the agreement of two methods for adjudicating ARDS after trauma: the current gold standard, direct review of chest radiographs and review of dictated radiology reports, a commonly used alternative. This nested cohort study included 123 mechanically ventilated patients between 2005 and 2008, with at least one PaO2/FIO2 less than 300 within the first 8 days of admission. Two blinded physician investigators adjudicated ARDS by two methods. The investigators directly reviewed all chest radiographs to evaluate for bilateral infiltrates. Several months later, blinded to their previous assessments, they adjudicated ARDS using a standardized rubric to classify radiology reports. A κ statistics was calculated. Regression analyses quantified the association between established risk factors as well as important clinical outcomes and ARDS determined by the aforementioned methods as well as hypoxemia as a surrogate marker. The κ was 0.47 for the observed agreement between ARDS adjudicated by direct review of chest radiographs and ARDS adjudicated by review of radiology reports. Both the magnitude and direction of bias on the estimates of association between ARDS and established risk factors as well as clinical outcomes varied by method of adjudication. Classification of ARDS by review of dictated radiology reports had only moderate agreement with the current gold standard, ARDS adjudicated by direct review of chest radiographs. While the misclassification of ARDS had varied effects on the estimates of associations with established risk factors, it tended to weaken the association of ARDS with important clinical outcomes. A standardized approach to ARDS adjudication after trauma by direct review of chest radiographs will minimize misclassification bias in future observational studies. Diagnostic study, level II.
What is the lifetime risk of developing cancer?: the effect of adjusting for multiple primaries
Sasieni, P D; Shelton, J; Ormiston-Smith, N; Thomson, C S; Silcocks, P B
2011-01-01
Background: The ‘lifetime risk' of cancer is generally estimated by combining current incidence rates with current all-cause mortality (‘current probability' method) rather than by describing the experience of a birth cohort. As individuals may get more than one type of cancer, what is generally estimated is the average (mean) number of cancers over a lifetime. This is not the same as the probability of getting cancer. Methods: We describe a method for estimating lifetime risk that corrects for the inclusion of multiple primary cancers in the incidence rates routinely published by cancer registries. The new method applies cancer incidence rates to the estimated probability of being alive without a previous cancer. The new method is illustrated using data from the Scottish Cancer Registry and is compared with ‘gold-standard' estimates that use (unpublished) data on first primaries. Results: The effect of this correction is to make the estimated ‘lifetime risk' smaller. The new estimates are extremely similar to those obtained using incidence based on first primaries. The usual ‘current probability' method considerably overestimates the lifetime risk of all cancers combined, although the correction for any single cancer site is minimal. Conclusion: Estimation of the lifetime risk of cancer should either be based on first primaries or should use the new method. PMID:21772332
Survey of Sterile Admixture Practices in Canadian Hospital Pharmacies: Part 1. Methods and Results
Warner, Travis; Nishi, Cesilia; Checkowski, Ryan; Hall, Kevin W.
2009-01-01
Background: The 1996 Guidelines for Preparation of Sterile Products in Pharmacies of the Canadian Society of Hospital Pharmacists (CSHP) represent the current standard of practice for sterile compounding in Canada. However, these guidelines are practice recommendations, not enforceable standards. Previous surveys of sterile compounding practices have shown that actual practice deviates markedly from voluntary practice recommendations. In 2004, the United States Pharmacopeia (USP) published its “General Chapter <797> Pharmaceutical Compounding—Sterile Preparations”, which set a more rigorous and enforceable standard for sterile compounding in the United States. Objectives: To assess sterile compounding practices in Canadian hospital pharmacies and to compare them with current CSHP recommendations and USP chapter <797> standards. Methods: An online survey, based on previous studies of sterile compounding practices, the CSHP guidelines, and the chapter <797> standards, was created and distributed to 193 Canadian hospital pharmacies. Results: A total of 133 pharmacies completed at least part of the survey, for a response rate of 68.9%. All respondents reported the preparation of sterile products. Various degrees of deviation from the practice recommendations were noted for virtually all areas of the CSHP guidelines and the USP standards. Low levels of compliance were most notable in the areas of facilities and equipment, process validation, and product testing. Availability in the central pharmacy of a clean room facility meeting or exceeding the criteria of International Organization for Standardization (ISO) class 8 is a requirement of the chapter <797> standards, but more than 40% of responding pharmacies reported that they did not have such a facility. Higher levels of compliance were noted for policies and procedures, garbing requirements, aseptic technique, and handling of hazardous products. Part 1 of this series reports the survey methods and results relating to policies, personnel, raw materials, storage and handling, facilities and equipment, and garments. Part 2 will report results relating to preparation of aseptic products, expiry dating, labelling, process validation, product testing and release, documentation, records, and disposal of hazardous pharmaceuticals. It will also highlight some of the key areas where there is considerable opportunity for improvement. Conclusion: This survey identified numerous deficiences in sterile compounding practices in Canadian hospital pharmacies. Awareness of these deficiencies may create an impetus for critical assessment and improvements in practice. PMID:22478875
Calibration of laser vibrometers at frequencies up to 100 kHz and higher
NASA Astrophysics Data System (ADS)
Silva Pineda, Guillermo; von Martens, Hans-Jürgen; Rojas, Sergio; Ruiz, Arturo; Muñiz, Lorenzo
2008-06-01
Manufacturers and users of laser vibrometers exploit the wide frequency and intensity ranges of laser techniques, ranging over many decades (e.g., from 0.1 Hz to 100 MHz). Traceability to primary measurement standards is demanded over the specified measurement ranges of any measurement instrumentation. As the primary documentary standard ISO 16063-11 for the calibration of vibration transducers is restricted to 10 kHz, a new international standard for the calibration of laser vibrometers, ISO 16063-41, is under development. The current stage of the 2nd Committee Draft (CD) of the ISO standard specifies calibration methods for frequencies from 0.4 Hz to 50 kHz which does not meet the demand for providing traceability at higher frequencies. New investigations will be presented which demonstrate the applicability of the laser interferometer methods specified in ISO 16063-11 and in the 2nd CD also at higher frequencies of 100 kHz and beyond. The three standard methods were simultaneously used for vibration displacement and acceleration measurements up to 100 kHz, and a fourth high-accuracy method has been developed and used. Their results in displacement and acceleration measurements deviated by less than 1 % from each other at vibration displacement amplitudes in the order of 100 nm. In addition to the three interferometer methods specified in ISO 16063-11 and 16063-15, and in the 2nd Committee Draft of 16063-41 as well, measurement results will be presented. Examples of laser vibrometer calibrations will bedemonstrated. Further investigations are aimed
Detection of fatigue cracks by nondestructive testing methods
NASA Technical Reports Server (NTRS)
Anderson, R. T.; Delacy, T. J.; Stewart, R. C.
1973-01-01
The effectiveness was assessed of various NDT methods to detect small tight cracks by randomly introducing fatigue cracks into aluminum sheets. The study included optimizing NDT methods calibrating NDT equipment with fatigue cracked standards, and evaluating a number of cracked specimens by the optimized NDT methods. The evaluations were conducted by highly trained personnel, provided with detailed procedures, in order to minimize the effects of human variability. These personnel performed the NDT on the test specimens without knowledge of the flaw locations and reported on the flaws detected. The performance of these tests was measured by comparing the flaws detected against the flaws present. The principal NDT methods utilized were radiographic, ultrasonic, penetrant, and eddy current. Holographic interferometry, acoustic emission monitoring, and replication methods were also applied on a reduced number of specimens. Generally, the best performance was shown by eddy current, ultrasonic, penetrant and holographic tests. Etching provided no measurable improvement, while proof loading improved flaw detectability. Data are shown that quantify the performances of the NDT methods applied.
NASA Astrophysics Data System (ADS)
Adamkowski, A.; Krzemianowski, Z.
2012-11-01
The paper presents experiences gathered during many years of utilizing the current-meter and pressure-time methods for flow rate measurements in many hydropower plants. The integration techniques used in these both methods are different from the recommendations contained in the relevant international standards, mainly from the graphical and arithmetical ones. The results of the comparative analysis of both methods applied at the same time during the hydraulic performance tests of two Kaplan turbines in one of the Polish hydropower plant are presented in the final part of the paper. In the case of the pressure-time method application, the concrete penstocks of the tested turbines required installing a special measuring instrumentation inside the penstock. The comparison has shown a satisfactory agreement between the results of discharge measurements executed using the both considered methods. Maximum differences between the discharge values have not exceeded 1.0 % and the average differences have not been greater than 0.5 %.
Hageman, Philip L.
2007-01-01
New methods for the determination of total mercury in geologic materials and dissolved mercury in aqueous samples have been developed that will replace the methods currently (2006) in use. The new methods eliminate the use of sodium dichromate (Na2Cr2O7 ?2H2O) as an oxidizer and preservative and significantly lower the detection limit for geologic and aqueous samples. The new methods also update instrumentation from the traditional use of cold vapor-atomic absorption spectrometry to cold vapor-atomic fluorescence spectrometry. At the same time, the new digestion procedures for geologic materials use the same size test tubes, and the same aluminum heating block and hot plate as required by the current methods. New procedures for collecting and processing of aqueous samples use the same procedures that are currently (2006) in use except that the samples are now preserved with concentrated hydrochloric acid/bromine monochloride instead of sodium dichromate/nitric acid. Both the 'old' and new methods have the same analyst productivity rates. These similarities should permit easy migration to the new methods. Analysis of geologic and aqueous reference standards using the new methods show that these procedures provide mercury recoveries that are as good as or better than the previously used methods.
Fox, W.E.; McCollum, D.W.; Mitchell, J.E.; Swanson, L.E.; Kreuter, U.P.; Tanaka, J.A.; Evans, G.R.; Theodore, Heintz H.; Breckenridge, R.P.; Geissler, P.H.
2009-01-01
Currently, there is no standard method to assess the complex systems in rangeland ecosystems. Decision makers need baselines to create a common language of current rangeland conditions and standards for continued rangeland assessment. The Sustainable Rangeland Roundtable (SRR), a group of private and public organizations and agencies, has created a forum to discuss rangeland sustainability and assessment. The SRR has worked to integrate social, economic, and ecological disciplines related to rangelands and has identified a standard set of indicators that can be used to assess rangeland sustainability. As part of this process, SRR has developed a two-tiered conceptual framework from a systems perspective to study the validity of indicators and the relationships among them. The first tier categorizes rangeland characteristics into four states. The second tier defines processes affecting these states through time and space. The framework clearly shows that the processes affect and are affected by each other. ?? 2009 Taylor & Francis Group, LLC.
Building the United States National Vegetation Classification
Franklin, S.B.; Faber-Langendoen, D.; Jennings, M.; Keeler-Wolf, T.; Loucks, O.; Peet, R.; Roberts, D.; McKerrow, A.
2012-01-01
The Federal Geographic Data Committee (FGDC) Vegetation Subcommittee, the Ecological Society of America Panel on Vegetation Classification, and NatureServe have worked together to develop the United States National Vegetation Classification (USNVC). The current standard was accepted in 2008 and fosters consistency across Federal agencies and non-federal partners for the description of each vegetation concept and its hierarchical classification. The USNVC is structured as a dynamic standard, where changes to types at any level may be proposed at any time as new information comes in. But, because much information already exists from previous work, the NVC partners first established methods for screening existing types to determine their acceptability with respect to the 2008 standard. Current efforts include a screening process to assign confidence to Association and Group level descriptions, and a review of the upper three levels of the classification. For the upper levels especially, the expectation is that the review process includes international scientists. Immediate future efforts include the review of remaining levels and the development of a proposal review process.
Bringing Standardized Processes in Atom-Probe Tomography: I Establishing Standardized Terminology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Ian M; Danoix, F; Forbes, Richard
2011-01-01
Defining standardized methods requires careful consideration of the entire field and its applications. The International Field Emission Society (IFES) has elected a Standards Committee, whose task is to determine the needed steps to establish atom-probe tomography as an accepted metrology technique. Specific tasks include developing protocols or standards for: terminology and nomenclature; metrology and instrumentation, including specifications for reference materials; test methodologies; modeling and simulations; and science-based health, safety, and environmental practices. The Committee is currently working on defining terminology related to atom-probe tomography with the goal to include terms into a document published by the International Organization for Standardsmore » (ISO). A lot of terms also used in other disciplines have already been defined) and will be discussed for adoption in the context of atom-probe tomography.« less
A Formal Semantics for the WS-BPEL Recovery Framework
NASA Astrophysics Data System (ADS)
Dragoni, Nicola; Mazzara, Manuel
While current studies on Web services composition are mostly focused - from the technical viewpoint - on standards and protocols, this work investigates the adoption of formal methods for dependable composition. The Web Services Business Process Execution Language (WS-BPEL) - an OASIS standard widely adopted both in academic and industrial environments - is considered as a touchstone for concrete composition languages and an analysis of its ambiguous Recovery Framework specification is offered. In order to show the use of formal methods, a precise and unambiguous description of its (simplified) mechanisms is provided by means of a conservative extension of the π-calculus. This has to be intended as a well known case study providing methodological arguments for the adoption of formal methods in software specification. The aspect of verification is not the main topic of the paper but some hints are given.
NASA Astrophysics Data System (ADS)
Shi, Liehang; Ling, Tonghui; Zhang, Jianguo
2016-03-01
Radiologists currently use a variety of terminologies and standards in most hospitals in China, and even there are multiple terminologies being used for different sections in one department. In this presentation, we introduce a medical semantic comprehension system (MedSCS) to extract semantic information about clinical findings and conclusion from free text radiology reports so that the reports can be classified correctly based on medical terms indexing standards such as Radlex or SONMED-CT. Our system (MedSCS) is based on both rule-based methods and statistics-based methods which improve the performance and the scalability of MedSCS. In order to evaluate the over all of the system and measure the accuracy of the outcomes, we developed computation methods to calculate the parameters of precision rate, recall rate, F-score and exact confidence interval.
Minimum current principle and variational method in theory of space charge limited flow
NASA Astrophysics Data System (ADS)
Rokhlenko, A.
2015-10-01
In spirit of the principle of least action, which means that when a perturbation is applied to a physical system, its reaction is such that it modifies its state to "agree" with the perturbation by "minimal" change of its initial state. In particular, the electron field emission should produce the minimum current consistent with boundary conditions. It can be found theoretically by solving corresponding equations using different techniques. We apply here the variational method for the current calculation, which can be quite effective even when involving a short set of trial functions. The approach to a better result can be monitored by the total current that should decrease when we on the right track. Here, we present only an illustration for simple geometries of devices with the electron flow. The development of these methods can be useful when the emitter and/or anode shapes make difficult the use of standard approaches. Though direct numerical calculations including particle-in-cell technique are very effective, but theoretical calculations can provide an important insight for understanding general features of flow formation and even sometimes be realized by simpler routines.
Olmos, Jorge A; Piskorz, María Marta; Vela, Marcelo F
2016-06-01
GERD is a highly prevalent disease in our country. It has a deep impact in patient´s quality of life, representing extremely high costs regarding health. The correct understanding of its pathophysiology is crucial for the rational use of diagnoses methods and the implementation of appropriate treatment adjusted to each individual case. In this review we evaluate this disorder based on the best available evidence, focusing in pathophysiological mechanisms, its epidemiology, modern diagnosis methods and current management standards.
Shields, Margaret V; Abdullah, Leath; Namdari, Surena
2016-06-01
Propionibacterium acnes is the most common cause of infection after shoulder arthroplasty. Whereas there are several methods that can aid in the diagnosis of P. acnes infection, there is not a single "gold standard" because of the difficulties inherent in identifying this bacterium. We present an evidence-based discussion of the demographic, clinical, and radiographic predictors of P. acnes infection and review the current options for diagnosis. This review was written after a comprehensive analysis of the current literature related to shoulder periprosthetic joint infection and P. acnes identification. Of the techniques reviewed, α-defensin had the highest sensitivity in detecting P. acnes infection (63%). C-reactive protein level and erythrocyte sedimentation rate were often normal in cases of infection. Whereas P. acnes can be challenging to successfully diagnose, there are several options that are considered preferable because of their higher sensitivities and specificities. The current gold standard is intraoperative culture, but major advances in molecular techniques may provide future improvements in diagnostic accuracy. Copyright © 2016 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.
Detection methods and performance criteria for genetically modified organisms.
Bertheau, Yves; Diolez, Annick; Kobilinsky, André; Magin, Kimberly
2002-01-01
Detection methods for genetically modified organisms (GMOs) are necessary for many applications, from seed purity assessment to compliance of food labeling in several countries. Numerous analytical methods are currently used or under development to support these needs. The currently used methods are bioassays and protein- and DNA-based detection protocols. To avoid discrepancy of results between such largely different methods and, for instance, the potential resulting legal actions, compatibility of the methods is urgently needed. Performance criteria of methods allow evaluation against a common standard. The more-common performance criteria for detection methods are precision, accuracy, sensitivity, and specificity, which together specifically address other terms used to describe the performance of a method, such as applicability, selectivity, calibration, trueness, precision, recovery, operating range, limit of quantitation, limit of detection, and ruggedness. Performance criteria should provide objective tools to accept or reject specific methods, to validate them, to ensure compatibility between validated methods, and be used on a routine basis to reject data outside an acceptable range of variability. When selecting a method of detection, it is also important to consider its applicability, its field of applications, and its limitations, by including factors such as its ability to detect the target analyte in a given matrix, the duration of the analyses, its cost effectiveness, and the necessary sample sizes for testing. Thus, the current GMO detection methods should be evaluated against a common set of performance criteria.
[Laboratory diagnosis of mucormycosis].
Garcia-Hermoso, Dea
2013-03-01
Mucormycosis are deep infections caused by ubiquitous filamentous fungi of the order of Mucorales. The disease occurs mostly in immunocompromised, diabetic or solid organ transplant recipients. There are currently no specific diagnostic guidelines for mucormycosis. The histological examination and culture of the clinical sample remain the most useful approaches for diagnosis. Furthermore, alternative methods to the fungal culture are yet to be standardized. Here we review the current microbiological approaches used for the diagnosis and identification of Mucorales. © 2013 médecine/sciences – Inserm / SRMS.
In-Situ Transfer Standard and Coincident-View Intercomparisons for Sensor Cross-Calibration
NASA Technical Reports Server (NTRS)
Thome, Kurt; McCorkel, Joel; Czapla-Myers, Jeff
2013-01-01
There exist numerous methods for accomplishing on-orbit calibration. Methods include the reflectance-based approach relying on measurements of surface and atmospheric properties at the time of a sensor overpass as well as invariant scene approaches relying on knowledge of the temporal characteristics of the site. The current work examines typical cross-calibration methods and discusses the expected uncertainties of the methods. Data from the Advanced Land Imager (ALI), Advanced Spaceborne Thermal Emission and Reflection and Radiometer (ASTER), Enhanced Thematic Mapper Plus (ETM+), Moderate Resolution Imaging Spectroradiometer (MODIS), and Thematic Mapper (TM) are used to demonstrate the limits of relative sensor-to-sensor calibration as applied to current sensors while Landsat-5 TM and Landsat-7 ETM+ are used to evaluate the limits of in situ site characterizations for SI-traceable cross calibration. The current work examines the difficulties in trending of results from cross-calibration approaches taking into account sampling issues, site-to-site variability, and accuracy of the method. Special attention is given to the differences caused in the cross-comparison of sensors in radiance space as opposed to reflectance space. The results show that cross calibrations with absolute uncertainties lesser than 1.5 percent (1 sigma) are currently achievable even for sensors without coincident views.
Chen, Kai; Zhou, Lian; Chen, Xiaodong; Bi, Jun; Kinney, Patrick L.
2017-01-01
Background Few multicity studies have addressed the health effects of ozone in China due to the scarcity of ozone monitoring data. A critical scientific and policy-relevant question is whether a threshold exists in the ozone-mortality relationship. Methods Using a generalized additive model and a univariate random-effects meta-analysis, this research evaluated the relationship between short-term ozone exposure and daily total mortality in seven cities of Jiangsu Province, China during 2013–2014. Spline, subset, and threshold models were applied to further evaluate whether a safe threshold level exists. Results This study found strong evidence that short-term ozone exposure is significantly associated with premature total mortality. A 10 μg/m3 increase in the average of the current and previous days’ maximum 8-h average ozone concentration was associated with a 0.55% (95% posterior interval: 0.34%, 0.76%) increase of total mortality. This finding is robust when considering the confounding effect of PM2.5, PM10, NO2, and SO2. No consistent evidence was found for a threshold in the ozone-mortality concentration-response relationship down to concentrations well below the current Chinese Ambient Air Quality Standard (CAAQS) level 2 standard (160 μg/m3). Conclusions Our findings suggest that ozone concentrations below the current CAAQS level 2 standard could still induce increased mortality risks in Jiangsu Province, China. Continuous air pollution control measures could yield important health benefits in Jiangsu Province, China, even in cities that meet the current CAAQS level 2 standard. PMID:28231551
A Nanocoaxial-Based Electrochemical Sensor for the Detection of Cholera Toxin
NASA Astrophysics Data System (ADS)
Archibald, Michelle M.; Rizal, Binod; Connolly, Timothy; Burns, Michael J.; Naughton, Michael J.; Chiles, Thomas C.
2015-03-01
Sensitive, real-time detection of biomarkers is of critical importance for rapid and accurate diagnosis of disease for point of care (POC) technologies. Current methods do not allow for POC applications due to several limitations, including sophisticated instrumentation, high reagent consumption, limited multiplexing capability, and cost. Here, we report a nanocoaxial-based electrochemical sensor for the detection of bacterial toxins using an electrochemical enzyme-linked immunosorbent assay (ELISA) and differential pulse voltammetry (DPV). Proof-of-concept was demonstrated for the detection of cholera toxin (CT). The linear dynamic range of detection was 10 ng/ml - 1 μg/ml, and the limit of detection (LOD) was found to be 2 ng/ml. This level of sensitivity is comparable to the standard optical ELISA used widely in clinical applications. In addition to matching the detection profile of the standard ELISA, the nanocoaxial array provides a simple electrochemical readout and a miniaturized platform with multiplexing capabilities for the simultaneous detection of multiple biomarkers, giving the nanocoax a desirable advantage over the standard method towards POC applications. Sensitive, real-time detection of biomarkers is of critical importance for rapid and accurate diagnosis of disease for point of care (POC) technologies. Current methods do not allow for POC applications due to several limitations, including sophisticated instrumentation, high reagent consumption, limited multiplexing capability, and cost. Here, we report a nanocoaxial-based electrochemical sensor for the detection of bacterial toxins using an electrochemical enzyme-linked immunosorbent assay (ELISA) and differential pulse voltammetry (DPV). Proof-of-concept was demonstrated for the detection of cholera toxin (CT). The linear dynamic range of detection was 10 ng/ml - 1 μg/ml, and the limit of detection (LOD) was found to be 2 ng/ml. This level of sensitivity is comparable to the standard optical ELISA used widely in clinical applications. In addition to matching the detection profile of the standard ELISA, the nanocoaxial array provides a simple electrochemical readout and a miniaturized platform with multiplexing capabilities for the simultaneous detection of multiple biomarkers, giving the nanocoax a desirable advantage over the standard method towards POC applications. This work was supported by the National Institutes of Health (National Cancer Institute award No. CA137681 and National Institute of Allergy and Infectious Diseases Award No. AI100216).
Multiplex cDNA quantification method that facilitates the standardization of gene expression data
Gotoh, Osamu; Murakami, Yasufumi; Suyama, Akira
2011-01-01
Microarray-based gene expression measurement is one of the major methods for transcriptome analysis. However, current microarray data are substantially affected by microarray platforms and RNA references because of the microarray method can provide merely the relative amounts of gene expression levels. Therefore, valid comparisons of the microarray data require standardized platforms, internal and/or external controls and complicated normalizations. These requirements impose limitations on the extensive comparison of gene expression data. Here, we report an effective approach to removing the unfavorable limitations by measuring the absolute amounts of gene expression levels on common DNA microarrays. We have developed a multiplex cDNA quantification method called GEP-DEAN (Gene expression profiling by DCN-encoding-based analysis). The method was validated by using chemically synthesized DNA strands of known quantities and cDNA samples prepared from mouse liver, demonstrating that the absolute amounts of cDNA strands were successfully measured with a sensitivity of 18 zmol in a highly multiplexed manner in 7 h. PMID:21415008
FISHing for bacteria in food--a promising tool for the reliable detection of pathogenic bacteria?
Rohde, Alexander; Hammerl, Jens Andre; Appel, Bernd; Dieckmann, Ralf; Al Dahouk, Sascha
2015-04-01
Foodborne pathogens cause millions of infections every year and are responsible for considerable economic losses worldwide. The current gold standard for the detection of bacterial pathogens in food is still the conventional cultivation following standardized and generally accepted protocols. However, these methods are time-consuming and do not provide fast information about food contaminations and thus are limited in their ability to protect consumers in time from potential microbial hazards. Fluorescence in situ hybridization (FISH) represents a rapid and highly specific technique for whole-cell detection. This review aims to summarize the current data on FISH-testing for the detection of pathogenic bacteria in different food matrices and to evaluate its suitability for the implementation in routine testing. In this context, the use of FISH in different matrices and their pretreatment will be presented, the sensitivity and specificity of FISH tests will be considered and the need for automation shall be discussed as well as the use of technological improvements to overcome current hurdles for a broad application in monitoring food safety. In addition, the overall economical feasibility will be assessed in a rough calculation of costs, and strengths and weaknesses of FISH are considered in comparison with traditional and well-established detection methods. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
Moyer, Jason T; Gnatkovsky, Vadym; Ono, Tomonori; Otáhal, Jakub; Wagenaar, Joost; Stacey, William C; Noebels, Jeffrey; Ikeda, Akio; Staley, Kevin; de Curtis, Marco; Litt, Brian; Galanopoulou, Aristea S
2017-11-01
Electroencephalography (EEG)-the direct recording of the electrical activity of populations of neurons-is a tremendously important tool for diagnosing, treating, and researching epilepsy. Although standard procedures for recording and analyzing human EEG exist and are broadly accepted, there are no such standards for research in animal models of seizures and epilepsy-recording montages, acquisition systems, and processing algorithms may differ substantially among investigators and laboratories. The lack of standard procedures for acquiring and analyzing EEG from animal models of epilepsy hinders the interpretation of experimental results and reduces the ability of the scientific community to efficiently translate new experimental findings into clinical practice. Accordingly, the intention of this report is twofold: (1) to review current techniques for the collection and software-based analysis of neural field recordings in animal models of epilepsy, and (2) to offer pertinent standards and reporting guidelines for this research. Specifically, we review current techniques for signal acquisition, signal conditioning, signal processing, data storage, and data sharing, and include applicable recommendations to standardize collection and reporting. We close with a discussion of challenges and future opportunities, and include a supplemental report of currently available acquisition systems and analysis tools. This work represents a collaboration on behalf of the American Epilepsy Society/International League Against Epilepsy (AES/ILAE) Translational Task Force (TASK1-Workgroup 5), and is part of a larger effort to harmonize video-EEG interpretation and analysis methods across studies using in vivo and in vitro seizure and epilepsy models. Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.
Future Concepts for Realtime Data Interfaces for Control Centers
NASA Technical Reports Server (NTRS)
Kearney, Mike W., III
2004-01-01
Existing methods of exchanging realtime data between the major control centers in the International Space Station program have resulted in a patchwork of local formats being imposed on each Mission Control Center. This puts the burden on a data customer to comply with the proprietary data formats of each data supplier. This has increased the cost and complexity for each participant, limited access to mission data and hampered the development of efficient and flexible operations concepts. Ideally, a universal format should be promoted in the industry to prevent the unnecessary burden of each center processing a different data format standard for every external interface with another center. With the broad acceptance of XML and other conventions used in other industries, it is now time for the Aerospace industry to fully engage and establish such a standard. This paper will briefly consider the components that would be required by such a standard (XML schema, data dictionaries, etc.) in order to accomplish the goal of a universal low-cost interface, and acquire broad industry acceptance. We will then examine current approaches being developed by standards bodies and other groups. The current state of CCSDS panel work will be reviewed, with a survey of the degree of industry acceptance. Other widely accepted commercial approaches will be considered, sometimes complimentary to the standards work, but sometimes not. The question is whether de facto industry standards are in concert with, or in conflict with the direction of the standards bodies. And given that state of affairs, the author will consider whether a new program establishing its Mission Control Center should implement a data interface based on those standards. The author proposes that broad industry support to unify the various efforts will enable collaboration between control centers and space programs to a wider degree than is currently available. This will reduce the cost for programs to provide realtime access to their data, hence reducing the cost of access to space, and benefiting the industry as a whole.
Kamesh Iyer, Srikant; Tasdizen, Tolga; Burgon, Nathan; Kholmovski, Eugene; Marrouche, Nassir; Adluru, Ganesh; DiBella, Edward
2016-09-01
Current late gadolinium enhancement (LGE) imaging of left atrial (LA) scar or fibrosis is relatively slow and requires 5-15min to acquire an undersampled (R=1.7) 3D navigated dataset. The GeneRalized Autocalibrating Partially Parallel Acquisitions (GRAPPA) based parallel imaging method is the current clinical standard for accelerating 3D LGE imaging of the LA and permits an acceleration factor ~R=1.7. Two compressed sensing (CS) methods have been developed to achieve higher acceleration factors: a patch based collaborative filtering technique tested with acceleration factor R~3, and a technique that uses a 3D radial stack-of-stars acquisition pattern (R~1.8) with a 3D total variation constraint. The long reconstruction time of these CS methods makes them unwieldy to use, especially the patch based collaborative filtering technique. In addition, the effect of CS techniques on the quantification of percentage of scar/fibrosis is not known. We sought to develop a practical compressed sensing method for imaging the LA at high acceleration factors. In order to develop a clinically viable method with short reconstruction time, a Split Bregman (SB) reconstruction method with 3D total variation (TV) constraints was developed and implemented. The method was tested on 8 atrial fibrillation patients (4 pre-ablation and 4 post-ablation datasets). Blur metric, normalized mean squared error and peak signal to noise ratio were used as metrics to analyze the quality of the reconstructed images, Quantification of the extent of LGE was performed on the undersampled images and compared with the fully sampled images. Quantification of scar from post-ablation datasets and quantification of fibrosis from pre-ablation datasets showed that acceleration factors up to R~3.5 gave good 3D LGE images of the LA wall, using a 3D TV constraint and constrained SB methods. This corresponds to reducing the scan time by half, compared to currently used GRAPPA methods. Reconstruction of 3D LGE images using the SB method was over 20 times faster than standard gradient descent methods. Copyright © 2016 Elsevier Inc. All rights reserved.
Thermoelectric converters for alternating current standards
NASA Astrophysics Data System (ADS)
Anatychuk, L. I.; Taschuk, D. D.
2012-06-01
Thermoelectric converters of alternating current remain priority instruments when creating standard equipment. This work presents the results of design and manufacture of alternating current converter for a military standard of alternating current in Ukraine. Results of simulation of temperature distribution in converter elements, ways of optimization to improve the accuracy of alternating current signal reproduction are presented. Results of metrological trials are given. The quality of thermoelectric material specially created for alternating current metrology is verified. The converter was used in alternating current standard for the frequency range from 10 Hz to 30 MHz. The efficiency of using thermoelectric signal converters in measuring instruments is confirmed.
Novel hermetic packaging methods for MOEMS
NASA Astrophysics Data System (ADS)
Stark, David
2003-01-01
Hermetic packaging of micro-optoelectromechanical systems (MOEMS) is an immature technology, lacking industry-consensus methods and standards. Off-the-shelf, catalog window assemblies are not yet available. Window assemblies are in general custom designed and manufactured for each new product, resulting in longer than acceptable cycle times, high procurement costs and questionable reliability. There are currently two dominant window-manufacturing methods wherein a metal frame is attached to glass, as well as a third, less-used method. The first method creates a glass-to-metal seal by heating the glass above its Tg to fuse it to the frame. The second method involves first metallizing the glass where it is to be attached to the frame, and then soldering the glass to the frame. The third method employs solder-glass to bond the glass to the frame. A novel alternative with superior features compared to the three previously described window-manufacturing methods is proposed. The new approach lends itself to a plurality of glass-to-metal attachment techniques. Benefits include lower temperature processing than two of the current methods and potentially more cost-effective manufacturing than all three of today"s attachment methods.
Methods for stable recording of short-circuit current in a Na+-transporting epithelium.
Gondzik, Veronika; Awayda, Mouhamed S
2011-07-01
Epithelial Na(+) transport as measured by a variety of techniques, including the short-circuit current technique, has been described to exhibit a "rundown" phenomenon. This phenomenon manifests as time-dependent decrease of current and resistance and precludes the ability to carry out prolonged experiments aimed at examining the regulation of this transport. We developed methods for prolonged stable recordings of epithelial Na(+) transport using modifications of the short-circuit current technique and commercial Ussing-type chambers. We utilize the polarized MDCK cell line expressing the epithelial Na(+) channel (ENaC) to describe these methods. Briefly, existing commercial chambers were modified to allow continuous flow of Ringer solution and precise control of such flow. Chamber manifolds and associated plumbing were modified to allow precise temperature clamp preventing temperature oscillations. Recording electrodes were modified to eliminate the use of KCl and prevent membrane depolarization from KCl leakage. Solutions utilized standard bicarbonate-based buffers, but all gasses were prehydrated to clamp buffer osmolarity. We demonstrate that these modifications result in measurements of current and resistance that are stable for at least 2 h. We further demonstrate that drifts in osmolarity similar to those obtained before prior to our modifications can lead to a decrease of current and resistance similar to those attributed to rundown.
DOE Office of Scientific and Technical Information (OSTI.GOV)
New, Joshua Ryan; Kumar, Jitendra; Hoffman, Forrest M.
Statement of the Problem: ASHRAE releases updates to 90.1 “Energy Standard for Buildings except Low-Rise Residential Buildings” every three years resulting in a 3.7%-17.3% increase in energy efficiency for buildings with each release. This is adopted by or informs building codes in nations across the globe, is the National Standard for the US, and individual states elect which release year of the standard they will enforce. These codes are built upon Standard 169 “Climatic Data for Building Design Standards,” the latest 2017 release of which defines climate zones based on 8, 118 weather stations throughout the world and data frommore » the past 8-25 years. This data may not be indicative of the weather that new buildings built today, will see during their upcoming 30-120 year lifespan. Methodology & Theoretical Orientation: Using more modern, high-resolution datasets from climate satellites, IPCC climate models (PCM and HadGCM), high performance computing resources (Titan) and new capabilities for clustering and optimization the authors briefly analyzed different methods for redefining climate zones. Using bottom-up analysis of multiple meteorological variables which were the subject matter, experts selected as being important to energy consumption, rather than the heating/cooling degree days currently used. Findings: We analyzed the accuracy of redefined climate zones, compared to current climate zones and how the climate zones moved under different climate change scenarios, and quantified the accuracy of these methods on a local level, at a national scale for the US. Conclusion & Significance: There is likely to be a significant annual, national energy and cost (billions USD) savings that could be realized by adjusting climate zones to take into account anticipated trends or scenarios in regional weather patterns.« less
NASA Technical Reports Server (NTRS)
Lee, Hyung B.; Ghia, Urmila; Bayyuk, Sami; Oberkampf, William L.; Roy, Christopher J.; Benek, John A.; Rumsey, Christopher L.; Powers, Joseph M.; Bush, Robert H.; Mani, Mortaza
2016-01-01
Computational fluid dynamics (CFD) and other advanced modeling and simulation (M&S) methods are increasingly relied on for predictive performance, reliability and safety of engineering systems. Analysts, designers, decision makers, and project managers, who must depend on simulation, need practical techniques and methods for assessing simulation credibility. The AIAA Guide for Verification and Validation of Computational Fluid Dynamics Simulations (AIAA G-077-1998 (2002)), originally published in 1998, was the first engineering standards document available to the engineering community for verification and validation (V&V) of simulations. Much progress has been made in these areas since 1998. The AIAA Committee on Standards for CFD is currently updating this Guide to incorporate in it the important developments that have taken place in V&V concepts, methods, and practices, particularly with regard to the broader context of predictive capability and uncertainty quantification (UQ) methods and approaches. This paper will provide an overview of the changes and extensions currently underway to update the AIAA Guide. Specifically, a framework for predictive capability will be described for incorporating a wide range of error and uncertainty sources identified during the modeling, verification, and validation processes, with the goal of estimating the total prediction uncertainty of the simulation. The Guide's goal is to provide a foundation for understanding and addressing major issues and concepts in predictive CFD. However, this Guide will not recommend specific approaches in these areas as the field is rapidly evolving. It is hoped that the guidelines provided in this paper, and explained in more detail in the Guide, will aid in the research, development, and use of CFD in engineering decision-making.
Metrology for hydrogen energy applications: a project to address normative requirements
NASA Astrophysics Data System (ADS)
Haloua, Frédérique; Bacquart, Thomas; Arrhenius, Karine; Delobelle, Benoît; Ent, Hugo
2018-03-01
Hydrogen represents a clean and storable energy solution that could meet worldwide energy demands and reduce greenhouse gases emission. The joint research project (JRP) ‘Metrology for sustainable hydrogen energy applications’ addresses standardisation needs through pre- and co-normative metrology research in the fast emerging sector of hydrogen fuel that meet the requirements of the European Directive 2014/94/EU by supplementing the revision of two ISO standards that are currently too generic to enable a sustainable implementation of hydrogen. The hydrogen purity dispensed at refueling points should comply with the technical specifications of ISO 14687-2 for fuel cell electric vehicles. The rapid progress of fuel cell technology now requires revising this standard towards less constraining limits for the 13 gaseous impurities. In parallel, optimized validated analytical methods are proposed to reduce the number of analyses. The study aims also at developing and validating traceable methods to assess accurately the hydrogen mass absorbed and stored in metal hydride tanks; this is a research axis for the revision of the ISO 16111 standard to develop this safe storage technique for hydrogen. The probability of hydrogen impurity presence affecting fuel cells and analytical techniques for traceable measurements of hydrogen impurities will be assessed and new data of maximum concentrations of impurities based on degradation studies will be proposed. Novel validated methods for measuring the hydrogen mass absorbed in hydrides tanks AB, AB2 and AB5 types referenced to ISO 16111 will be determined, as the methods currently available do not provide accurate results. The outputs here will have a direct impact on the standardisation works for ISO 16111 and ISO 14687-2 revisions in the relevant working groups of ISO/TC 197 ‘Hydrogen technologies’.
2017 National Standards for Diabetes Self-Management Education and Support.
Beck, Joni; Greenwood, Deborah A; Blanton, Lori; Bollinger, Sandra T; Butcher, Marcene K; Condon, Jo Ellen; Cypress, Marjorie; Faulkner, Priscilla; Fischl, Amy Hess; Francis, Theresa; Kolb, Leslie E; Lavin-Tompkins, Jodi M; MacLeod, Janice; Maryniuk, Melinda; Mensing, Carolé; Orzeck, Eric A; Pope, David D; Pulizzi, Jodi L; Reed, Ardis A; Rhinehart, Andrew S; Siminerio, Linda; Wang, Jing
2018-02-01
Purpose The purpose of this study is to review the literature for Diabetes Self-Management Education and Support (DSMES) to ensure the National Standards for DSMES (Standards) align with current evidence-based practices and utilization trends. Methods The 10 Standards were divided among 20 interdisciplinary workgroup members. Members searched the current research for diabetes education and support, behavioral health, clinical, health care environment, technical, reimbursement, and business practice for the strongest evidence that guided the Standards revision. Results Diabetes Self-Management Education and Support facilitates the knowledge, skills, and ability necessary for diabetes self-care as well as activities that assist a person in implementing and sustaining the behaviors needed to manage their condition on an ongoing basis. The evidence indicates that health care providers and people affected by diabetes are embracing technology, and this is having a positive impact of DSMES access, utilization, and outcomes. Conclusion Quality DSMES continues to be a critical element of care for all people with diabetes. The DSMES services must be individualized and guided by the concerns, preferences, and needs of the person affected by diabetes. Even with the abundance of evidence supporting the benefits of DSMES, it continues to be underutilized, but as with other health care services, technology is changing the way DSMES is delivered and utilized with positive outcomes.
2017 National Standards for Diabetes Self-Management Education and Support.
Beck, Joni; Greenwood, Deborah A; Blanton, Lori; Bollinger, Sandra T; Butcher, Marcene K; Condon, Jo Ellen; Cypress, Marjorie; Faulkner, Priscilla; Fischl, Amy Hess; Francis, Theresa; Kolb, Leslie E; Lavin-Tompkins, Jodi M; MacLeod, Janice; Maryniuk, Melinda; Mensing, Carolé; Orzeck, Eric A; Pope, David D; Pulizzi, Jodi L; Reed, Ardis A; Rhinehart, Andrew S; Siminerio, Linda; Wang, Jing
2017-10-01
Purpose The purpose of this study is to review the literature for Diabetes Self-Management Education and Support (DSMES) to ensure the National Standards for DSMES (Standards) align with current evidence-based practices and utilization trends. Methods The 10 Standards were divided among 20 interdisciplinary workgroup members. Members searched the current research for diabetes education and support, behavioral health, clinical, health care environment, technical, reimbursement, and business practice for the strongest evidence that guided the Standards revision. Results Diabetes Self-Management Education and Support facilitates the knowledge, skills, and ability necessary for diabetes self-care as well as activities that assist a person in implementing and sustaining the behaviors needed to manage their condition on an ongoing basis. The evidence indicates that health care providers and people affected by diabetes are embracing technology, and this is having a positive impact of DSMES access, utilization, and outcomes. Conclusion Quality DSMES continues to be a critical element of care for all people with diabetes. The DSMES services must be individualized and guided by the concerns, preferences, and needs of the person affected by diabetes. Even with the abundance of evidence supporting the benefits of DSMES, it continues to be underutilized, but as with other health care services, technology is changing the way DSMES is delivered and utilized with positive outcomes.
Estimating Adolescent Risk for Hearing Loss Based on Data From a Large School-Based Survey
Verschuure, Hans; van der Ploeg, Catharina P. B.; Brug, Johannes; Raat, Hein
2010-01-01
Objectives. We estimated whether and to what extent a group of adolescents were at risk of developing permanent hearing loss as a result of voluntary exposure to high-volume music, and we assessed whether such exposure was associated with hearing-related symptoms. Methods. In 2007, 1512 adolescents (aged 12–19 years) in Dutch secondary schools completed questionnaires about their music-listening behavior and whether they experienced hearing-related symptoms after listening to high-volume music. We used their self-reported data in conjunction with published average sound levels of music players, discotheques, and pop concerts to estimate their noise exposure, and we compared that exposure to our own “loosened” (i.e., less strict) version of current European safety standards for occupational noise exposure. Results. About half of the adolescents exceeded safety standards for occupational noise exposure. About one third of the respondents exceeded safety standards solely as a result of listening to MP3 players. Hearing symptoms that occurred after using an MP3 player or going to a discotheque were associated with exposure to high-volume music. Conclusions. Adolescents often exceeded current occupational safety standards for noise exposure, highlighting the need for specific safety standards for leisure-time noise exposure. PMID:20395587
Characterization of Triaxial Braided Composite Material Properties for Impact Simulation
NASA Technical Reports Server (NTRS)
Roberts, Gary D.; Goldberg, Robert K.; Biniendak, Wieslaw K.; Arnold, William A.; Littell, Justin D.; Kohlman, Lee W.
2009-01-01
The reliability of impact simulations for aircraft components made with triaxial braided carbon fiber composites is currently limited by inadequate material property data and lack of validated material models for analysis. Improvements to standard quasi-static test methods are needed to account for the large unit cell size and localized damage within the unit cell. The deformation and damage of a triaxial braided composite material was examined using standard quasi-static in-plane tension, compression, and shear tests. Some modifications to standard test specimen geometries are suggested, and methods for measuring the local strain at the onset of failure within the braid unit cell are presented. Deformation and damage at higher strain rates is examined using ballistic impact tests on 61- by 61- by 3.2-mm (24- by 24- by 0.125-in.) composite panels. Digital image correlation techniques were used to examine full-field deformation and damage during both quasi-static and impact tests. An impact analysis method is presented that utilizes both local and global deformation and failure information from the quasi-static tests as input for impact simulations. Improvements that are needed in test and analysis methods for better predictive capability are examined.
Cedergren, A
1974-06-01
A rapid and sensitive method using true potentiometric end-point detection has been developed and compared with the conventional amperometric method for Karl Fischer determination of water. The effect of the sulphur dioxide concentration on the shape of the titration curve is shown. By using kinetic data it was possible to calculate the course of titrations and make comparisons with those found experimentally. The results prove that the main reaction is the slow step, both in the amperometric and the potentiometric method. Results obtained in the standardization of the Karl Fischer reagent showed that the potentiometric method, including titration to a preselected potential, gave a standard deviation of 0.001(1) mg of water per ml, the amperometric method using extrapolation 0.002(4) mg of water per ml and the amperometric titration to a pre-selected diffusion current 0.004(7) mg of water per ml. Theories and results dealing with dilution effects are presented. The time of analysis was 1-1.5 min for the potentiometric and 4-5 min for the amperometric method using extrapolation.
State of the art of prostatic arterial embolization for benign prostatic hyperplasia.
Petrillo, Mario; Pesapane, Filippo; Fumarola, Enrico Maria; Emili, Ilaria; Acquasanta, Marzia; Patella, Francesca; Angileri, Salvatore Alessio; Rossi, Umberto G; Piacentini, Igor; Granata, Antonio Maria; Ierardi, Anna Maria; Carrafiello, Gianpaolo
2018-04-01
Prostatectomy via open surgery or transurethral resection of the prostate (TURP) is the standard treatment for benign prostatic hyperplasia (BPH). Several patients present contraindication for standard approach, individuals older than 60 years with urinary tract infection, strictures, post-operative pain, incontinence or urinary retention, sexual dysfunction, and blood loss are not good candidates for surgery. Prostatic artery embolization (PAE) is emerging as a viable method for patients unsuitable for surgery. In this article, we report results about technical and clinical success and safety of the procedure to define the current status.
Study on Standard Fatigue Vehicle Load Model
NASA Astrophysics Data System (ADS)
Huang, H. Y.; Zhang, J. P.; Li, Y. H.
2018-02-01
Based on the measured data of truck from three artery expressways in Guangdong Province, the statistical analysis of truck weight was conducted according to axle number. The standard fatigue vehicle model applied to industrial areas in the middle and late was obtained, which adopted equivalence damage principle, Miner linear accumulation law, water discharge method and damage ratio theory. Compared with the fatigue vehicle model Specified by the current bridge design code, the proposed model has better applicability. It is of certain reference value for the fatigue design of bridge in China.
Molecular analysis of breast sentinel lymph nodes.
Blumencranz, Peter W; Pieretti, Maura; Allen, Kathleen G; Blumencranz, Lisa E
2011-07-01
Lymphatic mapping and sentinel lymph node (SLN) biopsy have become the standard of care for staging the axilla in patients with invasive breast cancer. Current histologic methods for SLN evaluation have limitations, including subjectivity, limited sensitivity, and lack of standardization. The discovery of molecular markers to detect metastases has been reported over the last 2 decades. The authors review the historical development of these markers and the clinical use of one of the molecular platforms in 478 patients at their institution. Controversies and future directions are discussed. Copyright © 2011 Elsevier Inc. All rights reserved.
Real-time combustion control and diagnostics sensor-pressure oscillation monitor
Chorpening, Benjamin T [Morgantown, WV; Thornton, Jimmy [Morgantown, WV; Huckaby, E David [Morgantown, WV; Richards, George A [Morgantown, WV
2009-07-14
An apparatus and method for monitoring and controlling the combustion process in a combustion system to determine the amplitude and/or frequencies of dynamic pressure oscillations during combustion. An electrode in communication with the combustion system senses hydrocarbon ions and/or electrons produced by the combustion process and calibration apparatus calibrates the relationship between the standard deviation of the current in the electrode and the amplitudes of the dynamic pressure oscillations by applying a substantially constant voltage between the electrode and ground resulting in a current in the electrode and by varying one or more of (1) the flow rate of the fuel, (2) the flow rate of the oxidant, (3) the equivalence ratio, (4) the acoustic tuning of the combustion system, and (5) the fuel distribution in the combustion chamber such that the amplitudes of the dynamic pressure oscillations in the combustion chamber are calculated as a function of the standard deviation of the electrode current. Thereafter, the supply of fuel and/or oxidant is varied to modify the dynamic pressure oscillations.
A voltage-controlled capacitive discharge method for electrical activation of peripheral nerves.
Rosellini, Will M; Yoo, Paul B; Engineer, Navzer; Armstrong, Scott; Weiner, Richard L; Burress, Chester; Cauller, Larry
2011-01-01
A voltage-controlled capacitive discharge (VCCD) method was investigated as an alternative to rectangular stimulus pulses currently used in peripheral nerve stimulation therapies. In two anesthetized Gottingen mini pigs, the threshold (total charge per phase) for evoking a compound nerve action potential (CNAP) was compared between constant current (CC) and VCCD methods. Electrical pulses were applied to the tibial and posterior cutaneous femoralis nerves using standard and modified versions of the Medtronic 3778 Octad. In contrast to CC stimulation, the combined application of VCCD pulses with a modified Octad resulted in a marked decrease (-73 ± 7.4%) in the stimulation threshold for evoking a CNAP. This was consistent for different myelinated fiber types and locations of stimulation. The VCCD method provides a highly charge-efficient means of activating myelinated fibers that could potentially be used within a wireless peripheral nerve stimulator system. © 2011 International Neuromodulation Society.
[Current macro-diagnostic trends of forensic medicine in the Czech Republic].
Frišhons, Jan; Kučerová, Štěpánka; Jurda, Mikoláš; Sokol, Miloš; Vojtíšek, Tomáš; Hejna, Petr
2017-01-01
Over the last few years, advanced diagnostic methods have penetrated in the realm of forensic medicine in addition to standard autopsy techniques supported by traditional X-ray examination and macro-diagnostic laboratory tests. Despite the progress of imaging methods, the conventional autopsy has remained basic and essential diagnostic tool in forensic medicine. Postmortem computed tomography and magnetic resonance imaging are far the most progressive modern radio diagnostic methods setting the current trend of virtual autopsies all over the world. Up to now, only two institutes of forensic medicine have available postmortem computed tomography for routine diagnostic purposes in the Czech Republic. Postmortem magnetic resonance is currently unattainable for routine diagnostic use and was employed only for experimental purposes. Photogrammetry is digital method focused primarily on body surface imaging. Recently, the most fruitful results have been yielded from the interdisciplinary cooperation between forensic medicine and forensic anthropology with the implementation of body scanning techniques and 3D printing. Non-invasive and mini-invasive investigative methods such as postmortem sonography and postmortem endoscopy was unsystematically tested for diagnostic performance with good outcomes despite of limitations of these methods in postmortem application. Other futuristic methods, such as the use of a drone to inspect the crime scene are still experimental tools. The authors of the article present a basic overview of the both routinely and experimentally used investigative methods and current macro-diagnostic trends of the forensic medicine in the Czech Republic.
Brodén, Cyrus; Olivecrona, Henrik; Maguire, Gerald Q; Noz, Marilyn E; Zeleznik, Michael P; Sköldenberg, Olof
2016-01-01
Background and Purpose. The gold standard for detection of implant wear and migration is currently radiostereometry (RSA). The purpose of this study is to compare a three-dimensional computed tomography technique (3D CT) to standard RSA as an alternative technique for measuring migration of acetabular cups in total hip arthroplasty. Materials and Methods. With tantalum beads, we marked one cemented and one uncemented cup and mounted these on a similarly marked pelvic model. A comparison was made between 3D CT and standard RSA for measuring migration. Twelve repeated stereoradiographs and CT scans with double examinations in each position and gradual migration of the implants were made. Precision and accuracy of the 3D CT were calculated. Results. The accuracy of the 3D CT ranged between 0.07 and 0.32 mm for translations and 0.21 and 0.82° for rotation. The precision ranged between 0.01 and 0.09 mm for translations and 0.06 and 0.29° for rotations, respectively. For standard RSA, the precision ranged between 0.04 and 0.09 mm for translations and 0.08 and 0.32° for rotations, respectively. There was no significant difference in precision between 3D CT and standard RSA. The effective radiation dose of the 3D CT method, comparable to RSA, was estimated to be 0.33 mSv. Interpretation. Low dose 3D CT is a comparable method to standard RSA in an experimental setting.
A biochemical protocol for the isolation and identification of current species of Vibrio in seafood.
Ottaviani, D; Masini, L; Bacchiocchi, S
2003-01-01
We report a biochemical method for the isolation and identification of the current species of vibrios using just one operative protocol. The method involves an enrichment phase with incubation at 30 degrees C for 8-24 h in alkaline peptone water and an isolation phase on thiosulphate-citrate-salt sucrose agar plates incubating at 30 degrees C for 24 h. Four biochemical tests and Alsina's scheme were performed for genus and species identification, respectively. All biochemical tests were optimized as regards conditions of temperature, time of incubation and media composition. The whole standardized protocol was always able to give a correct identification when applied to 25 reference strains of Vibrio and 134 field isolates. The data demonstrated that the assay method allows an efficient recovery, isolation and identification of current species of Vibrio in seafood obtaining results within 2-7 days. This method based on biochemical tests could be applicable even in basic microbiology laboratories, and can be used simultaneously to isolate and discriminate all clinically relevant species of Vibrio.
DOT National Transportation Integrated Search
2010-02-01
Many entities currently use permeability specifications in Portland cement : concrete (PCC) pavements and structures. For those states using : permeability specifications, two test methods are generally used and include : ASTM C 1202 (Standard Test M...
Trends in Teacher Evaluation: What Every Special Education Teacher Should Know
ERIC Educational Resources Information Center
Benedict, Amber E; Thomas, Rachel A.; Kimerling, Jenna; Leko, Christopher
2013-01-01
The article reflects on current methods of teacher evaluation within the context of recent accountability policy, specifically No Child Left Behind. An overview is given of the most common forms of teacher evaluation, including performance evaluations, checklists, peer review, portfolios, the CEC and InTASC standards, the Charlotte Danielson…
Show the Data, Don't Conceal Them
ERIC Educational Resources Information Center
Drummond, Gordon B.; Vowler, Sarah L.
2011-01-01
Current standards of data presentation and analysis in biological journals often fall short of ideal. This is the first of a planned series of short articles, to be published in a number of journals, aiming to highlight the principles of clear data presentation and appropriate statistical analysis. This article considers the methods used to show…
Preliminary assestment of lint cotton water content in gin-drying temperature studies
USDA-ARS?s Scientific Manuscript database
Prior studies to measure total water (free and bound) in lint cotton by Karl Fischer Titration showed the method is more accurate and precise than moisture content by standard oven drying. The objective of the current study was to compare the moisture and total water contents from five cultivars de...
Meadows in the Sierra Nevada of California: state of knowledge
Raymond D. Ratliff
1985-01-01
This state-of-knowledge report summarizes the best available information on maintenance, restoration, and management of meadows of the Sierra Nevada, California. Major topics discussed include how to classify meadows, meadow soils, productivity of meadows, management problems, and how to evaluate range conditions and trends. Current methods and standards are reviewed,...
ERIC Educational Resources Information Center
Embrey, Karen K.
2012-01-01
Cognitive task analysis (CTA) is a knowledge elicitation technique employed for acquiring expertise from domain specialists to support the effective instruction of novices. CTA guided instruction has proven effective in improving surgical skills training for medical students and surgical residents. The standard, current method of teaching clinical…
USDA-ARS?s Scientific Manuscript database
Using next-generation-sequencing technology to assess entire transcriptomes requires high quality starting RNA. Currently, RNA quality is routinely judged using automated microfluidic gel electrophoresis platforms and associated algorithms. Here we report that such automated methods generate false-n...
Opportunities and Possibilities: Philosophical Hermeneutics and the Educational Researcher
ERIC Educational Resources Information Center
Agrey, Loren G.
2014-01-01
The opportunities that philosophical hermeneutics provide as a research tool are explored and it is shown that this qualitative research method can be employed as a valuable tool for the educational researcher. Used as an alternative to the standard quantitative approach to educational research, currently being the dominant paradigm of data…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-14
... standardized reporting methods to collect and analyze fire incident data at the Federal, State, and local levels. Data analysis helps local fire departments and States to focus on current problems, predict..., effort and resources used by respondents to respond) and cost, and the actual data collection instruments...
ERIC Educational Resources Information Center
Buys, L.; Aird, R.; Miller, E.
2012-01-01
Background: Considerable attention is currently being directed towards both active ageing and the revising of standards for disability services within Australia and internationally. Yet, to date, no consideration appears to have been given to ways to promote active ageing among older adults with intellectual disabilities (IDs). Methods:…
12 CFR 324.210 - Standardized measurement method for specific risk.
Code of Federal Regulations, 2014 CFR
2014-01-01
... purchased credit protection is capped at the current fair value of the transaction plus the absolute value... hedge has a specific risk add-on of zero if: (i) The debt or securitization position is fully hedged by... debt or securitization positions, an FDIC-supervised institution must multiply the absolute value of...
Evaluating the Strength of School Tobacco Policies: The Development of a Practical Rating System
ERIC Educational Resources Information Center
Boyce, Jennifer C.; Mueller, Nancy B.; Hogan-Watts, Melissa; Luke, Douglas A.
2009-01-01
Background: School tobacco control policies vary widely in their strength, extensiveness, and enforcement. Currently, no standardized method exists to assess the comprehensiveness of school tobacco policies. The purpose of this study was to develop a new practical rating system for school tobacco policies, assess its reliability, and present…
Objectives. We estimate the risk of highly credible gastrointestinal illness (HCGI) among adults 55 and older in a community drinking tap water meeting current U.S. standards. Methods. We conducted a randomized, triple-blinded, crossover trial in 714 households (988 indiv...
Antianaerobic Antimicrobials: Spectrum and Susceptibility Testing
Wexler, Hannah M.; Goldstein, Ellie J. C.
2013-01-01
SUMMARY Susceptibility testing of anaerobic bacteria recovered from selected cases can influence the choice of antimicrobial therapy. The Clinical and Laboratory Standards Institute (CLSI) has standardized many laboratory procedures, including anaerobic susceptibility testing (AST), and has published documents for AST. The standardization of testing methods by the CLSI allows comparisons of resistance trends among various laboratories. Susceptibility testing should be performed on organisms recovered from sterile body sites, those that are isolated in pure culture, or those that are clinically important and have variable or unique susceptibility patterns. Organisms that should be considered for individual isolate testing include highly virulent pathogens for which susceptibility cannot be predicted, such as Bacteroides, Prevotella, Fusobacterium, and Clostridium spp.; Bilophila wadsworthia; and Sutterella wadsworthensis. This review describes the current methods for AST in research and reference laboratories. These methods include the use of agar dilution, broth microdilution, Etest, and the spiral gradient endpoint system. The antimicrobials potentially effective against anaerobic bacteria include beta-lactams, combinations of beta-lactams and beta-lactamase inhibitors, metronidazole, chloramphenicol, clindamycin, macrolides, tetracyclines, and fluoroquinolones. The spectrum of efficacy, antimicrobial resistance mechanisms, and resistance patterns against these agents are described. PMID:23824372
Development of a nematode offspring counting assay for rapid and simple soil toxicity assessment.
Kim, Shin Woong; Moon, Jongmin; Jeong, Seung-Woo; An, Youn-Joo
2018-05-01
Since the introduction of standardized nematode toxicity assays by the American Society for Testing and Materials (ASTM) and International Organization for Standardization (ISO), many studies have reported their use. Given that the currently used standardized nematode toxicity assays have certain limitations, in this study, we examined the use of a novel nematode offspring counting assay for evaluating soil ecotoxicity based on a previous soil-agar isolation method used to recover live adult nematodes. In this new assay, adult Caenorhabditis elegans were exposed to soil using a standardized toxicity assay procedure, and the resulting offspring in test soils attracted by a microbial food source in agar plates were counted. This method differs from previously used assays in terms of its endpoint, namely, the number of nematode offspring. The applicability of the bioassay was demonstrated using metal-spiked soils, which revealed metal concentration-dependent responses, and with 36 field soil samples characterized by different physicochemical properties and containing various metals. Principal component analysis revealed that texture fraction (clay, sand, and silt) and electrical conductivity values were the main factors influencing the nematode offspring counting assay, and these findings warrant further investigation. The nematode offspring counting assay is a rapid and simple process that can provide multi-directional toxicity assessment when used in conjunction with other standard methods. Copyright © 2018 Elsevier Ltd. All rights reserved.
Different strategies for detection of HbA1c emphasizing on biosensors and point-of-care analyzers.
Kaur, Jagjit; Jiang, Cheng; Liu, Guozhen
2018-06-07
Measurement of glycosylated hemoglobin (HbA1c) is a gold standard procedure for assessing long term glycemic control in individuals with diabetes mellitus as it gives the stable and reliable value of blood glucose levels for a period of 90-120 days. HbA1c is formed by the non-enzymatic glycation of terminal valine of hemoglobin. The analysis of HbA1c tends to be complicated because there are more than 300 different assay methods for measuring HbA1c which leads to variations in reported values from same samples. Therefore, standardization of detection methods is recommended. The review outlines the current research activities on developing assays including biosensors for the detection of HbA1c. The pros and cons of different techniques for measuring HbA1c are outlined. The performance of current point-of-care HbA1c analyzers available on the market are also compared and discussed. The future perspectives for HbA1c detection and diabetes management are proposed. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Van de Casteele, Elke; Parizel, Paul; Sijbers, Jan
2012-03-01
Adaptive statistical iterative reconstruction (ASiR) is a new reconstruction algorithm used in the field of medical X-ray imaging. This new reconstruction method combines the idealized system representation, as we know it from the standard Filtered Back Projection (FBP) algorithm, and the strength of iterative reconstruction by including a noise model in the reconstruction scheme. It studies how noise propagates through the reconstruction steps, feeds this model back into the loop and iteratively reduces noise in the reconstructed image without affecting spatial resolution. In this paper the effect of ASiR on the contrast to noise ratio is studied using the low contrast module of the Catphan phantom. The experiments were done on a GE LightSpeed VCT system at different voltages and currents. The results show reduced noise and increased contrast for the ASiR reconstructions compared to the standard FBP method. For the same contrast to noise ratio the images from ASiR can be obtained using 60% less current, leading to a reduction in dose of the same amount.
Malone, Matthew; Goeres, Darla M; Gosbell, Iain; Vickery, Karen; Jensen, Slade; Stoodley, Paul
2017-02-01
The concept of biofilms in human health and disease is now widely accepted as cause of chronic infection. Typically, biofilms show remarkable tolerance to many forms of treatments and the host immune response. This has led to vast increase in research to identify new (and sometimes old) anti-biofilm strategies that demonstrate effectiveness against these tolerant phenotypes. Areas covered: Unfortunately, a standardized methodological approach of biofilm models has not been adopted leading to a large disparity between testing conditions. This has made it almost impossible to compare data across multiple laboratories, leaving large gaps in the evidence. Furthermore, many biofilm models testing anti-biofilm strategies aimed at the medical arena have not considered the matter of relevance to an intended application. This may explain why some in vitro models based on methodological designs that do not consider relevance to an intended application fail when applied in vivo at the clinical level. Expert commentary: This review will explore the issues that need to be considered in developing performance standards for anti-biofilm therapeutics and provide a rationale for the need to standardize models/methods that are clinically relevant. We also provide some rational as to why no standards currently exist.
Theoretical considerations and measurements for phoropters
NASA Astrophysics Data System (ADS)
Zhang, Jiyan; Liu, Wenli; Sun, Jie
2008-10-01
A phoropter is one of the most popular ophthalmic instruments used in current optometry practice. The quality and verification of the instrument are of the utmost importance. In 1997, International Organization for Standardization published the first ISO standard for requirements of phoropters. However, in China, few standard and test method are suggested for phoropters. Research work on test method for phoropters was carried out early in 2004 by China National Institute of Metrology. In this paper, first, structure of phoropters is described. Then, theoretical considerations for its optical design are analyzed. Next, a newly developed instrument is introduced and measurements are taken. By calibration, the indication error of the instrument is not over 0.05m-1. Finally, measurement results show that the quality situation of phoropters is not as good as expected because of production and assembly error. Optical design shall be improved especially for combinations of both spherical and cylindrical lenses with higher power. Besides, optical requirements specified in ISO standard are found to be a little strict and hard to meet. A proposal for revision of this international standard is drafted and discussed on ISO meeting of 2007 held in Tokyo.
How do laboratory embryo transfer techniques affect IVF outcomes? A review of current literature.
Sigalos, George; Triantafyllidou, Olga; Vlahos, Nikos
2017-04-01
Over the last few years, many studies have focused on embryo selection methods, whereas little attention has been given to the standardization of the procedure of embryo transfer. In this review, several parameters of the embryo transfer procedure are examined, such as the: (i) culture medium volume and loading technique; (ii) syringe and catheters used for embryo transfer; (iii) viscosity and composition of the embryo transfer medium; (iv) environment of embryo culture; (v) timing of embryo transfer; (vi) and standardization of the embryo transfer techniques. The aim of this manuscript is to review these factors and compare the existing embryo transfer techniques and highlight the need for better embryo transfer standardization.
Landsat Image Map Production Methods at the U. S. Geological Survey
Kidwell, R.D.; Binnie, D.R.; Martin, S.
1987-01-01
To maintain consistently high quality in satellite image map production, the U. S. Geological Survey (USGS) has developed standard procedures for the photographic and digital production of Landsat image mosaics, and for lithographic printing of multispectral imagery. This paper gives a brief review of the photographic, digital, and lithographic procedures currently in use for producing image maps from Landsat data. It is shown that consistency in the printing of image maps is achieved by standardizing the materials and procedures that affect the image detail and color balance of the final product. Densitometric standards are established by printing control targets using the pressplates, inks, pre-press proofs, and paper to be used for printing.
Novel methods of imaging and analysis for the thermoregulatory sweat test.
Carroll, Michael Sean; Reed, David W; Kuntz, Nancy L; Weese-Mayer, Debra Ellyn
2018-06-07
The thermoregulatory sweat test (TST) can be central to the identification and management of disorders affecting sudomotor function and small sensory and autonomic nerve fibers, but the cumbersome nature of the standard testing protocol has prevented its widespread adoption. A high resolution, quantitative, clean and simple assay of sweating could significantly improve identification and management of these disorders. Images from 89 clinical TSTs were analyzed retrospectively using two novel techniques. First, using the standard indicator powder, skin surface sweat distributions were determined algorithmically for each patient. Second, a fundamentally novel method using thermal imaging of forced evaporative cooling was evaluated through comparison with the standard technique. Correlation and receiver operating characteristic analyses were used to determine the degree of match between these methods, and the potential limits of thermal imaging were examined through cumulative analysis of all studied patients. Algorithmic encoding of sweating and non-sweating regions produces a more objective analysis for clinical decision making. Additionally, results from the forced cooling method correspond well with those from indicator powder imaging, with a correlation across spatial regions of -0.78 (CI: -0.84 to -0.71). The method works similarly across body regions, and frame-by-frame analysis suggests the ability to identify sweating regions within about 1 second of imaging. While algorithmic encoding can enhance the standard sweat testing protocol, thermal imaging with forced evaporative cooling can dramatically improve the TST by making it less time-consuming and more patient-friendly than the current approach.
Inductive High Power Transfer Technologies for Electric Vehicles
NASA Astrophysics Data System (ADS)
Madzharov, Nikolay D.; Tonchev, Anton T.
2014-03-01
Problems associated with "how to charge the battery pack of the electric vehicle" become more important every passing day. Most logical solution currently is the non-contact method of charge, possessing a number of advantages over standard contact methods for charging. This article focuses on methods for Inductive high power contact-less transfer of energy at relatively small distances, their advantages and disadvantages. Described is a developed Inductive Power Transfer (IPT) system for fast charging of electric vehicles with nominal power of 30 kW over 7 to 9 cm air gap.
Schacherer, Lindsey J; Xie, Weiping; Owens, Michaela A; Alarcon, Clara; Hu, Tiger X
2016-09-01
Liquid chromatography coupled with tandem mass spectrometry is increasingly used for protein detection for transgenic crops research. Currently this is achieved with protein reference standards which may take a significant time or efforts to obtain and there is a need for rapid protein detection without protein reference standards. A sensitive and specific method was developed to detect target proteins in transgenic maize leaf crude extract at concentrations as low as ∼30 ng mg(-1) dry leaf without the need of reference standards or any sample enrichment. A hybrid Q-TRAP mass spectrometer was used to monitor all potential tryptic peptides of the target proteins in both transgenic and non-transgenic samples. The multiple reaction monitoring-initiated detection and sequencing (MIDAS) approach was used for initial peptide/protein identification via Mascot database search. Further confirmation was achieved by direct comparison between transgenic and non-transgenic samples. Definitive confirmation was provided by running the same experiments of synthetic peptides or protein standards, if available. A targeted proteomic mass spectrometry method using MIDAS approach is an ideal methodology for detection of new proteins in early stages of transgenic crop research and development when neither protein reference standards nor antibodies are available. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
Weighing in on international growth standards: testing the case in Australian preschool children.
Pattinson, C L; Staton, S L; Smith, S S; Trost, S G; Sawyer, E F; Thorpe, K J
2017-10-01
Overweight and obesity in preschool-aged children are major health concerns. Accurate and reliable estimates of prevalence are necessary to direct public health and clinical interventions. There are currently three international growth standards used to determine prevalence of overweight and obesity, each using different methodologies: Center for Disease Control (CDC), World Health Organization (WHO) and International Obesity Task Force (IOTF). Adoption and use of each method were examined through a systematic review of Australian population studies (2006-2017). For this period, systematically identified population studies (N = 20) reported prevalence of overweight and obesity ranging between 15 and 38% with most (n = 16) applying the IOTF standards. To demonstrate the differences in prevalence estimates yielded by the IOTF in comparison to the WHO and CDC standards, methods were applied to a sample of N = 1,926 Australian children, aged 3-5 years. As expected, the three standards yielded significantly different estimates when applied to this single population. Prevalence of overweight/obesity was WHO - 9.3%, IOTF - 21.7% and CDC - 33.1%. Judicious selection of growth standards, taking account of their underpinning methodologies and provisions of access to study data sets to allow prevalence comparisons, is recommended. © 2017 World Obesity Federation.
Standard biological parts knowledgebase.
Galdzicki, Michal; Rodriguez, Cesar; Chandran, Deepak; Sauro, Herbert M; Gennari, John H
2011-02-24
We have created the Knowledgebase of Standard Biological Parts (SBPkb) as a publically accessible Semantic Web resource for synthetic biology (sbolstandard.org). The SBPkb allows researchers to query and retrieve standard biological parts for research and use in synthetic biology. Its initial version includes all of the information about parts stored in the Registry of Standard Biological Parts (partsregistry.org). SBPkb transforms this information so that it is computable, using our semantic framework for synthetic biology parts. This framework, known as SBOL-semantic, was built as part of the Synthetic Biology Open Language (SBOL), a project of the Synthetic Biology Data Exchange Group. SBOL-semantic represents commonly used synthetic biology entities, and its purpose is to improve the distribution and exchange of descriptions of biological parts. In this paper, we describe the data, our methods for transformation to SBPkb, and finally, we demonstrate the value of our knowledgebase with a set of sample queries. We use RDF technology and SPARQL queries to retrieve candidate "promoter" parts that are known to be both negatively and positively regulated. This method provides new web based data access to perform searches for parts that are not currently possible.
Wavelength selection method with standard deviation: application to pulse oximetry.
Vazquez-Jaccaud, Camille; Paez, Gonzalo; Strojnik, Marija
2011-07-01
Near-infrared spectroscopy provides useful biological information after the radiation has penetrated through the tissue, within the therapeutic window. One of the significant shortcomings of the current applications of spectroscopic techniques to a live subject is that the subject may be uncooperative and the sample undergoes significant temporal variations, due to his health status that, from radiometric point of view, introduce measurement noise. We describe a novel wavelength selection method for monitoring, based on a standard deviation map, that allows low-noise sensitivity. It may be used with spectral transillumination, transmission, or reflection signals, including those corrupted by noise and unavoidable temporal effects. We apply it to the selection of two wavelengths for the case of pulse oximetry. Using spectroscopic data, we generate a map of standard deviation that we propose as a figure-of-merit in the presence of the noise introduced by the living subject. Even in the presence of diverse sources of noise, we identify four wavelength domains with standard deviation, minimally sensitive to temporal noise, and two wavelengths domains with low sensitivity to temporal noise.
Qualification and Selection of Flight Diode Lasers for Space Applications
NASA Technical Reports Server (NTRS)
Liebe, Carl C.; Dillon, Robert P.; Gontijo, Ivair; Forouhar, Siamak; Shapiro, Andrew A.; Cooper, Mark S.; Meras, Patrick L.
2010-01-01
The reliability and lifetime of laser diodes is critical to space missions. The Nuclear Spectroscopic Telescope Array (NuSTAR) mission includes a metrology system that is based upon laser diodes. An operational test facility has been developed to qualify and select, by mission standards, laser diodes that will survive the intended space environment and mission lifetime. The facility is situated in an electrostatic discharge (ESD) certified clean-room and consist of an enclosed temperature-controlled stage that can accommodate up to 20 laser diodes. The facility is designed to characterize a single laser diode, in addition to conducting laser lifetime testing on up to 20 laser diodes simultaneously. A standard laser current driver is used to drive a single laser diode. Laser diode current, voltage, power, and wavelength are measured for each laser diode, and a method of selecting the most adequate laser diodes for space deployment is implemented. The method consists of creating histograms of laser threshold currents, powers at a designated current, and wavelengths at designated power. From these histograms, the laser diodes that illustrate a performance that is outside the normal are rejected and the remaining lasers are considered spaceborne candidates. To perform laser lifetime testing, the facility is equipped with 20 custom laser drivers that were designed and built by California Institute of Technology specifically to drive NuSTAR metrology lasers. The laser drivers can be operated in constant-current mode or alternating-current mode. Situated inside the enclosure, in front of the laser diodes, are 20 power-meter heads to record laser power throughout the duration of lifetime testing. Prior to connecting a laser diode to the current source for characterization and lifetime testing, a background program is initiated to collect current, voltage, and resistance. This backstage data collection enables the operational test facility to have full laser diode traceablity.
Animal Disease Import Risk Analysis--a Review of Current Methods and Practice.
Peeler, E J; Reese, R A; Thrush, M A
2015-10-01
The application of risk analysis to the spread of disease with international trade in animals and their products, that is, import risk analysis (IRA), has been largely driven by the Sanitary and Phytosanitary (SPS) agreement of the World Trade Organization (WTO). The degree to which the IRA standard established by the World Organization for Animal Health (OIE), and associated guidance, meets the needs of the SPS agreement is discussed. The use of scenario trees is the core modelling approach used to represent the steps necessary for the hazard to occur. There is scope to elaborate scenario trees for commodity IRA so that the quantity of hazard at each step is assessed, which is crucial to the likelihood of establishment. The dependence between exposure and establishment suggests that they should fall within the same subcomponent. IRA undertaken for trade reasons must include an assessment of consequences to meet SPS criteria, but guidance is sparse. The integration of epidemiological and economic modelling may open a path for better methods. Matrices have been used in qualitative IRA to combine estimates of entry and exposure, and consequences with likelihood, but this approach has flaws and better methods are needed. OIE IRA standards and guidance indicate that the volume of trade should be taken into account, but offer no detail. Some published qualitative IRAs have assumed current levels and patterns of trade without specifying the volume of trade, which constrains the use of IRA to determine mitigation measures (to reduce risk to an acceptable level) and whether the principle of equivalence, fundamental to the SPS agreement, has been observed. It is questionable whether qualitative IRA can meet all the criteria set out in the SPS agreement. Nevertheless, scope exists to elaborate the current standards and guidance, so they better serve the principle of science-based decision-making. © 2013 Crown copyright. This article is published with the permission of the Controller of HMSO and the Queen's Printer for Scotland.
Stirling Analysis Comparison of Commercial vs. High-Order Methods
NASA Technical Reports Server (NTRS)
Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.; Demko, Rikako
2007-01-01
Recently, three-dimensional Stirling engine simulations have been accomplished utilizing commercial Computational Fluid Dynamics software. The validations reported can be somewhat inconclusive due to the lack of precise time accurate experimental results from engines, export control/ proprietary concerns, and the lack of variation in the methods utilized. The last issue may be addressed by solving the same flow problem with alternate methods. In this work, a comprehensive examination of the methods utilized in the commercial codes is compared with more recently developed high-order methods. Specifically, Lele's Compact scheme and Dyson s Ultra Hi-Fi method will be compared with the SIMPLE and PISO methods currently employed in CFD-ACE, FLUENT, CFX, and STAR-CD (all commercial codes which can in theory solve a three-dimensional Stirling model although sliding interfaces and their moving grids limit the effective time accuracy). We will initially look at one-dimensional flows since the current standard practice is to design and optimize Stirling engines with empirically corrected friction and heat transfer coefficients in an overall one-dimensional model. This comparison provides an idea of the range in which commercial CFD software for modeling Stirling engines may be expected to provide accurate results. In addition, this work provides a framework for improving current one-dimensional analysis codes.
Stirling Analysis Comparison of Commercial Versus High-Order Methods
NASA Technical Reports Server (NTRS)
Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.; Demko, Rikako
2005-01-01
Recently, three-dimensional Stirling engine simulations have been accomplished utilizing commercial Computational Fluid Dynamics software. The validations reported can be somewhat inconclusive due to the lack of precise time accurate experimental results from engines, export control/proprietary concerns, and the lack of variation in the methods utilized. The last issue may be addressed by solving the same flow problem with alternate methods. In this work, a comprehensive examination of the methods utilized in the commercial codes is compared with more recently developed high-order methods. Specifically, Lele's compact scheme and Dyson's Ultra Hi-Fi method will be compared with the SIMPLE and PISO methods currently employed in CFD-ACE, FLUENT, CFX, and STAR-CD (all commercial codes which can in theory solve a three-dimensional Stirling model with sliding interfaces and their moving grids limit the effective time accuracy). We will initially look at one-dimensional flows since the current standard practice is to design and optimize Stirling engines with empirically corrected friction and heat transfer coefficients in an overall one-dimensional model. This comparison provides an idea of the range in which commercial CFD software for modeling Stirling engines may be expected to provide accurate results. In addition, this work provides a framework for improving current one-dimensional analysis codes.
Measurement of edge residual stresses in glass by the phase-shifting method
NASA Astrophysics Data System (ADS)
Ajovalasit, A.; Petrucci, G.; Scafidi, M.
2011-05-01
Control and measurement of residual stress in glass is of great importance in the industrial field. Since glass is a birefringent material, the residual stress analysis is based mainly on the photoelastic method. This paper considers two methods of automated analysis of membrane residual stress in glass sheets, based on the phase-shifting concept in monochromatic light. In particular these methods are the automated versions of goniometric compensation methods of Tardy and Sénarmont. The proposed methods can effectively replace manual methods of compensation (goniometric compensation of Tardy and Sénarmont, Babinet and Babinet-Soleil compensators) provided by current standards on the analysis of residual stresses in glasses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harding, Samuel F.; Romero-Gomez, Pedro D. J.; Richmond, Marshall C.
Standards provide recommendations for the best practices in the installation of current meters for measuring fluid flow in closed conduits. These include PTC-18 and IEC-41 . Both of these standards refer to the requirements of the ISO Standard 3354 for cases where the velocity distribution is assumed to be regular and the flow steady. Due to the nature of the short converging intakes of Kaplan hydroturbines, these assumptions may be invalid if current meters are intended to be used to characterize turbine flows. In this study, we examine a combination of measurement guidelines from both ISO standards by means ofmore » virtual current meters (VCM) set up over a simulated hydroturbine flow field. To this purpose, a computational fluid dynamics (CFD) model was developed to model the velocity field of a short converging intake of the Ice Harbor Dam on the Snake River, in the State of Washington. The detailed geometry and resulting wake of the submersible traveling screen (STS) at the first gate slot was of particular interest in the development of the CFD model using a detached eddy simulation (DES) turbulence solution. An array of virtual point velocity measurements were extracted from the resulting velocity field to simulate VCM at two virtual measurement (VM) locations at different distances downstream of the STS. The discharge through each bay was calculated from the VM using the graphical integration solution to the velocity-area method. This method of representing practical velocimetry techniques in a numerical flow field has been successfully used in a range of marine and conventional hydropower applications. A sensitivity analysis was performed to observe the effect of the VCM array resolution on the discharge error. The downstream VM section required 11–33% less VCM in the array than the upstream VM location to achieve a given discharge error. In general, more instruments were required to quantify the discharge at high levels of accuracy when the STS was introduced because of the increased spatial variability of the flow velocity.« less
Ingrassia, Pier Luigi; Foletti, Marco; Djalali, Ahmadreza; Scarone, Piercarlo; Ragazzoni, Luca; Corte, Francesco Della; Kaptan, Kubilay; Lupescu, Olivera; Arculeo, Chris; von Arnim, Gotz; Friedl, Tom; Ashkenazi, Michael; Heselmann, Deike; Hreckovski, Boris; Khorram-Manesh, Amir; Khorrram-Manesh, Amir; Komadina, Radko; Lechner, Kostanze; Patru, Cristina; Burkle, Frederick M; Fisher, Philipp
2014-04-01
Education and training are key elements of disaster management. Despite national and international educational programs in disaster management, there is no standardized curriculum available to guide the European Union (EU) member states. European- based Disaster Training Curriculum (DITAC), a multiple university-based project financially supported by the EU, is charged with developing a holistic and highly-structured curriculum and courses for responders and crisis managers at a strategic and tactical level. The purpose of this study is to qualitatively assess the prevailing preferences and characteristics of disaster management educational and training initiatives (ETIs) at a postgraduate level that currently exist in the EU countries. An Internet-based qualitative search was conducted in 2012 to identify and analyze the current training programs in disaster management. The course characteristics were evaluated for curriculum, teaching methods, modality of delivery, target groups, and funding. The literature search identified 140 ETIs, the majority (78%) located in United Kingdom, France, and Germany. Master level degrees were the primary certificates granted to graduates. Face-to-face education was the most common teaching method (84%). Approximately 80% of the training initiatives offered multi- and cross-disciplinary disaster management content. A competency-based approach to curriculum content was present in 61% of the programs. Emergency responders at the tactical level were the main target group. Almost all programs were self-funded. Although ETIs currently exist, they are not broadly available in all 27 EU countries. Also, the curricula do not cover all key elements of disaster management in a standardized and competency-based structure. This study has identified the need to develop a standardized competency-based educational and training program for all European countries that will ensure the practice and policies that meet both the standards of care and the broader expectations for professionalization of the disaster and crisis workforce.
Assembling Appliances Standards from a Basket of Functions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Siderious, Hans-Paul; Meier, Alan
2014-08-11
Rapid innovation in product design challenges the current methodology for setting standards and labels, especially for electronics, software and networking. Major problems include defining the product, measuring its energy consumption, and choosing the appropriate metric and level for the standard. Most governments have tried to solve these problems by defining ever more specific product subcategories, along with their corresponding test methods and metrics. An alternative approach would treat each energy-using product as something that delivers a basket of functions. Then separate standards would be constructed for the individual functions that can be defined, tested, and evaluated. Case studies of thermostats,more » displays and network equipment are presented to illustrate the problems with the classical approach for setting standards and indicate the merits and drawbacks of the alternative. The functional approach appears best suited to products whose primary purpose is processing information and that have multiple functions.« less
Wavelet/scalar quantization compression standard for fingerprint images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brislawn, C.M.
1996-06-12
US Federal Bureau of Investigation (FBI) has recently formulated a national standard for digitization and compression of gray-scale fingerprint images. Fingerprints are scanned at a spatial resolution of 500 dots per inch, with 8 bits of gray-scale resolution. The compression algorithm for the resulting digital images is based on adaptive uniform scalar quantization of a discrete wavelet transform subband decomposition (wavelet/scalar quantization method). The FBI standard produces archival-quality images at compression ratios of around 15 to 1 and will allow the current database of paper fingerprint cards to be replaced by digital imagery. The compression standard specifies a class ofmore » potential encoders and a universal decoder with sufficient generality to reconstruct compressed images produced by any compliant encoder, allowing flexibility for future improvements in encoder technology. A compliance testing program is also being implemented to ensure high standards of image quality and interchangeability of data between different implementations.« less
Standards for vision science libraries: 2014 revision
Motte, Kristin; Caldwell, C. Brooke; Lamson, Karen S.; Ferimer, Suzanne; Nims, J. Chris
2014-01-01
Objective: This Association of Vision Science Librarians revision of the “Standards for Vision Science Libraries” aspires to provide benchmarks to address the needs for the services and resources of modern vision science libraries (academic, medical or hospital, pharmaceutical, and so on), which share a core mission, are varied by type, and are located throughout the world. Methods: Through multiple meeting discussions, member surveys, and a collaborative revision process, the standards have been updated for the first time in over a decade. Results: While the range of types of libraries supporting vision science services, education, and research is wide, all libraries, regardless of type, share core attributes, which the standards address. Conclusions: The current standards can and should be used to help develop new vision science libraries or to expand the growth of existing libraries, as well as to support vision science librarians in their work to better provide services and resources to their respective users. PMID:25349547
Safaei-Asl, Afshin; Enshaei, Mercede; Heydarzadeh, Abtin; Maleknejad, Shohreh
2016-01-01
Assessment of glomerular filtration rate (GFR) is an important tool for monitoring renal function. Regarding to limitations in available methods, we intended to calculate GFR by cystatin C (Cys C) based formulas and determine correlation rate of them with current methods. We studied 72 children (38 boys and 34 girls) with renal disorders. The 24 hour urinary creatinine (Cr) clearance was the gold standard method. GFR was measured with Schwartz formula and Cys C-based formulas (Grubb, Hoek, Larsson and Simple). Then correlation rates of these formulas were determined. Using Pearson correlation coefficient, a significant positive correlation between all formulas and the standard method was seen (R(2) for Schwartz, Hoek, Larsson, Grubb and Simple formula was 0.639, 0.722, 0.705, 0.712, 0.722, respectively) (P<0.001). Cys C-based formulas could predict the variance of standard method results with high power. These formulas had correlation with Schwarz formula by R(2) 0.62-0.65 (intermediate correlation). Using linear regression and constant (y-intercept), it revealed that Larsson, Hoek and Grubb formulas can estimate GFR amounts with no statistical difference compared with standard method; but Schwartz and Simple formulas overestimate GFR. This study shows that Cys C-based formulas have strong relationship with 24 hour urinary Cr clearance. Hence, they can determine GFR in children with kidney injury, easier and with enough accuracy. It helps the physician to diagnosis of renal disease in early stages and improves the prognosis.
NASA Astrophysics Data System (ADS)
To, Anthony; Downs, Corey; Fu, Elain
2017-05-01
Wax printing has become a common method of fabricating channels in cellulose-based microfluidic devices. However, a limitation of wax printing is that it is restricted to relatively thin, smooth substrates that are compatible with processing by a commercial wax printer. In the current report, we describe a simple patterning method that extends the utility of wax printers for creating hydrophobic barriers on non-standard porous substrates via a process called wax transfer printing. We demonstrate the use of multiple wax transfer cycles to create well-defined, robust, and reproducible barriers in a thick cellulose substrate that is not compatible with feeding through a wax printer. We characterize the method for (i) wax spreading within the substrate as a function of heating time, (ii) the ability to create functional barriers in a substrate, and (iii) reproducibility in line width.
Kokoris, M; Nabavi, M; Lancaster, C; Clemmens, J; Maloney, P; Capadanno, J; Gerdes, J; Battrell, C F
2005-09-01
One current challenge facing point-of-care cancer detection is that existing methods make it difficult, time consuming and too costly to (1) collect relevant cell types directly from a patient sample, such as blood and (2) rapidly assay those cell types to determine the presence or absence of a particular type of cancer. We present a proof of principle method for an integrated, sample-to-result, point-of-care detection device that employs microfluidics technology, accepted assays, and a silica membrane for total RNA purification on a disposable, credit card sized laboratory-on-card ('lab card") device in which results are obtained in minutes. Both yield and quality of on-card purified total RNA, as determined by both LightCycler and standard reverse transcriptase amplification of G6PDH and BCR-ABL transcripts, were found to be better than or equal to accepted standard purification methods.
Solar cell and module performance assessment based on indoor calibration methods
NASA Astrophysics Data System (ADS)
Bogus, K.
A combined space/terrestrial solar cell test calibration method that requires five steps and can be performed indoors is described. The test conditions are designed to qualify the cell or module output data in standard illumination and temperature conditions. Measurements are made of the short-circuit current, the open circuit voltage, the maximum power, the efficiency, and the spectral response. Standard sunlight must be replicated both in earth surface and AM0 conditions; Xe lamps are normally used for the light source, with spectral measurements taken of the light. Cell and module spectral response are assayed by using monochromators and narrow band pass monochromatic filters. Attention is required to define the performance characteristics of modules under partial shadowing. Error sources that may effect the measurements are discussed, as are previous cell performance testing and calibration methods and their effectiveness in comparison with the behaviors of satellite solar power panels.
Application of DNA-based methods in forensic entomology.
Wells, Jeffrey D; Stevens, Jamie R
2008-01-01
A forensic entomological investigation can benefit from a variety of widely practiced molecular genotyping methods. The most commonly used is DNA-based specimen identification. Other applications include the identification of insect gut contents and the characterization of the population genetic structure of a forensically important insect species. The proper application of these procedures demands that the analyst be technically expert. However, one must also be aware of the extensive list of standards and expectations that many legal systems have developed for forensic DNA analysis. We summarize the DNA techniques that are currently used in, or have been proposed for, forensic entomology and review established genetic analyses from other scientific fields that address questions similar to those in forensic entomology. We describe how accepted standards for forensic DNA practice and method validation are likely to apply to insect evidence used in a death or other forensic entomological investigation.
The Contribution of Expanding Portion Sizes to the US Obesity Epidemic
Young, Lisa R.; Nestle, Marion
2002-01-01
Objectives. Because larger food portions could be contributing to the increasing prevalence of overweight and obesity, this study was designed to weigh samples of marketplace foods, identify historical changes in the sizes of those foods, and compare current portions with federal standards. Methods. We obtained information about current portions from manufacturers or from direct weighing; we obtained information about past portions from manufacturers or contemporary publications. Results. Marketplace food portions have increased in size and now exceed federal standards. Portion sizes began to grow in the 1970s, rose sharply in the 1980s, and have continued in parallel with increasing body weights. Conclusions. Because energy content increases with portion size, educational and other public health efforts to address obesity should focus on the need for people to consume smaller portions. PMID:11818300
Final Report on Jobin Yvon Contained Inductively Coupled Plasma Emission Spectrometer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pennebaker, F.M.
2003-03-17
A new Inductively Coupled Plasma -- Emission Spectrometer (ICP-ES) was recently purchased and installed in Lab B-147/151 at SRTC. The contained JY Model Ultima 170-C ICP-ES has been tested and compared to current ADS ICP-ES instrumentation. The testing has included both performance tests to evaluate instrumental ability, and the measurement of matrix standards commonly analyzed by ICP-ES at Savannah River. In developing operating procedures for this instrument, we have implemented the use of internal standards and off-peak background subtraction. Both of these techniques are recommended by EPA SW-846 ICP-ES methods and are common to current ICP-ES operations. Based on themore » testing and changes, the JY Model Ultima 170-C ICP-ES provides improved performance for elemental analysis of radioactive samples in the Analytical Development Section.« less
NASA Astrophysics Data System (ADS)
Green, David L.; Berry, Lee A.; Simpson, Adam B.; Younkin, Timothy R.
2018-04-01
We present the KINETIC-J code, a computational kernel for evaluating the linearized Vlasov equation with application to calculating the kinetic plasma response (current) to an applied time harmonic wave electric field. This code addresses the need for a configuration space evaluation of the plasma current to enable kinetic full-wave solvers for waves in hot plasmas to move beyond the limitations of the traditional Fourier spectral methods. We benchmark the kernel via comparison with the standard k →-space forms of the hot plasma conductivity tensor.
Towards a framework for developing semantic relatedness reference standards.
Pakhomov, Serguei V S; Pedersen, Ted; McInnes, Bridget; Melton, Genevieve B; Ruggieri, Alexander; Chute, Christopher G
2011-04-01
Our objective is to develop a framework for creating reference standards for functional testing of computerized measures of semantic relatedness. Currently, research on computerized approaches to semantic relatedness between biomedical concepts relies on reference standards created for specific purposes using a variety of methods for their analysis. In most cases, these reference standards are not publicly available and the published information provided in manuscripts that evaluate computerized semantic relatedness measurement approaches is not sufficient to reproduce the results. Our proposed framework is based on the experiences of medical informatics and computational linguistics communities and addresses practical and theoretical issues with creating reference standards for semantic relatedness. We demonstrate the use of the framework on a pilot set of 101 medical term pairs rated for semantic relatedness by 13 medical coding experts. While the reliability of this particular reference standard is in the "moderate" range; we show that using clustering and factor analyses offers a data-driven approach to finding systematic differences among raters and identifying groups of potential outliers. We test two ontology-based measures of relatedness and provide both the reference standard containing individual ratings and the R program used to analyze the ratings as open-source. Currently, these resources are intended to be used to reproduce and compare results of studies involving computerized measures of semantic relatedness. Our framework may be extended to the development of reference standards in other research areas in medical informatics including automatic classification, information retrieval from medical records and vocabulary/ontology development. Copyright © 2010 Elsevier Inc. All rights reserved.
The development of professional practice standards for Australian general practice nurses.
Halcomb, Elizabeth; Stephens, Moira; Bryce, Julianne; Foley, Elizabeth; Ashley, Christine
2017-08-01
The aim of this study was to explore the current role of general practice nurses and the scope of nursing practice to inform the development of national professional practice standards for Australian general practice nurses. Increasing numbers of nurses have been employed in Australian general practice to meet the growing demand for primary care services. This has brought significant changes to the nursing role. Competency standards for nurses working in general practice were first developed in Australia in 2005, but limited attention has been placed on articulating the contemporary scope of practice for nurses in this setting. Concurrent mixed methods design. Data collection was conducted during 2013-2014 and involved two online surveys of Registered and Enrolled Nurses currently working in general practice, a series of 14 focus groups across Australia and a series of consultations with key experts. Data collection enabled the development of 22 Practice Standards separated into four domains: (i) Professional Practice; (ii) Nursing Care; (iii) General Practice Environment and (iv) Collaborative Practice. To differentiate the variations in enacting these Standards, performance indicators for the Enrolled Nurse, Registered Nurse and Registered Nurse Advanced Practice are provided under each Standard. The development of national professional practice standards for nurses working in Australian general practice will support ongoing workforce development. These Standards are also an important means of articulating the role and scope of the nurses' practice for both consumers and other health professionals, as well as being a guide for curriculum development and measurement of performance. © 2017 John Wiley & Sons Ltd.
Schoenberg, Mike R; Lange, Rael T; Saklofske, Donald H
2007-11-01
Establishing a comparison standard in neuropsychological assessment is crucial to determining change in function. There is no available method to estimate premorbid intellectual functioning for the Wechsler Intelligence Scale for Children-Fourth Edition (WISC-IV). The WISC-IV provided normative data for both American and Canadian children aged 6 to 16 years old. This study developed regression algorithms as a proposed method to estimate full-scale intelligence quotient (FSIQ) for the Canadian WISC-IV. Participants were the Canadian WISC-IV standardization sample (n = 1,100). The sample was randomly divided into two groups (development and validation groups). The development group was used to generate regression algorithms; 1 algorithm only included demographics, and 11 combined demographic variables with WISC-IV subtest raw scores. The algorithms accounted for 18% to 70% of the variance in FSIQ (standard error of estimate, SEE = 8.6 to 14.2). Estimated FSIQ significantly correlated with actual FSIQ (r = .30 to .80), and the majority of individual FSIQ estimates were within +/-10 points of actual FSIQ. The demographic-only algorithm was less accurate than algorithms combining demographic variables with subtest raw scores. The current algorithms yielded accurate estimates of current FSIQ for Canadian individuals aged 6-16 years old. The potential application of the algorithms to estimate premorbid FSIQ is reviewed. While promising, clinical validation of the algorithms in a sample of children and/or adolescents with known neurological dysfunction is needed to establish these algorithms as a premorbid estimation procedure.
[Surgical treatment of burns : Special aspects of pediatric burns].
Bührer, G; Beier, J P; Horch, R E; Arkudas, A
2017-05-01
Treatment of pediatric burn patients is very important because of the sheer frequency of burn wounds and the possible long-term ramifications. Extensive burns need special care and are treated in specialized burn centers. The goal of this work is to present current standards in burn therapy and important innovations in the treatment of burns in children so that the common and small area burn wounds and scalds in pediatric patients in day-to-day dermatological practice can be adequately treated. Analysis of current literature, discussion of reviews, incorporation of current guidelines. Burns in pediatric patients are common. Improvement of survival can be achieved by treatment in burn centers. The assessment of burn depth and area is an important factor for proper treatment. We give an overview for outpatient treatment of partial thickness burns. New methods may result in better long-term outcome. Adequate treatment of burn injuries considering current literature and guidelines improves patient outcome. Rational implementation of new methods is recommended.
Finite-element lattice Boltzmann simulations of contact line dynamics
NASA Astrophysics Data System (ADS)
Matin, Rastin; Krzysztof Misztal, Marek; Hernández-García, Anier; Mathiesen, Joachim
2018-01-01
The lattice Boltzmann method has become one of the standard techniques for simulating a wide range of fluid flows. However, the intrinsic coupling of momentum and space discretization restricts the traditional lattice Boltzmann method to regular lattices. Alternative off-lattice Boltzmann schemes exist for both single- and multiphase flows that decouple the velocity discretization from the underlying spatial grid. The current study extends the applicability of these off-lattice methods by introducing a finite element formulation that enables simulating contact line dynamics for partially wetting fluids. This work exemplifies the implementation of the scheme and furthermore presents benchmark experiments that show the scheme reduces spurious currents at the liquid-vapor interface by at least two orders of magnitude compared to a nodal implementation and allows for predicting the equilibrium states accurately in the range of moderate contact angles.
AmO 2 Analysis for Analytical Method Testing and Assessment: Analysis Support for AmO 2 Production
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuhn, Kevin John; Bland, Galey Jean; Fulwyler, James Brent
Americium oxide samples will be measured for various analytes to support AmO 2 production. The key analytes that are currently requested by the Am production customer at LANL include total Am content, Am isotopics, Pu assay, Pu isotopics, and trace element content including 237Np content. Multiple analytical methods will be utilized depending on the sensitivity, accuracy and precision needs of the Am matrix. Traceability to the National Institute of Standards and Technology (NIST) will be achieved, where applicable, by running NIST traceable quality control materials. This given that there are no suitable AmO 2 reference materials currently available for requestedmore » analytes. The primary objective is to demonstrate the suitability of actinide analytical chemistry methods to support AmO 2 production operations.« less
Determination of Ethanol in Kombucha Products: Single-Laboratory Validation, First Action 2016.12.
Ebersole, Blake; Liu, Ying; Schmidt, Rich; Eckert, Matt; Brown, Paula N
2017-05-01
Kombucha is a fermented nonalcoholic beverage that has drawn government attention due to the possible presence of excess ethanol (≥0.5% alcohol by volume; ABV). A validated method that provides better precision and accuracy for measuring ethanol levels in kombucha is urgently needed by the kombucha industry. The current study validated a method for determining ethanol content in commercial kombucha products. The ethanol content in kombucha was measured using headspace GC with flame ionization detection. An ethanol standard curve ranging from 0.05 to 5.09% ABV was used, with correlation coefficients greater than 99.9%. The method detection limit was 0.003% ABV and the LOQ was 0.01% ABV. The RSDr ranged from 1.62 to 2.21% and the Horwitz ratio ranged from 0.4 to 0.6. The average accuracy of the method was 98.2%. This method was validated following the guidelines for single-laboratory validation by AOAC INTERNATIONAL and meets the requirements set by AOAC SMPR 2016.001, "Standard Method Performance Requirements for Determination of Ethanol in Kombucha."
Evaluating Sleep Disturbance: A Review of Methods
NASA Technical Reports Server (NTRS)
Smith, Roy M.; Oyung, R.; Gregory, K.; Miller, D.; Rosekind, M.; Rosekind, Mark R. (Technical Monitor)
1996-01-01
There are three general approaches to evaluating sleep disturbance in regards to noise: subjective, behavioral, and physiological. Subjective methods range from standardized questionnaires and scales to self-report measures designed for specific research questions. There are two behavioral methods that provide useful sleep disturbance data. One behavioral method is actigraphy, a motion detector that provides an empirical estimate of sleep quantity and quality. An actigraph, worn on the non-dominant wrist, provides a 24-hr estimate of the rest/activity cycle. The other method involves a behavioral response, either to a specific probe or stimuli or subject initiated (e.g., indicating wakefulness). The classic, gold standard for evaluating sleep disturbance is continuous physiological monitoring of brain, eye, and muscle activity. This allows detailed distinctions of the states and stages of sleep, awakenings, and sleep continuity. Physiological delta can be obtained in controlled laboratory settings and in natural environments. Current ambulatory physiological recording equipment allows evaluation in home and work settings. These approaches will be described and the relative strengths and limitations of each method will be discussed.
Micro-costing studies in the health and medical literature: protocol for a systematic review
2014-01-01
Background Micro-costing is a cost estimation method that allows for precise assessment of the economic costs of health interventions. It has been demonstrated to be particularly useful for estimating the costs of new interventions, for interventions with large variability across providers, and for estimating the true costs to the health system and to society. However, existing guidelines for economic evaluations do not provide sufficient detail of the methods and techniques to use when conducting micro-costing analyses. Therefore, the purpose of this study is to review the current literature on micro-costing studies of health and medical interventions, strategies, and programs to assess the variation in micro-costing methodology and the quality of existing studies. This will inform current practice in conducting and reporting micro-costing studies and lead to greater standardization in methodology in the future. Methods/Design We will perform a systematic review of the current literature on micro-costing studies of health and medical interventions, strategies, and programs. Using rigorously designed search strategies, we will search Ovid MEDLINE, EconLit, BIOSIS Previews, Embase, Scopus, and the National Health Service Economic Evaluation Database (NHS EED) to identify relevant English-language articles. These searches will be supplemented by a review of the references of relevant articles identified. Two members of the review team will independently extract detailed information on the design and characteristics of each included article using a standardized data collection form. A third reviewer will be consulted to resolve discrepancies. We will use checklists that have been developed for critical appraisal of health economics studies to evaluate the quality and potential risk of bias of included studies. Discussion This systematic review will provide useful information to help standardize the methods and techniques for conducting and reporting micro-costing studies in research, which can improve the quality and transparency of future studies and enhance comparability and interpretation of findings. In the long run, these efforts will facilitate clinical and health policy decision-making about resource allocation. Trial registration Systematic review registration: PROSPERO CRD42014007453. PMID:24887208
Vision 20/20: perspectives on automated image segmentation for radiotherapy.
Sharp, Gregory; Fritscher, Karl D; Pekar, Vladimir; Peroni, Marta; Shusharina, Nadya; Veeraraghavan, Harini; Yang, Jinzhong
2014-05-01
Due to rapid advances in radiation therapy (RT), especially image guidance and treatment adaptation, a fast and accurate segmentation of medical images is a very important part of the treatment. Manual delineation of target volumes and organs at risk is still the standard routine for most clinics, even though it is time consuming and prone to intra- and interobserver variations. Automated segmentation methods seek to reduce delineation workload and unify the organ boundary definition. In this paper, the authors review the current autosegmentation methods particularly relevant for applications in RT. The authors outline the methods' strengths and limitations and propose strategies that could lead to wider acceptance of autosegmentation in routine clinical practice. The authors conclude that currently, autosegmentation technology in RT planning is an efficient tool for the clinicians to provide them with a good starting point for review and adjustment. Modern hardware platforms including GPUs allow most of the autosegmentation tasks to be done in a range of a few minutes. In the nearest future, improvements in CT-based autosegmentation tools will be achieved through standardization of imaging and contouring protocols. In the longer term, the authors expect a wider use of multimodality approaches and better understanding of correlation of imaging with biology and pathology.
Genetic Testing as a New Standard for Clinical Diagnosis of Color Vision Deficiencies
Davidoff, Candice; Neitz, Maureen; Neitz, Jay
2016-01-01
Purpose The genetics underlying inherited color vision deficiencies is well understood: causative mutations change the copy number or sequence of the long (L), middle (M), or short (S) wavelength sensitive cone opsin genes. This study evaluated the potential of opsin gene analyses for use in clinical diagnosis of color vision defects. Methods We tested 1872 human subjects using direct sequencing of opsin genes and a novel genetic assay that characterizes single nucleotide polymorphisms (SNPs) using the MassArray system. Of the subjects, 1074 also were given standard psychophysical color vision tests for a direct comparison with current clinical methods. Results Protan and deutan deficiencies were classified correctly in all subjects identified by MassArray as having red–green defects. Estimates of defect severity based on SNPs that control photopigment spectral tuning correlated with estimates derived from Nagel anomaloscopy. Conclusions The MassArray assay provides genetic information that can be useful in the diagnosis of inherited color vision deficiency including presence versus absence, type, and severity, and it provides information to patients about the underlying pathobiology of their disease. Translational Relevance The MassArray assay provides a method that directly analyzes the molecular substrates of color vision that could be used in combination with, or as an alternative to current clinical diagnosis of color defects. PMID:27622081
Lung function imaging methods in Cystic Fibrosis pulmonary disease.
Kołodziej, Magdalena; de Veer, Michael J; Cholewa, Marian; Egan, Gary F; Thompson, Bruce R
2017-05-17
Monitoring of pulmonary physiology is fundamental to the clinical management of patients with Cystic Fibrosis. The current standard clinical practise uses spirometry to assess lung function which delivers a clinically relevant functional readout of total lung function, however does not supply any visible or localised information. High Resolution Computed Tomography (HRCT) is a well-established current 'gold standard' method for monitoring lung anatomical changes in Cystic Fibrosis patients. HRCT provides excellent morphological information, however, the X-ray radiation dose can become significant if multiple scans are required to monitor chronic diseases such as cystic fibrosis. X-ray phase-contrast imaging is another emerging X-ray based methodology for Cystic Fibrosis lung assessment which provides dynamic morphological and functional information, albeit with even higher X-ray doses than HRCT. Magnetic Resonance Imaging (MRI) is a non-ionising radiation imaging method that is garnering growing interest among researchers and clinicians working with Cystic Fibrosis patients. Recent advances in MRI have opened up the possibilities to observe lung function in real time to potentially allow sensitive and accurate assessment of disease progression. The use of hyperpolarized gas or non-contrast enhanced MRI can be tailored to clinical needs. While MRI offers significant promise it still suffers from poor spatial resolution and the development of an objective scoring system especially for ventilation assessment.
A Microfluidic, High Throughput Protein Crystal Growth Method for Microgravity
Carruthers Jr, Carl W.; Gerdts, Cory; Johnson, Michael D.; Webb, Paul
2013-01-01
The attenuation of sedimentation and convection in microgravity can sometimes decrease irregularities formed during macromolecular crystal growth. Current terrestrial protein crystal growth (PCG) capabilities are very different than those used during the Shuttle era and that are currently on the International Space Station (ISS). The focus of this experiment was to demonstrate the use of a commercial off-the-shelf, high throughput, PCG method in microgravity. Using Protein BioSolutions’ microfluidic Plug Maker™/CrystalCard™ system, we tested the ability to grow crystals of the regulator of glucose metabolism and adipogenesis: peroxisome proliferator-activated receptor gamma (apo-hPPAR-γ LBD), as well as several PCG standards. Overall, we sent 25 CrystalCards™ to the ISS, containing ~10,000 individual microgravity PCG experiments in a 3U NanoRacks NanoLab (1U = 103 cm.). After 70 days on the ISS, our samples were returned with 16 of 25 (64%) microgravity cards having crystals, compared to 12 of 25 (48%) of the ground controls. Encouragingly, there were more apo-hPPAR-γ LBD crystals in the microgravity PCG cards than the 1g controls. These positive results hope to introduce the use of the PCG standard of low sample volume and large experimental density to the microgravity environment and provide new opportunities for macromolecular samples that may crystallize poorly in standard laboratories. PMID:24278480
A per-cent-level determination of the nucleon axial coupling from quantum chromodynamics.
Chang, C C; Nicholson, A N; Rinaldi, E; Berkowitz, E; Garron, N; Brantley, D A; Monge-Camacho, H; Monahan, C J; Bouchard, C; Clark, M A; Joó, B; Kurth, T; Orginos, K; Vranas, P; Walker-Loud, A
2018-06-01
The axial coupling of the nucleon, g A , is the strength of its coupling to the weak axial current of the standard model of particle physics, in much the same way as the electric charge is the strength of the coupling to the electromagnetic current. This axial coupling dictates the rate at which neutrons decay to protons, the strength of the attractive long-range force between nucleons and other features of nuclear physics. Precision tests of the standard model in nuclear environments require a quantitative understanding of nuclear physics that is rooted in quantum chromodynamics, a pillar of the standard model. The importance of g A makes it a benchmark quantity to determine theoretically-a difficult task because quantum chromodynamics is non-perturbative, precluding known analytical methods. Lattice quantum chromodynamics provides a rigorous, non-perturbative definition of quantum chromodynamics that can be implemented numerically. It has been estimated that a precision of two per cent would be possible by 2020 if two challenges are overcome 1,2 : contamination of g A from excited states must be controlled in the calculations and statistical precision must be improved markedly 2-10 . Here we use an unconventional method 11 inspired by the Feynman-Hellmann theorem that overcomes these challenges. We calculate a g A value of 1.271 ± 0.013, which has a precision of about one per cent.
Analysis of clonazepam in a tablet dosage form using smallbore HPLC.
Spell, J C; Stewart, J T
1998-11-01
A stability indicating, reversed phase high-performance liquid chromatographic method utilizing a smallbore HPLC column has been developed for the determination of clonazepam in a commercial tablet dosage form. The use of a small bore column results in a substantial solvent savings, as well as a greater mass sensitivity, especially in the identification of degradation peaks in a chromatogram. The method involves ultraviolet detection at 254 nm and utilized a 150 x 3.0 mm i.d. column packed with 3 microm octyldecylsilane particles with a mobile phase of water methanol acetonitrile (40:30:30, v/v/v) at a flow rate of 400 microl min(-1) at ambient temperature, with and without the use of 1,2-dichlorobenzene as the internal standard. The current USP method for the analysis of clonazepam using a 300 x 3.9 mm i.d. conventional octyldecylsilane column was utilized as a comparison to the smallbore method. The retention times for clonazepam and the internal standard on the 3.0 mm i.d. column were 4.0 and 12.5 min, respectively. The intra- and interday RSDs on the 3.0 mm i.d. column were < 0.55% (n =4) using the internal standard, and < 0.19% (n = 4) without the internal standard at the lower limit of the standard curve, 50 microg ml(-1) and had a limit of detection of 24 ng ml(-1). The assay using the 3.0 mm i.d. column was shown to be suitable for measuring clonazepam in a tablet dosage form.
Li, Li; Liu, Dong-Jun
2014-01-01
Since 2012, China has been facing haze-fog weather conditions, and haze-fog pollution and PM2.5 have become hot topics. It is very necessary to evaluate and analyze the ecological status of the air environment of China, which is of great significance for environmental protection measures. In this study the current situation of haze-fog pollution in China was analyzed first, and the new Ambient Air Quality Standards were introduced. For the issue of air quality evaluation, a comprehensive evaluation model based on an entropy weighting method and nearest neighbor method was developed. The entropy weighting method was used to determine the weights of indicators, and the nearest neighbor method was utilized to evaluate the air quality levels. Then the comprehensive evaluation model was applied into the practical evaluation problems of air quality in Beijing to analyze the haze-fog pollution. Two simulation experiments were implemented in this study. One experiment included the indicator of PM2.5 and was carried out based on the new Ambient Air Quality Standards (GB 3095-2012); the other experiment excluded PM2.5 and was carried out based on the old Ambient Air Quality Standards (GB 3095-1996). Their results were compared, and the simulation results showed that PM2.5 was an important indicator for air quality and the evaluation results of the new Air Quality Standards were more scientific than the old ones. The haze-fog pollution situation in Beijing City was also analyzed based on these results, and the corresponding management measures were suggested. PMID:25170682
Echegaray, Sebastian; Nair, Viswam; Kadoch, Michael; Leung, Ann; Rubin, Daniel; Gevaert, Olivier; Napel, Sandy
2016-12-01
Quantitative imaging approaches compute features within images' regions of interest. Segmentation is rarely completely automatic, requiring time-consuming editing by experts. We propose a new paradigm, called "digital biopsy," that allows for the collection of intensity- and texture-based features from these regions at least 1 order of magnitude faster than the current manual or semiautomated methods. A radiologist reviewed automated segmentations of lung nodules from 100 preoperative volume computed tomography scans of patients with non-small cell lung cancer, and manually adjusted the nodule boundaries in each section, to be used as a reference standard, requiring up to 45 minutes per nodule. We also asked a different expert to generate a digital biopsy for each patient using a paintbrush tool to paint a contiguous region of each tumor over multiple cross-sections, a procedure that required an average of <3 minutes per nodule. We simulated additional digital biopsies using morphological procedures. Finally, we compared the features extracted from these digital biopsies with our reference standard using intraclass correlation coefficient (ICC) to characterize robustness. Comparing the reference standard segmentations to our digital biopsies, we found that 84/94 features had an ICC >0.7; comparing erosions and dilations, using a sphere of 1.5-mm radius, of our digital biopsies to the reference standard segmentations resulted in 41/94 and 53/94 features, respectively, with ICCs >0.7. We conclude that many intensity- and texture-based features remain consistent between the reference standard and our method while substantially reducing the amount of operator time required.
Faassen, Elisabeth J; Antoniou, Maria G; Beekman-Lukassen, Wendy; Blahova, Lucie; Chernova, Ekaterina; Christophoridis, Christophoros; Combes, Audrey; Edwards, Christine; Fastner, Jutta; Harmsen, Joop; Hiskia, Anastasia; Ilag, Leopold L; Kaloudis, Triantafyllos; Lopicic, Srdjan; Lürling, Miquel; Mazur-Marzec, Hanna; Meriluoto, Jussi; Porojan, Cristina; Viner-Mozzini, Yehudit; Zguna, Nadezda
2016-02-29
Exposure to β-N-methylamino-l-alanine (BMAA) might be linked to the incidence of amyotrophic lateral sclerosis, Alzheimer's disease and Parkinson's disease. Analytical chemistry plays a crucial role in determining human BMAA exposure and the associated health risk, but the performance of various analytical methods currently employed is rarely compared. A CYANOCOST initiated workshop was organized aimed at training scientists in BMAA analysis, creating mutual understanding and paving the way towards interlaboratory comparison exercises. During this workshop, we tested different methods (extraction followed by derivatization and liquid chromatography coupled to tandem mass spectrometry (LC-MS/MS) analysis, or directly followed by LC-MS/MS analysis) for trueness and intermediate precision. We adapted three workup methods for the underivatized analysis of animal, brain and cyanobacterial samples. Based on recovery of the internal standard D₃BMAA, the underivatized methods were accurate (mean recovery 80%) and precise (mean relative standard deviation 10%), except for the cyanobacterium Leptolyngbya. However, total BMAA concentrations in the positive controls (cycad seeds) showed higher variation (relative standard deviation 21%-32%), implying that D₃BMAA was not a good indicator for the release of BMAA from bound forms. Significant losses occurred during workup for the derivatized method, resulting in low recovery (<10%). Most BMAA was found in a trichloroacetic acid soluble, bound form and we recommend including this fraction during analysis.
Harmonic reduction of Direct Torque Control of six-phase induction motor.
Taheri, A
2016-07-01
In this paper, a new switching method in Direct Torque Control (DTC) of a six-phase induction machine for reduction of current harmonics is introduced. Selecting a suitable vector in each sampling period is an ordinal method in the ST-DTC drive of a six-phase induction machine. The six-phase induction machine has 64 voltage vectors and divided further into four groups. In the proposed DTC method, the suitable voltage vectors are selected from two vector groups. By a suitable selection of two vectors in each sampling period, the harmonic amplitude is decreased more, in and various comparison to that of the ST-DTC drive. The harmonics loss is greater reduced, while the electromechanical energy is decreased with switching loss showing a little increase. Spectrum analysis of the phase current in the standard and new switching table DTC of the six-phase induction machine and determination for the amplitude of each harmonics is proposed in this paper. The proposed method has a less sampling time in comparison to the ordinary method. The Harmonic analyses of the current in the low and high speed shows the performance of the presented method. The simplicity of the proposed method and its implementation without any extra hardware is other advantages of the proposed method. The simulation and experimental results show the preference of the proposed method. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.
Health effects of indoor odorants.
Cone, J E; Shusterman, D
1991-01-01
People assess the quality of the air indoors primarily on the basis of its odors and on their perception of associated health risk. The major current contributors to indoor odorants are human occupant odors (body odor), environmental tobacco smoke, volatile building materials, bio-odorants (particularly mold and animal-derived materials), air fresheners, deodorants, and perfumes. These are most often present as complex mixtures, making measurement of the total odorant problem difficult. There is no current method of measuring human body odor, other than by human panel studies of expert judges of air quality. Human body odors have been quantitated in terms of the "olf" which is the amount of air pollution produced by the average person. Another quantitative unit of odorants is the "decipol," which is the perceived level of pollution produced by the average human ventilated by 10 L/sec of unpolluted air or its equivalent level of dissatisfaction from nonhuman air pollutants. The standard regulatory approach, focusing on individual constituents or chemicals, is not likely to be successful in adequately controlling odorants in indoor air. Besides the current approach of setting minimum ventilation standards to prevent health effects due to indoor air pollution, a standard based on the olf or decipol unit might be more efficacious as well as simpler to measure. PMID:1821378
Cost effectiveness of the US Geological Survey's stream-gaging programs in New Hampshire and Vermont
Smath, J.A.; Blackey, F.E.
1986-01-01
Data uses and funding sources were identified for the 73 continuous stream gages currently (1984) being operated. Eight stream gages were identified as having insufficient reason to continue their operation. Parts of New Hampshire and Vermont were identified as needing additional hydrologic data. New gages should be established in these regions as funds become available. Alternative methods for providing hydrologic data at the stream gaging stations currently being operated were found to lack the accuracy that is required for their intended use. The current policy for operation of the stream gages requires a net budget of $297,000/yr. The average standard error of estimation of the streamflow records is 17.9%. This overall level of accuracy could be maintained with a budget of $285,000 if resources were redistributed among gages. Cost-effective analysis indicates that with the present budget, the average standard error could be reduced to 16.6%. A minimum budget of $278,000 is required to operate the present stream gaging program. Below this level, the gages and recorders would not receive the proper service and maintenance. At the minimum budget, the average standard error would be 20.4%. The loss of correlative data is a significant component of the error in streamflow records, especially at lower budgetary levels. (Author 's abstract)
Zimmerman, Christian E.; Nielsen, Roger L.
2003-01-01
The use of strontium-to-calcium (Sr/Ca) ratios in otoliths is becoming a standard method to describe life history type and the chronology of migrations between freshwater and seawater habitats in teleosts (e.g. Kalish, 1990; Radtke et al., 1990; Secor, 1992; Rieman et al., 1994; Radtke, 1995; Limburg, 1995; Tzeng et al. 1997; Volk et al., 2000; Zimmerman, 2000; Zimmerman and Reeves, 2000, 2002). This method provides critical information concerning the relationship and ecology of species exhibiting phenotypic variation in migratory behavior (Kalish, 1990; Secor, 1999). Methods and procedures, however, vary among laboratories because a standard method or protocol for measurement of Sr in otoliths does not exist. In this note, we examine the variations in analytical conditions in an effort to increase precision of Sr/Ca measurements. From these findings we argue that precision can be maximized with higher beam current (although there is specimen damage) than previously recommended by Gunn et al. (1992).
Towards a Better Corrosion Resistance and Biocompatibility Improvement of Nitinol Medical Devices
NASA Astrophysics Data System (ADS)
Rokicki, Ryszard; Hryniewicz, Tadeusz; Pulletikurthi, Chandan; Rokosz, Krzysztof; Munroe, Norman
2015-04-01
Haemocompatibility of Nitinol implantable devices and their corrosion resistance as well as resistance to fracture are very important features of advanced medical implants. The authors of the paper present some novel methods capable to improve Nitinol implantable devices to some marked degree beyond currently used electropolishing (EP) processes. Instead, a magnetoelectropolishing process should be advised. The polarization study shows that magnetoelectropolished Nitinol surface is more corrosion resistant than that obtained after a standard EP and has a unique ability to repassivate the surface. Currently used sterilization processes of Nitinol implantable devices can dramatically change physicochemical properties of medical device and by this influence its biocompatibility. The Authors' experimental results clearly show the way to improve biocompatibility of NiTi alloy surface. The final sodium hypochlorite treatment should replace currently used Nitinol implantable devices sterilization methods which rationale was also given in our previous study.
Malavera, Alejandra; Vasquez, Alejandra; Fregni, Felipe
2015-01-01
Transcranial direct current stimulation (tDCS) is a neuromodulatory technique that has been extensively studied. While there have been initial positive results in some clinical trials, there is still variability in tDCS results. The aim of this article is to review and discuss patents assessing novel methods to optimize the use of tDCS. A systematic review was performed using Google patents database with tDCS as the main technique, with patents filling date between 2010 and 2015. Twenty-two patents met our inclusion criteria. These patents attempt to address current tDCS limitations. Only a few of them have been investigated in clinical trials (i.e., high-definition tDCS), and indeed most of them have not been tested before in human trials. Further clinical testing is required to assess which patents are more likely to optimize the effects of tDCS. We discuss the potential optimization of tDCS based on these patents and the current experience with standard tDCS.
Convolutional auto-encoder for image denoising of ultra-low-dose CT.
Nishio, Mizuho; Nagashima, Chihiro; Hirabayashi, Saori; Ohnishi, Akinori; Sasaki, Kaori; Sagawa, Tomoyuki; Hamada, Masayuki; Yamashita, Tatsuo
2017-08-01
The purpose of this study was to validate a patch-based image denoising method for ultra-low-dose CT images. Neural network with convolutional auto-encoder and pairs of standard-dose CT and ultra-low-dose CT image patches were used for image denoising. The performance of the proposed method was measured by using a chest phantom. Standard-dose and ultra-low-dose CT images of the chest phantom were acquired. The tube currents for standard-dose and ultra-low-dose CT were 300 and 10 mA, respectively. Ultra-low-dose CT images were denoised with our proposed method using neural network, large-scale nonlocal mean, and block-matching and 3D filtering. Five radiologists and three technologists assessed the denoised ultra-low-dose CT images visually and recorded their subjective impressions of streak artifacts, noise other than streak artifacts, visualization of pulmonary vessels, and overall image quality. For the streak artifacts, noise other than streak artifacts, and visualization of pulmonary vessels, the results of our proposed method were statistically better than those of block-matching and 3D filtering (p-values < 0.05). On the other hand, the difference in the overall image quality between our proposed method and block-matching and 3D filtering was not statistically significant (p-value = 0.07272). The p-values obtained between our proposed method and large-scale nonlocal mean were all less than 0.05. Neural network with convolutional auto-encoder could be trained using pairs of standard-dose and ultra-low-dose CT image patches. According to the visual assessment by radiologists and technologists, the performance of our proposed method was superior to that of large-scale nonlocal mean and block-matching and 3D filtering.
Khangura, Jaspreet; Culleton, Bruce F; Manns, Braden J; Zhang, Jianguo; Barnieh, Lianne; Walsh, Michael; Klarenbach, Scott W; Tonelli, Marcello; Sarna, Magdalena; Hemmelgarn, Brenda R
2010-06-24
Left ventricular (LV) hypertrophy is common among patients on hemodialysis. While a relationship between blood pressure (BP) and LV hypertrophy has been established, it is unclear which BP measurement method is the strongest correlate of LV hypertrophy. We sought to determine agreement between various blood pressure measurement methods, as well as identify which method was the strongest correlate of LV hypertrophy among patients on hemodialysis. This was a post-hoc analysis of data from a randomized controlled trial. We evaluated the agreement between seven BP measurement methods: standardized measurement at baseline; single pre- and post-dialysis, as well as mean intra-dialytic measurement at baseline; and cumulative pre-, intra- and post-dialysis readings (an average of 12 monthly readings based on a single day per month). Agreement was assessed using Lin's concordance correlation coefficient (CCC) and the Bland Altman method. Association between BP measurement method and LV hypertrophy on baseline cardiac MRI was determined using receiver operating characteristic curves and area under the curve (AUC). Agreement between BP measurement methods in the 39 patients on hemodialysis varied considerably, from a CCC of 0.35 to 0.94, with overlapping 95% confidence intervals. Pre-dialysis measurements were the weakest predictors of LV hypertrophy while standardized, post- and inter-dialytic measurements had similar and strong (AUC 0.79 to 0.80) predictive power for LV hypertrophy. A single standardized BP has strong predictive power for LV hypertrophy and performs just as well as more resource intensive cumulative measurements, whereas pre-dialysis blood pressure measurements have the weakest predictive power for LV hypertrophy. Current guidelines, which recommend using pre-dialysis measurements, should be revisited to confirm these results.
Current Standardization and Cooperative Efforts Related to Industrial Information Infrastructures.
1993-05-01
Data Management Systems: Components used to store, manage, and retrieve data. Data management includes knowledge bases, database management...Application Development Tools and Methods X/Open and POSIX APIs Integrated Design Support System (IDS) Knowledge -Based Systems (KBS) Application...IDEFlx) Yourdon Jackson System Design (JSD) Knowledge -Based Systems (KBSs) Structured Systems Development (SSD) Semantic Unification Meta-Model
ERIC Educational Resources Information Center
Newton, Jill A.
2012-01-01
Although the question of whether written curricula are implemented according to the intentions of curriculum developers has already spurred much research, current methods for documenting curricular implementation seem to be missing a critical piece: the mathematics. To add a mathematical perspective to the discussion of the admittedly…
76 FR 26853 - Commercial Driver's License Testing and Commercial Learner's Permit Standards
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-09
... b. Pre-Trip Inspection c. Skills Test Banking Prohibition d. Gross Vehicle Weight Rating (GVWR... electronic method of transmitting test scores works best for them. At least one State currently has an... Issuing a CLP a. Passing the General Knowledge Test To Obtain a CLP b. Requiring the CLP To Be a Separate...
USDA-ARS?s Scientific Manuscript database
Moisture is paramount to cotton fiber properties dictating harvesting, ginning, storage and spinning as well as others. Currently, oven drying in air is often utilized to generate the percentage of moisture in cotton fibers. Karl Fischer Titration another method for cotton moisture, has been compa...
ERIC Educational Resources Information Center
Scott, Judith; Wishart, Jennifer; Currie, Candace
2011-01-01
Background: The language, format and length of typical national health survey questionnaires may make them inaccessible to many school-aged children with an intellectual disability. Materials and Methods: Using the standard delivery protocol, the WHO Health Behaviour in School-aged Children (HBSC) Questionnaire, currently in use in 43 countries,…
A Review of Standardized Tests of Nonverbal Oral and Speech Motor Performance in Children
ERIC Educational Resources Information Center
McCauley, Rebecca J.; Strand, Edythe A.
2008-01-01
Purpose: To review the content and psychometric characteristics of 6 published tests currently available to aid in the study, diagnosis, and treatment of motor speech disorders in children. Method: We compared the content of the 6 tests and critically evaluated the degree to which important psychometric characteristics support the tests' use for…
A Critical Analysis of the CELF-4: The Responsible Clinician's Guide to the CELF-4
ERIC Educational Resources Information Center
Crowley, Catherine Jane
2010-01-01
Purpose: To provide an analysis of the accuracy and effectiveness of using the Clinical Evaluation of Language Fundamentals-Fourth Edition (CELF-4) to identify students as having language-based disabilities. Method: The CELF-4 is analyzed within the current standards set by the federal law on special education, the available research, preferred…
The Adequacy of Different Robust Statistical Tests in Comparing Two Independent Groups
ERIC Educational Resources Information Center
Pero-Cebollero, Maribel; Guardia-Olmos, Joan
2013-01-01
In the current study, we evaluated various robust statistical methods for comparing two independent groups. Two scenarios for simulation were generated: one of equality and another of population mean differences. In each of the scenarios, 33 experimental conditions were used as a function of sample size, standard deviation and asymmetry. For each…
USDA-ARS?s Scientific Manuscript database
Sorghum [Sorghum bicolor (L.) Moench] has been shown to contain the cyanogenic glycoside dhurrin, which is responsible for the disorder known as prussic acid poisoning in livestock. The current standard method for estimating HCN uses spectrophotometery to measure the aglycone of the dhurrin, p-hydro...
Deep Aquifer Remediation Tools (DARTs): A new technology for ground-water remediation
Naftz, David L.; Davis, James A.
1999-01-01
Potable ground-water supplies throughout the world are contaminated or threatened by advancing plumes containing radionuclides, metals, and organic compounds. Currently (1999), the most widely used method of ground-water remediation is a combination of extraction, ex-situ treatment, and discharge of the treated water, commonly known as pump and treat. Pump-and-treat methods are costly and often ineffective in meeting long-term protection standards (Travis and Doty, 1990; Gillham and Burris, 1992; National Research Council, 1994). This fact sheet describes a new and potentially cost-effective technology for removal of organic and inorganic contaminants from ground water. The U.S. Geological Survey (USGS) is currently exploring the possibilities of obtaining a U.S. Patent for this technology.
Tamblyn, R
1994-06-01
Governments have traditionally looked to the medical profession for leadership in health planning and have charged the profession with the responsibility of establishing and monitoring standards of medical practice. Training program accreditation and licensure/certification exams have been used as the primary methods of preventing unqualified individuals from entering medical practice. Despite the critical nature of the decision made at the time of licensure/certification, there is no information about the validity of these examinations for predicting subsequent practice and health outcome. In this article, the assumptions implicit in the current use of licensing/certifying examinations are identified, the relevant evidence is reviewed, and the implications of this evidence for current methods of measurement are discussed.