Quinn, Terence J; Livingstone, Iain; Weir, Alexander; Shaw, Robert; Breckenridge, Andrew; McAlpine, Christine; Tarbert, Claire M
2018-01-01
Visual impairment affects up to 70% of stroke survivors. We designed an app (StrokeVision) to facilitate screening for common post stroke visual issues (acuity, visual fields, and visual inattention). We sought to describe the test time, feasibility, acceptability, and accuracy of our app-based digital visual assessments against (a) current methods used for bedside screening and (b) gold standard measures. Patients were prospectively recruited from acute stroke settings. Index tests were app-based assessments of fields and inattention performed by a trained researcher. We compared against usual clinical screening practice of visual fields to confrontation, including inattention assessment (simultaneous stimuli). We also compared app to gold standard assessments of formal kinetic perimetry (Goldman or Octopus Visual Field Assessment); and pencil and paper-based tests of inattention (Albert's, Star Cancelation, and Line Bisection). Results of inattention and field tests were adjudicated by a specialist Neuro-ophthalmologist. All assessors were masked to each other's results. Participants and assessors graded acceptability using a bespoke scale that ranged from 0 (completely unacceptable) to 10 (perfect acceptability). Of 48 stroke survivors recruited, the complete battery of index and reference tests for fields was successfully completed in 45. Similar acceptability scores were observed for app-based [assessor median score 10 (IQR: 9-10); patient 9 (IQR: 8-10)] and traditional bedside testing [assessor 10 (IQR: 9-10); patient 10 (IQR: 9-10)]. Median test time was longer for app-based testing [combined time to completion of all digital tests 420 s (IQR: 390-588)] when compared with conventional bedside testing [70 s, (IQR: 40-70)], but shorter than gold standard testing [1,260 s, (IQR: 1005-1,620)]. Compared with gold standard assessments, usual screening practice demonstrated 79% sensitivity and 82% specificity for detection of a stroke-related field defect. This compares with 79% sensitivity and 88% specificity for StrokeVision digital assessment. StrokeVision shows promise as a screening tool for visual complications in the acute phase of stroke. The app is at least as good as usual screening and offers other functionality that may make it attractive for use in acute stroke. https://ClinicalTrials.gov/ct2/show/NCT02539381.
Field testing energy-saving hermetic compressors in residential refrigerators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sauber, R.S.; Middleton, M.G.
The design of an energy saving compressor for low back pressure applications is reviewed. Calorimeter performance results are stated for two sizes of the efficient design and compared with performance test results for a standard compressor. Power consumption of a refrigerator-freezer is given with a standard compressor and with the energy saving compressor. The preparation of the refrigerators used in the field test are discussed along with the criteria used in selecting the instrumentation for the project. Results of the energy saving compressor in the field test along with a comparison to a standard production compressor are presented. Some conclusionsmore » are drawn, based on the data, in relation to important factors in residential refrigerator power consumption.« less
Environmental and reliability test of FBG based geophone as geophysical exploration instrument
NASA Astrophysics Data System (ADS)
Zhang, Xiaolei; Min, Li; Li, Ming; Jiang, Shaodong; Zhang, Faxiang; Sun, Zhihui; Ni, Jiasheng; Peng, Gangding; Wang, Chang
2017-10-01
A fiber Bragg grating (FBG) based geophone is designed for low-frequency signal detection has high acceleration response of about 60 dB re pm/g in a low frequency range of 5 Hz 60 Hz. To Guarantee normal operation in field test and practical application, an acceleration amplitude restriction is added in the mechanical design of the FBG geophone. Then a series of environmental and reliability test have been proceeded with online or offline monitoring of its working performance, including high and low temperature test, vibration test, shock test and free drop test. All the tests are planned according to National standard or Oil & Gas Industry Standard. And the experimental results indicate that our FBG geophone meet the criterion of oil and gas industry product and is capable of field application.
2012-01-01
Background The traditional Korean medical diagnoses employ pattern identification (PI), a diagnostic system that entails the comprehensive analysis of symptoms and signs. The PI needs to be standardized due to its ambiguity. Therefore, this study was performed to establish standard indicators of the PI for stroke through the traditional Korean medical literature, expert consensus and a clinical field test. Methods We sorted out stroke patterns with an expert committee organized by the Korean Institute of Oriental Medicine. The expert committee composed a document for a standardized pattern of identification for stroke based on the traditional Korean medical literature, and we evaluated the clinical significance of the document through a field test. Results We established five stroke patterns from the traditional Korean medical literature and extracted 117 indicators required for diagnosis. The indicators were evaluated by a field test and verified by the expert committee. Conclusions This study sought to develop indicators of PI based on the traditional Korean medical literature. This process contributed to the standardization of traditional Korean medical diagnoses. PMID:22410195
Peres experiment using photons: No test for hypercomplex (quaternionic) quantum theories
NASA Astrophysics Data System (ADS)
Adler, Stephen L.
2017-06-01
Assuming the standard axioms for quaternionic quantum theory and a spatially localized scattering interaction, the S matrix in quaternionic quantum theory is complex valued, not quaternionic. Using the standard connections between the S matrix, the forward scattering amplitude for electromagnetic wave scattering, and the index of refraction, we show that the index of refraction is necessarily complex, not quaternionic. This implies that the recent optical experiment of Procopio et al. [Nat. Commun. 8, 15044 (2017), 10.1038/ncomms15044] based on the Peres proposal does not test for hypercomplex or quaternionic quantum effects arising within the standard Hilbert space framework. Such a test requires looking at near zone fields, not radiation zone fields.
Inter-laboratory Comparison of Three Earplug Fit-test Systems
Byrne, David C.; Murphy, William J.; Krieg, Edward F.; Ghent, Robert M.; Michael, Kevin L.; Stefanson, Earl W.; Ahroon, William A.
2017-01-01
The National Institute for Occupational Safety and Health (NIOSH) sponsored tests of three earplug fit-test systems (NIOSH HPD Well-Fit™, Michael & Associates FitCheck, and Honeywell Safety Products VeriPRO®). Each system was compared to laboratory-based real-ear attenuation at threshold (REAT) measurements in a sound field according to ANSI/ASA S12.6-2008 at the NIOSH, Honeywell Safety Products, and Michael & Associates testing laboratories. An identical study was conducted independently at the U.S. Army Aeromedical Research Laboratory (USAARL), which provided their data for inclusion in this report. The Howard Leight Airsoft premolded earplug was tested with twenty subjects at each of the four participating laboratories. The occluded fit of the earplug was maintained during testing with a soundfield-based laboratory REAT system as well as all three headphone-based fit-test systems. The Michael & Associates lab had highest average A-weighted attenuations and smallest standard deviations. The NIOSH lab had the lowest average attenuations and the largest standard deviations. Differences in octave-band attenuations between each fit-test system and the American National Standards Institute (ANSI) sound field method were calculated (Attenfit-test - AttenANSI). A-weighted attenuations measured with FitCheck and HPD Well-Fit systems demonstrated approximately ±2 dB agreement with the ANSI sound field method, but A-weighted attenuations measured with the VeriPRO system underestimated the ANSI laboratory attenuations. For each of the fit-test systems, the average A-weighted attenuation across the four laboratories was not significantly greater than the average of the ANSI sound field method. Standard deviations for residual attenuation differences were about ±2 dB for FitCheck and HPD Well-Fit compared to ±4 dB for VeriPRO. Individual labs exhibited a range of agreement from less than a dB to as much as 9.4 dB difference with ANSI and REAT estimates. Factors such as the experience of study participants and test administrators, and the fit-test psychometric tasks are suggested as possible contributors to the observed results. PMID:27786602
Evaluation of Troxler model 3411 nuclear gage.
DOT National Transportation Integrated Search
1978-01-01
The performance of the Troxler Electronics Laboratory Model 3411 nuclear gage was evaluated through laboratory tests on the Department's density and moisture standards and field tests on various soils, base courses, and bituminous concrete overlays t...
Visual field defects after temporal lobe resection for epilepsy.
Steensberg, Alvilda T; Olsen, Ane Sophie; Litman, Minna; Jespersen, Bo; Kolko, Miriam; Pinborg, Lars H
2018-01-01
To determine visual field defects (VFDs) using methods of varying complexity and compare results with subjective symptoms in a population of newly operated temporal lobe epilepsy patients. Forty patients were included in the study. Two patients failed to perform VFD testing. Humphrey Field Analyzer (HFA) perimetry was used as the gold standard test to detect VFDs. All patients performed a web-based visual field test called Damato Multifixation Campimetry Online (DMCO). A bedside confrontation visual field examination ad modum Donders was extracted from the medical records in 27/38 patients. All participants had a consultation by an ophthalmologist. A questionnaire described the subjective complaints. A VFD in the upper quadrant was demonstrated with HFA in 29 (76%) of the 38 patients after surgery. In 27 patients tested ad modum Donders, the sensitivity of detecting a VFD was 13%. Eight patients (21%) had a severe VFD similar to a quadrant anopia, thus, questioning their permission to drive a car. In this group of patients, a VFD was demonstrated in one of five (sensitivity=20%) ad modum Donders and in seven of eight (sensitivity=88%) with DMCO. Subjective symptoms were only reported by 28% of the patients with a VFD and in two of eight (sensitivity=25%) with a severe VFD. Most patients (86%) considered VFD information mandatory. VFD continue to be a frequent adverse event after epilepsy surgery in the medial temporal lobe and may affect the permission to drive a car in at least one in five patients. Subjective symptoms and bedside visual field testing ad modum Donders are not sensitive to detect even a severe VFD. Newly developed web-based visual field test methods appear sensitive to detect a severe VFD but perimetry remains the golden standard for determining if visual standards for driving is fulfilled. Patients consider VFD information as mandatory. Copyright © 2017. Published by Elsevier Ltd.
Laurin, E; Thakur, K K; Gardner, I A; Hick, P; Moody, N J G; Crane, M S J; Ernst, I
2018-05-01
Design and reporting quality of diagnostic accuracy studies (DAS) are important metrics for assessing utility of tests used in animal and human health. Following standards for designing DAS will assist in appropriate test selection for specific testing purposes and minimize the risk of reporting biased sensitivity and specificity estimates. To examine the benefits of recommending standards, design information from published DAS literature was assessed for 10 finfish, seven mollusc, nine crustacean and two amphibian diseases listed in the 2017 OIE Manual of Diagnostic Tests for Aquatic Animals. Of the 56 DAS identified, 41 were based on field testing, eight on experimental challenge studies and seven on both. Also, we adapted human and terrestrial-animal standards and guidelines for DAS structure for use in aquatic animal diagnostic research. Through this process, we identified and addressed important metrics for consideration at the design phase: study purpose, targeted disease state, selection of appropriate samples and specimens, laboratory analytical methods, statistical methods and data interpretation. These recommended design standards for DAS are presented as a checklist including risk-of-failure points and actions to mitigate bias at each critical step. Adherence to standards when designing DAS will also facilitate future systematic review and meta-analyses of DAS research literature. © 2018 John Wiley & Sons Ltd.
van der Slikke, Rienk M A; Bregman, Daan J J; Berger, Monique A M; de Witte, Annemarie M H; Veeger, Dirk-Jan H E J
2017-11-01
Classification is a defining factor for competition in wheelchair sports, but it is a delicate and time-consuming process with often questionable validity. 1 New inertial sensor based measurement methods applied in match play and field tests, allow for more precise and objective estimates of the impairment effect on wheelchair mobility performance. It was evaluated if these measures could offer an alternative point of view for classification. Six standard wheelchair mobility performance outcomes of different classification groups were measured in match play (n=29), as well as best possible performance in a field test (n=47). In match-results a clear relationship between classification and performance level is shown, with increased performance outcomes in each adjacent higher classification group. Three outcomes differed significantly between the low and mid-class groups, and one between the mid and high-class groups. In best performance (field test), a split between the low and mid-class groups shows (5 out of 6 outcomes differed significantly) but hardly any difference between the mid and high-class groups. This observed split was confirmed by cluster analysis, revealing the existence of only two performance based clusters. The use of inertial sensor technology to get objective measures of wheelchair mobility performance, combined with a standardized field-test, brought alternative views for evidence based classification. The results of this approach provided arguments for a reduced number of classes in wheelchair basketball. Future use of inertial sensors in match play and in field testing could enhance evaluation of classification guidelines as well as individual athlete performance.
Redesigning Design: Field Testing a Revised Design Rubric Based on iNACOL Quality Course Standards
ERIC Educational Resources Information Center
Adelstein, David; Barbour, Michael K.
2016-01-01
Designers have a limited selection of K-12 online course creation standards to choose from that are not blocked behind proprietary or pay walls. For numerous institutions and states, the use of the iNACOL "National Standards for Quality Online Courses" is becoming a widely used resource. This article presents the final phase in a…
Student science achievement and the integration of Indigenous knowledge on standardized tests
NASA Astrophysics Data System (ADS)
Dupuis, Juliann; Abrams, Eleanor
2017-09-01
In this article, we examine how American Indian students in Montana performed on standardized state science assessments when a small number of test items based upon traditional science knowledge from a cultural curriculum, "Indian Education for All", were included. Montana is the first state in the US to mandate the use of a culturally relevant curriculum in all schools and to incorporate this curriculum into a portion of the standardized assessment items. This study compares White and American Indian student test scores on these particular test items to determine how White and American Indian students perform on culturally relevant test items compared to traditional standard science test items. The connections between student achievement on adapted culturally relevant science test items versus traditional items brings valuable insights to the fields of science education, research on student assessments, and Indigenous studies.
Solar industrial process heat systems: An assessment of standards for materials and components
NASA Astrophysics Data System (ADS)
Rossiter, W. J.; Shipp, W. E.
1981-09-01
A study was conducted to obtain information on the performance of materials and components in operational solar industrial process heat (PH) systems, and to provide recommendations for the development of standards including evaluative test procedures for materials and components. An assessment of the needs for standards for evaluating the long-term performance of materials and components of IPH systems was made. The assessment was based on the availability of existing standards, and information obtained from a field survey of operational systems, the literature, and discussions with individuals in the industry. Field inspections of 10 operational IPH systems were performed.
A Field-Portable Cell Analyzer without a Microscope and Reagents.
Seo, Dongmin; Oh, Sangwoo; Lee, Moonjin; Hwang, Yongha; Seo, Sungkyu
2017-12-29
This paper demonstrates a commercial-level field-portable lens-free cell analyzer called the NaviCell (No-stain and Automated Versatile Innovative cell analyzer) capable of automatically analyzing cell count and viability without employing an optical microscope and reagents. Based on the lens-free shadow imaging technique, the NaviCell (162 × 135 × 138 mm³ and 1.02 kg) has the advantage of providing analysis results with improved standard deviation between measurement results, owing to its large field of view. Importantly, the cell counting and viability testing can be analyzed without the use of any reagent, thereby simplifying the measurement procedure and reducing potential errors during sample preparation. In this study, the performance of the NaviCell for cell counting and viability testing was demonstrated using 13 and six cell lines, respectively. Based on the results of the hemocytometer ( de facto standard), the error rate (ER) and coefficient of variation (CV) of the NaviCell are approximately 3.27 and 2.16 times better than the commercial cell counter, respectively. The cell viability testing of the NaviCell also showed an ER and CV performance improvement of 5.09 and 1.8 times, respectively, demonstrating sufficient potential in the field of cell analysis.
A Cross-Sectional Evaluation of Student Achievement Using Standardized and Performance-Based Tests
ERIC Educational Resources Information Center
Pinter, Brad; Matchock, Robert L.; Charles, Eric P.; Balch, William R.
2014-01-01
Three groups of undergraduates (42 senior graduating psychology majors, 27 first-year premajors taking introductory psychology, and 24 first-year, high-performing nonmajors taking introductory psychology) completed the Psychology Major Field Test (MFT) and a short-answer (SA) essay test on reasoning about core knowledge in psychology. Graduating…
NASA Technical Reports Server (NTRS)
Herrman, B. D.; Uman, M. A.; Brantley, R. D.; Krider, E. P.
1976-01-01
The principle of operation of a wideband crossed-loop magnetic-field direction finder is studied by comparing the bearing determined from the NS and EW magnetic fields at various times up to 155 microsec after return stroke initiation with the TV-determined lightning channel base direction. For 40 lightning strokes in the 3 to 12 km range, the difference between the bearings found from magnetic fields sampled at times between 1 and 10 microsec and the TV channel-base data has a standard deviation of 3-4 deg. Included in this standard deviation is a 2-3 deg measurement error. For fields sampled at progressively later times, both the mean and the standard deviation of the difference between the direction-finder bearing and the TV bearing increase. Near 150 microsec, means are about 35 deg and standard deviations about 60 deg. The physical reasons for the late-time inaccuracies in the wideband direction finder and the occurrence of these effects in narrow-band VLF direction finders are considered.
The four-meter confrontation visual field test.
Kodsi, S R; Younge, B R
1992-01-01
The 4-m confrontation visual field test has been successfully used at the Mayo Clinic for many years in addition to the standard 0.5-m confrontation visual field test. The 4-m confrontation visual field test is a test of macular function and can identify small central or paracentral scotomas that the examiner may not find when the patient is tested only at 0.5 m. Also, macular sparing in homonymous hemianopias and quadrantanopias may be identified with the 4-m confrontation visual field test. We recommend use of this confrontation visual field test, in addition to the standard 0.5-m confrontation visual field test, on appropriately selected patients to obtain the most information possible by confrontation visual field tests. PMID:1494829
The four-meter confrontation visual field test.
Kodsi, S R; Younge, B R
1992-01-01
The 4-m confrontation visual field test has been successfully used at the Mayo Clinic for many years in addition to the standard 0.5-m confrontation visual field test. The 4-m confrontation visual field test is a test of macular function and can identify small central or paracentral scotomas that the examiner may not find when the patient is tested only at 0.5 m. Also, macular sparing in homonymous hemianopias and quadrantanopias may be identified with the 4-m confrontation visual field test. We recommend use of this confrontation visual field test, in addition to the standard 0.5-m confrontation visual field test, on appropriately selected patients to obtain the most information possible by confrontation visual field tests.
Support vector machines-based modelling of seismic liquefaction potential
NASA Astrophysics Data System (ADS)
Pal, Mahesh
2006-08-01
This paper investigates the potential of support vector machines (SVM)-based classification approach to assess the liquefaction potential from actual standard penetration test (SPT) and cone penetration test (CPT) field data. SVMs are based on statistical learning theory and found to work well in comparison to neural networks in several other applications. Both CPT and SPT field data sets is used with SVMs for predicting the occurrence and non-occurrence of liquefaction based on different input parameter combination. With SPT and CPT test data sets, highest accuracy of 96 and 97%, respectively, was achieved with SVMs. This suggests that SVMs can effectively be used to model the complex relationship between different soil parameter and the liquefaction potential. Several other combinations of input variable were used to assess the influence of different input parameters on liquefaction potential. Proposed approach suggest that neither normalized cone resistance value with CPT data nor the calculation of standardized SPT value is required with SPT data. Further, SVMs required few user-defined parameters and provide better performance in comparison to neural network approach.
Towards standardized assessment of endoscope optical performance: geometric distortion
NASA Astrophysics Data System (ADS)
Wang, Quanzeng; Desai, Viraj N.; Ngo, Ying Z.; Cheng, Wei-Chung; Pfefer, Joshua
2013-12-01
Technological advances in endoscopes, such as capsule, ultrathin and disposable devices, promise significant improvements in safety, clinical effectiveness and patient acceptance. Unfortunately, the industry lacks test methods for preclinical evaluation of key optical performance characteristics (OPCs) of endoscopic devices that are quantitative, objective and well-validated. As a result, it is difficult for researchers and developers to compare image quality and evaluate equivalence to, or improvement upon, prior technologies. While endoscope OPCs include resolution, field of view, and depth of field, among others, our focus in this paper is geometric image distortion. We reviewed specific test methods for distortion and then developed an objective, quantitative test method based on well-defined experimental and data processing steps to evaluate radial distortion in the full field of view of an endoscopic imaging system. Our measurements and analyses showed that a second-degree polynomial equation could well describe the radial distortion curve of a traditional endoscope. The distortion evaluation method was effective for correcting the image and can be used to explain other widely accepted evaluation methods such as picture height distortion. Development of consensus standards based on promising test methods for image quality assessment, such as the method studied here, will facilitate clinical implementation of innovative endoscopic devices.
Music--A Resource Guide for Standards-Based Instruction.
ERIC Educational Resources Information Center
New York State Education Dept., Albany.
This guide is designed to provide guidance to New York state school districts and teachers to help students achieve the music standards. Teachers throughout New York state met to compile these field-tested lesson plans, teaching strategies, assessments, and resources for teachers of students in pre-kindergarten through grade 12, in all areas of…
Intelligent Mobile Technologies
NASA Technical Reports Server (NTRS)
Alena, Rick; Gilbaugh, Bruce; Glass, Brian; Swanson, Keith (Technical Monitor)
2000-01-01
Testing involves commercial radio equipment approved for export and use in Canada. Testing was conducted in the Canadian High Arctic, where hilly terrain provided the worst-case testing. SFU and Canadian governmental agencies made significant technical contributions. The only technical data related to radio testing was exchanged with SFU. Test protocols are standard radio tests performed by communication technicians worldwide. The Joint Fields Operations objectives included the following: (1) to provide Internet communications services for field science work and mobile exploration systems; (2) to evaluate the range and throughput of three different medium-range radio link technologies for providing coverage of the crater area; and (3) to demonstrate collaborative software such as NetMeeting with multi-point video for exchange of scientific information between remote node and base-base camp and science centers as part of communications testing.
Zebrafish developmental toxicity testing is an emerging field, which faces considerable challenges regarding data meta-analysis and the establishment of standardized test protocols. Here, we present an initial correlation study on toxicity of 133 chemicals based on data in the li...
Rylands, Lee P; Roberts, Simon J; Hurst, Howard T
2015-09-01
The aim of this study was to ascertain the variation in elite male bicycle motocross (BMX) cyclists' peak power, torque, and time of power production during laboratory and field-based testing. Eight elite male BMX riders volunteered for the study, and each rider completed 3 maximal sprints using both a Schoberer Rad Messtechnik (SRM) ergometer in the laboratory and a portable SRM power meter on an Olympic standard indoor BMX track. The results revealed a significantly higher peak power (p ≤ 0.001, 34 ± 9%) and reduced time of power production (p ≤ 0.001, 105 ± 24%) in the field tests when compared with laboratory-derived values. Torque was also reported to be lower in the laboratory tests but not to an accepted level of significance (p = 0.182, 6 ± 8%). These results suggest that field-based testing may be a more effective and accurate measure of a BMX rider's peak power, torque, and time of power production.
This report adapts the standard U.S. EPA methodology for deriving ambient water quality criteria. Rather than use toxicity test results, the adaptation uses field data to determine the loss of 5% of genera from streams. The method is applied to derive effect benchmarks for disso...
Li, Li; Xiong, De-fu; Liu, Jia-wen; Li, Zi-xin; Zeng, Guang-cheng; Li, Hua-liang
2014-03-01
We aimed to evaluate the interference of 50 Hz extremely low frequency electromagnetic field (ELF-EMF) occupational exposure on the neurobehavior tests of workers performing tour-inspection close to transformers and distribution power lines. Occupational short-term "spot" measurements were carried out. 310 inspection workers and 300 logistics staff were selected as exposure and control. The neurobehavior tests were performed through computer-based neurobehavior evaluation system, including mental arithmetic, curve coincide, simple visual reaction time, visual retention, auditory digit span and pursuit aiming. In 500 kV areas electric field intensity at 71.98% of total measured 590 spots were above 5 kV/m (national occupational standard), while in 220 kV areas electric field intensity at 15.69% of total 701 spots were above 5 kV/m. Magnetic field flux density at all the spots was below 1,000 μT (ICNIRP occupational standard). The neurobehavior score changes showed no statistical significance. Results of neurobehavior tests among different age, seniority groups showed no significant changes. Neurobehavior changes caused by daily repeated ELF-EMF exposure were not observed in the current study.
NASA Astrophysics Data System (ADS)
Hardikar, Kedar Y.; Liu, Bill J. J.; Bheemreddy, Venkata
2016-09-01
Gaining an understanding of degradation mechanisms and their characterization are critical in developing relevant accelerated tests to ensure PV module performance warranty over a typical lifetime of 25 years. As newer technologies are adapted for PV, including new PV cell technologies, new packaging materials, and newer product designs, the availability of field data over extended periods of time for product performance assessment cannot be expected within the typical timeframe for business decisions. In this work, to enable product design decisions and product performance assessment for PV modules utilizing newer technologies, Simulation and Mechanism based Accelerated Reliability Testing (SMART) methodology and empirical approaches to predict field performance from accelerated test results are presented. The method is demonstrated for field life assessment of flexible PV modules based on degradation mechanisms observed in two accelerated tests, namely, Damp Heat and Thermal Cycling. The method is based on design of accelerated testing scheme with the intent to develop relevant acceleration factor models. The acceleration factor model is validated by extensive reliability testing under different conditions going beyond the established certification standards. Once the acceleration factor model is validated for the test matrix a modeling scheme is developed to predict field performance from results of accelerated testing for particular failure modes of interest. Further refinement of the model can continue as more field data becomes available. While the demonstration of the method in this work is for thin film flexible PV modules, the framework and methodology can be adapted to other PV products.
In July 1997, EPA promulgated a new National Ambient Air Quality Standard (NAAQS) for fine particulate matter (PM2.5). This new standard was based on collection of an integrated mass sample on a filter. Field studies have demonstrated that the collection of semivolatile compoun...
How Valid Are the Responses to Nursing Home Survey Questions? Some Issues and Concerns
ERIC Educational Resources Information Center
Tyler, Denise A.; Shield, Renee R.; Rosenthal, Marsha; Miller, Susan C.; Wetle, Terrie; Clark, Melissa A.
2011-01-01
Purpose: Although surveys are usually piloted before fielding, cognitive-based testing of surveys is not standard practice in nursing home (NH) research. Many terms used in the literature do not have standard definitions and may be interpreted differently by researchers, respondents, and policy makers. The purpose of this study was to ensure that…
A Field-Portable Cell Analyzer without a Microscope and Reagents
Oh, Sangwoo; Lee, Moonjin; Hwang, Yongha
2017-01-01
This paper demonstrates a commercial-level field-portable lens-free cell analyzer called the NaviCell (No-stain and Automated Versatile Innovative cell analyzer) capable of automatically analyzing cell count and viability without employing an optical microscope and reagents. Based on the lens-free shadow imaging technique, the NaviCell (162 × 135 × 138 mm3 and 1.02 kg) has the advantage of providing analysis results with improved standard deviation between measurement results, owing to its large field of view. Importantly, the cell counting and viability testing can be analyzed without the use of any reagent, thereby simplifying the measurement procedure and reducing potential errors during sample preparation. In this study, the performance of the NaviCell for cell counting and viability testing was demonstrated using 13 and six cell lines, respectively. Based on the results of the hemocytometer (de facto standard), the error rate (ER) and coefficient of variation (CV) of the NaviCell are approximately 3.27 and 2.16 times better than the commercial cell counter, respectively. The cell viability testing of the NaviCell also showed an ER and CV performance improvement of 5.09 and 1.8 times, respectively, demonstrating sufficient potential in the field of cell analysis. PMID:29286336
Hong, Na; Li, Dingcheng; Yu, Yue; Xiu, Qiongying; Liu, Hongfang; Jiang, Guoqian
2016-10-01
Constructing standard and computable clinical diagnostic criteria is an important but challenging research field in the clinical informatics community. The Quality Data Model (QDM) is emerging as a promising information model for standardizing clinical diagnostic criteria. To develop and evaluate automated methods for converting textual clinical diagnostic criteria in a structured format using QDM. We used a clinical Natural Language Processing (NLP) tool known as cTAKES to detect sentences and annotate events in diagnostic criteria. We developed a rule-based approach for assigning the QDM datatype(s) to an individual criterion, whereas we invoked a machine learning algorithm based on the Conditional Random Fields (CRFs) for annotating attributes belonging to each particular QDM datatype. We manually developed an annotated corpus as the gold standard and used standard measures (precision, recall and f-measure) for the performance evaluation. We harvested 267 individual criteria with the datatypes of Symptom and Laboratory Test from 63 textual diagnostic criteria. We manually annotated attributes and values in 142 individual Laboratory Test criteria. The average performance of our rule-based approach was 0.84 of precision, 0.86 of recall, and 0.85 of f-measure; the performance of CRFs-based classification was 0.95 of precision, 0.88 of recall and 0.91 of f-measure. We also implemented a web-based tool that automatically translates textual Laboratory Test criteria into the QDM XML template format. The results indicated that our approaches leveraging cTAKES and CRFs are effective in facilitating diagnostic criteria annotation and classification. Our NLP-based computational framework is a feasible and useful solution in developing diagnostic criteria representation and computerization. Copyright © 2016 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gao, J; Liu, X
2016-06-15
Purpose: To perform a quantitative study to verify that the mechanical field center coincides with the radiation field center when both are off from the isocenter during the single-isocenter technique in linear accelerator-based SRS/SBRT procedure to treat multiple lesions. Methods: We developed an innovative method to measure this accuracy, called the off-isocenter Winston-Lutz test, and here we provide a practical clinical guideline to implement this technique. We used ImagePro V.6 to analyze images of a Winston-Lutz phantom obtained using a Varian 21EX linear accelerator with an electronic portal imaging device, set up as for single-isocenter SRS/SBRT for multiple lesions. Wemore » investigated asymmetry field centers that were 3 cm and 5 cm away from the isocenter, as well as performing the standard Winston-Lutz test. We used a special beam configuration to acquire images while avoiding collision, and we investigated both jaw and multileaf collimation. Results: For the jaw collimator setting, at 3 cm off-isocenter, the mechanical field deviated from the radiation field by about 2.5 mm; at 5 cm, the deviation was above 3 mm, up to 4.27 mm. For the multileaf collimator setting, at 3 cm off-isocenter, the deviation was below 1 mm; at 5 cm, the deviation was above 1 mm, up to 1.72 mm, which is 72% higher than the tolerance threshold. Conclusion: These results indicated that the further the asymmetry field center is from the machine isocenter, the larger the deviation of the mechanical field from the radiation field, and the distance between the center of the asymmetry field and the isocenter should not exceed 3 cm in of our clinic. We recommend that every clinic that uses linear accelerator, multileaf collimator-based SRS/SBRT perform the off-isocenter Winston-Lutz test in addition to the standard Winston-Lutz test and use their own deviation data to design the treatment plan.« less
Heinrich, Andreas; Teichgräber, Ulf K; Güttler, Felix V
2015-12-01
The standard ASTM F2119 describes a test method for measuring the size of a susceptibility artifact based on the example of a passive implant. A pixel in an image is considered to be a part of an image artifact if the intensity is changed by at least 30% in the presence of a test object, compared to a reference image in which the test object is absent (reference value). The aim of this paper is to simplify and accelerate the test method using a histogram-based reference value. Four test objects were scanned parallel and perpendicular to the main magnetic field, and the largest susceptibility artifacts were measured using two methods of reference value determination (reference image-based and histogram-based reference value). The results between both methods were compared using the Mann-Whitney U-test. The difference between both reference values was 42.35 ± 23.66. The difference of artifact size was 0.64 ± 0.69 mm. The artifact sizes of both methods did not show significant differences; the p-value of the Mann-Whitney U-test was between 0.710 and 0.521. A standard-conform method for a rapid, objective, and reproducible evaluation of susceptibility artifacts could be implemented. The result of the histogram-based method does not significantly differ from the ASTM-conform method.
Jankowska, Petra J; Kong, Christine; Burke, Kevin; Harrington, Kevin J; Nutting, Christopher
2007-10-01
High dose irradiation of the posterior cervical lymph nodes usually employs applied electron fields to treat the target volume and maintain the spinal cord dose within tolerance. In the light of recent advances in elective lymph node localisation we investigated optimization of field shape and electron energy to treat this target volume. In this study, three sequential hypotheses were tested. Firstly, that customization of the electron fields based on the nodal PTV outlined gives better PTV coverage than conventional field delineation. Using the consensus guidelines, customization of the electron field shape was compared to conventional fields based on bony landmarks. Secondly, that selection of electron energy using DVHs for spinal cord and PTV improves the minimum dose to PTV. Electron dose-volume histograms (DVHs) for the PTV, spinal cord and para-vertebral muscles, were generated using the Monte Carlo electron algorithm. These DVHs were used to compare standard vs optimized electron energy calculations. Finally, that combination of field customization and electron energy optimization improves both the minimum and mean doses to PTV compared with current standard practice. Customized electron beam shaping based on the consensus guidelines led to fewer geographical misses than standard field shaping. Customized electron energy calculation led to higher minimum doses to the PTV. Overall, the customization of field shape and energy resulted in an improved mean dose to the PTV (92% vs 83% p=0.02) and a 27% improvement in the minimum dose delivered to the PTV (45% vs 18% p=0.0009). Optimization of electron field shape and beam energy based on current consensus guidelines led to significant improvement in PTV coverage and may reduce recurrence rates.
EAACI position paper for practical patch testing in allergic contact dermatitis in children.
de Waard-van der Spek, Flora B; Darsow, Ulf; Mortz, Charlotte G; Orton, David; Worm, Margitta; Muraro, Antonella; Schmid-Grendelmeier, Peter; Grimalt, Ramon; Spiewak, Radoslaw; Rudzeviciene, Odilija; Flohr, Carsten; Halken, Susanne; Fiocchi, Alessandro; Borrego, Luis Miguel; Oranje, Arnold P
2015-11-01
Allergic contact dermatitis (ACD) in children appears to be on the increase, and contact sensitization may already begin in infancy. The diagnosis of contact dermatitis requires a careful evaluation of a patient's clinical history, physical examination, and skin testing. Patch testing is the gold standard diagnostic test. Based on consensus, the EAACI Task Force on Allergic Contact Dermatitis in Children produced this document to provide details on clinical aspects, the standardization of patch test methodology, and suggestions for future research in the field. We provide a baseline list of test allergens to be tested in children with suspected ACD. Additional tests should be performed only on specific indications. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Evaluation of a combined index of optic nerve structure and function for glaucoma diagnosis
2011-01-01
Background The definitive diagnosis of glaucoma is currently based on congruent damage to both optic nerve structure and function. Given widespread quantitative assessment of both structure (imaging) and function (automated perimetry) in glaucoma, it should be possible to combine these quantitative data to diagnose disease. We have therefore defined and tested a new approach to glaucoma diagnosis by combining imaging and visual field data, using the anatomical organization of retinal ganglion cells. Methods Data from 1499 eyes of glaucoma suspects and 895 eyes with glaucoma were identified at a single glaucoma center. Each underwent Heidelberg Retinal Tomograph (HRT) imaging and standard automated perimetry. A new measure combining these two tests, the structure function index (SFI), was defined in 3 steps: 1) calculate the probability that each visual field point is abnormal, 2) calculate the probability of abnormality for each of the six HRT optic disc sectors, and 3) combine those probabilities with the probability that a field point and disc sector are linked by ganglion cell anatomy. The SFI was compared to the HRT and visual field using receiver operating characteristic (ROC) analysis. Results The SFI produced an area under the ROC curve (0.78) that was similar to that for both visual field mean deviation (0.78) and pattern standard deviation (0.80) and larger than that for a normalized measure of HRT rim area (0.66). The cases classified as glaucoma by the various tests were significantly non-overlapping. Based on the distribution of test values in the population with mild disease, the SFI may be better able to stratify this group while still clearly identifying those with severe disease. Conclusions The SFI reflects the traditional clinical diagnosis of glaucoma by combining optic nerve structure and function. In doing so, it identifies a different subset of patients than either visual field testing or optic nerve head imaging alone. Analysis of prospective data will allow us to determine whether the combined index of structure and function can provide an improved standard for glaucoma diagnosis. PMID:21314957
Plant-based insect repellents: a review of their efficacy, development and testing
2011-01-01
Plant-based repellents have been used for generations in traditional practice as a personal protection measure against host-seeking mosquitoes. Knowledge on traditional repellent plants obtained through ethnobotanical studies is a valuable resource for the development of new natural products. Recently, commercial repellent products containing plant-based ingredients have gained increasing popularity among consumers, as these are commonly perceived as “safe” in comparison to long-established synthetic repellents although this is sometimes a misconception. To date insufficient studies have followed standard WHO Pesticide Evaluation Scheme guidelines for repellent testing. There is a need for further standardized studies in order to better evaluate repellent compounds and develop new products that offer high repellency as well as good consumer safety. This paper presents a summary of recent information on testing, efficacy and safety of plant-based repellents as well as promising new developments in the field. PMID:21411012
Modified Drop Tower Impact Tests for American Football Helmets.
Rush, G Alston; Prabhu, R; Rush, Gus A; Williams, Lakiesha N; Horstemeyer, M F
2017-02-19
A modified National Operating Committee on Standards for Athletic Equipment (NOCSAE) test method for American football helmet drop impact test standards is presented that would provide better assessment of a helmet's on-field impact performance by including a faceguard on the helmet. In this study, a merger of faceguard and helmet test standards is proposed. The need for a more robust systematic approach to football helmet testing procedures is emphasized by comparing representative results of the Head Injury Criterion (HIC), Severity Index (SI), and peak acceleration values for different helmets at different helmet locations under modified NOCSAE standard drop tower tests. Essentially, these comparative drop test results revealed that the faceguard adds a stiffening kinematic constraint to the shell that lessens total energy absorption. The current NOCSAE standard test methods can be improved to represent on-field helmet hits by attaching the faceguards to helmets and by including two new helmet impact locations (Front Top and Front Top Boss). The reported football helmet test method gives a more accurate representation of a helmet's performance and its ability to mitigate on-field impacts while promoting safer football helmets.
Performance evaluation of infrared imaging system in field test
NASA Astrophysics Data System (ADS)
Wang, Chensheng; Guo, Xiaodong; Ren, Tingting; Zhang, Zhi-jie
2014-11-01
Infrared imaging system has been applied widely in both military and civilian fields. Since the infrared imager has various types and different parameters, for system manufacturers and customers, there is great demand for evaluating the performance of IR imaging systems with a standard tool or platform. Since the first generation IR imager was developed, the standard method to assess the performance has been the MRTD or related improved methods which are not perfect adaptable for current linear scanning imager or 2D staring imager based on FPA detector. For this problem, this paper describes an evaluation method based on the triangular orientation discrimination metric which is considered as the effective and emerging method to evaluate the synthesis performance of EO system. To realize the evaluation in field test, an experiment instrument is developed. And considering the importance of operational environment, the field test is carried in practical atmospheric environment. The test imagers include panoramic imaging system and staring imaging systems with different optics and detectors parameters (both cooled and uncooled). After showing the instrument and experiment setup, the experiment results are shown. The target range performance is analyzed and discussed. In data analysis part, the article gives the range prediction values obtained from TOD method, MRTD method and practical experiment, and shows the analysis and results discussion. The experimental results prove the effectiveness of this evaluation tool, and it can be taken as a platform to give the uniform performance prediction reference.
Bringing Standardized Processes in Atom-Probe Tomography: I Establishing Standardized Terminology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Ian M; Danoix, F; Forbes, Richard
2011-01-01
Defining standardized methods requires careful consideration of the entire field and its applications. The International Field Emission Society (IFES) has elected a Standards Committee, whose task is to determine the needed steps to establish atom-probe tomography as an accepted metrology technique. Specific tasks include developing protocols or standards for: terminology and nomenclature; metrology and instrumentation, including specifications for reference materials; test methodologies; modeling and simulations; and science-based health, safety, and environmental practices. The Committee is currently working on defining terminology related to atom-probe tomography with the goal to include terms into a document published by the International Organization for Standardsmore » (ISO). A lot of terms also used in other disciplines have already been defined) and will be discussed for adoption in the context of atom-probe tomography.« less
Calibration and assessment of full-field optical strain measurement procedures and instrumentation
NASA Astrophysics Data System (ADS)
Kujawinska, Malgorzata; Patterson, E. A.; Burguete, R.; Hack, E.; Mendels, D.; Siebert, T.; Whelan, Maurice
2006-09-01
There are no international standards or norms for the use of optical techniques for full-field strain measurement. In the paper the rationale and design of a reference material and a set of standarized materials for the calibration and evaluation of optical systems for full-field measurements of strain are outlined. A classification system for the steps in the measurement process is also proposed and allows the development of a unified approach to diagnostic testing of components in an optical system for strain measurement based on any optical technique. The results described arise from a European study known as SPOTS whose objectives were to begin to fill the gap caused by a lack of standards.
NASA Astrophysics Data System (ADS)
Bednarski, M.; Larsen, K.
2008-11-01
Astronomy activities often pose problems for in-service teachers, especially at the elementary level, as many do not have a solid content background. Often astronomy instruction revolves around reading and answering questions. This is not an effective way to work with abstract concepts or engage students, and also fails to meet the standards of inquiry-based instruction recommended by the National Science Teachers Association and national and state standards. Science museums and planetariums bring unique and exciting perspectives to astronomy education. However, bringing students to the museum can sometimes be perceived as only a ``cool field trip.'' With mounting pressure for teachers to teach to the new standardized tests demanded by No Child Left Behind, and shrinking school budgets, field trips are rapidly becoming an endangered species. Coordinating museum, science center, and planetarium offerings with national and state science standards can renew interest in (and perceived relevance of) field trips. Therefore, university faculty, in-service teachers, and museum/planetarium staff can form successful partnerships which can both improve student learning and increase attendance at informal education science events and facilities. This workshop will first briefly introduce participants to national and representative state standards as well as research on in-service teachers' astronomy content knowledge and the educational value of field trips. For the majority of the workshop, participants will engage in the actual steps of coordinating, planning, and writing inquiry-based astronomy curriculum embedded performance tasks that collectively meet the learning needs of students in elementary, middle, or high school.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rodriguez, J.R.; Ahrens, J.S.; Lowe, D.L.
Throughout the years, Sandia National Laboratories (SNL) has performed various laboratory evaluations of entry control devices, including biometric identity verifiers. The reports which resulted from this testing have been very well received by the physical security community. This same community now requires equally informative field study data. To meet this need we have conducted a field study in an effort to develop the tools and methods which our customers can use to translate laboratory data into operational field performance. The field testing described in this report was based on the Recognition Systems Inc.`s (RSI) model ID3D HandKey biometric verifier. Thismore » device was selected because it is referenced in DOE documents such as the Guide for Implementation of the DOE Standard Badge and is the de facto biometric standard for the DOE. The ID3D HandKey is currently being used at several DOE sites such as Hanford, Rocky Flats, Pantex, Savannah River, and Idaho Nuclear Engineering Laboratory. The ID3D HandKey was laboratory tested at SNL. It performed very well during this test, exhibiting an equal error point of 0.2 percent. The goals of the field test were to identify operational characteristics and design guidelines to help system engineers translate laboratory data into field performance. A secondary goal was to develop tools which could be used by others to evaluate system effectiveness or improve the performance of their systems. Operational characteristics were determined by installing a working system and studying its operation over a five month period. Throughout this test we developed tools which could be used by others to similarly gauge system effectiveness.« less
A Novel Field Deployable Point-of-Care Diagnostic Test for Cutaneous Leishmaniasis
2015-10-01
include localized cutaneous leishmaniasis (LCL), and destructive nasal and oropharyngeal lesions of mucosal leishmaniasis (ML). LCL in the New World...the high costs, personnel training and need of sophisticated equipment. Therefore, novel methods to detect leishmaniasis at the POC are urgently needed...To date, there is no field-standardized molecular method based on DNA amplification coupled with Lateral Flow reading to detect leishmaniasis
NASA Technical Reports Server (NTRS)
Hill, Charles S.; Oliveras, Ovidio M.
2011-01-01
Evolution of the 3D strain field during ASTM-D-7078 v-notch rail shear tests on 8-ply quasi-isotropic carbon fiber/epoxy laminates was determined by optical photogrammetry using an ARAMIS system. Specimens having non-optimal geometry and minor discrepancies in dimensional tolerances were shown to display non-symmetry and/or stress concentration in the vicinity of the notch relative to a specimen meeting the requirements of the standard, but resulting shear strength and modulus values remained within acceptable bounds of standard deviation. Based on these results, and reported difficulty machining specimens to the required tolerances using available methods, it is suggested that a parametric study combining analytical methods and experiment may provide rationale to increase the tolerances on some specimen dimensions, reducing machining costs, increasing the proportion of acceptable results, and enabling a wider adoption of the test method.
2006-07-01
All Quality Control Reference Materials are acquired only from authorized vendors or sources commonly used by U.S. EPA Regional Laboratories...are traceable to the National Institue of Standards and Testing (NITS) Standard Reference Materials (SRM) or to the U.S. EPA Reference Standards... clothing or equipment by blowing, shaking or any other means that may disperse material into the air is prohibited. 7.1.3. All disposable personal
NASA Astrophysics Data System (ADS)
Bednarski, Marsha; Larsen, K.
2008-05-01
Astronomy activities often pose problems for in-service teachers, especially at the elementary level, as many do not have a solid content background. Often astronomy instruction revolves around reading and answering questions. This is not an effective way to work with abstract concepts or engage students, and also fails to meet the standards of inquiry-based instruction recommended by the National Science Teachers Association and national and state standards. Science museums and planetariums bring unique and exciting perspectives to astronomy education. However, bringing students to the museum can sometimes be perceived as only a "cool field trip.” With mounting pressure for teachers to teach to the new standardized tests demanded by No Child Left Behind, and shrinking school budgets, field trips are rapidly becoming an endangered species. Coordinating museum, science center, and planetarium offerings with national and state science standards can renew interest in (and perceived relevance of) field trips. Therefore, university faculty, in-service teachers, and museum/planetarium staff can form successful partnerships which can both improve student learning and increase attendance at informal education science events and facilities. This workshop will first briefly introduce participants to national and representative state standards as well as research on in-service teachers’ astronomy content knowledge and the educational value of field trips. For the majority of the workshop, participants will engage in the actual steps of coordinating, planning, and writing inquiry-based astronomy curriculum embedded performance tasks that collectively meet the learning needs of students in elementary, middle, or high school. Participants are encouraged to bring a copy of their own state standards (available on their state's Department of Education website) for their preferred target age group.
NASA Technical Reports Server (NTRS)
Marte, J. E.; Bryant, J. A.; Livingston, R.
1983-01-01
Dynamometer performance of a South Coast Technology electric conversion of a Volkswagen (VW) Rabbit designated SCT-8 was tested. The SCT-8 vehicle was fitted with a transistorized chopper in the motor armature circuit to supplement the standard motor speed control via field weakening. The armature chopper allowed speed control below the motor base speed. This low speed control was intended to reduce energy loss at idle during stop-and-go traffic; to eliminate the need for using the clutch below base motor speed; and to improve the drivability. Test results indicate an improvement of about 3.5% in battery energy economy for the SAE J227a-D driving cycle and 6% for the C-cycle with only a minor reduction in acceleration performance. A further reduction of about 6% would be possible if provision were made for shutting down field power during the idle phases of the driving cycles. Drivability of the vehicle equipped with the armature chopper was significantly improved compared with the standard SCT Electric Rabbit.
Kocabeyoglu, Sibel; Uzun, Salih; Mocan, Mehmet Cem; Bozkurt, Banu; Irkec, Murat; Orhan, Mehmet
2013-10-01
The aim of this study was to compare the visual field test results in healthy children obtained via the Humphrey matrix 24-2 threshold program and standard automated perimetry (SAP) using the Swedish interactive threshold algorithm (SITA)-Standard 24-2 test. This prospective study included 55 healthy children without ocular or systemic disorders who underwent both SAP and frequency doubling technology (FDT) perimetry visual field testing. Visual field test reliability indices, test duration, global indices (mean deviation [MD], and pattern standard deviation [PSD]) were compared between the 2 tests using the Wilcoxon signed-rank test and paired t-test. The performance of the Humphrey field analyzer (HFA) 24-2 SITA-standard and frequency-doubling technology Matrix 24-2 tests between genders were compared with Mann-Whitney U-test. Fifty-five healthy children with a mean age of 12.2 ± 1.9 years (range from 8 years to 16 years) were included in this prospective study. The test durations of SAP and FDT were similar (5.2 ± 0.5 and 5.1 ± 0.2 min, respectively, P = 0.651). MD and the PSD values obtained via FDT Matrix were significantly higher than those obtained via SAP (P < 0.001), and fixation losses and false negative errors were significantly less with SAP (P < 0.05). A weak positive correlation between the two tests in terms of MD (r = 0.352, P = 0.008) and PSD (r = 0.329, P = 0.014) was observed. Children were able to complete both the visual test algorithms successfully within 6 min. However, SAP testing appears to be associated with less depression of the visual field indices of healthy children. FDT Matrix and SAP should not be used interchangeably in the follow-up of children.
Seelye, James G.; Mac, Michael J.
1984-01-01
A literature review of sediment bioassessment was conducted as the first step in the development of a more standardized and ecologically sound test procedure for evaluating sediment quality. Based on the review, the authors concluded that 1) a standardized laboratory bioassessment test should consist of flowthrough exposure of at least 10 days duration using more than one aquatic organism including at least an infaunal benthic invertebrate and a fish species. 2) Before adoption of a laboratory sediment bioassessment procedure, the laboratory results should be evaluated by comparison with field conditions. 3) Most current sediment bioassessment regulatory tests measure acute toxicity or bioaccumulation. Development of tests to evaluate chronic, sublethal effects is needed.
Hommen, Udo; Schmitt, Walter; Heine, Simon; Brock, Theo Cm; Duquesne, Sabine; Manson, Phil; Meregalli, Giovanna; Ochoa-Acuña, Hugo; van Vliet, Peter; Arts, Gertie
2016-01-01
This case study of the Society of Environmental Toxicology and Chemistry (SETAC) workshop MODELINK demonstrates the potential use of mechanistic effects models for macrophytes to extrapolate from effects of a plant protection product observed in laboratory tests to effects resulting from dynamic exposure on macrophyte populations in edge-of-field water bodies. A standard European Union (EU) risk assessment for an example herbicide based on macrophyte laboratory tests indicated risks for several exposure scenarios. Three of these scenarios are further analyzed using effect models for 2 aquatic macrophytes, the free-floating standard test species Lemna sp., and the sediment-rooted submerged additional standard test species Myriophyllum spicatum. Both models include a toxicokinetic (TK) part, describing uptake and elimination of the toxicant, a toxicodynamic (TD) part, describing the internal concentration-response function for growth inhibition, and a description of biomass growth as a function of environmental factors to allow simulating seasonal dynamics. The TK-TD models are calibrated and tested using laboratory tests, whereas the growth models were assumed to be fit for purpose based on comparisons of predictions with typical growth patterns observed in the field. For the risk assessment, biomass dynamics are predicted for the control situation and for several exposure levels. Based on specific protection goals for macrophytes, preliminary example decision criteria are suggested for evaluating the model outputs. The models refined the risk indicated by lower tier testing for 2 exposure scenarios, while confirming the risk associated for the third. Uncertainties related to the experimental and the modeling approaches and their application in the risk assessment are discussed. Based on this case study and the assumption that the models prove suitable for risk assessment once fully evaluated, we recommend that 1) ecological scenarios be developed that are also linked to the exposure scenarios, and 2) quantitative protection goals be set to facilitate the interpretation of model results for risk assessment. © 2015 SETAC.
Practical Issues in Field Based Testing of Oral Reading Fluency at Upper Elementary Grades
ERIC Educational Resources Information Center
Duesbery, Luke; Braun-Monegan, Jenelle; Werblow, Jacob; Braun, Drew
2012-01-01
In this series of studies, we explore the ideal frequency, duration, and relative effectiveness of measuring oral reading fluency. In study one, a sample of 389 fifth graders read out loud for 1 min and then took a traditional state-level standardized reading test. Results suggest administering three passages and using the median yields the…
Validation of the standardized field sobriety test battery at BACs below 0.10 percent
DOT National Transportation Integrated Search
1998-08-01
This study evaluated the accuracy of the National Highway Traffic Safety Administration's (NHTSA's) Standardized Field Sobriety Test (SFST) battery to assist officers in making arrest decisions for Driving While Intoxicated (DWI) at blood alcohol con...
Current status of antifungal susceptibility testing methods.
Arikan, Sevtap
2007-11-01
Antifungal susceptibility testing is a very dynamic field of medical mycology. Standardization of in vitro susceptibility tests by the Clinical and Laboratory Standards Institute (CLSI) and the European Committee for Antimicrobial Susceptibility Testing (EUCAST), and current availability of reference methods constituted the major remarkable steps in the field. Based on the established minimum inhibitory concentration (MIC) breakpoints, it is now possible to determine the susceptibilities of Candida strains to fluconazole, itraconazole, voriconazole, and flucytosine. Moreover, utility of fluconazole antifungal susceptibility tests as an adjunct in optimizing treatment of candidiasis has now been validated. While the MIC breakpoints and clinical significance of susceptibility testing for the remaining fungi and antifungal drugs remain yet unclear, modifications of the available methods as well as other methodologies are being intensively studied to overcome the present drawbacks and limitations. Among the other methods under investigation are Etest, colorimetric microdilution, agar dilution, determination of fungicidal activity, flow cytometry, and ergosterol quantitation. Etest offers the advantage of practical application and favorable agreement rates with the reference methods that are frequently above acceptable limits. However, MIC breakpoints for Etest remain to be evaluated and established. Development of commercially available, standardized colorimetric panels that are based on CLSI method parameters has added more to the antifungal susceptibility testing armamentarium. Flow cytometry, on the other hand, appears to offer rapid susceptibility testing but requires specified equipment and further evaluation for reproducibility and standardization. Ergosterol quantitation is another novel approach, which appears potentially beneficial particularly in discrimination of azole-resistant isolates from heavy trailers. The method is yet investigational and requires to be further studied. Developments in methodology and applications of antifungal susceptibility testing will hopefully provide enhanced utility in clinical guidance of antifungal therapy. However, and particularly in immunosuppressed host, in vitro susceptibility is and will remain only one of several factors that influence clinical outcome.
Demonstration & Testing of ClimaStat for Improved DX Air-Conditioning Efficiency
2013-04-01
impaired productivity and increased transmission of viruses and bacteria. Allowing indoor RH to rise above an average of 60%rh or a peak of 70%rh can...testing of an engineering prototype culminated in issuance of US Patent 6,427,454 in 2002. Then, development, testing and refinement of a production ...field tests on four Trane (American Standard) systems at a university site were concluded in 2009. A production prototype was constructed based on
Laboratory and field measurements and evaluations of vibration at the handles of riveting hammers
McDOWELL, THOMAS W.; WARREN, CHRISTOPHER; WELCOME, DANIEL E.; DONG, REN G.
2015-01-01
The use of riveting hammers can expose workers to harmful levels of hand-transmitted vibration (HTV). As a part of efforts to reduce HTV exposures through tool selection, the primary objective of this study was to evaluate the applicability of a standardized laboratory-based riveting hammer assessment protocol for screening riveting hammers. The second objective was to characterize the vibration emissions of reduced vibration riveting hammers and to make approximations of the HTV exposures of workers operating these tools in actual work tasks. Eight pneumatic riveting hammers were selected for the study. They were first assessed in a laboratory using the standardized method for measuring vibration emissions at the tool handle. The tools were then further assessed under actual working conditions during three aircraft sheet metal riveting tasks. Although the average vibration magnitudes of the riveting hammers measured in the laboratory test were considerably different from those measured in the field study, the rank orders of the tools determined via these tests were fairly consistent, especially for the lower vibration tools. This study identified four tools that consistently exhibited lower frequency-weighted and unweighted accelerations in both the laboratory and workplace evaluations. These observations suggest that the standardized riveting hammer test is acceptable for identifying tools that could be expected to exhibit lower vibrations in workplace environments. However, the large differences between the accelerations measured in the laboratory and field suggest that the standardized laboratory-based tool assessment is not suitable for estimating workplace riveting hammer HTV exposures. Based on the frequency-weighted accelerations measured at the tool handles during the three work tasks, the sheet metal mechanics assigned to these tasks at the studied workplace are unlikely to exceed the daily vibration exposure action value (2.5 m s−2) using any of the evaluated riveting hammers. PMID:22539561
Calibration of GPS based high accuracy speed meter for vehicles
NASA Astrophysics Data System (ADS)
Bai, Yin; Sun, Qiao; Du, Lei; Yu, Mei; Bai, Jie
2015-02-01
GPS based high accuracy speed meter for vehicles is a special type of GPS speed meter which uses Doppler Demodulation of GPS signals to calculate the speed of a moving target. It is increasingly used as reference equipment in the field of traffic speed measurement, but acknowledged standard calibration methods are still lacking. To solve this problem, this paper presents the set-ups of simulated calibration, field test signal replay calibration, and in-field test comparison with an optical sensor based non-contact speed meter. All the experiments were carried out on particular speed values in the range of (40-180) km/h with the same GPS speed meter. The speed measurement errors of simulated calibration fall in the range of +/-0.1 km/h or +/-0.1%, with uncertainties smaller than 0.02% (k=2). The errors of replay calibration fall in the range of +/-0.1% with uncertainties smaller than 0.10% (k=2). The calibration results justify the effectiveness of the two methods. The relative deviations of the GPS speed meter from the optical sensor based noncontact speed meter fall in the range of +/-0.3%, which validates the use of GPS speed meter as reference instruments. The results of this research can provide technical basis for the establishment of internationally standard calibration methods of GPS speed meters, and thus ensures the legal status of GPS speed meters as reference equipment in the field of traffic speed metrology.
Chen, Lei Tai; Sun, Ai Qing; Yang, Min; Chen, Lu Lu; Ma, Xue Li; Li, Mei Ling; Yin, Yan Ping
2016-09-01
A total of 16 wheat cultivars were selected to detect seed vigor of different genotypes using standard germination test, seed germination test under stress conditions and field emergence test. The adversity resistance indices of seed vigor indices and field emergence percentage under different germination conditions were used as the indices to evaluate adversity resistance. Principal component analysis and cluster analysis were used for the comprehensive evaluation of seed vigor. Results showed that drought stress, artificial aging and cold soaking treatments affected seed vigor to some extent. The adversity resistance indices of the artificial aging and cold soaking tests were significantly positively correlated with the field emergence percentage, while the adversity resistance index of drought stress test had no significant correlation with the field emergence percentage. 16 wheat cultivars were classified as three groups based on the principal component analysis and cluster analysis. Yunong 949, Yumai 49-198, Luyuan 502, Zhengyumai 9987, Shimai 21, Shannong 23, and Shixin 828 belonged to high vigor seeds. Xunong 5, Yunong 982, Tangmai 8, Jimai 20, Jimai 22, Jinan 17, and Shannong 20 belonged to medium vigor seeds. The other two cultivars, Chang 4738 and Lunxuan 061, belonged to low vigor seeds.
Hrovatin, Karin; Kunej, Tanja
2018-01-01
Erstwhile, sex was determined by observation, which is not always feasible. Nowadays, genetic methods are prevailing due to their accuracy, simplicity, low costs, and time-efficiency. However, there is no comprehensive review enabling overview and development of the field. The studies are heterogeneous, lacking a standardized reporting strategy. Therefore, our aim was to collect genetic sexing assays for mammals and assemble them in a catalogue with unified terminology. Publications were extracted from online databases using key words such as sexing and molecular. The collected data were supplemented with species and gene IDs and the type of sex-specific sequence variant (SSSV). We developed a catalogue and graphic presentation of diagnostic tests for molecular sex determination of mammals, based on 58 papers published from 2/1991 to 10/2016. The catalogue consists of five categories: species, genes, SSSVs, methods, and references. Based on the analysis of published literature, we propose minimal requirements for reporting, consisting of: species scientific name and ID, genetic sequence with name and ID, SSSV, methodology, genomic coordinates (e.g., restriction sites, SSSVs), amplification system, and description of detected amplicon and controls. The present study summarizes vast knowledge that has up to now been scattered across databases, representing the first step toward standardization regarding molecular sexing, enabling a better overview of existing tests and facilitating planned designs of novel tests. The project is ongoing; collecting additional publications, optimizing field development, and standardizing data presentation are needed.
New methods to quantify the cracking performance of cementitious systems made with internal curing
NASA Astrophysics Data System (ADS)
Schlitter, John L.
The use of high performance concretes that utilize low water-cement ratios have been promoted for use in infrastructure based on their potential to increase durability and service life because they are stronger and less porous. Unfortunately, these benefits are not always realized due to the susceptibility of high performance concrete to undergo early age cracking caused by shrinkage. This problem is widespread and effects federal, state, and local budgets that must maintain or replace deterioration caused by cracking. As a result, methods to reduce or eliminate early age shrinkage cracking have been investigated. Internal curing is one such method in which a prewetted lightweight sand is incorporated into the concrete mixture to provide internal water as the concrete cures. This action can significantly reduce or eliminate shrinkage and in some cases causes a beneficial early age expansion. Standard laboratory tests have been developed to quantify the shrinkage cracking potential of concrete. Unfortunately, many of these tests may not be appropriate for use with internally cured mixtures and only provide limited amounts of information. Most standard tests are not designed to capture the expansive behavior of internally cured mixtures. This thesis describes the design and implementation of two new testing devices that overcome the limitations of current standards. The first device discussed in this thesis is called the dual ring. The dual ring is a testing device that quantifies the early age restrained shrinkage performance of cementitious mixtures. The design of the dual ring is based on the current ASTM C 1581-04 standard test which utilizes one steel ring to restrain a cementitious specimen. The dual ring overcomes two important limitations of the standard test. First, the standard single ring test cannot restrain the expansion that takes place at early ages which is not representative of field conditions. The dual ring incorporates a second restraining ring which is located outside of the sample to provide restraint against expansion. Second, the standard ring test is a passive test that only relies on the autogenous and drying shrinkage of the mixture to induce cracking. The dual ring test can be an active test because it has the ability to vary the temperature of the specimen in order to induce thermal stress and produce cracking. This ability enables the study of the restrained cracking capacity as the mixture ages in order to quantify crack sensitive periods of time. Measurements made with the dual ring quantify the benefits from using larger amounts of internal curing. Mixtures that resupplied internal curing water to match that of chemical shrinkage could sustain three times the magnitude of thermal change before cracking. The second device discussed in this thesis is a large scale slab testing device. This device tests the cracking potential of 15' long by 4" thick by 24" wide slab specimens in an environmentally controlled chamber. The current standard testing devices can be considered small scale and encounter problems when linking their results to the field due to size effects. Therefore, the large scale slab testing device was developed in order to calibrate the results of smaller scale tests to real world field conditions such as a pavement or bridge deck. Measurements made with the large scale testing device showed that the cracking propensity of the internally cured mixtures was reduced and that a significant benefit could be realized.
NASA Astrophysics Data System (ADS)
Shiba, Kenji; Koshiji, Kohji
Transcutaneous Energy Transmission (TET) is one way of providing the energy needed to power a totally implantable artificial heart (TIAH). In the present study, an externally coupled TET system was implanted in a prototype human phantom to evaluate emission and immunity. In the emission evaluation, measurements were conducted based on CISPR Pub.11 and VDE 0871 standards, while immunity tests were based on the standards of the IEC 61000-4 series. The magnetic field of the radiated emission was measured using a loop antenna. At 0.1[MHz], we found the greatest magnetic field of 47.8 [dBμA/m], somewhat less than CISPR’s upper limit of 54 [dBμA/m]. For the conducted emission, by installing a noise filter and ferrite beads in the input section of the DC-power supply, conducted emission could be kept within the allowable limits of CISPR Pub.11 and VDE 0871. Finally, the immunity tests against radiated and conducted emission, electrostatic discharge and voltage fluctuation proved that the prototype could withstand the maximum level of disturbance. These results confirmed that the TET system implanted in a human phantom could, through modification, meet the emission and immunity standards.
NASA Astrophysics Data System (ADS)
Gabai, Haniel; Baranes-Zeevi, Maya; Zilberman, Meital; Shaked, Natan T.
2013-04-01
We propose an off-axis interferometric imaging system as a simple and unique modality for continuous, non-contact and non-invasive wide-field imaging and characterization of drug release from its polymeric device used in biomedicine. In contrast to the current gold-standard methods in this field, usually based on chromatographic and spectroscopic techniques, our method requires no user intervention during the experiment, and only one test-tube is prepared. We experimentally demonstrate imaging and characterization of drug release from soy-based protein matrix, used as skin equivalent for wound dressing with controlled anesthetic, Bupivacaine drug release. Our preliminary results demonstrate the high potential of our method as a simple and low-cost modality for wide-field imaging and characterization of drug release from drug delivery devices.
Negeri, Zelalem F; Shaikh, Mateen; Beyene, Joseph
2018-05-11
Diagnostic or screening tests are widely used in medical fields to classify patients according to their disease status. Several statistical models for meta-analysis of diagnostic test accuracy studies have been developed to synthesize test sensitivity and specificity of a diagnostic test of interest. Because of the correlation between test sensitivity and specificity, modeling the two measures using a bivariate model is recommended. In this paper, we extend the current standard bivariate linear mixed model (LMM) by proposing two variance-stabilizing transformations: the arcsine square root and the Freeman-Tukey double arcsine transformation. We compared the performance of the proposed methods with the standard method through simulations using several performance measures. The simulation results showed that our proposed methods performed better than the standard LMM in terms of bias, root mean square error, and coverage probability in most of the scenarios, even when data were generated assuming the standard LMM. We also illustrated the methods using two real data sets. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
40 CFR 60.52Da - Recordkeeping requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Standards of Performance for Electric Utility... opacity field data sheets; (2) For each performance test conducted using Method 22 of appendix A-4 of this... performance test; (iii) Copies of all visible emission observer opacity field data sheets; and (iv...
Near-infrared fluorescence image quality test methods for standardized performance evaluation
NASA Astrophysics Data System (ADS)
Kanniyappan, Udayakumar; Wang, Bohan; Yang, Charles; Ghassemi, Pejhman; Wang, Quanzeng; Chen, Yu; Pfefer, Joshua
2017-03-01
Near-infrared fluorescence (NIRF) imaging has gained much attention as a clinical method for enhancing visualization of cancers, perfusion and biological structures in surgical applications where a fluorescent dye is monitored by an imaging system. In order to address the emerging need for standardization of this innovative technology, it is necessary to develop and validate test methods suitable for objective, quantitative assessment of device performance. Towards this goal, we develop target-based test methods and investigate best practices for key NIRF imaging system performance characteristics including spatial resolution, depth of field and sensitivity. Characterization of fluorescence properties was performed by generating excitation-emission matrix properties of indocyanine green and quantum dots in biological solutions and matrix materials. A turbid, fluorophore-doped target was used, along with a resolution target for assessing image sharpness. Multi-well plates filled with either liquid or solid targets were generated to explore best practices for evaluating detection sensitivity. Overall, our results demonstrate the utility of objective, quantitative, target-based testing approaches as well as the need to consider a wide range of factors in establishing standardized approaches for NIRF imaging system performance.
Field testing energy-saving hermetic compressors in residential refrigerators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sauber, R.S.; Middleton, M.G.
The design of an energy-saving compressor for low back pressure applications is reviewed. Calorimeter performance results are stated for two sizes of the efficient design and compared with performance test results for a standard compressor. Power consumption of a refrigerator-freezer is given with a standard compressor and with the energy-saving compressor. The preparation of the refrigerators used in the field test is discussed, along with the criteria used in selecting the instrumentation for the project. Results of the energy-saving compressor in the field test, along with a comparison to a standard production compressor, are presented. Some conclusions are drawn, basedmore » on the data, in relation to important factors in residential refrigerator power consumption.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, Iain S.; Wray, Craig P.; Guillot, Cyril
2003-08-01
In this report, we discuss the accuracy of flow hoods for residential applications, based on laboratory tests and field studies. The results indicate that commercially available hoods are often inadequate to measure flows in residential systems, and that there can be a wide range of performance between different flow hoods. The errors are due to poor calibrations, sensitivity of existing hoods to grille flow non-uniformities, and flow changes from added flow resistance. We also evaluated several simple techniques for measuring register airflows that could be adopted by the HVAC industry and homeowners as simple diagnostics that are often as accuratemore » as commercially available devices. Our test results also show that current calibration procedures for flow hoods do not account for field application problems. As a result, organizations such as ASHRAE or ASTM need to develop a new standard for flow hood calibration, along with a new measurement standard to address field use of flow hoods.« less
Elastomer Compound Developed for High Wear Applications
NASA Technical Reports Server (NTRS)
Crawford, D.; Feuer, H.; Flanagan, D.; Rodriguez, G.; Teets, A.; Touchet, P.
1993-01-01
The U.S. Army is currently spending 300 million dollars per year replacing rubber track pads. An experimental rubber compound has been developed which exhibits 2 to 3 times greater service life than standard production pad compounds. To improve the service life of the tank track pads various aspects of rubber chemistry were explored including polymer, curing and reinforcing systems. Compounds that exhibited superior physical properties based on laboratory data were then fabricated into tank pads and field tested. This paper will discuss the compounding studies, laboratory data and field testing that led to the high wear elastomer compound.
Current progress in patient-specific modeling
2010-01-01
We present a survey of recent advancements in the emerging field of patient-specific modeling (PSM). Researchers in this field are currently simulating a wide variety of tissue and organ dynamics to address challenges in various clinical domains. The majority of this research employs three-dimensional, image-based modeling techniques. Recent PSM publications mostly represent feasibility or preliminary validation studies on modeling technologies, and these systems will require further clinical validation and usability testing before they can become a standard of care. We anticipate that with further testing and research, PSM-derived technologies will eventually become valuable, versatile clinical tools. PMID:19955236
Finding SDSS Galaxy Clusters in 4-dimensional Color Space Using the False Discovery Rate
NASA Astrophysics Data System (ADS)
Nichol, R. C.; Miller, C. J.; Reichart, D.; Wasserman, L.; Genovese, C.; SDSS Collaboration
2000-12-01
We describe a recently developed statistical technique that provides a meaningful cut-off in probability-based decision making. We are concerned with multiple testing, where each test produces a well-defined probability (or p-value). By well-known, we mean that the null hypothesis used to determine the p-value is fully understood and appropriate. The method is entitled False Discovery Rate (FDR) and its largest advantage over other measures is that it allows one to specify a maximal amount of acceptable error. As an example of this tool, we apply FDR to a four-dimensional clustering algorithm using SDSS data. For each galaxy (or test galaxy), we count the number of neighbors that fit within one standard deviation of a four dimensional Gaussian centered on that test galaxy. The mean and standard deviation of that Gaussian are determined from the colors and errors of the test galaxy. We then take that same Gaussian and place it on a random selection of n galaxies and make a similar count. In the limit of large n, we expect the median count around these random galaxies to represent a typical field galaxy. For every test galaxy we determine the probability (or p-value) that it is a field galaxy based on these counts. A low p-value implies that the test galaxy is in a cluster environment. Once we have a p-value for every galaxy, we use FDR to determine at what level we should make our probability cut-off. Once this cut-off is made, we have a final sample of galaxies that are cluster-like galaxies. Using FDR, we also know the maximum amount of field contamination in our cluster galaxy sample. We present our preliminary galaxy clustering results using these methods.
Modelling rollover behaviour of exacavator-based forest machines
M.W. Veal; S.E. Taylor; Robert B. Rummer
2003-01-01
This poster presentation provides results from analytical and computer simulation models of rollover behaviour of hydraulic excavators. These results are being used as input to the operator protective structure standards development process. Results from rigid body mechanics and computer simulation methods agree well with field rollover test data. These results show...
A Test Methodology for Determining Space-Readiness of Xilinx SRAM-Based FPGA Designs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quinn, Heather M; Graham, Paul S; Morgan, Keith S
2008-01-01
Using reconfigurable, static random-access memory (SRAM) based field-programmable gate arrays (FPGAs) for space-based computation has been an exciting area of research for the past decade. Since both the circuit and the circuit's state is stored in radiation-tolerant memory, both could be alterd by the harsh space radiation environment. Both the circuit and the circuit's state can be prote cted by triple-moduler redundancy (TMR), but applying TMR to FPGA user designs is often an error-prone process. Faulty application of TMR could cause the FPGA user circuit to output incorrect data. This paper will describe a three-tiered methodology for testing FPGA usermore » designs for space-readiness. We will describe the standard approach to testing FPGA user designs using a particle accelerator, as well as two methods using fault injection and a modeling tool. While accelerator testing is the current 'gold standard' for pre-launch testing, we believe the use of fault injection and modeling tools allows for easy, cheap and uniform access for discovering errors early in the design process.« less
Design and analysis of the federal aviation administration next generation fire test burner
NASA Astrophysics Data System (ADS)
Ochs, Robert Ian
The United States Federal Aviation Administration makes use of threat-based fire test methods for the certification of aircraft cabin materials to enhance the level of safety in the event of an in-flight or post-crash fire on a transport airplane. The global nature of the aviation industry results in these test methods being performed at hundreds of laboratories around the world; in some cases testing identical materials at multiple labs but yielding different results. Maintenance of this standard for an elevated level of safety requires that the test methods be as well defined as possible, necessitating a comprehensive understanding of critical test method parameters. The tests have evolved from simple Bunsen burner material tests to larger, more complicated apparatuses, requiring greater understanding of the device for proper application. The FAA specifies a modified home heating oil burner to simulate the effects of large, intense fires for testing of aircraft seat cushions, cargo compartment liners, power plant components, and thermal acoustic insulation. Recently, the FAA has developed a Next Generation (NexGen) Fire Test burner to replace the original oil burner that has become commercially unavailable. The NexGen burner design is based on the original oil burner but with more precise control of the air and fuel flow rates with the addition of a sonic nozzle and a pressurized fuel system. Knowledge of the fundamental flow properties created by various burner configurations is desired to develop an updated and standardized burner configuration for use around the world for aircraft materials fire testing and airplane certification. To that end, the NexGen fire test burner was analyzed with Particle Image Velocimetry (PIV) to resolve the non-reacting exit flow field and determine the influence of the configuration of burner components. The correlation between the measured flow fields and the standard burner performance metrics of flame temperature and burnthrough time was studied. Potential design improvements were also evaluated that could simplify burner set up and operation.
ERIC Educational Resources Information Center
Ward, William C.
The Open Field Test was used to assess variables that might not be manifested in a more standard testing situation. In this test, the child was shown 10 standard play objects in the room, and was told to do anything he wished with the toys. The tester initiated no interaction with the child and responded minimally to any overture made by the…
Du, Lei; Sun, Qiao; Cai, Changqing; Bai, Jie; Fan, Zhe; Zhang, Yue
2018-01-01
Traffic speed meters are important legal measuring instruments specially used for traffic speed enforcement and must be tested and verified in the field every year using a vehicular mobile standard speed-measuring instrument to ensure speed-measuring performances. The non-contact optical speed sensor and the GPS speed sensor are the two most common types of standard speed-measuring instruments. The non-contact optical speed sensor requires extremely high installation accuracy, and its speed-measuring error is nonlinear and uncorrectable. The speed-measuring accuracy of the GPS speed sensor is rapidly reduced if the amount of received satellites is insufficient enough, which often occurs in urban high-rise regions, tunnels, and mountainous regions. In this paper, a new standard speed-measuring instrument using a dual-antenna Doppler radar sensor is proposed based on a tradeoff between the installation accuracy requirement and the usage region limitation, which has no specified requirements for its mounting distance and no limitation on usage regions and can automatically compensate for the effect of an inclined installation angle on its speed-measuring accuracy. Theoretical model analysis, simulated speed measurement results, and field experimental results compared with a GPS speed sensor with high accuracy showed that the dual-antenna Doppler radar sensor is effective and reliable as a new standard speed-measuring instrument. PMID:29621142
Du, Lei; Sun, Qiao; Cai, Changqing; Bai, Jie; Fan, Zhe; Zhang, Yue
2018-04-05
Traffic speed meters are important legal measuring instruments specially used for traffic speed enforcement and must be tested and verified in the field every year using a vehicular mobile standard speed-measuring instrument to ensure speed-measuring performances. The non-contact optical speed sensor and the GPS speed sensor are the two most common types of standard speed-measuring instruments. The non-contact optical speed sensor requires extremely high installation accuracy, and its speed-measuring error is nonlinear and uncorrectable. The speed-measuring accuracy of the GPS speed sensor is rapidly reduced if the amount of received satellites is insufficient enough, which often occurs in urban high-rise regions, tunnels, and mountainous regions. In this paper, a new standard speed-measuring instrument using a dual-antenna Doppler radar sensor is proposed based on a tradeoff between the installation accuracy requirement and the usage region limitation, which has no specified requirements for its mounting distance and no limitation on usage regions and can automatically compensate for the effect of an inclined installation angle on its speed-measuring accuracy. Theoretical model analysis, simulated speed measurement results, and field experimental results compared with a GPS speed sensor with high accuracy showed that the dual-antenna Doppler radar sensor is effective and reliable as a new standard speed-measuring instrument.
Development and field test of psychophysical tests for DWI arrest
DOT National Transportation Integrated Search
1981-03-01
Administration and scoring procedures were standardized for a sobriety test battery consisting of the walk-and-turn test, the one leg stand test, and horizontal gaze nystagmus. The effectiveness of the standardized battery was then evaluated in the l...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Middleton, M.G.; Sauber, R.S.
Two models of a high-efficiency compressor were manufactured in a pilot production run. These compressors were for low back-pressure applications. While based on a production compressor, there were many changes that required production process changes. Some changes were performed within our company and others were made by outside vendors. The compressors were used in top mount refrigerator-freezers and sold in normal distribution channels. Forty units were placed in residences for a one-year field test. Additional compressors were built so that a life test program could be performed. The results of the field test reveal a 27.0% improvement in energy consumptionmore » for the 18 ft/sup 3/ high-efficiency model and a 15.6% improvement in the 21 ft/sup 3/ improvement in the 21 ft/sup 3/ high-efficiency model as compared to the standard production unit.« less
Field-programmable lab-on-a-chip based on microelectrode dot array architecture.
Wang, Gary; Teng, Daniel; Lai, Yi-Tse; Lu, Yi-Wen; Ho, Yingchieh; Lee, Chen-Yi
2014-09-01
The fundamentals of electrowetting-on-dielectric (EWOD) digital microfluidics are very strong: advantageous capability in the manipulation of fluids, small test volumes, precise dynamic control and detection, and microscale systems. These advantages are very important for future biochip developments, but the development of EWOD microfluidics has been hindered by the absence of: integrated detector technology, standard commercial components, on-chip sample preparation, standard manufacturing technology and end-to-end system integration. A field-programmable lab-on-a-chip (FPLOC) system based on microelectrode dot array (MEDA) architecture is presented in this research. The MEDA architecture proposes a standard EWOD microfluidic component called 'microelectrode cell', which can be dynamically configured into microfluidic components to perform microfluidic operations of the biochip. A proof-of-concept prototype FPLOC, containing a 30 × 30 MEDA, was developed by using generic integrated circuits computer aided design tools, and it was manufactured with standard low-voltage complementary metal-oxide-semiconductor technology, which allows smooth on-chip integration of microfluidics and microelectronics. By integrating 900 droplet detection circuits into microelectrode cells, the FPLOC has achieved large-scale integration of microfluidics and microelectronics. Compared to the full-custom and bottom-up design methods, the FPLOC provides hierarchical top-down design approach, field-programmability and dynamic manipulations of droplets for advanced microfluidic operations.
Determination of antenna factors using a three-antenna method at open-field test site
NASA Astrophysics Data System (ADS)
Masuzawa, Hiroshi; Tejima, Teruo; Harima, Katsushige; Morikawa, Takao
1992-09-01
Recently NIST has used the three-antenna method for calibration of the antenna factor of an antenna used for EMI measurements. This method does not require the specially designed standard antennas which are necessary in the standard field method or the standard antenna method, and can be used at an open-field test site. This paper theoretically and experimentally examines the measurement errors of this method and evaluates the precision of the antenna-factor calibration. It is found that the main source of the error is the non-ideal propagation characteristics of the test site, which should therefore be measured before the calibration. The precision of the antenna-factor calibration at the test site used in these experiments, is estimated to be 0.5 dB.
29 CFR 1926.752 - Site layout, site-specific erection plan and construction sequence.
Code of Federal Regulations, 2011 CFR
2011-07-01
... standard test method of field-cured samples, either 75 percent of the intended minimum compressive design... the basis of an appropriate ASTM standard test method of field-cured samples, either 75 percent of the intended minimum compressive design strength or sufficient strength to support the loads imposed during...
29 CFR 1926.752 - Site layout, site-specific erection plan and construction sequence.
Code of Federal Regulations, 2013 CFR
2013-07-01
... standard test method of field-cured samples, either 75 percent of the intended minimum compressive design... the basis of an appropriate ASTM standard test method of field-cured samples, either 75 percent of the intended minimum compressive design strength or sufficient strength to support the loads imposed during...
29 CFR 1926.752 - Site layout, site-specific erection plan and construction sequence.
Code of Federal Regulations, 2012 CFR
2012-07-01
... standard test method of field-cured samples, either 75 percent of the intended minimum compressive design... the basis of an appropriate ASTM standard test method of field-cured samples, either 75 percent of the intended minimum compressive design strength or sufficient strength to support the loads imposed during...
29 CFR 1926.752 - Site layout, site-specific erection plan and construction sequence.
Code of Federal Regulations, 2010 CFR
2010-07-01
... standard test method of field-cured samples, either 75 percent of the intended minimum compressive design... the basis of an appropriate ASTM standard test method of field-cured samples, either 75 percent of the intended minimum compressive design strength or sufficient strength to support the loads imposed during...
29 CFR 1926.752 - Site layout, site-specific erection plan and construction sequence.
Code of Federal Regulations, 2014 CFR
2014-07-01
... standard test method of field-cured samples, either 75 percent of the intended minimum compressive design... the basis of an appropriate ASTM standard test method of field-cured samples, either 75 percent of the intended minimum compressive design strength or sufficient strength to support the loads imposed during...
Descent advisor preliminary field test
NASA Technical Reports Server (NTRS)
Green, Steven M.; Vivona, Robert A.; Sanford, Beverly
1995-01-01
A field test of the Descent Advisor (DA) automation tool was conducted at the Denver Air Route Traffic Control Center in September 1994. DA is being developed to assist Center controllers in the efficient management and control of arrival traffic. DA generates advisories, based on trajectory predictions, to achieve accurate meter-fix arrival times in a fuel efficient manner while assisting the controller with the prediction and resolution of potential conflicts. The test objectives were to evaluate the accuracy of DA trajectory predictions for conventional- and flight-management-system-equipped jet transports, to identify significant sources of trajectory prediction error, and to investigate procedural and training issues (both air and ground) associated with DA operations. Various commercial aircraft (97 flights total) and a Boeing 737-100 research aircraft participated in the test. Preliminary results from the primary test set of 24 commercial flights indicate a mean DA arrival time prediction error of 2.4 sec late with a standard deviation of 13.1 sec. This paper describes the field test and presents preliminary results for the commercial flights.
The repeatability of mean defect with size III and size V standard automated perimetry.
Wall, Michael; Doyle, Carrie K; Zamba, K D; Artes, Paul; Johnson, Chris A
2013-02-15
The mean defect (MD) of the visual field is a global statistical index used to monitor overall visual field change over time. Our goal was to investigate the relationship of MD and its variability for two clinically used strategies (Swedish Interactive Threshold Algorithm [SITA] standard size III and full threshold size V) in glaucoma patients and controls. We tested one eye, at random, for 46 glaucoma patients and 28 ocularly healthy subjects with Humphrey program 24-2 SITA standard for size III and full threshold for size V each five times over a 5-week period. The standard deviation of MD was regressed against the MD for the five repeated tests, and quantile regression was used to show the relationship of variability and MD. A Wilcoxon test was used to compare the standard deviations of the two testing methods following quantile regression. Both types of regression analysis showed increasing variability with increasing visual field damage. Quantile regression showed modestly smaller MD confidence limits. There was a 15% decrease in SD with size V in glaucoma patients (P = 0.10) and a 12% decrease in ocularly healthy subjects (P = 0.08). The repeatability of size V MD appears to be slightly better than size III SITA testing. When using MD to determine visual field progression, a change of 1.5 to 4 decibels (dB) is needed to be outside the normal 95% confidence limits, depending on the size of the stimulus and the amount of visual field damage.
Visual field defects may not affect safe driving.
Dow, Jamie
2011-10-01
In Quebec a driver whose acquired visual field defect renders them ineligible for a driver's permit renewal may request an exemption from the visual field standard by demonstrating safe driving despite the defect. For safety reasons it was decided to attempt to identify predictors of failure on the road test in order to avoid placing driving evaluators in potentially dangerous situations when evaluating drivers with visual field defects. During a 4-month period in 2009 all requests for exemptions from the visual field standard were collected and analyzed. All available medical and visual field data were collated for 103 individuals, of whom 91 successfully completed the evaluation process and obtained a waiver. The collated data included age, sex, type of visual field defect, visual field characteristics, and concomitant medical problems. No single factor, or combination of factors, could predict failure of the road test. All 5 failures of the road test had cognitive problems but 6 of the successful drivers also had known cognitive problems. Thus, cognitive problems influence the risk of failure but do not predict certain failure. Most of the applicants for an exemption were able to complete the evaluation process successfully, thereby demonstrating safe driving despite their handicap. Consequently, jurisdictions that have visual field standards for their driving permit should implement procedures to evaluate drivers with visual field defects that render them unable to meet the standard but who wish to continue driving.
Recreational Pilot Practical Test Standards for Airplane, Rotorcraft
DOT National Transportation Integrated Search
1989-04-01
The Aviation Standards National Field Office of the FAA has : developed this book to be used as a standard by FAA inspectors : and designated pilot examiners when conducting recreational pilot : airmen practical tests. Flight instructors are expected...
2011-01-01
Background We studied the worst-case radiated radiofrequency (RF) susceptibility of automated external defibrillators (AEDs) based on the electromagnetic compatibility (EMC) requirements of a current standard for cardiac defibrillators, IEC 60601-2-4. Square wave modulation was used to mimic cardiac physiological frequencies of 1 - 3 Hz. Deviations from the IEC standard were a lower frequency limit of 30 MHz to explore frequencies where the patient-connected leads could resonate. Also testing up to 20 V/m was performed. We tested AEDs with ventricular fibrillation (V-Fib) and normal sinus rhythm signals on the patient leads to enable testing for false negatives (inappropriate "no shock advised" by the AED). Methods We performed radiated exposures in a 10 meter anechoic chamber using two broadband antennas to generate E fields in the 30 - 2500 MHz frequency range at 1% frequency steps. An AED patient simulator was housed in a shielded box and delivered normal and fibrillation waveforms to the AED's patient leads. We developed a technique to screen ECG waveforms stored in each AED for electromagnetic interference at all frequencies without waiting for the long cycle times between analyses (normally 20 to over 200 s). Results Five of the seven AEDs tested were susceptible to RF interference, primarily at frequencies below 80 MHz. Some induced errors could cause AEDs to malfunction and effectively inhibit operator prompts to deliver a shock to a patient experiencing lethal fibrillation. Failures occurred in some AEDs exposed to E fields between 3 V/m and 20 V/m, in the 38 - 50 MHz range. These occurred when the patient simulator was delivering a V-Fib waveform to the AED. Also, we found it is not possible to test modern battery-only-operated AEDs for EMI using a patient simulator if the IEC 60601-2-4 defibrillator standard's simulated patient load is used. Conclusions AEDs experienced potentially life-threatening false-negative failures from radiated RF, primarily below the lower frequency limit of present AED standards. Field strengths causing failures were at levels as low as 3 V/m at frequencies below 80 MHz where resonance of the patient leads and the AED input circuitry occurred. This plus problems with the standard's' prescribed patient load make changes to the standard necessary. PMID:21801368
Umberger, Ken; Bassen, Howard I
2011-07-29
We studied the worst-case radiated radiofrequency (RF) susceptibility of automated external defibrillators (AEDs) based on the electromagnetic compatibility (EMC) requirements of a current standard for cardiac defibrillators, IEC 60601-2-4. Square wave modulation was used to mimic cardiac physiological frequencies of 1-3 Hz. Deviations from the IEC standard were a lower frequency limit of 30 MHz to explore frequencies where the patient-connected leads could resonate. Also testing up to 20 V/m was performed. We tested AEDs with ventricular fibrillation (V-Fib) and normal sinus rhythm signals on the patient leads to enable testing for false negatives (inappropriate "no shock advised" by the AED). We performed radiated exposures in a 10 meter anechoic chamber using two broadband antennas to generate E fields in the 30-2500 MHz frequency range at 1% frequency steps. An AED patient simulator was housed in a shielded box and delivered normal and fibrillation waveforms to the AED's patient leads. We developed a technique to screen ECG waveforms stored in each AED for electromagnetic interference at all frequencies without waiting for the long cycle times between analyses (normally 20 to over 200 s). Five of the seven AEDs tested were susceptible to RF interference, primarily at frequencies below 80 MHz. Some induced errors could cause AEDs to malfunction and effectively inhibit operator prompts to deliver a shock to a patient experiencing lethal fibrillation. Failures occurred in some AEDs exposed to E fields between 3 V/m and 20 V/m, in the 38 - 50 MHz range. These occurred when the patient simulator was delivering a V-Fib waveform to the AED. Also, we found it is not possible to test modern battery-only-operated AEDs for EMI using a patient simulator if the IEC 60601-2-4 defibrillator standard's simulated patient load is used. AEDs experienced potentially life-threatening false-negative failures from radiated RF, primarily below the lower frequency limit of present AED standards. Field strengths causing failures were at levels as low as 3 V/m at frequencies below 80 MHz where resonance of the patient leads and the AED input circuitry occurred. This plus problems with the standard's' prescribed patient load make changes to the standard necessary.
NASA Astrophysics Data System (ADS)
Archer, Andrew J.; Chacko, Blesson; Evans, Robert
2017-07-01
In classical density functional theory (DFT), the part of the Helmholtz free energy functional arising from attractive inter-particle interactions is often treated in a mean-field or van der Waals approximation. On the face of it, this is a somewhat crude treatment as the resulting functional generates the simple random phase approximation (RPA) for the bulk fluid pair direct correlation function. We explain why using standard mean-field DFT to describe inhomogeneous fluid structure and thermodynamics is more accurate than one might expect based on this observation. By considering the pair correlation function g(x) and structure factor S(k) of a one-dimensional model fluid, for which exact results are available, we show that the mean-field DFT, employed within the test-particle procedure, yields results much superior to those from the RPA closure of the bulk Ornstein-Zernike equation. We argue that one should not judge the quality of a DFT based solely on the approximation it generates for the bulk pair direct correlation function.
Laboratory and Field Evaluation of Rapid Setting Cementitious Materials for Large Crater Repair
2010-05-01
frame used within which to complete the repair was the current NATO standard of 4 hr. A total of 6 simulated craters were prepared, with each repair...Combat Command 129 Andrews Street Langley Air Force Base, VA 23665 ERDC TR-10-4 ii Abstract: Current practice for expedient runway repair...penalty. Numerous commercial products are available. A full-scale field test was conducted using rapid setting materials to repair simulated bomb craters
DOT National Transportation Integrated Search
2008-05-23
This report presents the results of the ITS Standards Testing Program for the field testing, assessment, and evaluation of the three volumes comprising the Standards for Traffic Management Center to Center Communications (TMDD) version 2.1 and the NT...
Environmental characterisation of coal mine waste rock in the field: an example from New Zealand
NASA Astrophysics Data System (ADS)
Hughes, J.; Craw, D.; Peake, B.; Lindsay, P.; Weber, P.
2007-08-01
Characterisation of mine waste rock with respect to acid generation potential is a necessary part of routine mine operations, so that environmentally benign waste rock stacks can be constructed for permanent storage. Standard static characterisation techniques, such as acid neutralisation capacity (ANC), maximum potential acidity, and associated acid-base accounting, require laboratory tests that can be difficult to obtain rapidly at remote mine sites. We show that a combination of paste pH and a simple portable carbonate dissolution test, both techniques that can be done in the field in a 15 min time-frame, is useful for distinguishing rocks that are potentially acid-forming from those that are acid-neutralising. Use of these techniques could allow characterisation of mine wastes at the metre scale during mine excavation operations. Our application of these techniques to pyrite-bearing (total S = 1-4 wt%) but variably calcareous coal mine overburden shows that there is a strong correlation between the portable carbonate dissolution technique and laboratory-determined ANC measurements (range of 0-10 wt% calcite equivalent). Paste pH measurements on the same rocks are bimodal, with high-sulphur, low-calcite rocks yielding pH near 3 after 10 min, whereas high-ANC rocks yield paste pH of 7-8. In our coal mine example, the field tests were most effective when used in conjunction with stratigraphy. However, the same field tests have potential for routine use in any mine in which distinction of acid-generating rocks from acid-neutralising rocks is required. Calibration of field-based acid-base accounting characteristics of the rocks with laboratory-based static and/or kinetic tests is still necessary.
Chen, Xiang-Wu; Zhao, Ying-Xi
2017-01-01
AIM To compare the diagnostic performance of isolated-check visual evoked potential (icVEP) and standard automated perimetry (SAP), for evaluating the application values of icVEP in the detection of early glaucoma. METHODS Totally 144 subjects (288 eyes) were enrolled in this study. icVEP testing was performed with the Neucodia visual electrophysiological diagnostic system. A 15% positive-contrast (bright) condition pattern was used in this device to differentiate between glaucoma patients and healthy control subjects. Signal-to-noise ratios (SNR) were derived based on a multivariate statistic. The eyes were judged as abnormal if the test yielded an SNR≤1. SAP testing was performed with the Humphrey Field Analyzer II. The visual fields were deemed as abnormality if the glaucoma hemifield test results outside normal limits; or the pattern standard deviation with P<0.05; or the cluster of three or more non-edge points on the pattern deviation plot in a single hemifield with P<0.05, one of which must have a P<0.01. Disc photographs were graded as either glaucomatous optic neuropathy or normal by two experts who were masked to all other patient information. Moorfields regression analysis (MRA) used as a separate diagnostic classification was performed by Heidelberg retina tomograph (HRT). RESULTS When the disc photograph grader was used as diagnostic standard, the sensitivity for SAP and icVEP was 32.3% and 38.5% respectively and specificity was 82.3% and 77.8% respectively. When the MRA Classifier was used as the diagnostic standard, the sensitivity for SAP and icVEP was 48.6% and 51.4% respectively and specificity was 84.1% and 78.0% respectively. When the combined structural assessment was used as the diagnostic standard, the sensitivity for SAP and icVEP was 59.2% and 53.1% respectively and specificity was 84.2% and 84.6% respectivlely. There was no statistical significance between the sensitivity or specificity of SAP and icVEP, regardless of which diagnostic standard was based on. CONCLUSION The diagnostic performance of icVEP is not better than that of SAP in the detection of early glaucoma. PMID:28503434
[Research progress on mechanical performance evaluation of artificial intervertebral disc].
Li, Rui; Wang, Song; Liao, Zhenhua; Liu, Weiqiang
2018-03-01
The mechanical properties of artificial intervertebral disc (AID) are related to long-term reliability of prosthesis. There are three testing methods involved in the mechanical performance evaluation of AID based on different tools: the testing method using mechanical simulator, in vitro specimen testing method and finite element analysis method. In this study, the testing standard, testing equipment and materials of AID were firstly introduced. Then, the present status of AID static mechanical properties test (static axial compression, static axial compression-shear), dynamic mechanical properties test (dynamic axial compression, dynamic axial compression-shear), creep and stress relaxation test, device pushout test, core pushout test, subsidence test, etc. were focused on. The experimental techniques using in vitro specimen testing method and testing results of available artificial discs were summarized. The experimental methods and research status of finite element analysis were also summarized. Finally, the research trends of AID mechanical performance evaluation were forecasted. The simulator, load, dynamic cycle, motion mode, specimen and test standard would be important research fields in the future.
Travensolo, Cristiane; Goessler, Karla; Poton, Roberto; Pinto, Roberta Ramos; Polito, Marcos Doederlein
2018-04-13
The literature concerning the effects of cardiac rehabilitation (CR) on field tests results is inconsistent. To perform a systematic review with meta-analysis on field tests results after programs of CR. Studies published in PubMed and Web of Science databases until May 2016 were analyzed. The standard difference in means correct by bias (Hedges' g) was used as effect size (g) to measure que amount of modifications in performance of field tests after CR period. Potential differences between subgroups were analyzed by Q-test based on ANOVA. Fifteen studies published between 1996 e 2016 were included in the review, 932 patients and age ranged 54,4 - 75,3 years old. Fourteen studies used the six-minutes walking test to evaluate the exercise capacity and one study used the Shuttle Walk Test. The random Hedges's g was 0.617 (P<0.001), representing a drop of 20% in the performance of field test after CR. The meta-regression showed significantly association (P=0.01) to aerobic exercise duration, i.e., for each 1-min increase in aerobic exercise duration, there is a 0.02 increase in effect size for performance in the field test. Field tests can detect physical modification after CR, and the large duration of aerobic exercise during CR was associated with a better result. Copyright © 2018 Sociedade Portuguesa de Cardiologia. Publicado por Elsevier España, S.L.U. All rights reserved.
Huang, Wen-Yen; Hung, Weiteng; Vu, Chi Thanh; Chen, Wei-Ting; Lai, Jhih-Wei; Lin, Chitsan
2016-11-01
Taiwan has a large number of poorly managed contaminated sites in need of remediation. This study proposes a framework, a set of standards, and a spreadsheet-based evaluation tool for implementing green and sustainable principles into remediation projects and evaluating the projects from this perspective. We performed a case study to understand how the framework would be applied. For the case study, we used a spreadsheet-based evaluation tool (SEFA) and performed field scale cultivation tests on a site contaminated with total petroleum hydrocarbons (TPHs). The site was divided into two lots: one treated by chemical oxidation and the other by bioremediation. We evaluated five core elements of green and sustainable remediation (GSR): energy, air, water resources, materials and wastes, and land and ecosystem. The proposed evaluation tool and field scale cultivation test were found to efficiently assess the effectiveness of the two remediation alternatives. The framework and related tools proposed herein can potentially be used to support decisions about the remediation of contaminated sites taking into account engineering management, cost effectiveness, and social reconciliation.
Gargis, Amy S; Kalman, Lisa; Lubin, Ira M
2016-12-01
Clinical microbiology and public health laboratories are beginning to utilize next-generation sequencing (NGS) for a range of applications. This technology has the potential to transform the field by providing approaches that will complement, or even replace, many conventional laboratory tests. While the benefits of NGS are significant, the complexities of these assays require an evolving set of standards to ensure testing quality. Regulatory and accreditation requirements, professional guidelines, and best practices that help ensure the quality of NGS-based tests are emerging. This review highlights currently available standards and guidelines for the implementation of NGS in the clinical and public health laboratory setting, and it includes considerations for NGS test validation, quality control procedures, proficiency testing, and reference materials. Copyright © 2016, American Society for Microbiology. All Rights Reserved.
Mechanistic evaluation of test data from LTPP flexible pavement test sections, Vol. I
DOT National Transportation Integrated Search
1996-01-01
This report summarizes the process and lessons learned from the Standardized Travel Time Surveys and Field Test project. The field tests of travel time data collection were conducted in Boston, Seattle, and Lexington in 1993. The methodologies tested...
Methods for the field evaluation of quantitative G6PD diagnostics: a review.
Ley, Benedikt; Bancone, Germana; von Seidlein, Lorenz; Thriemer, Kamala; Richards, Jack S; Domingo, Gonzalo J; Price, Ric N
2017-09-11
Individuals with glucose-6-phosphate dehydrogenase (G6PD) deficiency are at risk of severe haemolysis following the administration of 8-aminoquinoline compounds. Primaquine is the only widely available 8-aminoquinoline for the radical cure of Plasmodium vivax. Tafenoquine is under development with the potential to simplify treatment regimens, but point-of-care (PoC) tests will be needed to provide quantitative measurement of G6PD activity prior to its administration. There is currently a lack of appropriate G6PD PoC tests, but a number of new tests are in development and are likely to enter the market in the coming years. As these are implemented, they will need to be validated in field studies. This article outlines the technical details for the field evaluation of novel quantitative G6PD diagnostics such as sample handling, reference testing and statistical analysis. Field evaluation is based on the comparison of paired samples, including one sample tested by the new assay at point of care and one sample tested by the gold-standard reference method, UV spectrophotometry in an established laboratory. Samples can be collected as capillary or venous blood; the existing literature suggests that potential differences in capillary or venous blood are unlikely to affect results substantially. The collection and storage of samples is critical to ensure preservation of enzyme activity, it is recommended that samples are stored at 4 °C and testing occurs within 4 days of collection. Test results can be visually presented as scatter plot, Bland-Altman plot, and a histogram of the G6PD activity distribution of the study population. Calculating the adjusted male median allows categorizing results according to G6PD activity to calculate standard performance indicators and to perform receiver operating characteristic (ROC) analysis.
Visual function and fitness to drive.
Kotecha, Aachal; Spratt, Alexander; Viswanathan, Ananth
2008-01-01
Driving is recognized to be a visually intensive task and accordingly there is a legal minimum standard of vision required for all motorists. The purpose of this paper is to review the current United Kingdom (UK) visual requirements for driving and discuss the evidence base behind these legal rules. The role of newer, alternative tests of visual function that may be better indicators of driving safety will also be considered. Finally, the implications of ageing on driving ability are discussed. A search of Medline and PubMed databases was performed using the following keywords: driving, vision, visual function, fitness to drive and ageing. In addition, papers from the Department of Transport website and UK Royal College of Ophthalmologists guidelines were studied. Current UK visual standards for driving are based upon historical concepts, but recent advances in technology have brought about more sophisticated methods for assessing the status of the binocular visual field and examining visual attention. These tests appear to be better predictors of driving performance. Further work is required to establish whether these newer tests should be incorporated in the current UK visual standards when examining an individual's fitness to drive.
Geostatistics as a validation tool for setting ozone standards for durum wheat.
De Marco, Alessandra; Screpanti, Augusto; Paoletti, Elena
2010-02-01
Which is the best standard for protecting plants from ozone? To answer this question, we must validate the standards by testing biological responses vs. ambient data in the field. A validation is missing for European and USA standards, because the networks for ozone, meteorology and plant responses are spatially independent. We proposed geostatistics as validation tool, and used durum wheat in central Italy as a test. The standards summarized ozone impact on yield better than hourly averages. Although USA criteria explained ozone-induced yield losses better than European criteria, USA legal level (75 ppb) protected only 39% of sites. European exposure-based standards protected > or =90%. Reducing the USA level to the Canadian 65 ppb or using W126 protected 91% and 97%, respectively. For a no-threshold accumulated stomatal flux, 22 mmol m(-2) was suggested to protect 97% of sites. In a multiple regression, precipitation explained 22% and ozone explained <0.9% of yield variability. Copyright (c) 2009 Elsevier Ltd. All rights reserved.
Hydrogen Field Test Standard: Laboratory and Field Performance
Pope, Jodie G.; Wright, John D.
2015-01-01
The National Institute of Standards and Technology (NIST) developed a prototype field test standard (FTS) that incorporates three test methods that could be used by state weights and measures inspectors to periodically verify the accuracy of retail hydrogen dispensers, much as gasoline dispensers are tested today. The three field test methods are: 1) gravimetric, 2) Pressure, Volume, Temperature (PVT), and 3) master meter. The FTS was tested in NIST's Transient Flow Facility with helium gas and in the field at a hydrogen dispenser location. All three methods agree within 0.57 % and 1.53 % for all test drafts of helium gas in the laboratory setting and of hydrogen gas in the field, respectively. The time required to perform six test drafts is similar for all three methods, ranging from 6 h for the gravimetric and master meter methods to 8 h for the PVT method. The laboratory tests show that 1) it is critical to wait for thermal equilibrium to achieve density measurements in the FTS that meet the desired uncertainty requirements for the PVT and master meter methods; in general, we found a wait time of 20 minutes introduces errors < 0.1 % and < 0.04 % in the PVT and master meter methods, respectively and 2) buoyancy corrections are important for the lowest uncertainty gravimetric measurements. The field tests show that sensor drift can become a largest component of uncertainty that is not present in the laboratory setting. The scale was calibrated after it was set up at the field location. Checks of the calibration throughout testing showed drift of 0.031 %. Calibration of the master meter and the pressure sensors prior to travel to the field location and upon return showed significant drifts in their calibrations; 0.14 % and up to 1.7 %, respectively. This highlights the need for better sensor selection and/or more robust sensor testing prior to putting into field service. All three test methods are capable of being successfully performed in the field and give equivalent answers if proper sensors without drift are used. PMID:26722192
NASA Astrophysics Data System (ADS)
Zenoni, A.; Bignotti, F.; Donzella, A.; Donzella, G.; Ferrari, M.; Pandini, S.; Andrighetto, A.; Ballan, M.; Corradetti, S.; Manzolaro, M.; Monetti, A.; Rossignoli, M.; Scarpa, D.; Alloni, D.; Prata, M.; Salvini, A.; Zelaschi, F.
2017-11-01
Materials and components employed in the presence of intense neutron and gamma fields are expected to absorb high dose levels that may induce deep modifications of their physical and mechanical properties, possibly causing loss of their function. A protocol for irradiating elastomeric materials in reactor mixed neutron and gamma fields and for testing the evolution of their main mechanical and physical properties with absorbed dose has been developed. Four elastomeric compounds used for vacuum O-rings, one fluoroelastomer polymer (FPM) based and three ethylene propylene diene monomer rubber (EPDM) based, presently available on the market have been selected for the test. One EPDM is rated as radiation resistant in gamma fields, while the other elastomers are general purpose products. Particular care has been devoted to dosimetry calculations, since absorbed dose in neutron fields, unlike pure gamma fields, is strongly dependent on the material composition and, in particular, on the hydrogen content. The products have been tested up to about 2 MGy absorbed dose. The FPM based elastomer, in spite of its lower dose absorption in fast neutron fields, features the largest variations of properties, with a dramatic increase in stiffness and brittleness. Out of the three EPDM based compounds, one shows large and rapid changes in the main mechanical properties, whereas the other two feature more stable behaviors. The performance of the EPDM rated as radiation resistant in pure gamma fields does not appear significantly better than that of the standard product. The predictive capability of the accelerated irradiation tests performed as well as the applicable concepts of threshold of radiation damage is discussed in view of the use of the examined products in the selective production of exotic species facility, now under construction at the Legnaro National Laboratories of the Italian Istituto Nazionale di Fisica Nucleare. It results that a careful account of dose rate effects and oxygen penetration in the material, both during test irradiations and in operating conditions, is needed to obtain reliable predictions.
Zenoni, A; Bignotti, F; Donzella, A; Donzella, G; Ferrari, M; Pandini, S; Andrighetto, A; Ballan, M; Corradetti, S; Manzolaro, M; Monetti, A; Rossignoli, M; Scarpa, D; Alloni, D; Prata, M; Salvini, A; Zelaschi, F
2017-11-01
Materials and components employed in the presence of intense neutron and gamma fields are expected to absorb high dose levels that may induce deep modifications of their physical and mechanical properties, possibly causing loss of their function. A protocol for irradiating elastomeric materials in reactor mixed neutron and gamma fields and for testing the evolution of their main mechanical and physical properties with absorbed dose has been developed. Four elastomeric compounds used for vacuum O-rings, one fluoroelastomer polymer (FPM) based and three ethylene propylene diene monomer rubber (EPDM) based, presently available on the market have been selected for the test. One EPDM is rated as radiation resistant in gamma fields, while the other elastomers are general purpose products. Particular care has been devoted to dosimetry calculations, since absorbed dose in neutron fields, unlike pure gamma fields, is strongly dependent on the material composition and, in particular, on the hydrogen content. The products have been tested up to about 2 MGy absorbed dose. The FPM based elastomer, in spite of its lower dose absorption in fast neutron fields, features the largest variations of properties, with a dramatic increase in stiffness and brittleness. Out of the three EPDM based compounds, one shows large and rapid changes in the main mechanical properties, whereas the other two feature more stable behaviors. The performance of the EPDM rated as radiation resistant in pure gamma fields does not appear significantly better than that of the standard product. The predictive capability of the accelerated irradiation tests performed as well as the applicable concepts of threshold of radiation damage is discussed in view of the use of the examined products in the selective production of exotic species facility, now under construction at the Legnaro National Laboratories of the Italian Istituto Nazionale di Fisica Nucleare. It results that a careful account of dose rate effects and oxygen penetration in the material, both during test irradiations and in operating conditions, is needed to obtain reliable predictions.
Laboratory and field evaluation of hot mix asphalt with high contents of reclaimed asphalt pavement
NASA Astrophysics Data System (ADS)
Van Winkle, Clinton Isaac
Currently in Iowa, the amount of RAP materials allowed for the surface layer is limited to 15% by weight. The objective of this project was to develop quality standards for inclusion of RAP content higher than 15% in asphalt mixtures. To meet Superpave mix design requirements, it was necessary to fractionate the RAP materials. Based on the extensive sieve-by-sieve analysis of RAP materials, the optimum sieve size to fractionate RAP materials was identified. To determine if the higher percentage of RAP materials than 15% can be used in Iowa's state highway, three test sections with 30.0%, 35.5% and 39.2% of RAP materials were constructed on Highway 6 in Iowa City. The construction of the field test sections was monitored and the cores were obtained to measure field densities of test sections. Field mixtures collected from test sections were compacted in the laboratory in order to test the moisture sensitivity using a Hamburg Wheel Tracking Device. The binder was extracted from the field mixtures with varying amounts of RAP materials and tested to determine the effects of RAP materials on the PG grade of a virgin binder. Field cores were taken from the various mix designs to determine the percent density of each test section. A condition survey of the test sections was then performed to evaluate the short-term performance.
Analysis of complex environment effect on near-field emission
NASA Astrophysics Data System (ADS)
Ravelo, B.; Lalléchère, S.; Bonnet, P.; Paladian, F.
2014-10-01
The article is dealing with uncertainty analyses of radiofrequency circuits electromagnetic compatibility emission based on the near-field/near-field (NF/NF) transform combined with stochastic approach. By using 2D data corresponding to electromagnetic (EM) field (X=E or H) scanned in the observation plane placed at the position z0 above the circuit under test (CUT), the X field map was extracted. Then, uncertainty analyses were assessed via the statistical moments from X component. In addition, stochastic collocation based was considered and calculations were applied to planar EM NF radiated by the CUTs as Wilkinson power divider and a microstrip line operating at GHz levels. After Matlab implementation, the mean and standard deviation were assessed. The present study illustrates how the variations of environmental parameters may impact EM fields. The NF uncertainty methodology can be applied to any physical parameter effects in complex environment and useful for printed circuit board (PCBs) design guideline.
NASA Astrophysics Data System (ADS)
Miranda, Jorge; Cabral, Jorge; Ravelo, Blaise; Wagner, Stefan; Pedersen, Christian F.; Memon, Mukhtiar; Mathiesen, Morten
2015-01-01
An innovative e-healthcare platform named common recognition and identification platform (CRIP) was developed and tested as part of the CareStore project. CareStore and CRIP aims at delivering accurate and safe disease management by minimising human operator errors in hospitals and care facilities. To support this, the CRIP platform features fingerprint biometrics and near field communication (NFC) for user identification; and Bluetooth communication support for a range of telemedicine medical devices adhering to the IEEE 11073 standard. The aim of this study was to evaluate the electromagnetic compatibility (EMC) immunity of the CRIP platform in order to validate it for medical application use. The first prototype of CRIP was demonstrated to operate as expected by showing the user identification function feasibility, both via NFC and biometric, and by detection of Bluetooth devices via radio frequency (RF) scanning. The NFC module works in the 13.56 MHz band and the Bluetooth module work in the 2.4 GHz band, according to the IEEE 802.15.1 standard. The standard test qualification of the CRIP was performed based on the radiated EMC immunity with respect to the EN 61000-4-3 standard. The immunity tests were conducted under industrial EMC compliance with electric field aggression, with levels up to 10 V/m in both horizontal and vertical polarisations when the test antenna and the CRIP were placed at a distance of 3 m. It was found that the CRIP device complies with the European electromagnetic (EM) radiation immunity requirements.
29 CFR Appendix A to Subpart Q of... - References to subpart Q of Part 1926
Code of Federal Regulations, 2010 CFR
2010-07-01
... (ASTM C39-86). • Standard Test Method for Making and Curing Concrete Test Specimens in the Field (ASTM C31-85). • Standard Test Method for Penetration Resistance of Hardened Concrete (ASTM C803-82... (ASTM C873-85). • Standard Method for Developing Early Age Compressive Test Values and Projecting Later...
Early detection of glaucoma by means of a novel 3D computer‐automated visual field test
Nazemi, Paul P; Fink, Wolfgang; Sadun, Alfredo A; Francis, Brian; Minckler, Donald
2007-01-01
Purpose A recently devised 3D computer‐automated threshold Amsler grid test was used to identify early and distinctive defects in people with suspected glaucoma. Further, the location, shape and depth of these field defects were characterised. Finally, the visual fields were compared with those obtained by standard automated perimetry. Patients and methods Glaucoma suspects were defined as those having elevated intraocular pressure (>21 mm Hg) or cup‐to‐disc ratio of >0.5. 33 patients and 66 eyes with risk factors for glaucoma were examined. 15 patients and 23 eyes with no risk factors were tested as controls. The recently developed 3D computer‐automated threshold Amsler grid test was used. The test exhibits a grid on a computer screen at a preselected greyscale and angular resolution, and allows patients to trace those areas on the grid that are missing in their visual field using a touch screen. The 5‐minute test required that the patients repeatedly outline scotomas on a touch screen with varied displays of contrast while maintaining their gaze on a central fixation marker. A 3D depiction of the visual field defects was then obtained that was further characterised by the location, shape and depth of the scotomas. The exam was repeated three times per eye. The results were compared to Humphrey visual field tests (ie, achromatic standard or SITA standard 30‐2 or 24‐2). Results In this pilot study 79% of the eyes tested in the glaucoma‐suspect group repeatedly demonstrated visual field loss with the 3D perimetry. The 3D depictions of visual field loss associated with these risk factors were all characteristic of or compatible with glaucoma. 71% of the eyes demonstrated arcuate defects or a nasal step. Constricted visual fields were shown in 29% of the eyes. No visual field changes were detected in the control group. Conclusions The 3D computer‐automated threshold Amsler grid test may demonstrate visual field abnormalities characteristic of glaucoma in glaucoma suspects with normal achromatic Humphrey visual field testing. This test may be used as a screening tool for the early detection of glaucoma. PMID:17504855
Field reliability of Ricor microcoolers
NASA Astrophysics Data System (ADS)
Pundak, N.; Porat, Z.; Barak, M.; Zur, Y.; Pasternak, G.
2009-05-01
Over the recent 25 years Ricor has fielded in excess of 50,000 Stirling cryocoolers, among which approximately 30,000 units are of micro integral rotary driven type. The statistical population of the fielded units is counted in thousands/ hundreds per application category. In contrast to MTTF values as gathered and presented based on standard reliability demonstration tests, where the failure of the weakest component dictates the end of product life, in the case of field reliability, where design and workmanship failures are counted and considered, the values are usually reported in number of failures per million hours of operation. These values are important and relevant to the prediction of service capabilities and plan.
A Field-Based Aquatic Life Benchmark for Conductivity in ...
This report adapts the standard U.S. EPA methodology for deriving ambient water quality criteria. Rather than use toxicity test results, the adaptation uses field data to determine the loss of 5% of genera from streams. The method is applied to derive effect benchmarks for dissolved salts as measured by conductivity in Central Appalachian streams using data from West Virginia and Kentucky. This report provides scientific evidence for a conductivity benchmark in a specific region rather than for the entire United States.
Results of the long range position-determining system tests. [Field Army system
NASA Technical Reports Server (NTRS)
Rhode, F. W.
1973-01-01
The long range position-determining system (LRPDS) has been developed by the Corps of Engineers to provide the Field Army with a rapid and accurate positioning capability. The LRPDS consists of an airborne reference position set (RPS), up to 30 ground based positioning sets (PS), and a position computing central (PCC). The PCC calculates the position of each PS based on the range change information provided by each Set. The positions can be relayed back to the PS again via RPS. Each PS unit contains a double oven precise crystal oscillator. The RPS contains a Hewlett-Packard cesium beam standard. Frequency drifts and off-sets of the crystal oscillators are taken in account in the data reduction process. A field test program was initiated in November 1972. A total of 54 flights were made which included six flights for equipment testing and 48 flights utilizing the field test data reduction program. The four general types of PS layouts used were: short range; medium range; long range; tactical configuration. The overall RMS radial error of the unknown positions varied from about 2.3 meters for the short range to about 15 meters for the long range. The corresponding elevation RMS errors vary from about 12 meters to 37 meters.
Pressure calculation in hybrid particle-field simulations
NASA Astrophysics Data System (ADS)
Milano, Giuseppe; Kawakatsu, Toshihiro
2010-12-01
In the framework of a recently developed scheme for a hybrid particle-field simulation techniques where self-consistent field (SCF) theory and particle models (molecular dynamics) are combined [J. Chem. Phys. 130, 214106 (2009)], we developed a general formulation for the calculation of instantaneous pressure and stress tensor. The expressions have been derived from statistical mechanical definition of the pressure starting from the expression for the free energy functional in the SCF theory. An implementation of the derived formulation suitable for hybrid particle-field molecular dynamics-self-consistent field simulations is described. A series of test simulations on model systems are reported comparing the calculated pressure with those obtained from standard molecular dynamics simulations based on pair potentials.
Full-Field Strain Methods for Investigating Failure Mechanisms in Triaxial Braided Composites
NASA Technical Reports Server (NTRS)
Littell, Justin D.; Binienda, Wieslaw K.; Goldberg, Robert K.; Roberts, Gary D.
2008-01-01
Recent advancements in braiding technology have led to commercially viable manufacturing approaches for making large structures with complex shape out of triaxial braided composite materials. In some cases, the static load capability of structures made using these materials has been higher than expected based on material strength properties measured using standard coupon tests. A more detailed investigation of deformation and failure processes in large-unit-cell-size triaxial braid composites is needed to evaluate the applicability of standard test methods for these materials and to develop alternative testing approaches. This report presents some new techniques that have been developed to investigate local deformation and failure using digital image correlation techniques. The methods were used to measure both local and global strains during standard straight-sided coupon tensile tests on composite materials made with 12- and 24-k yarns and a 0 /+60 /-60 triaxial braid architecture. Local deformation and failure within fiber bundles was observed and correlations were made between these local failures and global composite deformation and strength.
Physical employment standards for U.K. fire and rescue service personnel.
Blacker, S D; Rayson, M P; Wilkinson, D M; Carter, J M; Nevill, A M; Richmond, V L
2016-01-01
Evidence-based physical employment standards are vital for recruiting, training and maintaining the operational effectiveness of personnel in physically demanding occupations. (i) Develop criterion tests for in-service physical assessment, which simulate the role-related physical demands of UK fire and rescue service (UK FRS) personnel. (ii) Develop practical physical selection tests for FRS applicants. (iii) Evaluate the validity of the selection tests to predict criterion test performance. Stage 1: we conducted a physical demands analysis involving seven workshops and an expert panel to document the key physical tasks required of UK FRS personnel and to develop 'criterion' and 'selection' tests. Stage 2: we measured the performance of 137 trainee and 50 trained UK FRS personnel on selection, criterion and 'field' measures of aerobic power, strength and body size. Statistical models were developed to predict criterion test performance. Stage 3: matter experts derived minimum performance standards. We developed single person simulations of the key physical tasks required of UK FRS personnel as criterion and selection tests (rural fire, domestic fire, ladder lift, ladder extension, ladder climb, pump assembly, enclosed space search). Selection tests were marginally stronger predictors of criterion test performance (r = 0.88-0.94, 95% Limits of Agreement [LoA] 7.6-14.0%) than field test scores (r = 0.84-0.94, 95% LoA 8.0-19.8%) and offered greater face and content validity and more practical implementation. This study outlines the development of role-related, gender-free physical employment tests for the UK FRS, which conform to equal opportunities law. © The Author 2015. Published by Oxford University Press on behalf of the Society of Occupational Medicine. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Electromagnetic Waves in a Uniform Gravitational Field and Planck's Postulate
ERIC Educational Resources Information Center
Acedo, Luis; Tung, Michael M.
2012-01-01
The gravitational redshift forms the central part of the majority of the classical tests for the general theory of relativity. It could be successfully checked even in laboratory experiments on the earth's surface. The standard derivation of this effect is based on the distortion of the local structure of spacetime induced by large masses. The…
ERIC Educational Resources Information Center
Hirumi, Atsusi; Johnson, Teresa; Reyes, Ramsamooj Javier; Lok, Benjamin; Johnsen, Kyle; Rivera-Gutierrez, Diego J.; Bogert, Kenneth; Kubovec, Stacey; Eakins, Michael; Kleinsmith, Andrea; Bellew, Michael; Cendan, Juan
2016-01-01
In Part I of this two-part series, we examined the design and development of NERVE: A virtual patient simulation created to give medical students standardized experiences in interviewing, examining, and diagnosing virtual patients with cranial nerve disorders. We illustrated key design features and discussed how design-based research studies…
Sathantriphop, Sunaiyana; Kongmee, Monthathip; Tainchum, Krajana; Suwansirisilp, Kornwika; Sanguanpong, Unchalee; Bangs, Michael J; Chareonviriyaphap, Theeraphap
2015-12-01
The repellent and irritant effects of three essential oils-clove, hairy basil, and sweet basil-were compared using an excito-repellency test system against an insecticide-resistant strain of Aedes aegypti (L.) females from Pu Teuy, Kanchanaburi Province. DEET was used as the comparison standard compound. Tests were conducted under field and controlled laboratory conditions. The most marked repellent effect (spatial noncontact assay) among the three test essential oils was exhibited by sweet basil, Ocimum basilicum L. (53.8% escaped mosquitoes in 30-min exposure period) under laboratory conditions while hairy basil, Ocimum americanum L. and clove, Syzygium aromaticum (L.) Merill et. L.M. Perry from laboratory tests and sweet basil from field tests were the least effective as repellents (0-14%). In contrast, the contact assays measuring combined irritancy (excitation) and repellency effects found the best contact irritant response to hairy basil and DEET in field tests, whereas all others in laboratory and field were relatively ineffective in stimulating mosquitoes to move out the test chambers (0-5.5%). All three essential oils demonstrated significant differences in behavioral responses between field and laboratory conditions, whereas there was no significant difference in contact and noncontact assays for DEET between the two test conditions (P > 0.05). © The Authors 2015. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Mitigating Upsets in SRAM-Based FPGAs from the Xilinx Virtex 2 Family
NASA Technical Reports Server (NTRS)
Swift, G. M.; Yui, C. C.; Carmichael, C.; Koga, R.; George, J. S.
2003-01-01
Static random access memory (SRAM) upset rates in field programmable gate arrays (FPGAs) from the Xilinx Virtex 2 family have been tested for radiation effects on configuration memory, block RAM and the power-on-reset (POR) and SelectMAP single event functional interrupts (SEFIs). Dynamic testing has shown the effectiveness and value of Triple Module Redundancy (TMR) and partial reconfiguration when used in conjunction. Continuing dynamic testing for more complex designs and other Virtex 2 capabilities (i.e., I/O standards, digital clock managers (DCM), etc.) is scheduled.
Cryogenic Insulation Standard Data and Methodologies Project
NASA Technical Reports Server (NTRS)
Summerfield, Burton; Thompson, Karen; Zeitlin, Nancy; Mullenix, Pamela; Fesmire, James; Swanger, Adam
2015-01-01
Extending some recent developments in the area of technical consensus standards for cryogenic thermal insulation systems, a preliminary Inter-Laboratory Study of foam insulation materials was performed by NASA Kennedy Space Center and LeTourneau University. The initial focus was ambient pressure cryogenic boil off testing using the Cryostat-400 flat-plate instrument. Completion of a test facility at LETU has enabled direct, comparative testing, using identical cryostat instruments and methods, and the production of standard thermal data sets for a number of materials under sub-ambient conditions. The two sets of measurements were analyzed and indicate there is reasonable agreement between the two laboratories. Based on cryogenic boiloff calorimetry, new equipment and methods for testing thermal insulation systems have been successfully developed. These boiloff instruments (or cryostats) include both flat plate and cylindrical models and are applicable to a wide range of different materials under a wide range of test conditions. Test measurements are generally made at large temperature difference (boundary temperatures of 293 K and 78 K are typical) and include the full vacuum pressure range. Results are generally reported in effective thermal conductivity (ke) and mean heat flux (q) through the insulation system. The new cryostat instruments provide an effective and reliable way to characterize the thermal performance of materials under subambient conditions. Proven in through thousands of tests of hundreds of material systems, they have supported a wide range of aerospace, industry, and research projects. Boiloff testing technology is not just for cryogenic testing but is a cost effective, field-representative methodology to test any material or system for applications at sub-ambient temperatures. This technology, when adequately coupled with a technical standards basis, can provide a cost-effective, field-representative methodology to test any material or system for applications at sub-ambient to cryogenic temperatures. A growing need for energy efficiency and cryogenic applications is creating a worldwide demand for improved thermal insulation systems for low temperatures. The need for thermal characterization of these systems and materials raises a corresponding need for insulation test standards and thermal data targeted for cryogenic-vacuum applications. Such standards have a strong correlation to energy, transportation, and environment and the advancement of new materials technologies in these areas. In conjunction with this project, two new standards on cryogenic insulation were recently published by ASTM International: C1774 and C740. Following the requirements of NPR 7120.10, Technical Standards for NASA Programs and Projects, the appropriate information in this report can be provided to the NASA Chief Engineer as input for NASA's annual report to NIST, as required by OMB Circular No. A-119, describing NASA's use of voluntary consensus standards and participation in the development of voluntary consensus standards and bodies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wen, Haifang; Li, Xiaojun; Edil, Tuncer
The purpose of this study was to evaluate the performance of cementitious high carbon fly ash (CHCFA) stabilized recycled asphalt pavement as a base course material in a real world setting. Three test road cells were built at MnROAD facility in Minnesota. These cells have the same asphalt surface layers, subbases, and subgrades, but three different base courses: conventional crushed aggregates, untreated recycled pavement materials (RPM), and CHCFA stabilized RPM materials. During and after the construction of the three cells, laboratory and field tests were carried out to characterize the material properties. The test results were used in the mechanistic-empiricalmore » pavement design guide (MEPDG) to predict the pavement performance. Based on the performance prediction, the life cycle analyses of cost, energy consumption, and greenhouse gasses were performed. The leaching impacts of these three types of base materials were compared. The laboratory and field tests showed that fly ash stabilized RPM had higher modulus than crushed aggregate and RPM did. Based on the MEPDG performance prediction, the service life of the Cell 79 containing fly ash stabilized RPM, is 23.5 years, which is about twice the service life (11 years) of the Cell 77 with RPM base, and about three times the service life (7.5 years) of the Cell 78 with crushed aggregate base. The life cycle analysis indicated that the usage of the fly ash stabilized RPM as the base of the flexible pavement can significantly reduce the life cycle cost, the energy consumption, the greenhouse gases emission. Concentrations of many trace elements, particularly those with relatively low water quality standards, diminish over time as water flows through the pavement profile. For many elements, concentrations below US water drinking water quality standards are attained at the bottom of the pavement profile within 2-4 pore volumes of flow.« less
Standardizing biomarker testing for Canadian patients with advanced lung cancer
Melosky, B.; Blais, N.; Cheema, P.; Couture, C.; Juergens, R.; Kamel-Reid, S.; Tsao, M.-S.; Wheatley-Price, P.; Xu, Z.; Ionescu, D.N.
2018-01-01
Background The development and approval of both targeted and immune therapies for patients with advanced non-small cell lung cancer (nsclc) has significantly improved patient survival rates and quality of life. Biomarker testing for patients newly diagnosed with nsclc, as well as for patients progressing after treatment with epidermal growth factor receptor (EGFR) inhibitors, is the standard of care in Canada and many parts of the world. Methods A group of thoracic oncology experts in the field of thoracic oncology met to describe the standard for biomarker testing for lung cancer in the Canadian context, focusing on evidence-based recommendations for standard-of-care testing for EGFR, anaplastic lymphoma kinase (ALK), ROS1, BRAF V600 and programmed death-ligand (PD-L1) at the time of diagnosis of advanced disease and EGFR T790M upon progression. As well, additional exploratory molecules and targets are likely to impact future patient care, including MET exon 14 skipping mutations and whole gene amplification, RET translocations, HER2 (ERBB2) mutations, NTRK, RAS (KRAS and NRAS), as well as TP53. Results The standard of care must include the incorporation of testing for novel biomarkers as they become available, as it will be difficult for national guidelines to keep pace with technological advances in this area. Conclusions Canadian patients with nsclc should be treated equally; the minimum standard of care is defined in this paper. PMID:29507487
Do we really need in-situ bioassays?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salazar, M.H.; Salazar, S.M.
1995-12-31
In-situ bioassays are needed to validate the results from laboratory testing and to understand biological interactions. Standard laboratory protocols provide reproducible test results, and the precision of those tests can be mathematically defined. Significant correlations between toxic substances and levels of response (bioaccumulation and bioeffects) have also been demonstrated with natural field populations and suggest that the laboratory results can accurately predict field responses. An equal number of studies have shown a lack of correlation between laboratory bioassay results and responses of natural field populations. The best way to validate laboratory results is with manipulative field testing; i.e., in-situ bioassaysmore » with caged organisms. Bioaccumulation in transplanted bivalves has probably been the most frequently used form of an in-situ bioassay. The authors have refined those methods to include synoptic measurements of bioaccumulation and growth. Growth provides an easily-measured bioeffects endpoint and a means of calibrating bioaccumulation. Emphasis has been on minimizing the size range of test animals, repetitive measurements of individuals and standardization of test protocols for a variety of applications. They are now attempting to standardize criteria for accepting and interpreting data in the same way that laboratory bioassays have been standardized. Others have developed methods for in-situ bioassays using eggs, larvae, unicellular organisms, crustaceans, benthic invertebrates, bivalves, and fish. In the final analysis, the in-situ approach could be considered as an exposure system where any clinical measurements are possible. The most powerful approach would be to use the same species in laboratory and field experiments with the same endpoints.« less
Dorronzoro, Enrique; Gómez, Isabel; Medina, Ana Verónica; Gómez, José Antonio
2015-01-29
Solutions in the field of Ambient Assisted Living (AAL) do not generally use standards to implement a communication interface between sensors and actuators. This makes these applications isolated solutions because it is so difficult to integrate them into new or existing systems. The objective of this research was to design and implement a prototype with a standardized interface for sensors and actuators to facilitate the integration of different solutions in the field of AAL. Our work is based on the roadmap defined by AALIANCE, using motes with TinyOS telosb, 6LoWPAN, sensors, and the IEEE 21451 standard protocol. This prototype allows one to upgrade sensors to a smart status for easy integration with new applications and already existing ones. The prototype has been evaluated for autonomy and performance. As a use case, the prototype has been tested in a serious game previously designed for people with mobility problems, and its advantages and disadvantages have been analysed.
Dorronzoro, Enrique; Gómez, Isabel; Medina, Ana Verónica; Gómez, José Antonio
2015-01-01
Solutions in the field of Ambient Assisted Living (AAL) do not generally use standards to implement a communication interface between sensors and actuators. This makes these applications isolated solutions because it is so difficult to integrate them into new or existing systems. The objective of this research was to design and implement a prototype with a standardized interface for sensors and actuators to facilitate the integration of different solutions in the field of AAL. Our work is based on the roadmap defined by AALIANCE, using motes with TinyOS telosb, 6LoWPAN, sensors, and the IEEE 21451 standard protocol. This prototype allows one to upgrade sensors to a smart status for easy integration with new applications and already existing ones. The prototype has been evaluated for autonomy and performance. As a use case, the prototype has been tested in a serious game previously designed for people with mobility problems, and its advantages and disadvantages have been analysed. PMID:25643057
Descent Advisor Preliminary Field Test
NASA Technical Reports Server (NTRS)
Green, Steven M.; Vivona, Robert A.; Sanford, Beverly
1995-01-01
A field test of the Descent Advisor (DA) automation tool was conducted at the Denver Air Route Traffic Control Center in September 1994. DA is being developed to assist Center controllers in the efficient management and control of arrival traffic. DA generates advisories, based on trajectory predictions, to achieve accurate meter-fix arrival times in a fuel efficient manner while assisting the controller with the prediction and resolution of potential conflicts. The test objectives were: (1) to evaluate the accuracy of DA trajectory predictions for conventional and flight-management system equipped jet transports, (2) to identify significant sources of trajectory prediction error, and (3) to investigate procedural and training issues (both air and ground) associated with DA operations. Various commercial aircraft (97 flights total) and a Boeing 737-100 research aircraft participated in the test. Preliminary results from the primary test set of 24 commercial flights indicate a mean DA arrival time prediction error of 2.4 seconds late with a standard deviation of 13.1 seconds. This paper describes the field test and presents preliminary results for the commercial flights.
Celio, Mark A; Vetter-O'Hagen, Courtney S; Lisman, Stephen A; Johansen, Gerard E; Spear, Linda P
2011-12-01
Field methodologies offer a unique opportunity to collect ecologically valid data on alcohol use and its associated problems within natural drinking environments. However, limitations in follow-up data collection methods have left unanswered questions regarding the psychometric properties of field-based measures. The aim of the current study is to evaluate the reliability of self-report data collected in a naturally occurring environment - as indexed by the Alcohol Use Disorders Identification Test (AUDIT) - compared to self-report data obtained through an innovative web-based follow-up procedure. Individuals recruited outside of bars (N=170; mean age=21; range 18-32) provided a BAC sample and completed a self-administered survey packet that included the AUDIT. BAC feedback was provided anonymously through a dedicated web page. Upon sign in, follow-up participants (n=89; 52%) were again asked to complete the AUDIT before receiving their BAC feedback. Reliability analyses demonstrated that AUDIT scores - both continuous and dichotomized at the standard cut-point - were stable across field- and web-based administrations. These results suggest that self-report data obtained from acutely intoxicated individuals in naturally occurring environments are reliable when compared to web-based data obtained after a brief follow-up interval. Furthermore, the results demonstrate the feasibility, utility, and potential of integrating field methods and web-based data collection procedures. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Gierczak, R F D; Devlin, J F; Rudolph, D L
2006-01-05
Elevated nitrate concentrations within a municipal water supply aquifer led to pilot testing of a field-scale, in situ denitrification technology based on carbon substrate injections. In advance of the pilot test, detailed characterization of the site was undertaken. The aquifer consisted of complex, discontinuous and interstratified silt, sand and gravel units, similar to other well studied aquifers of glaciofluvial origin, 15-40 m deep. Laboratory and field tests, including a conservative tracer test, a pumping test, a borehole flowmeter test, grain-size analysis of drill cuttings and core material, and permeameter testing performed on core samples, were performed on the most productive depth range (27-40 m), and the results were compared. The velocity profiles derived from the tracer tests served as the basis for comparison with other methods. The spatial variation in K, based on grain-size analysis, using the Hazen method, were poorly correlated with the breakthrough data. Trends in relative hydraulic conductivity (K/K(avg)) from permeameter testing compared somewhat better. However, the trends in transient drawdown with depth, measured in multilevel sampling points, corresponded particularly well with those of solute mass flux. Estimates of absolute K, based on standard pumping test analysis of the multilevel drawdown data, were inversely correlated with the tracer test data. The inverse nature of the correlation was attributed to assumptions in the transient drawdown packages that were inconsistent with the variable diffusivities encountered at the scale of the measurements. Collectively, the data showed that despite a relatively low variability in K within the aquifer under study (within a factor of 3), water and solute mass fluxes were concentrated in discrete intervals that could be targeted for later bioremediation.
ERIC Educational Resources Information Center
Cramer, Stephen E.
A standard-setting procedure was developed for the Georgia Teacher Certification Testing Program as tests in 30 teaching fields were revised. A list of important characteristics of a standard-setting procedure was derived, drawing on the work of R. A. Berk (1986). The best method was found to be a highly formalized judgmental, empirical Angoff…
Thongdee, Pimwan; Chaijaroenkul, Wanna; Kuesap, Jiraporn; Na-Bangchang, Kesara
2014-08-01
Microscopy is considered as the gold standard for malaria diagnosis although its wide application is limited by the requirement of highly experienced microscopists. PCR and serological tests provide efficient diagnostic performance and have been applied for malaria diagnosis and research. The aim of this study was to investigate the diagnostic performance of nested PCR and a recently developed an ELISA-based new rapid diagnosis test (RDT), NovaLisa test kit, for diagnosis of malaria infection, using microscopic method as the gold standard. The performance of nested-PCR as a malaria diagnostic tool is excellent with respect to its high accuracy, sensitivity, specificity, and ability to discriminate Plasmodium species. The sensitivity and specificity of nested-PCR compared with the microscopic method for detection of Plasmodium falciparum, Plasmodium vivax, and P. falciparum/P. vivax mixed infection were 71.4 vs 100%, 100 vs 98.7%, and 100 vs 95.0%, respectively. The sensitivity and specificity of the ELISA-based NovaLisa test kit compared with the microscopic method for detection of Plasmodium genus were 89.0 vs 91.6%, respectively. NovaLisa test kit provided comparable diagnostic performance. Its relatively low cost, simplicity, and rapidity enables large scale field application.
Artes, Paul H; Hutchison, Donna M; Nicolela, Marcelo T; LeBlanc, Raymond P; Chauhan, Balwantray C
2005-07-01
To compare test results from second-generation Frequency-Doubling Technology perimetry (FDT2, Humphrey Matrix; Carl-Zeiss Meditec, Dublin, CA) and standard automated perimetry (SAP) in patients with glaucoma. Specifically, to examine the relationship between visual field sensitivity and test-retest variability and to compare total and pattern deviation probability maps between both techniques. Fifteen patients with glaucoma who had early to moderately advanced visual field loss with SAP (mean MD, -4.0 dB; range, +0.2 to -16.1) were enrolled in the study. Patients attended three sessions. During each session, one eye was examined twice with FDT2 (24-2 threshold test) and twice with SAP (Swedish Interactive Threshold Algorithm [SITA] Standard 24-2 test), in random order. We compared threshold values between FDT2 and SAP at test locations with similar visual field coordinates. Test-retest variability, established in terms of test-retest intervals and standard deviations (SDs), was investigated as a function of visual field sensitivity (estimated by baseline threshold and mean threshold, respectively). The magnitude of visual field defects apparent in total and pattern deviation probability maps were compared between both techniques by ordinal scoring. The global visual field indices mean deviation (MD) and pattern standard deviation (PSD) of FDT2 and SAP correlated highly (r > 0.8; P < 0.001). At test locations with high sensitivity (>25 dB with SAP), threshold estimates from FDT2 and SAP exhibited a close, linear relationship, with a slope of approximately 2.0. However, at test locations with lower sensitivity, the relationship was much weaker and ceased to be linear. In comparison with FDT2, SAP showed a slightly larger proportion of test locations with absolute defects (3.0% vs. 2.2% with SAP and FDT2, respectively, P < 0.001). Whereas SAP showed a significant increase in test-retest variability at test locations with lower sensitivity (P < 0.001), there was no relationship between variability and sensitivity with FDT2 (P = 0.46). In comparison with SAP, FDT2 exhibited narrower test-retest intervals at test locations with lower sensitivity (SAP thresholds <25 dB). A comparison of the total and pattern deviation maps between both techniques showed that the total deviation analyses of FDT2 may slightly underestimate the visual field loss apparent with SAP. However, the pattern-deviation maps of both instruments agreed well with each other. The test-retest variability of FDT2 is uniform over the measurement range of the instrument. These properties may provide advantages for the monitoring of patients with glaucoma that should be investigated in longitudinal studies.
Electromagnetic Interference Tests
1994-05-31
for Safety Levels with Respect to Human Exposure to Radio Frequency Electromagnetic Fields (300 kHz - 100 GHz), American National Standards Institute...Respect to Human Exposure to Radio Frequency Electromagnetic Fields (300 kHz - 100 GHz), American National Standards Institute, C95.1-1982, 30 July 1980...II il~l I!I 11 lll i 13. ABSTkACT (Waxlrnun 200woruh) This TOP is a general guideline for electromagnetic interference testing of electronic
Saraceno, John F.; Shanley, James B.; Downing, Bryan D.; Pellerin, Brian A.
2017-01-01
In situ fluorescent dissolved organic matter (fDOM) measurements have gained increasing popularity as a proxy for dissolved organic carbon (DOC) concentrations in streams. One challenge to accurate fDOM measurements in many streams is light attenuation due to suspended particles. Downing et al. (2012) evaluated the need for corrections to compensate for particle interference on fDOM measurements using a single sediment standard in a laboratory study. The application of those results to a large river improved unfiltered field fDOM accuracy. We tested the same correction equation in a headwater tropical stream and found that it overcompensated fDOM when turbidity exceeded ∼300 formazin nephelometric units (FNU). Therefore, we developed a site-specific, field-based fDOM correction equation through paired in situ fDOM measurements of filtered and unfiltered streamwater. The site-specific correction increased fDOM accuracy up to a turbidity as high as 700 FNU, the maximum observed in this study. The difference in performance between the laboratory-based correction equation of Downing et al. (2012) and our site-specific, field-based correction equation likely arises from differences in particle size distribution between the sediment standard used in the lab (silt) and that observed in our study (fine to medium sand), particularly during high flows. Therefore, a particle interference correction equation based on a single sediment type may not be ideal when field sediment size is significantly different. Given that field fDOM corrections for particle interference under turbid conditions are a critical component in generating accurate DOC estimates, we describe a way to develop site-specific corrections.
Six-month Longitudinal Comparison of a Portable Tablet Perimeter With the Humphrey Field Analyzer.
Prea, Selwyn Marc; Kong, Yu Xiang George; Mehta, Aditi; He, Mingguang; Crowston, Jonathan G; Gupta, Vinay; Martin, Keith R; Vingrys, Algis J
2018-06-01
To establish the medium-term repeatability of the iPad perimetry app Melbourne Rapid Fields (MRF) compared to Humphrey Field Analyzer (HFA) 24-2 SITA-standard and SITA-fast programs. Multicenter longitudinal observational clinical study. Sixty patients (stable glaucoma/ocular hypertension/glaucoma suspects) were recruited into a 6-month longitudinal clinical study with visits planned at baseline and at 2, 4, and 6 months. At each visit patients undertook visual field assessment using the MRF perimetry application and either HFA SITA-fast (n = 21) or SITA-standard (n = 39). The primary outcome measure was the association and repeatability of mean deviation (MD) for the MRF and HFA tests. Secondary measures were the point-wise threshold and repeatability for each test, as well as test time. MRF was similar to SITA-fast in speed and significantly faster than SITA-standard (MRF 4.6 ± 0.1 minutes vs SITA-fast 4.3 ± 0.2 minutes vs SITA-standard 6.2 ± 0.1 minutes, P < .001). Intraclass correlation coefficients (ICC) between MRF and SITA-fast for MD at the 4 visits ranged from 0.71 to 0.88. ICC values between MRF and SITA-standard for MD ranged from 0.81 to 0.90. Repeatability of MRF MD outcomes was excellent, with ICC for baseline and the 6-month visit being 0.98 (95% confidence interval: 0.96-0.99). In comparison, ICC at 6-month retest for SITA-fast was 0.95 and SITA-standard 0.93. Fewer points changed with the MRF, although for those that did, the MRF gave greater point-wise variability than did the SITA tests. MRF correlated strongly with HFA across 4 visits over a 6-month period, and has good test-retest reliability. MRF is suitable for monitoring visual fields in settings where conventional perimetry is not readily accessible. Copyright © 2018 Elsevier Inc. All rights reserved.
Aquatic toxicity information retrieval data base (AQUIRE for non-vms) (1600 bpi). Data file
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The purpose of AQUIRE is to provide scientists and managers quick access to a comprehensive, systematic, computerized compilation of aquatic toxicity data. During 1992 and early 1993, nine data updates were made to the AQUIRE system. AQUIRE now contains 109,338 individual aquatic toxicity test results for 5,159 chemicals, 2,429 organisms, and over 160 endpoints reviewed from 7,517 publications. New features include a data selection option that permits searches that are restricted to data added or modified through any of the eight most recent updates, and a report generation (Full Record Detail) that displays the entire AQUIRE record for each testmore » identified in a search. Selection of the Full Record Detail feature allows the user to peruse all AQUIRE fields for a given test, including the information stored in the remarks section, while the standard AQUIRE output format presents selected data fields in a concise table. The standard report remains an available option for rapid viewing of system output.« less
Aquatic toxicity information retrieval data base (AQUIRE for non-vms) (6250 bpi). Data file
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The purpose of AQUIRE is to provide scientists and managers quick access to a comprehensive, systematic, computerized compilation of aquatic toxicity data. During 1992 and early 1993, nine data updates were made to the AQUIRE system. AQUIRE now contains 109,338 individual aquatic toxicity test results for 5,159 chemicals, 2,429 organisms, and over 160 endpoints reviewed from 7,517 publications. New features include a data selection option that permits searches that are restricted to data added or modified through any of the eight most recent updates, and a report generation (Full Record Detail) that displays the entire AQUIRE record for each testmore » identified in a search. Selection of the Full Record Detail feature allows the user to peruse all AQUIRE fields for a given test, including the information stored in the remarks section, while the standard AQUIRE output format presents selected data fields in a concise table. The standard report remains an available option for rapid viewing of system output.« less
Chapinal, Núria; Schumaker, Brant A; Joly, Damien O; Elkin, Brett T; Stephen, Craig
2015-07-01
We estimated the sensitivity and specificity of the caudal-fold skin test (CFT), the fluorescent polarization assay (FPA), and the rapid lateral-flow test (RT) for the detection of Mycobacterium bovis in free-ranging wild wood bison (Bison bison athabascae), in the absence of a gold standard, by using Bayesian analysis, and then used those estimates to forecast the performance of a pairwise combination of tests in parallel. In 1998-99, 212 wood bison from Wood Buffalo National Park (Canada) were tested for M. bovis infection using CFT and two serologic tests (FPA and RT). The sensitivity and specificity of each test were estimated using a three-test, one-population, Bayesian model allowing for conditional dependence between FPA and RT. The sensitivity and specificity of the combination of CFT and each serologic test in parallel were calculated assuming conditional independence. The test performance estimates were influenced by the prior values chosen. However, the rank of tests and combinations of tests based on those estimates remained constant. The CFT was the most sensitive test and the FPA was the least sensitive, whereas RT was the most specific test and CFT was the least specific. In conclusion, given the fact that gold standards for the detection of M. bovis are imperfect and difficult to obtain in the field, Bayesian analysis holds promise as a tool to rank tests and combinations of tests based on their performance. Combining a skin test with an animal-side serologic test, such as RT, increases sensitivity in the detection of M. bovis and is a good approach to enhance disease eradication or control in wild bison.
Ahmad, Farooq; Richardson, Michael K
2013-01-01
This study aimed to develop and characterize a novel (standard) open field test adapted for larval zebrafish. We also developed and characterized a variant of the same assay consisting of a colour-enriched open field; this was used to assess the impact of environmental complexity on patterns of exploratory behaviours as well to determine natural colour preference/avoidance. We report the following main findings: (1) zebrafish larvae display characteristic patterns of exploratory behaviours in the standard open field, such as thigmotaxis/centre avoidance; (2) environmental complexity (i.e. presence of colours) differentially affects patterns of exploratory behaviours and greatly attenuates natural zone preference; (3) larvae displayed the ability to discriminate colours. As reported previously in adult zebrafish, larvae showed avoidance towards blue and black; however, in contrast to the reported adult behaviour, larvae displayed avoidance towards red. Avoidance towards yellow and preference for green and orange are shown for the first time, (4) compared to standard open field tests, exposure to the colour-enriched open field resulted in an enhanced expression of anxiety-like behaviours. To conclude, we not only developed and adapted a traditional rodent behavioural assay that serves as a gold standard in preclinical drug screening, but we also provide a version of the same test that affords the possibility to investigate the impact of environmental stress on behaviour in larval zebrafish while representing the first test for assessment of natural colour preference/avoidance in larval zebrafish. In the future, these assays will improve preclinical drug screening methodologies towards the goal to uncover novel drugs. This article is part of a Special Issue entitled: insert SI title. Copyright © 2012 Elsevier B.V. All rights reserved.
Hearing in Noise Test Brazil: standardization for young adults with normal hearing.
Sbompato, Andressa Forlevise; Corteletti, Lilian Cassia Bornia Jacob; Moret, Adriane de Lima Mortari; Jacob, Regina Tangerino de Souza
2015-01-01
Individuals with the same ability of speech recognition in quiet can have extremely different results in noisy environments. To standardize speech perception in adults with normal hearing in the free field using the Brazilian Hearing in Noise Test. Contemporary, cross-sectional cohort study. 79 adults with normal hearing and without cognitive impairment participated in the study. Lists of Hearing in Noise Test sentences were randomly in quiet, noise front, noise right, and noise left. There were no significant differences between right and left ears at all frequencies tested (paired t-1 test). Nor were significant differences observed when comparing gender and interaction between these conditions. A difference was observed among the free field positions tested, except in the situations of noise right and noise left. Results of speech perception in adults with normal hearing in the free field during different listening situations in noise indicated poorer performance during the condition with noise and speech in front, i.e., 0°/0°. The values found in the standardization of the Hearing in Noise Test free field can be used as a reference in the development of protocols for tests of speech perception in noise, and for monitoring individuals with hearing impairment. Copyright © 2015 Associação Brasileira de Otorrinolaringologia e Cirurgia Cérvico-Facial. Published by Elsevier Editora Ltda. All rights reserved.
Hall, Lawrence B.
1955-01-01
The new demands placed upon application equipment by the introduction of modern insecticides have revealed the deficiencies of this equipment when required for continuous use on a large scale. If adequate equipment is to be produced, specifications must be based not only on basic materials tests but also on “use” tests, in which the conditions of field use are simulated. The author outlines suggested techniques to be followed and standards to be adopted in testing the performance of compression sprayers and allied equipment, with reference to the following features: compression-sprayer tank fatigue; tank impact; pump resistance to bursting; pump resistance to collapse; pump friction; cut-off valve durability; constant-pressure valves; cut-off valve actuation; hose flexure; hose tension and bursting-pressure; hose friction; gaskets, valve faces, and similar non-metallic parts; nozzle-orifice erosion; and nozzle pattern. ImagesFIG. 1FIG. 14FIG. 20 PMID:14364189
NASA Astrophysics Data System (ADS)
Khabarova, K. Yu.; Kudeyarov, K. S.; Kolachevsky, N. N.
2017-06-01
Research and development in the field of optical clocks based on ultracold atoms and ions have enabled the relative uncertainty in frequency to be reduced down to a few parts in 1018. The use of novel, precise frequency comparison methods opens up new possibilities for basic research (sensitive tests of general relativity, a search for a drift of fundamental constants and a search for ‘dark matter’) as well as for state-of-the-art navigation and gravimetry. We discuss the key methods that are used in creating precision clocks (including transportable clocks) based on ultracold atoms and ions and the feasibility of using them in resolving current relativistic gravimetry issues.
ERIC Educational Resources Information Center
Lu, Qun; Liu, Enshan
2012-01-01
With the development and implementation of new curriculum standards, the field tests of education reform in senior high schools began in 2004 in four pilot provinces in mainland China. After five years of the reform, it is necessary to know how and to what extent the curriculum standard guides test classroom instruction. The present study was…
Collé, R.; Kotrappa, P.; Hutchinson, J. M. R.
1995-01-01
The recently developed 222Rn emanation standards that are based on polyethylene-encapsulated 226Ra solutions were employed for a first field-measurement application test to demonstrate their efficacy in calibrating passive integral radon monitors. The performance of the capsules was evaluated with respect to the calibration needs of electret ionization chambers (E-PERM®, Rad Elec Inc.). The encapsulated standards emanate well-characterized and known quantities of 222Rn, and were used in two different-sized, relatively-small, accumulation vessels (about 3.6 L and 10 L) which also contained the deployed electret monitors under test. Calculated integral 222Rn activities from the capsules over various accumulation times were compared to the averaged electret responses. Evaluations were made with four encapsulated standards ranging in 226Ra activity from approximately 15 Bq to 540 Bq (with 222Rn emanation fractions of 0.888); over accumulation times from 1 d to 33 d; and with four different types of E-PERM detectors that were independently calibrated. The ratio of the electret chamber response ERn to the integral 222Rn activity IRn was constant (within statistical variations) over the variables of the specific capsule used, the accumulation volume, accumulation time, and detector type. The results clearly demonstrated the practicality and suitability of the encapsulated standards for providing a simple and readily-available calibration for those measurement applications. However, the mean ratio ERn/IRn was approximately 0.91, suggesting a possible systematic bias in the extant E-PERM calibrations. This 9 % systematic difference was verified by an independent test of the E-PERM calibration based on measurements with the NIST radon-in-water standard generator. PMID:29151765
Collé, R; Kotrappa, P; Hutchinson, J M R
1995-01-01
The recently developed 222 Rn emanation standards that are based on polyethylene-encapsulated 226 Ra solutions were employed for a first field-measurement application test to demonstrate their efficacy in calibrating passive integral radon monitors. The performance of the capsules was evaluated with respect to the calibration needs of electret ionization chambers (E-PERM ® , Rad Elec Inc.). The encapsulated standards emanate well-characterized and known quantities of 222 Rn, and were used in two different-sized, relatively-small, accumulation vessels (about 3.6 L and 10 L) which also contained the deployed electret monitors under test. Calculated integral 222 Rn activities from the capsules over various accumulation times were compared to the averaged electret responses. Evaluations were made with four encapsulated standards ranging in 226 Ra activity from approximately 15 Bq to 540 Bq (with 222 Rn emanation fractions of 0.888); over accumulation times from 1 d to 33 d; and with four different types of E-PERM detectors that were independently calibrated. The ratio of the electret chamber response E Rn to the integral 222 Rn activity I Rn was constant (within statistical variations) over the variables of the specific capsule used, the accumulation volume, accumulation time, and detector type. The results clearly demonstrated the practicality and suitability of the encapsulated standards for providing a simple and readily-available calibration for those measurement applications. However, the mean ratio E Rn / I Rn was approximately 0.91, suggesting a possible systematic bias in the extant E-PERM calibrations. This 9 % systematic difference was verified by an independent test of the E-PERM calibration based on measurements with the NIST radon-in-water standard generator.
Broadband standard dipole antenna for antenna calibration
NASA Astrophysics Data System (ADS)
Koike, Kunimasa; Sugiura, Akira; Morikawa, Takao
1995-06-01
Antenna calibration of EMI antennas is mostly performed by the standard antenna method at an open-field test site using a specially designed dipole antenna as a reference. In order to develop broadband standard antennas, the antenna factors of shortened dipples are theoretically investigated. First, the effects of the dipole length are analyzed using the induced emf method. Then, baluns and loads are examined to determine their influence on the antenna factors. It is found that transformer-type baluns are very effective for improving the height dependence of the antenna factors. Resistive loads are also useful for flattening the frequency dependence. Based on these studies, a specification is developed for a broadband standard antenna operating in the 30 to 150 MHz frequency range.
ANZSoilML: An Australian - New Zealand standard for exchange of soil data
NASA Astrophysics Data System (ADS)
Simons, Bruce; Wilson, Peter; Ritchie, Alistair; Cox, Simon
2013-04-01
The Australian-New Zealand soil information exchange standard (ANZSoilML) is a GML-based standard designed to allow the discovery, query and delivery of soil and landscape data via standard Open Geospatial Consortium (OGC) Web Feature Services. ANZSoilML modifies the Australian soil exchange standard (OzSoilML), which is based on the Australian Soil Information Transfer and Evaluation System (SITES) database design and exchange protocols, to meet the New Zealand National Soils Database requirements. The most significant change was the removal of the lists of CodeList terms in OzSoilML, which were based on the field methods specified in the 'Australian Soil and Land Survey Field Handbook'. These were replaced with empty CodeLists as placeholders to external vocabularies to allow the use of New Zealand vocabularies without violating the data model. Testing of the use of these separately governed Australian and New Zealand vocabularies has commenced. ANZSoilML attempts to accommodate the proposed International Organization for Standardization ISO/DIS 28258 standard for soil quality. For the most part, ANZSoilML is consistent with the ISO model, although major differences arise as a result of: • The need to specify the properties appropriate for each feature type; • The inclusion of soil-related 'Landscape' features; • Allowing the mapping of soil surfaces, bodies, layers and horizons, independent of the soil profile; • Allowing specifying the relationships between the various soil features; • Specifying soil horizons as specialisations of soil layers; • Removing duplication of features provided by the ISO Observation & Measurements standard. The International Union of Soil Sciences (IUSS) Working Group on Soil Information Standards (WG-SIS) aims to develop, promote and maintain a standard to facilitate the exchange of soils data and information. Developing an international exchange standard that is compatible with existing and emerging national and regional standards is a considerable challenge. ANZSoilML is proposed as a profile of the more generalised SoilML model being progressed through the IUSS Working Group.
DOT National Transportation Integrated Search
2008-04-25
This report presents the results of the ITS Standards Testing Program for the field testing, assessment, and evaluation of the NTCIP standards that apply in the domain of Dynamic Message Signs (DMS). Specifically, the National Transportation Communic...
Pechacek, Judith; Shanedling, Janet; Lutfiyya, May Nawal; Brandt, Barbara F; Cerra, Frank B; Delaney, Connie White
2015-01-01
Understanding the impact that interprofessional education and collaborative practice (IPECP) might have on triple aim patient outcomes is of high interest to health care providers, educators, administrators, and policy makers. Before the work undertaken by the National Center for Interprofessional Practice and Education at the University of Minnesota, no standard mechanism to acquire and report outcome data related to interprofessional education and collaborative practice and its effect on triple aim outcomes existed. This article describes the development and adoption of the National Center Data Repository (NCDR) designed to capture data related to IPECP processes and outcomes to support analyses of the relationship of IPECP on the Triple Aim. The data collection methods, web-based survey design and implementation process are discussed. The implications of this informatics work to the field of IPECP and health care quality and safety include creating standardized capacity to describe interprofessional practice and measure outcomes connecting interprofessional education and collaborative practice to the triple aim within and across sites/settings, leveraging an accessible data collection process using user friendly web-based survey design to support large data scholarship and instrument testing, and establishing standardized data elements and variables that can potentially lead to enhancements to national/international information system and academic accreditation standards to further team-based, interprofessional, collaborative research in the field.
Pechacek, Judith; Shanedling, Janet; Lutfiyya, May Nawal; Brandt, Barbara F.; Cerra, Frank B.; Delaney, Connie White
2015-01-01
Abstract Understanding the impact that interprofessional education and collaborative practice (IPECP) might have on triple aim patient outcomes is of high interest to health care providers, educators, administrators, and policy makers. Before the work undertaken by the National Center for Interprofessional Practice and Education at the University of Minnesota, no standard mechanism to acquire and report outcome data related to interprofessional education and collaborative practice and its effect on triple aim outcomes existed. This article describes the development and adoption of the National Center Data Repository (NCDR) designed to capture data related to IPECP processes and outcomes to support analyses of the relationship of IPECP on the Triple Aim. The data collection methods, web-based survey design and implementation process are discussed. The implications of this informatics work to the field of IPECP and health care quality and safety include creating standardized capacity to describe interprofessional practice and measure outcomes connecting interprofessional education and collaborative practice to the triple aim within and across sites/settings, leveraging an accessible data collection process using user friendly web-based survey design to support large data scholarship and instrument testing, and establishing standardized data elements and variables that can potentially lead to enhancements to national/international information system and academic accreditation standards to further team-based, interprofessional, collaborative research in the field. PMID:26652631
Mobile phone-based clinical guidance for rural health providers in India.
Gautham, Meenakshi; Iyengar, M Sriram; Johnson, Craig W
2015-12-01
There are few tried and tested mobile technology applications to enhance and standardize the quality of health care by frontline rural health providers in low-resource settings. We developed a media-rich, mobile phone-based clinical guidance system for management of fevers, diarrhoeas and respiratory problems by rural health providers. Using a randomized control design, we field tested this application with 16 rural health providers and 128 patients at two rural/tribal sites in Tamil Nadu, Southern India. Protocol compliance for both groups, phone usability, acceptability and patient feedback for the experimental group were evaluated. Linear mixed-model analyses showed statistically significant improvements in protocol compliance in the experimental group. Usability and acceptability among patients and rural health providers were very high. Our results indicate that mobile phone-based, media-rich procedural guidance applications have significant potential for achieving consistently standardized quality of care by diverse frontline rural health providers, with patient acceptance. © The Author(s) 2014.
2012-06-01
executed a concerted effort to employ reliability standards and testing from the design phase through fielding. Reliability programs remain standard...performed flight test engineer duties on several developmental flight test programs and served as Chief Engineer for a flight test squadron. Major...Quant is an acquisition professional with over 250 flight test hours in various aircraft, including the F-16, Airborne Laser, and HH-60. She holds a
Hoffmann, Sebastian; Hartung, Thomas; Stephens, Martin
Evidence-based toxicology (EBT) was introduced independently by two groups in 2005, in the context of toxicological risk assessment and causation as well as based on parallels between the evaluation of test methods in toxicology and evidence-based assessment of diagnostics tests in medicine. The role model of evidence-based medicine (EBM) motivated both proposals and guided the evolution of EBT, whereas especially systematic reviews and evidence quality assessment attract considerable attention in toxicology.Regarding test assessment, in the search of solutions for various problems related to validation, such as the imperfectness of the reference standard or the challenge to comprehensively evaluate tests, the field of Diagnostic Test Assessment (DTA) was identified as a potential resource. DTA being an EBM discipline, test method assessment/validation therefore became one of the main drivers spurring the development of EBT.In the context of pathway-based toxicology, EBT approaches, given their objectivity, transparency and consistency, have been proposed to be used for carrying out a (retrospective) mechanistic validation.In summary, implementation of more evidence-based approaches may provide the tools necessary to adapt the assessment/validation of toxicological test methods and testing strategies to face the challenges of toxicology in the twenty first century.
Field evaluation of a malaria rapid diagnostic test (ICT Pf).
Moonasar, Devanand; Goga, Ameena E; Kruger, Philip S; La Cock, Christine; Maharaj, Rajendra; Frean, John; Chandramohan, Daniel
2009-11-01
Malaria rapid diagnostic tests (MRDTs) are quick and easy to perform and useful for diagnosing malaria in primary health care settings. In South Africa most malaria infections are due to Plasmodium falciparurrm, and HRPII-based MRDTs have been used since 2001. Previous studies in Africa showed variability in sensitivity and specificity of HRPII-based MRDTs; hence, we conducted a field evaluation in Limpopo province to determine the accuracy of the MRDT currently used in public sector clinics and hospitals. A cross-sectional observational study was conducted to determine the sensitivity and specificity of an ICT Pf MRDT. We tested 405 patients with fever with ICT Pf MRDT and compared the results with blood film microscopy (the gold standard). RESULTS. The overall sensitivity of the ICT Pf MRDT was 99.48% (95% confidence interval (CI) 96.17-100%), while specificity was 96.26% (95% CI 94.7-100%). The positive predictive value of the test was 98.48 (99% CI 98.41-100%), and the negative predictive value was 99.52% (95% CI 96.47-100%). The ICT Pf MRDT is an appropriate test to use in the field in South Africa where laboratory facilities are not available. It has a high degree of sensitivity and acceptable level of specificity in accordance with the World Health Organization criteria. However, sensitivity of MRDT at low levels of parasitaemia (<100 parasites/microl of blood) in field conditions must still be established.
Ashraf, Sania; Kao, Angie; Hugo, Cecilia; Christophel, Eva M; Fatunmbi, Bayo; Luchavez, Jennifer; Lilley, Ken; Bell, David
2012-10-24
Malaria diagnosis has received renewed interest in recent years, associated with the increasing accessibility of accurate diagnosis through the introduction of rapid diagnostic tests and new World Health Organization guidelines recommending parasite-based diagnosis prior to anti-malarial therapy. However, light microscopy, established over 100 years ago and frequently considered the reference standard for clinical diagnosis, has been neglected in control programmes and in the malaria literature and evidence suggests field standards are commonly poor. Microscopy remains the most accessible method for parasite quantitation, for drug efficacy monitoring, and as a reference of assessing other diagnostic tools. This mismatch between quality and need highlights the importance of the establishment of reliable standards and procedures for assessing and assuring quality. This paper describes the development, function and impact of a multi-country microscopy external quality assurance network set up for this purpose in Asia. Surveys were used for key informants and past participants for feedback on the quality assurance programme. Competency scores for each country from 14 participating countries were compiled for analyses using paired sample t-tests. In-depth interviews were conducted with key informants including the programme facilitators and national level microscopists. External assessments and limited retraining through a formalized programme based on a reference slide bank has demonstrated an increase in standards of competence of senior microscopists over a relatively short period of time, at a potentially sustainable cost. The network involved in the programme now exceeds 14 countries in the Asia-Pacific, and the methods are extended to other regions. While the impact on national programmes varies, it has translated in some instances into a strengthening of national microscopy standards and offers a possibility both for supporting revival of national microcopy programmes, and for the development of globally recognized standards of competency needed both for patient management and field research.
2012-01-01
Background Malaria diagnosis has received renewed interest in recent years, associated with the increasing accessibility of accurate diagnosis through the introduction of rapid diagnostic tests and new World Health Organization guidelines recommending parasite-based diagnosis prior to anti-malarial therapy. However, light microscopy, established over 100 years ago and frequently considered the reference standard for clinical diagnosis, has been neglected in control programmes and in the malaria literature and evidence suggests field standards are commonly poor. Microscopy remains the most accessible method for parasite quantitation, for drug efficacy monitoring, and as a reference of assessing other diagnostic tools. This mismatch between quality and need highlights the importance of the establishment of reliable standards and procedures for assessing and assuring quality. This paper describes the development, function and impact of a multi-country microscopy external quality assurance network set up for this purpose in Asia. Methods Surveys were used for key informants and past participants for feedback on the quality assurance programme. Competency scores for each country from 14 participating countries were compiled for analyses using paired sample t-tests. In-depth interviews were conducted with key informants including the programme facilitators and national level microscopists. Results External assessments and limited retraining through a formalized programme based on a reference slide bank has demonstrated an increase in standards of competence of senior microscopists over a relatively short period of time, at a potentially sustainable cost. The network involved in the programme now exceeds 14 countries in the Asia-Pacific, and the methods are extended to other regions. Conclusions While the impact on national programmes varies, it has translated in some instances into a strengthening of national microscopy standards and offers a possibility both for supporting revival of national microcopy programmes, and for the development of globally recognized standards of competency needed both for patient management and field research. PMID:23095668
Flegar-Meštrić, Zlata; Perkov, Sonja; Radeljak, Andrea
2016-03-26
Considering the fact that the results of laboratory tests provide useful information about the state of health of patients, determination of reference value is considered an intrinsic part in the development of laboratory medicine. There are still huge differences in the analytical methods used as well as in the associated reference intervals which could consequently significantly affect the proper assessment of patient health. In a constant effort to increase the quality of patients' care, there are numerous international initiatives for standardization and/or harmonization of laboratory diagnostics in order to achieve maximum comparability of laboratory test results and improve patient safety. Through the standardization and harmonization processes of analytical methods the ability to create unique reference intervals is achieved. Such reference intervals could be applied globally in all laboratories using methods traceable to the same reference measuring system and analysing the biological samples from the populations with similar socio-demographic and ethnic characteristics. In this review we outlined the results of the harmonization processes in Croatia in the field of population based reference intervals for clinically relevant blood and serum constituents which are in accordance with ongoing activity for worldwide standardization and harmonization based on traceability in laboratory medicine.
Flegar-Meštrić, Zlata; Perkov, Sonja; Radeljak, Andrea
2016-01-01
Considering the fact that the results of laboratory tests provide useful information about the state of health of patients, determination of reference value is considered an intrinsic part in the development of laboratory medicine. There are still huge differences in the analytical methods used as well as in the associated reference intervals which could consequently significantly affect the proper assessment of patient health. In a constant effort to increase the quality of patients’ care, there are numerous international initiatives for standardization and/or harmonization of laboratory diagnostics in order to achieve maximum comparability of laboratory test results and improve patient safety. Through the standardization and harmonization processes of analytical methods the ability to create unique reference intervals is achieved. Such reference intervals could be applied globally in all laboratories using methods traceable to the same reference measuring system and analysing the biological samples from the populations with similar socio-demographic and ethnic characteristics. In this review we outlined the results of the harmonization processes in Croatia in the field of population based reference intervals for clinically relevant blood and serum constituents which are in accordance with ongoing activity for worldwide standardization and harmonization based on traceability in laboratory medicine. PMID:27019800
Saver, Jeffrey L.; Warach, Steven; Janis, Scott; Odenkirchen, Joanne; Becker, Kyra; Benavente, Oscar; Broderick, Joseph; Dromerick, Alexander W.; Duncan, Pamela; Elkind, Mitchell S. V.; Johnston, Karen; Kidwell, Chelsea S.; Meschia, James F.; Schwamm, Lee
2012-01-01
Background and Purpose The National Institute of Neurological Disorders and Stroke initiated development of stroke-specific Common Data Elements (CDEs) as part of a project to develop data standards for funded clinical research in all fields of neuroscience. Standardizing data elements in translational, clinical and population research in cerebrovascular disease could decrease study start-up time, facilitate data sharing, and promote well-informed clinical practice guidelines. Methods A Working Group of diverse experts in cerebrovascular clinical trials, epidemiology, and biostatistics met regularly to develop a set of Stroke CDEs, selecting among, refining, and adding to existing, field-tested data elements from national registries and funded trials and studies. Candidate elements were revised based on comments from leading national and international neurovascular research organizations and the public. Results The first iteration of the NINDS stroke-specific CDEs comprises 980 data elements spanning nine content areas: 1) Biospecimens and Biomarkers; 2) Hospital Course and Acute Therapies; 3) Imaging; 4) Laboratory Tests and Vital Signs; 5) Long Term Therapies; 6) Medical History and Prior Health Status; 7) Outcomes and Endpoints; 8) Stroke Presentation; 9) Stroke Types and Subtypes. A CDE website provides uniform names and structures for each element, a data dictionary, and template case report forms (CRFs) using the CDEs. Conclusion Stroke-specific CDEs are now available as standardized, scientifically-vetted variable structures to facilitate data collection and data sharing in cerebrovascular patient-oriented research. The CDEs are an evolving resource that will be iteratively improved based on investigator use, new technologies, and emerging concepts and research findings. PMID:22308239
Is vision function related to physical functional ability in older adults?
West, Catherine G; Gildengorin, Ginny; Haegerstrom-Portnoy, Gunilla; Schneck, Marilyn E; Lott, Lori; Brabyn, John A
2002-01-01
To assess the relationship between a broad range of vision functions and measures of physical performance in older adults. Cross-sectional study. Population-based cohort of community-dwelling older adults, subset of an on-going longitudinal study. Seven hundred eighty-two adults aged 55 and older (65% of living eligible subjects) had subjective health measures and objective physical performance evaluated in 1989/91 and again in 1993/95 and a battery of vision functions tested in 1993/95. Comprehensive battery of vision tests (visual acuity, contrast sensitivity, effects of illumination level, contrast and glare on acuity, visual fields with and without attentional load, color vision, temporal sensitivity, and the impact of dimming light on walking ability) and physical function measures (self-reported mobility limitations and observed measures of walking, rising from a chair and tandem balance). The failure rate for all vision functions and physical performance measures increased exponentially with age. Standard high-contrast visual acuity and standard visual fields showed the lowest failure rates. Nonstandard vision tests showed much higher failure rates. Poor performance on many individual vision functions was significantly associated with particular individual measures of physical performance. Using constructed combination vision variables, significant associations were found between spatial vision, field integrity, binocularity and/or adaptation, and each of the functional outcomes. Vision functions other than standard visual acuity may affect day-to-day functioning of older adults. Additional studies of these other aspects of vision and how they can be treated or rehabilitated are needed to determine whether these aspects play a role in strategies for reducing disability in older adults.
Mechanical testing of hydrogels in cartilage tissue engineering: beyond the compressive modulus.
Xiao, Yinghua; Friis, Elizabeth A; Gehrke, Stevin H; Detamore, Michael S
2013-10-01
Injuries to articular cartilage result in significant pain to patients and high medical costs. Unfortunately, cartilage repair strategies have been notoriously unreliable and/or complex. Biomaterial-based tissue-engineering strategies offer great promise, including the use of hydrogels to regenerate articular cartilage. Mechanical integrity is arguably the most important functional outcome of engineered cartilage, although mechanical testing of hydrogel-based constructs to date has focused primarily on deformation rather than failure properties. In addition to deformation testing, as the field of cartilage tissue engineering matures, this community will benefit from the addition of mechanical failure testing to outcome analyses, given the crucial clinical importance of the success of engineered constructs. However, there is a tremendous disparity in the methods used to evaluate mechanical failure of hydrogels and articular cartilage. In an effort to bridge the gap in mechanical testing methods of articular cartilage and hydrogels in cartilage regeneration, this review classifies the different toughness measurements for each. The urgency for identifying the common ground between these two disparate fields is high, as mechanical failure is ready to stand alongside stiffness as a functional design requirement. In comparing toughness measurement methods between hydrogels and cartilage, we recommend that the best option for evaluating mechanical failure of hydrogel-based constructs for cartilage tissue engineering may be tensile testing based on the single edge notch test, in part because specimen preparation is more straightforward and a related American Society for Testing and Materials (ASTM) standard can be adopted in a fracture mechanics context.
Type testing of the Siemens Plessey electronic personal dosemeter.
Hirning, C R; Yuen, P S
1995-07-01
This paper presents the results of a laboratory assessment of the performance of a new type of personal dosimeter, the Electronic Personal Dosemeter made by Siemens Plessey Controls Limited. Twenty pre-production dosimeters and a reader were purchased by Ontario Hydro for the assessment. Tests were performed on radiological performance, including reproducibility, accuracy, linearity, detection threshold, energy response, angular response, neutron response, and response time. There were also tests on the effects of a variety of environmental factors, such as temperature, humidity, pulsed magnetic and electric fields, low- and high-frequency electromagnetic fields, light exposure, drop impact, vibration, and splashing. Other characteristics that were tested were alarm volume, clip force, and battery life. The test results were compared with the relevant requirements of three standards: an Ontario Hydro standard for personal alarming dosimeters, an International Electrotechnical Commission draft standard for direct reading personal dose monitors, and an International Electrotechnical Commission standard for thermoluminescence dosimetry systems for personal monitoring. In general, the performance of the Electronic Personal Dosemeter was found to be quite acceptable: it met most of the relevant requirements of the three standards. However, the following deficiencies were found: slow response time; sensitivity to high-frequency electromagnetic fields; poor resistance to dropping; and an alarm that was not loud enough. In addition, the response of the electronic personal dosimeter to low-energy beta rays may be too low for some applications. Problems were experienced with the reliability of operation of the pre-production dosimeters used in these tests.
Buchholz, Bernhard; Kallweit, Sören; Ebert, Volker
2016-12-30
Instrument operation in harsh environments often significantly impacts the trust level of measurement data. While commercial instrument manufacturers clearly define the deployment conditions to achieve trustworthy data in typical standard applications, it is frequently unavoidable in scientific field applications to operate instruments outside these commercial standard application specifications. Scientific instrumentation, however, is employing cutting-edge technology and often highly optimized but also lacks long-term field tests to assess the field vs. laboratory performance. Recently, we developed the Selective Extractive Laser Diode Hygrometer (SEALDH-II), which addresses field and especially airborne applications as well as metrological laboratory validations. SEALDH-II targets reducing deviations between airborne hygrometers (currently up to 20% between the most advanced hygrometers) with a new holistic, internal control and validation concept, which guarantees the transfer of the laboratory performance into a field scenario by capturing more than 80 instrument internal "housekeeping" data to nearly perfectly control SEALDH-II's health status. SEALDH-II uses a calibration-free, first principles based, direct Tuneable Diode Laser Absorption Spectroscopy (dTDLAS) approach, to cover the entire atmospheric humidity measurement range from about 3 to 40,000 ppmv with a calculated maximum uncertainty of 4.3% ± 3 ppmv. This is achieved not only by innovations in internal instrument monitoring and design, but also by active control algorithms such as a high resolution spectral stabilization. This paper describes the setup, working principles, and instrument stabilization, as well as its precision validation and long-term stress tests in an environmental chamber over an environmental temperature and humidity range of ΔT = 50 K and ΔRH = 80% RH, respectively.
Buchholz, Bernhard; Kallweit, Sören; Ebert, Volker
2016-01-01
Instrument operation in harsh environments often significantly impacts the trust level of measurement data. While commercial instrument manufacturers clearly define the deployment conditions to achieve trustworthy data in typical standard applications, it is frequently unavoidable in scientific field applications to operate instruments outside these commercial standard application specifications. Scientific instrumentation, however, is employing cutting-edge technology and often highly optimized but also lacks long-term field tests to assess the field vs. laboratory performance. Recently, we developed the Selective Extractive Laser Diode Hygrometer (SEALDH-II), which addresses field and especially airborne applications as well as metrological laboratory validations. SEALDH-II targets reducing deviations between airborne hygrometers (currently up to 20% between the most advanced hygrometers) with a new holistic, internal control and validation concept, which guarantees the transfer of the laboratory performance into a field scenario by capturing more than 80 instrument internal “housekeeping” data to nearly perfectly control SEALDH-II’s health status. SEALDH-II uses a calibration-free, first principles based, direct Tuneable Diode Laser Absorption Spectroscopy (dTDLAS) approach, to cover the entire atmospheric humidity measurement range from about 3 to 40,000 ppmv with a calculated maximum uncertainty of 4.3% ± 3 ppmv. This is achieved not only by innovations in internal instrument monitoring and design, but also by active control algorithms such as a high resolution spectral stabilization. This paper describes the setup, working principles, and instrument stabilization, as well as its precision validation and long-term stress tests in an environmental chamber over an environmental temperature and humidity range of ΔT = 50 K and ΔRH = 80% RH, respectively. PMID:28042844
NASA Astrophysics Data System (ADS)
Sallaberry, Fabienne; Fernández-García, Aránzazu; Lüpfert, Eckhard; Morales, Angel; Vicente, Gema San; Sutter, Florian
2017-06-01
Precise knowledge of the optical properties of the components used in the solar field of concentrating solar thermal power plants is primordial to ensure their optimum power production. Those properties are measured and evaluated by different techniques and equipment, in laboratory conditions and/or in the field. Standards for such measurements and international consensus for the appropriate techniques are in preparation. The reference materials used as a standard for the calibration of the equipment are under discussion. This paper summarizes current testing methodologies and guidelines for the characterization of optical properties of solar mirrors and absorbers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Poerschke, Andrew; Stecher, Dave
2014-06-01
Field testing was performed in a new construction unoccupied test house in Pittsburgh, PA. Four air-based heating, ventilation, and air conditioning distribution systems—a typical airflow ducted system to the bedrooms, a low airflow ducted system to the bedrooms, a system with transfer fans to the bedrooms, and a system with no ductwork to the bedrooms—were evaluated during heating, cooling, and midseason conditions. The relative ability of each system was assessed with respect to relevant Air Conditioning Contractors of America and ASHRAE standards for house temperature uniformity and stability, respectively.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Poerschke, A.; Stecher, D.
2014-06-01
Field testing was performed in a new construction unoccupied test house in Pittsburgh, Pennsylvania. Four air-based heating, ventilation, and air conditioning distribution systems--a typical airflow ducted system to the bedrooms, a low airflow ducted system to the bedrooms, a system with transfer fans to the bedrooms, and a system with no ductwork to the bedrooms--were evaluated during heating, cooling, and midseason conditions. The relative ability of each system was assessed with respect to relevant Air Conditioning Contractors of America and ASHRAE standards for house temperature uniformity and stability, respectively.
The Making of a Self-Neglect Severity Scale
NASA Technical Reports Server (NTRS)
Smith, Scott M.; Dyer, C. B.; Pavlik, V. N.; Kelly, P. A.; Lee, J.; Doody, R. S.; Regev, C.; Pickens, C.; Burnett, J.
2006-01-01
Research in elder self-neglect has lagged behind that of other forms of mistreatment, despite the fact that self-neglect is the most common allegation reported to Adult Protective Service agencies throughout the US. The lack of a gold-standard to measure self-neglect has hampered efforts to study this phenomenon. Researchers designed the Self-neglect Severity Scale (SSS) based on interviews with Adult Protective Service workers and a national expert panel. The SSS is based on observation and interview and is administered in the home to include an environmental assessment. It was piloted, extensively field tested and then revised. The CREST SSS was developed using survey data and consultation with experts in the field. This instrument utilizes observer ratings, interview responses, and assesses subjects physical and environmental domains. It also assesses functional status as it relates to health and safety issues. After field and pilot testing the SSS was finalized and is currently undergoing reliability and validity testing. The CREST SSS was developed as a state scale to provide a common language for describing cases of self-neglect. It is the first self-neglect severity scale available to researchers. If found to be both reliable and valid it can be used in future intervention studies.
The making of a self-neglect severity scale.
Dyer, Carmel Bitondo; Kelly, P Adam; Pavlik, Valory N; Lee, Jessica; Doody, Rachelle S; Regev, Tziona; Pickens, Sabrina; Burnett, Jason; Smith, Scott M
2006-01-01
Research in elder self-neglect has lagged behind that of other forms of mistreatment, despite the fact that self-neglect is the most common allegation reported to Adult Protective Service agencies throughout the US. The lack of a gold standard to measure self-neglect has hampered efforts to study this phenomenon. Researchers designed the Self-Neglect Severity Scale (SSS) based on interviews with Adult Protective Service workers and a national expert panel. The SSS is based on observation and interview and is administered in the home to include an environmental assessment. It was piloted, extensively field tested and then revised. The CREST SSS was developed using survey data and consultation with experts in the field. This instrument utilizes observer ratings, interview responses, and assesses subjects' physical and environmental domains. It also assesses functional status as it relates to health and safety issues. After field and pilot testing, the SSS was finalized and is currently undergoing reliability and validity testing. The CREST SSS was developed as a state scale to provide a common language for describing cases of self-neglect. It is the first self-neglect severity scale available to researchers. If found to be both reliable and valid, it may be used in future intervention studies.
A Five-Year Study of the First Edition of the Core-Plus Mathematics Curriculum
ERIC Educational Resources Information Center
Schoen, Harold, Ed.; Ziebarth, Steven W., Ed.; Hirsch, Christian R., Ed.; BrckaLorenz, Allison, Ed.
2010-01-01
The study reported in this volume adds to the growing body of evaluation studies that focus on the use of NSF-funded Standards-based high school mathematics curricula. Most previous evaluations have studied the impact of field-test versions of a curriculum. Since these innovative curricula were so new at the time of many of these studies, students…
NASA Astrophysics Data System (ADS)
Pacheco-Sanchez, Anibal; Claus, Martin; Mothes, Sven; Schröter, Michael
2016-11-01
Three different methods for the extraction of the contact resistance based on both the well-known transfer length method (TLM) and two variants of the Y-function method have been applied to simulation and experimental data of short- and long-channel CNTFETs. While for TLM special CNT test structures are mandatory, standard electrical device characteristics are sufficient for the Y-function methods. The methods have been applied to CNTFETs with low and high channel resistance. It turned out that the standard Y-function method fails to deliver the correct contact resistance in case of a relatively high channel resistance compared to the contact resistances. A physics-based validation is also given for the application of these methods based on applying traditional Si MOSFET theory to quasi-ballistic CNTFETs.
Metropolitan Quantum Key Distribution with Silicon Photonics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bunandar, Darius; Lentine, Anthony; Lee, Catherine
Photonic integrated circuits provide a compact and stable platform for quantum photonics. Here we demonstrate a silicon photonics quantum key distribution (QKD) encoder in the first high-speed polarization-based QKD field tests. The systems reach composable secret key rates of 1.039 Mbps in a local test (on a 103.6-m fiber with a total emulated loss of 9.2 dB) and 157 kbps in an intercity metropolitan test (on a 43-km fiber with 16.4 dB loss). Our results represent the highest secret key generation rate for polarization-based QKD experiments at a standard telecom wavelength and demonstrate photonic integrated circuits as a promising, scalablemore » resource for future formation of metropolitan quantum-secure communications networks.« less
Metropolitan Quantum Key Distribution with Silicon Photonics
Bunandar, Darius; Lentine, Anthony; Lee, Catherine; ...
2018-04-06
Photonic integrated circuits provide a compact and stable platform for quantum photonics. Here we demonstrate a silicon photonics quantum key distribution (QKD) encoder in the first high-speed polarization-based QKD field tests. The systems reach composable secret key rates of 1.039 Mbps in a local test (on a 103.6-m fiber with a total emulated loss of 9.2 dB) and 157 kbps in an intercity metropolitan test (on a 43-km fiber with 16.4 dB loss). Our results represent the highest secret key generation rate for polarization-based QKD experiments at a standard telecom wavelength and demonstrate photonic integrated circuits as a promising, scalablemore » resource for future formation of metropolitan quantum-secure communications networks.« less
Metropolitan Quantum Key Distribution with Silicon Photonics
NASA Astrophysics Data System (ADS)
Bunandar, Darius; Lentine, Anthony; Lee, Catherine; Cai, Hong; Long, Christopher M.; Boynton, Nicholas; Martinez, Nicholas; DeRose, Christopher; Chen, Changchen; Grein, Matthew; Trotter, Douglas; Starbuck, Andrew; Pomerene, Andrew; Hamilton, Scott; Wong, Franco N. C.; Camacho, Ryan; Davids, Paul; Urayama, Junji; Englund, Dirk
2018-04-01
Photonic integrated circuits provide a compact and stable platform for quantum photonics. Here we demonstrate a silicon photonics quantum key distribution (QKD) encoder in the first high-speed polarization-based QKD field tests. The systems reach composable secret key rates of 1.039 Mbps in a local test (on a 103.6-m fiber with a total emulated loss of 9.2 dB) and 157 kbps in an intercity metropolitan test (on a 43-km fiber with 16.4 dB loss). Our results represent the highest secret key generation rate for polarization-based QKD experiments at a standard telecom wavelength and demonstrate photonic integrated circuits as a promising, scalable resource for future formation of metropolitan quantum-secure communications networks.
NASA Astrophysics Data System (ADS)
Rukmana, Y. Y.; Ridwan, M.
2018-01-01
This paper presents the results of soil investigation on the residual soil at Gayungan Surabaya. The methodology of the research consists of Drilling + Standard Penetration Test (ASTM D1586-99), sampling and laboratory test for index properties & mechanical of soil, then analyzed for Soil Bearing Capacity (Meyerhoff, 1976). Field test analysis data showed that Bore Hole.01(BH.01) and Bore Hole.03 (BH.03) were dominated by Sand / Sandy clay layer with Standart Penetration Test (SPT) values: 6-68, whereas in BH.02 was dominated by Clayey sand layer with Standard Penetration Test (SPT) values: 32-68. Based on Soil classification according to Unified Soil Classification System (USCS), the soil type at the research area consisted of ML (Silt with Low plasticity), CL ( Clay with low plasticity), MH (Silt with High plasticity), and SP (Sand with Poor gradation). Based on the borlog data and soil bearing capacity analysis of the research area is recommended: for The Deep foundation to reaches at least 16 meters depth with Qa = 1160.40-2032.80 kN / m2, and Shallow foundation reaches at least 1-2 meters deep with Qa = 718.25 kN / M2.
Public domain optical character recognition
NASA Astrophysics Data System (ADS)
Garris, Michael D.; Blue, James L.; Candela, Gerald T.; Dimmick, Darrin L.; Geist, Jon C.; Grother, Patrick J.; Janet, Stanley A.; Wilson, Charles L.
1995-03-01
A public domain document processing system has been developed by the National Institute of Standards and Technology (NIST). The system is a standard reference form-based handprint recognition system for evaluating optical character recognition (OCR), and it is intended to provide a baseline of performance on an open application. The system's source code, training data, performance assessment tools, and type of forms processed are all publicly available. The system recognizes the handprint entered on handwriting sample forms like the ones distributed with NIST Special Database 1. From these forms, the system reads hand-printed numeric fields, upper and lowercase alphabetic fields, and unconstrained text paragraphs comprised of words from a limited-size dictionary. The modular design of the system makes it useful for component evaluation and comparison, training and testing set validation, and multiple system voting schemes. The system contains a number of significant contributions to OCR technology, including an optimized probabilistic neural network (PNN) classifier that operates a factor of 20 times faster than traditional software implementations of the algorithm. The source code for the recognition system is written in C and is organized into 11 libraries. In all, there are approximately 19,000 lines of code supporting more than 550 subroutines. Source code is provided for form registration, form removal, field isolation, field segmentation, character normalization, feature extraction, character classification, and dictionary-based postprocessing. The recognition system has been successfully compiled and tested on a host of UNIX workstations. This paper gives an overview of the recognition system's software architecture, including descriptions of the various system components along with timing and accuracy statistics.
Adaptive Set-Based Methods for Association Testing
Su, Yu-Chen; Gauderman, W. James; Kiros, Berhane; Lewinger, Juan Pablo
2017-01-01
With a typical sample size of a few thousand subjects, a single genomewide association study (GWAS) using traditional one-SNP-at-a-time methods can only detect genetic variants conferring a sizable effect on disease risk. Set-based methods, which analyze sets of SNPs jointly, can detect variants with smaller effects acting within a gene, a pathway, or other biologically relevant sets. While self-contained set-based methods (those that test sets of variants without regard to variants not in the set) are generally more powerful than competitive set-based approaches (those that rely on comparison of variants in the set of interest with variants not in the set), there is no consensus as to which self-contained methods are best. In particular, several self-contained set tests have been proposed to directly or indirectly ‘adapt’ to the a priori unknown proportion and distribution of effects of the truly associated SNPs in the set, which is a major determinant of their power. A popular adaptive set-based test is the adaptive rank truncated product (ARTP), which seeks the set of SNPs that yields the best-combined evidence of association. We compared the standard ARTP, several ARTP variations we introduced, and other adaptive methods in a comprehensive simulation study to evaluate their performance. We used permutations to assess significance for all the methods and thus provide a level playing field for comparison. We found the standard ARTP test to have the highest power across our simulations followed closely by the global model of random effects (GMRE) and a LASSO based test. PMID:26707371
Testing of visual field with virtual reality goggles in manual and visual grasp modes.
Wroblewski, Dariusz; Francis, Brian A; Sadun, Alfredo; Vakili, Ghazal; Chopra, Vikas
2014-01-01
Automated perimetry is used for the assessment of visual function in a variety of ophthalmic and neurologic diseases. We report development and clinical testing of a compact, head-mounted, and eye-tracking perimeter (VirtualEye) that provides a more comfortable test environment than the standard instrumentation. VirtualEye performs the equivalent of a full threshold 24-2 visual field in two modes: (1) manual, with patient response registered with a mouse click, and (2) visual grasp, where the eye tracker senses change in gaze direction as evidence of target acquisition. 59 patients successfully completed the test in manual mode and 40 in visual grasp mode, with 59 undergoing the standard Humphrey field analyzer (HFA) testing. Large visual field defects were reliably detected by VirtualEye. Point-by-point comparison between the results obtained with the different modalities indicates: (1) minimal systematic differences between measurements taken in visual grasp and manual modes, (2) the average standard deviation of the difference distributions of about 5 dB, and (3) a systematic shift (of 4-6 dB) to lower sensitivities for VirtualEye device, observed mostly in high dB range. The usability survey suggested patients' acceptance of the head-mounted device. The study appears to validate the concepts of a head-mounted perimeter and the visual grasp mode.
Mazerand, Edouard; Le Renard, Marc; Hue, Sophie; Lemée, Jean-Michel; Klinger, Evelyne; Menei, Philippe
2017-01-01
Brain mapping during awake craniotomy is a well-known technique to preserve neurological functions, especially the language. It is still challenging to map the optic radiations due to the difficulty to test the visual field intraoperatively. To assess the visual field during awake craniotomy, we developed the Functions' Explorer based on a virtual reality headset (FEX-VRH). The impaired visual field of 10 patients was tested with automated perimetry (the gold standard examination) and the FEX-VRH. The proof-of-concept test was done during the surgery performed on a patient who was blind in his right eye and presenting with a left parietotemporal glioblastoma. The FEX-VRH was used intraoperatively, simultaneously with direct subcortical electrostimulation, allowing identification and preservation of the optic radiations. The FEX-VRH detected 9 of the 10 visual field defects found by automated perimetry. The patient who underwent an awake craniotomy with intraoperative mapping of the optic tract using the FEX-VRH had no permanent postoperative visual field defect. Intraoperative visual field assessment with the FEX-VRH during direct subcortical electrostimulation is a promising approach to mapping the optical radiations and preventing a permanent visual field defect during awake surgery for epilepsy or tumor. Copyright © 2016 Elsevier Inc. All rights reserved.
29 CFR 1630.10 - Qualification standards, tests, and other selection criteria.
Code of Federal Regulations, 2011 CFR
2011-07-01
... business necessity. (b) Qualification standards and tests related to uncorrected vision. Notwithstanding..., or other selection criteria based on an individual's uncorrected vision unless the standard, test, or... application of a qualification standard, test, or other criterion based on uncorrected vision need not be a...
NASA Technical Reports Server (NTRS)
Larkin, Paul; Goldstein, Bob
2008-01-01
This paper presents an update to the methods and procedures used in Direct Field Acoustic Testing (DFAT). The paper will discuss some of the recent techniques and developments that are currently being used and the future publication of a reference standard. Acoustic testing using commercial sound system components is becoming a popular and cost effective way of generating a required acoustic test environment both in and out of a reverberant chamber. This paper will present the DFAT test method, the usual setup and procedure and the development and use of a closed-loop, narrow-band control system. Narrow-band control of the acoustic PSD allows all standard techniques and procedures currently used in random control to be applied to acoustics and some examples are given. The paper will conclude with a summary of the development of a standard practice guideline that is hoped to be available in the first quarter of next year.
NASA Astrophysics Data System (ADS)
Snaith, Henry J.; Hacke, Peter
2018-06-01
Photovoltaic modules are expected to operate in the field for more than 25 years, so reliability assessment is critical for the commercialization of new photovoltaic technologies. In early development stages, understanding and addressing the device degradation mechanisms are the priorities. However, any technology targeting large-scale deployment must eventually pass industry-standard qualification tests and undergo reliability testing to validate the module lifetime. In this Perspective, we review the methodologies used to assess the reliability of established photovoltaics technologies and to develop standardized qualification tests. We present the stress factors and stress levels for degradation mechanisms currently identified in pre-commercial perovskite devices, along with engineering concepts for mitigation of those degradation modes. Recommendations for complete and transparent reporting of stability tests are given, to facilitate future inter-laboratory comparisons and to further the understanding of field-relevant degradation mechanisms, which will benefit the development of accelerated stress tests.
A new biological test of water toxicity-yeast Saccharomyces cerevisiae conductometric test.
Dolezalova, Jaroslava; Rumlova, Lubomira
2014-11-01
This new biological test of water toxicity is based on monitoring of specific conductivity changes of yeast Saccharomyces cerevisiae suspension as a result of yeast fermentation activity inhibition in toxic conditions. The test was verified on ten substances with various mechanisms of toxic effect and the results were compared with two standard toxicity tests based on Daphnia magna mobility inhibition (EN ISO 6341) and Vibrio fischeri bioluminescence inhibition (EN ISO 11348-2) and with the results of the S. cerevisiae lethal test (Rumlova and Dolezalova, 2012). The new biological test - S. cerevisiae conductometric test - is an express method developed primarily for field conditions. It is applicable in case of need of immediate information about water toxicity. Fast completion is an advantage of this test (time necessary for test completion is about 60min), the test is simple and the test organism - dried instant yeast - belongs among its biggest advantages because of its long-term storage life and broad availability. Copyright © 2014 Elsevier B.V. All rights reserved.
Mousa, Mohammad F; Cubbidge, Robert P; Al-Mansouri, Fatima; Bener, Abdulbari
2014-02-01
Multifocal visual evoked potential (mfVEP) is a newly introduced method used for objective visual field assessment. Several analysis protocols have been tested to identify early visual field losses in glaucoma patients using the mfVEP technique, some were successful in detection of field defects, which were comparable to the standard automated perimetry (SAP) visual field assessment, and others were not very informative and needed more adjustment and research work. In this study we implemented a novel analysis approach and evaluated its validity and whether it could be used effectively for early detection of visual field defects in glaucoma. Three groups were tested in this study; normal controls (38 eyes), glaucoma patients (36 eyes) and glaucoma suspect patients (38 eyes). All subjects had a two standard Humphrey field analyzer (HFA) test 24-2 and a single mfVEP test undertaken in one session. Analysis of the mfVEP results was done using the new analysis protocol; the hemifield sector analysis (HSA) protocol. Analysis of the HFA was done using the standard grading system. Analysis of mfVEP results showed that there was a statistically significant difference between the three groups in the mean signal to noise ratio (ANOVA test, p < 0.001 with a 95% confidence interval). The difference between superior and inferior hemispheres in all subjects were statistically significant in the glaucoma patient group in all 11 sectors (t-test, p < 0.001), partially significant in 5 / 11 (t-test, p < 0.01), and no statistical difference in most sectors of the normal group (1 / 11 sectors was significant, t-test, p < 0.9). Sensitivity and specificity of the HSA protocol in detecting glaucoma was 97% and 86%, respectively, and for glaucoma suspect patients the values were 89% and 79%, respectively. The new HSA protocol used in the mfVEP testing can be applied to detect glaucomatous visual field defects in both glaucoma and glaucoma suspect patients. Using this protocol can provide information about focal visual field differences across the horizontal midline, which can be utilized to differentiate between glaucoma and normal subjects. Sensitivity and specificity of the mfVEP test showed very promising results and correlated with other anatomical changes in glaucoma field loss.
Mousa, Mohammad F.; Cubbidge, Robert P.; Al-Mansouri, Fatima
2014-01-01
Purpose Multifocal visual evoked potential (mfVEP) is a newly introduced method used for objective visual field assessment. Several analysis protocols have been tested to identify early visual field losses in glaucoma patients using the mfVEP technique, some were successful in detection of field defects, which were comparable to the standard automated perimetry (SAP) visual field assessment, and others were not very informative and needed more adjustment and research work. In this study we implemented a novel analysis approach and evaluated its validity and whether it could be used effectively for early detection of visual field defects in glaucoma. Methods Three groups were tested in this study; normal controls (38 eyes), glaucoma patients (36 eyes) and glaucoma suspect patients (38 eyes). All subjects had a two standard Humphrey field analyzer (HFA) test 24-2 and a single mfVEP test undertaken in one session. Analysis of the mfVEP results was done using the new analysis protocol; the hemifield sector analysis (HSA) protocol. Analysis of the HFA was done using the standard grading system. Results Analysis of mfVEP results showed that there was a statistically significant difference between the three groups in the mean signal to noise ratio (ANOVA test, p < 0.001 with a 95% confidence interval). The difference between superior and inferior hemispheres in all subjects were statistically significant in the glaucoma patient group in all 11 sectors (t-test, p < 0.001), partially significant in 5 / 11 (t-test, p < 0.01), and no statistical difference in most sectors of the normal group (1 / 11 sectors was significant, t-test, p < 0.9). Sensitivity and specificity of the HSA protocol in detecting glaucoma was 97% and 86%, respectively, and for glaucoma suspect patients the values were 89% and 79%, respectively. Conclusions The new HSA protocol used in the mfVEP testing can be applied to detect glaucomatous visual field defects in both glaucoma and glaucoma suspect patients. Using this protocol can provide information about focal visual field differences across the horizontal midline, which can be utilized to differentiate between glaucoma and normal subjects. Sensitivity and specificity of the mfVEP test showed very promising results and correlated with other anatomical changes in glaucoma field loss. PMID:24511212
NASA Astrophysics Data System (ADS)
Robertis, G. De; Fanizzi, G.; Loddo, F.; Manzari, V.; Rizzi, M.
2018-02-01
In this work the MOSAIC ("MOdular System for Acquisition, Interface and Control") board, designed for the readout and testing of the pixel modules for the silicon tracker upgrade of the ALICE (A Large Ion Collider Experiment) experiment at teh CERN LHC, is described. It is based on an Artix7 Field Programmable Gate Array device by Xilinx and is compliant with the six unit "Versa Modular Eurocard" standard (6U-VME) for easy housing in a standard VMEbus crate from which it takes only power supplies and cooling.
Hydrogen slush density reference system
NASA Technical Reports Server (NTRS)
Weitzel, D. H.; Lowe, L. T.; Ellerbruch, D. A.; Cruz, J. E.; Sindt, C. F.
1971-01-01
A hydrogen slush density reference system was designed for calibration of field-type instruments and/or transfer standards. The device is based on the buoyancy principle of Archimedes. The solids are weighed in a low-mass container so arranged that solids and container are buoyed by triple-point liquid hydrogen during the weighing process. Several types of hydrogen slush density transducers were developed and tested for possible use as transfer standards. The most successful transducers found were those which depend on change in dielectric constant, after which the Clausius-Mossotti function is used to relate dielectric constant and density.
Fragmentation modeling of a resin bonded sand
NASA Astrophysics Data System (ADS)
Hilth, William; Ryckelynck, David
2017-06-01
Cemented sands exhibit a complex mechanical behavior that can lead to sophisticated models, with numerous parameters without real physical meaning. However, using a rather simple generalized critical state bonded soil model has proven to be a relevant compromise between an easy calibration and good results. The constitutive model formulation considers a non-associated elasto-plastic formulation within the critical state framework. The calibration procedure, using standard laboratory tests, is complemented by the study of an uniaxial compression test observed by tomography. Using finite elements simulations, this test is simulated considering a non-homogeneous 3D media. The tomography of compression sample gives access to 3D displacement fields by using image correlation techniques. Unfortunately these fields have missing experimental data because of the low resolution of correlations for low displacement magnitudes. We propose a recovery method that reconstructs 3D full displacement fields and 2D boundary displacement fields. These fields are mandatory for the calibration of the constitutive parameters by using 3D finite element simulations. The proposed recovery technique is based on a singular value decomposition of available experimental data. This calibration protocol enables an accurate prediction of the fragmentation of the specimen.
NASA Technical Reports Server (NTRS)
Sanchez, Jose Enrique; Auge, Estanislau; Santalo, Josep; Blanes, Ian; Serra-Sagrista, Joan; Kiely, Aaron
2011-01-01
A new standard for image coding is being developed by the MHDC working group of the CCSDS, targeting onboard compression of multi- and hyper-spectral imagery captured by aircraft and satellites. The proposed standard is based on the "Fast Lossless" adaptive linear predictive compressor, and is adapted to better overcome issues of onboard scenarios. In this paper, we present a review of the state of the art in this field, and provide an experimental comparison of the coding performance of the emerging standard in relation to other state-of-the-art coding techniques. Our own independent implementation of the MHDC Recommended Standard, as well as of some of the other techniques, has been used to provide extensive results over the vast corpus of test images from the CCSDS-MHDC.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wohlgemuth, John; Silverman, Timothy; Miller, David C.
This paper describes an effort to inspect and evaluate PV modules in order to determine what failure or degradation modes are occurring in field installations. This paper will report on the results of six site visits, including the Sacramento Municipal Utility District (SMUD) Hedge Array, Tucson Electric Power (TEP) Springerville, Central Florida Utility, Florida Solar Energy Center (FSEC), the TEP Solar Test Yard, and University of Toledo installations. The effort here makes use of a recently developed field inspection data collection protocol, and the results were input into a corresponding database. The results of this work have also been usedmore » to develop a draft of the IEC standard for climate and application specific accelerated stress testing beyond module qualification. TEP Solar Test Yard, and University of Toledo installations. The effort here makes use of a recently developed field inspection data collection protocol, and the results were input into a corresponding database. The results of this work have also been used to develop a draft of the IEC standard for climate and application specific accelerated stress testing beyond module qualification. TEP Solar Test Yard, and University of Toledo installations. The effort here makes use of a recently developed field inspection data collection protocol, and the results were input into a corresponding database. The results of this work have also been used to develop a draft of the IEC standard for climate and application specific accelerated stress testing beyond module qualification.« less
A Portable Platform for Evaluation of Visual Performance in Glaucoma Patients
Rosen, Peter N.; Boer, Erwin R.; Gracitelli, Carolina P. B.; Abe, Ricardo Y.; Diniz-Filho, Alberto; Marvasti, Amir H.; Medeiros, Felipe A.
2015-01-01
Purpose To propose a new tablet-enabled test for evaluation of visual performance in glaucoma, the PERformance CEntered Portable Test (PERCEPT), and to evaluate its ability to predict history of falls and motor vehicle crashes. Design Cross-sectional study. Methods The study involved 71 patients with glaucomatous visual field defects on standard automated perimetry (SAP) and 59 control subjects. The PERCEPT was based on the concept of increasing visual task difficulty to improve detection of central visual field losses in glaucoma patients. Subjects had to perform a foveal 8-alternative-forced-choice orientation discrimination task, while detecting a simultaneously presented peripheral stimulus within a limited presentation time. Subjects also underwent testing with the Useful Field of View (UFOV) divided attention test. The ability to predict history of motor vehicle crashes and falls was investigated by odds ratios and incident-rate ratios, respectively. Results When adjusted for age, only the PERCEPT processing speed parameter showed significantly larger values in glaucoma compared to controls (difference: 243ms; P<0.001). PERCEPT results had a stronger association with history of motor vehicle crashes and falls than UFOV. Each 1 standard deviation increase in PERCEPT processing speed was associated with an odds ratio of 2.69 (P = 0.003) for predicting history of motor vehicle crashes and with an incident-rate ratio of 1.95 (P = 0.003) for predicting history of falls. Conclusion A portable platform for testing visual function was able to detect functional deficits in glaucoma, and its results were significantly associated with history of involvement in motor vehicle crashes and history of falls. PMID:26445501
Differential Gender Performance on the Major Field Test-Business
ERIC Educational Resources Information Center
Bielinska-Kwapisz, Agnieszka; Brown, F. William
2013-01-01
The Major Field Test in Business (MFT-B), a standardized assessment test of business knowledge among undergraduate business seniors, is widely used to measure student achievement. Many previous studies analyzing scores on the MFT-B report gender differences on the exam even after controlling for student's aptitude, general intellectual ability,…
Web-Based Evaluation System to Measure Learning Effectiveness in Kampo Medicine
Usuku, Koichiro; Segawa, Makoto; Wang, Yue; Ogashiwa, Kahori; Fujita, Yusuke; Ogihara, Hiroyuki; Tazuma, Susumu
2016-01-01
Measuring the learning effectiveness of Kampo Medicine (KM) education is challenging. The aim of this study was to develop a web-based test to measure the learning effectiveness of KM education among medical students (MSs). We used an open-source Moodle platform to test 30 multiple-choice questions classified into 8-type fields (eight basic concepts of KM) including “qi-blood-fluid” and “five-element” theories, on 117 fourth-year MSs. The mean (±standard deviation [SD]) score on the web-based test was 30.2 ± 11.9 (/100). The correct answer rate ranged from 17% to 36%. A pattern-based portfolio enabled these rates to be individualized in terms of KM proficiency. MSs with scores higher (n = 19) or lower (n = 14) than mean ± 1SD were defined as high or low achievers, respectively. Cluster analysis using the correct answer rates for the 8-type field questions revealed clear divisions between high and low achievers. Interestingly, each high achiever had a different proficiency pattern. In contrast, three major clusters were evident among low achievers, all of whom responded with a low percentage of or no correct answers. In addition, a combination of three questions accurately classified high and low achievers. These findings suggest that our web-based test allows individual quantitative assessment of the learning effectiveness of KM education among MSs. PMID:27738440
Web-Based Evaluation System to Measure Learning Effectiveness in Kampo Medicine.
Iizuka, Norio; Usuku, Koichiro; Nakae, Hajime; Segawa, Makoto; Wang, Yue; Ogashiwa, Kahori; Fujita, Yusuke; Ogihara, Hiroyuki; Tazuma, Susumu; Hamamoto, Yoshihiko
2016-01-01
Measuring the learning effectiveness of Kampo Medicine (KM) education is challenging. The aim of this study was to develop a web-based test to measure the learning effectiveness of KM education among medical students (MSs). We used an open-source Moodle platform to test 30 multiple-choice questions classified into 8-type fields (eight basic concepts of KM) including "qi-blood-fluid" and "five-element" theories, on 117 fourth-year MSs. The mean (±standard deviation [SD]) score on the web-based test was 30.2 ± 11.9 (/100). The correct answer rate ranged from 17% to 36%. A pattern-based portfolio enabled these rates to be individualized in terms of KM proficiency. MSs with scores higher ( n = 19) or lower ( n = 14) than mean ± 1SD were defined as high or low achievers, respectively. Cluster analysis using the correct answer rates for the 8-type field questions revealed clear divisions between high and low achievers. Interestingly, each high achiever had a different proficiency pattern. In contrast, three major clusters were evident among low achievers, all of whom responded with a low percentage of or no correct answers. In addition, a combination of three questions accurately classified high and low achievers. These findings suggest that our web-based test allows individual quantitative assessment of the learning effectiveness of KM education among MSs.
Proposed Robust Entanglement-Based Magnetic Field Sensor Beyond the Standard Quantum Limit.
Tanaka, Tohru; Knott, Paul; Matsuzaki, Yuichiro; Dooley, Shane; Yamaguchi, Hiroshi; Munro, William J; Saito, Shiro
2015-10-23
Recently, there have been significant developments in entanglement-based quantum metrology. However, entanglement is fragile against experimental imperfections, and quantum sensing to beat the standard quantum limit in scaling has not yet been achieved in realistic systems. Here, we show that it is possible to overcome such restrictions so that one can sense a magnetic field with an accuracy beyond the standard quantum limit even under the effect of decoherence, by using a realistic entangled state that can be easily created even with current technology. Our scheme could pave the way for the realizations of practical entanglement-based magnetic field sensors.
An Enclosed Laser Calibration Standard
NASA Astrophysics Data System (ADS)
Adams, Thomas E.; Fecteau, M. L.
1985-02-01
We have designed, evaluated and calibrated an enclosed, safety-interlocked laser calibration standard for use in US Army Secondary Reference Calibration Laboratories. This Laser Test Set Calibrator (LTSC) represents the Army's first-generation field laser calibration standard. Twelve LTSC's are now being fielded world-wide. The main requirement on the LTSC is to provide calibration support for the Test Set (TS3620) which, in turn, is a GO/NO GO tester of the Hand-Held Laser Rangefinder (AN/GVS-5). However, we believe it's design is flexible enough to accommodate the calibration of other laser test, measurement and diagnostic equipment (TMDE) provided that single-shot capability is adequate to perform the task. In this paper we describe the salient aspects and calibration requirements of the AN/GVS-5 Rangefinder and the Test Set which drove the basic LTSC design. Also, we detail our evaluation and calibration of the LTSC, in particular, the LTSC system standards. We conclude with a review of our error analysis from which uncertainties were assigned to the LTSC calibration functions.
Hyyti, Janne; Escoto, Esmerando; Steinmeyer, Günter
2017-10-01
A novel algorithm for the ultrashort laser pulse characterization method of interferometric frequency-resolved optical gating (iFROG) is presented. Based on a genetic method, namely, differential evolution, the algorithm can exploit all available information of an iFROG measurement to retrieve the complex electric field of a pulse. The retrieval is subjected to a series of numerical tests to prove the robustness of the algorithm against experimental artifacts and noise. These tests show that the integrated error-correction mechanisms of the iFROG method can be successfully used to remove the effect from timing errors and spectrally varying efficiency in the detection. Moreover, the accuracy and noise resilience of the new algorithm are shown to outperform retrieval based on the generalized projections algorithm, which is widely used as the standard method in FROG retrieval. The differential evolution algorithm is further validated with experimental data, measured with unamplified three-cycle pulses from a mode-locked Ti:sapphire laser. Additionally introducing group delay dispersion in the beam path, the retrieval results show excellent agreement with independent measurements with a commercial pulse measurement device based on spectral phase interferometry for direct electric-field retrieval. Further experimental tests with strongly attenuated pulses indicate resilience of differential-evolution-based retrieval against massive measurement noise.
On standardization of low symmetry crystal fields
NASA Astrophysics Data System (ADS)
Gajek, Zbigniew
2015-07-01
Standardization methods of low symmetry - orthorhombic, monoclinic and triclinic - crystal fields are formulated and discussed. Two alternative approaches are presented, the conventional one, based on the second-rank parameters and the standardization based on the fourth-rank parameters. Mainly f-electron systems are considered but some guidelines for d-electron systems and the spin Hamiltonian describing the zero-field splitting are given. The discussion focuses on premises for choosing the most suitable method, in particular on inadequacy of the conventional one. Few examples from the literature illustrate this situation.
NASA Astrophysics Data System (ADS)
Guffey, S. K.; Slater, T. F.; Slater, S. J.
2017-12-01
Discipline-based geoscience education researchers have considerable need for criterion-referenced, easy-to-administer and easy-to-score, conceptual diagnostic surveys for undergraduates taking introductory science survey courses in order for faculty to better be able to monitor the learning impacts of various interactive teaching approaches. To support ongoing discipline-based science education research to improve teaching and learning across the geosciences, this study establishes the reliability and validity of a 28-item, multiple-choice, pre- and post- Exam of GeoloGy Standards, hereafter simply called EGGS. The content knowledge EGGS addresses is based on 11 consensus concepts derived from a systematic, thematic analysis of the overlapping ideas presented in national science education reform documents including the Next Generation Science Standards, the AAAS Benchmarks for Science Literacy, the Earth Science Literacy Principles, and the NRC National Science Education Standards. Using community agreed upon best-practices for creating, field-testing, and iteratively revising modern multiple-choice test items using classical item analysis techniques, EGGS emphasizes natural student language over technical scientific vocabulary, leverages illustrations over students' reading ability, specifically targets students' misconceptions identified in the scholarly literature, and covers the range of topics most geology educators expect general education students to know at the end of their formal science learning experiences. The current version of EGGS is judged to be valid and reliable with college-level, introductory science survey students based on both standard quantitative and qualitative measures, including extensive clinical interviews with targeted students and systematic expert review.
Rao, Harsha L; Yadav, Ravi K; Addepalli, Uday K; Begum, Viquar U; Senthil, Sirisha; Choudhari, Nikhil S; Garudadri, Chandra S
2015-08-01
To evaluate the relationship between the reference standard used to diagnose glaucoma and the diagnostic ability of spectral domain optical coherence tomograph (SDOCT). In a cross-sectional study, 280 eyes of 175 consecutive subjects, referred to a tertiary eye care center for glaucoma evaluation, underwent optic disc photography, visual field (VF) examination, and SDOCT examination. The cohort was divided into glaucoma and control groups based on 3 reference standards for glaucoma diagnosis: first based on the optic disc classification (179 glaucoma and 101 control eyes), second on VF classification (glaucoma hemifield test outside normal limits and pattern SD with P-value of <5%, 130 glaucoma and 150 control eyes), and third on the presence of both glaucomatous optic disc and glaucomatous VF (125 glaucoma and 155 control eyes). Relationship between the reference standards and the diagnostic parameters of SDOCT were evaluated using areas under the receiver operating characteristic curve, sensitivity, and specificity. Areas under the receiver operating characteristic curve and sensitivities of most of the SDOCT parameters obtained with the 3 reference standards (ranging from 0.74 to 0.88 and 72% to 88%, respectively) were comparable (P>0.05). However, specificities of SDOCT parameters were significantly greater (P<0.05) with optic disc classification as reference standard (74% to 88%) compared with VF classification as reference standard (57% to 74%). Diagnostic parameters of SDOCT that was significantly affected by reference standard was the specificity, which was greater with optic disc classification as the reference standard. This has to be considered when comparing the diagnostic ability of SDOCT across studies.
Collaborative Data Publication Utilizing the Open Data Repository's (ODR) Data Publisher
NASA Technical Reports Server (NTRS)
Stone, N.; Lafuente, B.; Bristow, T.; Keller, R. M.; Downs, R. T.; Blake, D.; Fonda, M.; Dateo, C.; Pires, A.
2017-01-01
Introduction: For small communities in diverse fields such as astrobiology, publishing and sharing data can be a difficult challenge. While large, homogenous fields often have repositories and existing data standards, small groups of independent researchers have few options for publishing standards and data that can be utilized within their community. In conjunction with teams at NASA Ames and the University of Arizona, the Open Data Repository's (ODR) Data Publisher has been conducting ongoing pilots to assess the needs of diverse research groups and to develop software to allow them to publish and share their data collaboratively. Objectives: The ODR's Data Publisher aims to provide an easy-to-use and implement software tool that will allow researchers to create and publish database templates and related data. The end product will facilitate both human-readable interfaces (web-based with embedded images, files, and charts) and machine-readable interfaces utilizing semantic standards. Characteristics: The Data Publisher software runs on the standard LAMP (Linux, Apache, MySQL, PHP) stack to provide the widest server base available. The software is based on Symfony (www.symfony.com) which provides a robust framework for creating extensible, object-oriented software in PHP. The software interface consists of a template designer where individual or master database templates can be created. A master database template can be shared by many researchers to provide a common metadata standard that will set a compatibility standard for all derivative databases. Individual researchers can then extend their instance of the template with custom fields, file storage, or visualizations that may be unique to their studies. This allows groups to create compatible databases for data discovery and sharing purposes while still providing the flexibility needed to meet the needs of scientists in rapidly evolving areas of research. Research: As part of this effort, a number of ongoing pilot and test projects are currently in progress. The Astrobiology Habitable Environments Database Working Group is developing a shared database standard using the ODR's Data Publisher and has a number of example databases where astrobiology data are shared. Soon these databases will be integrated via the template-based standard. Work with this group helps determine what data researchers in these diverse fields need to share and archive. Additionally, this pilot helps determine what standards are viable for sharing these types of data from internally developed standards to existing open standards such as the Dublin Core (http://dublincore.org) and Darwin Core (http://rs.twdg.org) metadata standards. Further studies are ongoing with the University of Arizona Department of Geosciences where a number of mineralogy databases are being constructed within the ODR Data Publisher system. Conclusions: Through the ongoing pilots and discussions with individual researchers and small research teams, a definition of the tools desired by these groups is coming into focus. As the software development moves forward, the goal is to meet the publication and collaboration needs of these scientists in an unobtrusive and functional way.
[An optical-fiber-sensor-based spectrophotometer for soil non-metallic nutrient determination].
He, Dong-xian; Hu, Juan-xiu; Lu, Shao-kun; He, Hou-yong
2012-01-01
In order to achieve rapid, convenient and efficient soil nutrient determination in soil testing and fertilizer recommendation, a portable optical-fiber-sensor-based spectrophotometer including immersed fiber sensor, flat field holographic concave grating, and diode array detector was developed for soil non-metallic nutrient determination. According to national standard of ultraviolet and visible spectrophotometer with JJG 178-2007, the wavelength accuracy and repeatability, baseline stability, transmittance accuracy and repeatability measured by the prototype instrument were satisfied with the national standard of III level; minimum spectral bandwidth, noise and excursion, and stray light were satisfied with the national standard of IV level. Significant linear relationships with slope of closing to 1 were found between the soil available nutrient contents including soil nitrate nitrogen, ammonia nitrogen, available phosphorus, available sulfur, available boron, and organic matter measured by the prototype instrument compared with that measured by two commercial single-beam-based and dual-beam-based spectrophotometers. No significant differences were revealed from the above comparison data. Therefore, the optical-fiber-sensor-based spectrophotometer can be used for rapid soil non-metallic nutrient determination with a high accuracy.
Hearing Aid–Related Standards and Test Systems
Ravn, Gert; Preves, David
2015-01-01
Many documents describe standardized methods and standard equipment requirements in the field of audiology and hearing aids. These standards will ensure a uniform level and a high quality of both the methods and equipment used in audiological work. The standards create the basis for measuring performance in a reproducible manner and independent from how and when and by whom parameters have been measured. This article explains, and focuses on, relevant acoustic and electromagnetic compatibility parameters and describes several test systems available. PMID:27516709
Bamdad, Shahram; Beigi, Vahid; Sedaghat, Mohammad Reza
2017-01-01
Perimetry is one of the mainstays in glaucoma diagnosis and treatment. Various strategies offer different accuracies in glaucoma testing. Our aim was to determine and compare the diagnostic sensitivity and specificity of Swedish Interactive Threshold Algorithm (SITA) Fast and Standard Full Threshold (SFT) strategies of the Humphrey Field Analyzer (HFA) in identifying patients with visual field defect in glaucoma disease. This prospective observational case series study was conducted in a university-based eye hospital. A total of 37 eyes of 20 patients with glaucoma were evaluated using the central 30-2 program and both the SITA Fast and SFT strategies. Both strategies were performed for each strategy in each session and for four times in a 2-week period. Data were analyzed using the Student's t-test, analysis of variance, and chi-square test. The SITA Fast and SFT strategies had similar sensitivity of 93.3%. The specificity of SITA Fast and SFT strategies was 57.4% and 71.4% respectively. The mean duration of SFT tests was 14.6 minutes, and that of SITA Fast tests was 5.45 minutes (a statistically significant 62.5% reduction). In gray scale plots, visual field defect was less deep in SITA Fast than in SFT; however, more points had significant defect (p < 0.5% and p < 1%) in pattern deviation plots in SITA Fast than in SFT; these differences were not clinically significant. In conclusion, the SITA Fast strategy showed higher sensitivity for detection of glaucoma compared to the SFT strategy, yet with reduced specificity; however, the shorter test duration makes it a more acceptable choice in many clinical situations, especially for children, elderly, and those with musculoskeletal diseases.
A high resolution Passive Flux Meter approach based on colorimetric responses
NASA Astrophysics Data System (ADS)
Chardi, K.; Dombrowski, K.; Cho, J.; Hatfield, K.; Newman, M.; Annable, M. D.
2016-12-01
Subsurface water and contaminant mass flux measurements are critical in determining risk, optimizing remediation strategies, and monitoring contaminant attenuation. The standard Passive Flux Meter, hereafter knows as a (PFM), is a well-developed device used for determining and monitoring rates of groundwater and contaminant mass flux in screened wells. The current PFM is a permeable device that contains granular activated carbon impregnated with alcohol tracers which is deployed in a flow field for a designated period of time. Once extracted, sampling requires laboratory analysis to quantify Darcy flux, which can be time consuming and have significant cost. To expedite test results, a modified PFM based on the image analysis of colorimetric responses, herein referred to as a colorimetric Passive Flux Meter (cPFM), was developed. Various dyes and sorbents were selected and evaluated to determine colorimetric response to water flow. Rhodamine, fluorescent yellow, fluorescent orange, and turmeric were the dye candidates while 100% wool and a 35% wool blend with 65% rayon were the sorbent candidates selected for use in the cPFM. Ultraviolet light image analysis was used to calculate average color intensity using ImageJ, a Java-based image processing program. These results were then used to quantify Darcy flux. Error ranges evaluated for Darcy flux using the cPFM are comparable to those with the standard, activated carbon based, PFM. The cPFM has the potential to accomplish the goal of obtaining high resolution Darcy flux data while eliminating high costs and analysis time. Implications of groundwater characteristics, such as PH and contaminant concentrations, on image analysis are to be tested through laboratory analysis followed by field testing of the cPFM.
Genotoxicity evaluation of So-ochim-tang-gamibang (SOCG), a herbal medicine.
Lee, Mi Young; Park, Yang-Chun; Jin, Mirim; Kim, Eunseok; Choi, Jeong June; Jung, In Chul
2018-02-02
So-ochim-tang-gamibang (SOCG) is a traditional Korean medicine frequently used for depression in the clinical field. In this study, we evaluated the potential genotoxicity of SOCG using three standard batteries of tests as part of a safety evaluation. SOCG was evaluated for potential genotoxic effects using the standard three tests recommended by the Ministry of Food and Drug Safety (MFDS) of Korea. These tests were the bacterial reverse mutation test (Ames test), in vitro mammalian chromosomal aberration test using Chinese hamster lung cells, and in vivo micronucleus test using ICR mice. The Ames test with Salmonella typhimurium strains TA98, TA100, TA1535 and TA1537 and the Escherichia coli strain WP2uvrA(pKM101) showed that SOCG did not induce gene mutations at any dose level in all of the strains. SOCG did not induce any chromosomal aberrations in the in vitro chromosomal aberration test (for both the 6 and 24 h test) and the in vivo micronucleus test. Based on the results of these tests, it was concluded that SOCG does not exhibit any genotoxic risk under the experimental conditions of this study.
Measurement and Analysis of Failures in Computer Systems
NASA Technical Reports Server (NTRS)
Thakur, Anshuman
1997-01-01
This thesis presents a study of software failures spanning several different releases of Tandem's NonStop-UX operating system running on Tandem Integrity S2(TMR) systems. NonStop-UX is based on UNIX System V and is fully compliant with industry standards, such as the X/Open Portability Guide, the IEEE POSIX standards, and the System V Interface Definition (SVID) extensions. In addition to providing a general UNIX interface to the hardware, the operating system has built-in recovery mechanisms and audit routines that check the consistency of the kernel data structures. The analysis is based on data on software failures and repairs collected from Tandem's product report (TPR) logs for a period exceeding three years. A TPR log is created when a customer or an internal developer observes a failure in a Tandem Integrity system. This study concentrates primarily on those TPRs that report a UNIX panic that subsequently crashes the system. Approximately 200 of the TPRs fall into this category. Approximately 50% of the failures reported are from field systems, and the rest are from the testing and development sites. It has been observed by Tandem developers that fewer cases are encountered from the field than from the test centers. Thus, the data selection mechanism has introduced a slight skew.
Michaud, Ginette Y
2005-01-01
In the field of clinical laboratory medicine, standardization is aimed at increasing the trueness and reliability of measured values. Standardization relies on the use of written standards, reference measurement procedures and reference materials. These are important tools for the design and validation of new tests, and for establishing the metrological traceability of diagnostic assays. Their use supports the translation of research technologies into new diagnostic assays and leads to more rapid advances in science and medicine, as well as improvements in the quality of patient care. The various standardization tools are described, as are the procedures by which written standards, reference procedures and reference materials are developed. Recent efforts to develop standards for use in the field of molecular diagnostics are discussed. The recognition of standardization tools by the FDA and other regulatory authorities is noted as evidence of their important role in ensuring the safety and performance of in vitro diagnostic devices.
A Florida validation study of the Standardized Field Sobriety Test (S.F.S.T.) battery
DOT National Transportation Integrated Search
1997-01-01
During the years 1975 - 1981, a battery of field sobriety tests was developed under funding by the National Highway Traffic Safety Administration (NHTSA), U.S. Department of Transportation (Burns and Moskowitz, 1977; Tharp, Burns, and Moskowitz, 1981...
Plucinski, Mateusz; Dimbu, Rafael; Candrinho, Baltazar; Colborn, James; Badiane, Aida; Ndiaye, Daouda; Mace, Kimberly; Chang, Michelle; Lemoine, Jean F; Halsey, Eric S; Barnwell, John W; Udhayakumar, Venkatachalam; Aidoo, Michael; Rogier, Eric
2017-11-07
Rapid diagnostic test (RDT) positivity is supplanting microscopy as the standard measure of malaria burden at the population level. However, there is currently no standard for externally validating RDT results from field surveys. Individuals' blood concentration of the Plasmodium falciparum histidine rich protein 2 (HRP2) protein were compared to results of HRP2-detecting RDTs in participants from field surveys in Angola, Mozambique, Haiti, and Senegal. A logistic regression model was used to estimate the HRP2 concentrations corresponding to the 50 and 90% level of detection (LOD) specific for each survey. There was a sigmoidal dose-response relationship between HRP2 concentration and RDT positivity for all surveys. Variation was noted in estimates for field RDT sensitivity, with the 50% LOD ranging between 0.076 and 6.1 ng/mL and the 90% LOD ranging between 1.1 and 53 ng/mL. Surveys conducted in two different provinces of Angola using the same brand of RDT and same study methodology showed a threefold difference in LOD. Measures of malaria prevalence estimated using population RDT positivity should be interpreted in the context of potentially large variation in RDT LODs between, and even within, surveys. Surveys based on RDT positivity would benefit from external validation of field RDT results by comparing RDT positivity and antigen concentration.
2010-01-01
Background The order Carnivora is well represented in India, with 58 of the 250 species found globally, occurring here. However, small carnivores figure very poorly in research and conservation policies in India. This is mainly due to the dearth of tested and standardized techniques that are both cost effective and conducive to small carnivore studies in the field. In this paper we present a non-invasive genetic technique standardized for the study of Indian felids and canids with the use of PCR amplification and restriction enzyme digestion of scat collected in the field. Findings Using existing sequences of felids and canids from GenBank, we designed primers from the 16S rRNA region of the mitochondrial genome and tested these on ten species of felids and five canids. We selected restriction enzymes that would cut the selected region differentially for various species within each family. We produced a restriction digestion profile for the potential differentiation of species based on fragment patterns. To test our technique, we used felid PCR primers on scats collected from various habitats in India, representing varied environmental conditions. Amplification success with field collected scats was 52%, while 86% of the products used for restriction digestion could be accurately assigned to species. We verified this through sequencing. A comparison of costs across the various techniques currently used for scat assignment showed that this technique was the most practical and cost effective. Conclusions The species-specific key developed in this paper provides a means for detailed investigations in the future that focus on elusive carnivores in India and this approach provides a model for other studies in areas of Asia where many small carnivores co-occur. PMID:20525407
Mechanical Testing of Hydrogels in Cartilage Tissue Engineering: Beyond the Compressive Modulus
Xiao, Yinghua; Friis, Elizabeth A.; Gehrke, Stevin H.
2013-01-01
Injuries to articular cartilage result in significant pain to patients and high medical costs. Unfortunately, cartilage repair strategies have been notoriously unreliable and/or complex. Biomaterial-based tissue-engineering strategies offer great promise, including the use of hydrogels to regenerate articular cartilage. Mechanical integrity is arguably the most important functional outcome of engineered cartilage, although mechanical testing of hydrogel-based constructs to date has focused primarily on deformation rather than failure properties. In addition to deformation testing, as the field of cartilage tissue engineering matures, this community will benefit from the addition of mechanical failure testing to outcome analyses, given the crucial clinical importance of the success of engineered constructs. However, there is a tremendous disparity in the methods used to evaluate mechanical failure of hydrogels and articular cartilage. In an effort to bridge the gap in mechanical testing methods of articular cartilage and hydrogels in cartilage regeneration, this review classifies the different toughness measurements for each. The urgency for identifying the common ground between these two disparate fields is high, as mechanical failure is ready to stand alongside stiffness as a functional design requirement. In comparing toughness measurement methods between hydrogels and cartilage, we recommend that the best option for evaluating mechanical failure of hydrogel-based constructs for cartilage tissue engineering may be tensile testing based on the single edge notch test, in part because specimen preparation is more straightforward and a related American Society for Testing and Materials (ASTM) standard can be adopted in a fracture mechanics context. PMID:23448091
Forman, Jason L.; Kent, Richard W.; Mroz, Krystoffer; Pipkorn, Bengt; Bostrom, Ola; Segui-Gomez, Maria
2012-01-01
This study sought to develop a strain-based probabilistic method to predict rib fracture risk with whole-body finite element (FE) models, and to describe a method to combine the results with collision exposure information to predict injury risk and potential intervention effectiveness in the field. An age-adjusted ultimate strain distribution was used to estimate local rib fracture probabilities within an FE model. These local probabilities were combined to predict injury risk and severity within the whole ribcage. The ultimate strain distribution was developed from a literature dataset of 133 tests. Frontal collision simulations were performed with the THUMS (Total HUman Model for Safety) model with four levels of delta-V and two restraints: a standard 3-point belt and a progressive 3.5–7 kN force-limited, pretensioned (FL+PT) belt. The results of three simulations (29 km/h standard, 48 km/h standard, and 48 km/h FL+PT) were compared to matched cadaver sled tests. The numbers of fractures predicted for the comparison cases were consistent with those observed experimentally. Combining these results with field exposure informantion (ΔV, NASS-CDS 1992–2002) suggests a 8.9% probability of incurring AIS3+ rib fractures for a 60 year-old restrained by a standard belt in a tow-away frontal collision with this restraint, vehicle, and occupant configuration, compared to 4.6% for the FL+PT belt. This is the first study to describe a probabilistic framework to predict rib fracture risk based on strains observed in human-body FE models. Using this analytical framework, future efforts may incorporate additional subject or collision factors for multi-variable probabilistic injury prediction. PMID:23169122
Cetin, K.O.; Seed, R.B.; Der Kiureghian, A.; Tokimatsu, K.; Harder, L.F.; Kayen, R.E.; Moss, R.E.S.
2004-01-01
This paper presents'new correlations for assessment of the likelihood of initiation (or triggering) of soil liquefaction. These new correlations eliminate several sources of bias intrinsic to previous, similar correlations, and provide greatly reduced overall uncertainty and variance. Key elements in the development of these new correlations are (1) accumulation of a significantly expanded database of field performance case histories; (2) use of improved knowledge and understanding of factors affecting interpretation of standard penetration test data; (3) incorporation of improved understanding of factors affecting site-specific earthquake ground motions (including directivity effects, site-specific response, etc.); (4) use of improved methods for assessment of in situ cyclic shear stress ratio; (5) screening of field data case histories on a quality/uncertainty basis; and (6) use of high-order probabilistic tools (Bayesian updating). The resulting relationships not only provide greatly reduced uncertainty, they also help to resolve a number of corollary issues that have long been difficult and controversial including: (1) magnitude-correlated duration weighting factors, (2) adjustments for fines content, and (3) corrections for overburden stress. ?? ASCE.
Viladomat, Júlia; Mazumder, Rahul; McInturff, Alex; McCauley, Douglas J; Hastie, Trevor
2014-06-01
We propose a method to test the correlation of two random fields when they are both spatially autocorrelated. In this scenario, the assumption of independence for the pair of observations in the standard test does not hold, and as a result we reject in many cases where there is no effect (the precision of the null distribution is overestimated). Our method recovers the null distribution taking into account the autocorrelation. It uses Monte-Carlo methods, and focuses on permuting, and then smoothing and scaling one of the variables to destroy the correlation with the other, while maintaining at the same time the initial autocorrelation. With this simulation model, any test based on the independence of two (or more) random fields can be constructed. This research was motivated by a project in biodiversity and conservation in the Biology Department at Stanford University. © 2014, The International Biometric Society.
Newtonian CAFE: a new ideal MHD code to study the solar atmosphere
NASA Astrophysics Data System (ADS)
González-Avilés, J. J.; Cruz-Osorio, A.; Lora-Clavijo, F. D.; Guzmán, F. S.
2015-12-01
We present a new code designed to solve the equations of classical ideal magnetohydrodynamics (MHD) in three dimensions, submitted to a constant gravitational field. The purpose of the code centres on the analysis of solar phenomena within the photosphere-corona region. We present 1D and 2D standard tests to demonstrate the quality of the numerical results obtained with our code. As solar tests we present the transverse oscillations of Alfvénic pulses in coronal loops using a 2.5D model, and as 3D tests we present the propagation of impulsively generated MHD-gravity waves and vortices in the solar atmosphere. The code is based on high-resolution shock-capturing methods, uses the Harten-Lax-van Leer-Einfeldt (HLLE) flux formula combined with Minmod, MC, and WENO5 reconstructors. The divergence free magnetic field constraint is controlled using the Flux Constrained Transport method.
Fort Dix Remedial Investigation/Feasibility Study for MAG-1 Area
1994-01-01
by PID headspace results or odor ), samples should be diluted to bring the target compound concentrations within the instrument calibration range...Conductivity Testing ................... 2-38 2.9 ANALYTICAL PROCEDURES FOR FIELD SCREENING SAMPLES .. 2-38 2.9.1 Volatile Organic Compounds ...ANALYSIS OF VOLATILE ORGANIC COMPOUNDS BY FIELD GAS CHROMATOGRAPHY - STANDARD OPERATING PROCEDURE APPENDIX B RDX EXPLOSIVES FIELD TEST KIT PROCEDURES
Testing of Visual Field with Virtual Reality Goggles in Manual and Visual Grasp Modes
Wroblewski, Dariusz; Francis, Brian A.; Sadun, Alfredo; Vakili, Ghazal; Chopra, Vikas
2014-01-01
Automated perimetry is used for the assessment of visual function in a variety of ophthalmic and neurologic diseases. We report development and clinical testing of a compact, head-mounted, and eye-tracking perimeter (VirtualEye) that provides a more comfortable test environment than the standard instrumentation. VirtualEye performs the equivalent of a full threshold 24-2 visual field in two modes: (1) manual, with patient response registered with a mouse click, and (2) visual grasp, where the eye tracker senses change in gaze direction as evidence of target acquisition. 59 patients successfully completed the test in manual mode and 40 in visual grasp mode, with 59 undergoing the standard Humphrey field analyzer (HFA) testing. Large visual field defects were reliably detected by VirtualEye. Point-by-point comparison between the results obtained with the different modalities indicates: (1) minimal systematic differences between measurements taken in visual grasp and manual modes, (2) the average standard deviation of the difference distributions of about 5 dB, and (3) a systematic shift (of 4–6 dB) to lower sensitivities for VirtualEye device, observed mostly in high dB range. The usability survey suggested patients' acceptance of the head-mounted device. The study appears to validate the concepts of a head-mounted perimeter and the visual grasp mode. PMID:25050326
DOT National Transportation Integrated Search
2008-03-14
This report contains the results, findings and conclusions generated from the evaluation and field testing of a specific subset of ITS Standards applicable to the center-to-center exchange of advanced traveler information as deployed by the Nebraska ...
R&D Priorities for Educational Testing and Evaluation: The Testimony of the CRESST National Faculty.
ERIC Educational Resources Information Center
Herman, Joan L., Ed.
At the 1989 meeting of the National Faculty of the Center for Research on Evaluation, Standards, and Student Testing (CRESST), faculty members were invited to present testimony on what they viewed as the most pressing research and policy issues in the fields of testing, evaluation, and standards. These views are expressed in this document,…
Adaptive Set-Based Methods for Association Testing.
Su, Yu-Chen; Gauderman, William James; Berhane, Kiros; Lewinger, Juan Pablo
2016-02-01
With a typical sample size of a few thousand subjects, a single genome-wide association study (GWAS) using traditional one single nucleotide polymorphism (SNP)-at-a-time methods can only detect genetic variants conferring a sizable effect on disease risk. Set-based methods, which analyze sets of SNPs jointly, can detect variants with smaller effects acting within a gene, a pathway, or other biologically relevant sets. Although self-contained set-based methods (those that test sets of variants without regard to variants not in the set) are generally more powerful than competitive set-based approaches (those that rely on comparison of variants in the set of interest with variants not in the set), there is no consensus as to which self-contained methods are best. In particular, several self-contained set tests have been proposed to directly or indirectly "adapt" to the a priori unknown proportion and distribution of effects of the truly associated SNPs in the set, which is a major determinant of their power. A popular adaptive set-based test is the adaptive rank truncated product (ARTP), which seeks the set of SNPs that yields the best-combined evidence of association. We compared the standard ARTP, several ARTP variations we introduced, and other adaptive methods in a comprehensive simulation study to evaluate their performance. We used permutations to assess significance for all the methods and thus provide a level playing field for comparison. We found the standard ARTP test to have the highest power across our simulations followed closely by the global model of random effects (GMRE) and a least absolute shrinkage and selection operator (LASSO)-based test. © 2015 WILEY PERIODICALS, INC.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hacke, P.
2012-03-01
Over the past decade, there have been observations of module degradation and power loss because of the stress that system voltage bias exerts. This results in part from qualification tests and standards note adequately evaluating for the durability of modules to the long-term effects of high voltage bias that they experience in fielded arrays. This talk deals with factors for consideration, progress, and information still needed for a standardized test for degradation due to system voltage stress.
ERIC Educational Resources Information Center
Tumthong, Suwut; Piriyasurawong, Pullop; Jeerangsuwan, Namon
2016-01-01
This research proposes a functional competency development model for academic personnel based on international professional qualification standards in computing field and examines the appropriateness of the model. Specifically, the model consists of three key components which are: 1) functional competency development model, 2) blended training…
Zhao, Yuan; Stepto, Hannah; Schneider, Christian K
2017-01-01
Gene therapy is a rapidly evolving field. So far, there have been >2,400 gene therapy products in clinical trials and four products on the market. A prerequisite for producing gene therapy products is ensuring their quality and safety. This requires appropriately controlled and standardized production and testing procedures that result in consistent safety and efficacy. Assuring the quality and safety of lentivirus-based gene therapy products in particular presents a great challenge because they are cell-based multigene products that include viral and therapeutic proteins as well as modified cells. In addition to the continuous refinement of a product, changes in production sites and manufacturing processes have become more and more common, posing challenges to developers regarding reproducibility and comparability of results. This paper discusses the concept of developing a first World Health Organization International Standard, suitable for the standardization of assays and enabling comparison of cross-trial and cross-manufacturing results for this important vector platform. The standard will be expected to optimize the development of gene therapy medicinal products, which is especially important, given the usually orphan nature of the diseases to be treated, naturally hampering reproducibility and comparability of results. PMID:28747142
Di Gianfilippo, Martina; Hyks, Jiri; Verginelli, Iason; Costa, Giulia; Hjelmar, Ole; Lombardi, Francesco
2018-03-01
Several types of standardized laboratory leaching tests have been developed during the past few decades to evaluate the leaching behaviour of waste materials as a function of different parameters, such as the pH of the eluate and the liquid to solid ratio. However, the link between the results of these tests and leaching data collected from the field (e.g. in disposal or reuse scenarios) is not always straightforward. In this work, we compare data obtained from an on-going large scale field trial, in which municipal solid waste incineration bottom ash is being tested as road sub-base material, with the results obtained from percolation column and pH-dependence laboratory leaching tests carried out on the bottom ash at the beginning of the test. The comparisons reported in this paper show that for soluble substances (e.g. Cl, K and SO 4 ), percolation column tests can provide a good indication of the release expected in the field with deviations usually within a factor of 3. For metals characterized by a solubility-controlled release, i.e. that depends more on eluate pH than the liquid to solid ratio applied, the results of pH-dependence tests describe more accurately the eluate concentration trends observed in the field with deviations that in most cases (around 80%) are within one order of magnitude (see e.g. Al and Cd). The differences between field and lab-scale data might be in part ascribed to the occurrence in the field of weathering reactions (e.g. carbonation) but also to microbial decomposition of organic matter that modifying leachate pH affect the solubility of several constituents (e.g. Ca, Ba and Cr). Besides, weathering reactions can result in enhanced adsorption of fulvic acids to iron/aluminum (hydr)oxides, leading to a decrease in the leaching of fulvic acids and hence of elements such as Cu, Ni and Pb that strongly depend on DOC leaching. Overall, this comparison shows that percolation column tests and pH-dependence tests can represent a reliable screening tool to derive data that could be employed in risk-based analysis or life cycle assessment (LCA) frameworks for evaluating potential environmental impacts deriving from specific disposal/reuse options for waste materials. Copyright © 2017 Elsevier Ltd. All rights reserved.
Lee, Eun Gyung; Nelson, John H.; Kashon, Michael L.; Harper, Martin
2015-01-01
A Japanese round-robin study revealed that analysts who used a dark-medium (DM) objective lens reported higher fiber counts from American Industrial Hygiene Association (AIHA) Proficiency Analytical Testing (PAT) chrysotile samples than those using a standard objective lens, but the cause of this difference was not investigated at that time. The purpose of this study is to determine any major source of this difference by performing two sets of round-robin studies. For the first round-robin study, 15 AIHA PAT samples (five each of chrysotile and amosite generated by water-suspended method, and five chrysotile generated by aerosolization method) were prepared with relocatable cover slips and examined by nine laboratories. A second round-robin study was then performed with six chrysotile field sample slides by six out of nine laboratories who participated in the first round-robin study. In addition, two phase-shift test slides to check analysts’ visibility and an eight-form diatom test plate to compare resolution between the two objectives were examined. For the AIHA PAT chrysotile reference slides, use of the DM objective resulted in consistently higher fiber counts (1.45 times for all data) than the standard objective (P-value < 0.05), regardless of the filter generation (water-suspension or aerosol) method. For the AIHA PAT amosite reference and chrysotile field sample slides, the fiber counts between the two objectives were not significantly different. No statistically significant differences were observed in the visibility of blocks of the test slides between the two objectives. Also, the DM and standard objectives showed no pattern of differences in viewing the fine lines and/or dots of each species images on the eight-form diatom test plate. Among various potential factors that might affect the analysts’ performance of fiber counts, this study supports the greater contrast caused by the different phase plate absorptions as the main cause of high counts for the AIHA PAT chrysotile slides using the DM objective. The comparison of fiber count ratios (DM/standard) between the AIHA PAT chrysotile samples and chrysotile field samples indicates that there is a fraction of fibers in the PAT samples approaching the theoretical limit of visibility of the phase-contrast microscope with 3-degree phase-shift. These fibers become more clearly visible through the greater contrast from the phase plate absorption of the DM objective. However, as such fibers are not present in field samples, no difference in counts between the two objectives was observed in this study. The DM objective, therefore, could be allowed for routine fiber counting as it will maintain continuity with risk assessments based on earlier phase-contrast microscopy fiber counts from field samples. Published standard methods would need to be modified to allow a higher aperture specification for the objective. PMID:25737333
ERIC Educational Resources Information Center
de Avila, Edward A.; Havassy, Barbara
Approximately 1,225 Mexican American and Anglo American children in grades 1-6 (ages 6-14) from California, Colorado, New Mexico, and Texas were tested using school achievement and IQ standardized tests and four Piagetian-derived measures (Cartoon Conservation Scales, Water Level Task, Figural Intersection Test, and Serial Task). The field study's…
Applying a Genetic Algorithm to Reconfigurable Hardware
NASA Technical Reports Server (NTRS)
Wells, B. Earl; Weir, John; Trevino, Luis; Patrick, Clint; Steincamp, Jim
2004-01-01
This paper investigates the feasibility of applying genetic algorithms to solve optimization problems that are implemented entirely in reconfgurable hardware. The paper highlights the pe$ormance/design space trade-offs that must be understood to effectively implement a standard genetic algorithm within a modem Field Programmable Gate Array, FPGA, reconfgurable hardware environment and presents a case-study where this stochastic search technique is applied to standard test-case problems taken from the technical literature. In this research, the targeted FPGA-based platform and high-level design environment was the Starbridge Hypercomputing platform, which incorporates multiple Xilinx Virtex II FPGAs, and the Viva TM graphical hardware description language.
Landsat for practical forest type mapping - A test case
NASA Technical Reports Server (NTRS)
Bryant, E.; Dodge, A. G., Jr.; Warren, S. D.
1980-01-01
Computer classified Landsat maps are compared with a recent conventional inventory of forest lands in northern Maine. Over the 196,000 hectare area mapped, estimates of the areas of softwood, mixed wood and hardwood forest obtained by a supervised classification of the Landsat data and a standard inventory based on aerial photointerpretation, probability proportional to prediction, field sampling and a standard forest measurement program are found to agree to within 5%. The cost of the Landsat maps is estimated to be $0.065/hectare. It is concluded that satellite techniques are worth developing for forest inventories, although they are not yet refined enough to be incorporated into current practical inventories.
Assessment of soil compaction properties based on surface wave techniques
NASA Astrophysics Data System (ADS)
Jihan Syamimi Jafri, Nur; Rahim, Mohd Asri Ab; Zahid, Mohd Zulham Affandi Mohd; Faizah Bawadi, Nor; Munsif Ahmad, Muhammad; Faizal Mansor, Ahmad; Omar, Wan Mohd Sabki Wan
2018-03-01
Soil compaction plays an important role in every construction activities to reduce risks of any damage. Traditionally, methods of assessing compaction include field tests and invasive penetration tests for compacted areas have great limitations, which caused time-consuming in evaluating large areas. Thus, this study proposed the possibility of using non-invasive surface wave method like Multi-channel Analysis of Surface Wave (MASW) as a useful tool for assessing soil compaction. The aim of this study was to determine the shear wave velocity profiles and field density of compacted soils under varying compaction efforts by using MASW method. Pre and post compaction of MASW survey were conducted at Pauh Campus, UniMAP after applying rolling compaction with variation of passes (2, 6 and 10). Each seismic data was recorded by GEODE seismograph. Sand replacement test was conducted for each survey line to obtain the field density data. All seismic data were processed using SeisImager/SW software. The results show the shear wave velocity profiles increase with the number of passes from 0 to 6 passes, but decrease after 10 passes. This method could attract the interest of geotechnical community, as it can be an alternative tool to the standard test for assessing of soil compaction in the field operation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Colle, R.; Hutchinson, J.M.R.; Kotrappa, P.
1995-11-01
The recently developed {sup 222}Rn emanation standards that are based on polyethylene-encapsulated {sup 226}Ra solutions were employed for a first field-measurement application test to demonstrate their efficacy in calibrating passive integral radon monitors. The performance of the capsules was evaluated with respect to the calibration needs of electret ionization chambers (E-PERM{reg_sign}, Rad Elec Inc.). The encapsulated standards emanate well-characterized and known quantities of {sup 222}Rn, and were used in two different-sized, relatively-small, accumulation vessels (about 3.6 L and 10 L) which also contained the deployed electret monitors under test. Calculated integral {sup 222}Rn activities from the capsules over various accumulationmore » times were compared to the averaged electret responses. Evaluations were made with four encapsulated standards ranging in {sup 226}Ra activity from approximately 15 Bq to 540 Bq (with {sup 222}Rn emanation fractions of 0.888); over accumulation times from 1 d to 33 d; and with four different types of E-PERM detectors that were independently calibrated. The ratio of the electret chamber response E{sub Rn} to the integral {sup 222}Rn activity I{sub Rn} was constant (within statistical variations) over the variables of the specific capsule used, the accumulation volume, accumulation time, and detector type. The results clearly demonstrated the practicality and suitability of the encapsulated standards for providing a simple and readily-available calibration for those measurement applications. However, the mean ratio E{sub Rn}/I{sub Rn} was approximately 0.91, suggesting a possible systematic bias in the extant E-PERM calibrations. This 9% systematic difference was verified by an independent test of the E-PERM calibration based on measurements with the NIST radon-in-water standard generator.« less
Using weighted power mean for equivalent square estimation.
Zhou, Sumin; Wu, Qiuwen; Li, Xiaobo; Ma, Rongtao; Zheng, Dandan; Wang, Shuo; Zhang, Mutian; Li, Sicong; Lei, Yu; Fan, Qiyong; Hyun, Megan; Diener, Tyler; Enke, Charles
2017-11-01
Equivalent Square (ES) enables the calculation of many radiation quantities for rectangular treatment fields, based only on measurements from square fields. While it is widely applied in radiotherapy, its accuracy, especially for extremely elongated fields, still leaves room for improvement. In this study, we introduce a novel explicit ES formula based on Weighted Power Mean (WPM) function and compare its performance with the Sterling formula and Vadash/Bjärngard's formula. The proposed WPM formula is ESWPMa,b=waα+1-wbα1/α for a rectangular photon field with sides a and b. The formula performance was evaluated by three methods: standard deviation of model fitting residual error, maximum relative model prediction error, and model's Akaike Information Criterion (AIC). Testing datasets included the ES table from British Journal of Radiology (BJR), photon output factors (S cp ) from the Varian TrueBeam Representative Beam Data (Med Phys. 2012;39:6981-7018), and published S cp data for Varian TrueBeam Edge (J Appl Clin Med Phys. 2015;16:125-148). For the BJR dataset, the best-fit parameter value α = -1.25 achieved a 20% reduction in standard deviation in ES estimation residual error compared with the two established formulae. For the two Varian datasets, employing WPM reduced the maximum relative error from 3.5% (Sterling) or 2% (Vadash/Bjärngard) to 0.7% for open field sizes ranging from 3 cm to 40 cm, and the reduction was even more prominent for 1 cm field sizes on Edge (J Appl Clin Med Phys. 2015;16:125-148). The AIC value of the WPM formula was consistently lower than its counterparts from the traditional formulae on photon output factors, most prominent on very elongated small fields. The WPM formula outperformed the traditional formulae on three testing datasets. With increasing utilization of very elongated, small rectangular fields in modern radiotherapy, improved photon output factor estimation is expected by adopting the WPM formula in treatment planning and secondary MU check. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.
Gabbett, Tim J; Carius, Josh; Mulvey, Mike
2008-11-01
This study investigated the effects of video-based perceptual training on pattern recognition and pattern prediction ability in elite field sport athletes and determined whether enhanced perceptual skills influenced the physiological demands of game-based activities. Sixteen elite women soccer players (mean +/- SD age, 18.3 +/- 2.8 years) were allocated to either a video-based perceptual training group (N = 8) or a control group (N = 8). The video-based perceptual training group watched video footage of international women's soccer matches. Twelve training sessions, each 15 minutes in duration, were conducted during a 4-week period. Players performed assessments of speed (5-, 10-, and 20-m sprint), repeated-sprint ability (6 x 20-m sprints, with active recovery on a 15-second cycle), estimated maximal aerobic power (V O2 max, multistage fitness test), and a game-specific video-based perceptual test of pattern recognition and pattern prediction before and after the 4 weeks of video-based perceptual training. The on-field assessments included time-motion analysis completed on all players during a standardized 45-minute small-sided training game, and assessments of passing, shooting, and dribbling decision-making ability. No significant changes were detected in speed, repeated-sprint ability, or estimated V O2 max during the training period. However, video-based perceptual training improved decision accuracy and reduced the number of recall errors, indicating improved game awareness and decision-making ability. Importantly, the improvements in pattern recognition and prediction ability transferred to on-field improvements in passing, shooting, and dribbling decision-making skills. No differences were detected between groups for the time spent standing, walking, jogging, striding, and sprinting during the small-sided training game. These findings demonstrate that video-based perceptual training can be used effectively to enhance the decision-making ability of field sport athletes; however, it has no effect on the physiological demands of game-based activities.
Field ion spectrometry: a new technology for cocaine and heroin detection
NASA Astrophysics Data System (ADS)
Carnahan, Byron L.; Day, Stephen; Kouznetsov, Viktor; Tarassov, Alexandre
1997-02-01
Field ion spectrometry, also known as transverse field compensation ion mobility spectrometry, is a new technique for trace gas analysis that can be applied to the detection of cocaine and heroin. Its principle is based on filtering ion species according to the functional dependence of their mobilities with electric field strength. Field ion spectrometry eliminates the gating electrodes needed in conventional IMS to pulse ions into the spectrometer; instead, ions are injected in to the spectrometer and reach the detector continuously, resulting in improved sensitivity. The technique enables analyses that are difficult with conventional constant field strength ion mobility spectrometers. We have shown that a filed ion spectrometer can selectively detect the vapors from cocaine and heroin emitted from both their base and hydrochloride forms. The estimated volumetric limits of detection are in the low pptv range, based on testing with standardized drug vapor generation systems. The spectrometer can detect cocaine base in the vapor phase, at concentrations well below its estimated 100 pptv vapor pressure equivalent at 20 degrees C. This paper describes the underlying principles of field ion spectrometry in relation to narcotic drug detection, and recent results obtained for cocaine and heroin. The work has been sponsored in part by the United States Advanced Research Projects Agency under contract DAAB10-95C-0004, for the DOD Counterdrug Technology Development Program.
Geue, Lutz; Stieber, Bettina; Monecke, Stefan; Engelmann, Ines; Gunzer, Florian; Slickers, Peter; Braun, Sascha D; Ehricht, Ralf
2014-08-01
In this study, we developed a new rapid, economic, and automated microarray-based genotyping test for the standardized subtyping of Shiga toxins 1 and 2 of Escherichia coli. The microarrays from Alere Technologies can be used in two different formats, the ArrayTube and the ArrayStrip (which enables high-throughput testing in a 96-well format). One microarray chip harbors all the gene sequences necessary to distinguish between all Stx subtypes, facilitating the identification of single and multiple subtypes within a single isolate in one experiment. Specific software was developed to automatically analyze all data obtained from the microarray. The assay was validated with 21 Shiga toxin-producing E. coli (STEC) reference strains that were previously tested by the complete set of conventional subtyping PCRs. The microarray results showed 100% concordance with the PCR results. Essentially identical results were detected when the standard DNA extraction method was replaced by a time-saving heat lysis protocol. For further validation of the microarray, we identified the Stx subtypes or combinations of the subtypes in 446 STEC field isolates of human and animal origin. In summary, this oligonucleotide array represents an excellent diagnostic tool that provides some advantages over standard PCR-based subtyping. The number of the spotted probes on the microarrays can be increased by additional probes, such as for novel alleles, species markers, or resistance genes, should the need arise. Copyright © 2014, American Society for Microbiology. All Rights Reserved.
NASA Astrophysics Data System (ADS)
Rohmanu, Ajar; Everhard, Yan
2017-04-01
A technological development, especially in the field of electronics is very fast. One of the developments in the electronics hardware device is Flexible Flat Cable (FFC), which serves as a media liaison between the main boards with other hardware parts. The production of Flexible Flat Cable (FFC) will go through the process of testing and measuring of the quality Flexible Flat Cable (FFC). Currently, the testing and measurement is still done manually by observing the Light Emitting Diode (LED) by the operator, so there were many problems. This study will be made of test quality Flexible Flat Cable (FFC) computationally utilize Open Source Embedded System. The method used is the measurement with Short Open Test method using Ohm’s Law approach to 4-wire (Kelvin) and fuzzy logic as a decision maker measurement results based on Open Source Arduino Data Logger. This system uses a sensor current INA219 as a sensor to read the voltage value thus obtained resistance value Flexible Flat Cable (FFC). To get a good system we will do the Black-box testing as well as testing the accuracy and precision with the standard deviation method. In testing the system using three models samples were obtained the test results in the form of standard deviation for the first model of 1.921 second model of 4.567 and 6.300 for the third model. While the value of the Standard Error of Mean (SEM) for the first model of the model 0.304 second at 0.736 and 0.996 of the third model. In testing this system, we will also obtain the average value of the measurement tolerance resistance values for the first model of - 3.50% 4.45% second model and the third model of 5.18% with the standard measurement of prisoners and improve productivity becomes 118.33%. From the results of the testing system is expected to improve the quality and productivity in the process of testing Flexible Flat Cable (FFC).
Wall, Michael; Woodward, Kimberly R; Doyle, Carrie K; Artes, Paul H
2009-02-01
Standard automated perimetry (SAP) shows a marked increase in variability in damaged areas of the visual field. This study was conducted to test the hypothesis that larger stimuli are associated with more uniform variability, by investigating the retest variability of four perimetry tests: standard automated perimetry size III (SAP III), with the SITA standard strategy; SAP size V (SAP V), with the full-threshold strategy; Matrix (FDT II), and Motion perimetry. One eye each of 120 patients with glaucoma was examined on the same day with these four perimetric tests and retested 1 to 8 weeks later. The decibel scales were adjusted to make the test's scales numerically similar. Retest variability was examined by establishing the distributions of retest threshold estimates, for each threshold level observed at the first test. The 5th and 95th percentiles of the retest distribution were used as point-wise limits of retest variability. Regression analyses were performed to quantify the relationship between visual field sensitivity and variability. With SAP III, the retest variability increased substantially with reducing sensitivity. Corresponding increases with SAP V, Matrix, and Motion perimetry were considerably smaller or absent. With SAP III, sensitivity explained 22% of the retest variability (r(2)), whereas corresponding data for SAP V, Matrix, and Motion perimetry were 12%, 2%, and 2%, respectively. Variability of Matrix and Motion perimetry does not increase as substantially as that of SAP III in damaged areas of the visual field. Increased sampling with the larger stimuli of these techniques is the likely explanation for this finding. These properties may make these stimuli excellent candidates for early detection of visual field progression.
Products of steel slags an opportunity to save natural resources.
Motz, H; Geiseler, J
2001-01-01
In Germany, and in the most industrial countries, the use of blast furnace and steel slags as an aggregate for civil engineering, for metallurgical use and as fertiliser has a very long tradition. Since the introduction of the basic oxygen steel making furnace (BOF) process and the electric arc furnace (EAF) process the German steel industry started extensive research on the development of fields of application for BOF and EAF slags. These investigations have been mainly performed by Forschungsgemeinschaft Eisenhüttenschlacken e. V. (FEhS), the Research Association for blast furnace and steel slags. Today steel slags are well characterised and long-term experienced materials mainly used as aggregates for road construction (e.g. asphaltic or unbound layers), as armour-stones for hydraulic engineering constructions (e.g. stabilisation of shores), and as fertiliser for agriculture purposes. These multifarious fields of application could only be achieved because the steelworks influence the quality of slags by a careful selection of raw materials and a suitable process route. Furthermore, subsequent procedures like a treatment of the liquid slag, an appropriate heat treatment and a suitable processing have been developed to ensure that the quality of steel slags is always adequate for the end use. Depending on the respective field of application, the suitability of steel slags has to be proven by determining the technical properties, as well as the environmental compatibility. For this reason test methods have been developed to evaluate the technical properties especially the volume stability and the environmental behaviour. To evaluate the volume stability a suitable test (steam test) has been developed and the results from laboratory tests were compared with the behaviour of steel slags under practical conditions, e.g. in a road. To determine the environmental behaviour leaching tests have been developed. In the meanwhile most of these test methods are drafted or already accepted as a CEN standard and are used for a continuous quality control. Usually the suitability of steel slags is stated by fulfilling the requirements of national and/or international standards and regulations. Based on these standards and regulations in Germany in 1998 about 97% of the produced steel slags have been used as aggregates for road construction (e.g. as surface layer, road base and sub base for high trafficked roads), ways, earthworks, and armourstones for hydraulic structures. Consistent to the successful long-term experience not only products of steel slags but also products of blast furnace slags have been eliminated from the European Waste Catalogue and the European Shipment of Waste Regulation of the European Community, as well as from the lists of OECD for transfrontier movements by the decision of the OECD-Council from 21 September, 1995.
Comparison of electric field exposure monitoring instrumentation. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bracken, T.D.
1985-06-01
Electric field exposure monitoring instrumentation was compared and evaluated during three days of tests performed in 60-Hz electric fields. A conducting vest exposure meter and a small electric field exposure meter (EFEM) located in a shirt pocket, arm band or hard hat were compared in a series of static and dynamic tests. In some tests, the devices were worn simultaneously without interference to provide separate measures of identical exposure. Tests with stationary subjects wearing the instruments were used to measure the effects of grounding, and to establish the meter response in a standard posture for each subject. Dynamic occupational exposuremore » simulations were used to compare accumulated measurements of exposure between instruments and to compare measurements with predicted exposures. The simulations were based on analysis of the work-related behavior of substation electricians and operators. Electrician's tasks at ground level and in a bucket truck were simulated near an energized test line. A simulated substation inspection was performed in a 230 kV substation. The exposure measurements demonstrated an overall consistency between the meters. The vest demonstrated less intersubject variability and less detailed exposure characterization. Measurements with the shirt pocket EFEM were below those made with the vest and with the EFEM in other locations. Insulation provided by shoe soles appeared to be the largest factor in reducing measured exposures during the substation inspection below those predicted from the unperturbed field. Improvements in meter design and additional measurements are suggested. 11 refs., 20 figs., 28 tabs.« less
ERIC Educational Resources Information Center
Seaberg, James R.; And Others
The National Center on Child Abuse and Neglect funded a project to develop and field-test an evaluation procedure that could be used by interested states or communities to determine the extent of congruity between (1) their provisions for responding to the problems of child abuse and neglect, and (2) provisions prescribed in the Federal Standards…
The accuracy of confrontation visual field test in comparison with automated perimetry.
Johnson, L. N.; Baloh, F. G.
1991-01-01
The accuracy of confrontation visual field testing was determined for 512 visual fields using automated static perimetry as the reference standard. The sensitivity of confrontation testing excluding patchy defects was 40% for detecting anterior visual field defects, 68.3% for posterior defects, and 50% for both anterior and posterior visual field defects combined. The sensitivity within each group varied depending on the type of visual field defect encountered. Confrontation testing had a high sensitivity (75% to 100%) for detecting altitudinal visual loss, central/centrocecal scotoma, and homonymous hemianopsia. Confrontation testing was fairly insensitive (20% to 50% sensitivity) for detecting arcuate scotoma and bitemporal hemianopsia. The specificity of confrontation testing was high at 93.4%. The high positive predictive value (72.6%) and negative predictive value (75.7%) would indicate that visual field defects identified during confrontation testing are often true visual field defects. However, the many limitations of confrontation testing should be remembered, particularly its low sensitivity for detecting visual field loss associated with parasellar tumors, glaucoma, and compressive optic neuropathies. PMID:1800764
ERIC Educational Resources Information Center
Needham, Martha Elaine
2010-01-01
This research compares differences between standardized test scores in problem-based learning (PBL) classrooms and a traditional classroom for 6th grade students using a mixed-method, quasi-experimental and qualitative design. The research shows that problem-based learning is as effective as traditional teaching methods on standardized tests. The…
NASA Technical Reports Server (NTRS)
Polzin, Kurt A.; Hill, Carrie S.; Turchi, Peter J.; Burton, Rodney L.; Messer, Sarah; Lovberg, Ralph H.; Hallock, Ashley K.
2013-01-01
Inductive magnetic field probes (also known as B-dot probes and sometimes as B-probes or magnetic probes) are often employed to perform field measurements in electric propulsion applications where there are time-varying fields. Magnetic field probes provide the means to measure these magnetic fields and can even be used to measure the plasma current density indirectly through the application of Ampere's law. Measurements of this type can yield either global information related to a thruster and its performance or detailed, local data related to the specific physical processes occurring in the plasma. Results of the development of a standard for B-dot probe measurements are presented, condensing the available literature on the subject into an accessible set of rules, guidelines, and techniques to standardize the performance and presentation of future measurements.
Tenebrio beetles use magnetic inclination compass
NASA Astrophysics Data System (ADS)
Vácha, Martin; Drštková, Dana; Půžová, Tereza
2008-08-01
Animals that guide directions of their locomotion or their migration routes by the lines of the geomagnetic field use either polarity or inclination compasses to determine the field polarity (the north or south direction). Distinguishing the two compass types is a guideline for estimation of the molecular principle of reception and has been achieved for a number of animal groups, with the exception of insects. A standard diagnostic method to distinguish a compass type is based on reversing the vertical component of the geomagnetic field, which leads to the opposite reactions of animals with two different compass types. In the present study, adults of the mealworm beetle Tenebrio molitor were tested by means of a two-step laboratory test of magnetoreception. Beetles that were initially trained to memorize the magnetic position of the light source preferred, during the subsequent test, this same direction, pursuant geomagnetic cues only. In the following step, the vertical component was reversed between the training and the test. The beetles significantly turned their preferred direction by 180°. Our results brought until then unknown original findings that insects, represented here by the T. molitor species, use—in contrast to another previously researched Arthropod, spiny lobster—the inclination compass.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hacke, Peter; Lokanath, Sumanth; Williams, Paul
Data indicate that the inverter is the element of the photovoltaic plant that has the highest number of service calls and the greatest operation and maintenance cost burden. This paper describes the projects and relevant background needed in developing design qualification standards that would serve to establish a minimum level of reliability, along with a review of photovoltaic inverter quality and safety standards, most of which are in their infancy. We compare stresses and levels for accelerated testing of inverters proposed in the standard drafts, and those proposed by manufacturers and purchasers of inverters. We also review bases for themore » methods, stress types, and stress levels for durability testing of key inverter components. Many of the test protocols appear to need more comprehensive inclusion of stress factors existing in the natural environment such as wind driven rain, dust, and grid disturbances. Further understanding of how temperature, humidity ingress, and voltage bias affect the inverters and their components is also required. We provide data indicating inconsistent quality of the inverters and the durability of components leading to greater cost for the photovoltaic plant operator. Accordingly, the recommendation for data collection within quality standards for obtaining cost of ownership metrics is made. Design validation testing using realistic operation, environmental, and connection conditions, including under end-use field conditions with feedback for continuous improvement is recommended for inclusion within a quality standard.« less
Hacke, Peter; Lokanath, Sumanth; Williams, Paul; ...
2017-10-10
Data indicate that the inverter is the element of the photovoltaic plant that has the highest number of service calls and the greatest operation and maintenance cost burden. This paper describes the projects and relevant background needed in developing design qualification standards that would serve to establish a minimum level of reliability, along with a review of photovoltaic inverter quality and safety standards, most of which are in their infancy. We compare stresses and levels for accelerated testing of inverters proposed in the standard drafts, and those proposed by manufacturers and purchasers of inverters. We also review bases for themore » methods, stress types, and stress levels for durability testing of key inverter components. Many of the test protocols appear to need more comprehensive inclusion of stress factors existing in the natural environment such as wind driven rain, dust, and grid disturbances. Further understanding of how temperature, humidity ingress, and voltage bias affect the inverters and their components is also required. We provide data indicating inconsistent quality of the inverters and the durability of components leading to greater cost for the photovoltaic plant operator. Accordingly, the recommendation for data collection within quality standards for obtaining cost of ownership metrics is made. Design validation testing using realistic operation, environmental, and connection conditions, including under end-use field conditions with feedback for continuous improvement is recommended for inclusion within a quality standard.« less
Optimizing hydraulic fracture design in the diatomite formation, Lost Hills Field
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, D.G.; Klins, M.A.; Manrique, J.F.
1996-12-31
Since 1988, over 1.3 billion pounds of proppant have been placed in the Lost Hills Field of Kern County. California in over 2700 hydraulic fracture treatments involving investments of about $150 million. In 1995, systematic reevaluation of the standard, field trial-based fracture design began. Reservoir, geomechanical, and hydraulic fracture characterization; production and fracture modeling; sensitivity analysis; and field test results were integrated to optimize designs with regard to proppant volume, proppant ramps, and perforating strategy. The results support a reduction in proppant volume from 2500 to 1700 lb/ft which will save about $50,000 per well, totalling over $3 million permore » year. Vertical coverage was found to be a key component of fracture quality which could be optimized by eliminating perforations from lower stress intervals, reducing the total number of perforations, and reducing peak slurry loading from 16 to 12 ppa. A relationship between variations in lithology, pore pressure, and stress was observed. Point-source, perforating strategies were investigated and variable multiple fracture behavior was observed. The discussed approach has application in areas where stresses are variable; pay zones are thick; hydraulic fracture design is based primarily on empirical, trial-and-error field test results; and effective, robust predictive models involving real-data feedback have not been incorporated into the design improvement process.« less
Web-based health care agents; the case of reminders and todos, too (R2Do2).
Silverman, B G; Andonyadis, C; Morales, A
1998-11-01
This paper describes efforts to develop and field an agent-based, healthcare middleware framework that securely connects practice rule sets to patient records to anticipate health todo items and to remind and alert users about these items over the web. Reminders and todos, too (R2Do2) is an example of merging data- and document-centric architectures, and of integrating agents into patient-provider collaboration environments. A test of this capability verifies that R2Do2 is progressing toward its two goals: (1) an open standards framework for middleware in the healthcare field; and (2) an implementation of the 'principle of optimality' to derive the best possible health plans for each user. This paper concludes with lessons learned to date.
NASA Astrophysics Data System (ADS)
Leka, K. D.; Barnes, G.
2003-10-01
We apply statistical tests based on discriminant analysis to the wide range of photospheric magnetic parameters described in a companion paper by Leka & Barnes, with the goal of identifying those properties that are important for the production of energetic events such as solar flares. The photospheric vector magnetic field data from the University of Hawai'i Imaging Vector Magnetograph are well sampled both temporally and spatially, and we include here data covering 24 flare-event and flare-quiet epochs taken from seven active regions. The mean value and rate of change of each magnetic parameter are treated as separate variables, thus evaluating both the parameter's state and its evolution, to determine which properties are associated with flaring. Considering single variables first, Hotelling's T2-tests show small statistical differences between flare-producing and flare-quiet epochs. Even pairs of variables considered simultaneously, which do show a statistical difference for a number of properties, have high error rates, implying a large degree of overlap of the samples. To better distinguish between flare-producing and flare-quiet populations, larger numbers of variables are simultaneously considered; lower error rates result, but no unique combination of variables is clearly the best discriminator. The sample size is too small to directly compare the predictive power of large numbers of variables simultaneously. Instead, we rank all possible four-variable permutations based on Hotelling's T2-test and look for the most frequently appearing variables in the best permutations, with the interpretation that they are most likely to be associated with flaring. These variables include an increasing kurtosis of the twist parameter and a larger standard deviation of the twist parameter, but a smaller standard deviation of the distribution of the horizontal shear angle and a horizontal field that has a smaller standard deviation but a larger kurtosis. To support the ``sorting all permutations'' method of selecting the most frequently occurring variables, we show that the results of a single 10-variable discriminant analysis are consistent with the ranking. We demonstrate that individually, the variables considered here have little ability to differentiate between flaring and flare-quiet populations, but with multivariable combinations, the populations may be distinguished.
Cognitive Learning Strategy as a Partial Effect on Major Field Test in Business Results
ERIC Educational Resources Information Center
Strang, Kenneth David
2014-01-01
An experiment was developed to determine if cognitive learning strategies improved standardized university business exam results. Previous studies revealed that factors such as prior ability, age, gender, and culture predicted a student's Major Field Test in Business (MFTB) score better than course content. The experiment control consisted of…
Dombo, Eileen A; Bass, Ami P
2014-01-01
In practice with adult women who survived childhood sexual abuse, the field of social work currently lacks an evidence-based intervention. The current interventions, from the 1990s, come primarily from psychologists. The hypothesis that the Feminist-Cognitive-Relational Social Work Model and Intervention will be more effective in decreasing cognitive distortions, and increasing intimacy and relational health when compared to the standard agency intervention was tested in a quasi-experimental study. The challenges in carrying out the study in small, non-profit organizations are explored to highlight the difficulties in developing evidence-based interventions. Changes to implementation that resulted from the research findings are discussed.
Singh, Manav Deep; Jain, Kanika
2017-11-01
To find out whether 30-2 Swedish Interactive Threshold Algorithm (SITA) Fast is comparable to 30-2 SITA Standard as a tool for perimetry among the patients with intracranial tumors. This was a prospective cross-sectional study involving 80 patients aged ≥18 years with imaging proven intracranial tumors and visual acuity better than 20/60. The patients underwent multiple visual field examinations using the two algorithms till consistent and repeatable results were obtained. A total of 140 eyes of 80 patients were analyzed. Almost 60% of patients undergoing perimetry with SITA Standard required two or more sessions to obtain consistent results, whereas the same could be obtained in 81.42% with SITA Fast in the first session itself. Of 140 eyes, 70 eyes had recordable field defects and the rest had no defects as detected by either of the two algorithms. Mean deviation (MD) (P = 0.56), pattern standard deviation (PSD) (P = 0.22), visual field index (P = 0.83) and number of depressed points at P < 5%, 2%, 1%, and 0.5% on MD and PSD probability plots showed no statistically significant difference between two algorithms. Bland-Altman test showed that considerable variability existed between two algorithms. Perimetry performed by SITA Standard and SITA Fast algorithm of Humphrey Field Analyzer gives comparable results among the patients of intracranial tumors. Being more time efficient and with a shorter learning curve, SITA Fast my be recommended as a standard test for the purpose of perimetry among these patients.
Lin, Yuehe; Bennett, Wendy D.; Timchalk, Charles; Thrall, Karla D.
2004-03-02
Microanalytical systems based on a microfluidics/electrochemical detection scheme are described. Individual modules, such as microfabricated piezoelectrically actuated pumps and a microelectrochemical cell were integrated onto portable platforms. This allowed rapid change-out and repair of individual components by incorporating "plug and play" concepts now standard in PC's. Different integration schemes were used for construction of the microanalytical systems based on microfluidics/electrochemical detection. In one scheme, all individual modules were integrated in the surface of the standard microfluidic platform based on a plug-and-play design. Microelectrochemical flow cell which integrated three electrodes based on a wall-jet design was fabricated on polymer substrate. The microelectrochemical flow cell was then plugged directly into the microfluidic platform. Another integration scheme was based on a multilayer lamination method utilizing stacking modules with different functionality to achieve a compact microanalytical device. Application of the microanalytical system for detection of lead in, for example, river water and saliva samples using stripping voltammetry is described.
Development of a rapid antibody test for point-of-care diagnosis of animal African trypanosomosis.
Boulangé, Alain; Pillay, Davita; Chevtzoff, Cyrille; Biteau, Nicolas; Comé de Graça, Vanessa; Rempeters, Leonie; Theodoridis, Dimitrios; Baltz, Théo
2017-01-15
Trypanosoma congolense and T. vivax are the main causative agents of animal African trypanosomosis (AAT), a disease which hinders livestock production throughout sub-Saharan Africa and in some parts of South America. Although two trypanocidal drugs are currently available, the level of treatment is low due to the difficulty in diagnosing the disease in the field. The major clinical signs of AAT such as anaemia, weight loss, and infertility, are common to several other endemic livestock diseases. Current diagnostic methods, based on the visualization of the parasite in the blood, or on the detection of its DNA or the antibodies it triggers in the host, are not suitable for direct use in the field as they require specialized equipment and personnel. Thus, we developed a quick-format diagnostic test (15min) based on the recombinant TcoCB and TvGM6 antigens for detection of T. congolense and T. vivax, respectively, aimed at providing farmers and veterinarians in the field with the means to conduct a quick diagnosis. The specificity and sensitivity of the test were evaluated using sera from experimentally infected cattle, and fresh blood when possible. The prototype, which includes both antigens, shows a specificity of 95.9 (95% C.I., 90.4%-100%) and a sensitivity of 92.0% (95% C.I., 85.9%-98.1%) for T. congolense and 98.2% (95% C.I., 94.7%-100%) for T. vivax. The high levels of sensitivity and specificity of this rapid test, the possibility of using directly whole blood, and the ease of interpreting the result, all contribute to make of this test a valuable candidate to contribute to the control of AAT in the field. However, further tests with more representative, numerous and fresh reference samples are necessary in order to compare this test with the ELISA, the current gold standard serological test for trypanosomosis. Copyright © 2016 Elsevier B.V. All rights reserved.
Demonstration and Commercialization of the Sediment Ecosystem Assessment Protocol (SEAP)
2017-07-09
undergone severe erosion (Peeling 1975). Zuniga Jetty, which runs parallel to Point Loma at the bay’s inlet, was built to control erosion near the inlet...consistent conditions and level of effort required to run the tests. A per site unit cost is less amenable to a field-based deployment, given the many...support in situ tetsing: 1) a standard exposure of spores to a reference toxicant dilutuion series; and 2) exposure of sporophyll blades to a
Testing Nonassociative Quantum Mechanics.
Bojowald, Martin; Brahma, Suddhasattwa; Büyükçam, Umut
2015-11-27
The familiar concepts of state vectors and operators in quantum mechanics rely on associative products of observables. However, these notions do not apply to some exotic systems such as magnetic monopoles, which have long been known to lead to nonassociative algebras. Their quantum physics has remained obscure. This Letter presents the first derivation of potentially testable physical results in nonassociative quantum mechanics, based on effective potentials. They imply new effects which cannot be mimicked in usual quantum mechanics with standard magnetic fields.
Polilli, Ennio; Sozio, Federica; Di Stefano, Paola; Clerico, Luigi; Di Iorio, Giancarlo; Parruti, Giustino
2018-04-01
This study aimed to analyze the efficacy of a Web-based testing programme in terms of the prevention of late HIV presentation. The clinical characteristics of patients diagnosed with HIV via the Web-based testing programme were compared to those of patients diagnosed in parallel via standard diagnostic care procedures. This study included the clinical and demographic data of newly diagnosed HIV patients enrolled at the study clinic between February 2014 and June 2017. These patients were diagnosed either via standard diagnostic procedures or as a result of the Web-based testing programme. Eighty-eight new cases of HIV were consecutively enrolled; their mean age was 39.1±13.0 years. Fifty-nine patients (67%) were diagnosed through standard diagnostic procedures and 29 (33%) patients came from the Web-based testing programme. Late presentation (62% vs. 34%, p=0.01) and AIDS-defining conditions at presentation (13 vs. 1, p=0.02) were significantly more frequent in the standard care group than in the Web-based group; four of 13 patients with AIDS diagnosed under standard diagnostic procedures died, versus none in the Web-based testing group (p<0.001). Web-based recruitment for voluntary and free HIV testing helped to diagnose patients with less advanced HIV disease and no risk of death, from all at-risk groups, in comparison with standard care testing. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
Bosker, Wendy M; Kuypers, Kim P C; Theunissen, Eef L; Surinx, Anke; Blankespoor, Roos J; Skopp, Gisela; Jeffery, Wayne K; Walls, H Chip; van Leeuwen, Cees J; Ramaekers, Johannes G
2012-10-01
The acute and chronic effects of dronabinol [medicinal Δ(9) -tetrahydrocannabinol (THC)] on actual driving performance and the Standard Field Sobriety Test (SFST) were assessed. It was hypothesized that occasional users would be impaired on these tests and that heavy users would show less impairment due to tolerance. Double-blind, placebo-controlled, randomized, three-way cross-over study. Twelve occasional and 12 heavy cannabis users (14 males/10 females) received single doses of placebo, 10 and 20 mg dronabinol. Standard deviation of lateral position (SDLP; i.e. weaving) is the primary measure of road-tracking control. Time to speed adaptation (TSA) is the primary reaction-time measure in the car-following test. Percentage of impaired individuals on the SFST and subjective high on a visual analogue scale were secondary measures. Superiority tests showed that SDLP (P = 0.008) and TSA (P = 0.011) increased after dronabinol in occasional users. Equivalence tests demonstrated that dronabinol-induced increments in SDLP were bigger than impairment associated with BAC of 0.5 mg/ml in occasional and heavy users, although the magnitude of driving impairment was generally less in heavy users. The SFST did not discriminate between conditions. Levels of subjective high were comparable in occasional and heavy users. Dronabinol (medicinal tetrahydrocannabinol) impairs driving performance in occasional and heavy users in a dose-dependent way, but to a lesser degree in heavy users due possibly to tolerance. The Standard Field Sobriety Test is not sensitive to clinically relevant driving impairment caused by oral tetrahydrocannabinol. © 2012 The Authors. Addiction © 2012 Society for the Study of Addiction.
McTrusty, Alice D; Cameron, Lorraine A; Perperidis, Antonios; Brash, Harry M; Tatham, Andrew J; Agarwal, Pankaj K; Murray, Ian C; Fleck, Brian W; Minns, Robert A
2017-09-01
We compared patterns of visual field loss detected by standard automated perimetry (SAP) to saccadic vector optokinetic perimetry (SVOP) and examined patient perceptions of each test. A cross-sectional study was done of 58 healthy subjects and 103 with glaucoma who were tested using SAP and two versions of SVOP (v1 and v2). Visual fields from both devices were categorized by masked graders as: 0, normal; 1, paracentral defect; 2, nasal step; 3, arcuate defect; 4, altitudinal; 5, biarcuate; and 6, end-stage field loss. SVOP and SAP classifications were cross-tabulated. Subjects completed a questionnaire on their opinions of each test. We analyzed 142 (v1) and 111 (v2) SVOP and SAP test pairs. SVOP v2 had a sensitivity of 97.7% and specificity of 77.9% for identifying normal versus abnormal visual fields. SAP and SVOP v2 classifications showed complete agreement in 54% of glaucoma patients, with a further 23% disagreeing by one category. On repeat testing, 86% of SVOP v2 classifications agreed with the previous test, compared to 91% of SAP classifications; 71% of subjects preferred SVOP compared to 20% who preferred SAP. Eye-tracking perimetry can be used to obtain threshold visual field sensitivity values in patients with glaucoma and produce maps of visual field defects, with patterns exhibiting close agreement to SAP. Patients preferred eye-tracking perimetry compared to SAP. This first report of threshold eye tracking perimetry shows good agreement with conventional automated perimetry and provides a benchmark for future iterations.
NASA Technical Reports Server (NTRS)
Schmitt, Jeff G.; Stahnke, Brian
2017-01-01
This report describes test results from an assessment of the acoustically treated 9x15 Foot Low Speed Wind Tunnel at the NASA Glenn Research Center in Cleveland, Ohio in July of 2016. The tests were conducted in accordance with the recently adopted international standard ISO 26101-2012 on Qualification of Free Field Test Environments. This method involves moving a microphone relative to a source and comparing the sound pressure level versus distance measurements with theoretical inverse square law spreading.
The robustness of the horizontal gaze nystagmus test
DOT National Transportation Integrated Search
2007-09-01
Police officers follow procedures set forth in the NHTSA/IACP curriculum when they administer the Standardized Field Sobriety Tests (SFSTs) to suspected alcohol-impaired drivers. The SFSTs include Horizontal Gaze Nystagmus (HGN) test, Walk-and-Turn (...
Monajjemzadeh, Farnaz; Shokri, Javad; Mohajel Nayebi, Ali Reza; Nemati, Mahboob; Azarmi, Yadollah; Charkhpour, Mohammad; Najafi, Moslem
2014-01-01
Purpose: This study was aimed to design Objective Structured Field Examination (OSFE) and also standardize the course plan of community pharmacy clerkship at Pharmacy Faculty of Tabriz University of Medical Sciences (Iran). Methods: The study was composed of several stages including; evaluation of the old program, standardization and implementation of the new course plan, design and implementation of OSFE, and finally results evaluation. Results: Lack of a fair final assessment protocol and proper organized educating system in various fields of community pharmacy clerkship skills were assigned as the main weaknesses of the old program. Educational priorities were determined and student’s feedback was assessed to design the new curriculum consisting of sessions to fulfill a 60-hour training course. More than 70% of the students were satisfied and successfulness and efficiency of the new clerkship program was significantly greater than the old program (P<0.05). In addition, they believed that OSFE was a suitable testing method. Conclusion: The defined course plan was successfully improved different skills of the students and OSFE was concluded as a proper performance based assessment method. This is easily adoptable by pharmacy faculties to improve the educational outcomes of the clerkship course. PMID:24511477
Prick test: evolution towards automated reading.
Justo, X; Díaz, I; Gil, J J; Gastaminza, G
2016-08-01
The prick test is one of the most common medical methods for diagnosing allergies, and it has been carried out in a similar and laborious manner over many decades. In an attempt to standardize the reading of the test, many researchers have tried to automate the process of measuring the allergic reactions found by developing systems and algorithms based on multiple technologies. This work reviews the techniques for automatic wheal measurement with the aim of pointing out their advantages and disadvantages and the progress in the field. Furthermore, it provides a classification scheme for the different technologies applied. The works discussed herein provide evidence that significant challenges still exist for the development of an automatic wheal measurement system that not only helps allergists in their medical practice but also allows for the standardization of the reading and data exchange. As such, the aim of the work was to serve as guideline for the development of a proper and feasible system. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Procedures to evaluate the efficiency of protective clothing worn by operators applying pesticide.
Espanhol-Soares, Melina; Nociti, Leticia A S; Machado-Neto, Joaquim Gonçalves
2013-10-01
The evaluation of the efficiency of whole-body protective clothing against pesticides has already been carried out through field tests and procedures defined by international standards, but there is a need to determine the useful life of these garments to ensure worker safety. The aim of this article is to compare the procedures for evaluating efficiency of two whole-body protective garments, both new and previously used by applicators of herbicides, using a laboratory test with a mannequin and in the field with the operator. The evaluation of the efficiency of protective clothing used both quantitative and qualitative methodologies, leading to a proposal for classification according to efficiency, and determination of the useful life of protective clothing for use against pesticides, based on a quantitative assessment. The procedures used were in accordance with the standards of the modified American Society for Testing and Materials (ASTM) F 1359:2007 and International Organization for Standardization 17491-4. The protocol used in the field was World Health Organization Vector Biology and Control (VBC)/82.1. Clothing tested was personal water repellent and pesticide protective. Two varieties of fabric were tested: Beige (100% cotton) and Camouflaged (31% polyester and 69% cotton). The efficiency in exposure control of the personal protective clothing was measured before use and after 5, 10, 20, and 30 uses and washes under field conditions. Personal protective clothing was worn by workers in the field during the application of the herbicide glyphosate on weed species in mature sugar cane plantations using a knapsack sprayer. The modified ASTM 1359:2007 procedure was chosen as the most appropriate due to its greater repeatability (lower coefficient of variation). This procedure provides quantitative evaluation needed to determine the efficiency and useful life of individual protective clothing, not just at specific points of failure, but according to dermal protection as a whole. The qualitative assessment, which is suitable for verification of garment design and stitching flaws, does not aid in determining useful life, but does complement the quantitative evaluation. The proposed classification is appropriate and accurate for determining the useful life of personal protective clothing against pesticide materials relative to number of uses and washes after each use. For example, the Beige garment had a useful life of 30 uses and washes, while the Camouflaged garment had a useful life of 5 uses and washes. The quantitative evaluation aids in determining the efficiency and useful life of individual protective clothing according to dermal protection as a whole, not just at specific points of failure.
Natural and Artificial Playing Fields: Characteristics and Safety Features.
ERIC Educational Resources Information Center
Schmidt, Roger C., Ed.; Hoerner, Earl F., Ed.; Milner, Edward M., Ed.; Morehouse, C. A., Ed.
These papers are on the subjects of playing field standards, surface traction, testing and correlation to actual field experience, and state-of-the-art natural and artificial surfaces. The papers, presented at the Symposium on the Characteristics and Safety of Playing Surfaces (Artificial and Natural) for Field Sports in 1998, cover the…
[The standardization of medical care and the training of medical personnel].
Korbut, V B; Tyts, V V; Boĭshenko, V A
1997-09-01
The medical specialist training at all levels (medical orderly, doctor's assistant, general practitioner, doctors) should be based on the medical care standards. Preliminary studies in the field of military medicine standards have demonstrated that the medical service of the Armed Forces of Russia needs medical resources' standards, structure and organization standards, technology standards. Military medical service resources' standards should reflect the requisitions for: all medical specialists' qualification, equipment and material for medical set-ups, field medical systems, drugs, etc. Standards for structures and organization should include requisitions for: command and control systems in military formations' and task forces' medical services and their information support; health-care and evacuation functions, sanitary control and anti-epidemic measures and personnel health protection. Technology standards development could improve and regulate the health care procedures in the process of evacuation. Standards' development will help to solve the problem of the data-base for the military medicine education system and medical research.
How Have State Level Standards-Based Tests Related to Norm-Referenced Tests in Alaska?.
ERIC Educational Resources Information Center
Fenton, Ray
This overview of the Alaska system for test development, scoring, and reporting explored differences and similarities between norm-referenced and standards-based tests. The current Alaska testing program is based on legislation passed in 1997 and 1998, and is designed to meet the requirements of the federal No Child Left Behind Legislation. In…
ERIC Educational Resources Information Center
Kass, Darrin; Grandzol, Christian
2014-01-01
The use of standardized tests as a piece of outcomes assessment has risen in recent years in order to satisfy external accrediting bodies such as the Association to Advance Collegiate Schools of Business International. The authors explore the value added by the Graduate Major Field Test in Business (GMFT-B) for assurance of learning in a master of…
Code of Federal Regulations, 2013 CFR
2013-04-01
... under the HUD building product standard and certification program for construction adhesives for wood... program for construction adhesives for wood floor systems. (a) Applicable standards. (1) All construction adhesives for field glued wood floor systems shall be designed, manufactured, and tested in compliance with...
Code of Federal Regulations, 2014 CFR
2014-04-01
... under the HUD building product standard and certification program for construction adhesives for wood... program for construction adhesives for wood floor systems. (a) Applicable standards. (1) All construction adhesives for field glued wood floor systems shall be designed, manufactured, and tested in compliance with...
Code of Federal Regulations, 2010 CFR
2010-04-01
... under the HUD building product standard and certification program for construction adhesives for wood... program for construction adhesives for wood floor systems. (a) Applicable standards. (1) All construction adhesives for field glued wood floor systems shall be designed, manufactured, and tested in compliance with...
Gupta, Veer; Henriksen, Kim; Edwards, Melissa; Jeromin, Andreas; Lista, Simone; Bazenet, Chantal; Soares, Holly; Lovestone, Simon; Hampel, Harald; Montine, Thomas; Blennow, Kaj; Foroud, Tatiana; Carrillo, Maria; Graff-Radford, Neill; Laske, Christoph; Breteler, Monique; Shaw, Leslie; Trojanowski, John Q.; Schupf, Nicole; Rissman, Robert A.; Fagan, Anne M.; Oberoi, Pankaj; Umek, Robert; Weiner, Michael W.; Grammas, Paula; Posner, Holly; Martins, Ralph
2015-01-01
The lack of readily available biomarkers is a significant hindrance towards progressing to effective therapeutic and preventative strategies for Alzheimer’s disease (AD). Blood-based biomarkers have potential to overcome access and cost barriers and greatly facilitate advanced neuroimaging and cerebrospinal fluid biomarker approaches. Despite the fact that preanalytical processing is the largest source of variability in laboratory testing, there are no currently available standardized preanalytical guidelines. The current international working group provides the initial starting point for such guidelines for standardized operating procedures (SOPs). It is anticipated that these guidelines will be updated as additional research findings become available. The statement provides (1) a synopsis of selected preanalytical methods utilized in many international AD cohort studies, (2) initial draft guidelines/SOPs for preanalytical methods, and (3) a list of required methodological information and protocols to be made available for publications in the field in order to foster cross-validation across cohorts and laboratories. PMID:25282381
Evaluation of a new automated instrument for pretransfusion testing.
Morelati, F; Revelli, N; Maffei, L M; Poretti, M; Santoro, C; Parravicini, A; Rebulla, P; Cole, R; Sirchia, G
1998-10-01
A number of automated devices for pretransfusion testing have recently become available. This study evaluated a fully automated device based on column agglutination technology (AutoVue System, Ortho, Raritan, NJ). Some 6747 tests including forward and reverse ABO group, Rh type and phenotype, antibody screen, autocontrol, and crossmatch were performed on random samples from 1069 blood donors, 2063 patients, and 98 newborns and cord blood. Also tested were samples from 168 immunized patients and 53 donors expressing weak or variant A and D antigens. Test results and technician times required for their performance were compared with those obtained by standard methods (manual column agglutination technology, slide, semiautomatic handler). No erroneous conclusions were found in regard to the 5028 ABO group and Rh type or phenotype determinations carried out with the device. The device rejected 1.53 percent of tests for sample inadequacy. Of the remaining 18 tests with discrepant results found with the device and not confirmed with the standard methods, 6 gave such results because of mixed-field reactions, 10 gave negative results with A2 RBCs in reverse ABO grouping, and 2 gave very weak positive reactions in antibody screening and crossmatching. In the samples from immunized patients, the device missed one weak anti-K, whereas standard methods missed five weak antibodies. In addition, 48, 34, and 31 of the 53 weak or variant antigens were detected by the device, the slide method, and the semiautomated handler, respectively. Technician time with the standard methods was 1.6 to 7 times higher than that with the device. The technical performance of the device compared favorably with that of standard methods, with a number of advantages, including in particular the saving of technician time. Sample inadequacy was the most common cause of discrepancy, which suggests that standardization of sample collection can further improve the performance of the device.
Nayana, M Ravi Shashi; Sekhar, Y Nataraja; Nandyala, Haritha; Muttineni, Ravikumar; Bairy, Santosh Kumar; Singh, Kriti; Mahmood, S K
2008-10-01
In the present study, a series of 179 quinoline and quinazoline heterocyclic analogues exhibiting inhibitory activity against Gastric (H+/K+)-ATPase were investigated using the comparative molecular field analysis (CoMFA) and comparative molecular similarity indices (CoMSIA) methods. Both the models exhibited good correlation between the calculated 3D-QSAR fields and the observed biological activity for the respective training set compounds. The most optimal CoMFA and CoMSIA models yielded significant leave-one-out cross-validation coefficient, q(2) of 0.777, 0.744 and conventional cross-validation coefficient, r(2) of 0.927, 0.914 respectively. The predictive ability of generated models was tested on a set of 52 compounds having broad range of activity. CoMFA and CoMSIA yielded predicted activities for test set compounds with r(pred)(2) of 0.893 and 0.917 respectively. These validation tests not only revealed the robustness of the models but also demonstrated that for our models r(pred)(2) based on the mean activity of test set compounds can accurately estimate external predictivity. The factors affecting activity were analyzed carefully according to standard coefficient contour maps of steric, electrostatic, hydrophobic, acceptor and donor fields derived from the CoMFA and CoMSIA. These contour plots identified several key features which explain the wide range of activities. The results obtained from models offer important structural insight into designing novel peptic-ulcer inhibitors prior to their synthesis.
NASA Astrophysics Data System (ADS)
Chen, Xiaol; Guo, Bei; Tuo, Jinliang; Zhou, Ruixin; Lu, Yang
2017-08-01
Nowadays, people are paying more and more attention to the noise reduction of household refrigerator compressor. This paper established a sound field bounded by compressor shell and ISO3744 standard field points. The Acoustic Transfer Vector (ATV) in the sound field radiated by a refrigerator compressor shell were calculated which fits the test result preferably. Then the compressor shell surface is divided into several parts. Based on Acoustic Transfer Vector approach, the sound pressure contribution to the field points and the sound power contribution to the sound field of each part were calculated. To obtain the noise radiation in the sound field, the sound pressure cloud charts were analyzed, and the contribution curves in different frequency of each part were acquired. Meanwhile, the sound power contribution of each part in different frequency was analyzed, to ensure those parts where contributes larger sound power. Through the analysis of acoustic contribution, those parts where radiate larger noise on the compressor shell were determined. This paper provides a credible and effective approach on the structure optimal design of refrigerator compressor shell, which is meaningful in the noise and vibration reduction.
Patel, Dipesh E; Cumberland, Phillippa M; Walters, Bronwen C; Russell-Eggitt, Isabelle; Brookes, John; Papadopoulos, Maria; Khaw, Peng Tee; Viswanathan, Ananth C; Garway-Heath, David; Cortina-Borja, Mario; Rahi, Jugnoo S
2018-02-01
There is limited evidence to support the development of guidance for visual field testing in children with glaucoma. To compare different static and combined static/kinetic perimetry approaches in children with glaucoma. Cross-sectional, observational study recruiting children prospectively between May 2013 and June 2015 at 2 tertiary specialist pediatric ophthalmology centers in London, England (Moorfields Eye Hospital and Great Ormond Street Hospital). The study included 65 children aged 5 to 15 years with glaucoma (108 affected eyes). A comparison of test quality and outcomes for static and combined static/kinetic techniques, with respect to ability to quantify glaucomatous loss. Children performed perimetric assessments using Humphrey static (Swedish Interactive Thresholding Algorithm 24-2 FAST) and Octopus combined static tendency-oriented perimetry/kinetic perimetry (isopter V4e, III4e, or I4e) in a single sitting, using standardized clinical protocols, administered by a single examiner. Information was collected about test duration, completion, and quality (using automated reliability indices and our qualitative Examiner-Based Assessment of Reliability score). Perimetry outputs were scored using the Aulhorn and Karmeyer classification. One affected eye in 19 participants was retested with Swedish Interactive Thresholding Algorithm 24-2 FAST and 24-2 standard algorithms. Sixty-five children (33 girls [50.8%]), with a median age of 12 years (interquartile range, 9-14 years), were tested. Test quality (Examiner-Based Assessment of Reliability score) improved with increasing age for both Humphrey and Octopus strategies and were equivalent in children older than 10 years (McNemar test, χ2 = 0.33; P = .56), but better-quality tests with Humphrey perimetry were achieved in younger children (McNemar test, χ2 = 4.0; P = .05). Octopus and Humphrey static MD values worse than or equal to -6 dB showed disagreement (Bland-Altman, mean difference, -0.70; limit of agreement, -7.74 to 6.35) but were comparable when greater than this threshold (mean difference, -0.03; limit of agreement, -2.33 to 2.27). Visual field classification scores for static perimetry tests showed substantial agreement (linearly weighted κ, 0.79; 95% CI, 0.65-0.93), although 25 of 80 (31%) were graded with a more severe defect for Octopus static perimetry. Of the 7 severe cases of visual field loss (grade 5), 5 had lower kinetic than static classification scores. A simple static perimetry approach potentially yields high-quality results in children younger than 10 years. For children older than 10 years, without penalizing quality, the addition of kinetic perimetry enabled measurement of far-peripheral sensitivity, which is particularly useful in children with severe visual field restriction.
ERIC Educational Resources Information Center
Lee, Jaekyung; Liu, Xiaoyan; Amo, Laura Casey; Wang, Weichun Leilani
2014-01-01
Drawing on national and state assessment datasets in reading and math, this study tested "external" versus "internal" standards-based education models. The goal was to understand whether and how student performance standards work in multilayered school systems under No Child Left Behind Act of 2001 (NCLB). Under the…
Peters, Brenton C; Fitzgerald, Christopher J
2006-10-01
Laboratory and field data reported in the literature are confusing with regard to "adequate" protection thresholds for borate timber preservatives. The confusion is compounded by differences in termite species, timber species and test methodology. Laboratory data indicate a borate retention of 0.5% mass/mass (m/m) boric acid equivalent (BAE) would cause > 90% termite mortality and restrict mass loss in test specimens to < or = 5%. Field data generally suggest that borate retentions appreciably > 0.5% m/m BAE are required. We report two field experiments with varying amounts of untreated feeder material in which Coptotermes acinaciformis (Froggatt) (Isoptera: Rhinotermitidae) responses to borate-treated radiata (Monterey) pine, Pinus radiata D. Don, were measured. The apparently conflicting results between laboratory and field data are explained by the presence or absence of untreated feeder material in the test environment. In the absence of untreated feeder material, wood containing 0.5% BAE provided adequate protection from Coptotermes sp., whereas in the presence of untreated feeder material, increased retentions were required. Furthermore, the retentions required increased with increased amounts of susceptible material present. Some termites, Nasutitermes sp. and Mastotermes darwiniensis Froggatt, for example, are borate-tolerant and borate timber preservatives are not a viable management option with these species. The lack of uniform standards for termite test methodology and assessment criteria for efficacy across the world is recognized as a difficulty with research into the performance of timber preservatives with termites. The many variables in laboratory and field assays make "prescriptive" standards difficult to recommend. The use of "performance" standards to define efficacy criteria ("adequate" protection) is discussed.
Inzaule, Seth C; Hamers, Ralph L; Paredes, Roger; Yang, Chunfu; Schuurman, Rob; Rinke de Wit, Tobias F
2017-01-01
Global scale-up of antiretroviral treatment has dramatically changed the prospects of HIV/AIDS disease, rendering life-long chronic care and treatment a reality for millions of HIV-infected patients. Affordable technologies to monitor antiretroviral treatment are needed to ensure long-term durability of limited available drug regimens. HIV drug resistance tests can complement existing strategies in optimizing clinical decision-making for patients with treatment failure, in addition to facilitating population-based surveillance of HIV drug resistance. This review assesses the current landscape of HIV drug resistance technologies and discusses the strengths and limitations of existing assays available for expanding testing in resource-limited settings. These include sequencing-based assays (Sanger sequencing assays and nextgeneration sequencing), point mutation assays, and genotype-free data-based prediction systems. Sanger assays are currently considered the gold standard genotyping technology, though only available at a limited number of resource-limited setting reference and regional laboratories, but high capital and test costs have limited their wide expansion. Point mutation assays present opportunities for simplified laboratory assays, but HIV genetic variability, extensive codon redundancy at or near the mutation target sites with limited multiplexing capability have restricted their utility. Next-generation sequencing, despite high costs, may have potential to reduce the testing cost significantly through multiplexing in high-throughput facilities, although the level of bioinformatics expertise required for data analysis is currently still complex and expensive and lacks standardization. Web-based genotype-free prediction systems may provide enhanced antiretroviral treatment decision-making without the need for laboratory testing, but require further clinical field evaluation and implementation scientific research in resource-limited settings.
Laser transit anemometer software development program
NASA Technical Reports Server (NTRS)
Abbiss, John B.
1989-01-01
Algorithms were developed for the extraction of two components of mean velocity, standard deviation, and the associated correlation coefficient from laser transit anemometry (LTA) data ensembles. The solution method is based on an assumed two-dimensional Gaussian probability density function (PDF) model of the flow field under investigation. The procedure consists of transforming the data ensembles from the data acquisition domain (consisting of time and angle information) to the velocity space domain (consisting of velocity component information). The mean velocity results are obtained from the data ensemble centroid. Through a least squares fitting of the transformed data to an ellipse representing the intersection of a plane with the PDF, the standard deviations and correlation coefficient are obtained. A data set simulation method is presented to test the data reduction process. Results of using the simulation system with a limited test matrix of input values is also given.
TERRESTRIAL PLANT REPRODUCTIVE TESTING: SHOULD WILDLIFE TOXICOLOGISTS CARE?
Standard phytotoxicity testing using the seedling emergence and vegetative vigor tests have been shown to be inadequate for the protection of plant reproduction. Both experimental evidence and unintended field exposures have shown vegetation can be minimally or not significantly...
Schulze, P.A.; Capel, P.D.; Squillace, P.J.; Helsel, D.R.
1993-01-01
The usefulness and sensitivity, of a portable immunoassay test for the semiquantitative field screening of water samples was evaluated by means of laboratory and field studies. Laboratory results indicated that the tests were useful for the determination of atrazine concentrations of 0.1 to 1.5 μg/L. At a concentration of 1 μg/L, the relative standard deviation in the difference between the regression line and the actual result was about 40 percent. The immunoassay was less sensitive and produced similar errors for other triazine herbicides. After standardization, the test results were relatively insensitive to ionic content and variations in pH (range, 4 to 10), mildly sensitive to temperature changes, and quite sensitive to the timing of the final incubation step, variances in timing can be a significant source of error. Almost all of the immunoassays predicted a higher atrazine concentration in water samples when compared to results of gas chromatography. If these tests are used as a semiquantitative screening tool, this tendency for overprediction does not diminish the tests' usefulness. Generally, the tests seem to be a valuable method for screening water samples for triazine herbicides.
Code of Federal Regulations, 2011 CFR
2011-07-01
...) Electrical Tests .453(b)(3) Field Modifications .453(a)(2) Ladder Trucks .453(b)(1) Tower Trucks .453(b)(1...) Shoring and reshoring .703(b) Test Requirements—Compressive Strength .703(e)(1) Equipment and Tools .702 Lift-Slab Operations .705 Lockout/Tagout .702(j) Post-Tensioning .701(c) Precast .704 Pumping Systems...
Code of Federal Regulations, 2010 CFR
2010-07-01
... .453(b)(4) Electrical Tests .453(b)(3) Field Modifications .453(a)(2) Ladder Trucks .453(b)(1) Tower... .703(a) Formwork, Removal .703(e) Reinforcing steel .701(b), .703(d) Shoring and reshoring .703(b) Test... .702(j) Post-Tensioning .701(c) Precast .704 Pumping Systems .702(e) Conductors, Deenergized .955(c...
Buckus, Raimondas; Strukčinskienė, Birute; Raistenskis, Juozas; Stukas, Rimantas; Šidlauskienė, Aurelija; Čerkauskienė, Rimantė; Isopescu, Dorina Nicolina; Stabryla, Jan; Cretescu, Igor
2017-01-01
During the last two decades, the number of macrocell mobile telephony base station antennas emitting radiofrequency (RF) electromagnetic radiation (EMR) in residential areas has increased significantly, and therefore much more attention is being paid to RF EMR and its effects on human health. Scientific field measurements of public exposure to RF EMR (specifically to radio frequency radiation) from macrocell mobile telephony base station antennas and RF electromagnetic field (EMF) intensity parameters in the environment are discussed in this article. The research methodology is applied according to the requirements of safety norms and Lithuanian Standards in English (LST EN). The article presents and analyses RF EMFs generated by mobile telephony base station antennas in areas accessible to the general public. Measurements of the RF electric field strength and RF EMF power density were conducted in the near- and far-fields of the mobile telephony base station antenna. Broadband and frequency-selective measurements were performed outside (on the roof and on the ground) and in a residential area. The tests performed on the roof in front of the mobile telephony base station antennas in the near-field revealed the presence of a dynamic energy interaction within the antenna electric field, which changes rapidly with distance. The RF EMF power density values on the ground at distances of 50, 100, 200, 300, 400, and 500 m from the base station are very low and are scattered within intervals of 0.002 to 0.05 μW/cm2. The results were compared with international exposure guidelines (ICNIRP). PMID:28257069
Buckus, Raimondas; Strukčinskienė, Birute; Raistenskis, Juozas; Stukas, Rimantas; Šidlauskienė, Aurelija; Čerkauskienė, Rimantė; Isopescu, Dorina Nicolina; Stabryla, Jan; Cretescu, Igor
2017-03-01
During the last two decades, the number of macrocell mobile telephony base station antennas emitting radiofrequency (RF) electromagnetic radiation (EMR) in residential areas has increased significantly, and therefore much more attention is being paid to RF EMR and its effects on human health. Scientific field measurements of public exposure to RF EMR (specifically to radio frequency radiation) from macrocell mobile telephony base station antennas and RF electromagnetic field (EMF) intensity parameters in the environment are discussed in this article. The research methodology is applied according to the requirements of safety norms and Lithuanian Standards in English (LST EN). The article presents and analyses RF EMFs generated by mobile telephony base station antennas in areas accessible to the general public. Measurements of the RF electric field strength and RF EMF power density were conducted in the near- and far-fields of the mobile telephony base station antenna. Broadband and frequency-selective measurements were performed outside (on the roof and on the ground) and in a residential area. The tests performed on the roof in front of the mobile telephony base station antennas in the near-field revealed the presence of a dynamic energy interaction within the antenna electric field, which changes rapidly with distance. The RF EMF power density values on the ground at distances of 50, 100, 200, 300, 400, and 500 m from the base station are very low and are scattered within intervals of 0.002 to 0.05 μW/cm². The results were compared with international exposure guidelines (ICNIRP).
Merrill, Rebecca D.; Shamim, Abu Ahmed; Ali, Hasmot; Schulze, Kerry; Rashid, Mahbubur; Christian, Parul; West, Jr., Keith P.
2009-01-01
Iron is ubiquitous in natural water sources used around the world for drinking and cooking. The health impact of chronic exposure to iron through water, which in groundwater sources can reach well above the World Health Organization's defined aesthetic limit of 0.3 mg/L, is not currently understood. To quantify the impact of consumption of iron in groundwater on nutritional status, it is important to accurately assess naturally-occurring exposure levels among populations. In this study, the validity of iron quantification in water was evaluated using two portable instruments: the HACH DR/890 portable colorimeter (colorimeter) and HACH Iron test-kit, Model IR-18B (test-kit), by comparing field-based iron estimates for 25 tubewells located in northwestern Bangladesh with gold standard atomic absorption spectrophotometry analysis. Results of the study suggest that the HACH test-kit delivers more accurate point-of-use results across a wide range of iron concentrations under challenging field conditions. PMID:19507757
Merrill, Rebecca D; Shamim, Abu Ahmed; Labrique, Alain B; Ali, Hasmot; Schulze, Kerry; Rashid, Mahbubur; Christian, Parul; West, Keith P
2009-06-01
Iron is ubiquitous in natural water sources used around the world for drinking and cooking. The health impact of chronic exposure to iron through water, which in groundwater sources can reach well above the World Health Organization's defined aesthetic limit of 0.3 mg/L, is not currently understood. To quantify the impact of consumption of iron in groundwater on nutritional status, it is important to accurately assess naturally-occurring exposure levels among populations. In this study, the validity of iron quantification in water was evaluated using two portable instruments: the HACH DR/890 portable colorimeter (colorimeter) and HACH Iron test-kit, Model IR-18B (test-kit), by comparing field-based iron estimates for 25 tubewells located in northwestern Bangladesh with gold standard atomic absorption spectrophotometry analysis. Results of the study suggest that the HACH test-kit delivers more accurate point-of-use results across a wide range of iron concentrations under challenging field conditions.
Martin, Jeffrey D.
2002-01-01
Correlation analysis indicates that for most pesticides and concentrations, pooled estimates of relative standard deviation rather than pooled estimates of standard deviation should be used to estimate variability because pooled estimates of relative standard deviation are less affected by heteroscedasticity. The 2 Variability of Pesticide Detections and Concentrations in Field Replicate Water Samples, 1992–97 median pooled relative standard deviation was calculated for all pesticides to summarize the typical variability for pesticide data collected for the NAWQA Program. The median pooled relative standard deviation was 15 percent at concentrations less than 0.01 micrograms per liter (µg/L), 13 percent at concentrations near 0.01 µg/L, 12 percent at concentrations near 0.1 µg/L, 7.9 percent at concentrations near 1 µg/L, and 2.7 percent at concentrations greater than 5 µg/L. Pooled estimates of standard deviation or relative standard deviation presented in this report are larger than estimates based on averages, medians, smooths, or regression of the individual measurements of standard deviation or relative standard deviation from field replicates. Pooled estimates, however, are the preferred method for characterizing variability because they provide unbiased estimates of the variability of the population. Assessments of variability based on standard deviation (rather than variance) underestimate the true variability of the population. Because pooled estimates of variability are larger than estimates based on other approaches, users of estimates of variability must be cognizant of the approach used to obtain the estimate and must use caution in the comparison of estimates based on different approaches.
Development of a novel SCADA system for laboratory testing.
Patel, M; Cole, G R; Pryor, T L; Wilmot, N A
2004-07-01
This document summarizes the supervisory control and data acquisition (SCADA) system that allows communication with, and controlling the output of, various I/O devices in the renewable energy systems and components test facility RESLab. This SCADA system differs from traditional SCADA systems in that it supports a continuously changing operating environment depending on the test to be performed. The SCADA System is based on the concept of having one Master I/O Server and multiple client computer systems. This paper describes the main features and advantages of this dynamic SCADA system, the connections of various field devices to the master I/O server, the device servers, and numerous software features used in the system. The system is based on the graphical programming language "LabVIEW" and its "Datalogging and Supervisory Control" (DSC) module. The DSC module supports a real-time database called the "tag engine," which performs the I/O operations with all field devices attached to the master I/O server and communications with the other tag engines running on the client computers connected via a local area network. Generic and detailed communication block diagrams illustrating the hierarchical structure of this SCADA system are presented. The flow diagram outlining a complete test performed using this system in one of its standard configurations is described.
Evaluation of a culture-based pathogen identification kit for bacterial causes of bovine mastitis.
Viora, L; Graham, E M; Mellor, D J; Reynolds, K; Simoes, P B A; Geraghty, T E
2014-07-26
Accurate identification of mastitis-causing bacteria supports effective management and can be used to implement selective use of antimicrobials for treatment. The objectives of this study were to compare the results from a culture-based mastitis pathogen detection test kit ('VetoRapid', Vétoquinol) with standard laboratory culture and to evaluate the potential suitability of the test kit to inform a selective treatment programme. Overall 231 quarter milk samples from five UK dairy farms were collected. The sensitivity and specificity of the test kit for the identification of Escherichia coli, Staphylococcus aureus, coagulase-negative staphylococci, Streptococcus uberis and Enterococcus spp. ranged from 17 per cent to 84 per cent and 92 per cent to 98 per cent, respectively. In total, 23 of 68 clinical samples were assigned as meeting the requirement for antimicrobial treatment (Gram-positive organism cultured) according to standard culture results, with the test kit results having sensitivity and specificity of 91 per cent and 78 per cent, respectively. Several occurrences of misidentification are reported, including S. aureus being misidentified as coagulase-negative staphylococci and vice versa. The test kit provides rapid preliminary identification of five common causes of bovine mastitis under UK field conditions and is likely to be suitable for informing selective treatment of clinical mastitis caused by Gram-positive organisms. British Veterinary Association.
NASA Technical Reports Server (NTRS)
McFarland, Shane M.
2010-01-01
Field of view has always been a design feature paramount to helmet design, and in particular spacesuit design, where the helmet must provide an adequate field of view for a large range of activities, environments, and body positions. Historically, suited field of view has been evaluated either qualitatively in parallel with design or quantitatively using various test methods and protocols. As such, oftentimes legacy suit field of view information is either ambiguous for lack of supporting data or contradictory to other field of view tests performed with different subjects and test methods. This paper serves to document a new field of view testing method that is more reliable and repeatable than its predecessors. It borrows heavily from standard ophthalmologic field of vision tests such as the Goldmann kinetic perimetry test, but is designed specifically for evaluating field of view of a spacesuit helmet. In this test, four suits utilizing three different helmet designs were tested for field of view. Not only do these tests provide more reliable field of view data for legacy and prototype helmet designs, they also provide insight into how helmet design impacts field of view and what this means for the Constellation Project spacesuit helmet, which must meet stringent field of view requirements that are more generous to the crewmember than legacy designs.
Partially composite particle physics with and without supersymmetry
NASA Astrophysics Data System (ADS)
Kramer, Thomas A.
Theories in which the Standard Model fields are partially compositeness provide elegant and phenomenologically viable solutions to the Hierarchy Problem. In this thesis we will study types of models from two different perspectives. We first derive an effective field theory describing the interactions of the Standard Models fields with their lightest composite partners based on two weakly coupled sectors. Technically, via the AdS/CFT correspondence, our model is dual to a highly deconstructed theory with a single warped extra-dimension. This two sector theory provides a simplified approach to the phenomenology of this important class of theories. We then use this effective field theoretic approach to study models with weak scale accidental supersymmetry. Particularly, we will investigate the possibility that the Standard Model Higgs field is a member of a composite supersymmetric sector interacting weakly with the known Standard Model fields.
Information Technology Measurement and Testing Activities at NIST
Hogan, Michael D.; Carnahan, Lisa J.; Carpenter, Robert J.; Flater, David W.; Fowler, James E.; Frechette, Simon P.; Gray, Martha M.; Johnson, L. Arnold; McCabe, R. Michael; Montgomery, Douglas; Radack, Shirley M.; Rosenthal, Robert; Shakarji, Craig M.
2001-01-01
Our high technology society continues to rely more and more upon sophisticated measurements, technical standards, and associated testing activities. This was true for the industrial society of the 20th century and remains true for the information society of the 21st century. Over the last half of the 20th century, information technology (IT) has been a powerful agent of change in almost every sector of the economy. The complexity and rapidly changing nature of IT have presented unique technical challenges to the National Institute of Standards and Technology (NIST) and to the scientific measurement community in developing a sound measurement and testing infrastructure for IT. This measurement and testing infrastructure for the important non-physical and non-chemical properties associated with complex IT systems is still in an early stage of development. This paper explains key terms and concepts of IT metrology, briefly reviews the history of the National Bureau of Standards/National Institute of Standards and Technology (NBS/NIST) in the field of IT, and reviews NIST’s current capabilities and work in measurement and testing for IT. It concludes with a look at what is likely to occur in the field of IT over the next ten years and what metrology roles NIST is likely to play. PMID:27500026
Layton, Kelvin J; Gallichan, Daniel; Testud, Frederik; Cocosco, Chris A; Welz, Anna M; Barmet, Christoph; Pruessmann, Klaas P; Hennig, Jürgen; Zaitsev, Maxim
2013-09-01
It has recently been demonstrated that nonlinear encoding fields result in a spatially varying resolution. This work develops an automated procedure to design single-shot trajectories that create a local resolution improvement in a region of interest. The technique is based on the design of optimized local k-space trajectories and can be applied to arbitrary hardware configurations that employ any number of linear and nonlinear encoding fields. The trajectories designed in this work are tested with the currently available hardware setup consisting of three standard linear gradients and two quadrupolar encoding fields generated from a custom-built gradient insert. A field camera is used to measure the actual encoding trajectories up to third-order terms, enabling accurate reconstructions of these demanding single-shot trajectories, although the eddy current and concomitant field terms of the gradient insert have not been completely characterized. The local resolution improvement is demonstrated in phantom and in vivo experiments. Copyright © 2012 Wiley Periodicals, Inc.
Precise SAR measurements in the near-field of RF antenna systems
NASA Astrophysics Data System (ADS)
Hakim, Bandar M.
Wireless devices must meet specific safety radiation limits, and in order to assess the health affects of such devices, standard procedures are used in which standard phantoms, tissue-equivalent liquids, and miniature electric field probes are used. The accuracy of such measurements depend on the precision in measuring the dielectric properties of the tissue-equivalent liquids and the associated calibrations of the electric-field probes. This thesis describes work on the theoretical modeling and experimental measurement of the complex permittivity of tissue-equivalent liquids, and associated calibration of miniature electric-field probes. The measurement method is based on measurements of the field attenuation factor and power reflection coefficient of a tissue-equivalent sample. A novel method, to the best of the authors knowledge, for determining the dielectric properties and probe calibration factors is described and validated. The measurement system is validated using saline at different concentrations, and measurements of complex permittivity and calibration factors have been made on tissue-equivalent liquids at 900MHz and 1800MHz. Uncertainty analysis have been conducted to study the measurement system sensitivity. Using the same waveguide to measure tissue-equivalent permittivity and calibrate e-field probes eliminates a source of uncertainty associated with using two different measurement systems. The measurement system is used to test GSM cell-phones at 900MHz and 1800MHz for Specific Absorption Rate (SAR) compliance using a Specific Anthropomorphic Mannequin phantom (SAM).
24 CFR 891.150 - Operating cost standards.
Code of Federal Regulations, 2010 CFR
2010-04-01
... through 891.790, the operating cost standard for group homes shall be based on the number of residents... as differences in costs based on location within the field office jurisdiction. The operating cost...
A standard protocol for describing individual-based and agent-based models
Grimm, Volker; Berger, Uta; Bastiansen, Finn; Eliassen, Sigrunn; Ginot, Vincent; Giske, Jarl; Goss-Custard, John; Grand, Tamara; Heinz, Simone K.; Huse, Geir; Huth, Andreas; Jepsen, Jane U.; Jorgensen, Christian; Mooij, Wolf M.; Muller, Birgit; Pe'er, Guy; Piou, Cyril; Railsback, Steven F.; Robbins, Andrew M.; Robbins, Martha M.; Rossmanith, Eva; Ruger, Nadja; Strand, Espen; Souissi, Sami; Stillman, Richard A.; Vabo, Rune; Visser, Ute; DeAngelis, Donald L.
2006-01-01
Simulation models that describe autonomous individual organisms (individual based models, IBM) or agents (agent-based models, ABM) have become a widely used tool, not only in ecology, but also in many other disciplines dealing with complex systems made up of autonomous entities. However, there is no standard protocol for describing such simulation models, which can make them difficult to understand and to duplicate. This paper presents a proposed standard protocol, ODD, for describing IBMs and ABMs, developed and tested by 28 modellers who cover a wide range of fields within ecology. This protocol consists of three blocks (Overview, Design concepts, and Details), which are subdivided into seven elements: Purpose, State variables and scales, Process overview and scheduling, Design concepts, Initialization, Input, and Submodels. We explain which aspects of a model should be described in each element, and we present an example to illustrate the protocol in use. In addition, 19 examples are available in an Online Appendix. We consider ODD as a first step for establishing a more detailed common format of the description of IBMs and ABMs. Once initiated, the protocol will hopefully evolve as it becomes used by a sufficiently large proportion of modellers.
Value of the Cosmological Constant in Emergent Quantum Gravity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hogan, Craig
It is suggested that the exact value of the cosmological constant could be derived from first principles, based on entanglement of the Standard Model field vacuum with emergent holographic quantum geometry. For the observed value of the cosmological constant, geometrical information is shown to agree closely with the spatial information density of the QCD vacuum, estimated in a free-field approximation. The comparison is motivated by a model of exotic rotational fluctuations in the inertial frame that can be precisely tested in laboratory experiments. Cosmic acceleration in this model is always positive, but fluctuates with characteristic coherence lengthmore » $$\\approx 100$$km and bandwidth $$\\approx 3000$$ Hz.« less
The fast multipole method and point dipole moment polarizable force fields.
Coles, Jonathan P; Masella, Michel
2015-01-14
We present an implementation of the fast multipole method for computing Coulombic electrostatic and polarization forces from polarizable force-fields based on induced point dipole moments. We demonstrate the expected O(N) scaling of that approach by performing single energy point calculations on hexamer protein subunits of the mature HIV-1 capsid. We also show the long time energy conservation in molecular dynamics at the nanosecond scale by performing simulations of a protein complex embedded in a coarse-grained solvent using a standard integrator and a multiple time step integrator. Our tests show the applicability of fast multipole method combined with state-of-the-art chemical models in molecular dynamical systems.
Polynuclear aromatic hydrocarbon analysis using the synchronous scanning luminoscope
NASA Astrophysics Data System (ADS)
Hyfantis, George J., Jr.; Teglas, Matthew S.; Wilbourn, Robert G.
2001-02-01
12 The Synchronous Scanning Luminoscope (SSL) is a field- portable, synchronous luminescence spectrofluorometer that was developed for on-site analysis of contaminated soil and ground water. The SSL is capable of quantitative analysis of total polynuclear aromatic hydrocarbons (PAHs) using phosphorescence and fluorescence techniques with a high correlation to laboratory data as illustrated by this study. The SSL is also capable of generating benzo(a)pyrene equivalency results, based on seven carcinogenic PAHs and Navy risk numbers, with a high correlation to laboratory data as illustrated by this study. These techniques allow rapid field assessments of total PAHs and benzo(a)pyrene equivalent concentrations. The Luminoscope is capable of detecting total PAHs to the parts per billion range. This paper describes standard field methods for using the SSL and describes the results of field/laboratory testing of PAHs. SSL results from two different hazardous waste sites are discussed.
Soil pH Mapping with an On-The-Go Sensor
Schirrmann, Michael; Gebbers, Robin; Kramer, Eckart; Seidel, Jan
2011-01-01
Soil pH is a key parameter for crop productivity, therefore, its spatial variation should be adequately addressed to improve precision management decisions. Recently, the Veris pH Manager™, a sensor for high-resolution mapping of soil pH at the field scale, has been made commercially available in the US. While driving over the field, soil pH is measured on-the-go directly within the soil by ion selective antimony electrodes. The aim of this study was to evaluate the Veris pH Manager™ under farming conditions in Germany. Sensor readings were compared with data obtained by standard protocols of soil pH assessment. Experiments took place under different scenarios: (a) controlled tests in the lab, (b) semicontrolled test on transects in a stop-and-go mode, and (c) tests under practical conditions in the field with the sensor working in its typical on-the-go mode. Accuracy issues, problems, options, and potential benefits of the Veris pH Manager™ were addressed. The tests demonstrated a high degree of linearity between standard laboratory values and sensor readings. Under practical conditions in the field (scenario c), the measure of fit (r2) for the regression between the on-the-go measurements and the reference data was 0.71, 0.63, and 0.84, respectively. Field-specific calibration was necessary to reduce systematic errors. Accuracy of the on-the-go maps was considerably higher compared with the pH maps obtained by following the standard protocols, and the error in calculating lime requirements was reduced by about one half. However, the system showed some weaknesses due to blockage by residual straw and weed roots. If these problems were solved, the on-the-go sensor investigated here could be an efficient alternative to standard sampling protocols as a basis for liming in Germany. PMID:22346591
Soil pH mapping with an on-the-go sensor.
Schirrmann, Michael; Gebbers, Robin; Kramer, Eckart; Seidel, Jan
2011-01-01
Soil pH is a key parameter for crop productivity, therefore, its spatial variation should be adequately addressed to improve precision management decisions. Recently, the Veris pH Manager™, a sensor for high-resolution mapping of soil pH at the field scale, has been made commercially available in the US. While driving over the field, soil pH is measured on-the-go directly within the soil by ion selective antimony electrodes. The aim of this study was to evaluate the Veris pH Manager™ under farming conditions in Germany. Sensor readings were compared with data obtained by standard protocols of soil pH assessment. Experiments took place under different scenarios: (a) controlled tests in the lab, (b) semicontrolled test on transects in a stop-and-go mode, and (c) tests under practical conditions in the field with the sensor working in its typical on-the-go mode. Accuracy issues, problems, options, and potential benefits of the Veris pH Manager™ were addressed. The tests demonstrated a high degree of linearity between standard laboratory values and sensor readings. Under practical conditions in the field (scenario c), the measure of fit (r(2)) for the regression between the on-the-go measurements and the reference data was 0.71, 0.63, and 0.84, respectively. Field-specific calibration was necessary to reduce systematic errors. Accuracy of the on-the-go maps was considerably higher compared with the pH maps obtained by following the standard protocols, and the error in calculating lime requirements was reduced by about one half. However, the system showed some weaknesses due to blockage by residual straw and weed roots. If these problems were solved, the on-the-go sensor investigated here could be an efficient alternative to standard sampling protocols as a basis for liming in Germany.
40 CFR 1065.940 - Emission calculations.
Code of Federal Regulations, 2011 CFR
2011-07-01
... CONTROLS ENGINE-TESTING PROCEDURES Field Testing and Portable Emission Measurement Systems § 1065.940...-specific emissions for each test interval using any applicable information and instructions in the standard.... Determine this fixed value by engineering analysis. [75 FR 68467, Nov. 8, 2010] ...
NASA Astrophysics Data System (ADS)
Bernas, Martin; Páta, Petr; Hudec, René; Soldán, Jan; Rezek, Tomáš; Castro-Tirado, Alberto J.
1998-05-01
Although there are several optical GRB follow-up systems in operation and/or in development, some of them with a very short response time, they will never be able to provide true simultaneous (no delay) and pre-burst optical data for GRBs. We report on the development and tests of a monitoring experiment expected to be put into test operation in 1998. The system should detect Optical Transients down to mag 6-7 (few seconds duration assumed) over a wide field of view. The system is based on the double CCD wide-field cameras ST8. For the real time evaluation of the signal from both cameras, two TMS 320C40 processors are used. Using two channels differing in spectral sensitivity and processing of temporal sequence of images allows us to eliminate man-made objects and defects of the CCD electronics. The system is controlled by a standard PC computer.
Newtonian CAFE: a new ideal MHD code to study the solar atmosphere
NASA Astrophysics Data System (ADS)
González, J. J.; Guzmán, F.
2015-12-01
In this work we present a new independent code designed to solve the equations of classical ideal magnetohydrodynamics (MHD) in three dimensions, submitted to a constant gravitational field. The purpose of the code centers on the analysis of solar phenomena within the photosphere-corona region. In special the code is capable to simulate the propagation of impulsively generated linear and non-linear MHD waves in the non-isothermal solar atmosphere. We present 1D and 2D standard tests to demonstrate the quality of the numerical results obtained with our code. As 3D tests we present the propagation of MHD-gravity waves and vortices in the solar atmosphere. The code is based on high-resolution shock-capturing methods, uses the HLLE flux formula combined with Minmod, MC and WENO5 reconstructors. The divergence free magnetic field constraint is controlled using the Flux Constrained Transport method.
A field test of cut-off importance sampling for bole volume
Jeffrey H. Gove; Harry T. Valentine; Michael J. Holmes
2000-01-01
Cut-off importance sampling has recently been introduced as a technique for estimating bole volume to some point below the tree tip, termed the cut-off point. A field test of this technique was conducted on a small population of eastern white pine trees using dendrometry as the standard for volume estimation. Results showed that the differences in volume estimates...
Evaluation of engineering plastic for rollover protective structure (ROPS) mounting.
Comer, R S; Ayers, P D; Liu, J
2007-04-01
Agriculture has one of the highest fatality rates of any industry in America. Tractor rollovers are a significant contributor to the high death rate. Rollover protective structures (ROPS) have helped lower these high fatality rates on full-size tractors. However, a large number of older tractors still do not use ROPS due to the difficulty of designing and creating a mounting structure. To help reduce this difficulty, engineering plastics were evaluated for use in a ROPS mounting structure on older tractors. The use of engineering plastics around axle housings could provide a uniform mounting configuration as well as lower costs for aftermarket ROPS. Various plastics were examined through shear testing, scale model testing, and compressive strength testing. Once a material was chosen based upon strength and cost, full-scale testing of the plastic's strength on axle housings was conducted. Finally, a mounting structure was tested in static ROPS tests, and field upset tests were performed in accordance with SAE Standard J2194. Initial tests revealed that the ROPS mounting structure and axle housing combination had higher torsional strength with less twisting than the axle housing alone. An engineering plastic ROPS mounting structure was easily successful in withstanding the forces applied during the static longitudinal and lateral ROPS tests. Field upset testing revealed that the mounting structure could withstand the impact loads seen during actual upsets without a failure. During both static testing and field upset testing, no permanent twisting of the mounting structure was found. Engineering plastic could therefore be a viable option for a universal ROPS mounting structure for older tractors.
Jäckel, S; Eiden, M; Balkema-Buschmann, A; Ziller, M; van Vuren, P Jansen; Paweska, J T; Groschup, M H
2013-10-01
Rift Valley fever virus (RVFV) is an emerging zoonotic pathogen that causes high morbidity and mortality in humans and livestock. In this paper, we describe the cloning, expression and purification of RVFV glycoprotein Gn and its application as a diagnostic antigen in an indirect ELISA for the specific detection of RVF IgG antibodies in sheep and goats. The performance of this Gn based ELISA is validated using a panel of almost 2000 field samples from sheep and goats from Mozambique, Senegal, Uganda and Yemen. All serum samples were also tested by virus neutralization test (VNT), the gold standard method for RVFV serological testing. Compared to the VNT results the Gn based ELISA proved to have an excellent sensitivity (94.56%) and specificity (95.57%). Apart from establishing this new diagnostic assay, these results also demonstrate a close correlation between the presence of RVFV Gn and neutralizing antibodies. Copyright © 2013 Elsevier Ltd. All rights reserved.
From field data collection to earth sciences dissemination: mobile examples in the digital era
NASA Astrophysics Data System (ADS)
Giardino, Marco; Ghiraldi, Luca; Palomba, Mauro; Perotti, Luigi
2015-04-01
In the framework of the technological and cultural revolution related to the massive diffusion of mobile devices, as smartphones and tablets, the information management and accessibility is changing, and many software houses and developer communities realized applications that can meet various people's needs. Modern collection, storing and sharing of data have radically changed, and advances in ICT increasingly involve field-based activities. Progresses in these researches and applications depend on three main components: hardware, software and web system. Since 2008 the geoSITLab multidisciplinary group (Earth Sciences Department and NatRisk Centre of the University of Torino and the Natural Sciences Museum of the Piemonte Region) is active in defining and testing methods for collecting, managing and sharing field information using mobile devices. Key issues include: Geomorphological Digital Mapping, Natural Hazards monitoring, Geoheritage assessment and applications for the teaching of Earth Sciences. An overview of the application studies is offered here, including the use of Mobile tools for data collection, the construction of relational databases for inventory activities and the test of Web-Mapping tools and mobile apps for data dissemination. The fil rouge of connection is a standardized digital approach allowing the use of mobile devices in each step of the process, which will be analysed within different projects set up by the research group (Geonathaz, EgeoFieldwork, Progeo Piemonte, GeomediaWeb). The hardware component mainly consists of the availability of handheld mobile devices (e.g. smartphones, PDAs and Tablets). The software component corresponds to applications for spatial data visualization on mobile devices, such as composite mobile GIS or simple location-based apps. The web component allows the integration of collected data into geodatabase based on client-server architecture, where the information can be easily loaded, uploaded and shared between field staff and data management team, in order to disseminate collected information to media or to inform the decision makers. Results demonstrated the possibility to record field observations in a fast and reliable way, using standardized formats that can improve the precision of collected information and lower the possibility of errors and data omission. Dedicated forms have been set up for gathering different thematic data (geologic/geomorphologic, faunal and floristic, path system…etc.). Field data allowed to arrange maps and SDI useful for many application purposes: from country-planning to disaster risk management, from Geoheritage management to Earth Science concepts dissemination.
Equivalence between solar irradiance and solar simulators in aging tests of sunglasses.
Masili, Mauro; Ventura, Liliane
2016-08-26
This work is part of a broader research that focuses on ocular health. Three outlines are the basis of the pyramid that comprehend the research as a whole: authors' previous work, which has provided the public to self-check their own sunglasses regarding the ultraviolet protection compatible to their category; Brazilian national survey in order to improve nationalization of sunglasses standards; and studies conducted on revisiting requirements of worldwide sunglasses standards, in which this work is inserted. It is still controversial on the literature the ultraviolet (UV) radiation effects on the ocular media, but the World Health Organization has established safe limits on the exposure of eyes to UV radiation based on the studies reported in literature. Sunglasses play an important role in providing safety, and their lenses should provide adequate UV filters. Regarding UV protection for ocular media, the resistance-to-irradiance test for sunglasses under many national standards requires irradiating lenses for 50 uninterrupted hours with a 450 W solar simulator. This artificial aging test may provide a corresponding evaluation of exposure to the sun. Calculating the direct and diffuse solar irradiance at a vertical surface and the corresponding radiant exposure for the entire year, we compare the latter with the 50-h radiant exposure of a 450 W xenon arc lamp from a solar simulator required by national standards. Our calculations indicate that this stress test is ineffective in its present form. We provide evidence of the need to re-evaluate the parameters of the tests to establish appropriate safe limits for UV irradiance. This work is potentially significant for scientists and legislators in the field of sunglasses standards to improve the requirements of sunglasses quality and safety.
Depiction of pneumothoraces in a large animal model using x-ray dark-field radiography.
Hellbach, Katharina; Baehr, Andrea; De Marco, Fabio; Willer, Konstantin; Gromann, Lukas B; Herzen, Julia; Dmochewitz, Michaela; Auweter, Sigrid; Fingerle, Alexander A; Noël, Peter B; Rummeny, Ernst J; Yaroshenko, Andre; Maack, Hanns-Ingo; Pralow, Thomas; van der Heijden, Hendrik; Wieberneit, Nataly; Proksa, Roland; Koehler, Thomas; Rindt, Karsten; Schroeter, Tobias J; Mohr, Juergen; Bamberg, Fabian; Ertl-Wagner, Birgit; Pfeiffer, Franz; Reiser, Maximilian F
2018-02-08
The aim of this study was to assess the diagnostic value of x-ray dark-field radiography to detect pneumothoraces in a pig model. Eight pigs were imaged with an experimental grating-based large-animal dark-field scanner before and after induction of a unilateral pneumothorax. Image contrast-to-noise ratios between lung tissue and the air-filled pleural cavity were quantified for transmission and dark-field radiograms. The projected area in the object plane of the inflated lung was measured in dark-field images to quantify the collapse of lung parenchyma due to a pneumothorax. Means and standard deviations for lung sizes and signal intensities from dark-field and transmission images were tested for statistical significance using Student's two-tailed t-test for paired samples. The contrast-to-noise ratio between the air-filled pleural space of lateral pneumothoraces and lung tissue was significantly higher in the dark-field (3.65 ± 0.9) than in the transmission images (1.13 ± 1.1; p = 0.002). In case of dorsally located pneumothoraces, a significant decrease (-20.5%; p > 0.0001) in the projected area of inflated lung parenchyma was found after a pneumothorax was induced. Therefore, the detection of pneumothoraces in x-ray dark-field radiography was facilitated compared to transmission imaging in a large animal model.
Murumkar, Prashant R; Giridhar, Rajani; Yadav, Mange Ram
2008-04-01
A set of 29 benzothiadiazepine hydroxamates having selective tumor necrosis factor-alpha converting enzyme inhibitory activity were used to compare the quality and predictive power of 3D-quantitative structure-activity relationship, comparative molecular field analysis, and comparative molecular similarity indices models for the atom-based, centroid/atom-based, data-based, and docked conformer-based alignment. Removal of two outliers from the initial training set of molecules improved the predictivity of models. Among the 3D-quantitative structure-activity relationship models developed using the above four alignments, the database alignment provided the optimal predictive comparative molecular field analysis model for the training set with cross-validated r(2) (q(2)) = 0.510, non-cross-validated r(2) = 0.972, standard error of estimates (s) = 0.098, and F = 215.44 and the optimal comparative molecular similarity indices model with cross-validated r(2) (q(2)) = 0.556, non-cross-validated r(2) = 0.946, standard error of estimates (s) = 0.163, and F = 99.785. These models also showed the best test set prediction for six compounds with predictive r(2) values of 0.460 and 0.535, respectively. The contour maps obtained from 3D-quantitative structure-activity relationship studies were appraised for activity trends for the molecules analyzed. The comparative molecular similarity indices models exhibited good external predictivity as compared with that of comparative molecular field analysis models. The data generated from the present study helped us to further design and report some novel and potent tumor necrosis factor-alpha converting enzyme inhibitors.
Ishibashi, Midori
2015-01-01
The cost, speed, and quality are the three important factors recently indicated by the Ministry of Health, Labour and Welfare (MHLW) for the purpose of accelerating clinical studies. Based on this background, the importance of laboratory tests is increasing, especially in the evaluation of clinical study participants' entry and safety, and drug efficacy. To assure the quality of laboratory tests, providing high-quality laboratory tests is mandatory. For providing adequate quality assurance in laboratory tests, quality control in the three fields of pre-analytical, analytical, and post-analytical processes is extremely important. There are, however, no detailed written requirements concerning specimen collection, handling, preparation, storage, and shipping. Most laboratory tests for clinical studies are performed onsite in a local laboratory; however, a part of laboratory tests is done in offsite central laboratories after specimen shipping. As factors affecting laboratory tests, individual and inter-individual variations are well-known. Besides these factors, standardizing the factors of specimen collection, handling, preparation, storage, and shipping, may improve and maintain the high quality of clinical studies in general. Furthermore, the analytical method, units, and reference interval are also important factors. It is concluded that, to overcome the problems derived from pre-analytical processes, it is necessary to standardize specimen handling in a broad sense.
The Development of a Pediatric Inpatient Experience of Care Measure: Child HCAHPS®
Toomey, Sara L.; Zaslavsky, Alan M.; Elliott, Marc N.; Gallagher, Patricia M.; Fowler, Floyd J.; Klein, David J.; Shulman, Shanna; Ratner, Jessica; McGovern, Caitriona; LeBlanc, Jessica L.; Schuster, Mark A.
2016-01-01
CMS uses Adult HCAHPS® scores for public reporting and pay-for-performance for most U.S. hospitals, but no publicly available standardized survey of inpatient experience of care exists for pediatrics. To fill the gap, CMS/AHRQ commissioned the development of the Consumer Assessment of Healthcare Providers and Systems Hospital Survey – Child Version (Child HCAHPS), a survey of parents/guardians of pediatric patients (<18 years old) who were recently hospitalized. This Special Article describes the development of Child HCAHPS, which included an extensive review of the literature and quality measures, expert interviews, focus groups, cognitive testing, pilot testing of the draft survey, a national field test with 69 hospitals in 34 states, psychometric analysis, and end-user testing of the final survey. We conducted extensive validity and reliability testing to determine which items would be included in the final survey instrument and to develop composite measures. We analyzed national field test data from 17,727 surveys collected from 11/12-1/14 from parents of recently hospitalized children. The final Child HCAHPS instrument has 62 items, including 39 patient experience items, 10 screeners, 12 demographic/descriptive items, and 1 open-ended item. The 39 experience items are categorized based on testing into 18 composite and single-item measures. Our composite and single-item measures demonstrated good to excellent hospital-level reliability at 300 responses per hospital. Child HCAHPS was developed to be a publicly available standardized survey of pediatric inpatient experience of care. It can be used to benchmark pediatric inpatient experience across hospitals and assist in efforts to improve the quality of inpatient care. PMID:26195542
NASA Astrophysics Data System (ADS)
Song, X. P.; Potapov, P.; Adusei, B.; King, L.; Khan, A.; Krylov, A.; Di Bella, C. M.; Pickens, A. H.; Stehman, S. V.; Hansen, M.
2016-12-01
Reliable and timely information on agricultural production is essential for ensuring world food security. Freely available medium-resolution satellite data (e.g. Landsat, Sentinel) offer the possibility of improved global agriculture monitoring. Here we develop and test a method for estimating in-season crop acreage using a probability sample of field visits and producing wall-to-wall crop type maps at national scales. The method is first illustrated for soybean cultivated area in the US for 2015. A stratified, two-stage cluster sampling design was used to collect field data to estimate national soybean area. The field-based estimate employed historical soybean extent maps from the U.S. Department of Agriculture (USDA) Cropland Data Layer to delineate and stratify U.S. soybean growing regions. The estimated 2015 U.S. soybean cultivated area based on the field sample was 341,000 km2 with a standard error of 23,000 km2. This result is 1.0% lower than USDA's 2015 June survey estimate and 1.9% higher than USDA's 2016 January estimate. Our area estimate was derived in early September, about 2 months ahead of harvest. To map soybean cover, the Landsat image archive for the year 2015 growing season was processed using an active learning approach. Overall accuracy of the soybean map was 84%. The field-based sample estimated area was then used to calibrate the map such that the soybean acreage of the map derived through pixel counting matched the sample-based area estimate. The strength of the sample-based area estimation lies in the stratified design that takes advantage of the spatially explicit cropland layers to construct the strata. The success of the mapping was built upon an automated system which transforms Landsat images into standardized time-series metrics. The developed method produces reliable and timely information on soybean area in a cost-effective way and could be implemented in an operational mode. The approach has also been applied for other crops in other regions, such as winter wheat in Pakistan, soybean in Argentina and soybean in the entire South America. Similar levels of accuracy and timeliness were achieved as in the US.
NASA Astrophysics Data System (ADS)
Rahrig, M.; Drewello, R.; Lazzeri, A.
2018-05-01
Monitoring is an essential requirement for the planning, assessment and evaluation of conservation measures. It should be based on a standardized and reproducible observation of the historical surface. For many areas and materials suitable methods for long-term monitoring already exist. But hardly any non-destructive testing methods have been used to test new materials for conservation of damaged stone surfaces. The Nano-Cathedral project, funded by the European Union's Horizon 2020 research and innovation program, is developing new materials and technologies for preserving damaged stone surfaces of built heritage. The prototypes developed are adjusted to the needs and problems of a total of six major cultural monuments in Europe. In addition to the testing of the materials under controlled laboratory conditions, the products have been applied to trial areas on the original stone surfaces. For a location-independent standardized assessment of surface changes of the entire trial areas a monitoring method based on opto-technical, non-contact and non-destructive testing methods has been developed. This method involves a three-dimensional measurement of the surface topography using Structured-Light-Scanning and the analysis of the surfaces in different light ranges using high resolution VIS photography, as well as UV-A-fluorescence photography and reflected near-field IR photography. The paper will show the workflow of this methodology, including a detailed description of the equipment used data processing and the advantages for monitoring highly valuable stone surfaces. Alongside the theoretical discussion, the results of two measuring campaigns on trial areas of the Nano-Cathedral project will be shown.
2016-01-01
We introduce a portable biochemical analysis platform for rapid field deployment of nucleic acid-based diagnostics using consumer-class quadcopter drones. This approach exploits the ability to isothermally perform the polymerase chain reaction (PCR) with a single heater, enabling the system to be operated using standard 5 V USB sources that power mobile devices (via battery, solar, or hand crank action). Time-resolved fluorescence detection and quantification is achieved using a smartphone camera and integrated image analysis app. Standard sample preparation is enabled by leveraging the drone’s motors as centrifuges via 3D printed snap-on attachments. These advancements make it possible to build a complete DNA/RNA analysis system at a cost of ∼$50 ($US). Our instrument is rugged and versatile, enabling pinpoint deployment of sophisticated diagnostics to distributed field sites. This capability is demonstrated by successful in-flight replication of Staphylococcus aureus and λ-phage DNA targets in under 20 min. The ability to perform rapid in-flight assays with smartphone connectivity eliminates delays between sample collection and analysis so that test results can be delivered in minutes, suggesting new possibilities for drone-based systems to function in broader and more sophisticated roles beyond cargo transport and imaging. PMID:26898247
Priye, Aashish; Wong, Season; Bi, Yuanpeng; Carpio, Miguel; Chang, Jamison; Coen, Mauricio; Cope, Danielle; Harris, Jacob; Johnson, James; Keller, Alexandra; Lim, Richard; Lu, Stanley; Millard, Alex; Pangelinan, Adriano; Patel, Neal; Smith, Luke; Chan, Kamfai; Ugaz, Victor M
2016-05-03
We introduce a portable biochemical analysis platform for rapid field deployment of nucleic acid-based diagnostics using consumer-class quadcopter drones. This approach exploits the ability to isothermally perform the polymerase chain reaction (PCR) with a single heater, enabling the system to be operated using standard 5 V USB sources that power mobile devices (via battery, solar, or hand crank action). Time-resolved fluorescence detection and quantification is achieved using a smartphone camera and integrated image analysis app. Standard sample preparation is enabled by leveraging the drone's motors as centrifuges via 3D printed snap-on attachments. These advancements make it possible to build a complete DNA/RNA analysis system at a cost of ∼$50 ($US). Our instrument is rugged and versatile, enabling pinpoint deployment of sophisticated diagnostics to distributed field sites. This capability is demonstrated by successful in-flight replication of Staphylococcus aureus and λ-phage DNA targets in under 20 min. The ability to perform rapid in-flight assays with smartphone connectivity eliminates delays between sample collection and analysis so that test results can be delivered in minutes, suggesting new possibilities for drone-based systems to function in broader and more sophisticated roles beyond cargo transport and imaging.
Drinking water test methods in crisis-afflicted areas: comparison of methods under field conditions.
Merle, Roswitha; Bleul, Ingo; Schulenburg, Jörg; Kreienbrock, Lothar; Klein, Günter
2011-11-01
To simplify the testing of drinking water in crisis-afflicted areas (as in Kosovo in 2007), rapid test methods were compared with the standard test. For Escherichia coli and coliform pathogens, rapid tests were made available: Colilert(®)-18, P/A test with 4-methylumbelliferyl-β-D-glucoronid, and m-Endo Broth. Biochemical differentiation was carried out by Enterotube™ II. Enterococci were determined following the standard ISO test and by means of Enterolert™. Four hundred ninety-nine water samples were tested for E. coli and coliforms using four methods. Following the standard method, 20.8% (n=104) of the samples contained E. coli, whereas the rapid tests detected between 19.6% (m-Endo Broth, 92.0% concordance) and 20.0% (concordance: 93.6% Colilert-18 and 94.8% P/A-test) positive samples. Regarding coliforms, the percentage of concordant results ranged from 98.4% (P/A-test) to 99.0% (Colilert-18). Colilert-18 and m-Endo Broth detected even more positive samples than the standard method did. Enterococci were detected in 93 of 573 samples by the standard method, but in 92 samples by Enterolert (concordance: 99.5%). Considering the high-quality equipment and time requirements of the standard method, the use of rapid tests in crisis-afflicted areas is sufficiently reliable.
Center/TRACON Automation System: Development and Evaluation in the Field
DOT National Transportation Integrated Search
1993-10-01
Technological advances are changing the way that advanced air traffic control : automation should be developed and assessed. Current standards and practices of : system development place field testing at the end of the development process. : While su...
Hill, Kylie; Dolmage, Thomas E; Woon, Lynda; Coutts, Debbie; Goldstein, Roger; Brooks, Dina
2012-02-01
Field and laboratory-based tests are used to measure exercise capacity in people with COPD. A comparison of the cardiorespiratory responses to field tests, referenced to a laboratory test, is needed to appreciate the relative physiological demands. We sought to compare peak and submaximal cardiorespiratory responses to the 6-min walk test, incremental shuttle walk test and endurance shuttle walk test with a ramp cycle ergometer test (CET) in patients with COPD. Twenty-four participants (FEV(1) 50 ± 14%; 66.5 ± 7.7 years; 15 men) completed four sessions, separated by ≥24 h. During an individual session, participants completed either two 6-min walk tests, incremental shuttle walk tests, endurance shuttle walk tests using standardized protocols, or a single CET, wearing a portable gas analysis unit (Cosmed K4b(2)) which included measures of heart rate and arterial oxygen saturation (SpO(2)). Between tests, no difference was observed in the peak rate of oxygen uptake (F(3,69) = 1.2; P = 0.31), end-test heart rate (F(2,50) = 0.6; P = 0.58) or tidal volume (F(3,69) = 1.5; P = 0.21). Compared with all walking tests, the CET elicited a higher peak rate of carbon dioxide output (1173 ± 350 mL/min; F(3,62) = 4.8; P = 0.006), minute ventilation (48 ± 17 L/min; F(3,69) = 10.2; P < 0.001) and a higher end-test SpO(2) (95 ± 4%; F(3,63) = 24.9; P < 0.001). In patients with moderate COPD, field walking tests elicited a similar peak rate of oxygen uptake and heart rate as a CET, demonstrating that both self- and externally paced walking tests progress to high intensities. © 2011 The Authors. Respirology © 2011 Asian Pacific Society of Respirology.
NASA Technical Reports Server (NTRS)
Schreiner, John; Clancy, Daniel (Technical Monitor)
2002-01-01
The Collaborative Information Portal (CIP) is a web-based information management and retrieval system. Its purpose is to provide users at MER (Mars Exploration Rover) mission operations with easy access to a broad range of mission data and products and contextual information such as the current operations schedule. The CIP web-server provides this content in a user customizable web-portal environment. Since CIP is still under development, only a subset of the full feature set will be available for the EDO field test. The CIP web-portal will be accessed through a standard web browser. CIP is intended to be intuitive and simple to use, however, at the training session, users will receive a one to two page reference guide, which should aid them in using CIP. Users must provide their own computers for accessing CIP during the field test. These computers should be configured with Java 1.3 and a Java 2 enabled browser. Macintosh computers should be running OS 10.1.3 or later. Classic Mac OS (OS 9) is not supported. For more information please read section 7.3 in the FIASCO Rover Science Operations Test Mission Plan. Several screen shots of the Beta Release of CIP are shown on the following pages.
Visual detection of Brucella in bovine biological samples using DNA-activated gold nanoparticles
Kumar, Satish; Kaur, Gurpreet; Ali, Syed Atif; Shrivastava, Sameer; Gupta, Praveen K.; Cooper, Jonathan M.; Chaudhuri, Pallab
2017-01-01
Brucellosis is a bacterial disease, which, although affecting cattle primarily, has been associated with human infections, making its detection an important challenge. The existing gold standard diagnosis relies on the culture of bacteria which is a lengthy and costly process, taking up to 45 days. New technologies based on molecular diagnosis have been proposed, either through dip-stick, immunological assays, which have limited specificity, or using nucleic acid tests, which enable to identify the pathogen, but are impractical for use in the field, where most of the reservoir cases are located. Here we demonstrate a new test based on hybridization assays with metal nanoparticles, which, upon detection of a specific pathogen-derived DNA sequence, yield a visual colour change. We characterise the components used in the assay with a range of analytical techniques and show sensitivities down to 1000 cfu/ml for the detection of Brucella. Finally, we demonstrate that the assay works in a range of bovine samples including semen, milk and urine, opening up the potential for its use in the field, in low-resource settings. PMID:28719613
The validity of three tests of temperament in guppies (Poecilia reticulata).
Burns, James G
2008-11-01
Differences in temperament (consistent differences among individuals in behavior) can have important effects on fitness-related activities such as dispersal and competition. However, evolutionary ecologists have put limited effort into validating their tests of temperament. This article attempts to validate three standard tests of temperament in guppies: the open-field test, emergence test, and novel-object test. Through multiple reliability trials, and comparison of results between different types of test, this study establishes the confidence that can be placed in these temperament tests. The open-field test is shown to be a good test of boldness and exploratory behavior; the open-field test was reliable when tested in multiple ways. There were problems with the emergence test and novel-object test, which leads one to conclude that the protocols used in this study should not be considered valid tests for this species. (PsycINFO Database Record (c) 2008 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Ilev, Ilko K.; Walker, Bennett; Calhoun, William; Hassan, Moinuddin
2016-03-01
Biophotonics is an emerging field in modern biomedical technology that has opened up new horizons for transfer of state-of-the-art techniques from the areas of lasers, fiber optics and biomedical optics to the life sciences and medicine. This field continues to vastly expand with advanced developments across the entire spectrum of biomedical applications ranging from fundamental "bench" laboratory studies to clinical patient "bedside" diagnostics and therapeutics. However, in order to translate these technologies to clinical device applications, the scientific and industrial community, and FDA are facing the requirement for a thorough evaluation and review of laser radiation safety and efficacy concerns. In many cases, however, the review process is complicated due the lack of effective means and standard test methods to precisely analyze safety and effectiveness of some of the newly developed biophotonics techniques and devices. There is, therefore, an immediate public health need for new test protocols, guidance documents and standard test methods to precisely evaluate fundamental characteristics, performance quality and safety of these technologies and devices. Here, we will overview our recent developments of novel test methodologies for safety and efficacy evaluation of some emerging biophotonics technologies and medical devices. These methodologies are based on integrating the advanced features of state-of-the-art optical sensor technologies and approaches such as high-resolution fiber-optic sensing, confocal and optical coherence tomography imaging, and infrared spectroscopy. The presentation will also illustrate some methodologies developed and implemented for testing intraocular lens implants, biochemical contaminations of medical devices, ultrahigh-resolution nanoscopy, and femtosecond laser therapeutics.
NASA Astrophysics Data System (ADS)
Wang, Jing; Tronville, Paolo
2014-06-01
The filtration of airborne nanoparticles is an important control technique as the environmental, health, and safety impacts of nanomaterials grow. A review of the literature shows that significant progress has been made on airborne nanoparticle filtration in the academic field in the recent years. We summarize the filtration mechanisms of fibrous and membrane filters; the air flow resistance and filter media figure of merit are discussed. Our review focuses on the air filtration test methods and instrumentation necessary to implement them; recent experimental studies are summarized accordingly. Two methods using monodisperse and polydisperse challenging aerosols, respectively, are discussed in detail. Our survey shows that the commercial instruments are already available for generating a large amount of nanoparticles, sizing, and quantifying them accurately. The commercial self-contained filter test systems provide the possibility of measurement for particles down to 15 nm. Current international standards dealing with efficiency test for filters and filter media focus on measurement of the minimum efficiency at the most penetrating particle size. The available knowledge and instruments provide a solid base for development of test methods to determine the effectiveness of filtration media against airborne nanoparticles down to single-digit nanometer range.
Normal Threshold Size of Stimuli in Children Using a Game-Based Visual Field Test.
Wang, Yanfang; Ali, Zaria; Subramani, Siddharth; Biswas, Susmito; Fenerty, Cecilia; Henson, David B; Aslam, Tariq
2017-06-01
The aim of this study was to demonstrate and explore the ability of novel game-based perimetry to establish normal visual field thresholds in children. One hundred and eighteen children (aged 8.0 ± 2.8 years old) with no history of visual field loss or significant medical history were recruited. Each child had one eye tested using a game-based visual field test 'Caspar's Castle' at four retinal locations 12.7° (N = 118) from fixation. Thresholds were established repeatedly using up/down staircase algorithms with stimuli of varying diameter (luminance 20 cd/m 2 , duration 200 ms, background luminance 10 cd/m 2 ). Relationships between threshold and age were determined along with measures of intra- and intersubject variability. The Game-based visual field test was able to establish threshold estimates in the full range of children tested. Threshold size reduced with increasing age in children. Intrasubject variability and intersubject variability were inversely related to age in children. Normal visual field thresholds were established for specific locations in children using a novel game-based visual field test. These could be used as a foundation for developing a game-based perimetry screening test for children.
Mebane, C.A.
2010-01-01
Criteria to protect aquatic life are intended to protect diverse ecosystems, but in practice are usually developed from compilations of single-species toxicity tests using standard test organisms that were tested in laboratory environments. Species sensitivity distributions (SSDs) developed from these compilations are extrapolated to set aquatic ecosystem criteria. The protectiveness of the approach was critically reviewed with a chronic SSD for cadmium comprising 27 species within 21 genera. Within the data set, one genus had lower cadmium effects concentrations than the SSD fifth percentile-based criterion, so in theory this genus, the amphipod Hyalella, could be lost or at least allowed some level of harm by this criteria approach. However, population matrix modeling projected only slightly increased extinction risks for a temperate Hyalella population under scenarios similar to the SSD fifth percentile criterion. The criterion value was further compared to cadmium effects concentrations in ecosystem experiments and field studies. Generally, few adverse effects were inferred from ecosystem experiments at concentrations less than the SSD fifth percentile criterion. Exceptions were behavioral impairments in simplified food web studies. No adverse effects were apparent in field studies under conditions that seldom exceeded the criterion. At concentrations greater than the SSD fifth percentile, the magnitudes of adverse effects in the field studies were roughly proportional to the laboratory-based fraction of species with adverse effects in the SSD. Overall, the modeling and field validation comparisons of the chronic criterion values generally supported the relevance and protectiveness of the SSD fifth percentile approach with cadmium. ?? 2009 Society for Risk Analysis.
MOSFET detectors in quality assurance of tomotherapy treatments.
Cherpak, Amanda; Studinski, Ryan C N; Cygler, Joanna E
2008-02-01
The purpose of this work was to characterize metal oxide semiconductor field-effect transistors (MOSFETs) in a 6 MV conventional linac and investigate their use for quality assurance of radiotherapy treatments with a tomotherapy Hi-Art unit. High sensitivity and standard sensitivity MOSFETs were first calibrated and then tested for reproducibility, field size dependence, and accuracy of measuring surface dose in a 6 MV beam as well as in a tomotherapy Hi-Art unit. In vivo measurements were performed on both a RANDO phantom and several head and neck cancer patients treated with tomotherapy and compared to TLD measurements and treatment plan doses to evaluate the performance of MOSFETs in a high gradient radiation field. The average calibration factor found was 0.345+/-2.5%cGy/mV for the high sensitivity MOSFETs tested and 0.901+/-2.4%cGy/mV for the standard sensitivity MOSFETs. MOSFET measured surface doses had an average agreement with ion chamber measurements of 1.55% for the high sensitivity MOSFET and 5.23% for the standard sensitivity MOSFET when averaged over all trials and field sizes tested. No significant dependence on field size was found for the standard sensitivity MOSFETs, however a maximum difference of 5.34% was found for the high sensitivity MOSFET calibration factors in the field sizes tested. Measurements made with MOSFETS on head and neck patients treated on a tomotherapy Hi-Art unit had an average agreement of (3.26+/-0.03)% with TLD measurements, however the average of the absolute difference between the MOSFET measurements and the treatment plan skin doses was (12.2+/-7.5)%. The MOSFET measured patient skin doses also had good reproducibility, with inter-fraction deviations ranging from 1.4% to 6.6%. Similar results were found from trials using a RANDO phantom. The MOSFETs performed well when used in the tomotherapy Hi-Art unit and did not increase the overall treatment set-up time when used for patient measurements. It was found that MOSFETs are suitable detectors for surface dose measurements in both conventional beam and tomotherapy treatments and they can provide valuable skin dose information in areas where the treatment planning system may not be accurate.
Weather forecasting with open source software
NASA Astrophysics Data System (ADS)
Rautenhaus, Marc; Dörnbrack, Andreas
2013-04-01
To forecast the weather situation during aircraft-based atmospheric field campaigns, we employ a tool chain of existing and self-developed open source software tools and open standards. Of particular value are the Python programming language with its extension libraries NumPy, SciPy, PyQt4, Matplotlib and the basemap toolkit, the NetCDF standard with the Climate and Forecast (CF) Metadata conventions, and the Open Geospatial Consortium Web Map Service standard. These open source libraries and open standards helped to implement the "Mission Support System", a Web Map Service based tool to support weather forecasting and flight planning during field campaigns. The tool has been implemented in Python and has also been released as open source (Rautenhaus et al., Geosci. Model Dev., 5, 55-71, 2012). In this presentation we discuss the usage of free and open source software for weather forecasting in the context of research flight planning, and highlight how the field campaign work benefits from using open source tools and open standards.
ERIC Educational Resources Information Center
Oberle, Alex; Palacios, Fabian Araya
2012-01-01
Overseas experiences provide educators with exceptional opportunities to incorporate field study, firsthand experiences, and tangible artifacts into the classroom. Despite this potential, teachers must consider curricular standards that direct how such international endeavors can be integrated. Furthermore, geography curriculum development is more…
2014-01-01
Background The use of radio frequency identification (RFID) systems in healthcare is increasing, and concerns for electromagnetic compatibility (EMC) pose one of the biggest obstacles for widespread adoption. Numerous studies have demonstrated that RFID systems can interfere with medical devices; however, the majority of past studies relied on time-consuming and burdensome test schemes based on ad hoc test methods applied to individual RFID systems. Methods This paper presents the results of using an RFID simulator that allows for faster evaluation of RFID-medical device EMC against a library of RFID test signals at various field strengths. Results The results of these tests demonstrate the feasibility and adequacy of simulator testing and can be used to support its incorporation into applicable consensus standards. Conclusions This work can aid the medical device community in better assessing the risks associated with medical device exposure to RFID. PMID:25086451
Seidman, Seth J; Bekdash, Omar; Guag, Joshua; Mehryar, Maryam; Booth, Paul; Frisch, Paul
2014-08-03
The use of radio frequency identification (RFID) systems in healthcare is increasing, and concerns for electromagnetic compatibility (EMC) pose one of the biggest obstacles for widespread adoption. Numerous studies have demonstrated that RFID systems can interfere with medical devices; however, the majority of past studies relied on time-consuming and burdensome test schemes based on ad hoc test methods applied to individual RFID systems. This paper presents the results of using an RFID simulator that allows for faster evaluation of RFID-medical device EMC against a library of RFID test signals at various field strengths. The results of these tests demonstrate the feasibility and adequacy of simulator testing and can be used to support its incorporation into applicable consensus standards. This work can aid the medical device community in better assessing the risks associated with medical device exposure to RFID.
Microscopic or occult hematuria, when reflex testing is not good laboratory practice.
Froom, Paul; Barak, Mira
2010-01-01
Consensus opinion suggests that hematuria found by dipstick and not confirmed on microscopic examination (<2 erythrocytes per high power field) signifies a false-positive reagent strip test result. Standard practice is to repeat the dipstick test several days later and if still positive to confirm by microscopic examination. If discordant results are obtained, experts recommend reflex testing for urinary myoglobin and hemoglobin concentrations. The question is whether or not this approach represents good laboratory practice. These recommendations are not evidence based. We conclude that the reference range for red blood cells on the reagent strip should be increased to 25x10(6) cells/L for young men, and 50x10(6) cells/L for the rest of the adult population, ranges consistent with flow cytometry reports. Confirmation reflex testing using tests that have inferior sensitivity, precision and probably accuracy is not recommended.
Test-Based Accountability: The Promise and the Perils
ERIC Educational Resources Information Center
Loveless, Tom
2005-01-01
In the early 1990s, states began establishing standards in academic subjects backed by test-based accountability systems to see that the standards were met. Incentives were implemented for schools and students based on pupil test scores. These early accountability systems paved the way for passage of landmark federal legislation, the No Child Left…
NASA Technical Reports Server (NTRS)
Polzin, Kurt A.; Hill, Carrie S.
2013-01-01
Inductive magnetic field probes (also known as B-dot probes and sometimes as B-probes or magnetic probes) are useful for performing measurements in electric space thrusters and various plasma accelerator applications where a time-varying magnetic field is present. Magnetic field probes have proven to be a mainstay in diagnosing plasma thrusters where changes occur rapidly with respect to time, providing the means to measure the magnetic fields produced by time-varying currents and even an indirect measure of the plasma current density through the application of Ampère's law. Examples of applications where this measurement technique has been employed include pulsed plasma thrusters and quasi-steady magnetoplasmadynamic thrusters. The Electric Propulsion Technical Committee (EPTC) of the American Institute of Aeronautics and Astronautics (AIAA) was asked to assemble a Committee on Standards (CoS) for Electric Propulsion Testing. The assembled CoS was tasked with developing Standards and Recommended Practices for various diagnostic techniques used in the evaluation of plasma thrusters. These include measurements that can yield either global information related to a thruster and its performance or detailed, local data related to the specific physical processes occurring in the plasma. This paper presents a summary of the standard, describing the preferred methods for fabrication, calibration, and usage of inductive magnetic field probes for use in diagnosing plasma thrusters. Inductive magnetic field probes (also called B-dot probes throughout this document) are commonly used in electric propulsion (EP) research and testing to measure unsteady magnetic fields produced by time-varying currents. The B-dot probe is relatively simple in construction, and requires minimal cost, making it a low-cost technique that is readily accessible to most researchers. While relatively simple, the design of a B-dot probe is not trivial and there are many opportunities for errors in probe construction, calibration, and usage, and in the post-processing of data that is produced by the probe. There are typically several ways in which each of these steps can be approached, and different applications may require more or less vigorous attention to various issues.
Integrated System Health Management: Pilot Operational Implementation in a Rocket Engine Test Stand
NASA Technical Reports Server (NTRS)
Figueroa, Fernando; Schmalzel, John L.; Morris, Jonathan A.; Turowski, Mark P.; Franzl, Richard
2010-01-01
This paper describes a credible implementation of integrated system health management (ISHM) capability, as a pilot operational system. Important core elements that make possible fielding and evolution of ISHM capability have been validated in a rocket engine test stand, encompassing all phases of operation: stand-by, pre-test, test, and post-test. The core elements include an architecture (hardware/software) for ISHM, gateways for streaming real-time data from the data acquisition system into the ISHM system, automated configuration management employing transducer electronic data sheets (TEDS?s) adhering to the IEEE 1451.4 Standard for Smart Sensors and Actuators, broadcasting and capture of sensor measurements and health information adhering to the IEEE 1451.1 Standard for Smart Sensors and Actuators, user interfaces for management of redlines/bluelines, and establishment of a health assessment database system (HADS) and browser for extensive post-test analysis. The ISHM system was installed in the Test Control Room, where test operators were exposed to the capability. All functionalities of the pilot implementation were validated during testing and in post-test data streaming through the ISHM system. The implementation enabled significant improvements in awareness about the status of the test stand, and events and their causes/consequences. The architecture and software elements embody a systems engineering, knowledge-based approach; in conjunction with object-oriented environments. These qualities are permitting systematic augmentation of the capability and scaling to encompass other subsystems.
Continuous Codes and Standards Improvement (CCSI)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rivkin, Carl H; Burgess, Robert M; Buttner, William J
2015-10-21
As of 2014, the majority of the codes and standards required to initially deploy hydrogen technologies infrastructure in the United States have been promulgated. These codes and standards will be field tested through their application to actual hydrogen technologies projects. Continuous codes and standards improvement (CCSI) is a process of identifying code issues that arise during project deployment and then developing codes solutions to these issues. These solutions would typically be proposed amendments to codes and standards. The process is continuous because as technology and the state of safety knowledge develops there will be a need to monitor the applicationmore » of codes and standards and improve them based on information gathered during their application. This paper will discuss code issues that have surfaced through hydrogen technologies infrastructure project deployment and potential code changes that would address these issues. The issues that this paper will address include (1) setback distances for bulk hydrogen storage, (2) code mandated hazard analyses, (3) sensor placement and communication, (4) the use of approved equipment, and (5) system monitoring and maintenance requirements.« less
Field Evaluations Test Plan for Validation of Alternatives to Aliphatic Isocyanate Polyurethanes
NASA Technical Reports Server (NTRS)
Lewis, Pattie
2005-01-01
The objective of this project is to qualify candidate alternatives to Aliphatic Isocyanate Polyurethane coatings under the specifications for the standard system. This project will compare coating performance of the proposed alternatives to existing coating systems or standards.
A Rapid Assessment Tool for affirming good practice in midwifery education programming.
Fullerton, Judith T; Johnson, Peter; Lobe, Erika; Myint, Khine Haymar; Aung, Nan Nan; Moe, Thida; Linn, Nay Aung
2016-03-01
to design a criterion-referenced assessment tool that could be used globally in a rapid assessment of good practices and bottlenecks in midwifery education programs. a standard tool development process was followed, to generate standards and reference criteria; followed by external review and field testing to document psychometric properties. review of standards and scoring criteria were conducted by stakeholders around the globe. Field testing of the tool was conducted in Myanmar. eleven of Myanmar׳s 22 midwifery education programs participated in the assessment. the clinimetric tool was demonstrated to have content validity and high inter-rater reliability in use. a globally validated tool, and accompanying user guide and handbook are now available for conducting rapid assessments of compliance with good practice criteria in midwifery education programming. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Gordon, Sarah; Daneshian, Mardas; Bouwstra, Joke; Caloni, Francesca; Constant, Samuel; Davies, Donna E; Dandekar, Gudrun; Guzman, Carlos A; Fabian, Eric; Haltner, Eleonore; Hartung, Thomas; Hasiwa, Nina; Hayden, Patrick; Kandarova, Helena; Khare, Sangeeta; Krug, Harald F; Kneuer, Carsten; Leist, Marcel; Lian, Guoping; Marx, Uwe; Metzger, Marco; Ott, Katharina; Prieto, Pilar; Roberts, Michael S; Roggen, Erwin L; Tralau, Tewes; van den Braak, Claudia; Walles, Heike; Lehr, Claus-Michael
2015-01-01
Models of the outer epithelia of the human body - namely the skin, the intestine and the lung - have found valid applications in both research and industrial settings as attractive alternatives to animal testing. A variety of approaches to model these barriers are currently employed in such fields, ranging from the utilization of ex vivo tissue to reconstructed in vitro models, and further to chip-based technologies, synthetic membrane systems and, of increasing current interest, in silico modeling approaches. An international group of experts in the field of epithelial barriers was convened from academia, industry and regulatory bodies to present both the current state of the art of non-animal models of the skin, intestinal and pulmonary barriers in their various fields of application, and to discuss research-based, industry-driven and regulatory-relevant future directions for both the development of new models and the refinement of existing test methods. Issues of model relevance and preference, validation and standardization, acceptance, and the need for simplicity versus complexity were focal themes of the discussions. The outcomes of workshop presentations and discussions, in relation to both current status and future directions in the utilization and development of epithelial barrier models, are presented by the attending experts in the current report.
Virtual Shaker Testing: Simulation Technology Improves Vibration Test Performance
NASA Technical Reports Server (NTRS)
Ricci, Stefano; Peeters, Bart; Fetter, Rebecca; Boland, Doug; Debille, Jan
2008-01-01
In the field of vibration testing, the interaction between the structure being tested and the instrumentation hardware used to perform the test is a critical issue. This is particularly true when testing massive structures (e.g. satellites), because due to physical design and manufacturing limits, the dynamics of the testing facility often couples with the test specimen one in the frequency range of interest. A further issue in this field is the standard use of a closed loop real-time vibration control scheme, which could potentially shift poles and change damping of the aforementioned coupled system. Virtual shaker testing is a novel approach to deal with these issues. It means performing a simulation which closely represents the real vibration test on the specific facility by taking into account all parameters which might impact the dynamic behavior of the specimen. In this paper, such a virtual shaker testing approach is developed. It consists of the following components: (1) Either a physical-based or an equation-based coupled electro-mechanical lumped parameter shaker model is created. The model parameters are obtained from manufacturer's specifications or by carrying out some dedicated experiments; (2) Existing real-time vibration control algorithm are ported to the virtual simulation environment; and (3) A structural model of the test object is created and after defining proper interface conditions structural modes are computed by means of the well-established Craig-Bampton CMS technique. At this stage, a virtual shaker test has been run, by coupling the three described models (shaker, control loop, structure) in a co-simulation routine. Numerical results have eventually been correlated with experimental ones in order to assess the robustness of the proposed methodology.
Desland, Fiona A; Afzal, Aqeela; Warraich, Zuha; Mocco, J
2014-01-01
Animal models of stroke have been crucial in advancing our understanding of the pathophysiology of cerebral ischemia. Currently, the standards for determining neurological deficit in rodents are the Bederson and Garcia scales, manual assessments scoring animals based on parameters ranked on a narrow scale of severity. Automated open field analysis of a live-video tracking system that analyzes animal behavior may provide a more sensitive test. Results obtained from the manual Bederson and Garcia scales did not show significant differences between pre- and post-stroke animals in a small cohort. When using the same cohort, however, post-stroke data obtained from automated open field analysis showed significant differences in several parameters. Furthermore, large cohort analysis also demonstrated increased sensitivity with automated open field analysis versus the Bederson and Garcia scales. These early data indicate use of automated open field analysis software may provide a more sensitive assessment when compared to traditional Bederson and Garcia scales.
Photovoltaic performance and reliability workshop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kroposki, B
1996-10-01
This proceedings is the compilation of papers presented at the ninth PV Performance and Reliability Workshop held at the Sheraton Denver West Hotel on September 4--6, 1996. This years workshop included presentations from 25 speakers and had over 100 attendees. All of the presentations that were given are included in this proceedings. Topics of the papers included: defining service lifetime and developing models for PV module lifetime; examining and determining failure and degradation mechanisms in PV modules; combining IEEE/IEC/UL testing procedures; AC module performance and reliability testing; inverter reliability/qualification testing; standardization of utility interconnect requirements for PV systems; need activitiesmore » to separate variables by testing individual components of PV systems (e.g. cells, modules, batteries, inverters,charge controllers) for individual reliability and then test them in actual system configurations; more results reported from field experience on modules, inverters, batteries, and charge controllers from field deployed PV systems; and system certification and standardized testing for stand-alone and grid-tied systems.« less
Automating linear accelerator quality assurance.
Eckhause, Tobias; Al-Hallaq, Hania; Ritter, Timothy; DeMarco, John; Farrey, Karl; Pawlicki, Todd; Kim, Gwe-Ya; Popple, Richard; Sharma, Vijeshwar; Perez, Mario; Park, SungYong; Booth, Jeremy T; Thorwarth, Ryan; Moran, Jean M
2015-10-01
The purpose of this study was 2-fold. One purpose was to develop an automated, streamlined quality assurance (QA) program for use by multiple centers. The second purpose was to evaluate machine performance over time for multiple centers using linear accelerator (Linac) log files and electronic portal images. The authors sought to evaluate variations in Linac performance to establish as a reference for other centers. The authors developed analytical software tools for a QA program using both log files and electronic portal imaging device (EPID) measurements. The first tool is a general analysis tool which can read and visually represent data in the log file. This tool, which can be used to automatically analyze patient treatment or QA log files, examines the files for Linac deviations which exceed thresholds. The second set of tools consists of a test suite of QA fields, a standard phantom, and software to collect information from the log files on deviations from the expected values. The test suite was designed to focus on the mechanical tests of the Linac to include jaw, MLC, and collimator positions during static, IMRT, and volumetric modulated arc therapy delivery. A consortium of eight institutions delivered the test suite at monthly or weekly intervals on each Linac using a standard phantom. The behavior of various components was analyzed for eight TrueBeam Linacs. For the EPID and trajectory log file analysis, all observed deviations which exceeded established thresholds for Linac behavior resulted in a beam hold off. In the absence of an interlock-triggering event, the maximum observed log file deviations between the expected and actual component positions (such as MLC leaves) varied from less than 1% to 26% of published tolerance thresholds. The maximum and standard deviations of the variations due to gantry sag, collimator angle, jaw position, and MLC positions are presented. Gantry sag among Linacs was 0.336 ± 0.072 mm. The standard deviation in MLC position, as determined by EPID measurements, across the consortium was 0.33 mm for IMRT fields. With respect to the log files, the deviations between expected and actual positions for parameters were small (<0.12 mm) for all Linacs. Considering both log files and EPID measurements, all parameters were well within published tolerance values. Variations in collimator angle, MLC position, and gantry sag were also evaluated for all Linacs. The performance of the TrueBeam Linac model was shown to be consistent based on automated analysis of trajectory log files and EPID images acquired during delivery of a standardized test suite. The results can be compared directly to tolerance thresholds. In addition, sharing of results from standard tests across institutions can facilitate the identification of QA process and Linac changes. These reference values are presented along with the standard deviation for common tests so that the test suite can be used by other centers to evaluate their Linac performance against those in this consortium.
NASA Technical Reports Server (NTRS)
2006-01-01
Topics covered include: Magnetic-Field-Response Measurement-Acquisition System; Platform for Testing Robotic Vehicles on Simulated Terrain; Interferometer for Low-Uncertainty Vector Metrology; Rayleigh Scattering for Measuring Flow in a Nozzle Testing Facility; "Virtual Feel" Capaciflectors; FETs Based on Doped Polyaniline/Polyethylene Oxide Fibers; Miniature Housings for Electronics With Standard Interfaces; Integrated Modeling Environment; Modified Recursive Hierarchical Segmentation of Data; Sizing Structures and Predicting Weight of a Spacecraft; Stress Testing of Data-Communication Networks; Framework for Flexible Security in Group Communications; Software for Collaborative Use of Large Interactive Displays; Microsphere Insulation Panels; Single-Wall Carbon Nanotube Anodes for Lithium Cells; Tantalum-Based Ceramics for Refractory Composites; Integral Flexure Mounts for Metal Mirrors for Cryogenic Use; Templates for Fabricating Nanowire/Nanoconduit- Based Devices; Measuring Vapors To Monitor the State of Cure of a Resin; Partial-Vacuum-Gasketed Electrochemical Corrosion Cell; Theodolite Ring Lights; Integrating Terrain Maps Into a Reactive Navigation Strategy; Reducing Centroid Error Through Model-Based Noise Reduction; Adaptive Modeling Language and Its Derivatives; Stable Satellite Orbits for Global Coverage of the Moon; and Low-Cost Propellant Launch From a Tethered Balloon
NASA Astrophysics Data System (ADS)
Bolduc, A.; Gauthier, P.-A.; Berry, A.
2017-12-01
While perceptual evaluation and sound quality testing with jury are now recognized as essential parts of acoustical product development, they are rarely implemented with spatial sound field reproduction. Instead, monophonic, stereophonic or binaural presentations are used. This paper investigates the workability and interest of a method to use complete vibroacoustic engineering models for auralization based on 2.5D Wave Field Synthesis (WFS). This method is proposed in order that spatial characteristics such as directivity patterns and direction-of-arrival are part of the reproduced sound field while preserving the model complete formulation that coherently combines frequency and spatial responses. Modifications to the standard 2.5D WFS operators are proposed for extended primary sources, affecting the reference line definition and compensating for out-of-plane elementary primary sources. Reported simulations and experiments of reproductions of two physically-accurate vibroacoustic models of thin plates show that the proposed method allows for an effective reproduction in the horizontal plane: Spatial and frequency domains features are recreated. Application of the method to the sound rendering of a virtual transmission loss measurement setup shows the potential of the method for use in virtual acoustical prototyping for jury testing.
Convergence of broadband optical and wireless access networks
NASA Astrophysics Data System (ADS)
Chang, Gee-Kung; Jia, Zhensheng; Chien, Hung-Chang; Chowdhury, Arshad; Hsueh, Yu-Ting; Yu, Jianjun
2009-01-01
This paper describes convergence of optical and wireless access networks for delivering high-bandwidth integrated services over optical fiber and air links. Several key system technologies are proposed and experimentally demonstrated. We report here, for the first ever, a campus-wide field trial demonstration of radio-over-fiber (RoF) system transmitting uncompressed standard-definition (SD) high-definition (HD) real-time video contents, carried by 2.4-GHz radio and 60- GHz millimeter-wave signals, respectively, over 2.5-km standard single mode fiber (SMF-28) through the campus fiber network at Georgia Institute of Technology (GT). In addition, subsystem technologies of Base Station and wireless tranceivers operated at 60 GHz for real-time video distribution have been developed and tested.
NASA Astrophysics Data System (ADS)
Roger-Estrade, Jean; Boizard, Hubert; Peigné, Josephine; Sasal, Maria Carolina; Guimaraes, Rachel; Piron, Denis; Tomis, Vincent; Vian, Jean-François; Cadoux, Stephane; Ralisch, Ricardo; Filho, Tavares; Heddadj, Djilali; de Battista, Juan; Duparque, Annie
2016-04-01
In France, agronomists have studied the effects of cropping systems on soil structure, using a field method based on a visual description of soil structure. The "profil cultural" method (Manichon and Gautronneau, 1987) has been designed to perform a field diagnostic of the effects of tillage and compaction on soil structure dynamics. This method is of great use to agronomists improving crop management for a better preservation of soil structure. However, this method was developed and mainly used in conventional tillage systems, with ploughing. As several forms of reduced, minimum and no tillage systems are expanding in many parts of the world, it is necessary to re-evaluate the ability of this method to describe and interpret soil macrostructure in unploughed situations. In unploughed fields, soil structure dynamics of untilled layers is mainly driven by compaction and regeneration by natural agents (climatic conditions, root growth and macrofauna) and it is of major importance to evaluate the importance of these natural processes on soil structure regeneration. These concerns have led us to adapt the standard method and to propose amendments based on a series of field observations and experimental work in different situations of cropping systems, soil types and climatic conditions. We improved the description of crack type and we introduced an index of biological activity, based on the visual examination of clods. To test the improved method, a comparison with the reference method was carried out and the ability of the "profil cultural" method to make a diagnosis was tested on five experiments in France, Brazil and Argentina. Using the improved method, the impact of cropping systems on soil functioning was better assessed when natural processes were integrated into the description.
NASA Astrophysics Data System (ADS)
Kiekebusch, Mario J.; Di Lieto, Nicola; Sandrock, Stefan; Popovic, Dan; Chiozzi, Gianluca
2014-07-01
ESO is in the process of implementing a new development platform, based on PLCs, for upcoming VLT control systems (new instruments and refurbishing of existing systems to manage obsolescence issues). In this context, we have evaluated the integration and reuse of existing C++ libraries and Simulink models into the real-time environment of BECKHOFF Embedded PCs using the capabilities of the latest version of TwinCAT software and MathWorks Embedded Coder. While doing so the aim was to minimize the impact of the new platform by adopting fully tested solutions implemented in C++. This allows us to reuse the in house expertise, as well as extending the normal capabilities of the traditional PLC programming environments. We present the progress of this work and its application in two concrete cases: 1) field rotation compensation for instrument tracking devices like derotators, 2) the ESO standard axis controller (ESTAC), a generic model-based controller implemented in Simulink and used for the control of telescope main axes.
NASA Technical Reports Server (NTRS)
Sherriff, Abigail
2015-01-01
The Field Test study is currently in full swing, preceded by the successful completion of the Pilot Field Test study that paved the way for collecting data on the astronauts in the medical tent in Kazakhstan. Abigail Sherriff worked alongside Logan Dobbe on one Field Test aspect to determine foot clearance over obstacles (5cm, 10cm, and 15cm) using APDM Inc. Internal Measurement Units (IMU) worn by the astronauts. They created a program to accurately calculate foot clearance using the accelerometer, magnetometer, and gyroscope data with the IMUs attached to the top of the shoes. To validate the functionality of their program, they completed a successful study on test subjects performing various tasks in an optical motion studio, considered a gold standard in biomechanics research. Future work will include further validation and expanding the program to include other analyses.
Developing product quality standards for wheelchairs used in less-resourced environments
McCambridge, Matt; Reese, Norman; Schoendorfer, Don; Wunderlich, Eric; Rushman, Chris; Mahilo, Dave
2017-01-01
Background Premature failures of wheelchairs in less-resourced environments (LREs) may be because of shortcomings in product regulation and quality standards. The standards published by the International Organization for Standardization (ISO) specify wheelchair tests for durability, safety and performance, but their applicability to products used in the rugged conditions of LREs is unclear. Because of this, wheelchair-related guidelines published by the World Health Organization recommended developing more rigorous durability tests for wheelchairs. Objectives This study was performed to identify the additional tests needed for LREs. Methods First, a literature review of the development of ISO test standards, wheelchair standards testing studies and wheelchair evaluations in LREs was performed. Second, expert advice from members of the Standards Working Group of the International Society of Wheelchair Professionals (ISWP) was compiled and reviewed. Results A total of 35 articles were included in the literature review. Participation from LREs was not observed in the ISO standards development. As per wheelchair testing study evidence, wheelchair models delivered in LREs did not meet the minimum standards requirement. Multiple part failures and repairs were observed with reviewed field evaluation studies. ISWP experts noted that several testing factors responsible for premature failures with wheelchair parts are not included in the standards and accordingly provided advice for additional test development. Conclusion The study findings indicate the need to develop a wide range of tests, with specific tests for measuring corrosion resistance of the entire wheelchair, rolling resistance of castors and rear wheels, and durability of whole wheelchair and castor assemblies. PMID:28936410
Meyer, B; Morin, V N; Rödger, H-J; Holah, J; Bird, C
2010-04-01
The results from European standard disinfectant tests are used as one basis to approve the use of disinfectants in Europe. The design of these laboratory-based tests should thus simulate as closely as possible the practical conditions and challenges that the disinfectants would encounter in use. No evidence is available that the organic and microbial loading in these tests simulates actual levels in the food service sector. Total organic carbon (TOC) and total viable count (TVC) were determined on 17 visibly clean and 45 visibly dirty surfaces in two restaurants and the food preparation surfaces of a large retail store. These values were compared to reference values recovered from surfaces soiled with the organic and microbial loading, following the standard conditions of the European Surface Test for bactericidal efficacy, EN 13697. The TOC reference values for clean and dirty conditions were higher than the data from practice, but cannot be regarded as statistical outliers. This was considered as a conservative assessment; however, as additional nine TOC samples from visibly dirty surfaces were discarded from the analysis, as their loading made them impossible to process. Similarly, the recovery of test organisms from surfaces contaminated according to EN 13697 was higher than the TVC from visibly dirty surfaces in practice; though they could not be regarded as statistical outliers of the whole data field. No correlation was found between TVC and TOC in the sampled data, which re-emphasizes the potential presence of micro-organisms on visibly clean surfaces and thus the need for the same degree of disinfection as visibly dirty surfaces. The organic soil and the microbial burden used in EN disinfectant standards represent a realistic worst-case scenario for disinfectants used in the food service and food-processing areas.
NASA Astrophysics Data System (ADS)
Kim, Byung Chan; Park, Seong-Ook
In order to determine exposure compliance with the electromagnetic fields from a base station's antenna in the far-field region, we should calculate the spatially averaged field value in a defined space. This value is calculated based on the measured value obtained at several points within the restricted space. According to the ICNIRP guidelines, at each point in the space, the reference levels are averaged over any 6min (from 100kHz to 10GHz) for the general public. Therefore, the more points we use, the longer the measurement time becomes. For practical application, it is very advantageous to spend less time for measurement. In this paper, we analyzed the difference of average values between 6min and lesser periods and compared it with the standard uncertainty for measurement drift. Based on the standard deviation from the 6min averaging value, the proposed minimum averaging time is 1min.
Zainabadi, Kayvan; Adams, Matthew; Han, Zay Yar; Lwin, Hnin Wai; Han, Kay Thwe; Ouattara, Amed; Thura, Si; Plowe, Christopher V; Nyunt, Myaing M
2017-09-18
Greater Mekong Subregion countries are committed to eliminating Plasmodium falciparum malaria by 2025. Current elimination interventions target infections at parasite densities that can be detected by standard microscopy or rapid diagnostic tests (RDTs). More sensitive detection methods have been developed to detect lower density "asymptomatic" infections that may represent an important transmission reservoir. These ultrasensitive polymerase chain reaction (usPCR) tests have been used to identify target populations for mass drug administration (MDA). To date, malaria usPCR tests have used either venous or capillary blood sampling, which entails complex sample collection, processing and shipping requirements. An ultrasensitive method performed on standard dried blood spots (DBS) would greatly facilitate the molecular surveillance studies needed for targeting elimination interventions. A highly sensitive method for detecting Plasmodium falciparum and P. vivax 18S ribosomal RNA from DBS was developed by empirically optimizing nucleic acid extraction conditions. The limit of detection (LoD) was determined using spiked DBS samples that were dried and stored under simulated field conditions. Further, to assess its utility for routine molecular surveillance, two cross-sectional surveys were performed in Myanmar during the wet and dry seasons. The lower LoD of the DBS-based ultrasensitive assay was 20 parasites/mL for DBS collected on Whatman 3MM filter paper and 23 parasites/mL for Whatman 903 Protein Saver cards-equivalent to 1 parasite per 50 µL DBS. This is about 5000-fold more sensitive than standard RDTs and similar to the LoD of ≤16-22 parasites/mL reported for other ultrasensitive methods based on whole blood. In two cross-sectional surveys in Myanmar, nearly identical prevalence estimates were obtained from contemporaneous DBS samples and capillary blood samples collected during the wet and dry season. The DBS-based ultrasensitive method described in this study shows equal sensitivity as previously described methods based on whole blood, both in its limit of detection and prevalence estimates in two field surveys. The reduced cost and complexity of this method will allow for the scale-up of surveillance studies to target MDA and other malaria elimination interventions, and help lead to a better understanding of the epidemiology of low-density malaria infections.
NASA Astrophysics Data System (ADS)
Salyer, Terry
2017-06-01
For the bulk of detonation performance experiments, a fairly basic set of diagnostic techniques has evolved as the standard for acquiring the necessary measurements. Gold standard techniques such as pin switches and streak cameras still produce the high-quality data required, yet much room remains for improvement with regard to ease of use, cost of fielding, breadth of data, and diagnostic versatility. Over the past several years, an alternate set of diagnostics has been under development to replace many of these traditional techniques. Pulse Correlation Reflectometry (PCR) is a capable substitute for pin switches with the advantage of obtaining orders of magnitude more data at a small fraction of the cost and fielding time. Spectrally Encoded Imaging (SEI) can replace most applications of streak camera with the advantage of imaging surfaces through a single optical fiber that are otherwise optically inaccessible. Such diagnostics advance the measurement state of the art, but even further improvements may come through revamping the standardized tests themselves such as the copper cylinder expansion test. At the core of this modernization, the aforementioned diagnostics play a significant role in revamping and improving the standard test suite for the present era. This research was performed under the auspices of the United States Department of Energy.
Environmental assessment of the 40 kilowatt fuel cell system field test operation
NASA Technical Reports Server (NTRS)
Bollenbacher, G.
1982-01-01
This environmental assessment examines the potential environmental consequences, both adverse and beneficial, of the 40 kW fuel cell system system field test operation. The assessment is of necessity generic in nature since actual test sites were not selected. This assessment provides the basis for determining the need for an environmental impact statement. In addition, this assessment provides siting criteria to avoid or minimize negative environmental impacts and standards for determining candidate test sites, if any, for which site specific assessments may be required.
Testing Modified Gravity Theories via Wide Binaries and GAIA
NASA Astrophysics Data System (ADS)
Pittordis, Charalambos; Sutherland, Will
2018-06-01
The standard ΛCDM model based on General Relativity (GR) including cold dark matter (CDM) is very successful at fitting cosmological observations, but recent non-detections of candidate dark matter (DM) particles mean that various modified-gravity theories remain of significant interest. The latter generally involve modifications to GR below a critical acceleration scale ˜10-10 m s-2. Wide-binary (WB) star systems with separations ≳ 5 kAU provide an interesting test for modified gravity, due to being in or near the low-acceleration regime and presumably containing negligible DM. Here, we explore the prospects for new observations pending from the GAIA spacecraft to provide tests of GR against MOND or TeVes-like theories in a regime only partially explored to date. In particular, we find that a histogram of (3D) binary relative velocities, relative to equilibrium circular velocity predicted from the (2D) projected separation predicts a rather sharp feature in this distribution for standard gravity, with an 80th (90th) percentile value close to 1.025 (1.14) with rather weak dependence on the eccentricity distribution. However, MOND/TeVeS theories produce a shifted distribution, with a significant increase in these upper percentiles. In MOND-like theories without an external field effect, there are large shifts of order unity. With the external field effect included, the shifts are considerably reduced to ˜0.04 - 0.08, but are still potentially detectable statistically given reasonably large samples and good control of contaminants. In principle, followup of GAIA-selected wide binaries with ground-based radial velocities accurate to ≲ 0.03 { km s^{-1}} should be able to produce an interesting new constraint on modified-gravity theories.
Assuring long-term reliability of concentrator PV systems
NASA Astrophysics Data System (ADS)
McConnell, R.; Garboushian, V.; Brown, J.; Crawford, C.; Darban, K.; Dutra, D.; Geer, S.; Ghassemian, V.; Gordon, R.; Kinsey, G.; Stone, K.; Turner, G.
2009-08-01
Concentrator PV (CPV) systems have attracted significant interest because these systems incorporate the world's highest efficiency solar cells and they are targeting the lowest cost production of solar electricity for the world's utility markets. Because these systems are just entering solar markets, manufacturers and customers need to assure their reliability for many years of operation. There are three general approaches for assuring CPV reliability: 1) field testing and development over many years leading to improved product designs, 2) testing to internationally accepted qualification standards (especially for new products) and 3) extended reliability tests to identify critical weaknesses in a new component or design. Amonix has been a pioneer in all three of these approaches. Amonix has an internal library of field failure data spanning over 15 years that serves as the basis for its seven generations of CPV systems. An Amonix product served as the test CPV module for the development of the world's first qualification standard completed in March 2001. Amonix staff has served on international standards development committees, such as the International Electrotechnical Commission (IEC), in support of developing CPV standards needed in today's rapidly expanding solar markets. Recently Amonix employed extended reliability test procedures to assure reliability of multijunction solar cell operation in its seventh generation high concentration PV system. This paper will discuss how these three approaches have all contributed to assuring reliability of the Amonix systems.
Evaluation and Field Assessment of Bifacial Photovoltaic Module Power Rating Methodologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deline, Chris; MacAlpine, Sara; Marion, Bill
2016-11-21
1-sun power ratings for bifacial modules are currently undefined. This is partly because there is no standard definition of rear irradiance given 1000 Wm-2 on the front. Using field measurements and simulations, we evaluate multiple deployment scenarios for bifacial modules and provide details on the amount of irradiance that could be expected. A simplified case that represents a single module deployed under conditions consistent with existing 1-sun irradiance standards leads to a bifacial reference condition of 1000 Wm-2 Gfront and 130-140 Wm-2 Grear. For fielded systems of bifacial modules, Grear magnitude and spatial uniformity will be affected by self-shade frommore » adjacent modules, varied ground cover, and ground-clearance height. A standard measurement procedure for bifacial modules is also currently undefined. A proposed international standard is under development, which provides the motivation for this work. Here, we compare outdoor field measurements of bifacial modules with irradiance on both sides with proposed indoor test methods where irradiance is only applied to one side at a time. The indoor method has multiple advantages, including controlled and repeatable irradiance and thermal environment, along with allowing the use of conventional single-sided flash test equipment. The comparison results are promising, showing that the indoor and outdoor methods agree within 1%-2% for multiple rear-irradiance conditions and bifacial module types.« less
Evaluation and Field Assessment of Bifacial Photovoltaic Module Power Rating Methodologies: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deline, Chris; MacAlpine, Sara; Marion, Bill
2016-06-16
1-sun power ratings for bifacial modules are currently undefined. This is partly because there is no standard definition of rear irradiance given 1000 Wm-2 on the front. Using field measurements and simulations, we evaluate multiple deployment scenarios for bifacial modules and provide details on the amount of irradiance that could be expected. A simplified case that represents a single module deployed under conditions consistent with existing 1-sun irradiance standards leads to a bifacial reference condition of 1000 Wm-2 Gfront and 130-140 Wm-2 Grear. For fielded systems of bifacial modules, Grear magnitude and spatial uniformity will be affected by self-shade frommore » adjacent modules, varied ground cover, and ground-clearance height. A standard measurement procedure for bifacial modules is also currently undefined. A proposed international standard is under development, which provides the motivation for this work. Here, we compare outdoor field measurements of bifacial modules with irradiance on both sides with proposed indoor test methods where irradiance is only applied to one side at a time. The indoor method has multiple advantages, including controlled and repeatable irradiance and thermal environment, along with allowing the use of conventional single-sided flash test equipment. The comparison results are promising, showing that the indoor and outdoor methods agree within 1%-2% for multiple rear-irradiance conditions and bifacial module types.« less
Skaane, Per; Young, Kari; Skjennald, Arnulf
2003-12-01
To compare screen-film and full-field digital mammography with soft-copy reading in a population-based screening program. Full-field digital and screen-film mammography were performed in 3,683 women aged 50-69 years. Two standard views of each breast were acquired with each modality. Images underwent independent double reading with use of a five-point rating scale for probability of cancer. Recall rates and positive predictive values were calculated. Cancer detection rates determined with both modalities were compared by using the McNemar test for paired proportions. Retrospective side-by-side analysis for conspicuity of cancers was performed by an external independent radiologist group with experience in both modalities. In 3,683 cases, 31 cancers were detected. Screen-film mammography depicted 28 (0.76%) malignancies, and full-field digital mammography depicted 23 (0.62%) malignancies. The difference between cancer detection rates was not significant (P =.23). The recall rate for full-field digital mammography (4.6%; 168 of 3,683 cases) was slightly higher than that for screen-film mammography (3.5%; 128 of 3,683 cases). The positive predictive value based on needle biopsy results was 46% for screen-film mammography and 39% for full-field digital mammography. Side-by-side image comparison for cancer conspicuity led to classification of 19 cancers as equal for probability of malignancy, six cancers as slightly better demonstrated at screen-film mammography, and six cancers as slightly better demonstrated at full-field digital mammography. There was no statistically significant difference in cancer detection rate between screen-film and full-field digital mammography. Cancer conspicuity was equal with both modalities. Full-field digital mammography with soft-copy reading is comparable to screen-film mammography in population-based screening.
Validation of the Puumala virus rapid field test for bank voles in Germany.
Reil, D; Imholt, C; Rosenfeld, U M; Drewes, S; Fischer, S; Heuser, E; Petraityte-Burneikiene, R; Ulrich, R G; Jacob, J
2017-02-01
Puumala virus (PUUV) causes many human infections in large parts of Europe and can lead to mild to moderate disease. The bank vole (Myodes glareolus) is the only reservoir of PUUV in Central Europe. A commercial PUUV rapid field test for rodents was validated for bank-vole blood samples collected in two PUUV-endemic regions in Germany (North Rhine-Westphalia and Baden-Württemberg). A comparison of the results of the rapid field test and standard ELISAs indicated a test efficacy of 93-95%, largely independent of the origin of the antigens used in the ELISA. In ELISAs, reactivity for the German PUUV strain was higher compared to the Swedish strain but not compared to the Finnish strain, which was used for the rapid field test. In conclusion, the use of the rapid field test can facilitate short-term estimation of PUUV seroprevalence in bank-vole populations in Germany and can aid in assessing human PUUV infection risk.
1987-07-01
Groundwater." Developments in Industrial Microbiology, Volume 24, pp. 225-234. Society of Industrial Microbiology, Arlington, Virginia. 18. Product ...ESL-TR-85-52 cv) VOLUME II CN IN SITU BIOLOGICAL TREATMENT TEST AT KELLY AIR FORCE BASE, VOLUME !1: FIELD TEST RESULTS AND COST MODEL R.S. WETZEL...Kelly Air Force Base, Volume II: Field Test Results and Cost Model (UNCLASSIFIED) 12 PERSONAL AUTHOR(S) Roger S. Wetzel, Connie M. Durst, Donald H
In Situ Solid Particle Generator
NASA Technical Reports Server (NTRS)
Agui, Juan H.; Vijayakumar, R.
2013-01-01
Particle seeding is a key diagnostic component of filter testing and flow imaging techniques. Typical particle generators rely on pressurized air or gas sources to propel the particles into the flow field. Other techniques involve liquid droplet atomizers. These conventional techniques have drawbacks that include challenging access to the flow field, flow and pressure disturbances to the investigated flow, and they are prohibitive in high-temperature, non-standard, extreme, and closed-system flow conditions and environments. In this concept, the particles are supplied directly within a flow environment. A particle sample cartridge containing the particles is positioned somewhere inside the flow field. The particles are ejected into the flow by mechanical brush/wiper feeding and sieving that takes place within the cartridge chamber. Some aspects of this concept are based on established material handling techniques, but they have not been used previously in the current configuration, in combination with flow seeding concepts, and in the current operational mode. Unlike other particle generation methods, this concept has control over the particle size range ejected, breaks up agglomerates, and is gravity-independent. This makes this device useful for testing in microgravity environments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dana, S.; Damiani, R.; vanDam, J.
As part of an ongoing effort to improve the modeling and prediction of small wind turbine dynamics, NREL tested a small horizontal axis wind turbine in the field at the National Wind Technology Center (NWTC). The test turbine was a 2.1-kW downwind machine mounted on an 18-meter multi-section fiberglass composite tower. The tower was instrumented and monitored for approximately 6 months. The collected data were analyzed to assess the turbine and tower loads and further validate the simplified loads equations from the International Electrotechnical Commission (IEC) 61400-2 design standards. Field-measured loads were also compared to the output of an aeroelasticmore » model of the turbine. Ultimate loads at the tower base were assessed using both the simplified design equations and the aeroelastic model output. The simplified design equations in IEC 61400-2 do not accurately model fatigue loads. In this project, we compared fatigue loads as measured in the field, as predicted by the aeroelastic model, and as calculated using the simplified design equations.« less
Zarkovic, Andrea; Mora, Justin; McKelvie, James; Gamble, Greg
2007-12-01
The aim of the study was to establish the correlation between visual filed loss as shown by second-generation Frequency Doubling Technology (Humphrey Matrix) and Standard Automated Perimetry (Humphrey Field Analyser) in patients with glaucoma. Also, compared were the test duration and reliability. Forty right eyes from glaucoma patients from a private ophthalmology practice were included in this prospective study. All participants had tests within an 8-month period. Pattern deviation plots and mean deviation were compared to establish the correlation between the two perimetry tests. Overall correlation and correlation between hemifields, quadrants and individual test locations were assessed. Humphrey Field Analyser tests were slightly more reliable (37/40 vs. 34/40 for Matrix)) but overall of longer duration. There was good correlation (0.69) between mean deviations. Superior hemifields and superonasal quadrants had the highest correlation (0.88 [95% CI 0.79, 0.94]). Correlation between individual points was independent of distance from the macula. Generally, the Matrix and Humphrey Field Analyser perimetry correlate well; however, each machine utilizes a different method of analysing data and thus the direct comparison should be made with caution.
Static Load Test on Instrumented Pile - Field Data and Numerical Simulations
NASA Astrophysics Data System (ADS)
Krasiński, Adam; Wiszniewski, Mateusz
2017-09-01
Static load tests on foundation piles are generally carried out in order to determine load - the displacement characteristic of the pile head. For standard (basic) engineering practices this type of test usually provides enough information. However, the knowledge of force distribution along the pile core and its division into the friction along the shaft and the resistance under the base can be very useful. Such information can be obtained by strain gage pile instrumentation [1]. Significant investigations have been completed on this technology, proving its utility and correctness [8], [10], [12]. The results of static tests on instrumented piles are not easy to interpret. There are many factors and processes affecting the final outcome. In order to understand better the whole testing process and soil-structure behavior some investigations and numerical analyses were done. In the paper, real data from a field load test on instrumented piles is discussed and compared with numerical simulation of such a test in similar conditions. Differences and difficulties in the results interpretation with their possible reasons are discussed. Moreover, the authors used their own analytical solution for more reliable determination of force distribution along the pile. The work was presented at the XVII French-Polish Colloquium of Soil and Rock Mechanics, Łódź, 28-30 November 2016.
EVALUATION OF A FIELD TEST KIT FOR MONITORING LEAD IN DRINKING WATER.
This article describes a conceptual framework for designing evaluation studies of test kits for the analysis of significant drinking water constituents. A commercial test kit for the analysis of lead in tap waters was evaluated and compared with a standard graphite furnace atomic...
NASA Astrophysics Data System (ADS)
Lai, T.; Biggie, R.; Brooks, A.; Potter, B. G.; Simmons-Potter, K.
2015-09-01
Lifecycle degradation testing of photovoltaic (PV) modules in accelerated-degradation chambers can enable the prediction both of PV performance lifetimes and of return-on-investment for installations of PV systems. With degradation results strongly dependent on chamber test parameters, the validity of such studies relative to fielded, installed PV systems must be determined. In the present work, accelerated aging of a 250 W polycrystalline silicon module is compared to real-time performance degradation in a similar polycrystalline-silicon, fielded, PV technology that has been operating since October 2013. Investigation of environmental aging effects are performed in a full-scale, industrial-standard environmental chamber equipped with single-sun irradiance capability providing illumination uniformity of 98% over a 2 x 1.6 m area. Time-dependent, photovoltaic performance (J-V) is evaluated over a recurring, compressed night-day cycle providing representative local daily solar insolation for the southwestern United States, followed by dark (night) cycling. This cycle is synchronized with thermal and humidity environmental variations that are designed to mimic, as closely as possible, test-yard conditions specific to a 12 month weather profile for a fielded system in Tucson, AZ. Results confirm the impact of environmental conditions on the module long-term performance. While the effects of temperature de-rating can be clearly seen in the data, removal of these effects enables the clear interpretation of module efficiency degradation with time and environmental exposure. With the temperature-dependent effect removed, the normalized efficiency is computed and compared to performance results from another panel of similar technology that has previously experienced identical climate changes in the test yard. Analysis of relative PV module efficiency degradation for the chamber-tested system shows good comparison to the field-tested system with ~2.5% degradation following an equivalent year of testing.
Field evaluation of a prototype paper-based point-of-care fingerstick transaminase test.
Pollock, Nira R; McGray, Sarah; Colby, Donn J; Noubary, Farzad; Nguyen, Huyen; Nguyen, The Anh; Khormaee, Sariah; Jain, Sidhartha; Hawkins, Kenneth; Kumar, Shailendra; Rolland, Jason P; Beattie, Patrick D; Chau, Nguyen V; Quang, Vo M; Barfield, Cori; Tietje, Kathy; Steele, Matt; Weigl, Bernhard H
2013-01-01
Monitoring for drug-induced liver injury (DILI) via serial transaminase measurements in patients on potentially hepatotoxic medications (e.g., for HIV and tuberculosis) is routine in resource-rich nations, but often unavailable in resource-limited settings. Towards enabling universal access to affordable point-of-care (POC) screening for DILI, we have performed the first field evaluation of a paper-based, microfluidic fingerstick test for rapid, semi-quantitative, visual measurement of blood alanine aminotransferase (ALT). Our objectives were to assess operational feasibility, inter-operator variability, lot variability, device failure rate, and accuracy, to inform device modification for further field testing. The paper-based ALT test was performed at POC on fingerstick samples from 600 outpatients receiving HIV treatment in Vietnam. Results, read independently by two clinic nurses, were compared with gold-standard automated (Roche Cobas) results from venipuncture samples obtained in parallel. Two device lots were used sequentially. We demonstrated high inter-operator agreement, with 96.3% (95% C.I., 94.3-97.7%) agreement in placing visual results into clinically-defined "bins" (<3x, 3-5x, and >5x upper limit of normal), >90% agreement in validity determination, and intraclass correlation coefficient of 0.89 (95% C.I., 0.87-0.91). Lot variability was observed in % invalids due to hemolysis (21.1% for Lot 1, 1.6% for Lot 2) and correlated with lots of incorporated plasma separation membranes. Invalid rates <1% were observed for all other device controls. Overall bin placement accuracy for the two readers was 84% (84.3%/83.6%). Our findings of extremely high inter-operator agreement for visual reading-obtained in a target clinical environment, as performed by local practitioners-indicate that the device operation and reading process is feasible and reproducible. Bin placement accuracy and lot-to-lot variability data identified specific targets for device optimization and material quality control. This is the first field study performed with a patterned paper-based microfluidic device and opens the door to development of similar assays for other important analytes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2014-10-01
In this project, Building America team IBACOS performed field testing in a new construction unoccupied test house in Pittsburgh, Pennsylvania to evaluate heating, ventilating, and air conditioning (HVAC) distribution systems during heating, cooling, and midseason conditions. Four air-based HVAC distribution systems were assessed:-a typical airflow ducted system to the bedrooms, a low airflow ducted system to the bedrooms, a system with transfer fans to the bedrooms, and a system with no ductwork to the bedrooms. The relative ability of each system was considered with respect to relevant Air Conditioning Contractors of America and ASHRAE standards for house temperature uniformity andmore » stability, respectively.« less
Jacobson, R H; Downing, D R; Lynch, T J
1982-11-15
A computer-assisted enzyme-linked immunosorbent assay (ELISA) system, based on kinetics of the reaction between substrate and enzyme molecules, was developed for testing large numbers of sera in laboratory applications. Systematic and random errors associated with conventional ELISA technique were identified leading to results formulated on a statistically validated, objective, and standardized basis. In a parallel development, an inexpensive system for field and veterinary office applications contained many of the qualities of the computer-assisted ELISA. This system uses a fluorogenic indicator (rather than the enzyme-substrate interaction) in a rapid test (15 to 20 minutes' duration) which promises broad application in serodiagnosis.
NASA Astrophysics Data System (ADS)
Miles, Natasha L.; Martins, Douglas K.; Richardson, Scott J.; Rella, Christopher W.; Arata, Caleb; Lauvaux, Thomas; Davis, Kenneth J.; Barkley, Zachary R.; McKain, Kathryn; Sweeney, Colm
2018-03-01
Four in situ cavity ring-down spectrometers (G2132-i, Picarro, Inc.) measuring methane dry mole fraction (CH4), carbon dioxide dry mole fraction (CO2), and the isotopic ratio of methane (δ13CH4) were deployed at four towers in the Marcellus Shale natural gas extraction region of Pennsylvania. In this paper, we describe laboratory and field calibration of the analyzers for tower-based applications and characterize their performance in the field for the period January-December 2016. Prior to deployment, each analyzer was tested using bottles with various isotopic ratios, from biogenic to thermogenic source values, which were diluted to varying degrees in zero air, and an initial calibration was performed. Furthermore, at each tower location, three field tanks were employed, from ambient to high mole fractions, with various isotopic ratios. Two of these tanks were used to adjust the calibration of the analyzers on a daily basis. We also corrected for the cross-interference from ethane on the isotopic ratio of methane. Using an independent field tank for evaluation, the standard deviation of 4 h means of the isotopic ratio of methane difference from the known value was found to be 0.26 ‰ δ13CH4. Following improvements in the field tank testing scheme, the standard deviation of 4 h means was 0.11 ‰, well within the target compatibility of 0.2 ‰. Round-robin style testing using tanks with near-ambient isotopic ratios indicated mean errors of -0.14 to 0.03 ‰ for each of the analyzers. Flask to in situ comparisons showed mean differences over the year of 0.02 and 0.08 ‰, for the east and south towers, respectively. Regional sources in this region were difficult to differentiate from strong perturbations in the background. During the afternoon hours, the median differences of the isotopic ratio measured at three of the towers, compared to the background tower, were &minus0.15 to 0.12 ‰ with standard deviations of the 10 min isotopic ratio differences of 0.8 ‰. In terms of source attribution, analyzer compatibility of 0.2 ‰ δ13CH4 affords the ability to distinguish a 50 ppb CH4 peak from a biogenic source (at -60 ‰, for example) from one originating from a thermogenic source (-35 ‰), with the exact value dependent upon the source isotopic ratios. Using a Keeling plot approach for the non-afternoon data at a tower in the center of the study region, we determined the source isotopic signature to be -31.2 ± 1.9 ‰, within the wide range of values consistent with a deep-layer Marcellus natural gas source.
Mödden, Claudia; Behrens, Marion; Damke, Iris; Eilers, Norbert; Kastrup, Andreas; Hildebrandt, Helmut
2012-06-01
Compensatory and restorative treatments have been developed to improve visual field defects after stroke. However, no controlled trials have compared these interventions with standard occupational therapy (OT). A total of 45 stroke participants with visual field defect admitted for inpatient rehabilitation were randomized to restorative computerized training (RT) using computer-based stimulation of border areas of their visual field defects or to a computer-based compensatory therapy (CT) teaching a visual search strategy. OT, in which different compensation strategies were used to train for activities of daily living, served as standard treatment for the active control group. Each treatment group received 15 single sessions of 30 minutes distributed over 3 weeks. The primary outcome measures were visual field expansion for RT, visual search performance for CT, and reading performance for both treatments. Visual conjunction search, alertness, and the Barthel Index were secondary outcomes. Compared with OT, CT resulted in a better visual search performance, and RT did not result in a larger expansion of the visual field. Intragroup pre-post comparisons demonstrated that CT improved all defined outcome parameters and RT several, whereas OT only improved one. CT improved functional deficits after visual field loss compared with standard OT and may be the intervention of choice during inpatient rehabilitation. A larger trial that includes lesion location in the analysis is recommended.
Prewhitening of Colored Noise Fields for Detection of Threshold Sources
1993-11-07
determines the noise covariance matrix, prewhitening techniques allow detection of threshold sources. The multiple signal classification ( MUSIC ...SUBJECT TERMS 1S. NUMBER OF PAGES AR Model, Colored Noise Field, Mixed Spectra Model, MUSIC , Noise Field, 52 Prewhitening, SNR, Standardized Test...EXAMPLE 2: COMPLEX AR COEFFICIENT .............................................. 5 EXAMPLE 3: MUSIC IN A COLORED BACKGROUND NOISE ...................... 6
Recent Progress on Sonic Boom Research at NASA
NASA Technical Reports Server (NTRS)
Loubeau, Alexandra
2012-01-01
Sonic boom research conducted at NASA through the Supersonics Project of the Fundamental Aeronautics Program is oriented toward understanding the potential impact of sonic boom noise on communities from new low-boom supersonic aircraft designs. Encompassing research in atmospheric propagation, structural response, and human response, NASA research contributes to knowledge in key areas needed to support development of a new noise-based standard for supersonic aircraft certification. Partnerships with several industry, government, and academic institutions have enabled the recent execution of several acoustic field studies on sonic booms. An overview of recent activities funded by NASA includes: focus boom model development and experimental validation, field experiments of structural transmission of sonic booms into large buildings, and low boom community response testing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prabakar, Kumaraguru; Shirazi, Mariko; Singh, Akanksha
Penetration levels of solar photovoltaic (PV) generation on the electric grid have increased in recent years. In the past, most PV installations have not included grid-support functionalities. But today, standards such as the upcoming revisions to IEEE 1547 recommend grid support and anti-islanding functions-including volt-var, frequency-watt, volt-watt, frequency/voltage ride-through, and other inverter functions. These functions allow for the standardized interconnection of distributed energy resources into the grid. This paper develops and tests low-level inverter current control and high-level grid support functions. The controller was developed to integrate advanced inverter functions in a systematic approach, thus avoiding conflict among the differentmore » control objectives. The algorithms were then programmed on an off-the-shelf, embedded controller with a dual-core computer processing unit and field-programmable gate array (FPGA). This programmed controller was tested using a controller-hardware-in-the-loop (CHIL) test bed setup using an FPGA-based real-time simulator. The CHIL was run at a time step of 500 ns to accommodate the 20-kHz switching frequency of the developed controller. The details of the advanced control function and CHIL test bed provided here will aide future researchers when designing, implementing, and testing advanced functions of PV inverters.« less
[Phantom studies of ultrasound equipment for quality improvement in breast diagnosis].
Madjar, H; Mundinger, A; Lattermann, U; Gufler, H; Prömpeler, H J
1996-04-01
According to the German guidelines for quality control of ultrasonic equipment, the following conditions are required for breast ultrasound: A transducer frequency between 5-7.5 MHz and a minimum field of view of 5 cm. Satisfactory images must be obtained in a depth between 0.5 and 4 cm with a wide tolerance of the focal zones. This allows the use of poor quality equipment which does not produce satisfactory image quality and it excludes a number of high frequency and high resolution transducers with a field of view below 5 cm. This study with a test phantom was performed to define image quality objectively. Sixteen ultrasound instruments in different price categories were used to perform standardized examinations on a breast phantom model 550 (ATS Laboratories, Bridgeport, USA). Contrast and spatial resolution in different penetration depths were investigated on cyst phantoms from 1-4 mm diameter and wire targets with defined distances between 0.5-3 mm 4 investigations reported the images. A positive correlation was seen between price category and image quality. This study demonstrates that transducer frequency and image geometry do not allow sufficient quality control. An improvement of ultrasound diagnosis is only possible if equipment guidelines are based on standard examinations with test phantoms.
NASA Astrophysics Data System (ADS)
Gosses, Moritz; Nowak, Wolfgang; Wöhling, Thomas
2018-05-01
In recent years, proper orthogonal decomposition (POD) has become a popular model reduction method in the field of groundwater modeling. It is used to mitigate the problem of long run times that are often associated with physically-based modeling of natural systems, especially for parameter estimation and uncertainty analysis. POD-based techniques reproduce groundwater head fields sufficiently accurate for a variety of applications. However, no study has investigated how POD techniques affect the accuracy of different boundary conditions found in groundwater models. We show that the current treatment of boundary conditions in POD causes inaccuracies for these boundaries in the reduced models. We provide an improved method that splits the POD projection space into a subspace orthogonal to the boundary conditions and a separate subspace that enforces the boundary conditions. To test the method for Dirichlet, Neumann and Cauchy boundary conditions, four simple transient 1D-groundwater models, as well as a more complex 3D model, are set up and reduced both by standard POD and POD with the new extension. We show that, in contrast to standard POD, the new method satisfies both Dirichlet and Neumann boundary conditions. It can also be applied to Cauchy boundaries, where the flux error of standard POD is reduced by its head-independent contribution. The extension essentially shifts the focus of the projection towards the boundary conditions. Therefore, we see a slight trade-off between errors at model boundaries and overall accuracy of the reduced model. The proposed POD extension is recommended where exact treatment of boundary conditions is required.
Characterization of commercial magnetorheological fluids at high shear rate: influence of the gap
NASA Astrophysics Data System (ADS)
Golinelli, Nicola; Spaggiari, Andrea
2018-07-01
This paper reports the experimental tests on the behaviour of a commercial MR fluid at high shear rates and the effect of the gap. Three gaps were considered at multiple magnetic fields and shear rates. From an extended set of almost two hundred experimental flow curves, a set of parameters for the apparent viscosity are retrieved by using the Ostwald de Waele model for non-Newtonian fluids. It is possible to simplify the parameter correlation by making the following considerations: the consistency of the model depends only on the magnetic field, the flow index depends on the fluid type and the gap shows an important effect only at null or very low magnetic fields. This lead to a simple and useful model, especially in the design phase of a MR based product. During the off state, with no applied field, it is possible to use a standard viscous model. During the active state, with high magnetic field, a strong non-Newtonian nature becomes prevalent over the viscous one even at very high shear rate; the magnetic field dominates the apparent viscosity change, while the gap does not play any relevant role on the system behaviour. This simple assumption allows the designer to dimension the gap only considering the non-active state, as in standard viscous systems, and taking into account only the magnetic effect in the active state, where the gap does not change the proposed fluid model.
Static pile load tests on driven piles in Intermediate-Geo Materials : research brief.
DOT National Transportation Integrated Search
2017-02-01
Research Objectives: : Investigate the use of modified standard penetration tests (MSPT) : Compare field results with predictions made by the WisDOT driving formula, PDA and CAPWAP : Improve prediction of pile lengths and pile capacities ...
Geotechnical Descriptions of Rock and Rock Masses.
1985-04-01
determined in the field on core speci ns by the standard Rock Testing Handbook Methods . afls GA DTIC TAB thannounod 13 Justifiatlo By Distributin...to provide rock strength descriptions from the field. The point-load test has proven to be a reliable method of determining rock strength properties...report should qualify the reported spacing values by stating the methods used to determine spacing. Preferably the report should make the determination
D-RATS 2011: RAFT Protocol Overview
NASA Technical Reports Server (NTRS)
Utz, Hans
2011-01-01
A brief overview presentation on the protocol used during the D-RATS2011 field test for file transfer from the field-test robots at Black Point Lava Flow AZ to Johnson Space Center, Houston TX over a simulated time-delay. The file transfer actually uses a commercial implementation of an open communications standard. The focus of the work lies on how to make the state of the distributed system observable.
Development of wheelchair caster testing equipment and preliminary testing of caster models
Mhatre, Anand; Ott, Joseph
2017-01-01
Background Because of the adverse environmental conditions present in less-resourced environments (LREs), the World Health Organization (WHO) has recommended that specialised wheelchair test methods may need to be developed to support product quality standards in these environments. A group of experts identified caster test methods as a high priority because of their common failure in LREs, and the insufficiency of existing test methods described in the International Organization for Standardization (ISO) Wheelchair Testing Standards (ISO 7176). Objectives To develop and demonstrate the feasibility of a caster system test method. Method Background literature and expert opinions were collected to identify existing caster test methods, caster failures common in LREs and environmental conditions present in LREs. Several conceptual designs for the caster testing method were developed, and through an iterative process using expert feedback, a final concept and a design were developed and a prototype was fabricated. Feasibility tests were conducted by testing a series of caster systems from wheelchairs used in LREs, and failure modes were recorded and compared to anecdotal reports about field failures. Results The new caster testing system was developed and it provides the flexibility to expose caster systems to typical conditions in LREs. Caster failures such as stem bolt fractures, fork fractures, bearing failures and tire cracking occurred during testing trials and are consistent with field failures. Conclusion The new caster test system has the capability to incorporate necessary test factors that degrade caster quality in LREs. Future work includes developing and validating a testing protocol that results in failure modes common during wheelchair use in LRE. PMID:29062762
Evaluation of the Hydrolab HL4 water-quality sonde and sensors
Snazelle, Teri T.
2017-12-18
The U.S. Geological Survey (USGS) Hydrologic Instrumentation Facility evaluated three Hydrolab HL4 multiparameter water-quality sondes by OTT Hydromet. The sondes were equipped with temperature, conductivity, pH, dissolved oxygen (DO), and turbidity sensors. The sensors were evaluated for compliance with the USGS National Field Manual for the Collection of Water-Quality Data (NFM) criteria for continuous water-quality monitors and to verify the validity of the manufacturer’s technical specifications. The conductivity sensors were evaluated for the accuracy of the specific conductance (SC) values (conductance at 25 degrees Celsius [oC]), that were calculated by using the vendor default method, Hydrolab Fresh. The HL4’s communication protocols and operating temperature range along with accuracy of the water-quality sensors were tested in a controlled laboratory setting May 1–19, 2016. To evaluate the sonde’s performance in a surface-water field application, an HL4 equipped with temperature, conductivity, pH, DO, and turbidity sensors was deployed June 20–July 22, 2016, at USGS water-monitoring site 02492620, Pearl River at National Space Technology Laboratories (NSTL) Station, Mississippi, located near Bay Saint Louis, Mississippi, and compared to the adjacent well-maintained EXO2 site sonde.The three HL4 sondes met the USGS temperature testing criteria and the manufacturer’s technical specifications for temperature based upon the median room temperature difference between the measured and standard temperatures, but two of the three sondes exceeded the allowable difference criteria at the temperature extremes of approximately 5 and 40 ºC. Two sondes met the USGS criteria for SC. One of the sondes failed the criteria for SC when evaluated in a 100,000-microsiemens-per-centimeter (μS/cm) standard at room temperature, and also failed in a 10,000-μS/cm standard at 5, 15, and 40 ºC. All three sondes met the USGS criteria for pH and DO at room temperature, but one sonde exceeded the allowable difference criteria when tested in pH 5.00 buffer and at 40 ºC. The USGS criteria and the technical specifications for turbidity were met by one sonde in standards ranging from 10 to 3,000 nephelometric turbidity units (NTU). A second sonde met the USGS criteria and the technical specifications except in the 3,000-NTU standard, and the third sonde exceeded the USGS calibration criteria in the 10- and 20-NTU standards and the technical specifications in the 20-NTU standard.Results of the field test showed acceptable performance and revealed that differences in data sample processing between sonde manufacturers may result in variances between the reported measurements when comparing one sonde to another. These variances in data would be more pronounced in dynamic site conditions. The lack of a wiper or other sensor-cleaning device on the DO sensor could prove problematic, and could limit the use of the HL4 to profiling applications or at sites with limited biofouling.
Riccardi, M; Mele, G; Pulvento, C; Lavini, A; d'Andria, R; Jacobsen, S-E
2014-06-01
Leaf chlorophyll content provides valuable information about physiological status of plants; it is directly linked to photosynthetic potential and primary production. In vitro assessment by wet chemical extraction is the standard method for leaf chlorophyll determination. This measurement is expensive, laborious, and time consuming. Over the years alternative methods, rapid and non-destructive, have been explored. The aim of this work was to evaluate the applicability of a fast and non-invasive field method for estimation of chlorophyll content in quinoa and amaranth leaves based on RGB components analysis of digital images acquired with a standard SLR camera. Digital images of leaves from different genotypes of quinoa and amaranth were acquired directly in the field. Mean values of each RGB component were evaluated via image analysis software and correlated to leaf chlorophyll provided by standard laboratory procedure. Single and multiple regression models using RGB color components as independent variables have been tested and validated. The performance of the proposed method was compared to that of the widely used non-destructive SPAD method. Sensitivity of the best regression models for different genotypes of quinoa and amaranth was also checked. Color data acquisition of the leaves in the field with a digital camera was quick, more effective, and lower cost than SPAD. The proposed RGB models provided better correlation (highest R (2)) and prediction (lowest RMSEP) of the true value of foliar chlorophyll content and had a lower amount of noise in the whole range of chlorophyll studied compared with SPAD and other leaf image processing based models when applied to quinoa and amaranth.
Advanced development of non-discoloring EVA-based PV encapsulants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holley, W.H.; Galica, J.P.; Argo, S.C.
1996-01-01
The purpose of this investigation was to better define the problem of field yellowing of EVA-based PV encapsulant, through laboratory study of probable chemical mechanisms and the development of stabilization strategies for protecting EVA from discoloration. EVA from fielded modules was analyzed for vinyl acetate content, unsaturation, and additive levels. These test results were then compared to results from Xenon Arc Weather-Ometer aged glass/EVA/glass laminates made in the laboratory. Variables evaluated in Weather-Ometer aged laminates included ``standard-cure`` A9918P EVA, ``fast-cure`` 15295P EVA, low iron glass superstrate containing cerium oxide, and systematic elimination or addition of specific additives. Six significant findingsmore » were revealed: 1) Improved ``standard-cure`` and ``fast-cure`` type EVA encapsulants, formulations X9903P and X15303P, respectively, showed little or no yellowing after extended Weather-Ometer exposure; 2) The use of {open_quote}{open_quote}fast-cure{close_quote}{close_quote} EVA reduced discoloration when compared with {open_quote}{open_quote}standard-cure{close_quote}{close_quote} A9918P EVA; 3) Glass superstrate containing cerium oxide resulted in a reduced rate of EVA discoloration; 4) {open_quote}{open_quote}Fast-cure{close_quote}{close_quote} EVA used with glass superstrate containing cerium oxide showed no visible yellowing after 32 weeks in the Weather-Ometer{emdash}a period estimated to be roughly equivalent to 20{endash}30 years of exposure in the Southwest; 5) Severely discolored EVA samples from the field showed no measurable loss of acetate group and little detectable unsaturation; and 6) EVA encapsulant with a Tefzel cover exhibited no yellowing after extended Weather-Ometer exposure. {copyright} {ital 1996 American Institute of Physics.}« less
NASA Astrophysics Data System (ADS)
Hedley, Mikell Lynne
2008-10-01
The purpose of the study was to use geospatial technologies to improve the spatial abilities and specific atmospheric science content knowledge of students in high schools and junior highs in primarily high-needs schools. These technologies include remote sensing, geographic information systems, and global positioning systems. The program involved training the teachers in the use of the technologies at a five-day institute. Scientists who use the technologies in their research taught the basics of their use and scientific background. Standards-based activities were used to integrate the technologies in the classroom setting. Students were tested before any instruction in the technologies and then tested two other times. They used the technologies in field data collection and used that data in an inquiry-based project. Their projects were presented at a mini-science conference with scientists, teachers, parents, and other students in attendance. Significant differences were noted from pre-test to second post-test in the test in both the spatial abilities and science section. There was a gain in both spatial abilities and in specific atmospheric science content knowledge.
Saver, Jeffrey L; Warach, Steven; Janis, Scott; Odenkirchen, Joanne; Becker, Kyra; Benavente, Oscar; Broderick, Joseph; Dromerick, Alexander W; Duncan, Pamela; Elkind, Mitchell S V; Johnston, Karen; Kidwell, Chelsea S; Meschia, James F; Schwamm, Lee
2012-04-01
The National Institute of Neurological Disorders and Stroke initiated development of stroke-specific Common Data Elements (CDEs) as part of a project to develop data standards for funded clinical research in all fields of neuroscience. Standardizing data elements in translational, clinical, and population research in cerebrovascular disease could decrease study start-up time, facilitate data sharing, and promote well-informed clinical practice guidelines. A working group of diverse experts in cerebrovascular clinical trials, epidemiology, and biostatistics met regularly to develop a set of stroke CDEs, selecting among, refining, and adding to existing, field-tested data elements from national registries and funded trials and studies. Candidate elements were revised on the basis of comments from leading national and international neurovascular research organizations and the public. The first iteration of the National Institute of Neurological Disorders and Stroke (NINDS) stroke-specific CDEs comprises 980 data elements spanning 9 content areas: (1) biospecimens and biomarkers; (2) hospital course and acute therapies; (3) imaging; (4) laboratory tests and vital signs; (5) long-term therapies; (6) medical history and prior health status; (7) outcomes and end points; (8) stroke presentation; and (9) stroke types and subtypes. A CDE website provides uniform names and structures for each element, a data dictionary, and template case report forms, using the CDEs. Stroke-specific CDEs are now available as standardized, scientifically vetted, variable structures to facilitate data collection and data sharing in cerebrovascular patient-oriented research. The CDEs are an evolving resource that will be iteratively improved based on investigator use, new technologies, and emerging concepts and research findings.
Comparative A/B testing a mobile data acquisition app for hydrogeochemistry
NASA Astrophysics Data System (ADS)
Klump, Jens; Golodoniuc, Pavel; Reid, Nathan; Gray, David; Ross, Shawn
2015-04-01
In the context of a larger study on the Capricorn Orogen of Western Australia, the CSIRO Mineral Discovery Program is conducting a regional study of the hydrogeochemistry on water from agricultural and other bores. Over time, the sampling process was standardised and a form for capturing metadata and data from initial measurements was developed. In 2014 an extensive technology review was conducted with an aim to automate field data acquisition process. A prototype hydrogeochemistry data capture form was implemented as a mobile application for Windows Mobile devices. This version of the software was a standalone application with an interface to export data as CSV files. A second candidate version of the hydrogeochemistry data capture form was implemented as an Android mobile application in the FAIMS framework. FAIMS is a framework for mobile field data capture, originally developed by at the University of New South Wales for archaeological field data collection. A benefit of the FAIMS application was the ability to associate photographs taken with the device's embedded camera with the captured data. FAIMS also allows networked collaboration within a field team, using the mobile applications as asynchronous rich clients. The network infrastructure can be installed in the field ("FAIMS in a Box") to supply data synchronisation, backup and transfer. This aspect will be tested in the next field season. A benefit of the FAIMS application was the ability to associate photographs taken with the device's embedded camera with the captured data. Having two data capture applications available allowed us to conduct an A/B test, comparing two different implementations for the same task. Both applications were trialled in the field by different field crews and user feedback will be used to improve the usability of the app for the next field season. A key learning was that the ergonomics of the app is at paramount importance to gain the user acceptance. This extends from general fit with the standard procedures used in the field during data acquisition to self-descriptive and intuitive user interface features well aligned with the workflows and sequence of actions performed by a user that ultimately contributes to the implementation of a Collect-As-You-Go approach. In the Australian outback, issues such as absence of network connectivity, heat and sun glare may challenge the utility of tablet based applications in the field. Due to limitations of tablet use in the field we also consider the use of smart pens for data capture. A smart pen application based on Anoto forms and software by Formidable will be tested in the next field season.
Nakanishi, Masaki; Wang, Yu-Te; Jung, Tzyy-Ping; Zao, John K; Chien, Yu-Yi; Diniz-Filho, Alberto; Daga, Fabio B; Lin, Yuan-Pin; Wang, Yijun; Medeiros, Felipe A
2017-06-01
The current assessment of visual field loss in diseases such as glaucoma is affected by the subjectivity of patient responses and the lack of portability of standard perimeters. To describe the development and initial validation of a portable brain-computer interface (BCI) for objectively assessing visual function loss. This case-control study involved 62 eyes of 33 patients with glaucoma and 30 eyes of 17 healthy participants. Glaucoma was diagnosed based on a masked grading of optic disc stereophotographs. All participants underwent testing with a BCI device and standard automated perimetry (SAP) within 3 months. The BCI device integrates wearable, wireless, dry electroencephalogram and electrooculogram systems and a cellphone-based head-mounted display to enable the detection of multifocal steady state visual-evoked potentials associated with visual field stimulation. The performances of global and sectoral multifocal steady state visual-evoked potentials metrics to discriminate glaucomatous from healthy eyes were compared with global and sectoral SAP parameters. The repeatability of the BCI device measurements was assessed by collecting results of repeated testing in 20 eyes of 10 participants with glaucoma for 3 sessions of measurements separated by weekly intervals. Receiver operating characteristic curves summarizing diagnostic accuracy. Intraclass correlation coefficients and coefficients of variation for assessing repeatability. Among the 33 participants with glaucoma, 19 (58%) were white, 12 (36%) were black, and 2 (6%) were Asian, while among the 17 participants with healthy eyes, 9 (53%) were white, 8 (47%) were black, and none were Asian. The receiver operating characteristic curve area for the global BCI multifocal steady state visual-evoked potentials parameter was 0.92 (95% CI, 0.86-0.96), which was larger than for SAP mean deviation (area under the curve, 0.81; 95% CI, 0.72-0.90), SAP mean sensitivity (area under the curve, 0.80; 95% CI, 0.69-0.88; P = .03), and SAP pattern standard deviation (area under the curve, 0.77; 95% CI, 0.66-0.87; P = .01). No statistically significant differences were seen for the sectoral measurements between the BCI and SAP. Intraclass coefficients for global and sectoral parameters ranged from 0.74 to 0.92, and mean coefficients of variation ranged from 3.03% to 7.45%. The BCI device may be useful for assessing the electrical brain responses associated with visual field stimulation. The device discriminated eyes with glaucomatous neuropathy from healthy eyes in a clinically based setting. Further studies should investigate the feasibility of the BCI device for home-based testing as well as for detecting visual function loss over time.
Berg, Brandon; Cortazar, Bingen; Tseng, Derek; Ozkan, Haydar; Feng, Steve; Wei, Qingshan; Chan, Raymond Yan-Lok; Burbano, Jordi; Farooqui, Qamar; Lewinski, Michael; Di Carlo, Dino; Garner, Omai B; Ozcan, Aydogan
2015-08-25
Standard microplate based enzyme-linked immunosorbent assays (ELISA) are widely utilized for various nanomedicine, molecular sensing, and disease screening applications, and this multiwell plate batched analysis dramatically reduces diagnosis costs per patient compared to nonbatched or nonstandard tests. However, their use in resource-limited and field-settings is inhibited by the necessity for relatively large and expensive readout instruments. To mitigate this problem, we created a hand-held and cost-effective cellphone-based colorimetric microplate reader, which uses a 3D-printed opto-mechanical attachment to hold and illuminate a 96-well plate using a light-emitting-diode (LED) array. This LED light is transmitted through each well, and is then collected via 96 individual optical fibers. Captured images of this fiber-bundle are transmitted to our servers through a custom-designed app for processing using a machine learning algorithm, yielding diagnostic results, which are delivered to the user within ∼1 min per 96-well plate, and are visualized using the same app. We successfully tested this mobile platform in a clinical microbiology laboratory using FDA-approved mumps IgG, measles IgG, and herpes simplex virus IgG (HSV-1 and HSV-2) ELISA tests using a total of 567 and 571 patient samples for training and blind testing, respectively, and achieved an accuracy of 99.6%, 98.6%, 99.4%, and 99.4% for mumps, measles, HSV-1, and HSV-2 tests, respectively. This cost-effective and hand-held platform could assist health-care professionals to perform high-throughput disease screening or tracking of vaccination campaigns at the point-of-care, even in resource-poor and field-settings. Also, its intrinsic wireless connectivity can serve epidemiological studies, generating spatiotemporal maps of disease prevalence and immunity.
DOT National Transportation Integrated Search
1999-09-01
This report presents the results of the field test portion of the Development, Evaluation, and Application of Performance-Based Brake Testing Technologies sponsored by the Federal Highway Administrations (FHWA) Office of Motor Carriers.
Sigma models with negative curvature
Alonso, Rodrigo; Jenkins, Elizabeth E.; Manohar, Aneesh V.
2016-03-16
Here, we construct Higgs Effective Field Theory (HEFT) based on the scalar manifold Hn, which is a hyperbolic space of constant negative curvature. The Lagrangian has a non-compact O(n, 1) global symmetry group, but it gives a unitary theory as long as only a compact subgroup of the global symmetry is gauged. Whether the HEFT manifold has positive or negative curvature can be tested by measuring the S-parameter, and the cross sections for longitudinal gauge boson and Higgs boson scattering, since the curvature (including its sign) determines deviations from Standard Model values.
Simulation-Based Training for Colonoscopy
Preisler, Louise; Svendsen, Morten Bo Søndergaard; Nerup, Nikolaj; Svendsen, Lars Bo; Konge, Lars
2015-01-01
Abstract The aim of this study was to create simulation-based tests with credible pass/fail standards for 2 different fidelities of colonoscopy models. Only competent practitioners should perform colonoscopy. Reliable and valid simulation-based tests could be used to establish basic competency in colonoscopy before practicing on patients. Twenty-five physicians (10 consultants with endoscopic experience and 15 fellows with very little endoscopic experience) were tested on 2 different simulator models: a virtual-reality simulator and a physical model. Tests were repeated twice on each simulator model. Metrics with discriminatory ability were identified for both modalities and reliability was determined. The contrasting-groups method was used to create pass/fail standards and the consequences of these were explored. The consultants significantly performed faster and scored higher than the fellows on both the models (P < 0.001). Reliability analysis showed Cronbach α = 0.80 and 0.87 for the virtual-reality and the physical model, respectively. The established pass/fail standards failed one of the consultants (virtual-reality simulator) and allowed one fellow to pass (physical model). The 2 tested simulations-based modalities provided reliable and valid assessments of competence in colonoscopy and credible pass/fail standards were established for both the tests. We propose to use these standards in simulation-based training programs before proceeding to supervised training on patients. PMID:25634177
Full-field Strain Methods for Investigating Failure Mechanisms in Triaxial Braided Composites
NASA Technical Reports Server (NTRS)
Littell, Justin D.; Binienda, Wieslaw K.; Goldberg, Robert K.; Roberts, Gary D.
2008-01-01
Composite materials made with triaxial braid architecture and large tow size carbon fibers are beginning to be used in many applications, including composite aircraft and engine structures. Recent advancements in braiding technology have led to commercially viable manufacturing approaches for making large structures with complex shape. Although the large unit cell size of these materials is an advantage for manufacturing efficiency, the fiber architecture presents some challenges for materials characterization, design, and analysis. In some cases, the static load capability of structures made using these materials has been higher than expected based on material strength properties measured using standard coupon tests. A potential problem with using standard tests methods for these materials is that the unit cell size can be an unacceptably large fraction of the specimen dimensions. More detailed investigation of deformation and failure processes in large unit cell size triaxial braid composites is needed to evaluate the applicability of standard test methods for these materials and to develop alternative testing approaches. In recent years, commercial equipment has become available that enables digital image correlation to be used on a more routine basis for investigation of full field 3D deformation in materials and structures. In this paper, some new techniques that have been developed to investigate local deformation and failure using digital image correlation techniques are presented. The methods were used to measure both local and global strains during standard straight-sided coupon tensile tests on composite materials made with 12 and 24 k yarns and a 0/+60/-60 triaxial braid architecture. Local deformation and failure within fiber bundles was observed, and this local failure had a significant effect on global stiffness and strength. The matrix material had a large effect on local damage initiation for the two matrix materials used in this investigation. Premature failure in regions of the unit cell near the edge of the straight-sided specimens was observed for transverse tensile tests in which the braid axial fibers were perpendicular to the specimen axis and the bias fibers terminated on the cut edges in the specimen gage section. This edge effect is one factor that could contribute to a measured strength that is lower than the actual material strength in a structure without edge effects.
Hermassi, Souhail; Chelly, Mohamed-Souhaiel; Wollny, Rainer; Hoffmeyer, Birgit; Fieseler, Georg; Schulze, Stephan; Irlenbusch, Lars; Delank, Karl-Stefan; Shephard, Roy J; Bartels, Thomas; Schwesig, René
2018-06-01
This study assessed the validity of the handball-specific complex test (HBCT) and two non-specific field tests in professional elite handball athletes, using the match performance score (MPS) as the gold standard of performance. Thirteen elite male handball players (age: 27.4±4.8 years; premier German league) performed the HBCT, the Yo-Yo Intermittent Recovery (YYIR) test and a repeated shuttle sprint ability (RSA) test at the beginning of pre-season training. The RSA results were evaluated in terms of best time, total time, and fatigue decrement. Heart rates (HR) were assessed at selected times throughout all tests; the recovery HR was measured immediately post-test and 10 minutes later. The match performance score was based on various handball specific parameters (e.g., field goals, assists, steals, blocks, and technical mistakes) as seen during all matches of the immediately subsequent season (2015/2016). The parameters of run 1, run 2, and HR recovery at minutes 6 and 10 of the RSA test all showed a variance of more than 10% (range: 11-15%). However, the variance of scores for the YYIR test was much smaller (range: 1-7%). The resting HR (r2=0.18), HR recovery at minute 10 (r2=0.10), lactate concentration at rest (r2=0.17), recovery of heart rate from 0 to 10 minutes (r2=0.15), and velocity of second throw at first trial (r2=0.37) were the most valid HBCT parameters. Much effort is necessary to assess MPS and to develop valid tests. Speed and the rate of functional recovery seem the best predictors of competitive performance for elite handball players.
NASA Astrophysics Data System (ADS)
Day-Lewis, F. D.; Gray, M. B.
2004-12-01
Development of our Hydrogeophysics Well Field has enabled new opportunities for field-based undergraduate research and active-learning at Bucknell University. Installed in 2001-2002, the on-campus well field has become a cornerstone of field labs for hydrogeology and applied geophysics courses, and for introductory labs in engineering and environmental geology. In addition to enabling new field experiences, the well field serves as a meeting place for students and practicing geoscientists. In the last three years, we have hosted field demonstrations by alumni working in the environmental, geophysical, and water-well drilling industries; researchers from government agencies; graduate students from other universities; and geophysical equipment vendors seeking to test and demonstrate new instruments. Coordinating undergraduate research and practical course labs with field experiments led by alumni and practicing geoscientists provides students hands-on experience with new technology while educating them about career and graduate-school opportunities. In addition to being effective pedagogical strategy, these experiences are well received by students -- enrollment in our geophysics course has tripled from three years ago. The Bucknell Hydrogeophysics Well Field consists of five bedrock wells, installed in a fractured-rock aquifer in the Wills Creek Shale. The wells are open in the bedrock, facilitating geophysical and hydraulic measurements. To date, student have helped acquire from one or more wells: (1) open-hole slug- and aquifer-test data; (2) packer test data from isolated borehole intervals; (3) flow-meter logs; (4) acoustic and optical televiewer logs; (5) standard borehole logs including single-point resistance, caliper, and natural-gamma; (6) borehole video camera; (7) electrical resistivity tomograms; (8) water levels while drilling; and (9) water chemistry and temperature logs. Preliminary student-led data analysis indicates that sparse discrete fractures dominate the response of water levels to pumping. The three sets of fractures observed in the wells are consistent with those observed in outcrops around Bucknell: (1) bedding sub-parallel fractures; (2) joints; and (3) fractures parallel to rock cleavage. Efforts are ongoing to develop a CD-ROM of field data, photographs and video footage documenting the site and experiments; the CD is intended for publication as a "Virtual Field Laboratory" teaching tool for undergraduate hydrogeology and applied geophysics. We have seen the benefits of merging theory and practice in our undergraduate curriculum, and we seek to make these benefits available to other schools.
Lockie, Robert G; Schultz, Adrian B; Callaghan, Samuel J; Jeffriess, Matthew D; Berry, Simon P
2013-01-01
Field sport coaches must use reliable and valid tests to assess change-of-direction speed in their athletes. Few tests feature linear sprinting with acute change- of-direction maneuvers. The Change-of-Direction and Acceleration Test (CODAT) was designed to assess field sport change-of-direction speed, and includes a linear 5-meter (m) sprint, 45° and 90° cuts, 3- m sprints to the left and right, and a linear 10-m sprint. This study analyzed the reliability and validity of this test, through comparisons to 20-m sprint (0-5, 0-10, 0-20 m intervals) and Illinois agility run (IAR) performance. Eighteen Australian footballers (age = 23.83 ± 7.04 yrs; height = 1.79 ± 0.06 m; mass = 85.36 ± 13.21 kg) were recruited. Following familiarization, subjects completed the 20-m sprint, CODAT, and IAR in 2 sessions, 48 hours apart. Intra-class correlation coefficients (ICC) assessed relative reliability. Absolute reliability was analyzed through paired samples t-tests (p ≤ 0.05) determining between-session differences. Typical error (TE), coefficient of variation (CV), and differences between the TE and smallest worthwhile change (SWC), also assessed absolute reliability and test usefulness. For the validity analysis, Pearson's correlations (p ≤ 0.05) analyzed between-test relationships. Results showed no between-session differences for any test (p = 0.19-0.86). CODAT time averaged ~6 s, and the ICC and CV equaled 0.84 and 3.0%, respectively. The homogeneous sample of Australian footballers meant that the CODAT's TE (0.19 s) exceeded the usual 0.2 x standard deviation (SD) SWC (0.10 s). However, the CODAT is capable of detecting moderate performance changes (SWC calculated as 0.5 x SD = 0.25 s). There was a near perfect correlation between the CODAT and IAR (r = 0.92), and very large correlations with the 20-m sprint (r = 0.75-0.76), suggesting that the CODAT was a valid change-of-direction speed test. Due to movement specificity, the CODAT has value for field sport assessment. Key pointsThe change-of-direction and acceleration test (CODAT) was designed specifically for field sport athletes from specific speed research, and data derived from time-motion analyses of sports such as rugby union, soccer, and Australian football. The CODAT features a linear 5-meter (m) sprint, 45° and 90° cuts and 3-m sprints to the left and right, and a linear 10-m sprint.The CODAT was found to be a reliable change-of-direction speed assessment when considering intra-class correlations between two testing sessions, and the coefficient of variation between trials. A homogeneous sample of Australian footballers resulted in absolute reliability limitations when considering differences between the typical error and smallest worthwhile change. However, the CODAT will detect moderate (0.5 times the test's standard deviation) changes in performance.The CODAT correlated with the Illinois agility run, highlighting that it does assess change-of-direction speed. There were also significant relationships with short sprint performance (i.e. 0-5 m and 0-10 m), demonstrating that linear acceleration is assessed within the CODAT, without the extended duration and therefore metabolic limitations of the IAR. Indeed, the average duration of the test (~6 seconds) is field sport-specific. Therefore, the CODAT could be used as an assessment of change-of-direction speed in field sport athletes.
Field evaluation of ventilation system performance in enclosed parking garages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ayari, A.M.; Grot, D.A.; Krarti, M.
2000-07-01
This paper summarizes the results of a field study to determine the ventilation requirements and the contaminant levels in existing enclosed parking garages. The testing was conducted in seven parking garages with different sizes, traffic flow patterns, vehicle types, and locations. In particular, the study compares the actual ventilation rates measured using the tracer gas technique with the ventilation requirements of ANSI/ASHRAE Standard 62-1989. In addition, the field test evaluated the effectiveness of the existing ventilation systems in maintaining acceptable contaminant levels within enclosed parking garages.
TH-EF-BRB-11: Volumetric Modulated Arc Therapy for Total Body Irradiation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ouyang, L; Folkerts, M; Hrycushko, B
Purpose: To develop a modern, patient-comfortable total body irradiation (TBI) technique suitable for standard-sized linac vaults. Methods: An indexed rotatable immobilization system (IRIS) was developed to make possible total-body CT imaging and radiation delivery on conventional couches. Treatment consists of multi-isocentric volumetric modulated arc therapy (VMAT) to the upper body and parallel-opposed fields to the lower body. Each isocenter is indexed to the couch and includes a 180° IRIS rotation between the upper and lower body fields. VMAT fields are optimized to satisfy lung dose objectives while achieving a uniform therapeutic dose to the torso. End-to-end tests with a randomore » phantom were used to verify dosimetric characteristics. Treatment plan robustness regarding setup uncertainty was assessed by simulating global and regional isocenter setup shifts on patient data sets. Dosimetric comparisons were made with conventional extended distance, standing TBI (cTBI) plans using a Monte Carlo-based calculation. Treatment efficiency was assessed for eight courses of patient treatment. Results: The IRIS system is level and orthogonal to the scanned CT image plane, with lateral shifts <2mm following rotation. End-to-end tests showed surface doses within ±10% of the prescription dose, field junction doses within ±15% of prescription dose. Plan robustness tests showed <15% changes in dose with global setup errors up to 5mm in each direction. Local 5mm relative setup errors in the chest resulted in < 5% dose changes. Local 5mm shift errors in the pelvic and upper leg junction resulted in <10% dose changes while a 10mm shift error causes dose changes up to 25%. Dosimetric comparison with cTBI showed VMAT-TBI has advantages in preserving chest wall dose with flexibility in leveraging the PTV-body and PTV-lung dose. Conclusion: VMAT-TBI with the IRIS system was shown clinically feasible as a cost-effective approach to TBI for standard-sized linac vaults.« less
Faris, Allison T.; Seed, Raymond B.; Kayen, Robert E.; Wu, Jiaer
2006-01-01
During the 1906 San Francisco Earthquake, liquefaction-induced lateral spreading and resultant ground displacements damaged bridges, buried utilities, and lifelines, conventional structures, and other developed works. This paper presents an improved engineering tool for the prediction of maximum displacement due to liquefaction-induced lateral spreading. A semi-empirical approach is employed, combining mechanistic understanding and data from laboratory testing with data and lessons from full-scale earthquake field case histories. The principle of strain potential index, based primary on correlation of cyclic simple shear laboratory testing results with in-situ Standard Penetration Test (SPT) results, is used as an index to characterized the deformation potential of soils after they liquefy. A Bayesian probabilistic approach is adopted for development of the final predictive model, in order to take fullest advantage of the data available and to deal with the inherent uncertainties intrinstiic to the back-analyses of field case histories. A case history from the 1906 San Francisco Earthquake is utilized to demonstrate the ability of the resultant semi-empirical model to estimate maximum horizontal displacement due to liquefaction-induced lateral spreading.
Temperature Control Diagnostics for Sample Environments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Santodonato, Louis J; Walker, Lakeisha MH; Church, Andrew J
2010-01-01
In a scientific laboratory setting, standard equipment such as cryocoolers are often used as part of a custom sample environment system designed to regulate temperature over a wide range. The end user may be more concerned with precise sample temperature control than with base temperature. But cryogenic systems tend to be specified mainly in terms of cooling capacity and base temperature. Technical staff at scientific user facilities (and perhaps elsewhere) often wonder how to best specify and evaluate temperature control capabilities. Here we describe test methods and give results obtained at a user facility that operates a large sample environmentmore » inventory. Although this inventory includes a wide variety of temperature, pressure, and magnetic field devices, the present work focuses on cryocooler-based systems.« less
Development of a nematode offspring counting assay for rapid and simple soil toxicity assessment.
Kim, Shin Woong; Moon, Jongmin; Jeong, Seung-Woo; An, Youn-Joo
2018-05-01
Since the introduction of standardized nematode toxicity assays by the American Society for Testing and Materials (ASTM) and International Organization for Standardization (ISO), many studies have reported their use. Given that the currently used standardized nematode toxicity assays have certain limitations, in this study, we examined the use of a novel nematode offspring counting assay for evaluating soil ecotoxicity based on a previous soil-agar isolation method used to recover live adult nematodes. In this new assay, adult Caenorhabditis elegans were exposed to soil using a standardized toxicity assay procedure, and the resulting offspring in test soils attracted by a microbial food source in agar plates were counted. This method differs from previously used assays in terms of its endpoint, namely, the number of nematode offspring. The applicability of the bioassay was demonstrated using metal-spiked soils, which revealed metal concentration-dependent responses, and with 36 field soil samples characterized by different physicochemical properties and containing various metals. Principal component analysis revealed that texture fraction (clay, sand, and silt) and electrical conductivity values were the main factors influencing the nematode offspring counting assay, and these findings warrant further investigation. The nematode offspring counting assay is a rapid and simple process that can provide multi-directional toxicity assessment when used in conjunction with other standard methods. Copyright © 2018 Elsevier Ltd. All rights reserved.
Comparison of Breast Density Between Synthesized Versus Standard Digital Mammography.
Haider, Irfanullah; Morgan, Matthew; McGow, Anna; Stein, Matthew; Rezvani, Maryam; Freer, Phoebe; Hu, Nan; Fajardo, Laurie; Winkler, Nicole
2018-06-12
To evaluate perceptual difference in breast density classification using synthesized mammography (SM) compared with standard or full-field digital mammography (FFDM) for screening. This institutional review board-approved, retrospective, multireader study evaluated breast density on 200 patients who underwent baseline screening mammogram during which both SM and FFDM were obtained contemporaneously from June 1, 2016, through November 30, 2016. Qualitative breast density was independently assigned by seven readers initially evaluating FFDM alone. Then, in a separate session, these same readers assigned breast density using synthetic views alone on the same 200 patients. The readers were again blinded to each other's assignment. Qualitative density assessment was based on BI-RADS fifth edition. Interreader agreement was evaluated with κ statistic using 95% confidence intervals. Testing for homogeneity in paired proportions was performed using McNemar's test with a level of significance of .05. For patients across the SM and standard 2-D data set, diagnostic testing with McNemar's test with P = 0.32 demonstrates that the minimal density transitions across FFDM and SM are not statistically significant density shifts. Taking clinical significance into account, only 8 of 200 (4%) patients had clinically significant transition (dense versus not dense). There was substantial interreader agreement with overall κ in FFDM of 0.71 (minimum 0.53, maximum 0.81) and overall SM κ average of 0.63 (minimum 0.56, maximum 0.87). Overall subjective breast density assignment by radiologists on SM is similar to density assignment on standard 2-D mammogram. Copyright © 2018 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Electromagnetic system for detection and localization of miners caught in mine accidents
NASA Astrophysics Data System (ADS)
Pronenko, Vira; Dudkin, Fedir
2016-12-01
The profession of a miner is one of the most dangerous in the world. Among the main causes of fatalities in underground coal mines are the delayed alert of the accident and the lack of information regarding the actual location of the miners after the accident. In an emergency situation (failure or destruction of underground infrastructure), personnel search behind and beneath blockage needs to be performed urgently. However, none of the standard technologies - radio-frequency identification (RFID), Digital Enhanced Cordless Telecommunications (DECT), Wi-Fi, emitting cables, which use the stationary technical devices in mines - provide information about the miners location with the necessary precision. The only technology that is able to provide guaranteed delivery of messages to mine personnel, regardless of their location and under any destruction in the mine, is low-frequency radio technology, which is able to operate through the thickness of rocks even if they are wet. The proposed new system for miner localization is based on solving the inverse problem of determining the magnetic field source coordinates using the data of magnetic field measurements. This approach is based on the measurement of the magnetic field radiated by the miner's responder beacon using two fixed and spaced three-component magnetic field receivers and the inverse problem solution. As a result, a working model of the system for miner's beacon search and localization (MILES - MIner's Location Emergency System) was developed and successfully tested. This paper presents the most important aspects of this development and the results of experimental tests.
NASA Astrophysics Data System (ADS)
Aad, G.; Abbott, B.; Abdinov, O.; Abdallah, J.; Abeloos, B.; Aben, R.; Abolins, M.; Aben, R.; Abolins, M.; AbouZeid, O. S.; Abraham, N. L.; Abramowicz, H.; Abreu, H.; Abreu, R.; Abulaiti, Y.; Acharya, B. S.; Adamczyk, L.; Adams, D. L.; Adelman, J.; Adomeit, S.; Adye, T.; Affolder, A. A.; Agatonovic-Jovin, T.; Agricola, J.; Aguilar-Saavedra, J. A.; Ahlen, S. P.; Ahmadov, F.; Aielli, G.; Akerstedt, H.; Åkesson, T. P. A.; Akimov, A. V.; Alberghi, G. L.; Albert, J.; Albrand, S.; Verzini, M. J. Alconada; Aleksa, M.; Aleksandrov, I. N.; Alexa, C.; Alexander, G.; Alexopoulos, T.; Alhroob, M.; Alimonti, G.; Alison, J.; Alkire, S. P.; Allbrooke, B. M. M.; Allen, B. W.; Allport, P. P.; Aloisio, A.; Alonso, A.; Alonso, F.; Alpigiani, C.; Gonzalez, B. Alvarez; Piqueras, D. Álvarez; Alviggi, M. G.; Amadio, B. T.; Amako, K.; Coutinho, Y. Amaral; Amelung, C.; Amidei, D.; Santos, S. P. Amor Dos; Amorim, A.; Amoroso, S.; Amram, N.; Amundsen, G.; Anastopoulos, C.; Ancu, L. S.; Andari, N.; Andeen, T.; Anders, C. F.; Anders, G.; Anders, J. K.; Anderson, K. J.; Andreazza, A.; Andrei, V.; Angelidakis, S.; Angelozzi, I.; Anger, P.; Angerami, A.; Anghinolfi, F.; Anisenkov, A. V.; Anjos, N.; Annovi, A.; Antonelli, M.; Antonov, A.; Antos, J.; Anulli, F.; Aoki, M.; Bella, L. Aperio; Arabidze, G.; Arai, Y.; Araque, J. P.; Arce, A. T. H.; Arduh, F. A.; Arguin, J.-F.; Argyropoulos, S.; Arik, M.; Armbruster, A. J.; Armitage, L. J.; Arnaez, O.; Arnold, H.; Arratia, M.; Arslan, O.; Artamonov, A.; Artoni, G.; Artz, S.; Asai, S.; Asbah, N.; Ashkenazi, A.; Åsman, B.; Asquith, L.; Assamagan, K.; Astalos, R.; Atkinson, M.; Atlay, N. B.; Augsten, K.; Avolio, G.; Axen, B.; Ayoub, M. K.; Azuelos, G.; Baak, M. A.; Baas, A. E.; Baca, M. J.; Bachacou, H.; Bachas, K.; Backes, M.; Backhaus, M.; Bagiacchi, P.; Bagnaia, P.; Bai, Y.; Baines, J. T.; Baker, O. K.; Baldin, E. M.; Balek, P.; Balestri, T.; Balli, F.; Balunas, W. K.; Banas, E.; Banerjee, Sw.; Bannoura, A. A. E.; Barak, L.; Barberio, E. L.; Barberis, D.; Barbero, M.; Barillari, T.; Barisonzi, M.; Barklow, T.; Barlow, N.; Barnes, S. L.; Barnett, B. M.; Barnett, R. M.; Barnovska, Z.; Baroncelli, A.; Barone, G.; Barr, A. J.; Navarro, L. Barranco; Barreiro, F.; da Costa, J. Barreiro Guimarães; Bartoldus, R.; Barton, A. E.; Bartos, P.; Basalaev, A.; Bassalat, A.; Basye, A.; Bates, R. L.; Batista, S. J.; Batley, J. R.; Battaglia, M.; Bauce, M.; Bauer, F.; Bawa, H. S.; Beacham, J. B.; Beattie, M. D.; Beau, T.; Beauchemin, P. H.; Bechtle, P.; Beck, H. P.; Becker, K.; Becker, M.; Beckingham, M.; Becot, C.; Beddall, A. J.; Beddall, A.; Bednyakov, V. A.; Bedognetti, M.; Bee, C. P.; Beemster, L. J.; Beermann, T. A.; Begel, M.; Behr, J. K.; Belanger-Champagne, C.; Bell, A. S.; Bell, W. H.; Bella, G.; Bellagamba, L.; Bellerive, A.; Bellomo, M.; Belotskiy, K.; Beltramello, O.; Belyaev, N. L.; Benary, O.; Benchekroun, D.; Bender, M.; Bendtz, K.; Benekos, N.; Benhammou, Y.; Noccioli, E. Benhar; Benitez, J.; Garcia, J. A. Benitez; Benjamin, D. P.; Bensinger, J. R.; Bentvelsen, S.; Beresford, L.; Beretta, M.; Berge, D.; Kuutmann, E. Bergeaas; Berger, N.; Berghaus, F.; Beringer, J.; Berlendis, S.; Bernard, N. R.; Bernius, C.; Bernlochner, F. U.; Berry, T.; Berta, P.; Bertella, C.; Bertoli, G.; Bertolucci, F.; Bertram, I. A.; Bertsche, C.; Bertsche, D.; Besjes, G. J.; Bylund, O. Bessidskaia; Bessner, M.; Besson, N.; Betancourt, C.; Bethke, S.; Bevan, A. J.; Bhimji, W.; Bianchi, R. M.; Bianchini, L.; Bianco, M.; Biebel, O.; Biedermann, D.; Bielski, R.; Biesuz, N. V.; Biglietti, M.; De Mendizabal, J. Bilbao; Bilokon, H.; Bindi, M.; Binet, S.; Bingul, A.; Bini, C.; Biondi, S.; Bjergaard, D. M.; Black, C. W.; Black, J. E.; Black, K. M.; Blackburn, D.; Blair, R. E.; Blanchard, J.-B.; Blanco, J. E.; Blazek, T.; Bloch, I.; Blocker, C.; Blum, W.; Blumenschein, U.; Blunier, S.; Bobbink, G. J.; Bobrovnikov, V. S.; Bocchetta, S. S.; Bocci, A.; Bock, C.; Boehler, M.; Boerner, D.; Bogaerts, J. A.; Bogavac, D.; Bogdanchikov, A. G.; Bohm, C.; Boisvert, V.; Bold, T.; Boldea, V.; Boldyrev, A. S.; Bomben, M.; Bona, M.; Boonekamp, M.; Borisov, A.; Borissov, G.; Bortfeldt, J.; Bortoletto, D.; Bortolotto, V.; Bos, K.; Boscherini, D.; Bosman, M.; Sola, J. D. Bossio; Boudreau, J.; Bouffard, J.; Bouhova-Thacker, E. V.; Boumediene, D.; Bourdarios, C.; Boutle, S. K.; Boveia, A.; Boyd, J.; Boyko, I. R.; Bracinik, J.; Brandt, A.; Brandt, G.; Brandt, O.; Bratzler, U.; Brau, B.; Brau, J. E.; Braun, H. M.; Madden, W. D. Breaden; Brendlinger, K.; Brennan, A. J.; Brenner, L.; Brenner, R.; Bressler, S.; Bristow, T. M.; Britton, D.; Britzger, D.; Brochu, F. M.; Brock, I.; Brock, R.; Brooijmans, G.; Brooks, T.; Brooks, W. K.; Brosamer, J.; Brost, E.; Broughton, J. H.; de Renstrom, P. A. Bruckman; Bruncko, D.; Bruneliere, R.; Bruni, A.; Bruni, G.; Brunt, B. H.; Bruschi, M.; Bruscino, N.; Bryant, P.; Bryngemark, L.; Buanes, T.; Buat, Q.; Buchholz, P.; Buckley, A. G.; Budagov, I. A.; Buehrer, F.; Bugge, M. K.; Bulekov, O.; Bullock, D.; Burckhart, H.; Burdin, S.; Burgard, C. D.; Burghgrave, B.; Burka, K.; Burke, S.; Burmeister, I.; Busato, E.; Büscher, D.; Büscher, V.; Bussey, P.; Butler, J. M.; Butt, A. I.; Buttar, C. M.; Butterworth, J. M.; Butti, P.; Buttinger, W.; Buzatu, A.; Buzykaev, A. R.; Urbán, S. Cabrera; Caforio, D.; Cairo, V. M.; Cakir, O.; Calace, N.; Calafiura, P.; Calandri, A.; Calderini, G.; Calfayan, P.; Caloba, L. P.; Calvet, D.; Calvet, S.; Calvet, T. P.; Toro, R. Camacho; Camarda, S.; Camarri, P.; Cameron, D.; Armadans, R. Caminal; Camincher, C.; Campana, S.; Campanelli, M.; Campoverde, A.; Canale, V.; Canepa, A.; Bret, M. Cano; Cantero, J.; Cantrill, R.; Cao, T.; Garrido, M. D. M. Capeans; Caprini, I.; Caprini, M.; Capua, M.; Caputo, R.; Carbone, R. M.; Cardarelli, R.; Cardillo, F.; Carli, T.; Carlino, G.; Carminati, L.; Caron, S.; Carquin, E.; Carrillo-Montoya, G. D.; Carter, J. R.; Carvalho, J.; Casadei, D.; Casado, M. P.; Casolino, M.; Casper, D. W.; Castaneda-Miranda, E.; Castelli, A.; Gimenez, V. Castillo; Castro, N. F.; Catinaccio, A.; Catmore, J. R.; Cattai, A.; Caudron, J.; Cavaliere, V.; Cavallaro, E.; Cavalli, D.; Cavalli-Sforza, M.; Cavasinni, V.; Ceradini, F.; Alberich, L. Cerda; Cerio, B. C.; Cerqueira, A. S.; Cerri, A.; Cerrito, L.; Cerutti, F.; Cerv, M.; Cervelli, A.; Cetin, S. A.; Chafaq, A.; Chakraborty, D.; Chalupkova, I.; Chan, S. K.; Chan, Y. L.; Chang, P.; Chapman, J. D.; Charlton, D. G.; Chatterjee, A.; Chau, C. C.; Barajas, C. A. Chavez; Che, S.; Cheatham, S.; Chegwidden, A.; Chekanov, S.; Chekulaev, S. V.; Chelkov, G. A.; Chelstowska, M. A.; Chen, C.; Chen, H.; Chen, K.; Chen, S.; Chen, S.; Chen, X.; Chen, Y.; Cheng, H. C.; Cheng, H. J.; Cheng, Y.; Cheplakov, A.; Cheremushkina, E.; Moursli, R. Cherkaoui El; Chernyatin, V.; Cheu, E.; Chevalier, L.; Chiarella, V.; Chiarelli, G.; Chiodini, G.; Chisholm, A. S.; Chitan, A.; Chizhov, M. V.; Choi, K.; Chomont, A. R.; Chouridou, S.; Chow, B. K. B.; Christodoulou, V.; Chromek-Burckhart, D.; Chudoba, J.; Chuinard, A. J.; Chwastowski, J. J.; Chytka, L.; Ciapetti, G.; Ciftci, A. K.; Cinca, D.; Cindro, V.; Cioara, I. A.; Ciocio, A.; Cirotto, F.; Citron, Z. H.; Ciubancan, M.; Clark, A.; Clark, B. L.; Clark, P. J.; Clarke, R. N.; Clement, C.; Coadou, Y.; Cobal, M.; Coccaro, A.; Cochran, J.; Coffey, L.; Colasurdo, L.; Cole, B.; Cole, S.; Colijn, A. P.; Collot, J.; Colombo, T.; Compostella, G.; Muiño, P. Conde; Coniavitis, E.; Connell, S. H.; Connelly, I. A.; Consorti, V.; Constantinescu, S.; Conta, C.; Conti, G.; Conventi, F.; Cooke, M.; Cooper, B. D.; Cooper-Sarkar, A. M.; Cornelissen, T.; Corradi, M.; Corriveau, F.; Corso-Radu, A.; Cortes-Gonzalez, A.; Cortiana, G.; Costa, G.; Costa, M. J.; Costanzo, D.; Cottin, G.; Cowan, G.; Cox, B. E.; Cranmer, K.; Crawley, S. J.; Cree, G.; Crépé-Renaudin, S.; Crescioli, F.; Cribbs, W. A.; Ortuzar, M. Crispin; Cristinziani, M.; Croft, V.; Crosetti, G.; Donszelmann, T. Cuhadar; Cummings, J.; Curatolo, M.; Cúth, J.; Cuthbert, C.; Czirr, H.; Czodrowski, P.; D'Auria, S.; D'Onofrio, M.; De Sousa, M. J. Da Cunha Sargedas; Via, C. Da; Dabrowski, W.; Dai, T.; Dale, O.; Dallaire, F.; Dallapiccola, C.; Dam, M.; Dandoy, J. R.; Dang, N. P.; Daniells, A. C.; Dann, N. S.; Danninger, M.; Hoffmann, M. Dano; Dao, V.; Darbo, G.; Darmora, S.; Dassoulas, J.; Dattagupta, A.; Davey, W.; David, C.; Davidek, T.; Davies, M.; Davison, P.; Davygora, Y.; Dawe, E.; Dawson, I.; Daya-Ishmukhametova, R. K.; De, K.; de Asmundis, R.; De Benedetti, A.; De Castro, S.; De Cecco, S.; De Groot, N.; de Jong, P.; De la Torre, H.; De Lorenzi, F.; De Pedis, D.; De Salvo, A.; De Sanctis, U.; De Santo, A.; De Regie, J. B. De Vivie; Dearnaley, W. J.; Debbe, R.; Debenedetti, C.; Dedovich, D. V.; Deigaard, I.; Del Peso, J.; Del Prete, T.; Delgove, D.; Deliot, F.; Delitzsch, C. M.; Deliyergiyev, M.; Dell'Acqua, A.; Dell'Asta, L.; Dell'Orso, M.; Della Pietra, M.; della Volpe, D.; Delmastro, M.; Delsart, P. A.; Deluca, C.; DeMarco, D. A.; Demers, S.; Demichev, M.; Demilly, A.; Denisov, S. P.; Denysiuk, D.; Derendarz, D.; Derkaoui, J. E.; Derue, F.; Dervan, P.; Desch, K.; Deterre, C.; Dette, K.; Deviveiros, P. O.; Dewhurst, A.; Dhaliwal, S.; Di Ciaccio, A.; Di Ciaccio, L.; Di Clemente, W. K.; Di Domenico, A.; Di Donato, C.; Di Girolamo, A.; Di Girolamo, B.; Di Mattia, A.; Di Micco, B.; Di Nardo, R.; Di Simone, A.; Di Sipio, R.; Di Valentino, D.; Diaconu, C.; Diamond, M.; Dias, F. A.; Diaz, M. A.; Diehl, E. B.; Dietrich, J.; Diglio, S.; Dimitrievska, A.; Dingfelder, J.; Dita, P.; Dita, S.; Dittus, F.; Djama, F.; Djobava, T.; Djuvsland, J. I.; do Vale, M. A. B.; Dobos, D.; Dobre, M.; Doglioni, C.; Dohmae, T.; Dolejsi, J.; Dolezal, Z.; Dolgoshein, B. A.; Donadelli, M.; Donati, S.; Dondero, P.; Donini, J.; Dopke, J.; Doria, A.; Dova, M. T.; Doyle, A. T.; Drechsler, E.; Dris, M.; Du, Y.; Duarte-Campderros, J.; Duchovni, E.; Duckeck, G.; Ducu, O. A.; Duda, D.; Dudarev, A.; Duflot, L.; Duguid, L.; Dührssen, M.; Dunford, M.; Yildiz, H. Duran; Düren, M.; Durglishvili, A.; Duschinger, D.; Dutta, B.; Dyndal, M.; Eckardt, C.; Ecker, K. M.; Edgar, R. C.; Edson, W.; Edwards, N. C.; Eifert, T.; Eigen, G.; Einsweiler, K.; Ekelof, T.; Kacimi, M. El; Ellajosyula, V.; Ellert, M.; Elles, S.; Ellinghaus, F.; Elliot, A. A.; Ellis, N.; Elmsheuser, J.; Elsing, M.; Emeliyanov, D.; Enari, Y.; Endner, O. C.; Endo, M.; Ennis, J. S.; Erdmann, J.; Ereditato, A.; Ernis, G.; Ernst, J.; Ernst, M.; Errede, S.; Ertel, E.; Escalier, M.; Esch, H.; Escobar, C.; Esposito, B.; Etienvre, A. I.; Etzion, E.; Evans, H.; Ezhilov, A.; Fabbri, F.; Fabbri, L.; Facini, G.; Fakhrutdinov, R. M.; Falciano, S.; Falla, R. J.; Faltova, J.; Fang, Y.; Fanti, M.; Farbin, A.; Farilla, A.; Farina, C.; Farooque, T.; Farrell, S.; Farrington, S. M.; Farthouat, P.; Fassi, F.; Fassnacht, P.; Fassouliotis, D.; Giannelli, M. Faucci; Favareto, A.; Fawcett, W. J.; Fayard, L.; Fedin, O. L.; Fedorko, W.; Feigl, S.; Feligioni, L.; Feng, C.; Feng, E. J.; Feng, H.; Fenyuk, A. B.; Feremenga, L.; Martinez, P. Fernandez; Perez, S. Fernandez; Ferrando, J.; Ferrari, A.; Ferrari, P.; Ferrari, R.; de Lima, D. E. Ferreira; Ferrer, A.; Ferrere, D.; Ferretti, C.; Parodi, A. Ferretto; Fiedler, F.; Filipčič, A.; Filipuzzi, M.; Filthaut, F.; Fincke-Keeler, M.; Finelli, K. D.; Fiolhais, M. C. N.; Fiorini, L.; Firan, A.; Fischer, A.; Fischer, C.; Fischer, J.; Fisher, W. C.; Flaschel, N.; Fleck, I.; Fleischmann, P.; Fletcher, G. T.; Fletcher, G.; Fletcher, R. R. M.; Flick, T.; Floderus, A.; Castillo, L. R. Flores; Flowerdew, M. J.; Forcolin, G. T.; Formica, A.; Forti, A.; Foster, A. G.; Fournier, D.; Fox, H.; Fracchia, S.; Francavilla, P.; Franchini, M.; Francis, D.; Franconi, L.; Franklin, M.; Frate, M.; Fraternali, M.; Freeborn, D.; Fressard-Batraneanu, S. M.; Friedrich, F.; Froidevaux, D.; Frost, J. A.; Fukunaga, C.; Torregrosa, E. Fullana; Fusayasu, T.; Fuster, J.; Gabaldon, C.; Gabizon, O.; Gabrielli, A.; Gabrielli, A.; Gach, G. P.; Gadatsch, S.; Gadomski, S.; Gagliardi, G.; Gagnon, L. G.; Gagnon, P.; Galea, C.; Galhardo, B.; Gallas, E. J.; Gallop, B. J.; Gallus, P.; Galster, G.; Gan, K. K.; Gao, J.; Gao, Y.; Gao, Y. S.; Walls, F. M. Garay; García, C.; Navarro, J. E. García; Garcia-Sciveres, M.; Gardner, R. W.; Garelli, N.; Garonne, V.; Bravo, A. Gascon; Gatti, C.; Gaudiello, A.; Gaudio, G.; Gaur, B.; Gauthier, L.; Gavrilenko, I. L.; Gay, C.; Gaycken, G.; Gazis, E. N.; Gecse, Z.; Gee, C. N. P.; Geich-Gimbel, Ch.; Geisler, M. P.; Gemme, C.; Genest, M. H.; Geng, C.; Gentile, S.; George, S.; Gerbaudo, D.; Gershon, A.; Ghasemi, S.; Ghazlane, H.; Ghneimat, M.; Giacobbe, B.; Giagu, S.; Giannetti, P.; Gibbard, B.; Gibson, S. M.; Gignac, M.; Gilchriese, M.; Gillam, T. P. S.; Gillberg, D.; Gilles, G.; Gingrich, D. M.; Giokaris, N.; Giordani, M. P.; Giorgi, F. M.; Giorgi, F. M.; Giraud, P. F.; Giromini, P.; Giugni, D.; Giuli, F.; Giuliani, C.; Giulini, M.; Gjelsten, B. K.; Gkaitatzis, S.; Gkialas, I.; Gkougkousis, E. L.; Gladilin, L. K.; Glasman, C.; Glatzer, J.; Glaysher, P. C. F.; Glazov, A.; Goblirsch-Kolb, M.; Godlewski, J.; Goldfarb, S.; Golling, T.; Golubkov, D.; Gomes, A.; Gonçalo, R.; Costa, J. Goncalves Pinto Firmino Da; Gonella, L.; Gongadze, A.; de la Hoz, S. González; Parra, G. Gonzalez; Gonzalez-Sevilla, S.; Goossens, L.; Gorbounov, P. A.; Gordon, H. A.; Gorelov, I.; Gorini, B.; Gorini, E.; Gorišek, A.; Gornicki, E.; Goshaw, A. T.; Gössling, C.; Gostkin, M. I.; Goudet, C. R.; Goujdami, D.; Goussiou, A. G.; Govender, N.; Gozani, E.; Graber, L.; Grabowska-Bold, I.; Gradin, P. O. J.; Grafström, P.; Gramling, J.; Gramstad, E.; Grancagnolo, S.; Gratchev, V.; Gray, H. M.; Graziani, E.; Greenwood, Z. D.; Grefe, C.; Gregersen, K.; Gregor, I. M.; Grenier, P.; Grevtsov, K.; Griffiths, J.; Grillo, A. A.; Grimm, K.; Grinstein, S.; Gris, Ph.; Grivaz, J.-F.; Groh, S.; Grohs, J. P.; Gross, E.; Grosse-Knetter, J.; Grossi, G. C.; Grout, Z. J.; Guan, L.; Guan, W.; Guenther, J.; Guescini, F.; Guest, D.; Gueta, O.; Guido, E.; Guillemin, T.; Guindon, S.; Gul, U.; Gumpert, C.; Guo, J.; Guo, Y.; Gupta, S.; Gustavino, G.; Gutierrez, P.; Ortiz, N. G. Gutierrez; Gutschow, C.; Guyot, C.; Gwenlan, C.; Gwilliam, C. B.; Haas, A.; Haber, C.; Hadavand, H. K.; Haddad, N.; Hadef, A.; Haefner, P.; Hageböck, S.; Hajduk, Z.; Hakobyan, H.; Haleem, M.; Haley, J.; Hall, D.; Halladjian, G.; Hallewell, G. D.; Hamacher, K.; Hamal, P.; Hamano, K.; Hamilton, A.; Hamity, G. N.; Hamnett, P. G.; Han, L.; Hanagaki, K.; Hanawa, K.; Hance, M.; Haney, B.; Hanke, P.; Hanna, R.; Hansen, J. B.; Hansen, J. D.; Hansen, M. C.; Hansen, P. H.; Hara, K.; Hard, A. S.; Harenberg, T.; Hariri, F.; Harkusha, S.; Harrington, R. D.; Harrison, P. F.; Hartjes, F.; Hasegawa, M.; Hasegawa, Y.; Hasib, A.; Hassani, S.; Haug, S.; Hauser, R.; Hauswald, L.; Havranek, M.; Hawkes, C. M.; Hawkings, R. J.; Hawkins, A. D.; Hayden, D.; Hays, C. P.; Hays, J. M.; Hayward, H. S.; Haywood, S. J.; Head, S. J.; Heck, T.; Hedberg, V.; Heelan, L.; Heim, S.; Heim, T.; Heinemann, B.; Heinrich, J. J.; Heinrich, L.; Heinz, C.; Hejbal, J.; Helary, L.; Hellman, S.; Helsens, C.; Henderson, J.; Henderson, R. C. W.; Heng, Y.; Henkelmann, S.; Correia, A. M. Henriques; Henrot-Versille, S.; Herbert, G. H.; Jiménez, Y. Hernández; Herten, G.; Hertenberger, R.; Hervas, L.; Hesketh, G. G.; Hessey, N. P.; Hetherly, J. W.; Hickling, R.; Higón-Rodriguez, E.; Hill, E.; Hill, J. C.; Hiller, K. H.; Hillier, S. J.; Hinchliffe, I.; Hines, E.; Hinman, R. R.; Hirose, M.; Hirschbuehl, D.; Hobbs, J.; Hod, N.; Hodgkinson, M. C.; Hodgson, P.; Hoecker, A.; Hoeferkamp, M. R.; Hoenig, F.; Hohlfeld, M.; Hohn, D.; Holmes, T. R.; Homann, M.; Hong, T. M.; Hooberman, B. H.; Hopkins, W. H.; Horii, Y.; Horton, A. J.; Hostachy, J.-Y.; Hou, S.; Hoummada, A.; Howard, J.; Howarth, J.; Hrabovsky, M.; Hristova, I.; Hrivnac, J.; Hryn'ova, T.; Hrynevich, A.; Hsu, C.; Hsu, P. J.; Hsu, S.-C.; Hu, D.; Hu, Q.; Huang, Y.; Hubacek, Z.; Hubaut, F.; Huegging, F.; Huffman, T. B.; Hughes, E. W.; Hughes, G.; Huhtinen, M.; Hülsing, T. A.; Huseynov, N.; Huston, J.; Huth, J.; Iacobucci, G.; Iakovidis, G.; Ibragimov, I.; Iconomidou-Fayard, L.; Ideal, E.; Idrissi, Z.; Iengo, P.; Igonkina, O.; Iizawa, T.; Ikegami, Y.; Ikeno, M.; Ilchenko, Y.; Iliadis, D.; Ilic, N.; Ince, T.; Introzzi, G.; Ioannou, P.; Iodice, M.; Iordanidou, K.; Ippolito, V.; Quiles, A. Irles; Isaksson, C.; Ishino, M.; Ishitsuka, M.; Ishmukhametov, R.; Issever, C.; Istin, S.; Ito, F.; Ponce, J. M. Iturbe; Iuppa, R.; Ivarsson, J.; Iwanski, W.; Iwasaki, H.; Izen, J. M.; Izzo, V.; Jabbar, S.; Jackson, B.; Jackson, M.; Jackson, P.; Jain, V.; Jakobi, K. B.; Jakobs, K.; Jakobsen, S.; Jakoubek, T.; Jamin, D. O.; Jana, D. K.; Jansen, E.; Jansky, R.; Janssen, J.; Janus, M.; Jarlskog, G.; Javadov, N.; Javůrek, T.; Jeanneau, F.; Jeanty, L.; Jejelava, J.; Jeng, G.-Y.; Jennens, D.; Jenni, P.; Jentzsch, J.; Jeske, C.; Jézéquel, S.; Ji, H.; Jia, J.; Jiang, H.; Jiang, Y.; Jiggins, S.; Pena, J. Jimenez; Jin, S.; Jinaru, A.; Jinnouchi, O.; Johansson, P.; Johns, K. A.; Johnson, W. J.; Jon-And, K.; Jones, G.; Jones, R. W. L.; Jones, S.; Jones, T. J.; Jongmanns, J.; Jorge, P. M.; Jovicevic, J.; Ju, X.; Rozas, A. Juste; Köhler, M. K.; Kaczmarska, A.; Kado, M.; Kagan, H.; Kagan, M.; Kahn, S. J.; Kajomovitz, E.; Kalderon, C. W.; Kaluza, A.; Kama, S.; Kamenshchikov, A.; Kanaya, N.; Kaneti, S.; Kantserov, V. A.; Kanzaki, J.; Kaplan, B.; Kaplan, L. S.; Kapliy, A.; Kar, D.; Karakostas, K.; Karamaoun, A.; Karastathis, N.; Kareem, M. J.; Karentzos, E.; Karnevskiy, M.; Karpov, S. N.; Karpova, Z. M.; Karthik, K.; Kartvelishvili, V.; Karyukhin, A. N.; Kasahara, K.; Kashif, L.; Kass, R. D.; Kastanas, A.; Kataoka, Y.; Kato, C.; Katre, A.; Katzy, J.; Kawade, K.; Kawagoe, K.; Kawamoto, T.; Kawamura, G.; Kazama, S.; Kazanin, V. F.; Keeler, R.; Kehoe, R.; Keller, J. S.; Kempster, J. J.; Keoshkerian, H.; Kepka, O.; Kerševan, B. P.; Kersten, S.; Keyes, R. A.; Khalil-zada, F.; Khandanyan, H.; Khanov, A.; Kharlamov, A. G.; Khoo, T. J.; Khovanskiy, V.; Khramov, E.; Khubua, J.; Kido, S.; Kim, H. Y.; Kim, S. H.; Kim, Y. K.; Kimura, N.; Kind, O. M.; King, B. T.; King, M.; King, S. B.; Kirk, J.; Kiryunin, A. E.; Kishimoto, T.; Kisielewska, D.; Kiss, F.; Kiuchi, K.; Kivernyk, O.; Kladiva, E.; Klein, M. H.; Klein, M.; Klein, U.; Kleinknecht, K.; Klimek, P.; Klimentov, A.; Klingenberg, R.; Klinger, J. A.; Klioutchnikova, T.; Kluge, E.-E.; Kluit, P.; Kluth, S.; Knapik, J.; Kneringer, E.; Knoops, E. B. F. G.; Knue, A.; Kobayashi, A.; Kobayashi, D.; Kobayashi, T.; Kobel, M.; Kocian, M.; Kodys, P.; Koffas, T.; Koffeman, E.; Kogan, L. A.; Kohriki, T.; Koi, T.; Kolanoski, H.; Kolb, M.; Koletsou, I.; Komar, A. A.; Komori, Y.; Kondo, T.; Kondrashova, N.; Köneke, K.; König, A. C.; Kono, T.; Konoplich, R.; Konstantinidis, N.; Kopeliansky, R.; Koperny, S.; Köpke, L.; Kopp, A. K.; Korcyl, K.; Kordas, K.; Korn, A.; Korol, A. A.; Korolkov, I.; Korolkova, E. V.; Kortner, O.; Kortner, S.; Kosek, T.; Kostyukhin, V. V.; Kotov, V. M.; Kotwal, A.; Kourkoumeli-Charalampidi, A.; Kourkoumelis, C.; Kouskoura, V.; Koutsman, A.; Kowalewska, A. B.; Kowalewski, R.; Kowalski, T. Z.; Kozanecki, W.; Kozhin, A. S.; Kramarenko, V. A.; Kramberger, G.; Krasnopevtsev, D.; Krasny, M. W.; Krasznahorkay, A.; Kraus, J. K.; Kravchenko, A.; Kretz, M.; Kretzschmar, J.; Kreutzfeldt, K.; Krieger, P.; Krizka, K.; Kroeninger, K.; Kroha, H.; Kroll, J.; Kroseberg, J.; Krstic, J.; Kruchonak, U.; Krüger, H.; Krumnack, N.; Kruse, A.; Kruse, M. C.; Kruskal, M.; Kubota, T.; Kucuk, H.; Kuday, S.; Kuechler, J. T.; Kuehn, S.; Kugel, A.; Kuger, F.; Kuhl, A.; Kuhl, T.; Kukhtin, V.; Kukla, R.; Kulchitsky, Y.; Kuleshov, S.; Kuna, M.; Kunigo, T.; Kupco, A.; Kurashige, H.; Kurochkin, Y. A.; Kus, V.; Kuwertz, E. S.; Kuze, M.; Kvita, J.; Kwan, T.; Kyriazopoulos, D.; Rosa, A. La; Navarro, J. L. La Rosa; Rotonda, L. La; Lacasta, C.; Lacava, F.; Lacey, J.; Lacker, H.; Lacour, D.; Lacuesta, V. R.; Ladygin, E.; Lafaye, R.; Laforge, B.; Lagouri, T.; Lai, S.; Lammers, S.; Lampl, W.; Lançon, E.; Landgraf, U.; Landon, M. P. J.; Lang, V. S.; Lange, J. C.; Lankford, A. J.; Lanni, F.; Lantzsch, K.; Lanza, A.; Laplace, S.; Lapoire, C.; Laporte, J. F.; Lari, T.; Manghi, F. Lasagni; Lassnig, M.; Laurelli, P.; Lavrijsen, W.; Law, A. T.; Laycock, P.; Lazovich, T.; Lazzaroni, M.; Dortz, O. Le; Guirriec, E. Le; Menedeu, E. Le; Quilleuc, E. P. Le; LeBlanc, M.; LeCompte, T.; Ledroit-Guillon, F.; Lee, C. A.; Lee, S. C.; Lee, L.; Lefebvre, G.; Lefebvre, M.; Legger, F.; Leggett, C.; Lehan, A.; Miotto, G. Lehmann; Lei, X.; Leight, W. A.; Leisos, A.; Leister, A. G.; Leite, M. A. L.; Leitner, R.; Lellouch, D.; Lemmer, B.; Leney, K. J. C.; Lenz, T.; Lenzi, B.; Leone, R.; Leone, S.; Leonidopoulos, C.; Leontsinis, S.; Lerner, G.; Leroy, C.; Lesage, A. A. J.; Lester, C. G.; Levchenko, M.; Levêque, J.; Levin, D.; Levinson, L. J.; Levy, M.; Leyko, A. M.; Leyton, M.; Li, B.; Li, H.; Li, H. L.; Li, L.; Li, L.; Li, Q.; Li, S.; Li, X.; Li, Y.; Liang, Z.; Liao, H.; Liberti, B.; Liblong, A.; Lichard, P.; Lie, K.; Liebal, J.; Liebig, W.; Limbach, C.; Limosani, A.; Lin, S. C.; Lin, T. H.; Lindquist, B. E.; Lipeles, E.; Lipniacka, A.; Lisovyi, M.; Liss, T. M.; Lissauer, D.; Lister, A.; Litke, A. M.; Liu, B.; Liu, D.; Liu, H.; Liu, H.; Liu, J.; Liu, J. B.; Liu, K.; Liu, L.; Liu, M.; Liu, M.; Liu, Y. L.; Liu, Y.; Livan, M.; Lleres, A.; Merino, J. Llorente; Lloyd, S. L.; Sterzo, F. Lo; Lobodzinska, E.; Loch, P.; Lockman, W. S.; Loebinger, F. K.; Loevschall-Jensen, A. E.; Loew, K. M.; Loginov, A.; Lohse, T.; Lohwasser, K.; Lokajicek, M.; Long, B. A.; Long, J. D.; Long, R. E.; Longo, L.; Looper, K. A.; Lopes, L.; Mateos, D. Lopez; Paredes, B. Lopez; Paz, I. Lopez; Solis, A. Lopez; Lorenz, J.; Martinez, N. Lorenzo; Losada, M.; Lösel, P. J.; Lou, X.; Lounis, A.; Love, J.; Love, P. A.; Lu, H.; Lu, N.; Lubatti, H. J.; Luci, C.; Lucotte, A.; Luedtke, C.; Luehring, F.; Lukas, W.; Luminari, L.; Lundberg, O.; Lund-Jensen, B.; Lynn, D.; Lysak, R.; Lytken, E.; Lyubushkin, V.; Ma, H.; Ma, L. L.; Ma, Y.; Maccarrone, G.; Macchiolo, A.; Macdonald, C. M.; Maček, B.; Miguens, J. Machado; Madaffari, D.; Madar, R.; Maddocks, H. J.; Mader, W. F.; Madsen, A.; Maeda, J.; Maeland, S.; Maeno, T.; Maevskiy, A.; Magradze, E.; Mahlstedt, J.; Maiani, C.; Maidantchik, C.; Maier, A. A.; Maier, T.; Maio, A.; Majewski, S.; Makida, Y.; Makovec, N.; Malaescu, B.; Malecki, Pa.; Maleev, V. P.; Malek, F.; Mallik, U.; Malon, D.; Malone, C.; Maltezos, S.; Malyshev, V. M.; Malyukov, S.; Mamuzic, J.; Mancini, G.; Mandelli, B.; Mandelli, L.; Mandić, I.; Maneira, J.; Andrade Filho, L. Manhaes de; Ramos, J. Manjarres; Mann, A.; Mansoulie, B.; Mantifel, R.; Mantoani, M.; Manzoni, S.; Mapelli, L.; Marceca, G.; March, L.; Marchiori, G.; Marcisovsky, M.; Marjanovic, M.; Marley, D. E.; Marroquim, F.; Marsden, S. P.; Marshall, Z.; Marti, L. F.; Marti-Garcia, S.; Martin, B.; Martin, T. A.; Martin, V. J.; Latour, B. Martin dit; Martinez, M.; Martin-Haugh, S.; Martoiu, V. S.; Martyniuk, A. C.; Marx, M.; Marzano, F.; Marzin, A.; Masetti, L.; Mashimo, T.; Mashinistov, R.; Masik, J.; Maslennikov, A. L.; Massa, I.; Massa, L.; Mastrandrea, P.; Mastroberardino, A.; Masubuchi, T.; Mättig, P.; Mattmann, J.; Maurer, J.; Maxfield, S. J.; Maximov, D. A.; Mazini, R.; Mazza, S. M.; Fadden, N. C. Mc; Goldrick, G. Mc; Kee, S. P. Mc; McCarn, A.; McCarthy, R. L.; McCarthy, T. G.; McClymont, L. I.; McFarlane, K. W.; Mcfayden, J. A.; Mchedlidze, G.; McMahon, S. J.; McPherson, R. A.; Medinnis, M.; Meehan, S.; Mehlhase, S.; Mehta, A.; Meier, K.; Meineck, C.; Meirose, B.; Garcia, B. R. Mellado; Meloni, F.; Mengarelli, A.; Menke, S.; Meoni, E.; Mercurio, K. M.; Mergelmeyer, S.; Mermod, P.; Merola, L.; Meroni, C.; Merritt, F. S.; Messina, A.; Metcalfe, J.; Mete, A. S.; Meyer, C.; Meyer, C.; Meyer, J.-P.; Meyer, J.; Theenhausen, H. Meyer Zu; Middleton, R. P.; Miglioranzi, S.; Mijović, L.; Mikenberg, G.; Mikestikova, M.; Mikuž, M.; Milesi, M.; Milic, A.; Miller, D. W.; Mills, C.; Milov, A.; Milstead, D. A.; Minaenko, A. A.; Minami, Y.; Minashvili, I. A.; Mincer, A. I.; Mindur, B.; Mineev, M.; Ming, Y.; Mir, L. M.; Mistry, K. P.; Mitani, T.; Mitrevski, J.; Mitsou, V. A.; Miucci, A.; Miyagawa, P. S.; Mjörnmark, J. U.; Moa, T.; Mochizuki, K.; Mohapatra, S.; Mohr, W.; Molander, S.; Moles-Valls, R.; Monden, R.; Mondragon, M. C.; Mönig, K.; Monk, J.; Monnier, E.; Montalbano, A.; Berlingen, J. Montejo; Monticelli, F.; Monzani, S.; Moore, R. W.; Morange, N.; Moreno, D.; Llácer, M. Moreno; Morettini, P.; Mori, D.; Mori, T.; Morii, M.; Morinaga, M.; Morisbak, V.; Moritz, S.; Morley, A. K.; Mornacchi, G.; Morris, J. D.; Mortensen, S. S.; Morvaj, L.; Mosidze, M.; Moss, J.; Motohashi, K.; Mount, R.; Mountricha, E.; Mouraviev, S. V.; Moyse, E. J. W.; Muanza, S.; Mudd, R. D.; Mueller, F.; Mueller, J.; Mueller, R. S. P.; Mueller, T.; Muenstermann, D.; Mullen, P.; Mullier, G. A.; Sanchez, F. J. Munoz; Quijada, J. A. Murillo; Murray, W. J.; Murrone, A.; Musheghyan, H.; Muskinja, M.; Myagkov, A. G.; Myska, M.; Nachman, B. P.; Nackenhorst, O.; Nadal, J.; Nagai, K.; Nagai, R.; Nagano, K.; Nagasaka, Y.; Nagata, K.; Nagel, M.; Nagy, E.; Nairz, A. M.; Nakahama, Y.; Nakamura, K.; Nakamura, T.; Nakano, I.; Namasivayam, H.; Garcia, R. F. Naranjo; Narayan, R.; Villar, D. I. Narrias; Naryshkin, I.; Naumann, T.; Navarro, G.; Nayyar, R.; Neal, H. A.; Nechaeva, P. Yu.; Neep, T. J.; Nef, P. D.; Negri, A.; Negrini, M.; Nektarijevic, S.; Nellist, C.; Nelson, A.; Nemecek, S.; Nemethy, P.; Nepomuceno, A. A.; Nessi, M.; Neubauer, M. S.; Neumann, M.; Neves, R. M.; Nevski, P.; Newman, P. R.; Nguyen, D. H.; Nickerson, R. B.; Nicolaidou, R.; Nicquevert, B.; Nielsen, J.; Nikiforov, A.; Nikolaenko, V.; Nikolic-Audit, I.; Nikolopoulos, K.; Nilsen, J. K.; Nilsson, P.; Ninomiya, Y.; Nisati, A.; Nisius, R.; Nobe, T.; Nodulman, L.; Nomachi, M.; Nomidis, I.; Nooney, T.; Norberg, S.; Nordberg, M.; Norjoharuddeen, N.; Novgorodova, O.; Nowak, S.; Nozaki, M.; Nozka, L.; Ntekas, K.; Nurse, E.; Nuti, F.; O'grady, F.; O'Neil, D. C.; O'Rourke, A. A.; O'Shea, V.; Oakham, F. G.; Oberlack, H.; Obermann, T.; Ocariz, J.; Ochi, A.; Ochoa, I.; Ochoa-Ricoux, J. P.; Oda, S.; Odaka, S.; Ogren, H.; Oh, A.; Oh, S. H.; Ohm, C. C.; Ohman, H.; Oide, H.; Okawa, H.; Okumura, Y.; Okuyama, T.; Olariu, A.; Seabra, L. F. Oleiro; Pino, S. A. Olivares; Damazio, D. Oliveira; Olszewski, A.; Olszowska, J.; Onofre, A.; Onogi, K.; Onyisi, P. U. E.; Oram, C. J.; Oreglia, M. J.; Oren, Y.; Orestano, D.; Orlando, N.; Orr, R. S.; Osculati, B.; Ospanov, R.; Garzon, G. Otero y.; Otono, H.; Ouchrif, M.; Ould-Saada, F.; Ouraou, A.; Oussoren, K. P.; Ouyang, Q.; Ovcharova, A.; Owen, M.; Owen, R. E.; Ozcan, V. E.; Ozturk, N.; Pachal, K.; Pages, A. Pacheco; Aranda, C. Padilla; Pagáčová, M.; Griso, S. Pagan; Paige, F.; Pais, P.; Pajchel, K.; Palacino, G.; Palestini, S.; Palka, M.; Pallin, D.; Palma, A.; Panagiotopoulou, E. St.; Pandini, C. E.; Vazquez, J. G. Panduro; Pani, P.; Panitkin, S.; Pantea, D.; Paolozzi, L.; Papadopoulou, Th. D.; Papageorgiou, K.; Paramonov, A.; Hernandez, D. Paredes; Parker, A. J.; Parker, M. A.; Parker, K. A.; Parodi, F.; Parsons, J. A.; Parzefall, U.; Pascuzzi, V.; Pasqualucci, E.; Passaggio, S.; Pastore, F.; Pastore, Fr.; Pásztor, G.; Pataraia, S.; Patel, N. D.; Pater, J. R.; Pauly, T.; Pearce, J.; Pearson, B.; Pedersen, L. E.; Pedersen, M.; Lopez, S. Pedraza; Pedro, R.; Peleganchuk, S. V.; Pelikan, D.; Penc, O.; Peng, C.; Peng, H.; Penwell, J.; Peralva, B. S.; Perego, M. M.; Perepelitsa, D. V.; Codina, E. Perez; Perini, L.; Pernegger, H.; Perrella, S.; Peschke, R.; Peshekhonov, V. D.; Peters, K.; Peters, R. F. Y.; Petersen, B. A.; Petersen, T. C.; Petit, E.; Petridis, A.; Petridou, C.; Petroff, P.; Petrolo, E.; Petrov, M.; Petrucci, F.; Pettersson, N. E.; Peyaud, A.; Pezoa, R.; Phillips, P. W.; Piacquadio, G.; Pianori, E.; Picazio, A.; Piccaro, E.; Piccinini, M.; Pickering, M. A.; Piegaia, R.; Pilcher, J. E.; Pilkington, A. D.; Pin, A. W. J.; Pina, J.; Pinamonti, M.; Pinfold, J. L.; Pingel, A.; Pires, S.; Pirumov, H.; Pitt, M.; Plazak, L.; Pleier, M.-A.; Pleskot, V.; Plotnikova, E.; Plucinski, P.; Pluth, D.; Poettgen, R.; Poggioli, L.; Pohl, D.; Polesello, G.; Poley, A.; Policicchio, A.; Polifka, R.; Polini, A.; Pollard, C. S.; Polychronakos, V.; Pommès, K.; Pontecorvo, L.; Pope, B. G.; Popeneciu, G. A.; Popovic, D. S.; Poppleton, A.; Pospisil, S.; Potamianos, K.; Potrap, I. N.; Potter, C. J.; Potter, C. T.; Poulard, G.; Poveda, J.; Pozdnyakov, V.; Astigarraga, M. E. Pozo; Pralavorio, P.; Pranko, A.; Prell, S.; Price, D.; Price, L. E.; Primavera, M.; Prince, S.; Proissl, M.; Prokofiev, K.; Prokoshin, F.; Protopopescu, S.; Proudfoot, J.; Przybycien, M.; Puddu, D.; Puldon, D.; Purohit, M.; Puzo, P.; Qian, J.; Qin, G.; Qin, Y.; Quadt, A.; Quayle, W. B.; Queitsch-Maitland, M.; Quilty, D.; Raddum, S.; Radeka, V.; Radescu, V.; Radhakrishnan, S. K.; Radloff, P.; Rados, P.; Ragusa, F.; Rahal, G.; Raine, J. A.; Rajagopalan, S.; Rammensee, M.; Rangel-Smith, C.; Ratti, M. G.; Rauscher, F.; Rave, S.; Ravenscroft, T.; Raymond, M.; Read, A. L.; Readioff, N. P.; Rebuzzi, D. M.; Redelbach, A.; Redlinger, G.; Reece, R.; Reeves, K.; Rehnisch, L.; Reichert, J.; Reisin, H.; Rembser, C.; Ren, H.; Rescigno, M.; Resconi, S.; Rezanova, O. L.; Reznicek, P.; Rezvani, R.; Richter, R.; Richter, S.; Richter-Was, E.; Ricken, O.; Ridel, M.; Rieck, P.; Riegel, C. J.; Rieger, J.; Rifki, O.; Rijssenbeek, M.; Rimoldi, A.; Rinaldi, L.; Ristić, B.; Ritsch, E.; Riu, I.; Rizatdinova, F.; Rizvi, E.; Rizzi, C.; Robertson, S. H.; Robichaud-Veronneau, A.; Robinson, D.; Robinson, J. E. M.; Robson, A.; Roda, C.; Rodina, Y.; Perez, A. Rodriguez; Rodriguez, D. Rodriguez; Roe, S.; Rogan, C. S.; Røhne, O.; Romaniouk, A.; Romano, M.; Saez, S. M. Romano; Adam, E. Romero; Rompotis, N.; Ronzani, M.; Roos, L.; Ros, E.; Rosati, S.; Rosbach, K.; Rose, P.; Rosenthal, O.; Rossetti, V.; Rossi, E.; Rossi, L. P.; Rosten, J. H. N.; Rosten, R.; Rotaru, M.; Roth, I.; Rothberg, J.; Rousseau, D.; Royon, C. R.; Rozanov, A.; Rozen, Y.; Ruan, X.; Rubbo, F.; Rubinskiy, I.; Rud, V. I.; Rudolph, M. S.; Rühr, F.; Ruiz-Martinez, A.; Rurikova, Z.; Rusakovich, N. A.; Ruschke, A.; Russell, H. L.; Rutherfoord, J. P.; Ruthmann, N.; Ryabov, Y. F.; Rybar, M.; Rybkin, G.; Ryu, S.; Ryzhov, A.; Saavedra, A. F.; Sabato, G.; Sacerdoti, S.; Sadrozinski, H. F.-W.; Sadykov, R.; Tehrani, F. Safai; Saha, P.; Sahinsoy, M.; Saimpert, M.; Saito, T.; Sakamoto, H.; Sakurai, Y.; Salamanna, G.; Salamon, A.; Loyola, J. E. Salazar; Salek, D.; De Bruin, P. H. Sales; Salihagic, D.; Salnikov, A.; Salt, J.; Salvatore, D.; Salvatore, F.; Salvucci, A.; Salzburger, A.; Sammel, D.; Sampsonidis, D.; Sanchez, A.; Sánchez, J.; Martinez, V. Sanchez; Sandaker, H.; Sandbach, R. L.; Sander, H. G.; Sanders, M. P.; Sandhoff, M.; Sandoval, C.; Sandstroem, R.; Sankey, D. P. C.; Sannino, M.; Sansoni, A.; Santoni, C.; Santonico, R.; Santos, H.; Castillo, I. Santoyo; Sapp, K.; Sapronov, A.; Saraiva, J. G.; Sarrazin, B.; Sasaki, O.; Sasaki, Y.; Sato, K.; Sauvage, G.; Sauvan, E.; Savage, G.; Savard, P.; Sawyer, C.; Sawyer, L.; Saxon, J.; Sbarra, C.; Sbrizzi, A.; Scanlon, T.; Scannicchio, D. A.; Scarcella, M.; Scarfone, V.; Schaarschmidt, J.; Schacht, P.; Schaefer, D.; Schaefer, R.; Schaeffer, J.; Schaepe, S.; Schaetzel, S.; Schäfer, U.; Schaffer, A. C.; Schaile, D.; Schamberger, R. D.; Scharf, V.; Schegelsky, V. A.; Scheirich, D.; Schernau, M.; Schiavi, C.; Schillo, C.; Schioppa, M.; Schlenker, S.; Schmieden, K.; Schmitt, C.; Schmitt, S.; Schmitz, S.; Schneider, B.; Schnellbach, Y. J.; Schnoor, U.; Schoeffel, L.; Schoening, A.; Schoenrock, B. D.; Schopf, E.; Schorlemmer, A. L. S.; Schott, M.; Schovancova, J.; Schramm, S.; Schreyer, M.; Schuh, N.; Schultens, M. J.; Schultz-Coulon, H.-C.; Schulz, H.; Schumacher, M.; Schumm, B. A.; Schune, Ph.; Schwanenberger, C.; Schwartzman, A.; Schwarz, T. A.; Schwegler, Ph.; Schweiger, H.; Schwemling, Ph.; Schwienhorst, R.; Schwindling, J.; Schwindt, T.; Sciolla, G.; Scuri, F.; Scutti, F.; Searcy, J.; Seema, P.; Seidel, S. C.; Seiden, A.; Seifert, F.; Seixas, J. M.; Sekhniaidze, G.; Sekhon, K.; Sekula, S. J.; Seliverstov, D. M.; Semprini-Cesari, N.; Serfon, C.; Serin, L.; Serkin, L.; Sessa, M.; Seuster, R.; Severini, H.; Sfiligoj, T.; Sforza, F.; Sfyrla, A.; Shabalina, E.; Shaikh, N. W.; Shan, L. Y.; Shang, R.; Shank, J. T.; Shapiro, M.; Shatalov, P. B.; Shaw, K.; Shaw, S. M.; Shcherbakova, A.; Shehu, C. Y.; Sherwood, P.; Shi, L.; Shimizu, S.; Shimmin, C. O.; Shimojima, M.; Shiyakova, M.; Shmeleva, A.; Saadi, D. Shoaleh; Shochet, M. J.; Shojaii, S.; Shrestha, S.; Shulga, E.; Shupe, M. A.; Sicho, P.; Sidebo, P. E.; Sidiropoulou, O.; Sidorov, D.; Sidoti, A.; Siegert, F.; Sijacki, Dj.; Silva, J.; Silverstein, S. B.; Simak, V.; Simard, O.; Simic, Lj.; Simion, S.; Simioni, E.; Simmons, B.; Simon, D.; Simon, M.; Sinervo, P.; Sinev, N. B.; Sioli, M.; Siragusa, G.; Sivoklokov, S. Yu.; Sjölin, J.; Sjursen, T. B.; Skinner, M. B.; Skottowe, H. P.; Skubic, P.; Slater, M.; Slavicek, T.; Slawinska, M.; Sliwa, K.; Slovak, R.; Smakhtin, V.; Smart, B. H.; Smestad, L.; Smirnov, S. Yu.; Smirnov, Y.; Smirnova, L. N.; Smirnova, O.; Smith, M. N. K.; Smith, R. W.; Smizanska, M.; Smolek, K.; Snesarev, A. A.; Snidero, G.; Snyder, S.; Sobie, R.; Socher, F.; Soffer, A.; Soh, D. A.; Sokhrannyi, G.; Sanchez, C. A. Solans; Solar, M.; Soldatov, E. Yu.; Soldevila, U.; Solodkov, A. A.; Soloshenko, A.; Solovyanov, O. V.; Solovyev, V.; Sommer, P.; Son, H.; Song, H. Y.; Sood, A.; Sopczak, A.; Sopko, V.; Sorin, V.; Sosa, D.; Sotiropoulou, C. L.; Soualah, R.; Soukharev, A. M.; South, D.; Sowden, B. C.; Spagnolo, S.; Spalla, M.; Spangenberg, M.; Spanò, F.; Sperlich, D.; Spettel, F.; Spighi, R.; Spigo, G.; Spiller, L. A.; Spousta, M.; Denis, R. D. St.; Stabile, A.; Staerz, S.; Stahlman, J.; Stamen, R.; Stamm, S.; Stanecka, E.; Stanek, R. W.; Stanescu, C.; Stanescu-Bellu, M.; Stanitzki, M. M.; Stapnes, S.; Starchenko, E. A.; Stark, G. H.; Stark, J.; Staroba, P.; Starovoitov, P.; Staszewski, R.; Steinberg, P.; Stelzer, B.; Stelzer, H. J.; Stelzer-Chilton, O.; Stenzel, H.; Stewart, G. A.; Stillings, J. A.; Stockton, M. C.; Stoebe, M.; Stoicea, G.; Stolte, P.; Stonjek, S.; Stradling, A. R.; Straessner, A.; Stramaglia, M. E.; Strandberg, J.; Strandberg, S.; Strandlie, A.; Strauss, M.; Strizenec, P.; Ströhmer, R.; Strom, D. M.; Stroynowski, R.; Strubig, A.; Stucci, S. A.; Stugu, B.; Styles, N. A.; Su, D.; Su, J.; Subramaniam, R.; Suchek, S.; Sugaya, Y.; Suk, M.; Sulin, V. V.; Sultansoy, S.; Sumida, T.; Sun, S.; Sun, X.; Sundermann, J. E.; Suruliz, K.; Susinno, G.; Sutton, M. R.; Suzuki, S.; Svatos, M.; Swiatlowski, M.; Sykora, I.; Sykora, T.; Ta, D.; Taccini, C.; Tackmann, K.; Taenzer, J.; Taffard, A.; Tafirout, R.; Taiblum, N.; Takai, H.; Takashima, R.; Takeda, H.; Takeshita, T.; Takubo, Y.; Talby, M.; Talyshev, A. A.; Tam, J. Y. C.; Tan, K. G.; Tanaka, J.; Tanaka, R.; Tanaka, S.; Tannenwald, B. B.; Araya, S. Tapia; Tapprogge, S.; Tarem, S.; Tartarelli, G. F.; Tas, P.; Tasevsky, M.; Tashiro, T.; Tassi, E.; Delgado, A. Tavares; Tayalati, Y.; Taylor, A. C.; Taylor, G. N.; Taylor, P. T. E.; Taylor, W.; Teischinger, F. A.; Teixeira-Dias, P.; Temming, K. K.; Temple, D.; Kate, H. Ten; Teng, P. K.; Teoh, J. J.; Tepel, F.; Terada, S.; Terashi, K.; Terron, J.; Terzo, S.; Testa, M.; Teuscher, R. J.; Theveneaux-Pelzer, T.; Thomas, J. P.; Thomas-Wilsker, J.; Thompson, E. N.; Thompson, P. D.; Thompson, R. J.; Thompson, A. S.; Thomsen, L. A.; Thomson, E.; Thomson, M.; Tibbetts, M. J.; Torres, R. E. Ticse; Tikhomirov, V. O.; Tikhonov, Yu. A.; Timoshenko, S.; Tipton, P.; Tisserant, S.; Todome, K.; Todorov, T.; Todorova-Nova, S.; Tojo, J.; Tokár, S.; Tokushuku, K.; Tolley, E.; Tomlinson, L.; Tomoto, M.; Tompkins, L.; Toms, K.; Tong, B.; Torrence, E.; Torres, H.; Pastor, E. Torró; Toth, J.; Touchard, F.; Tovey, D. R.; Trefzger, T.; Tremblet, L.; Tricoli, A.; Trigger, I. M.; Trincaz-Duvoid, S.; Tripiana, M. F.; Trischuk, W.; Trocmé, B.; Trofymov, A.; Troncon, C.; Trottier-McDonald, M.; Trovatelli, M.; Truong, L.; Trzebinski, M.; Trzupek, A.; Tseng, J. C.-L.; Tsiareshka, P. V.; Tsipolitis, G.; Tsirintanis, N.; Tsiskaridze, S.; Tsiskaridze, V.; Tskhadadze, E. G.; Tsui, K. M.; Tsukerman, I. I.; Tsulaia, V.; Tsuno, S.; Tsybychev, D.; Tudorache, A.; Tudorache, V.; Tuna, A. N.; Tupputi, S. A.; Turchikhin, S.; Turecek, D.; Turgeman, D.; Turra, R.; Turvey, A. J.; Tuts, P. M.; Tyndel, M.; Ucchielli, G.; Ueda, I.; Ueno, R.; Ughetto, M.; Ukegawa, F.; Unal, G.; Undrus, A.; Unel, G.; Ungaro, F. C.; Unno, Y.; Unverdorben, C.; Urban, J.; Urquijo, P.; Urrejola, P.; Usai, G.; Usanova, A.; Vacavant, L.; Vacek, V.; Vachon, B.; Valderanis, C.; Santurio, E. Valdes; Valencic, N.; Valentinetti, S.; Valero, A.; Valery, L.; Valkar, S.; Vallecorsa, S.; Ferrer, J. A. Valls; Van Den Wollenberg, W.; Van Der Deijl, P. C.; van der Geer, R.; van der Graaf, H.; van Eldik, N.; van Gemmeren, P.; Van Nieuwkoop, J.; van Vulpen, I.; van Woerden, M. C.; Vanadia, M.; Vandelli, W.; Vanguri, R.; Vaniachine, A.; Vankov, P.; Vardanyan, G.; Vari, R.; Varnes, E. W.; Varol, T.; Varouchas, D.; Vartapetian, A.; Varvell, K. E.; Vasquez, J. G.; Vazeille, F.; Schroeder, T. Vazquez; Veatch, J.; Veloce, L. M.; Veloso, F.; Veneziano, S.; Ventura, A.; Venturi, M.; Venturi, N.; Venturini, A.; Vercesi, V.; Verducci, M.; Verkerke, W.; Vermeulen, J. C.; Vest, A.; Vetterli, M. C.; Viazlo, O.; Vichou, I.; Vickey, T.; Boeriu, O. E. Vickey; Viehhauser, G. H. A.; Viel, S.; Vigani, L.; Vigne, R.; Villa, M.; Perez, M. Villaplana; Vilucchi, E.; Vincter, M. G.; Vinogradov, V. B.; Vittori, C.; Vivarelli, I.; Vlachos, S.; Vlasak, M.; Vogel, M.; Vokac, P.; Volpi, G.; Volpi, M.; von der Schmitt, H.; von Toerne, E.; Vorobel, V.; Vorobev, K.; Vos, M.; Voss, R.; Vossebeld, J. H.; Vranjes, N.; Milosavljevic, M. Vranjes; Vrba, V.; Vreeswijk, M.; Vuillermet, R.; Vukotic, I.; Vykydal, Z.; Wagner, P.; Wagner, W.; Wahlberg, H.; Wahrmund, S.; Wakabayashi, J.; Walder, J.; Walker, R.; Walkowiak, W.; Wallangen, V.; Wang, C.; Wang, C.; Wang, F.; Wang, H.; Wang, H.; Wang, J.; Wang, J.; Wang, K.; Wang, R.; Wang, S. M.; Wang, T.; Wang, T.; Wang, X.; Wanotayaroj, C.; Warburton, A.; Ward, C. P.; Wardrope, D. R.; Washbrook, A.; Watkins, P. M.; Watson, A. T.; Watson, I. J.; Watson, M. F.; Watts, G.; Watts, S.; Waugh, B. M.; Webb, S.; Weber, M. S.; Weber, S. W.; Webster, J. S.; Weidberg, A. R.; Weinert, B.; Weingarten, J.; Weiser, C.; Weits, H.; Wells, P. S.; Wenaus, T.; Wengler, T.; Wenig, S.; Wermes, N.; Werner, M.; Werner, P.; Wessels, M.; Wetter, J.; Whalen, K.; Whallon, N. L.; Wharton, A. M.; White, A.; White, M. J.; White, R.; White, S.; Whiteson, D.; Wickens, F. J.; Wiedenmann, W.; Wielers, M.; Wienemann, P.; Wiglesworth, C.; Wiik-Fuchs, L. A. M.; Wildauer, A.; Wilk, F.; Wilkens, H. G.; Williams, H. H.; Williams, S.; Willis, C.; Willocq, S.; Wilson, J. A.; Wingerter-Seez, I.; Winklmeier, F.; Winston, O. J.; Winter, B. T.; Wittgen, M.; Wittkowski, J.; Wollstadt, S. J.; Wolter, M. W.; Wolters, H.; Wosiek, B. K.; Wotschack, J.; Woudstra, M. J.; Wozniak, K. W.; Wu, M.; Wu, M.; Wu, S. L.; Wu, X.; Wu, Y.; Wyatt, T. R.; Wynne, B. M.; Xella, S.; Xu, D.; Xu, L.; Yabsley, B.; Yacoob, S.; Yakabe, R.; Yamaguchi, D.; Yamaguchi, Y.; Yamamoto, A.; Yamamoto, S.; Yamanaka, T.; Yamauchi, K.; Yamazaki, Y.; Yan, Z.; Yang, H.; Yang, H.; Yang, Y.; Yang, Z.; Yao, W.-M.; Yap, Y. C.; Yasu, Y.; Yatsenko, E.; Wong, K. H. Yau; Ye, J.; Ye, S.; Yeletskikh, I.; Yen, A. L.; Yildirim, E.; Yorita, K.; Yoshida, R.; Yoshihara, K.; Young, C.; Young, C. J. S.; Youssef, S.; Yu, D. R.; Yu, J.; Yu, J. M.; Yu, J.; Yuan, L.; Yuen, S. P. Y.; Yusuff, I.; Zabinski, B.; Zaidan, R.; Zaitsev, A. M.; Zakharchuk, N.; Zalieckas, J.; Zaman, A.; Zambito, S.; Zanello, L.; Zanzi, D.; Zeitnitz, C.; Zeman, M.; Zemla, A.; Zeng, J. C.; Zeng, Q.; Zengel, K.; Zenin, O.; Ženiš, T.; Zerwas, D.; Zhang, D.; Zhang, F.; Zhang, G.; Zhang, H.; Zhang, J.; Zhang, L.; Zhang, R.; Zhang, R.; Zhang, X.; Zhang, Z.; Zhao, X.; Zhao, Y.; Zhao, Z.; Zhemchugov, A.; Zhong, J.; Zhou, B.; Zhou, C.; Zhou, L.; Zhou, L.; Zhou, M.; Zhou, N.; Zhu, C. G.; Zhu, H.; Zhu, J.; Zhu, Y.; Zhuang, X.; Zhukov, K.; Zibell, A.; Zieminska, D.; Zimine, N. I.; Zimmermann, C.; Zimmermann, S.; Zinonos, Z.; Zinser, M.; Ziolkowski, M.; Živković, L.; Zobernig, G.; Zoccoli, A.; Nedden, M. zur; Zurzolo, G.; Zwalinski, L.
2016-12-01
A test of CP invariance in Higgs boson production via vector-boson fusion using the method of the Optimal Observable is presented. The analysis exploits the decay mode of the Higgs boson into a pair of τ leptons and is based on 20.3 fb^{-1} of proton-proton collision data at √{s} = 8 TeV collected by the ATLAS experiment at the LHC. Contributions from CP-violating interactions between the Higgs boson and electroweak gauge bosons are described in an effective field theory framework, in which the strength of CP violation is governed by a single parameter tilde{d}. The mean values and distributions of CP-odd observables agree with the expectation in the Standard Model and show no sign of CP violation. The CP-mixing parameter tilde{d} is constrained to the interval (-0.11,0.05) at 68% confidence level, consistent with the Standard Model expectation of tilde{d}=0.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baxter, V. D.; Rice, K.; Murphy, R.
Between October 2008 and May 2013 ORNL and ClimateMaster, Inc. (CM) engaged in a Cooperative Research and Development Agreement (CRADA) to develop a groundsource integrated heat pump (GS-IHP) system for the US residential market. A initial prototype was designed and fabricated, lab-tested, and modeled in TRNSYS (SOLAR Energy Laboratory, et al, 2010) to predict annual performance relative to 1) a baseline suite of equipment meeting minimum efficiency standards in effect in 2006 (combination of air-source heat pump (ASHP) and resistance water heater) and 2) a state-of-the-art (SOA) two-capacity ground-source heat pump with desuperheater water heater (WH) option (GSHPwDS). Predicted totalmore » annual energy savings, while providing space conditioning and water heating for a 2600 ft{sup 2} (242 m{sup 2}) house at 5 U.S. locations, ranged from 52 to 59%, averaging 55%, relative to the minimum efficiency suite. Predicted energy use for water heating was reduced 68 to 78% relative to resistance WH. Predicted total annual savings for the GSHPwDS relative to the same baseline averaged 22.6% with water heating energy use reduced by 10 to 30% from desuperheater contributions. The 1st generation (or alpha) prototype design for the GS-IHP was finalized in 2010 and field test samples were fabricated for testing by CM and by ORNL. Two of the alpha units were installed in 3700 ft{sup 2} (345 m{sup 2}) houses at the ZEBRAlliance site in Oak Ridge and field tested during 2011. Based on the steady-state performance demonstrated by the GS-IHPs it was projected that it would achieve >52% energy savings relative to the minimum efficiency suite at this specific site. A number of operational issues with the alpha units were identified indicating design changes needed to the system before market introduction could be accomplished. These were communicated to CM throughout the field test period. Based on the alpha unit test results and the diagnostic information coming from the field test experience, CM developed a 2nd generation (or beta) prototype in 2012. Field test verification units were fabricated and installed at the ZEBRAlliance site in Oak Ridge in May 2012 and at several sites near CM headquarters in Oklahoma. Field testing of the units continued through February 2013. Annual performance analyses of the beta unit (prototype 2) with vertical well ground heat exchangers (GHX) in 5 U.S. locations predict annual energy savings of 57% to 61%, averaging 59% relative to the minimum efficiency suite and 38% to 56%, averaging 46% relative to the SOA GSHPwDS. Based on the steady-state performance demonstrated by the test units it was projected that the 2nd generation units would achieve ~58% energy savings relative to the minimum efficiency suite at the Zebra Alliance site with horizontal GHX. A new product based on the beta unit design was announced by CM in 2012 – the Trilogy 40® Q-mode™ (http://cmdealernet.com/trilogy_40.html). The unit was formally introduced in a March 2012 press release (see Appendix A) and was available for order beginning in December 2012.« less
Tourism Standards: Western Canada. Certification Field Test. Final Report. Formative Evaluation.
ERIC Educational Resources Information Center
Alberta Tourism Education Council, Edmonton.
The Tourism Standards Consortium (TSC) is a partnership of the governments of Canada's western provinces (Alberta, Manitoba, Saskatchewan, and British Columbia), the provinces' tourism industries, and the Pacific Rim Institute of Tourism in British Columbia. In an effort to increase professionalism in Western Canada's tourism industry, the TSC…
Competency Based Assessment in Fashion Design
NASA Astrophysics Data System (ADS)
Russanti, Irma; Nurlaela, Lutfiyah; Basuki, Ismet; Munoto
2018-04-01
Professional certification is a form of stipulation on certain competency standards provided by one professional organization to the performance of a person through assessment. For that an assessment needs to be standardized so that there exists a general standardized scale to measure competence. In the professional certification of fashion design department, an instrument of competency based assessment is essential to be developed. The purpose of this review is to know the application of competency based assessment in the field of fashion design. The literature reviews were found by journal searching with keywords competency based assessment and fashion design in Google scholar, of which was gotten over 20 journals from 2006 to 2016. Afterwards, the search of the free-downloaded e-books in libgen was conducted under competency based assessment and fashion design, which is then found some related references. The obtained literatures were used to review the definition, approach, and implementation of competency based assessment in the field of fashion design. Results show that it is important to develop an assessment sheet in the field of fashion design covering garment, apparel and embroidery sectors by patterning the criteria of performers along with the qualifications.
NASA Astrophysics Data System (ADS)
Lewis, Ray A.; Modanese, Giovanni
Vibrating media offer an important testing ground for reconciling conflicts between General Relativity, Quantum Mechanics and other branches of physics. For sources like a Weber bar, the standard covariant formalism for elastic bodies can be applied. The vibrating string, however, is a source of gravitational waves which requires novel computational techniques, based on the explicit construction of a conserved and renormalized energy-momentum tensor. Renormalization (in a classical sense) is necessary to take into account the effect of external constraints, which affect the emission considerably. Our computation also relaxes usual simplifying assumptions like far-field approximation, spherical or plane wave symmetry, TT gauge and absence of internal interference. In a further step towards unification, the method is then adapted to give the radiation field of a transversal Alfven wave in a rarefied astrophysical plasma, where the tension is produced by an external static magnetic field.
Near Field Enhanced Photocurrent Generation in P-type Dye-Sensitized Solar Cells
Xu, Xiaobao; Cui, Jin; Han, Junbo; Zhang, Junpei; Zhang, Yibo; Luan, Lin; Alemu, Getachew; Wang, Zhong; Shen, Yan; Xiong, Dehua; Chen, Wei; Wei, Zhanhua; Yang, Shihe; Hu, Bin; Cheng, Yibing; Wang, Mingkui
2014-01-01
Over the past few decades, the field of p-type dye-sensitized solar cell (p-DSSC) devices has undergone tremendous advances, in which Cu-based delafossite nanocrystal is of prime interest. This paper presents an augment of about 87% improvement in photocurrent observed in a particular configuration of organic dye P1 sensitized CuCrO2 delafossite nanocrystal electrode coupled with organic redox shuttle, 1-methy-1H- tetrazole-5-thiolate and its disulfide dimer when Au nanoparticles (NPs, with diameter of about 20 nm) is added into the photocathode, achieving a power convert efficiency of 0.31% (measured under standard AM 1.5 G test conditions). Detailed investigation shows that the local electrical-magnetic field effect, induced by Au NPs among the mesoporous CuCrO2 film, can improve the charge injection efficiency at dye/semiconductor interface, which is responsible for the bulk of the gain in photocurrent. PMID:24492539
Micromechanical Characterization of Polysilicon Films through On-Chip Tests
Mirzazadeh, Ramin; Eftekhar Azam, Saeed; Mariani, Stefano
2016-01-01
When the dimensions of polycrystalline structures become comparable to the average grain size, some reliability issues can be reported for the moving parts of inertial microelectromechanical systems (MEMS). Not only the overall behavior of the device turns out to be affected by a large scattering, but also the sensitivity to imperfections gets enhanced. In this work, through on-chip tests, we experimentally investigate the behavior of thin polysilicon samples using standard electrostatic actuation/sensing. The discrepancy between the target and actual responses of each sample has then been exploited to identify: (i) the overall stiffness of the film and, according to standard continuum elasticity, a morphology-based value of its Young’s modulus; (ii) the relevant over-etch induced by the fabrication process. To properly account for the aforementioned stochastic features at the micro-scale, the identification procedure has been based on particle filtering. A simple analytical reduced-order model of the moving structure has been also developed to account for the nonlinearities in the electrical field, up to pull-in. Results are reported for a set of ten film samples of constant slenderness, and the effects of different actuation mechanisms on the identified micromechanical features are thoroughly discussed. PMID:27483268
Micromechanical Characterization of Polysilicon Films through On-Chip Tests.
Mirzazadeh, Ramin; Eftekhar Azam, Saeed; Mariani, Stefano
2016-07-28
When the dimensions of polycrystalline structures become comparable to the average grain size, some reliability issues can be reported for the moving parts of inertial microelectromechanical systems (MEMS). Not only the overall behavior of the device turns out to be affected by a large scattering, but also the sensitivity to imperfections gets enhanced. In this work, through on-chip tests, we experimentally investigate the behavior of thin polysilicon samples using standard electrostatic actuation/sensing. The discrepancy between the target and actual responses of each sample has then been exploited to identify: (i) the overall stiffness of the film and, according to standard continuum elasticity, a morphology-based value of its Young's modulus; (ii) the relevant over-etch induced by the fabrication process. To properly account for the aforementioned stochastic features at the micro-scale, the identification procedure has been based on particle filtering. A simple analytical reduced-order model of the moving structure has been also developed to account for the nonlinearities in the electrical field, up to pull-in. Results are reported for a set of ten film samples of constant slenderness, and the effects of different actuation mechanisms on the identified micromechanical features are thoroughly discussed.
High Resolution Eddy-Current Wire Testing Based on a Gmr Sensor-Array
NASA Astrophysics Data System (ADS)
Kreutzbruck, Marc; Allweins, Kai; Strackbein, Chris; Bernau, Hendrick
2009-03-01
Increasing demands in materials quality and cost effectiveness have led to advanced standards in manufacturing technology. Especially when dealing with high quality standards in conjunction with high throughput quantitative NDE techniques are vital to provide reliable and fast quality control systems. In this work we illuminate a modern electromagnetic NDE approach using a small GMR sensor array for testing superconducting wires. Four GMR sensors are positioned around the wire. Each GMR sensor provides a field sensitivity of 200 pT/√Hz and a spatial resolution of about 100 μm. This enables us to detect under surface defects of 100 μm in size in a depth of 200 μm with a signal-to-noise ratio of better than 400. Surface defects could be detected with a SNR of up to 10,000. Besides this remarkably SNR the small extent of GMR sensors results in a spatial resolution which offers new visualisation techniques for defect localisation, defect characterization and tomography-like mapping techniques. We also report on inverse algorithms based on either a Finite Element Method or an analytical approach. These allow for accurate defect localization on the urn scale and an estimation of the defect size.
ELF magnetic fields in electric and gasoline-powered vehicles.
Tell, R A; Sias, G; Smith, J; Sahl, J; Kavet, R
2013-02-01
We conducted a pilot study to assess magnetic field levels in electric compared to gasoline-powered vehicles, and established a methodology that would provide valid data for further assessments. The sample consisted of 14 vehicles, all manufactured between January 2000 and April 2009; 6 were gasoline-powered vehicles and 8 were electric vehicles of various types. Of the eight models available, three were represented by a gasoline-powered vehicle and at least one electric vehicle, enabling intra-model comparisons. Vehicles were driven over a 16.3 km test route. Each vehicle was equipped with six EMDEX Lite broadband meters with a 40-1,000 Hz bandwidth programmed to sample every 4 s. Standard statistical testing was based on the fact that the autocorrelation statistic damped quickly with time. For seven electric cars, the geometric mean (GM) of all measurements (N = 18,318) was 0.095 µT with a geometric standard deviation (GSD) of 2.66, compared to 0.051 µT (N = 9,301; GSD = 2.11) for four gasoline-powered cars (P < 0.0001). Using the data from a previous exposure assessment of residential exposure in eight geographic regions in the United States as a basis for comparison (N = 218), the broadband magnetic fields in electric vehicles covered the same range as personal exposure levels recorded in that study. All fields measured in all vehicles were much less than the exposure limits published by the International Commission on Non-Ionizing Radiation Protection (ICNIRP) and the Institute of Electrical and Electronics Engineers (IEEE). Future studies should include larger sample sizes representative of a greater cross-section of electric-type vehicles. Copyright © 2012 Wiley Periodicals, Inc.
Gatenby, J. Christopher; Gore, John C.; Tong, Frank
2012-01-01
High-resolution functional MRI is a leading application for very high field (7 Tesla) human MR imaging. Though higher field strengths promise improvements in signal-to-noise ratios (SNR) and BOLD contrast relative to fMRI at 3 Tesla, these benefits may be partially offset by accompanying increases in geometric distortion and other off-resonance effects. Such effects may be especially pronounced with the single-shot EPI pulse sequences typically used for fMRI at standard field strengths. As an alternative, one might consider multishot pulse sequences, which may lead to somewhat lower temporal SNR than standard EPI, but which are also often substantially less susceptible to off-resonance effects. Here we consider retinotopic mapping of human visual cortex as a practical test case by which to compare examples of these sequence types for high-resolution fMRI at 7 Tesla. We performed polar angle retinotopic mapping at each of 3 isotropic resolutions (2.0, 1.7, and 1.1 mm) using both accelerated single-shot 2D EPI and accelerated multishot 3D gradient-echo pulse sequences. We found that single-shot EPI indeed led to greater temporal SNR and contrast-to-noise ratios (CNR) than the multishot sequences. However, additional distortion correction in postprocessing was required in order to fully realize these advantages, particularly at higher resolutions. The retinotopic maps produced by both sequence types were qualitatively comparable, and showed equivalent test/retest reliability. Thus, when surface-based analyses are planned, or in other circumstances where geometric distortion is of particular concern, multishot pulse sequences could provide a viable alternative to single-shot EPI. PMID:22514646
Bastian, Thomas; Maire, Aurélia; Dugas, Julien; Ataya, Abbas; Villars, Clément; Gris, Florence; Perrin, Emilie; Caritu, Yanis; Doron, Maeva; Blanc, Stéphane; Jallon, Pierre; Simon, Chantal
2015-03-15
"Objective" methods to monitor physical activity and sedentary patterns in free-living conditions are necessary to further our understanding of their impacts on health. In recent years, many software solutions capable of automatically identifying activity types from portable accelerometry data have been developed, with promising results in controlled conditions, but virtually no reports on field tests. An automatic classification algorithm initially developed using laboratory-acquired data (59 subjects engaging in a set of 24 standardized activities) to discriminate between 8 activity classes (lying, slouching, sitting, standing, walking, running, and cycling) was applied to data collected in the field. Twenty volunteers equipped with a hip-worn triaxial accelerometer performed at their own pace an activity set that included, among others, activities such as walking the streets, running, cycling, and taking the bus. Performances of the laboratory-calibrated classification algorithm were compared with those of an alternative version of the same model including field-collected data in the learning set. Despite good results in laboratory conditions, the performances of the laboratory-calibrated algorithm (assessed by confusion matrices) decreased for several activities when applied to free-living data. Recalibrating the algorithm with data closer to real-life conditions and from an independent group of subjects proved useful, especially for the detection of sedentary behaviors while in transports, thereby improving the detection of overall sitting (sensitivity: laboratory model = 24.9%; recalibrated model = 95.7%). Automatic identification methods should be developed using data acquired in free-living conditions rather than data from standardized laboratory activity sets only, and their limits carefully tested before they are used in field studies. Copyright © 2015 the American Physiological Society.
Swisher, Jascha D; Sexton, John A; Gatenby, J Christopher; Gore, John C; Tong, Frank
2012-01-01
High-resolution functional MRI is a leading application for very high field (7 Tesla) human MR imaging. Though higher field strengths promise improvements in signal-to-noise ratios (SNR) and BOLD contrast relative to fMRI at 3 Tesla, these benefits may be partially offset by accompanying increases in geometric distortion and other off-resonance effects. Such effects may be especially pronounced with the single-shot EPI pulse sequences typically used for fMRI at standard field strengths. As an alternative, one might consider multishot pulse sequences, which may lead to somewhat lower temporal SNR than standard EPI, but which are also often substantially less susceptible to off-resonance effects. Here we consider retinotopic mapping of human visual cortex as a practical test case by which to compare examples of these sequence types for high-resolution fMRI at 7 Tesla. We performed polar angle retinotopic mapping at each of 3 isotropic resolutions (2.0, 1.7, and 1.1 mm) using both accelerated single-shot 2D EPI and accelerated multishot 3D gradient-echo pulse sequences. We found that single-shot EPI indeed led to greater temporal SNR and contrast-to-noise ratios (CNR) than the multishot sequences. However, additional distortion correction in postprocessing was required in order to fully realize these advantages, particularly at higher resolutions. The retinotopic maps produced by both sequence types were qualitatively comparable, and showed equivalent test/retest reliability. Thus, when surface-based analyses are planned, or in other circumstances where geometric distortion is of particular concern, multishot pulse sequences could provide a viable alternative to single-shot EPI.
Fathi, Roya; Sheehan, Orla C; Garrigues, Sarah K; Saliba, Debra; Leff, Bruce; Ritchie, Christine S
2016-08-01
The unique needs of homebound adults receiving home-based medical care (HBMC) (ie, home-based primary care and home-based palliative care services) are ideally provided by interdisciplinary care teams (IDTs) that provide coordinated care. The composition of team members from an array of organizations and the unique dimension of providing care in the home present specific challenges to timely access and communication of patient care information. The objective of this work was to develop a conceptual framework and corresponding quality indicators (QIs) that assess how IDT members for HBMC practices access and communicate key patient information with each other. A systematic review of peer-reviewed and gray literature was performed to inform a framework for care coordination in the home and the development of candidate QIs to assess processes by which all IDT members optimally access and use patient information. A technical expert panel (TEP) participated in a modified Delphi process to assess the validity and feasibility of each QI and to identify which would be most suitable for testing in the field. Thematic analysis of literature revealed 4 process themes for how HBMC practices might engage in high-quality care coordination: using electronic medical records, conducting interdisciplinary team meetings, sharing standardized patient assessments, and communicating via secure e-messaging. Based on these themes, 9 candidate QIs were developed to reflect these processes. Three candidate QIs were assessed by the TEP as valid and feasible to measure in an HBMC practice setting. These indicators focused on use of IDT meetings, standardized patient assessments, and secure e-messaging. Translating the complex issue of care coordination into QIs will improve care delivered to vulnerable home-limited adults who receive HBMC. Guided by the literature, we developed a framework to reflect optimal care coordination in the home setting and identified 3 candidate QIs to field-test in HBMC practices. Published by Elsevier Inc.
Concept of an upright wearable positron emission tomography imager in humans.
Bauer, Christopher E; Brefczynski-Lewis, Julie; Marano, Gary; Mandich, Mary-Beth; Stolin, Alexander; Martone, Peter; Lewis, James W; Jaliparthi, Gangadhar; Raylman, Raymond R; Majewski, Stan
2016-09-01
Positron Emission Tomography (PET) is traditionally used to image patients in restrictive positions, with few devices allowing for upright, brain-dedicated imaging. Our team has explored the concept of wearable PET imagers which could provide functional brain imaging of freely moving subjects. To test feasibility and determine future considerations for development, we built a rudimentary proof-of-concept prototype (Helmet_PET) and conducted tests in phantoms and four human volunteers. Twelve Silicon Photomultiplier-based detectors were assembled in a ring with exterior weight support and an interior mechanism that could be adjustably fitted to the head. We conducted brain phantom tests as well as scanned four patients scheduled for diagnostic F(18-) FDG PET/CT imaging. For human subjects the imager was angled such that field of view included basal ganglia and visual cortex to test for typical resting-state pattern. Imaging in two subjects was performed ~4 hr after PET/CT imaging to simulate lower injected F(18-) FDG dose by taking advantage of the natural radioactive decay of the tracer (F(18) half-life of 110 min), with an estimated imaging dosage of 25% of the standard. We found that imaging with a simple lightweight ring of detectors was feasible using a fraction of the standard radioligand dose. Activity levels in the human participants were quantitatively similar to standard PET in a set of anatomical ROIs. Typical resting-state brain pattern activation was demonstrated even in a 1 min scan of active head rotation. To our knowledge, this is the first demonstration of imaging a human subject with a novel wearable PET imager that moves with robust head movements. We discuss potential research and clinical applications that will drive the design of a fully functional device. Designs will need to consider trade-offs between a low weight device with high mobility and a heavier device with greater sensitivity and larger field of view.
NASA Astrophysics Data System (ADS)
Zhang, Zhongya; Pan, Bing; Grédiac, Michel; Song, Weidong
2018-04-01
The virtual fields method (VFM) is generally used with two-dimensional digital image correlation (2D-DIC) or grid method (GM) for identifying constitutive parameters. However, when small out-of-plane translation/rotation occurs to the test specimen, 2D-DIC and GM are prone to yield inaccurate measurements, which further lessen the accuracy of the parameter identification using VFM. In this work, an easy-to-implement but effective "special" stereo-DIC (SS-DIC) method is proposed for accuracy-enhanced VFM identification. The SS-DIC can not only deliver accurate deformation measurement without being affected by unavoidable out-of-plane movement/rotation of a test specimen, but can also ensure evenly distributed calculation data in space, which leads to simple data processing. Based on the accurate kinematics fields with evenly distributed measured points determined by SS-DIC method, constitutive parameters can be identified by VFM with enhanced accuracy. Uniaxial tensile tests of a perforated aluminum plate and pure shear tests of a prismatic aluminum specimen verified the effectiveness and accuracy of the proposed method. Experimental results show that the constitutive parameters identified by VFM using SS-DIC are more accurate and stable than those identified by VFM using 2D-DIC. It is suggested that the proposed SS-DIC can be used as a standard measuring tool for mechanical identification using VFM.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 16 Commercial Practices 2 2012-01-01 2012-01-01 false Requirements. 1204.3 Section 1204.3... STANDARD FOR OMNIDIRECTIONAL CITIZENS BAND BASE STATION ANTENNAS The Standard § 1204.3 Requirements. All omnidirectional CB base station antennas are required to comply with the following requirements. (a) Field joints...
US EPA Base Study Standard Operating Procedure for Preliminary Visits to Buildings
The objective of this standard operation procedure is to give the preliminary visit (PV) field investigator a description of how to conduct a BASE PV, as well as to detail the informational requirements that are gathered as part of this investigation.
NASA Astrophysics Data System (ADS)
Markelin, L.; Honkavaara, E.; Näsi, R.; Nurminen, K.; Hakala, T.
2014-08-01
Remote sensing based on unmanned airborne vehicles (UAVs) is a rapidly developing field of technology. UAVs enable accurate, flexible, low-cost and multiangular measurements of 3D geometric, radiometric, and temporal properties of land and vegetation using various sensors. In this paper we present a geometric processing chain for multiangular measurement system that is designed for measuring object directional reflectance characteristics in a wavelength range of 400-900 nm. The technique is based on a novel, lightweight spectral camera designed for UAV use. The multiangular measurement is conducted by collecting vertical and oblique area-format spectral images. End products of the geometric processing are image exterior orientations, 3D point clouds and digital surface models (DSM). This data is needed for the radiometric processing chain that produces reflectance image mosaics and multiangular bidirectional reflectance factor (BRF) observations. The geometric processing workflow consists of the following three steps: (1) determining approximate image orientations using Visual Structure from Motion (VisualSFM) software, (2) calculating improved orientations and sensor calibration using a method based on self-calibrating bundle block adjustment (standard photogrammetric software) (this step is optional), and finally (3) creating dense 3D point clouds and DSMs using Photogrammetric Surface Reconstruction from Imagery (SURE) software that is based on semi-global-matching algorithm and it is capable of providing a point density corresponding to the pixel size of the image. We have tested the geometric processing workflow over various targets, including test fields, agricultural fields, lakes and complex 3D structures like forests.
A standard telemental health evaluation model: the time is now.
Kramer, Greg M; Shore, Jay H; Mishkind, Matt C; Friedl, Karl E; Poropatich, Ronald K; Gahm, Gregory A
2012-05-01
The telehealth field has advanced historic promises to improve access, cost, and quality of care. However, the extent to which it is delivering on its promises is unclear as the scientific evidence needed to justify success is still emerging. Many have identified the need to advance the scientific knowledge base to better quantify success. One method for advancing that knowledge base is a standard telemental health evaluation model. Telemental health is defined here as the provision of mental health services using live, interactive video-teleconferencing technology. Evaluation in the telemental health field largely consists of descriptive and small pilot studies, is often defined by the individual goals of the specific programs, and is typically focused on only one outcome. The field should adopt new evaluation methods that consider the co-adaptive interaction between users (patients and providers), healthcare costs and savings, and the rapid evolution in communication technologies. Acceptance of a standard evaluation model will improve perceptions of telemental health as an established field, promote development of a sounder empirical base, promote interagency collaboration, and provide a framework for more multidisciplinary research that integrates measuring the impact of the technology and the overall healthcare aspect. We suggest that consideration of a standard model is timely given where telemental health is at in terms of its stage of scientific progress. We will broadly recommend some elements of what such a standard evaluation model might include for telemental health and suggest a way forward for adopting such a model.
Magnetic Field Experiment Data Analysis System
NASA Technical Reports Server (NTRS)
Holland, D. B.; Zanetti, L. J.; Suther, L. L.; Potemra, T. A.; Anderson, B. J.
1995-01-01
The Johns Hopkins University Applied Physics Laboratory (JHU/APL) Magnetic Field Experiment Data Analysis System (MFEDAS) has been developed to process and analyze satellite magnetic field experiment data from the TRIAD, MAGSAT, AMPTE/CCE, Viking, Polar BEAR, DMSP, HILAT, UARS, and Freja satellites. The MFEDAS provides extensive data management and analysis capabilities. The system is based on standard data structures and a standard user interface. The MFEDAS has two major elements: (1) a set of satellite unique telemetry processing programs for uniform and rapid conversion of the raw data to a standard format and (2) the program Magplot which has file handling, data analysis, and data display sections. This system is an example of software reuse, allowing new data sets and software extensions to be added in a cost effective and timely manner. Future additions to the system will include the addition of standard format file import routines, modification of the display routines to use a commercial graphics package based on X-Window protocols, and a generic utility for telemetry data access and conversion.
The Effect of Seasonal and Long-Period Geopotential Variations on the GPS Orbits
NASA Technical Reports Server (NTRS)
Melachroinos, Stavros A.; Lemoine, Frank G.; Chinn, Douglas S.; Zelensky, Nikita P.; Nicholas, Joseph B.; Beckley, Brian D.
2013-01-01
We examine the impact of using seasonal and long-period time-variable gravity field (TVG) models on GPS orbit determination, through simulations from 1994 to 2012. The models of time-variable gravity that we test include the GRGS release RL02 GRACE-derived 10-day gravity field models up to degree and order 20 (grgs20x20), a 4 x 4 series of weekly coefficients using GGM03S as a base derived from SLR and DORIS tracking to 11 satellites (tvg4x4), and a harmonic fit to the above 4 x 4 SLR-DORIS time series (goco2s_fit2). These detailed models are compared to GPS orbit simulations using a reference model (stdtvg) based on the International Earth Rotation Service (IERS) and International GNSS Service (IGS) repro1 standards. We find that the new TVG modeling produces significant along, cross-track orbit differences as well as annual, semi-annual, draconitic and long-period effects in the Helmert translation parameters (Tx, Ty, Tz) of the GPS orbits with magnitudes of several mm. We show that the simplistic TVG modeling approach used by all of the IGS Analysis Centers, which is based on the models provided by the IERS standards, becomes progressively less adequate following 2006 when compared to the seasonal and long-period TVG models.
Arizona's Instrument to Measure Standards (AIMS DPA). Student Guide, Grade 8
ERIC Educational Resources Information Center
Arizona Department of Education, 2006
2006-01-01
Arizona's Instrument to Measure Standards (AIMS), a Standards-Based test, provides educators and the public with valuable information regarding the progress of Arizona's students toward mastering Arizona's reading, writing and mathematics Standards. This specific test, Arizona's Instrument to Measure Standards Dual Purpose Assessment (AIMS DPA) is…
Soucek, David J; Dickinson, Amy
2015-09-01
Although insects occur in nearly all freshwater ecosystems, few sensitive insect models exist for use in determining the toxicity of contaminants. The objectives of the present study were to adapt previously developed culturing and toxicity testing methods for the mayfly Neocloeon triangulifer (Ephemeroptera: Baetidae), and to further develop a method for chronic toxicity tests spanning organism ages of less than 24 h post hatch to adult emergence, using a laboratory cultured diatom diet. The authors conducted 96-h fed acute tests and full-life chronic toxicity tests with sodium chloride, sodium nitrate, and sodium sulfate. The authors generated 96-h median lethal concentrations (LC50s) of 1062 mg Cl/L (mean of 3 tests), 179 mg N-NO3 /L, and 1227 mg SO4 /L. Acute to chronic ratios ranged from 2.1 to 6.4 for chloride, 2.5 to 5.1 for nitrate, and 2.3 to 8.5 for sulfate. The endpoints related to survival and development time were consistently the most sensitive in the tests. The chronic values generated for chloride were in the same range as those generated by others using natural foods. Furthermore, our weight-versus-fecundity plots were similar to those previously published using the food culturing method on which the present authors' method was based, indicating good potential for standardization. The authors believe that the continued use of this sensitive mayfly species in laboratory studies will help to close the gap in understanding between standard laboratory toxicity test results and field-based observations of community impairment. © 2015 SETAC.
Reusable Models of Pedagogical Concepts--A Framework for Pedagogical and Content Design.
ERIC Educational Resources Information Center
Pawlowski, Jan M.
Standardization initiatives in the field of learning technologies have produced standards for the interoperability of learning environments and learning management systems. Learning resources based on these standards can be reused, recombined, and adapted to the user. However, these standards follow a content-oriented approach; the process of…
Spatial regression test for ensuring temperature data quality in southern Spain
NASA Astrophysics Data System (ADS)
Estévez, J.; Gavilán, P.; García-Marín, A. P.
2018-01-01
Quality assurance of meteorological data is crucial for ensuring the reliability of applications and models that use such data as input variables, especially in the field of environmental sciences. Spatial validation of meteorological data is based on the application of quality control procedures using data from neighbouring stations to assess the validity of data from a candidate station (the station of interest). These kinds of tests, which are referred to in the literature as spatial consistency tests, take data from neighbouring stations in order to estimate the corresponding measurement at the candidate station. These estimations can be made by weighting values according to the distance between the stations or to the coefficient of correlation, among other methods. The test applied in this study relies on statistical decision-making and uses a weighting based on the standard error of the estimate. This paper summarizes the results of the application of this test to maximum, minimum and mean temperature data from the Agroclimatic Information Network of Andalusia (southern Spain). This quality control procedure includes a decision based on a factor f, the fraction of potential outliers for each station across the region. Using GIS techniques, the geographic distribution of the errors detected has been also analysed. Finally, the performance of the test was assessed by evaluating its effectiveness in detecting known errors.
Joint Test Report For Validation of Alternatives to Aliphatic Isocyanate Polyurethanes
NASA Technical Reports Server (NTRS)
Lewis, Pattie
2007-01-01
National Aeronautics and Space Administration (NASA) and Air Force Space Command (AFSPC) have similar missions and therefore similar facilities and structures in similar environments. The standard practice for protecting metallic substrates in atmospheric environments is the application of an applied coating system. The most common topcoats used in coating systems are polyurethanes that contain isocyanates. Isocyanates are classified as potential human carcinogens and are known to cause cancer in animals. The primary objective of this effort was to demonstrate and validate alternatives to aliphatic isocyanate polyurethanes resulting in one or more isocyanate-free coatings qualified for use at AFSPC and NASA installations participating in this project. This joint Test Report (JTR) documents the results of the laboratory and field testing as well as any test modifications made during the execution of the testing. The technical stakeholders agreed upon test procedure modifications documented in this document. This JTR is made available as a reference for future pollution prevention endeavors by other NASA centers, the Department of Defense and commercial users to minimize duplication of effort. All coating system candidates were tested using approved NASA and AFSPC standard coating systems as experimental controls. This study looked at eight alternative coating systems and two control coating systems and was divided into Phase I Screening Tests, Phase II Tests, and Field Testing. The Phase I Screening Tests were preliminary tests performed on all the selected candidate coating systems. Candidate coating systems that did not meet the acceptance criteria of the screening tests were eliminated from further testing. Phase I Screening Tests included: Ease of Application, Surface Appearance, Dry-To-Touch (Sanding), Accelerated Storage Stability, Pot Life (Viscosity), Cure Time (Solvent Rubs), Cleanability, Knife Test, Tensile (pull-off) Adhesion, and X-Cut Adhesion by Wet Tape After a review of the Phase I test results, four of the alternative coating systems showed substandard performance in relation to the Control Systems and were eliminated from the Phase II testing. Due to the interest of stakeholders and time constraints, however, all eight alternatives were subjected to the following Phase II tests, along with field testing at Stennis Space Center (SSC), Mississippi: Hypergol Compatibility, Liquid Oxygen Compatibility, 18-Month Marine Exposure (Gloss Retention, Color Retention, Blistering, Visual Corrosion, Creepage from Scribe, Heat Adhesion), and Field Exposure (6- and 12-month Evaluation for Coating Condition, Color Retention, Gloss Retention). The remaining four alternative coating systems determined to be the best viable alternatives were carried on to Phase II testing that included: Removability, Repairability, Abrasion Resistance, Gravelometer, Fungus Resistance, Accelerated Weathering, Mandrel Bend Flexibility, and Cyclic Corrosion Resistance. Of the systems that continued to Phase II, three (3) alternative coating systems meet the performance requirements as identified by stakeholders. Two (2) other systems, that were not included in Phase II testing, performed well enough on the 18-Month Marine Exposure, the primary requirement for NASA technical standard NASA-STD-5008, Protective Coating of Carbon Steel, Stainless Steel, and Aluminum on Launch Structures, Facilities, and Ground Support Equipment, that they were also considered to be successful candidates. In total, five (5) alternative coating systems were approved for inclusion in the NASA-STD- 5008 Qualified Products List (QPL). The standard is intended to provide a common framework for consistent practices across NASA and is often used by other entities. The standard's QPL does not connote endorsement of the products by NASA, but lists those products that have been tested and meet the requirements as specified.
Chapter 16: text mining for translational bioinformatics.
Cohen, K Bretonnel; Hunter, Lawrence E
2013-04-01
Text mining for translational bioinformatics is a new field with tremendous research potential. It is a subfield of biomedical natural language processing that concerns itself directly with the problem of relating basic biomedical research to clinical practice, and vice versa. Applications of text mining fall both into the category of T1 translational research-translating basic science results into new interventions-and T2 translational research, or translational research for public health. Potential use cases include better phenotyping of research subjects, and pharmacogenomic research. A variety of methods for evaluating text mining applications exist, including corpora, structured test suites, and post hoc judging. Two basic principles of linguistic structure are relevant for building text mining applications. One is that linguistic structure consists of multiple levels. The other is that every level of linguistic structure is characterized by ambiguity. There are two basic approaches to text mining: rule-based, also known as knowledge-based; and machine-learning-based, also known as statistical. Many systems are hybrids of the two approaches. Shared tasks have had a strong effect on the direction of the field. Like all translational bioinformatics software, text mining software for translational bioinformatics can be considered health-critical and should be subject to the strictest standards of quality assurance and software testing.
Development of EPA Protocol Information Enquiry Service System Based on Embedded ARM Linux
NASA Astrophysics Data System (ADS)
Peng, Daogang; Zhang, Hao; Weng, Jiannian; Li, Hui; Xia, Fei
Industrial Ethernet is a new technology for industrial network communications developed in recent years. In the field of industrial automation in China, EPA is the first standard accepted and published by ISO, and has been included in the fourth edition IEC61158 Fieldbus of NO.14 type. According to EPA standard, Field devices such as industrial field controller, actuator and other instruments are all able to realize communication based on the Ethernet standard. The Atmel AT91RM9200 embedded development board and open source embedded Linux are used to develop an information inquiry service system of EPA protocol based on embedded ARM Linux in this paper. The system is capable of designing an EPA Server program for EPA data acquisition procedures, the EPA information inquiry service is available for programs in local or remote host through Socket interface. The EPA client can access data and information of other EPA equipments on the EPA network when it establishes connection with the monitoring port of the server.
NASA Astrophysics Data System (ADS)
Frassinetti, L.; Olofsson, K. E. J.; Fridström, R.; Setiadi, A. C.; Brunsell, P. R.; Volpe, F. A.; Drake, J.
2013-08-01
A new method for the estimate of the wall diffusion time of non-axisymmetric fields is developed. The method based on rotating external fields and on the measurement of the wall frequency response is developed and tested in EXTRAP T2R. The method allows the experimental estimate of the wall diffusion time for each Fourier harmonic and the estimate of the wall diffusion toroidal asymmetries. The method intrinsically considers the effects of three-dimensional structures and of the shell gaps. Far from the gaps, experimental results are in good agreement with the diffusion time estimated with a simple cylindrical model that assumes a homogeneous wall. The method is also applied with non-standard configurations of the coil array, in order to mimic tokamak-relevant settings with a partial wall coverage and active coils of large toroidal extent. The comparison with the full coverage results shows good agreement if the effects of the relevant sidebands are considered.
What To Do, Instead of Counterproductive, Standardized Curricula and Testing.
ERIC Educational Resources Information Center
Keegan, Mark
1993-01-01
Presents evidence against the use of standardized curricula and testing and in favor of discovery-based instruction. Describes a successful experience with a relatively new form of discovery-based instruction, the scenario educational computer software. (AEF)
Gardner, Ian A; Whittington, Richard J; Caraguel, Charles G B; Hick, Paul; Moody, Nicholas J G; Corbeil, Serge; Garver, Kyle A.; Warg, Janet V.; Arzul, Isabelle; Purcell, Maureen; St. J. Crane, Mark; Waltzek, Thomas B.; Olesen, Niels J; Lagno, Alicia Gallardo
2016-01-01
Complete and transparent reporting of key elements of diagnostic accuracy studies for infectious diseases in cultured and wild aquatic animals benefits end-users of these tests, enabling the rational design of surveillance programs, the assessment of test results from clinical cases and comparisons of diagnostic test performance. Based on deficiencies in the Standards for Reporting of Diagnostic Accuracy (STARD) guidelines identified in a prior finfish study (Gardner et al. 2014), we adapted the Standards for Reporting of Animal Diagnostic Accuracy Studies—paratuberculosis (STRADAS-paraTB) checklist of 25 reporting items to increase their relevance to finfish, amphibians, molluscs, and crustaceans and provided examples and explanations for each item. The checklist, known as STRADAS-aquatic, was developed and refined by an expert group of 14 transdisciplinary scientists with experience in test evaluation studies using field and experimental samples, in operation of reference laboratories for aquatic animal pathogens, and in development of international aquatic animal health policy. The main changes to the STRADAS-paraTB checklist were to nomenclature related to the species, the addition of guidelines for experimental challenge studies, and the designation of some items as relevant only to experimental studies and ante-mortem tests. We believe that adoption of these guidelines will improve reporting of primary studies of test accuracy for aquatic animal diseases and facilitate assessment of their fitness-for-purpose. Given the importance of diagnostic tests to underpin the Sanitary and Phytosanitary agreement of the World Trade Organization, the principles outlined in this paper should be applied to other World Organisation for Animal Health (OIE)-relevant species.
Gardner, Ian A; Whittington, Richard J; Caraguel, Charles G B; Hick, Paul; Moody, Nicholas J G; Corbeil, Serge; Garver, Kyle A; Warg, Janet V; Arzul, Isabelle; Purcell, Maureen K; Crane, Mark St J; Waltzek, Thomas B; Olesen, Niels J; Gallardo Lagno, Alicia
2016-02-25
Complete and transparent reporting of key elements of diagnostic accuracy studies for infectious diseases in cultured and wild aquatic animals benefits end-users of these tests, enabling the rational design of surveillance programs, the assessment of test results from clinical cases and comparisons of diagnostic test performance. Based on deficiencies in the Standards for Reporting of Diagnostic Accuracy (STARD) guidelines identified in a prior finfish study (Gardner et al. 2014), we adapted the Standards for Reporting of Animal Diagnostic Accuracy Studies-paratuberculosis (STRADAS-paraTB) checklist of 25 reporting items to increase their relevance to finfish, amphibians, molluscs, and crustaceans and provided examples and explanations for each item. The checklist, known as STRADAS-aquatic, was developed and refined by an expert group of 14 transdisciplinary scientists with experience in test evaluation studies using field and experimental samples, in operation of reference laboratories for aquatic animal pathogens, and in development of international aquatic animal health policy. The main changes to the STRADAS-paraTB checklist were to nomenclature related to the species, the addition of guidelines for experimental challenge studies, and the designation of some items as relevant only to experimental studies and ante-mortem tests. We believe that adoption of these guidelines will improve reporting of primary studies of test accuracy for aquatic animal diseases and facilitate assessment of their fitness-for-purpose. Given the importance of diagnostic tests to underpin the Sanitary and Phytosanitary agreement of the World Trade Organization, the principles outlined in this paper should be applied to other World Organisation for Animal Health (OIE)-relevant species.
Research on Generating Method of Embedded Software Test Document Based on Dynamic Model
NASA Astrophysics Data System (ADS)
Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Liu, Ying
2018-03-01
This paper provides a dynamic model-based test document generation method for embedded software that provides automatic generation of two documents: test requirements specification documentation and configuration item test documentation. This method enables dynamic test requirements to be implemented in dynamic models, enabling dynamic test demand tracking to be easily generated; able to automatically generate standardized, standardized test requirements and test documentation, improved document-related content inconsistency and lack of integrity And other issues, improve the efficiency.
Homogeneity revisited: analysis of updated precipitation series in Turkey
NASA Astrophysics Data System (ADS)
Bickici Arikan, Bugrayhan; Kahya, Ercan
2018-01-01
Homogeneous time series of meteorological variables are necessary for hydrologic and climate studies. Dependability of historical precipitation data is subjected to keen evaluation prior to every study in water resources, hydrology, and climate change fields. This study aims to characterize the homogeneity of long-term Turkish precipitation data in order to ensure that they can be reliably used. The homogeneity of monthly precipitation data set was tested using the standard normal homogeneity test, Buishand test, Von Neumann ratio test, and Pettitt test at the 5% significance level across Turkey. Our precipitation records including the most updated observations, extracted from 160 meteorological stations, for the periods 1974-2014 were analyzed by all the four homogeneity tests. According to the results of all tests, five out of 160 stations have an inhomogeneity. With regard to our strict confirmation rule, 44 out of 160 stations are said to be inhomogeneous since they failed from at least one of the four tests. The breaks captured by the Buishand and Pettitt tests usually tend to appear in the middle of the precipitation series, whereas the ability of standard normal homogeneity test is in favor of identifying inhomogeneities mostly at the beginning or at the end of the records. Our results showed that 42 out of 44 inhomogeneous stations passed all the four tests after applying a correction procedure based on the double mass curve analysis. Available metadata was used to interpret the detected inhomogeneity.
A new methodology for hydro-abrasive erosion tests simulating penstock erosive flow
NASA Astrophysics Data System (ADS)
Aumelas, V.; Maj, G.; Le Calvé, P.; Smith, M.; Gambiez, B.; Mourrat, X.
2016-11-01
Hydro-abrasive resistance is an important property requirement for hydroelectric power plant penstock coating systems used by EDF. The selection of durable coating systems requires an experimental characterization of coating performance. This can be achieved by performing accelerated and representative laboratory tests. In case of severe erosion induced by a penstock flow, there is no suitable method or standard representative of real erosive flow conditions. The presented study aims at developing a new methodology and an associated laboratory experimental device. The objective of the laboratory apparatus is to subject coated test specimens to wear conditions similar to the ones generated at the penstock lower generatrix in actual flow conditions. Thirteen preselected coating solutions were first been tested during a 45 hours erosion test. A ranking of the thirteen coating solutions was then determined after characterisation. To complete this first evaluation and to determine the wear kinetic of the four best coating solutions, additional erosion tests were conducted with a longer duration of 216 hours. A comparison of this new method with standardized tests and with real service operating flow conditions is also discussed. To complete the final ranking based on hydro-abrasive erosion tests, some trial tests were carried out on penstock samples to check the application method of selected coating systems. The paper gives some perspectives related to erosion test methodologies for materials and coating solutions for hydraulic applications. The developed test method can also be applied in other fields.
Developing a Strategy for Using Technology-Enhanced Items in Large-Scale Standardized Tests
ERIC Educational Resources Information Center
Bryant, William
2017-01-01
As large-scale standardized tests move from paper-based to computer-based delivery, opportunities arise for test developers to make use of items beyond traditional selected and constructed response types. Technology-enhanced items (TEIs) have the potential to provide advantages over conventional items, including broadening construct measurement,…
Xiao, Xiang; Wang, Tianping; Ye, Hongzhuan; Qiang, Guangxiang; Wei, Haiming; Tian, Zhigang
2005-01-01
OBJECTIVE: To determine the validity of a recently developed rapid test--a colloidal dye immunofiltration assay (CDIFA)--used by health workers in field settings to identify villagers infected with Schistosoma japonicum. METHODS: Health workers in the field used CDIFA to test samples from 1553 villagers in two areas of low endemicity and an area where S. japonicum was not endemic in Anhui, China. All the samples were then tested in the laboratory by laboratory staff using a standard parasitological method (Kato-Katz), an indirect haemagglutination assay (IHA), and CDIFA. The results of CDIFA performed by health workers were compared with those obtained by Kato-Katz and IHA. FINDINGS: Concordance between the results of CDIFA performed in field settings and in the laboratory was high (kappa index, 0.95; 95% confidence interval, 0.93-0.97). When Kato-Katz was used as the reference test, the overall sensitivity and specificity of CDIFA were 98.5% and 83.6%, respectively in the two villages in areas of low endemicity, while the specificity was 99.8% in the nonendemic village. Compared with IHA, the overall specificity and sensitivity of CDIFA were greater than 99% and 96%, respectively. With the combination of Kato-Katz and IHA as the reference standard, CDIFA had a sensitivity of 95.8% and a specificity of 99.5%, and an accuracy of 98.6% in the two areas of low endemicity. CONCLUSION: CDIFA is a specific, sensitive, and reliable test that can be used for rapid screening for schistosomiasis by health workers in field settings. PMID:16175827
Hereditary arrhythmias and cardiomyopathies: decision-making about genetic testing.
Louis, Clauden; Calamaro, Emily; Vinocur, Jeffrey M
2018-01-01
The modern field of clinical genetics has advanced beyond the traditional teachings familiar to most practicing cardiologists. Increased understanding of the roles of genetic testing may improve uptake and appropriateness of use. Clinical genetics has become integral to the management of patients with hereditary arrhythmia and cardiomyopathy diagnoses. Depending on the condition, genetic testing may be useful for diagnosis, prognosis, treatment, family screening, and reproductive planning. However, genetic testing is a powerful tool with potential for underuse, overuse, and misuse. In the absence of a substantial body of literature on how these guidelines are applied in clinical practice, we use a case-based approach to highlight key lessons and pitfalls. Importantly, in many scenarios genetic testing has become the standard of care supported by numerous class I recommendations; genetic counselors can improve accessibility to and appropriate use and application of testing. Optimal management of hereditary arrhythmias and cardiomyopathies incorporates genetic testing, applied as per consensus guidelines, with involvement of a multidisciplinary team.
A comparison of exact tests for trend with binary endpoints using Bartholomew's statistic.
Consiglio, J D; Shan, G; Wilding, G E
2014-01-01
Tests for trend are important in a number of scientific fields when trends associated with binary variables are of interest. Implementing the standard Cochran-Armitage trend test requires an arbitrary choice of scores assigned to represent the grouping variable. Bartholomew proposed a test for qualitatively ordered samples using asymptotic critical values, but type I error control can be problematic in finite samples. To our knowledge, use of the exact probability distribution has not been explored, and we study its use in the present paper. Specifically we consider an approach based on conditioning on both sets of marginal totals and three unconditional approaches where only the marginal totals corresponding to the group sample sizes are treated as fixed. While slightly conservative, all four tests are guaranteed to have actual type I error rates below the nominal level. The unconditional tests are found to exhibit far less conservatism than the conditional test and thereby gain a power advantage.
NASA Astrophysics Data System (ADS)
Pechousek, J.; Prochazka, R.; Mashlan, M.; Jancik, D.; Frydrych, J.
2009-01-01
The digital proportional-integral-derivative (PID) velocity controller used in the Mössbauer spectrometer implemented in field programmable gate array (FPGA) is based on the National Instruments CompactRIO embedded system and LabVIEW graphical programming tools. The system works as a remote system accessible via the Ethernet. The digital controller operates in real-time conditions, and the maximum sampling frequency is approximately 227 kS s-1. The system was tested with standard sample measurements of α-Fe and α-57Fe2O3 on two different electromechanical velocity transducers. The nonlinearities of the velocity scales in the relative form are better than 0.2%. The replacement of the standard analog PID controller by the new system brings the possibility of optimizing the control process more precisely.
Towards standard testbeds for numerical relativity
NASA Astrophysics Data System (ADS)
Alcubierre, Miguel; Allen, Gabrielle; Bona, Carles; Fiske, David; Goodale, Tom; Guzmán, F. Siddhartha; Hawke, Ian; Hawley, Scott H.; Husa, Sascha; Koppitz, Michael; Lechner, Christiane; Pollney, Denis; Rideout, David; Salgado, Marcelo; Schnetter, Erik; Seidel, Edward; Shinkai, Hisa-aki; Shoemaker, Deirdre; Szilágyi, Béla; Takahashi, Ryoji; Winicour, Jeff
2004-01-01
In recent years, many different numerical evolution schemes for Einstein's equations have been proposed to address stability and accuracy problems that have plagued the numerical relativity community for decades. Some of these approaches have been tested on different spacetimes, and conclusions have been drawn based on these tests. However, differences in results originate from many sources, including not only formulations of the equations, but also gauges, boundary conditions, numerical methods and so on. We propose to build up a suite of standardized testbeds for comparing approaches to the numerical evolution of Einstein's equations that are designed to both probe their strengths and weaknesses and to separate out different effects, and their causes, seen in the results. We discuss general design principles of suitable testbeds, and we present an initial round of simple tests with periodic boundary conditions. This is a pivotal first step towards building a suite of testbeds to serve the numerical relativists and researchers from related fields who wish to assess the capabilities of numerical relativity codes. We present some examples of how these tests can be quite effective in revealing various limitations of different approaches, and illustrating their differences. The tests are presently limited to vacuum spacetimes, can be run on modest computational resources and can be used with many different approaches used in the relativity community.
Evaluation of Breast Sentinel Lymph Node Coverage by Standard Radiation Therapy Fields
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rabinovitch, Rachel; Ballonoff, Ari; Newman, Francis M.S.
2008-04-01
Background: Biopsy of the breast sentinel lymph node (SLN) is now a standard staging procedure for early-stage invasive breast cancer. The anatomic location of the breast SLN and its relationship to standard radiation fields has not been described. Methods and Materials: A retrospective review of radiotherapy treatment planning data sets was performed in patients with breast cancer who had undergone SLN biopsy, and those with a surgical clip at the SLN biopsy site were identified. The location of the clip was evaluated relative to vertebral body level on an anterior-posterior digitally reconstructed radiograph, treated whole-breast tangential radiation fields, and standardmore » axillary fields in 106 data sets meeting these criteria. Results: The breast SLN varied in vertebral body level position, ranging from T2 to T7 but most commonly opposite T4. The SLN clip was located below the base of the clavicle in 90%, and hence would be excluded from standard axillary radiotherapy fields where the inferior border is placed at this level. The clip was within the irradiated whole-breast tangent fields in 78%, beneath the superior-posterior corner multileaf collimators in 12%, and outside the tangent field borders in 10%. Conclusions: Standard axillary fields do not encompass the lymph nodes at highest risk of containing tumor in breast cancer patients. Elimination of the superior-posterior corner MLCs from the tangent field design would result in inclusion of the breast SLN in 90% of patients treated with standard whole-breast irradiation.« less
Mass Trapping for Anastrepha suspensa
USDA-ARS?s Scientific Manuscript database
ABSTRACT In field tests conducted in south Florida to test grape juice as an alternative inexpensive bait for Anastrepha suspensa Loew, high numbers of Zaprionus indianus Gupta were captured in traps baited with aqueous grape juice. These experiments included comparisons of grape juice with standard...
Østergaard, Mia L; Nielsen, Kristina R; Albrecht-Beste, Elisabeth; Konge, Lars; Nielsen, Michael B
2018-01-01
This study aimed to develop a test with validity evidence for abdominal diagnostic ultrasound with a pass/fail-standard to facilitate mastery learning. The simulator had 150 real-life patient abdominal scans of which 15 cases with 44 findings were selected, representing level 1 from The European Federation of Societies for Ultrasound in Medicine and Biology. Four groups of experience levels were constructed: Novices (medical students), trainees (first-year radiology residents), intermediates (third- to fourth-year radiology residents) and advanced (physicians with ultrasound fellowship). Participants were tested in a standardized setup and scored by two blinded reviewers prior to an item analysis. The item analysis excluded 14 diagnoses. Both internal consistency (Cronbach's alpha 0.96) and inter-rater reliability (0.99) were good and there were statistically significant differences (p < 0.001) between all four groups, except the intermediate and advanced groups (p = 1.0). There was a statistically significant correlation between experience and test scores (Pearson's r = 0.82, p < 0.001). The pass/fail-standard failed all novices (no false positives) and passed all advanced (no false negatives). All intermediate participants and six out of 14 trainees passed. We developed a test for diagnostic abdominal ultrasound with solid validity evidence and a pass/fail-standard without any false-positive or false-negative scores. • Ultrasound training can benefit from competency-based education based on reliable tests. • This simulation-based test can differentiate between competency levels of ultrasound examiners. • This test is suitable for competency-based education, e.g. mastery learning. • We provide a pass/fail standard without false-negative or false-positive scores.
Mbare, Oscar; Lindsay, Steven W; Fillinger, Ulrike
2013-03-14
Recently research has shown that larviciding can be an effective tool for integrated malaria vector control. Nevertheless, the uptake of this intervention has been hampered by the need to re-apply larvicides frequently. There is a need to explore persistent, environmentally friendly larvicides for malaria vector control to reduce intervention efforts and costs by reducing the frequency of application. In this study, the efficacy of a 0.5% pyriproxyfen granule (Surmilarv®0.5G, Sumitomo Chemicals) was assessed for the control of Anopheles gambiae sensu stricto and Anopheles arabiensis, the major malaria vectors in sub-Saharan Africa. Dose-response and standardized field tests were implemented following standard procedures of the World Health Organization's Pesticide Evaluation Scheme to determine: (i) the susceptibility of vectors to this formulation; (ii) the residual activity and appropriate retreatment schedule for field application; and, (iii) sub-lethal impacts on the number and viability of eggs laid by adults after exposure to Sumilarv®0.5G during larval development. Anopheles gambiae s.s. and An. arabiensis were highly susceptible to Sumilarv®0.5G. Estimated emergence inhibition (EI) values were very low and similar for both species. The minimum dosage that completely inhibited adult emergence was between 0.01-0.03 parts per million (ppm) active ingredient (ai). Compared to the untreated control, an application of 0.018 ppm ai prevented 85% (95% confidence interval (CI) 82%-88%) of adult emergence over six weeks under standardized field conditions. A fivefold increase in dosage of 0.09 ppm ai prevented 97% (95% CI 94%-98%) emergence. Significant sub-lethal effects were observed in the standardized field tests. Female An. gambiae s.s. that were exposed to 0.018 ppm ai as larvae laid 47% less eggs, and females exposed to 0.09 ppm ai laid 74% less eggs than females that were unexposed to the treatment. Furthermore, 77% of eggs laid by females exposed to 0.018 ppm ai failed to hatch, whilst 98% of eggs laid by females exposed to 0.09 ppm ai did not hatch. Anopheles gambiae s.s. and An. arabiensis are highly susceptible to Sumilarv®0.5G at very low dosages. The persistence of this granule formulation in treated habitats under standardized field conditions and its sub-lethal impact, reducing the number of viable eggs from adults emerging from treated ponds, enhances its potential as malaria vector control tool. These unique properties warrant further field testing to determine its suitability for inclusion in malaria vector control programmes.
2013-01-01
Background Recently research has shown that larviciding can be an effective tool for integrated malaria vector control. Nevertheless, the uptake of this intervention has been hampered by the need to re-apply larvicides frequently. There is a need to explore persistent, environmentally friendly larvicides for malaria vector control to reduce intervention efforts and costs by reducing the frequency of application. In this study, the efficacy of a 0.5% pyriproxyfen granule (Surmilarv®0.5G, Sumitomo Chemicals) was assessed for the control of Anopheles gambiae sensu stricto and Anopheles arabiensis, the major malaria vectors in sub-Saharan Africa. Methods Dose–response and standardized field tests were implemented following standard procedures of the World Health Organization’s Pesticide Evaluation Scheme to determine: (i) the susceptibility of vectors to this formulation; (ii) the residual activity and appropriate retreatment schedule for field application; and, (iii) sub-lethal impacts on the number and viability of eggs laid by adults after exposure to Sumilarv®0.5G during larval development. Results Anopheles gambiae s.s. and An. arabiensis were highly susceptible to Sumilarv®0.5G. Estimated emergence inhibition (EI) values were very low and similar for both species. The minimum dosage that completely inhibited adult emergence was between 0.01-0.03 parts per million (ppm) active ingredient (ai). Compared to the untreated control, an application of 0.018 ppm ai prevented 85% (95% confidence interval (CI) 82%-88%) of adult emergence over six weeks under standardized field conditions. A fivefold increase in dosage of 0.09 ppm ai prevented 97% (95% CI 94%-98%) emergence. Significant sub-lethal effects were observed in the standardized field tests. Female An. gambiae s.s. that were exposed to 0.018 ppm ai as larvae laid 47% less eggs, and females exposed to 0.09 ppm ai laid 74% less eggs than females that were unexposed to the treatment. Furthermore, 77% of eggs laid by females exposed to 0.018 ppm ai failed to hatch, whilst 98% of eggs laid by females exposed to 0.09 ppm ai did not hatch. Conclusion Anopheles gambiae s.s. and An. arabiensis are highly susceptible to Sumilarv®0.5G at very low dosages. The persistence of this granule formulation in treated habitats under standardized field conditions and its sub-lethal impact, reducing the number of viable eggs from adults emerging from treated ponds, enhances its potential as malaria vector control tool. These unique properties warrant further field testing to determine its suitability for inclusion in malaria vector control programmes. PMID:23497149
Pre-Clinical Tests of an Integrated CMOS Biomolecular Sensor for Cardiac Diseases Diagnosis.
Lee, Jen-Kuang; Wang, I-Shun; Huang, Chi-Hsien; Chen, Yih-Fan; Huang, Nien-Tsu; Lin, Chih-Ting
2017-11-26
Coronary artery disease and its related complications pose great threats to human health. In this work, we aim to clinically evaluate a CMOS field-effect biomolecular sensor for cardiac biomarkers, cardiac-specific troponin-I (cTnI), N -terminal prohormone brain natriuretic peptide (NT-proBNP), and interleukin-6 (IL-6). The CMOS biosensor is implemented via a standard commercialized 0.35 μm CMOS process. To validate the sensing characteristics, in buffer conditions, the developed CMOS biosensor has identified the detection limits of IL-6, cTnI, and NT-proBNP as being 45 pM, 32 pM, and 32 pM, respectively. In clinical serum conditions, furthermore, the developed CMOS biosensor performs a good correlation with an enzyme-linked immuno-sorbent assay (ELISA) obtained from a hospital central laboratory. Based on this work, the CMOS field-effect biosensor poses good potential for accomplishing the needs of a point-of-care testing (POCT) system for heart disease diagnosis.
2006-11-01
All Quality Control Reference Materials are acquired only from authorized vendors or sources commonly used by U.S. EPA Regional Laboratories...Institue of Standards and Testing (NITS) Standard Reference Materials (SRM) or to the U.S. EPA Reference Standards. Working Standards The commercial...contaminants from clothing or equipment by blowing, shaking or any other means that may disperse material into the air is prohibited. 7.1.3. All disposable
Ope, Maurice; Nyoka, Raymond; Unshur, Ahmed; Oyier, Fredrick O.; Mowlid, Shafe A.; Owino, Brian; Ochieng, Steve B.; Okello, Charles I.; Montgomery, Joel M.; Wagacha, Burton; Galev, Aleksandar; Abdow, Abdikadir; Esona, Mathew D.; Tate, Jacqueline; Fitter, David; Cookson, Susan T.; Arunmozhi, Balajee; Marano, Nina
2017-01-01
Rotavirus commonly causes diarrhea in children, leading to hospitalization and even death. Rapid diagnostic tests are feasible alternatives for determining rotavirus outbreaks in refugee camps that have inadequate laboratory capacity. We evaluated the field performance of ImmunoCard STAT!® Rotavirus (ICS-RV) in Dadaab Refugee Camp and at the Kenya–Somalia border. From May to December 2014, we prospectively enrolled children aged < 5 years hospitalized with acute diarrhea, defined as ≥ 3 episodes of loose stool in 24 hours for < 7 days. Stool samples were collected and tested by trained surveillance clerks using ICS-RV per manufacturer's instructions. The field performance characteristics of ICS-RV were evaluated against the gold standard test, Premier™ Rotaclone® enzyme immunoassay. The operational characteristics were evaluated using World Health Organization (WHO) ASSURED criteria to determine whether ICS-RV is appropriate as a point-of-care test by administering a standard questionnaire and observing surveillance clerks performing the test. We enrolled 213 patients with a median age of 10 months (range = 1–48); 58.2% were male. A total of 71 (33.3%) and 60 (28.2%) patients tested positive for rotavirus infection by immunoassay and ICS-RV, respectively. The sensitivity, specificity, and positive and negative predictive values of ICS-RV compared with the immunoassay were 83.1% (95% confidence interval [CI] = 72.3–91.0), 99.3% (95% CI = 96.1–100), 98.3% (95% CI = 91.1–100), and 92.1% (95% CI = 86.6–95.5), respectively. The ICS-RV fulfilled the WHO ASSURED criteria for point-of-care testing. ICS-RV is a field-ready point-of-care test with good field performance and operational characteristics. It can be useful in determining rotavirus outbreaks in resource-limited settings. PMID:28719278
Ope, Maurice; Nyoka, Raymond; Unshur, Ahmed; Oyier, Fredrick O; Mowlid, Shafe A; Owino, Brian; Ochieng, Steve B; Okello, Charles I; Montgomery, Joel M; Wagacha, Burton; Galev, Aleksandar; Abdow, Abdikadir; Esona, Mathew D; Tate, Jacqueline; Fitter, David; Cookson, Susan T; Arunmozhi, Balajee; Marano, Nina
2017-06-01
AbstractRotavirus commonly causes diarrhea in children, leading to hospitalization and even death. Rapid diagnostic tests are feasible alternatives for determining rotavirus outbreaks in refugee camps that have inadequate laboratory capacity. We evaluated the field performance of ImmunoCard STAT! ® Rotavirus (ICS-RV) in Dadaab Refugee Camp and at the Kenya-Somalia border. From May to December 2014, we prospectively enrolled children aged < 5 years hospitalized with acute diarrhea, defined as ≥ 3 episodes of loose stool in 24 hours for < 7 days. Stool samples were collected and tested by trained surveillance clerks using ICS-RV per manufacturer's instructions. The field performance characteristics of ICS-RV were evaluated against the gold standard test, Premier ™ Rotaclone ® enzyme immunoassay. The operational characteristics were evaluated using World Health Organization (WHO) ASSURED criteria to determine whether ICS-RV is appropriate as a point-of-care test by administering a standard questionnaire and observing surveillance clerks performing the test. We enrolled 213 patients with a median age of 10 months (range = 1-48); 58.2% were male. A total of 71 (33.3%) and 60 (28.2%) patients tested positive for rotavirus infection by immunoassay and ICS-RV, respectively. The sensitivity, specificity, and positive and negative predictive values of ICS-RV compared with the immunoassay were 83.1% (95% confidence interval [CI] = 72.3-91.0), 99.3% (95% CI = 96.1-100), 98.3% (95% CI = 91.1-100), and 92.1% (95% CI = 86.6-95.5), respectively. The ICS-RV fulfilled the WHO ASSURED criteria for point-of-care testing. ICS-RV is a field-ready point-of-care test with good field performance and operational characteristics. It can be useful in determining rotavirus outbreaks in resource-limited settings.
Epoxy-based production of wind turbine rotor blades: occupational contact allergies.
Pontén, Ann; Carstensen, Ole; Rasmussen, Kurt; Gruvberger, Birgitta; Isaksson, Marléne; Bruze, Magnus
2004-03-01
An industry producing rotor blades for wind turbines with an epoxy-based technology had experienced an increasing number of workers with dermatitis, among whom the frequency of occupational contact allergy (OCA) was suspected to be underestimated. To investigate the frequency of OCA by patch-testing with a specially profiled occupational patch test series. In a blinded study design, 603 workers were first interviewed and thereafter clinically examined. Based on a history of work-related skin disease, clinical findings of dermatitis, or both, 325 (53.9%) of the workers were patch-tested with an occupational patch test series and the European Standard patch test series. Of the 603 investigated workers, 10.9% had OCA and 5.6% had contact allergy to epoxy resin in the standard test series. Contact allergy to amine hardeners/catalysts was found in 4.1% of the workers. Among the workers with OCA, 48.5% reacted to work material other than epoxy resin in the European Standard patch test series. Approximately 50% of the workers with OCA would not have been detected if only the European Standard patch test series had been used.
WE-AB-BRB-10: Filmless QA of CyberKnife MLC-Collimated and Iris-Collimated Fields
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gersh, J; Spectrum Medical Physics, LLC, Greenville, SC
Purpose: Current methods of CK field shape QA is based on the use of radiochromic film. Though accurate results can be attained, these methods are prone to error, time consuming, and expensive. The techniques described herein perform similar QA using the FOIL Detector (Field, Output, and Image Localization). A key feature of this in-house QA solution, and central to this study, is an aSi flat-panel detector which provides the user with the means to perform accurate, immediate, and quantitative field analysis. Methods: The FOIL detector is automatically aligned in the CK beam using fiducial markers implanted within the detector case.more » Once the system is aligned, a treatment plan is delivered which irradiates the flat-panel imager using the field being tested. The current study tests each of the clinically-used fields shaped using the Iris variable-aperture collimation system using a plan which takes 6 minutes to deliver. The user is immediately provided with field diameter and beam profile, as well as a comparison to baseline values. Additionally, the detector is used to acquire and analyze leaf positions of the InCise multi-leaf collimation system. Results: Using a 6-minute plan consisting of 11 beams of 25MU-per-beam, the FOIL detector provided the user with a quantitative analysis of all clinically-used field shapes. The FOIL detector was also able to clearly resolve field edge junctions in a picket fence test, including slight over-travel of individual leaves as well as inter-leaf leakage. Conclusion: The FOIL system provided comparable field diameter and profile data when compared to methods using film; providing results much faster and with 5% of the MU used for film. When used with the MLC system, the FOIL detector provided the means for immediate quantification of the performance of the system through analysis of leaf positions in a picket fence test field. Author is the President/Owner of Spectrum Medical Physics, LLC, a company which maintains contracts with Siemens Healthcare and Standard Imaging, Inc.« less
Modeling Sound Propagation Through Non-Axisymmetric Jets
NASA Technical Reports Server (NTRS)
Leib, Stewart J.
2014-01-01
A method for computing the far-field adjoint Green's function of the generalized acoustic analogy equations under a locally parallel mean flow approximation is presented. The method is based on expanding the mean-flow-dependent coefficients in the governing equation and the scalar Green's function in truncated Fourier series in the azimuthal direction and a finite difference approximation in the radial direction in circular cylindrical coordinates. The combined spectral/finite difference method yields a highly banded system of algebraic equations that can be efficiently solved using a standard sparse system solver. The method is applied to test cases, with mean flow specified by analytical functions, corresponding to two noise reduction concepts of current interest: the offset jet and the fluid shield. Sample results for the Green's function are given for these two test cases and recommendations made as to the use of the method as part of a RANS-based jet noise prediction code.
Microcomputer-based system for registration of oxygen tension in peripheral muscle.
Odman, S; Bratt, H; Erlandsson, I; Sjögren, L
1986-01-01
For registration of oxygen tension fields in peripheral muscle a microcomputer based system was designed on the M6800 microprocessor. The system was designed to record the signals from a multiwire oxygen electrode, MDO, which is a multiwire electrode for measuring oxygen on the surface of an organ. The system contained patient safety isolation unit built on optocopplers and the upper frequency limit was 0.64 Hz. Collected data were corrected for drift and temperature changes during the measurement by using pre- and after calibrations and a linear compensation technique. Measure drift of the electrodes were proved to be linear and thus the drift could be compensated for. The system was tested in an experiment on pig. To study the distribution of oxygen statistically mean, standard deviation, skewness and curtosis were calculated. To see changes or differences between histograms a Kolmogorv-Smirnov test was used.
Research Says…/High-Stakes Testing Narrows the Curriculum
ERIC Educational Resources Information Center
David, Jane L.
2011-01-01
The current rationale for standards-based reform goes like this: If standards are demanding and tests accurately measure achievement of those standards, then curriculum and instruction will become richer and more rigorous. By attaching serious consequences to schools that fail to increase test scores, U.S. policymakers believe that educators will…
Redmond, Tony; O'Leary, Neil; Hutchison, Donna M; Nicolela, Marcelo T; Artes, Paul H; Chauhan, Balwantray C
2013-12-01
A new analysis method called permutation of pointwise linear regression measures the significance of deterioration over time at each visual field location, combines the significance values into an overall statistic, and then determines the likelihood of change in the visual field. Because the outcome is a single P value, individualized to that specific visual field and independent of the scale of the original measurement, the method is well suited for comparing techniques with different stimuli and scales. To test the hypothesis that frequency-doubling matrix perimetry (FDT2) is more sensitive than standard automated perimetry (SAP) in identifying visual field progression in glaucoma. Patients with open-angle glaucoma and healthy controls were examined by FDT2 and SAP, both with the 24-2 test pattern, on the same day at 6-month intervals in a longitudinal prospective study conducted in a hospital-based setting. Only participants with at least 5 examinations were included. Data were analyzed with permutation of pointwise linear regression. Permutation of pointwise linear regression is individualized to each participant, in contrast to current analyses in which the statistical significance is inferred from population-based approaches. Analyses were performed with both total deviation and pattern deviation. Sixty-four patients and 36 controls were included in the study. The median age, SAP mean deviation, and follow-up period were 65 years, -2.6 dB, and 5.4 years, respectively, in patients and 62 years, +0.4 dB, and 5.2 years, respectively, in controls. Using total deviation analyses, statistically significant deterioration was identified in 17% of patients with FDT2, in 34% of patients with SAP, and in 14% of patients with both techniques; in controls these percentages were 8% with FDT2, 31% with SAP, and 8% with both. Using pattern deviation analyses, statistically significant deterioration was identified in 16% of patients with FDT2, in 17% of patients with SAP, and in 3% of patients with both techniques; in controls these values were 3% with FDT2 and none with SAP. No evidence was found that FDT2 is more sensitive than SAP in identifying visual field deterioration. In about one-third of healthy controls, age-related deterioration with SAP reached statistical significance.
Psychopharmacology curriculum field test.
Zisook, Sidney; Balon, Richard; Benjamin, Sheldon; Beresin, Eugene; Goldberg, David A; Jibson, Michael D; Thrall, Grace
2009-01-01
As part of an effort to improve psychopharmacology training in psychiatric residency programs, a committee of residency training directors and associate directors adapted an introductory schizophrenia presentation from the American Society of Clinical Psychopharmacology's Model Psychopharmacology Curriculum to develop a multimodal, interactive training module. This article describes the module, its development, and the results of a field trial to test its feasibility and usefulness. Nineteen residency programs volunteered to use the module during the first half of the 2007-2008 academic year. Evaluation consisted of a structured phone interview with the training director or teaching faculty of participating programs during February and early March 2008, asking whether and how they used the curriculum, which components they found most useful, and how it was received by faculty and residents. Of the 19 programs, 14 used the module and 13 participated in the evaluation. The most commonly used components were the pre- and postmodule questions, video-enhanced presentation, standard presentation, problem- or team-based teaching module, and other problem-based teaching modules. No two programs used the module in the same fashion, but it was well received by instructors and residents regardless of use. The results of this field trial suggest that a dynamic, adult-centered curriculum that is exciting, innovative, and informative enough for a wide variety of programs can be developed; however, the development and programmatic barriers require considerable time and effort to overcome.
Portable atomic frequency standard based on coherent population trapping
NASA Astrophysics Data System (ADS)
Shi, Fan; Yang, Renfu; Nian, Feng; Zhang, Zhenwei; Cui, Yongshun; Zhao, Huan; Wang, Nuanrang; Feng, Keming
2015-05-01
In this work, a portable atomic frequency standard based on coherent population trapping is designed and demonstrated. To achieve a portable prototype, in the system, a single transverse mode 795nm VCSEL modulated by a 3.4GHz RF source is used as a pump laser which generates coherent light fields. The pump beams pass through a vapor cell containing atom gas and buffer gas. This vapor cell is surrounded by a magnetic shield and placed inside a solenoid which applies a longitudinal magnetic field to lift the Zeeman energy levels' degeneracy and to separate the resonance signal, which has no first-order magnetic field dependence, from the field-dependent resonances. The electrical control system comprises two control loops. The first one locks the laser wavelength to the minimum of the absorption spectrum; the second one locks the modulation frequency and output standard frequency. Furthermore, we designed the micro physical package and realized the locking of a coherent population trapping atomic frequency standard portable prototype successfully. The short-term frequency stability of the whole system is measured to be 6×10-11 for averaging times of 1s, and reaches 5×10-12 at an averaging time of 1000s.
The additional benefit of the ML Flow test to classify leprosy patients.
Bührer-Sékula, Samira; Illarramendi, Ximena; Teles, Rose B; Penna, Maria Lucia F; Nery, José Augusto C; Sales, Anna Maria; Oskam, Linda; Sampaio, Elizabeth P; Sarno, Euzenir N
2009-08-01
The use of the skin lesion counting classification leads to both under and over diagnosis of leprosy in many instances. Thus, there is a need to complement this classification with another simple and robust test for use in the field. Data of 202 untreated leprosy patients diagnosed at FIOCRUZ, Rio de Janeiro, Brazil, was analyzed. There were 90 patients classified as PB and 112 classified as MB according to the reference standard. The BI was positive in 111 (55%) patients and the ML Flow test in 116 (57.4%) patients. The ML Flow test was positive in 95 (86%) of the patients with a positive BI. The lesion counting classification was confirmed by both BI and ML Flow tests in 65% of the 92 patients with 5 or fewer lesions, and in 76% of the 110 patients with 6 or more lesions. The combination of skin lesion counting and the ML Flow test results yielded a sensitivity of 85% and a specificity of 87% for MB classification, and correctly classified 86% of the patients when compared to the standard reference. A considerable proportion of the patients (43.5%) with discordant test results in relation to standard classification was in reaction. The use of any classification system has limitations, especially those that oversimplify a complex disease such as leprosy. In the absence of an experienced dermatologist and slit skin smear, the ML Flow test could be used to improve treatment decisions in field conditions.
NASA Astrophysics Data System (ADS)
Purohit, A.; Satapathy, A.
2017-02-01
Use of industrial wastes, such as slag and sludge particles, as filler in polymers is not very common in the field of composite research. Therefore in this paper, a comparison of mechanical characteristics of epoxy based composites filled with LD sludge, BF slag and LD slag (wastes generated in iron and steel industries) were presented. A comparative study among these composites in regard to their dry sliding wear characteristics under similar test conditions was also included. Composites with different weight proportions (0, 5, 10, 15 and 20 wt.%) of LD sludge were fabricated by solution casting technique. Mechanical properties were evaluated as per ASTM test standards and sliding wear test was performed following a design of experiment approach based on Taguchi’s orthogonal array. The test results for epoxy-LD sludge composites were compared with those of epoxy-BF slag and epoxy-LD slag composites reported by previous investigators. The comparison reveals that epoxy filled with LD sludge exhibits superior mechanical and wear characteristics among the three types of composites considered in this study.
NASA Astrophysics Data System (ADS)
Che, Ailan; Luo, Xianqi; Qi, Jinghua; Wang, Deyong
Shear wave velocity (Vs) of soil is one of the key parameters used in assessment of liquefaction potential of saturated soils in the base with leveled ground surface; determination of shear module of soils used in seismic response analyses. Such parameter can be experimentally obtained from laboratory soil tests and field measurements. Statistical relation of shear wave velocity with soil properties based on the surface wave survey investigation, and resonant column triaxial tests, which are taken from more than 14 sites within the depth of 10 m under ground surface, is obtained in Tianjin (China) area. The relationship between shear wave velocity and the standard penetration test N value (SPT-N value) of silt and clay in the quaternary formation are summarized. It is an important problem to research the effect of shear wave velocity on liquefaction resistance of saturated silts (sandy loams) for evaluating liquefaction resistance. According the results of cyclic triaxial tests, a correlation between liquefaction resistance and shear wave velocity is presented. The results are useful for ground liquefaction investigation and the evaluation of liquefaction resistance.
2017-05-11
NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) US Army Aberdeen Test Center (TEDT-AT-WFS) 400...AND ADDRESS(ES) Policy and Standardization Division (CSTE-TM) U.S. Army Test and Evaluation Command 2202 Aberdeen Boulevard Aberdeen Proving Ground
A new IRT-based standard setting method: application to eCat-listening.
García, Pablo Eduardo; Abad, Francisco José; Olea, Julio; Aguado, David
2013-01-01
Criterion-referenced interpretations of tests are highly necessary, which usually involves the difficult task of establishing cut scores. Contrasting with other Item Response Theory (IRT)-based standard setting methods, a non-judgmental approach is proposed in this study, in which Item Characteristic Curve (ICC) transformations lead to the final cut scores. eCat-Listening, a computerized adaptive test for the evaluation of English Listening, was administered to 1,576 participants, and the proposed standard setting method was applied to classify them into the performance standards of the Common European Framework of Reference for Languages (CEFR). The results showed a classification closely related to relevant external measures of the English language domain, according to the CEFR. It is concluded that the proposed method is a practical and valid standard setting alternative for IRT-based tests interpretations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yow, T.G.
The Transportation Operational Personal Property Standard System (TOPS) is an automated information management system to help administer the personal property transporation program for the Department of Defense (DOD). TOPS was fielded at four prototype sites in the late summer of 1988. Prototype testing is currently underway, with system deployment scheduled for 1989. When fully deployed, TOPS will save DOD both time and money and help ensure that all shipments made by armed services personnel are handled quickly and efficiently. The success of the TOPS system depends upon several key factors. Of course, TOPS must give transportation clerks at military personalmore » property shipping offices a tool with which they can perform their jobs with greater ease, speed, and correctness. However, before TOPS can achieve success in the field, it must first find acceptance from the transportation clerks themselves. The purpose of this document is to examine the user interface techniques used in the Counseling module of TOPS to ensure user acceptance and data base integrity, both key elements in the ultimate success of TOPS. 6 refs., 12 figs.« less
Catomeris, Peter; Baxter, Nancy N; Boss, Sheila C; Paszat, Lawrence F; Rabeneck, Linda; Randell, Edward; Serenity, Mardie L; Sutradhar, Rinku; Tinmouth, Jill
2018-01-01
- Although promising for colorectal cancer screening, hemoglobin (Hb) stability remains a concern with fecal immunochemical tests. This study implemented a novel, standardized method to compare Hb stability across various fecal immunochemical tests. The method can be used to inform decisions when selecting a kit for use in colorectal cancer screening. In so doing, this work addressed a critical need for standardization in this field. - To compare the stability of Hb across 5 different immunochemical kits and one guaiac kit. - The stability of Hb was analyzed in collection devices inoculated with Hb-spiked feces and (1) stored at various temperatures (frozen, refrigerated, ambient, and elevated) for more than 60 days; (2) after undergoing 3 controlled, freeze-thaw cycles; and (3) after being transported by courier or postal services in uncontrolled temperature conditions from 3 locations in Ontario, Canada, to a central testing center. - The stability of Hb varied with time and temperature and by kit. Lower Hb recoveries occurred with increasing temperature and increasing time from sample collection to testing. Refrigeration provided the best stability, although results varied across kits (eg, from 4.2 days to >60 days before a prespecified threshold [<70% probability of the test results remaining positive] was reached). Freeze-thaw stability varied across kits and cycles (Hb recoveries: NS Plus [Alfresa Pharma, Chuo-ku, Osaka, Japan], 91.7% to 95.4%; OC Diana [Eiken Chemical, Taito-ku, Tokyo, Japan], 57.6% to 74.9%). Agreement regarding Hb levels before and after transportation varied across kits (from 57% to 100%). - Important differences in Hb stability were found across the included fecal immunochemical tests. These findings should inform practice-based and population-based colorectal cancer screening.
Cost-effective and Rapid Blood Analysis on a Cell-phone
Zhu, Hongying; Sencan, Ikbal; Wong, Justin; Dimitrov, Stoyan; Tseng, Derek; Nagashima, Keita; Ozcan, Aydogan
2013-01-01
We demonstrate a compact and cost-effective imaging cytometry platform installed on a cell-phone for the measurement of the density of red and white blood cells as well as hemoglobin concentration in human blood samples. Fluorescent and bright-field images of blood samples are captured using separate optical attachments to the cell-phone and are rapidly processed through a custom-developed smart application running on the phone for counting of blood cells and determining hemoglobin density. We evaluated the performance of this cell-phone based blood analysis platform using anonymous human blood samples and achieved comparable results to a standard bench-top hematology analyser. Test results can either be stored on the cell-phone memory or be transmitted to a central server, providing remote diagnosis opportunities even in field settings. PMID:23392286
Cost-effective and rapid blood analysis on a cell-phone.
Zhu, Hongying; Sencan, Ikbal; Wong, Justin; Dimitrov, Stoyan; Tseng, Derek; Nagashima, Keita; Ozcan, Aydogan
2013-04-07
We demonstrate a compact and cost-effective imaging cytometry platform installed on a cell-phone for the measurement of the density of red and white blood cells as well as hemoglobin concentration in human blood samples. Fluorescent and bright-field images of blood samples are captured using separate optical attachments to the cell-phone and are rapidly processed through a custom-developed smart application running on the phone for counting of blood cells and determining hemoglobin density. We evaluated the performance of this cell-phone based blood analysis platform using anonymous human blood samples and achieved comparable results to a standard bench-top hematology analyser. Test results can either be stored on the cell-phone memory or be transmitted to a central server, providing remote diagnosis opportunities even in field settings.
NASA Astrophysics Data System (ADS)
Arino de La Rubia, L.; Butler, J.; Gary, T.; Stockman, S.; Mumma, M.; Pfiffner, S.; Davis, K.; Edmonds, J.
2009-12-01
The Minority Institution Astrobiology Collaborative began working with the NASA Goddard Center for Astrobiology in 2003 to develop curriculum materials for high school chemistry and Earth science classes based on astrobiology concepts. The Astrobiology in Secondary Classrooms modules are being developed to emphasize interdisciplinary connections in astronomy, biology, chemistry, geoscience, physics, mathematics, and ethics through hands-on activities that address national educational standards. Since this time, more NASA Astrobiology Institute Teams have joined this education and public outreach (EPO)effort. Field-testing of the Astrobiology in Secondary Classrooms materials began in 2007 in five US locations, each with populations that are underrepresented in the career fields of science, technology, engineering, and mathematics.
EFFECTS OF RADIATION ON ESTABLISHED FORENSIC EVIDENCE CONTAINMENT METHODS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferguson, C.; Duff, M.; Clark, E.
2010-11-29
The Federal Bureau of Investigation (FBI) Laboratory is currently exploring needs and protocols for the storage of evidentiary items contaminated with radioactive material. While a large body of knowledge on the behavior of storage polymers in radiation fields exists, this knowledge has not been applied to the field of forensics and maintaining evidentiary integrity. The focus of this research was to evaluate the behavior of several traditional evidentiary containment polymers when exposed to significant alpha, beta, gamma, neutron and mixed radiation sources. Doses were designed to simulate exposures possible during storage of materials. Several products were found to be poorlymore » suited for use in this specific application based on standardized mechanical testing results. Remaining products were determined to warrant further investigation for the storage of radiologically contaminated evidence.« less
NASA Technical Reports Server (NTRS)
Rengarajan, Govind; Aminpour, Mohammad A.; Knight, Norman F., Jr.
1992-01-01
An improved four-node quadrilateral assumed-stress hybrid shell element with drilling degrees of freedom is presented. The formulation is based on Hellinger-Reissner variational principle and the shape functions are formulated directly for the four-node element. The element has 12 membrane degrees of freedom and 12 bending degrees of freedom. It has nine independent stress parameters to describe the membrane stress resultant field and 13 independent stress parameters to describe the moment and transverse shear stress resultant field. The formulation encompasses linear stress, linear buckling, and linear free vibration problems. The element is validated with standard tests cases and is shown to be robust. Numerical results are presented for linear stress, buckling, and free vibration analyses.
Wide gap active brazing of ceramic-to-metal-joints for high temperature applications
NASA Astrophysics Data System (ADS)
Bobzin, K.; Zhao, L.; Kopp, N.; Samadian Anavar, S.
2014-03-01
Applications like solid oxide fuel cells and sensors increasingly demand the possibility to braze ceramics to metals with a good resistance to high temperatures and oxidative atmospheres. Commonly used silver based active filler metals cannot fulfill these requirements, if application temperatures higher than 600°C occur. Au and Pd based active fillers are too expensive for many fields of use. As one possible solution nickel based active fillers were developed. Due to the high brazing temperatures and the low ductility of nickel based filler metals, the modification of standard nickel based filler metals were necessary to meet the requirements of above mentioned applications. To reduce thermally induced stresses wide brazing gaps and the addition of Al2O3 and WC particles to the filler metal were applied. In this study, the microstructure of the brazed joints and the thermo-chemical reactions between filler metal, active elements and WC particles were analyzed to understand the mechanism of the so called wide gap active brazing process. With regard to the behavior in typical application oxidation and thermal cycle tests were conducted as well as tensile tests.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Driscoll, Frederick R.
The University of Washington (UW) - Northwest National Marine Renewable Energy Center (UW-NNMREC) and the National Renewable Energy Laboratory (NREL) will collaborate to advance research and development (R&D) of Marine Hydrokinetic (MHK) renewable energy technology, specifically renewable energy captured from ocean tidal currents. UW-NNMREC is endeavoring to establish infrastructure, capabilities and tools to support in-water testing of marine energy technology. NREL is leveraging its experience and capabilities in field testing of wind systems to develop protocols and instrumentation to advance field testing of MHK systems. Under this work, UW-NNMREC and NREL will work together to develop a common instrumentation systemmore » and testing methodologies, standards and protocols. UW-NNMREC is also establishing simulation capabilities for MHK turbine and turbine arrays. NREL has extensive experience in wind turbine array modeling and is developing several computer based numerical simulation capabilities for MHK systems. Under this CRADA, UW-NNMREC and NREL will work together to augment single device and array modeling codes. As part of this effort UW NNMREC will also work with NREL to run simulations on NREL's high performance computer system.« less
Installation Restoration Program Phase 2. Confirmation/Quantification Stage 2.
1988-04-03
MOORE STANDARD OPERATING PROCEDURES WORK PRACTICES 1. Smoking, eating, drinking, and chewing tobacco are prohibited in the contaminated or...meter and detector are transported in a protective foam-lined case. The cell is tested before going into the field using the test feature and is...available prior to the start-up of field work, the 1"o1l ,ii . rvicts, materials, work space, and items of equipment to zu F.Vft. the nt1 tor
SU-G-IeP4-06: Feasibility of External Beam Treatment Field Verification Using Cherenkov Imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Black, P; Na, Y; Wuu, C
2016-06-15
Purpose: Cherenkov light emission has been shown to correlate with ionizing radiation (IR) dose delivery in solid tissue. In order to properly correlate Cherenkov light images with real time dose delivery in a patient, we must account for geometric and intensity distortions arising from observation angle, as well as the effect of monitor units (MU) and field size on Cherenkov light emission. To test the feasibility of treatment field verification, we first focused on Cherenkov light emission efficiency based on MU and known field size (FS). Methods: Cherenkov light emission was captured using a PI-MAX4 intensified charge coupled device(ICCD) systemmore » (Princeton Instruments), positioned at a fixed angle of 40° relative to the beam central axis. A Varian TrueBeam linear accelerator (linac) was operated at 6MV and 600MU/min to deliver an Anterior-Posterior beam to a 5cm thick block phantom positioned at 100cm Source-to-Surface-Distance(SSD). FS of 10×10, 5×5, and 2×2cm{sup 2} were used. Before beam delivery projected light field images were acquired, ensuring that geometric distortions were consistent when measuring Cherenkov field discrepancies. Cherenkov image acquisition was triggered by linac target current. 500 frames were acquired for each FS. Composite images were created through summation of frames and background subtraction. MU per image was calculated based on linac pulse delay of 2.8ms. Cherenkov and projected light FS were evaluated using ImageJ software. Results: Mean Cherenkov FS discrepancies compared to light field were <0.5cm for 5.6, 2.8, and 8.6 MU for 10×10, 5×5, and 2×2cm{sup 2} FS, respectably. Discrepancies were reduced with increasing field size and MU. We predict a minimum of 100 frames is needed for reliable confirmation of delivered FS. Conclusion: Current discrepancies in Cherenkov field sizes are within a usable range to confirm treatment delivery in standard and respiratory gated clinical scenarios at MU levels appropriate to standard MLC position segments.« less
A detailed numerical simulation of a liquid-propellant rocket engine ground test experiment
NASA Astrophysics Data System (ADS)
Lankford, D. W.; Simmons, M. A.; Heikkinen, B. D.
1992-07-01
A computational simulation of a Liquid Rocket Engine (LRE) ground test experiment was performed using two modeling approaches. The results of the models were compared with selected data to assess the validity of state-of-the-art computational tools for predicting the flowfield and radiative transfer in complex flow environments. The data used for comparison consisted of in-band station radiation measurements obtained in the near-field portion of the plume exhaust. The test article was a subscale LRE with an afterbody, resulting in a large base region. The flight conditions were such that afterburning regions were observed in the plume flowfield. A conventional standard modeling approach underpredicted the extent of afterburning and the associated radiation levels. These results were attributed to the absence of the base flow region which is not accounted for in this model. To assess the effects of the base region a Navier-Stokes model was applied. The results of this calculation indicate that the base recirculation effects are dominant features in the immediate expansion region and resulted in a much improved comparison. However, the downstream in-band station radiation data remained underpredicted by this model.
[Optimization of cluster analysis based on drug resistance profiles of MRSA isolates].
Tani, Hiroya; Kishi, Takahiko; Gotoh, Minehiro; Yamagishi, Yuka; Mikamo, Hiroshige
2015-12-01
We examined 402 methicillin-resistant Staphylococcus aureus (MRSA) strains isolated from clinical specimens in our hospital between November 19, 2010 and December 27, 2011 to evaluate the similarity between cluster analysis of drug susceptibility tests and pulsed-field gel electrophoresis (PFGE). The results showed that the 402 strains tested were classified into 27 PFGE patterns (151 subtypes of patterns). Cluster analyses of drug susceptibility tests with the cut-off distance yielding a similar classification capability showed favorable results--when the MIC method was used, and minimum inhibitory concentration (MIC) values were used directly in the method, the level of agreement with PFGE was 74.2% when 15 drugs were tested. The Unweighted Pair Group Method with Arithmetic mean (UPGMA) method was effective when the cut-off distance was 16. Using the SIR method in which susceptible (S), intermediate (I), and resistant (R) were coded as 0, 2, and 3, respectively, according to the Clinical and Laboratory Standards Institute (CLSI) criteria, the level of agreement with PFGE was 75.9% when the number of drugs tested was 17, the method used for clustering was the UPGMA, and the cut-off distance was 3.6. In addition, to assess the reproducibility of the results, 10 strains were randomly sampled from the overall test and subjected to cluster analysis. This was repeated 100 times under the same conditions. The results indicated good reproducibility of the results, with the level of agreement with PFGE showing a mean of 82.0%, standard deviation of 12.1%, and mode of 90.0% for the MIC method and a mean of 80.0%, standard deviation of 13.4%, and mode of 90.0% for the SIR method. In summary, cluster analysis for drug susceptibility tests is useful for the epidemiological analysis of MRSA.
NASA Astrophysics Data System (ADS)
Yarnykh, V.; Korostyshevskaya, A.
2017-08-01
Macromolecular proton fraction (MPF) is a biophysical parameter describing the amount of macromolecular protons involved into magnetization exchange with water protons in tissues. MPF represents a significant interest as a magnetic resonance imaging (MRI) biomarker of myelin for clinical applications. A recent fast MPF mapping method enabled clinical translation of MPF measurements due to time-efficient acquisition based on the single-point constrained fit algorithm. However, previous MPF mapping applications utilized only 3 Tesla MRI scanners and modified pulse sequences, which are not commonly available. This study aimed to test the feasibility of MPF mapping implementation on a 1.5 Tesla clinical scanner using standard manufacturer’s sequences and compare the performance of this method between 1.5 and 3 Tesla scanners. MPF mapping was implemented on 1.5 and 3 Tesla MRI units of one manufacturer with either optimized custom-written or standard product pulse sequences. Whole-brain three-dimensional MPF maps obtained from a single volunteer were compared between field strengths and implementation options. MPF maps demonstrated similar quality at both field strengths. MPF values in segmented brain tissues and specific anatomic regions appeared in close agreement. This experiment demonstrates the feasibility of fast MPF mapping using standard sequences on 1.5 T and 3 T clinical scanners.
NASA Astrophysics Data System (ADS)
Yan, Peng; Lu, Wenbo; Zhang, Jing; Zou, Yujun; Chen, Ming
2017-04-01
Ground vibration, as the most critical public hazard of blasting, has received much attention from the community. Many countries established national standards to suppress vibration impact on structures, but a world-accepted blasting vibration criterion on human safety is still missing. In order to evaluate human response to the vibration from blasting excavation of a large-scale rock slope in China, this study aims to suggest a revised criterion. The vibration frequency was introduced to improve the existing single-factor (peak particle velocity) standard recommended by the United States Bureau of Mines (USBM). The feasibility of the new criterion was checked based on field vibration monitoring and investigation of human reactions. Moreover, the air overpressure or blast effects on human beings have also been discussed. The result indicates that the entire zone of influence can be divided into three subzones: severe-annoyance, light-annoyance and perception zone according to the revised safety standard. Both the construction company and local residents have provided positive comments on this influence degree assessment, which indicates that the presented criterion is suitable for evaluating human response to nearby blasts. Nevertheless, this specific criterion needs more field tests and verifications before it can be
Clements, William H; Cadmus, Pete; Brinkman, Stephen F
2013-07-02
Field surveys of metal-contaminated streams suggest that some aquatic insects, particularly mayflies (Ephemeroptera) and stoneflies (Plecoptera), are highly sensitive to metals. However, results of single species toxicity tests indicate these organisms are quite tolerant, with LC50 values often several orders of magnitude greater than those obtained using standard test organisms (e.g., cladocerans and fathead minnows). Reconciling these differences is a critical research need, particularly since water quality criteria for metals are based primarily on results of single species toxicity tests. In this research we provide evidence based on community-level microcosm experiments to support the hypothesis that some aquatic insects are highly sensitive to metals. We present results of three experiments that quantified effects of Cu and Zn, alone and in combination, on stream insect communities. EC50 values, defined as the metal concentration that reduced abundance of aquatic insects by 50%, were several orders of magnitude lower than previously published values obtained from single species tests. We hypothesize that the short duration of laboratory toxicity tests and the failure to evaluate effects of metals on sensitive early life stages are the primary factors responsible for unrealistically high LC50 values in the literature. We also observed that Cu alone was significantly more toxic to aquatic insects than the combination of Cu and Zn, despite the fact that exposure concentrations represented theoretically similar toxicity levels. Our results suggest that water quality criteria for Zn were protective of most aquatic insects, whereas Cu was highly toxic to some species at concentrations near water quality criteria. Because of the functional significance of aquatic insects in stream ecosystems and their well-established importance as indicators of water quality, reconciling differences between field and laboratory responses and understanding the mechanisms responsible for variation in sensitivity among metals and metal mixtures is of critical importance.
An online ID identification system for liquefied-gas cylinder plant
NASA Astrophysics Data System (ADS)
He, Jin; Ding, Zhenwen; Han, Lei; Zhang, Hao
2017-11-01
An automatic ID identification system for gas cylinders' online production was developed based on the production conditions and requirements of the Technical Committee for Standardization of Gas Cylinders. A cylinder ID image acquisition system was designed to improve the image contrast of ID regions on gas cylinders against the background. Then the ID digits region was located by the CNN template matching algorithm. Following that, an adaptive threshold method based on the analysis of local average grey value and standard deviation was proposed to overcome defects of non-uniform background in the segmentation results. To improve the single digit identification accuracy, two BP neural networks were trained respectively for the identification of all digits and the easily confusable digits. If the single digit was classified as one of confusable digits by the former BP neural network, it was further tested by the later one, and the later result was taken as the final identification result of this single digit. At last, the majority voting was adopted to decide the final identification result for the 6-digit cylinder ID. The developed system was installed on a production line of a liquefied-petroleum-gas cylinder plant and worked in parallel with the existing weighing step on the line. Through the field test, the correct identification rate for single ID digit was 94.73%, and none of the tested 2000 cylinder ID was misclassified through the majority voting.
Thomas, Jobin; Singh, Mithilesh; Goswami, T K; Glora, Philma; Chakravarti, Soumendu; Chander, Vishal; Upmanyu, Vikramaditya; Verma, Suman; Sharma, Chhavi; Mahendran, K
2017-09-01
Canine parvoviral enteritis is a highly contagious viral illness caused by canine parvovirus-2 (CPV-2) which affects puppies of mainly 6-20 weeks of age. Vaccination is pivotal in preventing and controlling CPV-2 infection. Determination of antibody status is a critical determinant for successful vaccination. The hemagglutination inhibition (HI) test is 'gold standard' test for quantification of antibodies specific to CPV-2, although the execution of this test is not feasible under field conditions. The present study was undertaken to develop a point of care testing to determine immune status prior to CPV-2 vaccination or to detect seroconversion in immunized dogs by latex agglutination test (LAT) using recombinant antigen. Truncated portion of VP2 protein (tVP2) of CPV-2 was selected on the basis of antigenic indices, overexpressed the recombinant protein in E. coli system and was subsequently used in development of LAT. A total of 59 serum samples obtained from vaccinated (n = 54) and healthy unvaccinated (n = 5) dogs were tested. The positivity was observed in 85% (46/54) of these dogs with varying agglutination pattern. The overall sensitivity and specificity of latex agglutination test in comparison to HI test was recorded as 90% and 88% respectively with an agreement value of 90% (CI = 95%). Copyright © 2017 International Alliance for Biological Standardization. Published by Elsevier Ltd. All rights reserved.
Food for thought ... A toxicology ontology roadmap.
Hardy, Barry; Apic, Gordana; Carthew, Philip; Clark, Dominic; Cook, David; Dix, Ian; Escher, Sylvia; Hastings, Janna; Heard, David J; Jeliazkova, Nina; Judson, Philip; Matis-Mitchell, Sherri; Mitic, Dragana; Myatt, Glenn; Shah, Imran; Spjuth, Ola; Tcheremenskaia, Olga; Toldo, Luca; Watson, David; White, Andrew; Yang, Chihae
2012-01-01
Foreign substances can have a dramatic and unpredictable adverse effect on human health. In the development of new therapeutic agents, it is essential that the potential adverse effects of all candidates be identified as early as possible. The field of predictive toxicology strives to profile the potential for adverse effects of novel chemical substances before they occur, both with traditional in vivo experimental approaches and increasingly through the development of in vitro and computational methods which can supplement and reduce the need for animal testing. To be maximally effective, the field needs access to the largest possible knowledge base of previous toxicology findings, and such results need to be made available in such a fashion so as to be interoperable, comparable, and compatible with standard toolkits. This necessitates the development of open, public, computable, and standardized toxicology vocabularies and ontologies so as to support the applications required by in silico, in vitro, and in vivo toxicology methods and related analysis and reporting activities. Such ontology development will support data management, model building, integrated analysis, validation and reporting, including regulatory reporting and alternative testing submission requirements as required by guidelines such as the REACH legislation, leading to new scientific advances in a mechanistically-based predictive toxicology. Numerous existing ontology and standards initiatives can contribute to the creation of a toxicology ontology supporting the needs of predictive toxicology and risk assessment. Additionally, new ontologies are needed to satisfy practical use cases and scenarios where gaps currently exist. Developing and integrating these resources will require a well-coordinated and sustained effort across numerous stakeholders engaged in a public-private partnership. In this communication, we set out a roadmap for the development of an integrated toxicology ontology, harnessing existing resources where applicable. We describe the stakeholders' requirements analysis from the academic and industry perspectives, timelines, and expected benefits of this initiative, with a view to engagement with the wider community.
The development of STS payload environmental engineering standards
NASA Technical Reports Server (NTRS)
Bangs, W. F.
1982-01-01
The presently reported effort to provide a single set of standards for the design, analysis and testing of Space Transportation System (STS) payloads throughout the NASA organization must be viewed as essentially experimental, since the concept of incorporating the diverse opinions and experiences of several separate field research centers may in retrospect be judged too ambitious or perhaps even naive. While each STS payload may have unique characteristics, and the project should formulate its own criteria for environmental design, testing and evaluation, a reference source document providing coordinated standards is expected to minimize the duplication of effort and limit random divergence of practices among the various NASA payload programs. These standards would provide useful information to all potential STS users, and offer a degree of standardization to STS users outside the NASA organization.
Keogh, Justin W L; Weber, Clare L; Dalton, Carl T
2003-06-01
The purpose of the present study was to develop an effective testing battery for female field hockey by using anthropometric, physiological, and skill-related tests to distinguish between regional representative (Rep, n = 35) and local club level (Club, n = 39) female field hockey players. Rep players were significantly leaner and recorded faster times for the 10-m and 40-m sprints as well as the Illinois Agility Run (with and without dribbling a hockey ball). Rep players also had greater aerobic and lower body muscular power and were more accurate in the shooting accuracy test, p < 0.05. No significant differences between groups were evident for height, body mass, speed decrement in 6 x 40-m repeated sprints, handgrip strength, or pushing speed. These results indicate that %BF, sprinting speed, agility, dribbling control, aerobic and muscular power, and shooting accuracy can distinguish between female field hockey players of varying standards. Therefore talent identification programs for female field hockey should include assessments of these physical parameters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Augustoni, Arnold L.
A laser safety and hazard analysis was performed for the temperature stabilized Big Sky Laser Technology (BSLT) laser central to the ARES system based on the 2007 version of the American National Standards Institutes (ANSI) Standard Z136.1, for Safe Use of Lasers and the 2005 version of the ANSI Standard Z136.6, for Safe Use of Lasers Outdoors. The ARES laser system is a Van/Truck based mobile platform, which is used to perform laser interaction experiments and tests at various national test sites.
NASA Astrophysics Data System (ADS)
Bailey, Quentin G.
2007-08-01
This work explores the theoretical and experimental aspects of Lorentz violation in gravity. A set of modified Einstein field equations is derived from the general Lorentz-violating Standard-Model Extension (SME). Some general theoretical implications of these results are discussed. The experimental consequences for weak-field gravitating systems are explored in the Earth- laboratory setting, the solar system, and beyond. The role of spontaneous Lorentz-symmetry breaking is discussed in the context of the pure-gravity sector of the SME. To establish the low-energy effective Einstein field equations, it is necessary to take into account the dynamics of 20 coefficients for Lorentz violation. As an example, the results are compared with bumblebee models, which are general theories of vector fields with spontaneous Lorentz violation. The field equations are evaluated in the post- newtonian limit using a perfect fluid description of matter. The post-newtonian metric of the SME is derived and compared with some standard test models of gravity. The possible signals for Lorentz violation due to gravity-sector coefficients are studied. Several new effects are identified that have experimental implications for current and future tests. Among the unconventional effects are a new type of spin precession for a gyroscope in orbit and a modification to the local gravitational acceleration on the Earth's surface. These and other tests are expected to yield interesting sensitivities to dimensionless gravity- sector coefficients.
ERIC Educational Resources Information Center
Fowell, S. L.; Fewtrell, R.; McLaughlin, P. J.
2008-01-01
Absolute standard setting procedures are recommended for assessment in medical education. Absolute, test-centred standard setting procedures were introduced for written assessments in the Liverpool MBChB in 2001. The modified Angoff and Ebel methods have been used for short answer question-based and extended matching question-based papers,…
Translational Imaging Spectroscopy for Proximal Sensing
Rogass, Christian; Koerting, Friederike M.; Mielke, Christian; Brell, Maximilian; Boesche, Nina K.; Bade, Maria; Hohmann, Christian
2017-01-01
Proximal sensing as the near field counterpart of remote sensing offers a broad variety of applications. Imaging spectroscopy in general and translational laboratory imaging spectroscopy in particular can be utilized for a variety of different research topics. Geoscientific applications require a precise pre-processing of hyperspectral data cubes to retrieve at-surface reflectance in order to conduct spectral feature-based comparison of unknown sample spectra to known library spectra. A new pre-processing chain called GeoMAP-Trans for at-surface reflectance retrieval is proposed here as an analogue to other algorithms published by the team of authors. It consists of a radiometric, a geometric and a spectral module. Each module consists of several processing steps that are described in detail. The processing chain was adapted to the broadly used HySPEX VNIR/SWIR imaging spectrometer system and tested using geological mineral samples. The performance was subjectively and objectively evaluated using standard artificial image quality metrics and comparative measurements of mineral and Lambertian diffuser standards with standard field and laboratory spectrometers. The proposed algorithm provides highly qualitative results, offers broad applicability through its generic design and might be the first one of its kind to be published. A high radiometric accuracy is achieved by the incorporation of the Reduction of Miscalibration Effects (ROME) framework. The geometric accuracy is higher than 1 μpixel. The critical spectral accuracy was relatively estimated by comparing spectra of standard field spectrometers to those from HySPEX for a Lambertian diffuser. The achieved spectral accuracy is better than 0.02% for the full spectrum and better than 98% for the absorption features. It was empirically shown that point and imaging spectrometers provide different results for non-Lambertian samples due to their different sensing principles, adjacency scattering impacts on the signal and anisotropic surface reflection properties. PMID:28800111
Estimating Critical Values for Strength of Alignment among Curriculum, Assessments, and Instruction
ERIC Educational Resources Information Center
Fulmer, Gavin W.
2010-01-01
School accountability decisions based on standardized tests hinge on the degree of alignment of the test with a state's standards. Yet no established criteria were available for judging strength of alignment. Previous studies of alignment among tests, standards, and teachers' instruction have yielded mixed results that are difficult to interpret…