Culturally Sensitive Career Assessment: A Quandary. ERIC Digest No. 210.
ERIC Educational Resources Information Center
Austin, James T.
Multicultural perspectives on assessment challenge traditional perspectives by advancing an additional source of variation in test responses that is presumed to escape test developers and test users. Increasing and convergent evidence from multiple sources indicates the following types of ethnocentric errors in test development, administration,…
Psychological testing and psychological assessment. A review of evidence and issues.
Meyer, G J; Finn, S E; Eyde, L D; Kay, G G; Moreland, K L; Dies, R R; Eisman, E J; Kubiszyn, T W; Reed, G M
2001-02-01
This article summarizes evidence and issues associated with psychological assessment. Data from more than 125 meta-analyses on test validity and 800 samples examining multimethod assessment suggest 4 general conclusions: (a) Psychological test validity is strong and compelling, (b) psychological test validity is comparable to medical test validity, (c) distinct assessment methods provide unique sources of information, and (d) clinicians who rely exclusively on interviews are prone to incomplete understandings. Following principles for optimal nomothetic research, the authors suggest that a multimethod assessment battery provides a structured means for skilled clinicians to maximize the validity of individualized assessments. Future investigations should move beyond an examination of test scales to focus more on the role of psychologists who use tests as helpful tools to furnish patients and referral sources with professional consultation.
The Chandra Source Catalog: Source Variability
NASA Astrophysics Data System (ADS)
Nowak, Michael; Rots, A. H.; McCollough, M. L.; Primini, F. A.; Glotfelty, K. J.; Bonaventura, N. R.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Evans, J. D.; Fabbiano, G.; Galle, E.; Gibbs, D. G.; Grier, J. D.; Hain, R.; Hall, D. M.; Harbo, P. N.; He, X.; Houck, J. C.; Karovska, M.; Lauer, J.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Plummer, D. A.; Refsdal, B. L.; Siemiginowska, A. L.; Sundheim, B. A.; Tibbetts, M. S.; Van Stone, D. W.; Winkelman, S. L.; Zografou, P.
2009-01-01
The Chandra Source Catalog (CSC) contains fields of view that have been studied with individual, uninterrupted observations that span integration times ranging from 1 ksec to 160 ksec, and a large number of which have received (multiple) repeat observations days to years later. The CSC thus offers an unprecedented look at the variability of the X-ray sky over a broad range of time scales, and across a wide diversity of variable X-ray sources: stars in the local galactic neighborhood, galactic and extragalactic X-ray binaries, Active Galactic Nuclei, etc. Here we describe the methods used to identify and quantify source variability within a single observation, and the methods used to assess the variability of a source when detected in multiple, individual observations. Three tests are used to detect source variability within a single observation: the Kolmogorov-Smirnov test and its variant, the Kuiper test, and a Bayesian approach originally suggested by Gregory and Loredo. The latter test not only provides an indicator of variability, but is also used to create a best estimate of the variable lightcurve shape. We assess the performance of these tests via simulation of statistically stationary, variable processes with arbitrary input power spectral densities (here we concentrate on results of red noise simulations) at variety of mean count rates and fractional root mean square variabilities relevant to CSC sources. We also assess the false positive rate via simulations of constant sources whose sole source of fluctuation is Poisson noise. We compare these simulations to a preliminary assessment of the variability found in real CSC sources, and estimate the variability sensitivities of the CSC.
The Chandra Source Catalog: Source Variability
NASA Astrophysics Data System (ADS)
Nowak, Michael; Rots, A. H.; McCollough, M. L.; Primini, F. A.; Glotfelty, K. J.; Bonaventura, N. R.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Evans, J. D.; Evans, I.; Fabbiano, G.; Galle, E. C.; Gibbs, D. G., II; Grier, J. D.; Hain, R.; Hall, D. M.; Harbo, P. N.; He, X.; Houck, J. C.; Karovska, M.; Lauer, J.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Plummer, D. A.; Refsdal, B. L.; Siemiginowska, A. L.; Sundheim, B. A.; Tibbetts, M. S.; van Stone, D. W.; Winkelman, S. L.; Zografou, P.
2009-09-01
The Chandra Source Catalog (CSC) contains fields of view that have been studied with individual, uninterrupted observations that span integration times ranging from 1 ksec to 160 ksec, and a large number of which have received (multiple) repeat observations days to years later. The CSC thus offers an unprecedented look at the variability of the X-ray sky over a broad range of time scales, and across a wide diversity of variable X-ray sources: stars in the local galactic neighborhood, galactic and extragalactic X-ray binaries, Active Galactic Nuclei, etc. Here we describe the methods used to identify and quantify source variability within a single observation, and the methods used to assess the variability of a source when detected in multiple, individual observations. Three tests are used to detect source variability within a single observation: the Kolmogorov-Smirnov test and its variant, the Kuiper test, and a Bayesian approach originally suggested by Gregory and Loredo. The latter test not only provides an indicator of variability, but is also used to create a best estimate of the variable lightcurve shape. We assess the performance of these tests via simulation of statistically stationary, variable processes with arbitrary input power spectral densities (here we concentrate on results of red noise simulations) at variety of mean count rates and fractional root mean square variabilities relevant to CSC sources. We also assess the false positive rate via simulations of constant sources whose sole source of fluctuation is Poisson noise. We compare these simulations to an assessment of the variability found in real CSC sources, and estimate the variability sensitivities of the CSC.
Assessment and control of spacecraft electromagnetic interference
NASA Technical Reports Server (NTRS)
1972-01-01
Design criteria are presented to provide guidance in assessing electromagnetic interference from onboard sources and establishing requisite control in spacecraft design, development, and testing. A comprehensive state-of-the-art review is given which covers flight experience, sources and transmission of electromagnetic interference, susceptible equipment, design procedure, control techniques, and test methods.
2017-01-27
FINAL REPORT Designing , Assessing, and Demonstrating Sustainable Bioaugmentation for Treatment of DNAPL Sources in Fractured Bedrock ESTCP...W912HQ-12-C-0062 Designing , Assessing, and Demonstrating Sustainable Bioaugmentation for Treatment of DNAPL Sources in Fractured Bedrock 5b. GRANT...31 5.0 TEST DESIGN
What Does a Verbal Test Measure? A New Approach to Understanding Sources of Item Difficulty.
ERIC Educational Resources Information Center
Berk, Eric J. Vanden; Lohman, David F.; Cassata, Jennifer Coyne
Assessing the construct relevance of mental test results continues to present many challenges, and it has proven to be particularly difficult to assess the construct relevance of verbal items. This study was conducted to gain a better understanding of the conceptual sources of verbal item difficulty using a unique approach that integrates…
Psychological Testing and Psychological Assessment: A Review of Evidence and Issues.
ERIC Educational Resources Information Center
Meyer, Gregory J.; Finn, Stephen E.; Eyde, Lorraine D.; Kay, Gary G.; Moreland, Kevin L.; Dies, Robert R.; Eisman, Elena J.; Kubiszyn, Tom W.; Reed, Geoffrey M.
2001-01-01
Summarizes issues associated with psychological assessment, concluding that: psychological test validity is strong and is comparable to medical test validity; distinct assessment methods provide unique sources of information; and clinicians who rely solely on interviews are prone to incomplete understandings. Suggests that multimethod assessment…
Testing Information Sources for Educators. ERIC/TME Report 94.
ERIC Educational Resources Information Center
Fabiano, Emily; O'Brien, Nancy
This guide provides annotated lists of books, journals, indexes, and computer-based services and organizations that are sources of test information. The guide directs educators to test information about assessing academic ability, aptitude, achievement, personality, vocational aptitude, and intelligence, as well as specialized topics such as…
Can Sanitary Surveys Replace Water Quality Testing? Evidence from Kisii, Kenya.
Misati, Aaron Gichaba; Ogendi, George; Peletz, Rachel; Khush, Ranjiv; Kumpel, Emily
2017-02-07
Information about the quality of rural drinking water sources can be used to manage their safety and mitigate risks to health. Sanitary surveys, which are observational checklists to assess hazards present at water sources, are simpler to conduct than microbial tests. We assessed whether sanitary survey results were associated with measured indicator bacteria levels in rural drinking water sources in Kisii Central, Kenya. Overall, thermotolerant coliform (TTC) levels were high: all of the samples from the 20 tested dug wells, almost all (95%) of the samples from the 25 tested springs, and 61% of the samples from the 16 tested rainwater harvesting systems were contaminated with TTC. There were no significant associations between TTC levels and overall sanitary survey scores or their individual components. Contamination by TTC was associated with source type (dug wells and springs were more contaminated than rainwater systems). While sanitary surveys cannot be substituted for microbial water quality results in this context, they could be used to identify potential hazards and contribute to a comprehensive risk management approach.
Theoretical value of pre-trade testing for Salmonella in Swedish cattle herds.
Sternberg Lewerin, Susanna
2018-05-01
The Swedish Salmonella control programme includes mandatory action if Salmonella is detected in a herd. The aim of this study was to assess the relative value of different strategies for pre-movement testing of cattle. Three fictitious herds were included: dairy, beef and specialised calf-fattening. The yearly risks of introducing Salmonella with and without individual serological or bulk milk testing were assessed as well as the effects of sourcing animals from low-prevalence areas or reducing the number of source herds. The initial risk was highest for the calf-fattening herd and lowest for the beef herd. For the beef and dairy herds, the yearly risk of Salmonella introduction was reduced by about 75% with individual testing. Sourcing animals from low-prevalence areas reduced the risk by >99%. For the calf-fattening herd, the yearly risk was reduced by almost 50% by individual testing or sourcing animals from a maximum of five herds. The method was useful for illustrating effects of risk mitigation when introducing animals into a herd. Sourcing animals from low-risk areas (or herds) is more effective than single testing of individual animals or bulk milk. A comprehensive approach to reduce the risk of introducing Salmonella from source herds is justified. Copyright © 2017 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Balogun, Joseph; Abiona, Titilayo; Lukobo-Durrell, Mainza; Adefuye, Adedeji; Amosun, Seyi; Frantz, Jose; Yakut, Yavuz
2011-01-01
Objective: This comparative study evaluated the readability and test-retest reliability of a questionnaire designed to assess the attitudes, beliefs behaviours and sources of information about HIV/AIDS among young adults recruited from universities in the United States of America (USA), Turkey and South Africa. Design/Setting: The instrument was…
ERIC Educational Resources Information Center
Heric, Matthew; Carter, Jenn
2011-01-01
Cognitive readiness (CR) and performance for operational time-critical environments are continuing points of focus for military and academic communities. In response to this need, we designed an open source interactive CR assessment application as a highly adaptive and efficient open source testing administration and analysis tool. It is capable…
Brandt, Marc; Becker, Eva; Jöhncke, Ulrich; Sättler, Daniel; Schulte, Christoph
2016-01-01
One important purpose of the European REACH Regulation (EC No. 1907/2006) is to promote the use of alternative methods for assessment of hazards of substances in order to avoid animal testing. Experience with environmental hazard assessment under REACH shows that efficient alternative methods are needed in order to assess chemicals when standard test data are missing. One such assessment method is the weight-of-evidence (WoE) approach. In this study, the WoE approach was used to assess the persistence of certain phenolic benzotriazoles, a group of substances including also such of very high concern (SVHC). For phenolic benzotriazoles, assessment of the environmental persistence is challenging as standard information, i.e. simulation tests on biodegradation are not available. Thus, the WoE approach was used: overall information resulting from many sources was considered, and individual uncertainties of each source analysed separately. In a second step, all information was aggregated giving an overall picture of persistence to assess the degradability of the phenolic benzotriazoles under consideration although the reliability of individual sources was incomplete. Overall, the evidence suggesting that phenolic benzotriazoles are very persistent in the environment is unambiguous. This was demonstrated by a WoE approach considering the prerequisites of REACH by combining several limited information sources. The combination enabled a clear overall assessment which can be reliably used for SVHC identification. Finally, it is recommended to include WoE approaches as an important tool in future environmental risk assessments.
Open source database of images DEIMOS: extension for large-scale subjective image quality assessment
NASA Astrophysics Data System (ADS)
Vítek, Stanislav
2014-09-01
DEIMOS (Database of Images: Open Source) is an open-source database of images and video sequences for testing, verification and comparison of various image and/or video processing techniques such as compression, reconstruction and enhancement. This paper deals with extension of the database allowing performing large-scale web-based subjective image quality assessment. Extension implements both administrative and client interface. The proposed system is aimed mainly at mobile communication devices, taking into account advantages of HTML5 technology; it means that participants don't need to install any application and assessment could be performed using web browser. The assessment campaign administrator can select images from the large database and then apply rules defined by various test procedure recommendations. The standard test procedures may be fully customized and saved as a template. Alternatively the administrator can define a custom test, using images from the pool and other components, such as evaluating forms and ongoing questionnaires. Image sequence is delivered to the online client, e.g. smartphone or tablet, as a fully automated assessment sequence or viewer can decide on timing of the assessment if required. Environmental data and viewing conditions (e.g. illumination, vibrations, GPS coordinates, etc.), may be collected and subsequently analyzed.
Healy-Profitós, Jessica; Lee, Seungjun; Mouhaman, Arabi; Garabed, Rebecca; Moritz, Mark; Piperata, Barbara; Lee, Jiyoung
2016-06-01
This study examined the spatial variation of potential gastrointestinal pathogens within drinking water sources and home storage containers in four neighborhoods in Maroua, Cameroon. Samples were collected from source (n = 28) and home containers (n = 60) in each study neighborhood. Pathogen contamination was assessed using quantitative polymerase chain reaction, targeting Campylobacter spp., Shiga toxin producing Escherichia coli (virulence genes, stx1 and stx2), and Salmonella spp. Microbial source tracking (MST) targeted three different host-specific markers: HF183 (human), Rum2Bac (ruminant) and GFD (poultry) to identify contamination sources. Staphylococcus aureus and the tetracycline-resistance gene (tetQ) were assessed to measure human hand contact and presence of antibiotic-resistant bacteria. Pathogen/MST levels were compared statistically and spatially, and neighborhood variation was compared with previously collected demographic information. All the test fecal markers and pathogens (except Arcobacter) were detected in home and source samples. Two neighborhoods tested positive for most pathogens/MST while the others only tested positive for one or two. Spatial variation of pathogens/MST existed between sources, storage containers, and neighborhoods. Differing population density and ethno-economic characteristics could potentially explain variation. Future research should explore the influence of demographic and ethno-economic factors on water quality during microbial risk assessments in urban Africa.
Nakamura, Satoshi; Nishioka, Shie; Iijima, Kotaro; Wakita, Akihisa; Abe, Yukinao; Tohyama, Naoki; Kawamura, Shinji; Minemura, Toshiyuki; Itami, Jun
2017-01-01
Purpose The aim of this study is to describe a phantom designed for independent examination of a source position in brachytherapy that is suitable for inclusion in an external auditing program. Material and methods We developed a phantom that has a special design and a simple mechanism, capable of firmly fixing a radiochromic film and tandem-ovoid applicators to assess discrepancies in source positions between the measurements and treatment planning system (TPS). Three tests were conducted: 1) reproducibility of the source positions (n = 5); 2) source movements inside the applicator tube; 3) changing source position by changing curvature of the transfer tubes. In addition, as a trial study, the phantom was mailed to 12 institutions, and 23 trial data sets were examined. The source displacement ΔX and ΔY (reference = TPS) were expressed according to the coordinates, in which the positive direction on the X-axis corresponds to the external side of the applicator perpendicular to source transfer direction Y-axis. Results Test 1: The 1σ fell within 1 mm irrespective of the dwell positions. Test 2: ΔX were greater around the tip of the applicator owing to the source cable. Test 3: All of the source position changes fell within 1 mm. For postal audit, the mean and 1.96σ in ΔX were 0.8 and 0.8 mm, respectively. Almost all data were located within a positive region along the X-axis due to the source cable. The mean and 1.96σ in ΔY were 0.3 and 1.6 mm, respectively. The variance in ΔY was greater than that in ΔX, and large uncertainties exist in the determination of the first dwell position. The 95% confidence limit was 2.1 mm. Conclusions In HDR brachytherapy, an effectiveness of independent source position assessment could be demonstrated. The 95% confidence limit was 2.1 mm for a tandem-ovoids applicator. PMID:29204169
NASA Astrophysics Data System (ADS)
Moulton, J. D.; Steefel, C. I.; Yabusaki, S.; Castleton, K.; Scheibe, T. D.; Keating, E. H.; Freedman, V. L.
2013-12-01
The Advanced Simulation Capabililty for Environmental Management (ASCEM) program is developing an approach and open-source tool suite for standardized risk and performance assessments at legacy nuclear waste sites. These assessments use a graded and iterative approach, beginning with simplified highly abstracted models, and adding geometric and geologic complexity as understanding is gained. To build confidence in this assessment capability, extensive testing of the underlying tools is needed. Since the tools themselves, such as the subsurface flow and reactive-transport simulator, Amanzi, are under active development, testing must be both hierarchical and highly automated. In this presentation we show how we have met these requirements, by leveraging the python-based open-source documentation system called Sphinx with several other open-source tools. Sphinx builds on the reStructured text tool docutils, with important extensions that include high-quality formatting of equations, and integrated plotting through matplotlib. This allows the documentation, as well as the input files for tests, benchmark and tutorial problems, to be maintained with the source code under a version control system. In addition, it enables developers to build documentation in several different formats (e.g., html and pdf) from a single source. We will highlight these features, and discuss important benefits of this approach for Amanzi. In addition, we'll show that some of ASCEM's other tools, such as the sampling provided by the Uncertainty Quantification toolset, are naturally leveraged to enable more comprehensive testing. Finally, we will highlight the integration of this hiearchical testing and documentation framework with our build system and tools (CMake, CTest, and CDash).
Can Sanitary Surveys Replace Water Quality Testing? Evidence from Kisii, Kenya
Misati, Aaron Gichaba; Ogendi, George; Peletz, Rachel; Khush, Ranjiv; Kumpel, Emily
2017-01-01
Information about the quality of rural drinking water sources can be used to manage their safety and mitigate risks to health. Sanitary surveys, which are observational checklists to assess hazards present at water sources, are simpler to conduct than microbial tests. We assessed whether sanitary survey results were associated with measured indicator bacteria levels in rural drinking water sources in Kisii Central, Kenya. Overall, thermotolerant coliform (TTC) levels were high: all of the samples from the 20 tested dug wells, almost all (95%) of the samples from the 25 tested springs, and 61% of the samples from the 16 tested rainwater harvesting systems were contaminated with TTC. There were no significant associations between TTC levels and overall sanitary survey scores or their individual components. Contamination by TTC was associated with source type (dug wells and springs were more contaminated than rainwater systems). While sanitary surveys cannot be substituted for microbial water quality results in this context, they could be used to identify potential hazards and contribute to a comprehensive risk management approach. PMID:28178226
Sources of self-efficacy for physical activity.
Warner, Lisa M; Schüz, Benjamin; Wolff, Julia K; Parschau, Linda; Wurm, Susanne; Schwarzer, Ralf
2014-11-01
The effects of self-efficacy beliefs on physical activity are well documented, but much less is known about the origins of self-efficacy beliefs. This article proposes scales to assess the sources of self-efficacy for physical activity aims and to comparatively test their predictive power for physical activity via self-efficacy over time to detect the principal sources of self-efficacy beliefs for physical activity. A study of 1,406 German adults aged 16-90 years was conducted to construct scales to assess the sources of self-efficacy for physical activity (Study 1). In Study 2, the scales' predictive validity for self-efficacy and physical activity was tested in a sample of 310 older German adults. Short, reliable and valid instruments to measure six sources of self-efficacy for physical activity were developed that enable researchers to comparatively test the predictive value of the sources of self-efficacy. The results suggest that mastery experience, self-persuasion, and reduction in negative affective states are the most important predictors of self-efficacy for physical activity in community-dwelling older adults. PsycINFO Database Record (c) 2014 APA, all rights reserved.
Interior noise in the untreated Gulfstream II Propfan Test Assessment (PTA) aircraft
NASA Technical Reports Server (NTRS)
Kuntz, H. L.; Prydz, R. A.
1989-01-01
Interior noise on the Gulfstream II Propfan Test Assessment (PTA) aircraft was measured using 19 wing, 22 fuselage, and 32 cabin-interior microphones to determine the sources of the cabin noise. Results from ground and flight test acoustic and vibration measurements and analyses show that the major source of cabin noise was the airborne propfan blade passage frequency tones. The radiated sound pressure levels and the richness of the harmonic content of the propfan increased with increasing altitude. The acoustic output of the propfan also depended on the shaft power, helical Mach number, and blade passage frequency.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, Amy N; Wendt, Fabian F; Jonkman, Jason
The objective of this paper is to assess the sources of experimental uncertainty in an offshore wind validation campaign focused on better understanding the nonlinear hydrodynamic response behavior of a floating semisubmersible. The test specimen and conditions were simplified compared to other floating wind test campaigns to reduce potential sources of uncertainties and better focus on the hydrodynamic load attributes. Repeat tests were used to understand the repeatability of the test conditions and to assess the level of random uncertainty in the measurements. Attention was also given to understanding bias in all components of the test. The end goal ofmore » this work is to set uncertainty bounds on the response metrics of interest, which will be used in future work to evaluate the success of modeling tools in accurately calculating hydrodynamic loads and the associated motion responses of the system.« less
Building Assessment Survey and Evaluation Study: Summarized Data - Test Space Pollutant Sources
information collected regarding sources that may have potential impact on the building in terms of indoor air quality including sources such as past or current water damage, pesticide application practices, special use spaces, etc.
NASA Astrophysics Data System (ADS)
Kwon, Hyeokjun; Kang, Yoojin; Jang, Junwoo
2017-09-01
Color fidelity has been used as one of indices to evaluate the performance of light sources. Since the Color Rendering Index (CRI) was proposed at CIE, many color fidelity metrics have been proposed to increase the accuracy of the metric. This paper focuses on a comparison of the color fidelity metrics in an aspect of accuracy with human visual assessments. To visually evaluate the color fidelity of light sources, we made a simulator that reproduces the color samples under lighting conditions. In this paper, eighteen color samples of the Macbeth color checker under test light sources and reference illuminant for each of them are simulated and displayed on a well-characterized monitor. With only a spectrum set of the test light source and reference illuminant, color samples under any lighting condition can be reproduced. In this paper, the spectrums of the two LED and two OLED light sources that have similar values of CRI are used for the visual assessment. In addition, the results of the visual assessment are compared with the two color fidelity metrics that include CRI and IES TM-30-15 (Rf), proposed by Illuminating Engineering Society (IES) in 2015. Experimental results indicate that Rf outperforms CRI in terms of the correlation with visual assessment.
A New Generation of Leaching Tests – The Leaching Environmental Assessment Framework
Provides an overview of newly released leaching tests that provide a more accurate source term when estimating environmental release of metals and other constituents of potential concern (COPCs). The Leaching Environmental Assessment Framework (LEAF) methods have been (1) develo...
Evaluation of seismic spatial interaction effects through an impact testing program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas, B.D.; Driesen, G.E.
The consequences of non-seismically qualified objects falling and striking essential, seismically qualified objects is an analytically difficult problem to assess. Analytical solutions to impact problems are conservative and only available for simple situations. In a nuclear facility, the numerous ``sources`` and ``targets`` requiring evaluation often have complex geometric configurations, which makes calculations and computer modeling difficult. Few industry or regulatory rules are available for this specialized assessment. A drop test program was recently conducted to ``calibrate`` the judgment of seismic qualification engineers who perform interaction evaluations and to further develop seismic interaction criteria. Impact tests on varying combinations of sourcesmore » and targets were performed by dropping the sources from various heights onto targets that were connected to instruments. This paper summarizes the scope, test configurations, and some results of the drop test program. Force and acceleration time history data and general observations are presented on the ruggedness of various targets when subjected to impacts from different types of sources.« less
Evaluation of seismic spatial interaction effects through an impact testing program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas, B.D.; Driesen, G.E.
The consequences of non-seismically qualified objects falling and striking essential, seismically qualified objects is an analytically difficult problem to assess. Analytical solutions to impact problems are conservative and only available for simple situations. In a nuclear facility, the numerous sources'' and targets'' requiring evaluation often have complex geometric configurations, which makes calculations and computer modeling difficult. Few industry or regulatory rules are available for this specialized assessment. A drop test program was recently conducted to calibrate'' the judgment of seismic qualification engineers who perform interaction evaluations and to further develop seismic interaction criteria. Impact tests on varying combinations of sourcesmore » and targets were performed by dropping the sources from various heights onto targets that were connected to instruments. This paper summarizes the scope, test configurations, and some results of the drop test program. Force and acceleration time history data and general observations are presented on the ruggedness of various targets when subjected to impacts from different types of sources.« less
Uittenbogaard, Annemieke J M; de Deckere, Ernie R J T; Sandel, Maro H; Vis, Alice; Houser, Christine M; de Groot, Bas
2014-06-01
Timely administration of effective antibiotics is important in sepsis management. Source-targeted antibiotics are believed to be most effective, but source identification could cause time delays. First, to describe the accuracy/time delays of a diagnostic work-up and the association with time to antibiotics in septic emergency department (ED) patients. Second, to assess the fraction in which source-targeted antibiotics could have been administered solely on the basis of patient history and physical examination. Secondary analysis of the prospective observational study on septic ED patients was carried out. The time to test result availability was associated with time to antibiotics. The accuracy of the suspected source of infection in the ED was assessed. For patients with pneumosepsis, urosepsis, and abdominal sepsis, combinations of signs and symptoms were assessed to achieve a maximal positive predictive value for the sepsis source, identifying a subset of patients in whom source-targeted antibiotics could be administered without waiting for diagnostic test results. The time to antibiotics increased by 18 (95% confidence interval: 12-24) min/h delay in test result availability (n=323). In 38-79% of patients, antibiotics were administered after additional tests, whereas the ED diagnosis was correct in 68-85% of patients. The maximal positive predictive value of signs and symptoms was 0.87 for patients with pneumosepsis and urosepsis and 0.75 for those with abdominal sepsis. Use of signs and symptoms would have led to correct ED diagnosis in 33% of patients. Diagnostic tests are associated with delayed administration of antibiotics to septic ED patients while increasing the diagnostic accuracy to only 68-85%. In one-third of septic ED patients, the choice of antibiotics could have been accurately determined solely on the basis of patient history and physical examination.
Using Multiple-Variable Matching to Identify Cultural Sources of Differential Item Functioning
ERIC Educational Resources Information Center
Wu, Amery D.; Ercikan, Kadriye
2006-01-01
Identifying the sources of differential item functioning (DIF) in international assessments is very challenging, because such sources are often nebulous and intertwined. Even though researchers frequently focus on test translation and content area, few actually go beyond these factors to investigate other cultural sources of DIF. This article…
Generation of Alternative Assessment Scores using TEST and online data sources
Alternatives assessment frameworks such as DfE (Design for the Environment) evaluate chemical alternatives in terms of human health effects, ecotoxicity, and fate. T.E.S.T. (Toxicity Estimation Software Tool) can be utilized to evaluate human health in terms of acute oral rat tox...
2017-01-04
response, including the time for reviewing instructions, searching existing data sources, searching existing data sources, gathering and maintaining...configurations with a restrained manikin, was evaluated in four different test series . Test Series 1 was conducted to determine the materials and...5 ms TTP. Test Series 2 was conducted to determine the materials and drop heights required for energy attenuation of the seat pan to generate a 4 m
Testing of focal plane arrays at the AEDC
NASA Astrophysics Data System (ADS)
Nicholson, Randy A.; Mead, Kimberly D.; Smith, Robert W.
1992-07-01
A facility was developed at the Arnold Engineering Development Center (AEDC) to provide complete radiometric characterization of focal plane arrays (FPAs). The highly versatile facility provides the capability to test single detectors, detector arrays, and hybrid FPAs. The primary component of the AEDC test facility is the Focal Plane Characterization Chamber (FPCC). The FPCC provides a cryogenic, low-background environment for the test focal plane. Focal plane testing in the FPCC includes flood source testing, during which the array is uniformly irradiated with IR radiation, and spot source testing, during which the target radiation is focused onto a single pixel or group of pixels. During flood source testing, performance parameters such as power consumption, responsivity, noise equivalent input, dynamic range, radiometric stability, recovery time, and array uniformity can be assessed. Crosstalk is evaluated during spot source testing. Spectral response testing is performed in a spectral response test station using a three-grating monochromator. Because the chamber can accommodate several types of testing in a single test installation, a high throughput rate and good economy of operation are possible.
ERIC Educational Resources Information Center
Jukes, Matthew C. H.; Grigorenko, Elena L.
2010-01-01
Background: The use of cognitive tests is increasing in Africa but little is known about how such tests are affected by the great ethnic and linguistic diversity on the continent. Aim: To assess ethnic and linguistic group differences in cognitive test performance in the West African country of the Gambia and to investigate the sources of these…
A field test of emulsified zero valent iron (EZVI) nanoparticles was conducted at Parris Island, SC, USA and was monitored for two and half years to assess the treatment of subsurface-source zone chlorinated volatile organic compounds (CVOCs) dominated by tetrachloroethene (PCE) ...
ERIC Educational Resources Information Center
Young, John Q.; Lieu, Sandra; O'Sullivan, Patricia; Tong, Lowell
2011-01-01
Objective: The authors developed and tested the feasibility and utility of a new direct-observation instrument to assess trainee performance of a medication management session. Methods: The Psychopharmacotherapy-Structured Clinical Observation (P-SCO) instrument was developed based on multiple sources of expertise and then implemented in 4…
Piolino, Pascale; Lamidey, Virginie; Desgranges, Béatrice; Eustache, Francis
2007-01-01
Fifty-two subjects between ages 40 and 79 years were administered a questionnaire assessing their ability to recall semantic information about famous people from 4 different decades and to recollect its episodic source of acquisition together with autonoetic consciousness via the remember-know paradigm. In addition, they underwent a battery of standardized neuropsychological tests to assess episodic and semantic memory and executive functions. The analyses of age reveal differences for the episodic source score but no differences between age groups for the semantic scores within each decade. Regardless of the age of people, the analyses also show that semantic memory subcomponents of the famous person test are highly associated with each other as well as with the source component. The recall of semantic information on the famous person test relies on participants' semantic abilities, whereas the recall of its episodic source depends on their executive functions. The present findings confirm the existence of an episodic-semantic distinction in knowledge about famous people. They provide further evidence that personal source and semantic information are at once distinct and highly interactive within the framework of remote memory. (c) 2007 APA, all rights reserved.
2012-06-01
Source Compositions for HPS Dataset ...........................................78 Figure 25 Comparison of Source Apportionment for HPS Dataset...The similarity in the three source patterns from HPS makes the apportionment less certain at that site compared to the four source patterns at... apportionment of these sources across the site. Overall these techniques passed all the performance assessment tests that are presented in Section 6. 3.3
2008-01-01
on such tests as the Embedded Figures Test ( EFT ) (Witkin et al., 1971) or the Rod and Frame Test (RFT) (Witkin, Dyk, Faterson, Goodenough, & Karp...one starts to tap sources of individual differences measured little or not at all by such tests. Thus, when assessing intelligence, it is important to...in requiring verbal skills or the ability to analyze one’s own ideas-Sternberg & Lubart, 1995) but also tap skills beyond those measured even by
NASA Astrophysics Data System (ADS)
Woodward, Richard P.; Loeffler, Irvin J.
1993-04-01
Flight tests to define the far-field tone source at cruise conditions were completed on the full-scale SR-7L advanced turboprop that was installed on the left wing of a Gulfstream 2 aircraft. This program, designated Propfan Test Assessment (PTA), involved aeroacoustic testing of the propeller over a range of test conditions. These measurements defined source levels for input into long-distance propagation models to predict en route noise. In-flight data were taken for seven test cases. Near-field acoustic data were taken on the Gulfstream fuselage and on a microphone boom that was mounted on the Gulfstream wing outboard of the propeller. Far-field acoustic data were taken by an acoustically instrumented Learjet that flew in formation with the Gulfstream. These flight tests were flown from El Paso, Texas, and from the NASA Lewis Research Center. A comprehensive listing of the aeroacoustic results from these flight tests which may be used for future analysis are presented.
NASA Technical Reports Server (NTRS)
Woodward, Richard P.; Loeffler, Irvin J.
1993-01-01
Flight tests to define the far-field tone source at cruise conditions were completed on the full-scale SR-7L advanced turboprop that was installed on the left wing of a Gulfstream 2 aircraft. This program, designated Propfan Test Assessment (PTA), involved aeroacoustic testing of the propeller over a range of test conditions. These measurements defined source levels for input into long-distance propagation models to predict en route noise. In-flight data were taken for seven test cases. Near-field acoustic data were taken on the Gulfstream fuselage and on a microphone boom that was mounted on the Gulfstream wing outboard of the propeller. Far-field acoustic data were taken by an acoustically instrumented Learjet that flew in formation with the Gulfstream. These flight tests were flown from El Paso, Texas, and from the NASA Lewis Research Center. A comprehensive listing of the aeroacoustic results from these flight tests which may be used for future analysis are presented.
ERIC Educational Resources Information Center
Mendes-Barnett, Sharon; Ercikan, Kadriye
2006-01-01
This study contributes to understanding sources of gender differential item functioning (DIF) on mathematics tests. This study focused on identifying sources of DIF and differential bundle functioning for boys and girls on the British Columbia Principles of Mathematics Exam (Grade 12) using a confirmatory SIBTEST approach based on a…
Johnson, Ian; Hutchings, Matt; Benstead, Rachel; Thain, John; Whitehouse, Paul
2004-07-01
In the UK Direct Toxicity Assessment Programme, carried out in 1998-2000, a series of internationally recognised short-term toxicity test methods for algae, invertebrates and fishes, and rapid methods (ECLOX and Microtox) were used extensively. Abbreviated versions of conventional tests (algal growth inhibition tests, Daphnia magna immobilisation test and the oyster embryo-larval development test) were valuable for toxicity screening of effluent discharges and the identification of causes and sources of toxicity. Rapid methods based on chemiluminescence and bioluminescence were not generally useful in this programme, but may have a role where the rapid test has been shown to be an acceptable surrogate for a standardised test method. A range of quality assurance and control measures were identified. Requirements for quality control/assurance are most stringent when deriving data for characterising the toxic hazards of effluents and monitoring compliance against a toxicity reduction target. Lower quality control/assurance requirements can be applied to discharge screening and the identification of causes and sources of toxicity.
Brzonkalik, Katrin; Herrling, Tanja; Syldatk, Christoph; Neumann, Anke
2011-05-27
The aim of this study was to determine the influence of different carbon and nitrogen sources on the production of the mycotoxins alternariol (AOH), alternariol monomethyl ether (AME) and tenuazonic acid (TA) by Alternaria alternata at 28°C using a semi-synthetic medium (modified Czapek-Dox broth) supplemented with nitrogen and carbon sources. Additionally the effect of shaken and static cultivation on mycotoxin production was tested. Initial experiments showed a clear dependency between nitrogen depletion and mycotoxin production. To assess whether nitrogen limitation in general or the type of nitrogen source triggers the production, various nitrogen sources including several ammonium/nitrate salts and amino acids were tested. In static culture the production of AOH/AME can be enhanced greatly with phenylalanine whereas some nitrogen sources seem to inhibit the AOH/AME production completely. TA was not significantly affected by the choice of nitrogen source. In shaken culture the overall production of all mycotoxins was lower compared to static cultivation. Furthermore tests with a wide variety of carbon sources including monosaccharides, disaccharides, complex saccharides such as starch as well as glycerol and acetate were performed. In shaken culture AOH was produced when glucose, fructose, sucrose, acetate or mixtures of glucose/sucrose and glucose/acetate were used as carbon sources. AME production was not detected. The use of sodium acetate resulted in the highest AOH production. In static culture AOH production was also stimulated by acetate and the amount is comparable to shaken conditions. Under static conditions production of AOH was lower except when cultivated with acetate. In static cultivation 9 of 14 tested carbon sources induced mycotoxin production compared to 4 in shaken culture. This is the first study which analyses the influence of carbon and nitrogen sources in a semi-synthetic medium and assesses the effects of culture conditions on mycotoxin production by A. alternata. Copyright © 2011 Elsevier B.V. All rights reserved.
Source term model evaluations for the low-level waste facility performance assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yim, M.S.; Su, S.I.
1995-12-31
The estimation of release of radionuclides from various waste forms to the bottom boundary of the waste disposal facility (source term) is one of the most important aspects of LLW facility performance assessment. In this work, several currently used source term models are comparatively evaluated for the release of carbon-14 based on a test case problem. The models compared include PRESTO-EPA-CPG, IMPACTS, DUST and NEFTRAN-II. Major differences in assumptions and approaches between the models are described and key parameters are identified through sensitivity analysis. The source term results from different models are compared and other concerns or suggestions are discussed.
Fan Noise Prediction with Applications to Aircraft System Noise Assessment
NASA Technical Reports Server (NTRS)
Nark, Douglas M.; Envia, Edmane; Burley, Casey L.
2009-01-01
This paper describes an assessment of current fan noise prediction tools by comparing measured and predicted sideline acoustic levels from a benchmark fan noise wind tunnel test. Specifically, an empirical method and newly developed coupled computational approach are utilized to predict aft fan noise for a benchmark test configuration. Comparisons with sideline noise measurements are performed to assess the relative merits of the two approaches. The study identifies issues entailed in coupling the source and propagation codes, as well as provides insight into the capabilities of the tools in predicting the fan noise source and subsequent propagation and radiation. In contrast to the empirical method, the new coupled computational approach provides the ability to investigate acoustic near-field effects. The potential benefits/costs of these new methods are also compared with the existing capabilities in a current aircraft noise system prediction tool. The knowledge gained in this work provides a basis for improved fan source specification in overall aircraft system noise studies.
St-Jacques, Sylvie; Grenier, Sonya; Charland, Marc; Forest, Jean-Claude; Rousseau, François; Légaré, France
2008-12-01
To identify decisional needs of women, their partners and health professionals regarding prenatal testing for Down syndrome through a systematic review. Articles reporting original data from real clinical situations on sources of difficulty and/or ease in making decisions regarding prenatal testing for Down syndrome were selected. Data were extracted using a taxonomy adapted from the Ottawa Decision-Support Framework and the quality of the studies was assessed using Qualsyst validated tools. In all 40 publications covering 32 unique studies were included. The majority concerned women. The most often reported sources of difficulty for decision-making in women were pressure from others, emotions and lack of information; in partners, emotion; in health professionals, lack of information, length of consultation, and personal values. The most important sources of ease were, in women, personal values, understanding and confidence in the medical system; in partners, personal values, information from external sources, and income; in health professionals, peer support and scientific meetings. Interventions regarding a decision about prenatal testing for Down syndrome should address many decisional needs, which may indeed vary among the parties involved, whether women, their partners or health professionals. Very little is known about the decisional needs of partners and health professionals.
Teacher Assessment in Wales--The TAPS Cymru Project
ERIC Educational Resources Information Center
Jones, Bethan; Coakley, Ruth; Fenn, Lisa; Earle, Sarah; Davies, Dan
2018-01-01
Making accurate, manageable assessments of children's scientific understanding, skills and progress is one of the biggest challenges facing primary teachers. In Wales, where Statutory Assessment Tests (SATs) at age 11 were phased out in 2005, teacher assessment has been the only source of pupil attainment data in science for a much longer period…
Web-Based Portfolio Assessment: Validation of an Open Source Platform
ERIC Educational Resources Information Center
Collins, Regina; Elliot, Norbert; Klobucar, Andrew; Deek, Fadi P.
2013-01-01
Assessment of educational outcomes through purchased tests is commonplace in the evaluation of individual student ability and of educational programs. Focusing on the assessment of writing performance in a longitudinal study of first-time, full-time students (n = 598), this research describes the design, use, and assessment of an open-source…
Technical assessment for quality control of resins
NASA Technical Reports Server (NTRS)
Gosnell, R. B.
1977-01-01
Survey visits to companies involved in the manufacture and use of graphite-epoxy prepregs were conducted to assess the factors which may contribute to variability in the mechanical properties of graphite-epoxy composites. In particular, the purpose was to assess the contributions of the epoxy resins to variability. Companies represented three segments of the composites industry - aircraft manufacturers, prepreg manufacturers, and epoxy resin manufacturers. Several important sources of performance variability were identified from among the complete spectrum of potential sources which ranged from raw materials to composite test data interpretation.
Final Environmental Assessment/Overseas Environmental Assessment for Flight Experiment 1 (FE-1)
2017-08-01
bird habitat. A crater would form as a result of th is impact and leave debris that would need to be recovered 2• Post-test debris recovery and...sources. Ozone, NO2, and some particulates are formed through atmospheric chemica l reactions that are influenced by weather, ultraviolet light...combined emissions rate representing all GHGs. Under the rule, suppliers of fossil fuels or industrial GHGs, manufacturers of mobile sources and
Access to safe water in rural Artibonite, Haiti 16 months after the onset of the cholera epidemic.
Patrick, Molly; Berendes, David; Murphy, Jennifer; Bertrand, Fabienne; Husain, Farah; Handzel, Thomas
2013-10-01
Haiti has the lowest improved water and sanitation coverage in the Western Hemisphere and is suffering from the largest cholera epidemic on record. In May of 2012, an assessment was conducted in rural areas of the Artibonite Department to describe the type and quality of water sources and determine knowledge, access, and use of household water treatment products to inform future programs. It was conducted after emergency response was scaled back but before longer-term water, sanitation, and hygiene activities were initiated. The household survey and source water quality analysis documented low access to safe water, with only 42.3% of households using an improved drinking water source. One-half (50.9%) of the improved water sources tested positive for Escherichia coli. Of households with water to test, 12.7% had positive chlorine residual. The assessment reinforces the identified need for major investments in safe water and sanitation infrastructure and the importance of household water treatment to improve access to safe water in the near term.
Soleimanifar, Manijeh; Karimi, Noureddin; Arab, Amir Massoud
2017-04-01
The sacroiliac joint (SIJ) has been implicated as a potential source of low back and buttock pain. Several types of motion palpation and pain provocation tests are used to evaluate SIJ dysfunction. The purpose of this study was to investigate the relationship between motion palpation and pain provocation tests in assessment of SIJ problems. This study is Descriptive Correlation. 50 patients between the ages of 20 and 65 participated. Four motion palpation tests (Sitting flexion, Standing flexion, Prone knee flexion, Gillet test) and three pain provocation tests (FABER, Posterior shear, Resisted abduction test) were examined. Chi-square analysis was used to assess the relationship between results of the individuals and composites of these two groups of tests. No significant relationship was found between these two groups of tests. It seems that motion palpation tests assess SIJ dysfunction and provocative tests assessed SIJ pain which do not appear to be related. Copyright © 2016 Elsevier Ltd. All rights reserved.
Testing in America's Schools. Policy Information Report.
ERIC Educational Resources Information Center
Barton, Paul E.; Coley, Richard J.
This report provides a profile of state testing programs in 1992-93, as well as a view of classroom testing practices by state, school district, school, or individual teacher. Information, taken from a variety of sources, including the National Assessment of Educational Progress and a General Accounting Office study, indicates that the…
Tarrant, Marie; Knierim, Aimee; Hayes, Sasha K; Ware, James
2006-12-01
Multiple-choice questions are a common assessment method in nursing examinations. Few nurse educators, however, have formal preparation in constructing multiple-choice questions. Consequently, questions used in baccalaureate nursing assessments often contain item-writing flaws, or violations to accepted item-writing guidelines. In one nursing department, 2770 MCQs were collected from tests and examinations administered over a five-year period from 2001 to 2005. Questions were evaluated for 19 frequently occurring item-writing flaws, for cognitive level, for question source, and for the distribution of correct answers. Results show that almost half (46.2%) of the questions contained violations of item-writing guidelines and over 90% were written at low cognitive levels. Only a small proportion of questions were teacher generated (14.1%), while 36.2% were taken from testbanks and almost half (49.4%) had no source identified. MCQs written at a lower cognitive level were significantly more likely to contain item-writing flaws. While there was no relationship between the source of the question and item-writing flaws, teacher-generated questions were more likely to be written at higher cognitive levels (p<0.001). Correct answers were evenly distributed across all four options and no bias was noted in the placement of correct options. Further training in item-writing is recommended for all faculty members who are responsible for developing tests. Pre-test review and quality assessment is also recommended to reduce the occurrence of item-writing flaws and to improve the quality of test questions.
Dubinsky, Eric A; Butkus, Steven R; Andersen, Gary L
2016-11-15
Sources of fecal indicator bacteria are difficult to identify in watersheds that are impacted by a variety of non-point sources. We developed a molecular source tracking test using the PhyloChip microarray that detects and distinguishes fecal bacteria from humans, birds, ruminants, horses, pigs and dogs with a single test. The multiplexed assay targets 9001 different 25-mer fragments of 16S rRNA genes that are common to the bacterial community of each source type. Both random forests and SourceTracker were tested as discrimination tools, with SourceTracker classification producing superior specificity and sensitivity for all source types. Validation with 12 different mammalian sources in mixtures found 100% correct identification of the dominant source and 84-100% specificity. The test was applied to identify sources of fecal indicator bacteria in the Russian River watershed in California. We found widespread contamination by human sources during the wet season proximal to settlements with antiquated septic infrastructure and during the dry season at beaches during intense recreational activity. The test was more sensitive than common fecal indicator tests that failed to identify potential risks at these sites. Conversely, upstream beaches and numerous creeks with less reliance on onsite wastewater treatment contained no fecal signal from humans or other animals; however these waters did contain high counts of fecal indicator bacteria after rain. Microbial community analysis revealed that increased E. coli and enterococci at these locations did not co-occur with common fecal bacteria, but rather co-varied with copiotrophic bacteria that are common in freshwaters with high nutrient and carbon loading, suggesting runoff likely promoted the growth of environmental strains of E. coli and enterococci. These results indicate that machine-learning classification of PhyloChip microarray data can outperform conventional single marker tests that are used to assess health risks, and is an effective tool for distinguishing numerous fecal and environmental sources of pathogen indicators. Copyright © 2016 Elsevier Ltd. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-13
... level); and (b) Pre- and post- tests to assess participant knowledge. (c) Monthly activity logs from... activity, and (c) increased ability to identify healthier food options (increase in post test scores vs... (increase in post test scores vs. pre test scores). Year Three Data (2010-2011) of AI/AN Children 1762 (839...
Multiple Sources of Test Bias on the WISC-R and Bender-Gestalt Test.
ERIC Educational Resources Information Center
Oakland, Thomas; Feigenbaum, David
1979-01-01
Assessed test bias on the Wechsler Intelligence Test for Children-Revised (WISC-R) and Bender-Gestalt. On the Bender, evidence of bias was infrequent and irregular. On the WISC-R, group differences were most discernible for age, sex, family structure, and race. Consistent patterns of bias were not apparent among comparison groups. (Author)
Generation of GHS Scores from TEST and online sources ...
Alternatives assessment frameworks such as DfE (Design for the Environment) evaluate chemical alternatives in terms of human health effects, ecotoxicity, and fate. T.E.S.T. (Toxicity Estimation Software Tool) can be utilized to evaluate human health in terms of acute oral rat toxicity, developmental toxicity, endocrine activity, and mutagenicity. It can be used to evaluate ecotoxicity (in terms of acute fathead minnow toxicity) and fate (in terms of bioconcentration factor). It also be used to estimate a variety of key physicochemical properties such as melting point, boiling point, vapor pressure, water solubility, and bioconcentration factor. A web-based version of T.E.S.T. is currently being developed to allow predictions to be made from other web tools. Online data sources such as from NCCT’s Chemistry Dashboard, REACH dossiers, or from ChemHat.org can also be utilized to obtain GHS (Global Harmonization System) scores for comparing alternatives. The purpose of this talk is to show how GHS (Global Harmonization Score) data can be obtained from literature sources and from T.E.S.T. (Toxicity Estimation Software Tool). This data will be used to compare chemical alternatives in the alternatives assessment dashboard (a 2018 CSS product).
Evaluating online direct-to-consumer marketing of genetic tests: informed choices or buyers beware?
Geransar, Rose; Einsiedel, Edna
2008-03-01
Commercialization of genetic technologies is expanding the horizons for the marketing and sales of genetic tests direct-to-consumers (DTCs). This study assesses the information provision and access requirements that are in place for genetic tests that are being advertised DTC over the Internet. Sets of key words specific to DTC genetic testing were entered into popular Internet search engines to generate a list of 24 companies engaging in DTC advertising. Company requirements for physician mediation, genetic counseling arrangements, and information provision were coded to develop categories for quantitative analysis within each variable. Results showed that companies offering risk assessment and diagnostic testing were most likely to require that testing be mediated by a clinician, and to recommend physician-arranged counseling. Companies offering enhancement testing were less likely to require physician mediation of services and more likely to provide long-distance genetic counseling. DTC advertisements often provided information on disease etiology; this was most common in the case of multifactorial diseases. The majority of companies cited outside sources to support the validity of claims about clinical utility of the tests being advertised; companies offering risk assessment tests most frequently cited all information sources. DTC advertising for genetic tests that lack independent professional oversight raises troubling questions about appropriate use and interpretation of these tests by consumers and carries implications for the standards of patient care. These implications are discussed in the context of a public healthcare system.
Demonstration of a Small Modular BioPower System Using Poultry Litter
DOE Office of Scientific and Technical Information (OSTI.GOV)
John P. Reardon; Art Lilley; Jim Wimberly
2002-05-22
The purpose of this project was to assess poultry grower residue, or litter (manure plus absorbent biomass), as a fuel source for Community Power Corporation's small modular biopower system (SMB). A second objective was to assess the poultry industry to identify potential ''on-site'' applications of the SMB system using poultry litter residue as a fuel source, and to adapt CPC's existing SMB to generate electricity and heat from the poultry litter biomass fuel. Bench-scale testing and pilot testing were used to gain design information for the SMB retrofit. System design approach for the Phase II application of the SMB wasmore » the goal of Phase I testing. Cost estimates for an onsite poultry litter SMB were prepared. Finally, a market estimate was prepared for implementation of the on-farm SMB using poultry litter.« less
Testing a Neurocomputational Model of Recollection, Familiarity, and Source Recognition
ERIC Educational Resources Information Center
Elfman, Kane W.; Parks, Colleen M.; Yonelinas, Andrew P.
2008-01-01
The authors assess whether the complementary learning systems model of the medial temporal lobes (Norman & O'Reilly, 2003) is able to account for source recognition receiver operating characteristics (ROCs). The model assumes that recognition reflects the contribution of a hippocampally mediated recollection process and a cortically mediated…
Evaluation of phosphorus site assessment tools: lessons from the USA
USDA-ARS?s Scientific Manuscript database
Critical source area identification through phosphorus (P) site assessment is a fundamental part of modern nutrient management planning in the U.S. To date, the P Index has been the primary tool for P site assessment adopted by US states, but there has been only patchy testing of the many versions ...
Applying a Web and Simulation-Based System for Adaptive Competence Assessment of Spinal Anaesthesia
NASA Astrophysics Data System (ADS)
Hockemeyer, Cord; Nussbaumer, Alexander; Lövquist, Erik; Aboulafia, Annette; Breen, Dorothy; Shorten, George; Albert, Dietrich
The authors present an approach for implementing a system for the assessment of medical competences using a haptic simulation device. Based on Competence based Knowledge Space Theory (CbKST), information on the learners’ competences is gathered from different sources (test questions, data from the simulator, and supervising experts’ assessments).
Inflight source noise of an advanced full-scale single-rotation propeller
NASA Technical Reports Server (NTRS)
Woodward, Richard P.; Loeffler, Irvin J.
1991-01-01
Flight tests to define the far field tone source at cruise conditions were completed on the full scale SR-7L advanced turboprop which was installed on the left wing of a Gulfstream II aircraft. This program, designated Propfan Test Assessment (PTA), involved aeroacoustic testing of the propeller over a range of test conditions. These measurements defined source levels for input into long distance propagation models to predict en route noise. Inflight data were taken for 7 test cases. The sideline directivities measured by the Learjet showed expected maximum levels near 105 degrees from the propeller upstream axis. However, azimuthal directivities based on the maximum observed sideline tone levels showed highest levels below the aircraft. An investigation of the effect of propeller tip speed showed that the tone level of reduction associated with reductions in propeller tip speed is more significant in the horizontal plane than below the aircraft.
Equivalent magnetic vector potential model for low-frequency magnetic exposure assessment
NASA Astrophysics Data System (ADS)
Diao, Y. L.; Sun, W. N.; He, Y. Q.; Leung, S. W.; Siu, Y. M.
2017-10-01
In this paper, a novel source model based on a magnetic vector potential for the assessment of induced electric field strength in a human body exposed to the low-frequency (LF) magnetic field of an electrical appliance is presented. The construction of the vector potential model requires only a single-component magnetic field to be measured close to the appliance under test, hence relieving considerable practical measurement effort—the radial basis functions (RBFs) are adopted for the interpolation of discrete measurements; the magnetic vector potential model can then be directly constructed by summing a set of simple algebraic functions of RBF parameters. The vector potentials are then incorporated into numerical calculations as the equivalent source for evaluations of the induced electric field in the human body model. The accuracy and effectiveness of the proposed model are demonstrated by comparing the induced electric field in a human model to that of the full-wave simulation. This study presents a simple and effective approach for modelling the LF magnetic source. The result of this study could simplify the compliance test procedure for assessing an electrical appliance regarding LF magnetic exposure.
Equivalent magnetic vector potential model for low-frequency magnetic exposure assessment.
Diao, Y L; Sun, W N; He, Y Q; Leung, S W; Siu, Y M
2017-09-21
In this paper, a novel source model based on a magnetic vector potential for the assessment of induced electric field strength in a human body exposed to the low-frequency (LF) magnetic field of an electrical appliance is presented. The construction of the vector potential model requires only a single-component magnetic field to be measured close to the appliance under test, hence relieving considerable practical measurement effort-the radial basis functions (RBFs) are adopted for the interpolation of discrete measurements; the magnetic vector potential model can then be directly constructed by summing a set of simple algebraic functions of RBF parameters. The vector potentials are then incorporated into numerical calculations as the equivalent source for evaluations of the induced electric field in the human body model. The accuracy and effectiveness of the proposed model are demonstrated by comparing the induced electric field in a human model to that of the full-wave simulation. This study presents a simple and effective approach for modelling the LF magnetic source. The result of this study could simplify the compliance test procedure for assessing an electrical appliance regarding LF magnetic exposure.
ERIC Educational Resources Information Center
Blitz, Mark H.; Modeste, Marsha
2015-01-01
The Comprehensive Assessment of Leadership for Learning (CALL) is a multi-source assessment of distributed instructional leadership. As part of the validation of CALL, researchers examined differences between teacher and leader ratings in assessing distributed leadership practices. The authors utilized a t-test for equality of means for the…
Implementation of the Leaching Environmental Assessment Framework
New leaching tests are available in the U.S. for developing more accurate source terms for use in fate and transport models. For beneficial use or disposal, the use of the leaching environmental assessment framework (LEAF) will provide leaching results that reflect field condit...
Computational Approaches and Tools for Exposure Prioritization and Biomonitoring Data Interpretation
The ability to describe the source-environment-exposure-dose-response continuum is essential for identifying exposures of greater concern to prioritize chemicals for toxicity testing or risk assessment, as well as for interpreting biomarker data for better assessment of exposure ...
Xu, Jiao; Shi, Guo-Liang; Guo, Chang-Sheng; Wang, Hai-Ting; Tian, Ying-Ze; Huangfu, Yan-Qi; Zhang, Yuan; Feng, Yin-Chang; Xu, Jian
2018-01-01
A hybrid model based on the positive matrix factorization (PMF) model and the health risk assessment model for assessing risks associated with sources of perfluoroalkyl substances (PFASs) in water was established and applied at Dianchi Lake to test its applicability. The new method contains 2 stages: 1) the sources of PFASs were apportioned by the PMF model and 2) the contribution of health risks from each source was calculated by the new hybrid model. Two factors were extracted by PMF, with factor 1 identified as aqueous fire-fighting foams source and factor 2 as fluoropolymer manufacturing and processing and perfluorooctanoic acid production source. The health risk of PFASs in the water assessed by the health risk assessment model was 9.54 × 10 -7 a -1 on average, showing no obvious adverse effects to human health. The 2 sources' risks estimated by the new hybrid model ranged from 2.95 × 10 -10 to 6.60 × 10 -6 a -1 and from 1.64 × 10 -7 to 1.62 × 10 -6 a -1 , respectively. The new hybrid model can provide useful information on the health risks of PFAS sources, which is helpful for pollution control and environmental management. Environ Toxicol Chem 2018;37:107-115. © 2017 SETAC. © 2017 SETAC.
2014 Assessment of the Ballistic Missile Defense System (BMDS)
2015-03-23
for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of...take several more years to collect the test data needed to adequately VV&A the BMDS M&S required to perform such assessments. As data are collected ...Accreditation is possible only if a sufficient quantity and quality of flight test data have been collected to support model verification and
Adverse outcome pathways (AOPs) to enhance EDC ...
Screening and testing for endocrine active chemicals was mandated under 1996 amendments to the Safe Drinking Water Act and Food Quality Protection Act. Efficiencies can be gained in the endocrine disruptor screening program by using available biological and toxicological knowledge to facilitate greater use of high throughput screening data and other data sources to inform endocrine disruptor assessments. Likewise, existing knowledge, when properly organized, can help aid interpretation of test results. The adverse outcome pathway (AOP) framework, which organizes information concerning measureable changes that link initial biological interactions with a chemical to adverse effects that are meaningful to risk assessment and management, can aid this process. This presentation outlines the ways in which the AOP framework has already been employed to support EDSP and how it may further enhance endocrine disruptor assessments in the future. Screening and testing for endocrine active chemicals was mandated under 1996 amendments to the Safe Drinking Water Act and Food Quality Protection Act. Efficiencies can be gained in the endocrine disruptor screening program by using available biological and toxicological knowledge to facilitate greater use of high throughput screening data and other data sources to inform endocrine disruptor assessments. Likewise, existing knowledge, when properly organized, can help aid interpretation of test results. The adverse outcome pathway
Deal, Shanley B; Lendvay, Thomas S; Haque, Mohamad I; Brand, Timothy; Comstock, Bryan; Warren, Justin; Alseidi, Adnan
2016-02-01
Objective, unbiased assessment of surgical skills remains a challenge in surgical education. We sought to evaluate the feasibility and reliability of Crowd-Sourced Assessment of Technical Skills. Seven volunteer general surgery interns were given time for training and then testing, on laparoscopic peg transfer, precision cutting, and intracorporeal knot-tying. Six faculty experts (FEs) and 203 Amazon.com Mechanical Turk crowd workers (CWs) evaluated 21 deidentified video clips using the Global Objective Assessment of Laparoscopic Skills validated rating instrument. Within 19 hours and 15 minutes we received 662 eligible ratings from 203 CWs and 126 ratings from 6 FEs over 10 days. FE video ratings were of borderline internal consistency (Krippendorff's alpha = .55). FE ratings were highly correlated with CW ratings (Pearson's correlation coefficient = .78, P < .001). We propose the use of Crowd-Sourced Assessment of Technical Skills as a reliable, basic tool to standardize the evaluation of technical skills in general surgery. Copyright © 2016 Elsevier Inc. All rights reserved.
Palazón, L; Navas, A
2017-06-01
Information on sediment contribution and transport dynamics from the contributing catchments is needed to develop management plans to tackle environmental problems related with effects of fine sediment as reservoir siltation. In this respect, the fingerprinting technique is an indirect technique known to be valuable and effective for sediment source identification in river catchments. Large variability in sediment delivery was found in previous studies in the Barasona catchment (1509 km 2 , Central Spanish Pyrenees). Simulation results with SWAT and fingerprinting approaches identified badlands and agricultural uses as the main contributors to sediment supply in the reservoir. In this study the <63 μm sediment fraction from the surface reservoir sediments (2 cm) are investigated following the fingerprinting procedure to assess how the use of different statistical procedures affects the amounts of source contributions. Three optimum composite fingerprints were selected to discriminate between source contributions based in land uses/land covers from the same dataset by the application of (1) discriminant function analysis; and its combination (as second step) with (2) Kruskal-Wallis H-test and (3) principal components analysis. Source contribution results were different between assessed options with the greatest differences observed for option using #3, including the two step process: principal components analysis and discriminant function analysis. The characteristics of the solutions by the applied mixing model and the conceptual understanding of the catchment showed that the most reliable solution was achieved using #2, the two step process of Kruskal-Wallis H-test and discriminant function analysis. The assessment showed the importance of the statistical procedure used to define the optimum composite fingerprint for sediment fingerprinting applications. Copyright © 2016 Elsevier Ltd. All rights reserved.
Test Area C-64 Range Environmental Assessment, Revision 1
2010-10-01
DOI U.S. Department of the Interior DNL Day–Night Average Sound Level DU Depleted Uranium EBD Environmental Baseline Document EIAP Environmental...vulnerability, burning sensitivity, drop tests, bullet impact tests, sympathetic detonation tests, advanced warhead design tests, and depleted uranium (DU...land back to range use. Source: U.S. Air Force, 2009 DU = depleted uranium ; ERP = Environmental Restoration Program; LUC = land use control; RW
Dorman, Michael F; Natale, Sarah; Loiselle, Louise
2018-03-01
Sentence understanding scores for patients with cochlear implants (CIs) when tested in quiet are relatively high. However, sentence understanding scores for patients with CIs plummet with the addition of noise. To assess, for patients with CIs (MED-EL), (1) the value to speech understanding of two new, noise-reducing microphone settings and (2) the effect of the microphone settings on sound source localization. Single-subject, repeated measures design. For tests of speech understanding, repeated measures on (1) number of CIs (one, two), (2) microphone type (omni, natural, adaptive beamformer), and (3) type of noise (restaurant, cocktail party). For sound source localization, repeated measures on type of signal (low-pass [LP], high-pass [HP], broadband noise). Ten listeners, ranging in age from 48 to 83 yr (mean = 57 yr), participated in this prospective study. Speech understanding was assessed in two noise environments using monaural and bilateral CIs fit with three microphone types. Sound source localization was assessed using three microphone types. In Experiment 1, sentence understanding scores (in terms of percent words correct) were obtained in quiet and in noise. For each patient, noise was first added to the signal to drive performance off of the ceiling in the bilateral CI-omni microphone condition. The other conditions were then administered at that signal-to-noise ratio in quasi-random order. In Experiment 2, sound source localization accuracy was assessed for three signal types using a 13-loudspeaker array over a 180° arc. The dependent measure was root-mean-score error. Both the natural and adaptive microphone settings significantly improved speech understanding in the two noise environments. The magnitude of the improvement varied between 16 and 19 percentage points for tests conducted in the restaurant environment and between 19 and 36 percentage points for tests conducted in the cocktail party environment. In the restaurant and cocktail party environments, both the natural and adaptive settings, when implemented on a single CI, allowed scores that were as good as, or better, than scores in the bilateral omni test condition. Sound source localization accuracy was unaltered by either the natural or adaptive settings for LP, HP, or wideband noise stimuli. The data support the use of the natural microphone setting as a default setting. The natural setting (1) provides better speech understanding in noise than the omni setting, (2) does not impair sound source localization, and (3) retains low-frequency sensitivity to signals from the rear. Moreover, bilateral CIs equipped with adaptive beamforming technology can engender speech understanding scores in noise that fall only a little short of scores for a single CI in quiet. American Academy of Audiology
Unc, Adrian; Zurek, Ludek; Peterson, Greg; Narayanan, Sanjeev; Springthorpe, Susan V; Sattar, Syed A
2012-01-01
Potential risks associated with impaired surface water quality have commonly been evaluated by indirect description of potential sources using various fecal microbial indicators and derived source-tracking methods. These approaches are valuable for assessing and monitoring the impacts of land-use changes and changes in management practices at the source of contamination. A more detailed evaluation of putative etiologically significant genetic determinants can add value to these assessments. We evaluated the utility of using a microarray that integrates virulence genes with antibiotic and heavy metal resistance genes to describe and discriminate among spatially and seasonally distinct water samples from an agricultural watershed creek in Eastern Ontario. Because microarray signals may be analyzed as binomial distributions, the significance of ambiguous signals can be easily evaluated by using available off-the-shelf software. The FAMD software was used to evaluate uncertainties in the signal data. Analysis of multilocus fingerprinting data sets containing missing data has shown that, for the tested system, any variability in microarray signals had a marginal effect on data interpretation. For the tested watershed, results suggest that in general the wet fall season increased the downstream detection of virulence and resistance genes. Thus, the tested microarray technique has the potential to rapidly describe the quality of surface waters and thus to provide a qualitative tool to augment quantitative microbial risk assessments. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.
An open-source framework for testing tracking devices using Lego Mindstorms
NASA Astrophysics Data System (ADS)
Jomier, Julien; Ibanez, Luis; Enquobahrie, Andinet; Pace, Danielle; Cleary, Kevin
2009-02-01
In this paper, we present an open-source framework for testing tracking devices in surgical navigation applications. At the core of image-guided intervention systems is the tracking interface that handles communication with the tracking device and gathers tracking information. Given that the correctness of tracking information is critical for protecting patient safety and for ensuring the successful execution of an intervention, the tracking software component needs to be thoroughly tested on a regular basis. Furthermore, with widespread use of extreme programming methodology that emphasizes continuous and incremental testing of application components, testing design becomes critical. While it is easy to automate most of the testing process, it is often more difficult to test components that require manual intervention such as tracking device. Our framework consists of a robotic arm built from a set of Lego Mindstorms and an open-source toolkit written in C++ to control the robot movements and assess the accuracy of the tracking devices. The application program interface (API) is cross-platform and runs on Windows, Linux and MacOS. We applied this framework for the continuous testing of the Image-Guided Surgery Toolkit (IGSTK), an open-source toolkit for image-guided surgery and shown that regression testing on tracking devices can be performed at low cost and improve significantly the quality of the software.
Mobile Source Observation Database (MSOD)
The Mobile Source Observation Database (MSOD) is a relational database being developed by the Assessment and Standards Division (ASD) of the US Environmental Protection Agency Office of Transportation and Air Quality (formerly the Office of Mobile Sources). The MSOD contains emission test data from in-use mobile air- pollution sources such as cars, trucks, and engines from trucks and nonroad vehicles. Data in the database was collected from 1982 to the present. The data is intended to be representative of in-use vehicle emissions in the United States.
Comprehensive Assessment of Emotional Disturbance: A Cross-Validation Approach
ERIC Educational Resources Information Center
Fisher, Emily S.; Doyon, Katie E.; Saldana, Enrique; Allen, Megan Redding
2007-01-01
Assessing a student for emotional disturbance is a serious and complex task given the stigma of the label and the ambiguities of the federal definition. One way that school psychologists can be more confident in their assessment results is to cross validate data from different sources using the RIOT approach (Review, Interview, Observe, Test).…
Access to Safe Water in Rural Artibonite, Haiti 16 Months after the Onset of the Cholera Epidemic
Patrick, Molly; Berendes, David; Murphy, Jennifer; Bertrand, Fabienne; Husain, Farah; Handzel, Thomas
2013-01-01
Haiti has the lowest improved water and sanitation coverage in the Western Hemisphere and is suffering from the largest cholera epidemic on record. In May of 2012, an assessment was conducted in rural areas of the Artibonite Department to describe the type and quality of water sources and determine knowledge, access, and use of household water treatment products to inform future programs. It was conducted after emergency response was scaled back but before longer-term water, sanitation, and hygiene activities were initiated. The household survey and source water quality analysis documented low access to safe water, with only 42.3% of households using an improved drinking water source. One-half (50.9%) of the improved water sources tested positive for Escherichia coli. Of households with water to test, 12.7% had positive chlorine residual. The assessment reinforces the identified need for major investments in safe water and sanitation infrastructure and the importance of household water treatment to improve access to safe water in the near term. PMID:24106191
Kessels, Roy P C; Kortrijk, Hans E; Wester, Arie J; Nys, Gudrun M S
2008-04-01
Confabulation behavior is common in patients with Korsakoff's syndrome. A distinction can be made between spontaneous and provoked confabulations, which may have different underlying cognitive mechanisms. Provoked confabulations may be related to intrusions on memory tests, whereas spontaneous confabulations may be due to executive dysfunction or a source memory deficit. In 19 chronic Korsakoff patients, spontaneous confabulations were quantified by third-party rating (Likert scale). Provoked confabulations were assessed using the Dalla Barba Confabulation Battery. Furthermore, assessment of executive function was performed using an extensive neuropsychological battery. False memories (i.e. intrusions) and source memory were measured using twoparallelversions of a word-list learning paradigm (a modification of the Rey Auditory Verbal Learning Test). There were deficits in source memory, in which patients incorrectly assigned previously learned words to an incorrect word list. Also, Korsakoff patients had extensive executive deficits, but no relationship between the severity of these deficits and the severity of confabulation or intrusions on a memory task was found. The present findings provide evidence for a dissociation between spontaneous confabulation, provoked confabulation and false memories.
Attention during memory retrieval enhances future remembering.
Dudukovic, Nicole M; Dubrow, Sarah; Wagner, Anthony D
2009-10-01
Memory retrieval is a powerful learning event that influences whether an experience will be remembered in the future. Although retrieval can succeed in the presence of distraction, dividing attention during retrieval may reduce the power of remembering as an encoding event. In the present experiments, participants studied pictures of objects under full attention and then engaged in item recognition and source memory retrieval under full or divided attention. Two days later, a second recognition and source recollection test assessed the impact of attention during initial retrieval on long-term retention. On this latter test, performance was superior for items that had been tested initially under full versus divided attention. More importantly, even when items were correctly recognized on the first test, divided attention reduced the likelihood of subsequent recognition on the second test. The same held true for source recollection. Additionally, foils presented during the first test were also less likely to be later recognized if they had been encountered initially under divided attention. These findings demonstrate that attentive retrieval is critical for learning through remembering.
Test of US Federal Life Cycle Inventory Data Interoperability
Life cycle assessment practitioners must gather data from a variety of sources. For modeling activities in the US, practitioners may wish to use life cycle inventory data from public databases and libraries provided by US government entities. An exercise was conducted to test if ...
Deterministic approach for multiple-source tsunami hazard assessment for Sines, Portugal
NASA Astrophysics Data System (ADS)
Wronna, M.; Omira, R.; Baptista, M. A.
2015-11-01
In this paper, we present a deterministic approach to tsunami hazard assessment for the city and harbour of Sines, Portugal, one of the test sites of project ASTARTE (Assessment, STrategy And Risk Reduction for Tsunamis in Europe). Sines has one of the most important deep-water ports, which has oil-bearing, petrochemical, liquid-bulk, coal, and container terminals. The port and its industrial infrastructures face the ocean southwest towards the main seismogenic sources. This work considers two different seismic zones: the Southwest Iberian Margin and the Gloria Fault. Within these two regions, we selected a total of six scenarios to assess the tsunami impact at the test site. The tsunami simulations are computed using NSWING, a Non-linear Shallow Water model wIth Nested Grids. In this study, the static effect of tides is analysed for three different tidal stages: MLLW (mean lower low water), MSL (mean sea level), and MHHW (mean higher high water). For each scenario, the tsunami hazard is described by maximum values of wave height, flow depth, drawback, maximum inundation area and run-up. Synthetic waveforms are computed at virtual tide gauges at specific locations outside and inside the harbour. The final results describe the impact at the Sines test site considering the single scenarios at mean sea level, the aggregate scenario, and the influence of the tide on the aggregate scenario. The results confirm the composite source of Horseshoe and Marques de Pombal faults as the worst-case scenario, with wave heights of over 10 m, which reach the coast approximately 22 min after the rupture. It dominates the aggregate scenario by about 60 % of the impact area at the test site, considering maximum wave height and maximum flow depth. The HSMPF scenario inundates a total area of 3.5 km2.
Uncertainty Analysis of OC5-DeepCwind Floating Semisubmersible Offshore Wind Test Campaign
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, Amy N
This paper examines how to assess the uncertainty levels for test measurements of the Offshore Code Comparison, Continued, with Correlation (OC5)-DeepCwind floating offshore wind system, examined within the OC5 project. The goal of the OC5 project was to validate the accuracy of ultimate and fatigue load estimates from a numerical model of the floating semisubmersible using data measured during scaled tank testing of the system under wind and wave loading. The examination of uncertainty was done after the test, and it was found that the limited amount of data available did not allow for an acceptable uncertainty assessment. Therefore, thismore » paper instead qualitatively examines the sources of uncertainty associated with this test to start a discussion of how to assess uncertainty for these types of experiments and to summarize what should be done during future testing to acquire the information needed for a proper uncertainty assessment. Foremost, future validation campaigns should initiate numerical modeling before testing to guide the test campaign, which should include a rigorous assessment of uncertainty, and perform validation during testing to ensure that the tests address all of the validation needs.« less
Uncertainty Analysis of OC5-DeepCwind Floating Semisubmersible Offshore Wind Test Campaign: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, Amy N
This paper examines how to assess the uncertainty levels for test measurements of the Offshore Code Comparison, Continued, with Correlation (OC5)-DeepCwind floating offshore wind system, examined within the OC5 project. The goal of the OC5 project was to validate the accuracy of ultimate and fatigue load estimates from a numerical model of the floating semisubmersible using data measured during scaled tank testing of the system under wind and wave loading. The examination of uncertainty was done after the test, and it was found that the limited amount of data available did not allow for an acceptable uncertainty assessment. Therefore, thismore » paper instead qualitatively examines the sources of uncertainty associated with this test to start a discussion of how to assess uncertainty for these types of experiments and to summarize what should be done during future testing to acquire the information needed for a proper uncertainty assessment. Foremost, future validation campaigns should initiate numerical modeling before testing to guide the test campaign, which should include a rigorous assessment of uncertainty, and perform validation during testing to ensure that the tests address all of the validation needs.« less
Exploring a Source of Uneven Score Equity across the Test Score Range
ERIC Educational Resources Information Center
Huggins-Manley, Anne Corinne; Qiu, Yuxi; Penfield, Randall D.
2018-01-01
Score equity assessment (SEA) refers to an examination of population invariance of equating across two or more subpopulations of test examinees. Previous SEA studies have shown that score equity may be present for examinees scoring at particular test score ranges but absent for examinees scoring at other score ranges. No studies to date have…
Explanatory Item Response Modeling of Children's Change on a Dynamic Test of Analogical Reasoning
ERIC Educational Resources Information Center
Stevenson, Claire E.; Hickendorff, Marian; Resing, Wilma C. M.; Heiser, Willem J.; de Boeck, Paul A. L.
2013-01-01
Dynamic testing is an assessment method in which training is incorporated into the procedure with the aim of gauging cognitive potential. Large individual differences are present in children's ability to profit from training in analogical reasoning. The aim of this experiment was to investigate sources of these differences on a dynamic test of…
Acoustical evaluation of the NASA Lewis 9 by 15 foot low speed wind tunnel
NASA Technical Reports Server (NTRS)
Dahl, Milo D.; Woodward, Richard P.
1992-01-01
The test section of the NASA Lewis 9- by 15-Foot Low Speed Wind Tunnel was acoustically treated to allow the measurement of acoustic sources located within the tunnel test section under simulated free field conditions. The treatment was designed for high sound absorption at frequencies above 250 Hz and to withstand tunnel airflow velocities up to 0.2 Mach. Evaluation tests with no tunnel airflow were conducted in the test section to assess the performance of the installed treatment. This performance would not be significantly affected by low speed airflow. Time delay spectrometry tests showed that interference ripples in the incident signal resulting from reflections occurring within the test section average from 1.7 dB to 3.2 dB wide over a 500 to 5150 Hz frequency range. Late reflections, from upstream and downstream of the test section, were found to be insignificant at the microphone measuring points. For acoustic sources with low directivity characteristics, decay with distance measurements in the test section showed that incident free field behavior can be measured on average with an accuracy of +/- 1.5 dB or better at source frequencies from 400 Hz to 10 kHz. The free field variations are typically much smaller with an omnidirectional source.
The assessment of data sources for influenza virologic surveillance in New York State.
Escuyer, Kay L; Waters, Christine L; Gowie, Donna L; Maxted, Angie M; Farrell, Gregory M; Fuschino, Meghan E; St George, Kirsten
2017-03-01
Following the 2013 USA release of the Influenza Virologic Surveillance Right Size Roadmap, the New York State Department of Health (NYSDOH) embarked on an evaluation of data sources for influenza virologic surveillance. To assess NYS data sources, additional to data generated by the state public health laboratory (PHL), which could enhance influenza surveillance at the state and national level. Potential sources of laboratory test data for influenza were analyzed for quantity and quality. Computer models, designed to assess sample sizes and the confidence of data for statistical representation of influenza activity, were used to compare PHL test data to results from clinical and commercial laboratories, reported between June 8, 2013 and May 31, 2014. Sample sizes tested for influenza at the state PHL were sufficient for situational awareness surveillance with optimal confidence levels, only during peak weeks of the influenza season. Influenza data pooled from NYS PHLs and clinical laboratories generated optimal confidence levels for situational awareness throughout the influenza season. For novel influenza virus detection in NYS, combined real-time (rt) RT-PCR data from state and regional PHLs achieved ≥85% confidence during peak influenza activity, and ≥95% confidence for most of low season and all of off-season. In NYS, combined data from clinical, commercial, and public health laboratories generated optimal influenza surveillance for situational awareness throughout the season. Statistical confidence for novel virus detection, which is reliant on only PHL data, was achieved for most of the year. © 2016 The Authors. Influenza and Other Respiratory Viruses Published by John Wiley & Sons Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeRosier, R.; Waterland, L.R.
1986-02-01
This report is a compendium of detailed test sampling and analysis data obtained in field tests of a watertube industrial boiler burning a coal/water slurry (CWS). Test data reported include preliminary stack test data, boiler operating data, and complete flue-gas emission results. Flue-gas emission measurements included continuous monitoring for criteria pollutants; onsite gas chromatography (GC) for volatile hydrocarbons (Cl-C6); Methods 5/8 sampling for particulate, SO/sub 2/, and SO/sub 3/ emissions; source assessment sampling system (SASS) for total organics in two boiling point ranges (100 to 300 C and > 300 C), organic compound category information using infrared spectrometry (IR), liquidmore » column (LC) chromatography separation, and low-resolution mass spectrometry (LRMS), specific quantitation of the semivolatile organic priority pollutants using gas chromatography/mass spectrometry (GC/MS), and trace-element emissions using spark-source mass spectrometry (SSMS) and atomic absorption spectroscopy (AAS); N/sub 2/O emissions by gas chromatography/electron-capture detector (GC/ECD); and biological assay testing of SASS and ash-stream samples.« less
Portable Imagery Quality Assessment Test Field for Uav Sensors
NASA Astrophysics Data System (ADS)
Dąbrowski, R.; Jenerowicz, A.
2015-08-01
Nowadays the imagery data acquired from UAV sensors are the main source of all data used in various remote sensing applications, photogrammetry projects and in imagery intelligence (IMINT) as well as in other tasks as decision support. Therefore quality assessment of such imagery is an important task. The research team from Military University of Technology, Faculty of Civil Engineering and Geodesy, Geodesy Institute, Department of Remote Sensing and Photogrammetry has designed and prepared special test field- The Portable Imagery Quality Assessment Test Field (PIQuAT) that provides quality assessment in field conditions of images obtained with sensors mounted on UAVs. The PIQuAT consists of 6 individual segments, when combined allow for determine radiometric, spectral and spatial resolution of images acquired from UAVs. All segments of the PIQuAT can be used together in various configurations or independently. All elements of The Portable Imagery Quality Assessment Test Field were tested in laboratory conditions in terms of their radiometry and spectral reflectance characteristics.
Generation of GHS Scores from TEST and online sources
Alternatives assessment frameworks such as DfE (Design for the Environment) evaluate chemical alternatives in terms of human health effects, ecotoxicity, and fate. T.E.S.T. (Toxicity Estimation Software Tool) can be utilized to evaluate human health in terms of acute oral rat tox...
A recently published test method for Neocloeon triangulifer assessed the sensitivities of larval mayflies to several reference toxicants (NaCl, KCl, and CuSO4). Subsequent exposures have shown discrepancies from those results previously reported. To identify potential sources of ...
Alaska national hydrography dataset positional accuracy assessment study
Arundel, Samantha; Yamamoto, Kristina H.; Constance, Eric; Mantey, Kim; Vinyard-Houx, Jeremy
2013-01-01
Initial visual assessments Wide range in the quality of fit between features in NHD and these new image sources. No statistical analysis has been performed to actually quantify accuracy Determining absolute accuracy is cost prohibitive (must collect independent, well defined test points) Quantitative analysis of relative positional error is feasible.
Writing Assessment: Issues and Strategies. Longman Series in College Composition and Communication.
ERIC Educational Resources Information Center
Greenberg, Karen L., Ed.; And Others
Data compiled from more than 2,000 member institutions of the National Testing Network in Writing were the source of this guide to writing assessment. Using an interdisciplinary approach, with insights from cognitive psychology, sociology, linguistics, educational measurement, rhetoric, and English education, the book gives suggestions on…
The photon fluence non-uniformity correction for air kerma near Cs-137 brachytherapy sources.
Rodríguez, M L; deAlmeida, C E
2004-05-07
The use of brachytherapy sources in radiation oncology requires their proper calibration to guarantee the correctness of the dose delivered to the treatment volume of a patient. One of the elements to take into account in the dose calculation formalism is the non-uniformity of the photon fluence due to the beam divergence that causes a steep dose gradient near the source. The correction factors for this phenomenon have been usually evaluated by the two theories available, both of which were conceived only for point sources. This work presents the Monte Carlo assessment of the non-uniformity correction factors for a Cs-137 linear source and a Farmer-type ionization chamber. The results have clearly demonstrated that for linear sources there are some important differences among the values obtained from different calculation models, especially at short distances from the source. The use of experimental values for each specific source geometry is recommended in order to assess the non-uniformity factors for linear sources in clinical situations that require special dose calculations or when the correctness of treatment planning software is verified during the acceptance tests.
NASA Astrophysics Data System (ADS)
Cai, Z.; Wilson, R. D.
2009-05-01
Techniques for optimizing the removal of NAPL mass in source zones have advanced at a more rapid rate than strategies to assess treatment performance. Informed selection of remediation approaches would be easier if measurements of performance were more directly transferable. We developed a number of methods based on data generated from multilevel sampler (MLS) transects to assess the effectiveness of a bioaugmentation/biostimulation trial in a TCE source residing in a terrace gravel aquifer in the East Midlands, UK. In this spatially complex aquifer, treatment inferred from long screen monitoring well data was not as reliable as that from consideration of mass flux changes across transects installed in and downgradient of the source. Falling head tests were conducted in the MLS ports to generate the necessary hydraulic conductivity (K) data. Combining K with concentration provides a mass flux map that allows calculation of mass turnover and an assessment of where in the complex geology the greatest turnover occurred. Five snapshots over a 600-day period indicate a marked reduction in TCE flux, suggesting a significant reduction in DNAPL mass over that expected due to natural processes. However, persistence of daughter products suggested that complete dechlorination did not occur. The MLS fence data also revealed that delivery of both carbon source and pH buffer were not uniform across the test zone. This may have lead to the generation of niches of iron(III) and sulphate reduction as well as methanogenesis, which impacted on dechlorination processes. In the absence of this spatial data, it is difficult to reconcile apparent treatment as indicated in monitoring well data to on-going processes.
Groeber, F; Schober, L; Schmid, F F; Traube, A; Kolbus-Hernandez, S; Daton, K; Hoffmann, S; Petersohn, D; Schäfer-Korting, M; Walles, H; Mewes, K R
2016-10-01
To replace the Draize skin irritation assay (OECD guideline 404) several test methods based on reconstructed human epidermis (RHE) have been developed and were adopted in the OECD test guideline 439. However, all validated test methods in the guideline are linked to RHE provided by only three companies. Thus, the availability of these test models is dependent on the commercial interest of the producer. To overcome this limitation and thus to increase the accessibility of in vitro skin irritation testing, an open source reconstructed epidermis (OS-REp) was introduced. To demonstrate the capacity of the OS-REp in regulatory risk assessment, a catch-up validation study was performed. The participating laboratories used in-house generated OS-REp to assess the set of 20 reference substances according to the performance standards amending the OECD test guideline 439. Testing was performed under blinded conditions. The within-laboratory reproducibility of 87% and the inter-laboratory reproducibility of 85% prove a high reliability of irritancy testing using the OS-REp protocol. In addition, the prediction capacity was with an accuracy of 80% comparable to previous published RHE based test protocols. Taken together the results indicate that the OS-REp test method can be used as a standalone alternative skin irritation test replacing the OECD test guideline 404. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
2013-11-15
was conducted. As expected, a cylinder was formed similar to the one shown in Figure 5.9 using potassium permanganate , with slight elongation in the...clean water injections at 400 mg/L. This was not necessary during the ISCO disturbance test, as potassium permanganate (KMnO4), which forms a deep
Implementation of the Leaching Environmental Assessment ...
New leaching tests are available in the U.S. for developing more accurate source terms for use in fate and transport models. For beneficial use or disposal, the use of the leaching environmental assessment framework (LEAF) will provide leaching results that reflect field conditions reflecting either use of disposal of the material or waste. To provide overview of the implementation of new leaching tests for presentation at the MEGA symposium which is for the coal-fired power industry
Fournier, K B; Brown, C G; Yeoman, M F; Fisher, J H; Seiler, S W; Hinshelwood, D; Compton, S; Holdener, F R; Kemp, G E; Newlander, C D; Gilliam, R P; Froula, N; Lilly, M; Davis, J F; Lerch, Maj A; Blue, B E
2016-11-01
Our team has developed an experimental platform to evaluate the x-ray-generated stress and impulse in materials. Experimental activities include x-ray source development, design of the sample mounting hardware and sensors interfaced to the National Ignition Facility's diagnostics insertion system, and system integration into the facility. This paper focuses on the X-ray Transport and Radiation Response Assessment (XTRRA) test cassettes built for these experiments. The test cassette is designed to position six samples at three predetermined distances from the source, each known to within ±1% accuracy. Built-in calorimeters give in situ measurements of the x-ray environment along the sample lines of sight. The measured accuracy of sample responses as well as planned modifications to the XTRRA cassette is discussed.
Guo, Junfeng; Wang, Chao; Chan, Kung-Sik; Jin, Dakai; Saha, Punam K; Sieren, Jered P; Barr, R G; Han, MeiLan K; Kazerooni, Ella; Cooper, Christopher B; Couper, David; Newell, John D; Hoffman, Eric A
2016-05-01
A test object (phantom) is an important tool to evaluate comparability and stability of CT scanners used in multicenter and longitudinal studies. However, there are many sources of error that can interfere with the test object-derived quantitative measurements. Here the authors investigated three major possible sources of operator error in the use of a test object employed to assess pulmonary density-related as well as airway-related metrics. Two kinds of experiments were carried out to assess measurement variability caused by imperfect scanning status. The first one consisted of three experiments. A COPDGene test object was scanned using a dual source multidetector computed tomographic scanner (Siemens Somatom Flash) with the Subpopulations and Intermediate Outcome Measures in COPD Study (SPIROMICS) inspiration protocol (120 kV, 110 mAs, pitch = 1, slice thickness = 0.75 mm, slice spacing = 0.5 mm) to evaluate the effects of tilt angle, water bottle offset, and air bubble size. After analysis of these results, a guideline was reached in order to achieve more reliable results for this test object. Next the authors applied the above findings to 2272 test object scans collected over 4 years as part of the SPIROMICS study. The authors compared changes of the data consistency before and after excluding the scans that failed to pass the guideline. This study established the following limits for the test object: tilt index ≤0.3, water bottle offset limits of [-6.6 mm, 7.4 mm], and no air bubble within the water bottle, where tilt index is a measure incorporating two tilt angles around x- and y-axis. With 95% confidence, the density measurement variation for all five interested materials in the test object (acrylic, water, lung, inside air, and outside air) resulting from all three error sources can be limited to ±0.9 HU (summed in quadrature), when all the requirements are satisfied. The authors applied these criteria to 2272 SPIROMICS scans and demonstrated a significant reduction in measurement variation associated with the test object. Three operator errors were identified which significantly affected the usability of the acquired scan images of the test object used for monitoring scanner stability in a multicenter study. The authors' results demonstrated that at the time of test object scan receipt at a radiology core laboratory, quality control procedures should include an assessment of tilt index, water bottle offset, and air bubble size within the water bottle. Application of this methodology to 2272 SPIROMICS scans indicated that their findings were not limited to the scanner make and model used for the initial test but was generalizable to both Siemens and GE scanners which comprise the scanner types used within the SPIROMICS study.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guo, Junfeng; Newell, John D.; Wang, Chao
Purpose: A test object (phantom) is an important tool to evaluate comparability and stability of CT scanners used in multicenter and longitudinal studies. However, there are many sources of error that can interfere with the test object-derived quantitative measurements. Here the authors investigated three major possible sources of operator error in the use of a test object employed to assess pulmonary density-related as well as airway-related metrics. Methods: Two kinds of experiments were carried out to assess measurement variability caused by imperfect scanning status. The first one consisted of three experiments. A COPDGene test object was scanned using a dualmore » source multidetector computed tomographic scanner (Siemens Somatom Flash) with the Subpopulations and Intermediate Outcome Measures in COPD Study (SPIROMICS) inspiration protocol (120 kV, 110 mAs, pitch = 1, slice thickness = 0.75 mm, slice spacing = 0.5 mm) to evaluate the effects of tilt angle, water bottle offset, and air bubble size. After analysis of these results, a guideline was reached in order to achieve more reliable results for this test object. Next the authors applied the above findings to 2272 test object scans collected over 4 years as part of the SPIROMICS study. The authors compared changes of the data consistency before and after excluding the scans that failed to pass the guideline. Results: This study established the following limits for the test object: tilt index ≤0.3, water bottle offset limits of [−6.6 mm, 7.4 mm], and no air bubble within the water bottle, where tilt index is a measure incorporating two tilt angles around x- and y-axis. With 95% confidence, the density measurement variation for all five interested materials in the test object (acrylic, water, lung, inside air, and outside air) resulting from all three error sources can be limited to ±0.9 HU (summed in quadrature), when all the requirements are satisfied. The authors applied these criteria to 2272 SPIROMICS scans and demonstrated a significant reduction in measurement variation associated with the test object. Conclusions: Three operator errors were identified which significantly affected the usability of the acquired scan images of the test object used for monitoring scanner stability in a multicenter study. The authors’ results demonstrated that at the time of test object scan receipt at a radiology core laboratory, quality control procedures should include an assessment of tilt index, water bottle offset, and air bubble size within the water bottle. Application of this methodology to 2272 SPIROMICS scans indicated that their findings were not limited to the scanner make and model used for the initial test but was generalizable to both Siemens and GE scanners which comprise the scanner types used within the SPIROMICS study.« less
Noise-Source Separation Using Internal and Far-Field Sensors for a Full-Scale Turbofan Engine
NASA Technical Reports Server (NTRS)
Hultgren, Lennart S.; Miles, Jeffrey H.
2009-01-01
Noise-source separation techniques for the extraction of the sub-dominant combustion noise from the total noise signatures obtained in static-engine tests are described. Three methods are applied to data from a static, full-scale engine test. Both 1/3-octave and narrow-band results are discussed. The results are used to assess the combustion-noise prediction capability of the Aircraft Noise Prediction Program (ANOPP). A new additional phase-angle-based discriminator for the three-signal method is also introduced.
Remembering the snake in the grass: Threat enhances recognition but not source memory.
Meyer, Miriam Magdalena; Bell, Raoul; Buchner, Axel
2015-12-01
Research on the influence of emotion on source memory has yielded inconsistent findings. The object-based framework (Mather, 2007) predicts that negatively arousing stimuli attract attention, resulting in enhanced within-object binding, and, thereby, enhanced source memory for intrinsic context features of emotional stimuli. To test this prediction, we presented pictures of threatening and harmless animals, the color of which had been experimentally manipulated. In a memory test, old-new recognition for the animals and source memory for their color was assessed. In all 3 experiments, old-new recognition was better for the more threatening material, which supports previous reports of an emotional memory enhancement. This recognition advantage was due to the emotional properties of the stimulus material, and not specific for snake stimuli. However, inconsistent with the prediction of the object-based framework, intrinsic source memory was not affected by emotion. (c) 2015 APA, all rights reserved).
Hexavalent chromium emissions from aerospace operations: A case study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chaurushia, A.; Bajza, C.
1994-12-31
Northrop Aircraft Division (NAD) is subject to several air toxic regulations such as EPA SARA Title 3, California Assembly Bill 2588 (AB2588), and Proposition 65 and is a voluntary participant in air toxic emissions reduction programs such as the EPA 33/50 and MERIT Program. To quantify emissions, NAD initially followed regulatory guidelines which recommend that emission inventories of air toxics be based on engineering assumptions and conservative emission factors in absence of specific source test data. NAD was concerned that Chromium VI emissions from NAD`s spray coating and chemical tank line operations were not representative due to these techniques. Moremore » recently, NAD has relied upon information from its ongoing source testing program to determine emission rates of Chromium VI. Based on these source test results, NAD revised emission calculations for use in Chromium VI inventories, impact assessments and control strategies. NAD has been successful in demonstrating a significant difference between emissions calculated utilizing the source test results and emissions based on the traditional mass balance using agency suggested methods.« less
Introducing genetic testing for cardiovascular disease in primary care: a qualitative study.
Middlemass, Jo B; Yazdani, Momina F; Kai, Joe; Standen, Penelope J; Qureshi, Nadeem
2014-05-01
While primary care systematically offers conventional cardiovascular risk assessment, genetic tests for coronary heart disease (CHD) are increasingly commercially available to patients. It is unclear how individuals may respond to these new sources of risk information. To explore how patients who have had a recent conventional cardiovascular risk assessment, perceive additional information from genetic testing for CHD. Qualitative interview study in 12 practices in Nottinghamshire from both urban and rural settings. Interviews were conducted with 29 adults, who consented to genetic testing after having had a conventional cardiovascular risk assessment. Individuals' principal motivation for genetic testing was their family history of CHD and a desire to convey the results to their children. After testing, however, there was limited recall of genetic test results and scepticism about the value of informing their children. Participants dealt with conflicting findings from the genetic test, family history, and conventional assessment by either focusing on genetic risk or environmental lifestyle factors. In some participants, genetic test results appeared to reinforce healthy behaviour but others were falsely reassured, despite having an 'above-average' conventional cardiovascular risk score. Although genetic testing was acceptable, participants were unclear how to interpret genetic risk results. To facilitate healthy behaviour, health professionals should explore patients' understanding of genetic test results in light of their family history and conventional risk assessment.
NASA Astrophysics Data System (ADS)
Park, Junghyun; Hayward, Chris; Stump, Brian W.
2018-06-01
Ground truth sources in Utah during 2003-2013 are used to assess the contribution of temporal atmospheric conditions to infrasound detection and the predictive capabilities of atmospheric models. Ground truth sources consist of 28 long duration static rocket motor burn tests and 28 impulsive rocket body demolitions. Automated infrasound detections from a hybrid of regional seismometers and infrasound arrays use a combination of short-term time average/long-term time average ratios and spectral analyses. These detections are grouped into station triads using a Delaunay triangulation network and then associated to estimate phase velocity and azimuth to filter signals associated with a particular source location. The resulting range and azimuth distribution from sources to detecting stations varies seasonally and is consistent with predictions based on seasonal atmospheric models. Impulsive signals from rocket body detonations are observed at greater distances (>700 km) than the extended duration signals generated by the rocket burn test (up to 600 km). Infrasound energy attenuation associated with the two source types is quantified as a function of range and azimuth from infrasound amplitude measurements. Ray-tracing results using Ground-to-Space atmospheric specifications are compared to these observations and illustrate the degree to which the time variations in characteristics of the observations can be predicted over a multiple year time period.
Solomon, Ethan B; Niemira, Brendan A; Sapers, Gerald M; Annous, Bassam A
2005-05-01
The ability of 71 strains of Salmonella enterica originating from produce, meat, or clinical sources to form biofilms was investigated. A crystal violet binding assay demonstrated no significant differences in biofilm formation by isolates from any source when tested in any of the following three media: Luria-Bertani broth supplemented with 2% glucose, tryptic soy broth (TSB), or 1/20th-strength TSB. Incubation was overnight at 30 degrees C under static conditions. Curli production and cellulose production were monitored by assessing morphotypes on Luria-Bertani agar without salt containing Congo red and by assessing fluorescence on Luria-Bertani agar containing calcofluor, respectively. One hundred percent of the clinical isolates exhibited curli biosynthesis, and 73% demonstrated cellulose production. All meat-related isolates formed curli, and 84% produced cellulose. A total of 80% of produce-related isolates produced curli, but only 52% produced cellulose. Crystal violet binding was not statistically different between isolates representing the three morphotypes when grown in TSB; however, significant differences were observed when strains were cultured in the two other media tested. These data demonstrate that the ability to form biofilms is not dependent on the source of the test isolate and suggest a relationship between crystal violet binding and morphotype, with curli- and cellulose-deficient isolates being least effective in biofilm formation.
NASA Technical Reports Server (NTRS)
Gates, Thomas S.; Johnson, Theodore F.; Whitley, Karen S.
2005-01-01
The objective of this report is to contribute to the independent assessment of the Space Shuttle External Tank Foam Material. This report specifically addresses material modeling, characterization testing, data reduction methods, and data pedigree. A brief description of the External Tank foam materials, locations, and standard failure modes is provided to develop suitable background information. A review of mechanics based analysis methods from the open literature is used to provide an assessment of the state-of-the-art in material modeling of closed cell foams. Further, this report assesses the existing material property database and investigates sources of material property variability. The report presents identified deficiencies in testing methods and procedures, recommendations for additional testing as required, identification of near-term improvements that should be pursued, and long-term capabilities or enhancements that should be developed.
NASA Technical Reports Server (NTRS)
2005-01-01
Three sources have been considered to provide information allowing the evaluation of the Collision Conflict Avoidance (CCA) functional requirements: existing data, simulation, and flight test. The existing data sources that have been evaluated have been found to be lacking in two areas: The actual data that was recorded and missing elements to the system architecture. Many previous tests addressing collision avoidance were conducted without a remote operator. As such, they are missing critical elements that are required to assess the CCA functional requirements. Tests such as ERAST were conducted with all of the UAS elements. However, ERAST tests were conducted as a demonstration and the data recorded was of end-to-end performance. Many contributing elements of the system were not individually recorded or were recorded at a data rate insufficient for the purposes of evaluating the CCA functional requirements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Helland, B.; Summers, B.G.
1996-09-01
As the classroom paradigm shifts from being teacher-centered to being learner-centered, student assessments are evolving from typical paper and pencil testing to other methods of evaluation. Students should be probed for understanding, reasoning, and critical thinking abilities rather than their ability to return memorized facts. The assessment of the Department of Energy`s pilot program, Adventures in Supercomputing (AiS), offers one example of assessment techniques developed for learner-centered curricula. This assessment has employed a variety of methods to collect student data. Methods of assessment used were traditional testing, performance testing, interviews, short questionnaires via email, and student presentations of projects. Themore » data obtained from these sources have been analyzed by a professional assessment team at the Center for Children and Technology. The results have been used to improve the AiS curriculum and establish the quality of the overall AiS program. This paper will discuss the various methods of assessment used and the results.« less
Biological characterization of a novel in vitro cell irradiator
Fowler, Tyler L.; Fisher, Michael M.; Bailey, Alison M.; Bednarz, Bryan P.
2017-01-01
To evaluate the overall robustness of a novel cellular irradiator we performed a series of well-characterized, dose-responsive assays to assess the consequences of DNA damage. We used a previously described novel irradiation system and a traditional 137Cs source to irradiate a cell line. The generation of reactive oxygen species was assessed using chloromethyl-H2DCFDA dye, the induction of DNA DSBs was observed using the comet assay, and the initiation of DNA break repair was assessed through γH2AX image cytometry. A high correlation between physical absorbed dose and biologic dose was seen for the production of intracellular reactive oxygen species, physical DNA double strand breaks, and modulation of the cellular double stand break pathway. The results compared favorably to irradiation with a traditional 137Cs source. The rapid, straightforward tests described form a reasonable approach for biologic characterization of novel irradiators. These additional testing metrics go beyond standard physics testing such as Monte Carlo simulation and thermo-luminescent dosimeter evaluation to confirm that a novel irradiator can produce the desired dose effects in vitro. Further, assessment of these biological metrics confirms that the physical handling of the cells during the irradiation process results in biologic effects that scale appropriately with dose. PMID:29232400
Beyond the Bubble in History/Social Studies Assessments
ERIC Educational Resources Information Center
Breakstone, Joel; Smith, Mark; Wineburg, Sam
2013-01-01
Teachers need tools and assessments that will prepare students to meet the ambitious goals laid out by the Common Core State Standards. The multiple-choice tests that dominate in history will not prepare students to analyze primary and secondary sources, cite textual evidence to support arguments, consider the influence of an author's perspective,…
ERIC Educational Resources Information Center
Mitchell, Karen J.; Raye, Carol L.; McGuire, Joseph T.; Frankel, Hillary; Greene, Erich J.; Johnson, Marcia K.
2008-01-01
A short-term source monitoring procedure with functional magnetic resonance imaging assessed neural activity when participants made judgments about the format of 1 of 4 studied items (picture, word), the encoding task performed (cost, place), or whether an item was old or new. The results support findings from long-term memory studies showing that…
Mitigation of Cognitive Bias with a Serious Game: Two Experiments Testing Feedback Timing and Source
ERIC Educational Resources Information Center
Dunbar, Norah E.; Jensen, Matthew L.; Miller, Claude H.; Bessarabova, Elena; Lee, Yu-Hao; Wilson, Scott N.; Elizondo, Javier; Adame, Bradley J.; Valacich, Joseph; Straub, Sara; Burgoon, Judee K.; Lane, Brianna; Piercy, Cameron W.; Wilson, David; King, Shawn; Vincent, Cindy; Schuetzler, Ryan M.
2017-01-01
One of the benefits of using digital games for education is that games can provide feedback for learners to assess their situation and correct their mistakes. We conducted two studies to examine the effectiveness of different feedback design (timing, duration, repeats, and feedback source) in a serious game designed to teach learners about…
Estuarine and Riverine Areas Final Programmatic Environmental Assessment
2004-06-25
sources in the study area include WWTP spray field runoff, urban and agricultural runoff, septic tank leachate , landfill leachate , silviculture...overland sheet flow. Urban and agricultural runoff are sources of fecal and total coliform and fecal streptococcus bacteria. Septic tank leachate and...in leachate from experiments using sand showed the greatest mobility of tungsten. Outdoor exposures and accelerated aging tests studied the
The sound field of a rotating dipole in a plug flow.
Wang, Zhao-Huan; Belyaev, Ivan V; Zhang, Xiao-Zheng; Bi, Chuan-Xing; Faranosov, Georgy A; Dowell, Earl H
2018-04-01
An analytical far field solution for a rotating point dipole source in a plug flow is derived. The shear layer of the jet is modelled as an infinitely thin cylindrical vortex sheet and the far field integral is calculated by the stationary phase method. Four numerical tests are performed to validate the derived solution as well as to assess the effects of sound refraction from the shear layer. First, the calculated results using the derived formulations are compared with the known solution for a rotating dipole in a uniform flow to validate the present model in this fundamental test case. After that, the effects of sound refraction for different rotating dipole sources in the plug flow are assessed. Then the refraction effects on different frequency components of the signal at the observer position, as well as the effects of the motion of the source and of the type of source are considered. Finally, the effect of different sound speeds and densities outside and inside the plug flow is investigated. The solution obtained may be of particular interest for propeller and rotor noise measurements in open jet anechoic wind tunnels.
A Home Ignition Assessment Model Applied to Structures in the Wildland-Urban Interface
DOE Office of Scientific and Technical Information (OSTI.GOV)
Biswas, Kaushik; Werth, David; Gupta, Narendra
2013-01-01
The issue of exterior fire threat to buildings, from either wildfires in the wildland-urban interface or neighboring structure fires, is critically important. To address this, theWildfire Ignition Resistant Home Design (WIRHD) program was initiated. The WIRHD program developed a tool, theWildFIREWizard, that will allow homeowners to estimate the external fire threat to their homes based on specific features and characteristics of the homes and yards. The software then makes recommendations to reduce the threat. The inputs include the structural and material features of the home and information about any ignition sources or flammable objects in its immediate vicinity, known asmore » the home ignition zone. The tool comprises an ignition assessment model that performs explicit calculations of the radiant and convective heating of the building envelope from the potential ignition sources. This article describes a series of material ignition and flammability tests that were performed to calibrate and/or validate the ignition assessment model. The tests involved exposing test walls with different external siding types to radiant heating and/or direct flame contact.The responses of the test walls were used to determine the conditions leading to melting, ignition, or any other mode of failure of the walls. Temperature data were used to verify the model predictions of temperature rises and ignition times of the test walls.« less
Renewable Energy Assessment Methodology for Japanese OCONUS Army Installations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Solana, Amy E.; Horner, Jacob A.; Russo, Bryan J.
2010-08-30
Since 2005, Pacific Northwest National Laboratory (PNNL) has been asked by Installation Management Command (IMCOM) to conduct strategic assessments at selected US Army installations of the potential use of renewable energy resources, including solar, wind, geothermal, biomass, waste, and ground source heat pumps (GSHPs). IMCOM has the same economic, security, and legal drivers to develop alternative, renewable energy resources overseas as it has for installations located in the US. The approach for continental US (CONUS) studies has been to use known, US-based renewable resource characterizations and information sources coupled with local, site-specific sources and interviews. However, the extent to whichmore » this sort of data might be available for outside the continental US (OCONUS) sites was unknown. An assessment at Camp Zama, Japan was completed as a trial to test the applicability of the CONUS methodology at OCONUS installations. It was found that, with some help from Camp Zama personnel in translating and locating a few Japanese sources, there was relatively little difficulty in finding sources that should provide a solid basis for conducting an assessment of comparable depth to those conducted for US installations. Project implementation will likely be more of a challenge, but the feasibility analysis will be able to use the same basic steps, with some adjusted inputs, as PNNL’s established renewable resource assessment methodology.« less
Makris, Susan L.; Raffaele, Kathleen; Allen, Sandra; Bowers, Wayne J.; Hass, Ulla; Alleva, Enrico; Calamandrei, Gemma; Sheets, Larry; Amcoff, Patric; Delrue, Nathalie; Crofton, Kevin M.
2009-01-01
Objective We conducted a review of the history and performance of developmental neurotoxicity (DNT) testing in support of the finalization and implementation of Organisation of Economic Co-operation and Development (OECD) DNT test guideline 426 (TG 426). Information sources and analysis In this review we summarize extensive scientific efforts that form the foundation for this testing paradigm, including basic neurotoxicology research, interlaboratory collaborative studies, expert workshops, and validation studies, and we address the relevance, applicability, and use of the DNT study in risk assessment. Conclusions The OECD DNT guideline represents the best available science for assessing the potential for DNT in human health risk assessment, and data generated with this protocol are relevant and reliable for the assessment of these end points. The test methods used have been subjected to an extensive history of international validation, peer review, and evaluation, which is contained in the public record. The reproducibility, reliability, and sensitivity of these methods have been demonstrated, using a wide variety of test substances, in accordance with OECD guidance on the validation and international acceptance of new or updated test methods for hazard characterization. Multiple independent, expert scientific peer reviews affirm these conclusions. PMID:19165382
Identifying Sources of Bias in EFL Writing Assessment through Multiple Trait Scoring
ERIC Educational Resources Information Center
Salmani-Nodoushan, Mohammad Ali
2009-01-01
For purposes of the present study, it was hypothesized that field (in)dependence would introduce systematic variance into EFL learners' performance on composition tests. 1743 freshman, sophomore, junior, and senior students all majoring in English at different Iranian universities and colleges took the Group Embedded Figures Test (GEFT). The…
Erroneous Memories Arising from Repeated Attempts to Remember
ERIC Educational Resources Information Center
Henkel, Linda A.
2004-01-01
The impact of repeated and prolonged attempts at remembering on false memory rates was assessed in three experiments. Participants saw and imagined pictures and then made repeated recall attempts before taking a source memory test. Although the number of items recalled increased with repeated tests, the net gains were associated with more source…
Rater Expertise in a Second Language Speaking Assessment: The Influence of Training and Experience
ERIC Educational Resources Information Center
Davis, Lawrence Edward
2012-01-01
Speaking performance tests typically employ raters to produce scores; accordingly, variability in raters' scoring decisions has important consequences for test reliability and validity. One such source of variability is the rater's level of expertise in scoring. Therefore, it is important to understand how raters' performance is influenced by…
A recently published test method for Neocloeon triangulifer assessed the survival and growth of larval mayflies exposed to several reference toxicants (NaCl, KCl, and CuSO4). Results were not able to be replicated in subsequent experiments. To identify potential sources of variab...
NASA Technical Reports Server (NTRS)
Roman, Monsi C.; Mittelman, Marc W.
2010-01-01
This slide presentation summarizes the studies performed to assess the bulk phase microbial community during the Space Station Water Recover Tests (WRT) from 1990-1998. These tests show that it is possible to recycle water from different sources including urine, and produce water that can exceed the quality of municpally produced tap water.
Development of visible spectroscopy diagnostics for W sources assessment in WEST
NASA Astrophysics Data System (ADS)
Meyer, O.; Jones, O. M.; Giacalone, J. C.; Pascal, J. Y.; Raulin, D.; Xu, H.; Aumeunier, M. H.; Baude, R.; Escarguel, A.; Gil, C.; Harris, J. H.; Hatchressian, J.-C.; Klepper, C. C.; Larroque, S.; Lotte, Ph.; Moreau, Ph.; Pégourié, B.; Vartanian, S.
2016-11-01
The present work concerns the development of a W sources assessment system in the framework of the tungsten-W environment in steady state tokamak project that aims at equipping the existing Tore Supra device with a tungsten divertor in order to test actively cooled tungsten Plasma Facing Components (PFCs) in view of preparing ITER operation. The goal is to assess W sources and D recycling with spectral, spatial, and temporal resolution adapted to the PFCs observed. The originality of the system is that all optical elements are installed in the vacuum vessel and compatible with steady state operation. Our system is optimized to measure radiance as low as 1016 Ph/(m2 s sr). A total of 240 optical fibers will be deployed to the detection systems such as the "Filterscope," developed by Oak Ridge National Laboratory (USA) and consisting of photomultiplier tubes and filters, or imaging spectrometers dedicated to Multiview analysis.
Comparison of seven protocols to identify fecal contamination sources using Escherichia coli
Stoeckel, D.M.; Mathes, M.V.; Hyer, K.E.; Hagedorn, C.; Kator, H.; Lukasik, J.; O'Brien, T. L.; Fenger, T.W.; Samadpour, M.; Strickler, K.M.; Wiggins, B.A.
2004-01-01
Microbial source tracking (MST) uses various approaches to classify fecal-indicator microorganisms to source hosts. Reproducibility, accuracy, and robustness of seven phenotypic and genotypic MST protocols were evaluated by use of Escherichia coli from an eight-host library of known-source isolates and a separate, blinded challenge library. In reproducibility tests, measuring each protocol's ability to reclassify blinded replicates, only one (pulsed-field gel electrophoresis; PFGE) correctly classified all test replicates to host species; three protocols classified 48-62% correctly, and the remaining three classified fewer than 25% correctly. In accuracy tests, measuring each protocol's ability to correctly classify new isolates, ribotyping with EcoRI and PvuII approached 100% correct classification but only 6% of isolates were classified; four of the other six protocols (antibiotic resistance analysis, PFGE, and two repetitive-element PCR protocols) achieved better than random accuracy rates when 30-100% of challenge isolates were classified. In robustness tests, measuring each protocol's ability to recognize isolates from nonlibrary hosts, three protocols correctly classified 33-100% of isolates as "unknown origin," whereas four protocols classified all isolates to a source category. A relevance test, summarizing interpretations for a hypothetical water sample containing 30 challenge isolates, indicated that false-positive classifications would hinder interpretations for most protocols. Study results indicate that more representation in known-source libraries and better classification accuracy would be needed before field application. Thorough reliability assessment of classification results is crucial before and during application of MST protocols.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fournier, K. B., E-mail: fournier2@llnl.gov; Brown, C. G.; Yeoman, M. F.
2016-11-15
Our team has developed an experimental platform to evaluate the x-ray-generated stress and impulse in materials. Experimental activities include x-ray source development, design of the sample mounting hardware and sensors interfaced to the National Ignition Facility’s diagnostics insertion system, and system integration into the facility. This paper focuses on the X-ray Transport and Radiation Response Assessment (XTRRA) test cassettes built for these experiments. The test cassette is designed to position six samples at three predetermined distances from the source, each known to within ±1% accuracy. Built-in calorimeters give in situ measurements of the x-ray environment along the sample lines ofmore » sight. The measured accuracy of sample responses as well as planned modifications to the XTRRA cassette is discussed.« less
Fournier, K. B.; Brown, Jr., C. G.; Yeoman, M. F.; ...
2016-08-10
Our team has developed an experimental platform to evaluate the x-ray-generated stress and impulse in materials. Experimental activities include x-ray source development, design of the sample mounting hardware and sensors interfaced to the NIF’s diagnostics insertion system, and system integration into the facility. This paper focuses on the X-ray Transport and Radiation Response Assessment (XTRRA) test cassettes built for these experiments. The test cassette is designed to position six samples at three predetermined distances from the source, each known to within ±1% accuracy. Built in calorimeters give in situ measurements of the x-ray environment along the sample lines of sight.more » We discuss the measured accuracy of sample responses, as well as planned modifications to the XTRRA cassette.« less
Network sensitivity solutions for regional moment-tensor inversions
Ford, Sean R.; Dreger, Douglas S.; Walter, William R.
2010-09-20
Well-resolved moment-tensor solutions reveal information about the sources of seismic waves. In this paper,we introduce a newly of assessing confidence in the regional full moment-tensor inversion via the introduction of the network sensitivity solution (NSS). The NSS takes into account the unique station distribution, frequency band, and signal-to-noise ratio of a given event scenario. The NSS compares both a hypothetical pure source (for example, an explosion or an earthquake) and the actual data with several thousand sets of synthetic data from a uniform distribution of all possible sources. The comparison with a hypothetical pure source provides the theoretically best-constrained source-typemore » distribution for a given set of stations; and with it, one can determine whether further analysis with the data is warranted. The NSS that employs the actual data gives a direct comparison of all other source types with the best fit source. In this way, one can choose a threshold level of fit in which the solution is comfortably constrained. The method is tested for the well-recorded nuclear test, JUNCTION, at the Nevada Test Site. Sources that fit comparably well to a hypothetical pure explosion recorded with no noise at the JUNCTION data stations have a large volumetric component and are not described well by a double-couple (DC) source. The NSS using the real data from JUNCTION is even more tightly constrained to an explosion because the data contain some energy that precludes fitting with any type of deviator source. We also calculate the NSS for the October 2006 North Korea test and a nearby earthquake, where the station coverage is poor and the event magnitude is small. As a result, the earthquake solution is very well fit by a DC source, and the best-fit solution to the nuclear test (M w 4.1) is dominantly explosion.« less
Radiation Testing at Sandia National Laboratories: Sandia – JPL Collaboration for Europa Lander
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hattar, Khalid Mikhiel; Olszewska-Wasiolek, Maryla Aleksandra
Sandia National Laboratories (SNL) is assisting Jet Propulsion Laboratory in undertaking feasibility studies and performance assessments for the Planetary Protection aspect of the Europa Lander mission. The specific areas of interest for this project are described by task number. This white paper presents the evaluation results for Task 2, Radiation Testing, which was stated as follows: Survey SNL facilities and capabilities for simulating the Europan radiation environment and assess suitability for: A. Testing batteries, electronics, and other component and subsystems B. Exposing biological organisms to assess their survivability metrics. The radiation environment the Europa Lander will encounter on route andmore » in orbit upon arrival at its destination consists primarily of charged particles, energetic protons and electrons with the energies up to 1 GeV. The charged particle environments can be simulated using the accelerators at the Ion Beam Laboratory. The Gamma Irradiation Facility and its annex, the Low Dose Rate Irradiation Facility, offer irradiations using Co-60 gamma sources (1.17 and 1.33 MeV), as well as Cs-137 gamma (0.661 MeV) AmBe neutron (0-10 MeV) sources.« less
Cryogenic Fluid Film Bearing Tester Development Study
NASA Technical Reports Server (NTRS)
Scharrer, Joseph K. (Editor); Murphy, Brian T.; Hawkins, Lawrence A.
1993-01-01
Conceptual designs were developed for the determination of rotordynamic coefficients of cryogenic fluid film bearings. The designs encompassed the use of magnetic and conventional excitation sources as well as the use of magnetic bearings as support bearings. Test article configurations reviewed included overhung, floating housing, and fixed housing. Uncertainty and forced response analyses were performed to assess quality of data and suitability of each for testing a variety of fluid film bearing designs. Development cost and schedule estimates were developed for each design. Facility requirements were reviewed and compared with existing MSFC capability. The recommended configuration consisted of a fixed test article housing centrally located between two magnetic bearings. The magnetic bearings would also serve as the excitation source.
Zeitler, Daniel M; Dorman, Michael F; Natale, Sarah J; Loiselle, Louise; Yost, William A; Gifford, Rene H
2015-09-01
To assess improvements in sound source localization and speech understanding in complex listening environments after unilateral cochlear implantation for single-sided deafness (SSD). Nonrandomized, open, prospective case series. Tertiary referral center. Nine subjects with a unilateral cochlear implant (CI) for SSD (SSD-CI) were tested. Reference groups for the task of sound source localization included young (n = 45) and older (n = 12) normal-hearing (NH) subjects and 27 bilateral CI (BCI) subjects. Unilateral cochlear implantation. Sound source localization was tested with 13 loudspeakers in a 180 arc in front of the subject. Speech understanding was tested with the subject seated in an 8-loudspeaker sound system arrayed in a 360-degree pattern. Directionally appropriate noise, originally recorded in a restaurant, was played from each loudspeaker. Speech understanding in noise was tested using the Azbio sentence test and sound source localization quantified using root mean square error. All CI subjects showed poorer-than-normal sound source localization. SSD-CI subjects showed a bimodal distribution of scores: six subjects had scores near the mean of those obtained by BCI subjects, whereas three had scores just outside the 95th percentile of NH listeners. Speech understanding improved significantly in the restaurant environment when the signal was presented to the side of the CI. Cochlear implantation for SSD can offer improved speech understanding in complex listening environments and improved sound source localization in both children and adults. On tasks of sound source localization, SSD-CI patients typically perform as well as BCI patients and, in some cases, achieve scores at the upper boundary of normal performance.
A quality monitor and monitoring technique employing optically stimulated electron emission
NASA Technical Reports Server (NTRS)
Yost, William T. (Inventor); Welch, Christopher S. (Inventor); Joe, Edmond J. (Inventor); Hefner, Bill Bryan, Jr. (Inventor)
1995-01-01
A light source directs ultraviolet light onto a test surface and a detector detects a current of photoelectrons generated by the light. The detector includes a collector which is positively biased with respect to the test surface. Quality is indicated based on the photoelectron current. The collector is then negatively biased to replace charges removed by the measurement of a nonconducting substrate to permit subsequent measurements. Also, the intensity of the ultraviolet light at a particular wavelength is monitored and the voltage of the light source varied to maintain the light a constant desired intensity. The light source is also cooled via a gas circulation system. If the test surface is an insulator, the surface is bombarded with ultraviolet light in the presence of an electron field to remove the majority of negative charges from the surface. The test surface is then exposed to an ion field until it possesses no net charge. The technique described above is then performed to assess quality.
Conceptualizing and assessing improvement capability: a review
Boaden, Ruth; Walshe, Kieran
2017-01-01
Abstract Purpose The literature is reviewed to examine how ‘improvement capability’ is conceptualized and assessed and to identify future areas for research. Data sources An iterative and systematic search of the literature was carried out across all sectors including healthcare. The search was limited to literature written in English. Data extraction The study identifies and analyses 70 instruments and frameworks for assessing or measuring improvement capability. Information about the source of the instruments, the sectors in which they were developed or used, the measurement constructs or domains they employ, and how they were tested was extracted. Results of data synthesis The instruments and framework constructs are very heterogeneous, demonstrating the ambiguity of improvement capability as a concept, and the difficulties involved in its operationalisation. Two-thirds of the instruments and frameworks have been subject to tests of reliability and half to tests of validity. Many instruments have little apparent theoretical basis and do not seem to have been used widely. Conclusion The assessment and development of improvement capability needs clearer and more consistent conceptual and terminological definition, used consistently across disciplines and sectors. There is scope to learn from existing instruments and frameworks, and this study proposes a synthetic framework of eight dimensions of improvement capability. Future instruments need robust testing for reliability and validity. This study contributes to practice and research by presenting the first review of the literature on improvement capability across all sectors including healthcare. PMID:28992146
Leaching of PFC from soils contaminated with PFC of different origin
NASA Astrophysics Data System (ADS)
Kalbe, Ute; Piechotta, Christian; Rothe, Robert
2017-04-01
Leaching tests are fundamental tools for the assessment of groundwater impact by contaminated soils concerning the soil-groundwater pathway. Such procedures are supposed to serve as the basis for a reliable leachate prognosis. They can be applied to determine the short and long term leaching behaviour as well as the source term of contaminated soils. For this purpose two types of leaching procedures have been validated in Germany for the examination of the leaching behaviour of frequently occurring organic substances (DIN 19528 - column test and DIN 19529 - batch test). A liquid-to-solid ratio (L/S) of 2 L/kg and 10 L/kg) is the basis for the risk assessment which is implemented in different German regulations. The equivalence of test results for both tests for the same material under investigation has been investigated for a variety of pollutants in order to assess their reliability in compliance testing. However, for emerging pollutants there is hardly data available on this issue. Leaching tests on soils contaminated with emerging pollutants such as PFC (Perfluorinated Surfactants) are currently coming more into consideration due to the increasing detection of contaminated sites. Therefore, two soils were investigated in this study from different contamination source (paper sludge containing compost and fire distinguishing foam) using both leaching tests and both liquid-to-solid ratios. The leachability of the various perfluorinated compounds in relation to their content in solid matter was considered. Furthermore the eluate pre-treatment prior analysis (in particular liquid/solid separation step needed for batch tests) has been taken into account. The comparability of the results from batch and column is dependent on the solubility of the various compounds, on the L/S and on the turbidity in the eluates.
ERIC Educational Resources Information Center
Aitken, Joan E.; Neer, Michael R.
This paper provides an example procedure used to design and install a program of assessment to improve communication instruction through a competency-based core curriculum at a mid-sized, urban university. The paper models the various steps in the process, and includes specific tests, forms, memos, course description, sources, and procedures which…
Role of NAEP Could Change with Common Assessments
ERIC Educational Resources Information Center
Cavanagh, Sean
2009-01-01
For decades, when elected officials, researchers, educators, and parents have wanted a clear-eyed measure of what students know in a range of subjects, they have turned to an authoritative source: the National Assessment of Educational Progress (NAEP). Now the country stands poised to enter a new testing era. All but two states have agreed to work…
Acoustic Source Localization in Aircraft Interiors Using Microphone Array Technologies
NASA Technical Reports Server (NTRS)
Sklanka, Bernard J.; Tuss, Joel R.; Buehrle, Ralph D.; Klos, Jacob; Williams, Earl G.; Valdivia, Nicolas
2006-01-01
Using three microphone array configurations at two aircraft body stations on a Boeing 777-300ER flight test, the acoustic radiation characteristics of the sidewall and outboard floor system are investigated by experimental measurement. Analysis of the experimental data is performed using sound intensity calculations for closely spaced microphones, PATCH Inverse Boundary Element Nearfield Acoustic Holography, and Spherical Nearfield Acoustic Holography. Each method is compared assessing strengths and weaknesses, evaluating source identification capability for both broadband and narrowband sources, evaluating sources during transient and steady-state conditions, and quantifying field reconstruction continuity using multiple array positions.
Digital education and dynamic assessment of tongue diagnosis based on Mashup technique.
Tsai, Chin-Chuan; Lo, Yen-Cheng; Chiang, John Y; Sainbuyan, Natsagdorj
2017-01-24
To assess the digital education and dynamic assessment of tongue diagnosis based on Mashup technique (DEDATD) according to specifific user's answering pattern, and provide pertinent information tailored to user's specifific needs supplemented by the teaching materials constantly updated through the Mashup technique. Fifty-four undergraduate students were tested with DEDATD developed. The effificacy of the DEDATD was evaluated based on the pre- and post-test performance, with interleaving training sessions targeting on the weakness of the student under test. The t-test demonstrated that signifificant difference was reached in scores gained during pre- and post-test sessions, and positive correlation between scores gained and length of time spent on learning, while no signifificant differences between the gender and post-test score, and the years of students in school and the progress in score gained. DEDATD, coupled with Mashup technique, could provide updated materials fifiltered through diverse sources located across the network. The dynamic assessment could tailor each individual learner's needs to offer custom-made learning materials. DEDATD poses as a great improvement over the traditional teaching methods.
Introducing genetic testing for cardiovascular disease in primary care: a qualitative study
Middlemass, Jo B; Yazdani, Momina F; Kai, Joe; Standen, Penelope J; Qureshi, Nadeem
2014-01-01
Background While primary care systematically offers conventional cardiovascular risk assessment, genetic tests for coronary heart disease (CHD) are increasingly commercially available to patients. It is unclear how individuals may respond to these new sources of risk information. Aim To explore how patients who have had a recent conventional cardiovascular risk assessment, perceive additional information from genetic testing for CHD. Design and setting Qualitative interview study in 12 practices in Nottinghamshire from both urban and rural settings. Method Interviews were conducted with 29 adults, who consented to genetic testing after having had a conventional cardiovascular risk assessment. Results Individuals’ principal motivation for genetic testing was their family history of CHD and a desire to convey the results to their children. After testing, however, there was limited recall of genetic test results and scepticism about the value of informing their children. Participants dealt with conflicting findings from the genetic test, family history, and conventional assessment by either focusing on genetic risk or environmental lifestyle factors. In some participants, genetic test results appeared to reinforce healthy behaviour but others were falsely reassured, despite having an ‘above-average’ conventional cardiovascular risk score. Conclusion Although genetic testing was acceptable, participants were unclear how to interpret genetic risk results. To facilitate healthy behaviour, health professionals should explore patients’ understanding of genetic test results in light of their family history and conventional risk assessment. PMID:24771842
An evaluation and assessment of flow quality in selected NASA wind tunnels
NASA Technical Reports Server (NTRS)
Harvey, W. D.; Stainback, P. C.; Owen, F. K.
1983-01-01
Tests have been conducted in a number of NASA wind tunnels to measure disturbance levels and spectra in their respective settling chambers, test sections, and diffusers to determine the sources of their disturbances. The present data supplements previous results in other NASA tunnels and adds to the ongoing acquisition of a disturbance level data base. The present results also serve to explain flow related sources which cause relatively large disturbance amplitudes at discrete frequencies. The installation of honeycomb, screens, and acoustic baffles in or upstream of the settling chamber can significantly reduce the disturbance levels.
Sokolova, Ekaterina; Petterson, Susan R; Dienus, Olaf; Nyström, Fredrik; Lindgren, Per-Eric; Pettersson, Thomas J R
2015-09-01
Norovirus contamination of drinking water sources is an important cause of waterborne disease outbreaks. Knowledge on pathogen concentrations in source water is needed to assess the ability of a drinking water treatment plant (DWTP) to provide safe drinking water. However, pathogen enumeration in source water samples is often not sufficient to describe the source water quality. In this study, the norovirus concentrations were characterised at the contamination source, i.e. in sewage discharges. Then, the transport of norovirus within the water source (the river Göta älv in Sweden) under different loading conditions was simulated using a hydrodynamic model. Based on the estimated concentrations in source water, the required reduction of norovirus at the DWTP was calculated using quantitative microbial risk assessment (QMRA). The required reduction was compared with the estimated treatment performance at the DWTP. The average estimated concentration in source water varied between 4.8×10(2) and 7.5×10(3) genome equivalents L(-1); and the average required reduction by treatment was between 7.6 and 8.8 Log10. The treatment performance at the DWTP was estimated to be adequate to deal with all tested loading conditions, but was heavily dependent on chlorine disinfection, with the risk of poor reduction by conventional treatment and slow sand filtration. To our knowledge, this is the first article to employ discharge-based QMRA, combined with hydrodynamic modelling, in the context of drinking water. Copyright © 2015 Elsevier B.V. All rights reserved.
Nickel release from surgical instruments and operating room equipment.
Boyd, Anne H; Hylwa, Sara A
2018-04-15
Background There has been no systematic study assessing nickel release from surgical instruments and equipment used within the operating suite. This equipment represents important potential sources of exposure for nickel-sensitive patients and hospital staff. To investigate nickel release from commonly used surgical instruments and operating room equipment. Using the dimethylglyoxime nickel spot test, a variety of surgical instruments and operating room equipment were tested for nickel release at our institution. Of the 128 surgical instruments tested, only 1 was positive for nickel release. Of the 43 operating room items tested, 19 were positive for nickel release, 7 of which have the potential for direct contact with patients and/or hospital staff. Hospital systems should be aware of surgical instruments and operating room equipment as potential sources of nickel exposure.
40 CFR 63.11498 - What are the standards and compliance requirements for wastewater systems?
Code of Federal Regulations, 2010 CFR
2010-07-01
... Chemical Manufacturing Area Sources Standards and Compliance Requirements § 63.11498 What are the standards... each wastewater stream using process knowledge, engineering assessment, or test data. Also, you must...
Test readiness assessment summary for Integrated Dynamic Transit Operations (IDTO).
DOT National Transportation Integrated Search
2012-10-01
In support of USDOTs Intelligent Transportation Systems (ITS) Mobility Program, the Dynamic Mobility Applications (DMA) program seeks to create applications that fully leverage frequently collected and rapidly disseminated multi-source data gat...
40 CFR 63.11498 - What are the standards and compliance requirements for wastewater systems?
Code of Federal Regulations, 2012 CFR
2012-07-01
... Chemical Manufacturing Area Sources Standards and Compliance Requirements § 63.11498 What are the standards... each wastewater stream using process knowledge, engineering assessment, or test data. Also, you must...
Benson, Nsikak U.; Asuquo, Francis E.; Williams, Akan B.; Essien, Joseph P.; Ekong, Cyril I.; Akpabio, Otobong; Olajire, Abaas A.
2016-01-01
Trace metals (Cd, Cr, Cu, Ni and Pb) concentrations in benthic sediments were analyzed through multi-step fractionation scheme to assess the levels and sources of contamination in estuarine, riverine and freshwater ecosystems in Niger Delta (Nigeria). The degree of contamination was assessed using the individual contamination factors (ICF) and global contamination factor (GCF). Multivariate statistical approaches including principal component analysis (PCA), cluster analysis and correlation test were employed to evaluate the interrelationships and associated sources of contamination. The spatial distribution of metal concentrations followed the pattern Pb>Cu>Cr>Cd>Ni. Ecological risk index by ICF showed significant potential mobility and bioavailability for Cu, Cu and Ni. The ICF contamination trend in the benthic sediments at all studied sites was Cu>Cr>Ni>Cd>Pb. The principal component and agglomerative clustering analyses indicate that trace metals contamination in the ecosystems was influenced by multiple pollution sources. PMID:27257934
Blood: Tests Used to Assess the Physiological and Immunological Properties of Blood
ERIC Educational Resources Information Center
Quinn, J. G.; Tansey, E. A.; Johnson, C. D.; Roe, S. M.; Montgomery, L. E. A.
2016-01-01
The properties of blood and the relative ease of access to which it can be retrieved make it an ideal source to gauge different aspects of homeostasis within an individual, form an accurate diagnosis, and formulate an appropriate treatment regime. Tests used to determine blood parameters such as the erythrocyte sedimentation rate, hemoglobin…
Cao, Weidan; Zhang, Xinyao; Xu, Kaibin; Wang, Yuanxin
2016-09-01
The outbreak of severe acute respiratory syndrome (SARS) in 2003 marked the explosion of health information seeking online in China and the increasing emergence of Chinese health websites. There are both benefits and potential hazards of people's online health information seeking. This article intended to test part of Wilson's second model of information behavior, including source characteristics and activating mechanisms, and to identify the relationships among perceived access, perceived expertise credibility, reward assessment, Internet self-efficacy, and online health information-seeking behavior. Data were drawn from face-to-face surveys and an online survey of health information seekers (N = 393) in China. The results showed that source characteristics predicted activating mechanisms, which in turn predicted online health information-seeking behavior. Activating mechanisms, that is, reward assessment and Internet self-efficacy, mediated the relationship between source characteristics (i.e., access and credibility) and online health information-seeking behavior. Strategies for improving information access, expertise credibility, and Internet self-efficacy are discussed in order to maximize the benefits of online health information seeking and to minimize the potential harm.
Improved source inversion from joint measurements of translational and rotational ground motions
NASA Astrophysics Data System (ADS)
Donner, S.; Bernauer, M.; Reinwald, M.; Hadziioannou, C.; Igel, H.
2017-12-01
Waveform inversion for seismic point (moment tensor) and kinematic sources is a standard procedure. However, especially in the local and regional distances a lack of appropriate velocity models, the sparsity of station networks, or a low signal-to-noise ratio combined with more complex waveforms hamper the successful retrieval of reliable source solutions. We assess the potential of rotational ground motion recordings to increase the resolution power and reduce non-uniquenesses for point and kinematic source solutions. Based on synthetic waveform data, we perform a Bayesian (i.e. probabilistic) inversion. Thus, we avoid the subjective selection of the most reliable solution according the lowest misfit or other constructed criterion. In addition, we obtain unbiased measures of resolution and possible trade-offs. Testing different earthquake mechanisms and scenarios, we can show that the resolution of the source solutions can be improved significantly. Especially depth dependent components show significant improvement. Next to synthetic data of station networks, we also tested sparse-network and single station cases.
Nevada Test Site Wetlands Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
D. J. Hansen
1997-05-01
This report identifies 16 Nevada Test Site (NTS) natural water sources that may be classified by the U.S. Army Corps of Engineers (USACE) as jurisdictional wetlands and identifies eight water sources that may be classified as waters of the United States. These water sources are rare, localized habitats on the NTS that are important to regional wildlife and to isolated populations of water tolerant plants and aquatic organisms. No field investigations on the NTS have been conducted in the past to identify those natural water sources which would be protected as rare habitats and which may fall under regulatory authoritymore » of the Clean Water Act (CWA) of 1997. This report identifies and summarizes previous studies of NTS natural water sources, and identifies the current DOE management practices related to the protection of NTS wetlands. This report also presents management goals specific for NTS wetlands that incorporate the intent of existing wetlands legislation, the principles of ecosystem management, and the interests of regional land managers and other stakeholders.« less
Illuminant and observer metamerism and the Hardy-Rand-Rittler color vision test editions.
Dain, Stephen J
2006-01-01
A previous study identified a significant metamerism in the several editions of the Hardy-Rand-Rittller pseudoisochromatic plates (HRR) but did not proceed to quantify the consequences of that metamerism (Dain, 2004). Metamerism arises from two sources and is almost inevitable when a printed color vision test is reproduced in several editions. Metamerism has two consequences; these are illuminant/source-based changes in performance and changes in performance with observer (less well known) when assessing anomalous trichromats. This study addresses the effects of illuminant/source and observer metamerism on the fourth editions of HRR. Groups of colors intended to lie on a dichromat confusion line generally remain on a confusion line when the source id changed. The plates appear to be resistant to each form of metamerism, perhaps because the features of the spectral reflectance are similar for figure color and background gray. As a consequence, the clinician needs to be less concerned about using a non-recommended source than was previously believed.
Kumpel, Emily; Peletz, Rachel; Bonham, Mateyo; Khush, Ranjiv
2016-10-18
Universal access to safe drinking water is prioritized in the post-2015 Sustainable Development Goals. Collecting reliable and actionable water quality information in low-resource settings, however, is challenging, and little is known about the correspondence between water quality data collected by local monitoring agencies and global frameworks for water safety. Using 42 926 microbial water quality test results from 32 surveillance agencies and water suppliers in seven sub-Saharan African countries, we determined the degree to which water sources were monitored, how water quality varied by source type, and institutional responses to results. Sixty-four percent of the water samples were collected from piped supplies, although the majority of Africans rely on nonpiped sources. Piped supplies had the lowest levels of fecal indicator bacteria (FIB) compared to any other source type: only 4% of samples of water piped to plots and 2% of samples from water piped to public taps/standpipes were positive for FIB (n = 14 948 and n = 12 278, respectively). Among other types of improved sources, samples from harvested rainwater and boreholes were less often positive for FIB (22%, n = 167 and 31%, n = 3329, respectively) than protected springs or protected dug wells (39%, n = 472 and 65%, n = 505). When data from different settings were aggregated, the FIB levels in different source types broadly reflected the source-type water safety framework used by the Joint Monitoring Programme. However, the insufficient testing of nonpiped sources relative to their use indicates important gaps in current assessments. Our results emphasize the importance of local data collection for water safety management and measurement of progress toward universal safe drinking water access.
Microbial Groundwater Quality Status of Hand-Dug Wells and Boreholes in the Dodowa Area of Ghana
Lutterodt, George; Hoiting, Yvonne; Kamara, Alimamy K.; Oduro-Kwarteng, Sampson; Foppen, Jan Willem A.
2018-01-01
To assess the suitability of water sources for drinking purposes, samples were taken from groundwater sources (boreholes and hand-dug wells) used for drinking water in the Dodowa area of Ghana. The samples were analyzed for the presence of fecal indicator bacteria (Escherichia coli) and viruses (Adenovirus and Rotavirus), using membrane filtration with plating and glass wool filtration with quantitative polymerase chain reaction (PCR), respectively. In addition, sanitary inspection of surroundings of the sources was conducted to identify their vulnerability to pollution. The presence of viruses was also assessed in water samples from the Dodowa River. More than 70% of the hand-dug wells were sited within 10 m of nearby sources of contamination. All sources contained E. coli bacteria, and their numbers in samples of water between dug wells and boreholes showed no significant difference (p = 0.48). Quantitative PCR results for Adenovirus indicated 27% and 55% were positive for the boreholes and hand-dug wells, respectively. Samples from all boreholes tested negative for the presence of Rotavirus while 27% of the dug wells were positive for Rotavirus. PCR tests of 20% of groundwater samples were inhibited. Based on these results we concluded that there is systemic microbial and fecal contamination of groundwater in the area. On-site sanitation facilities, e.g., pit latrines and unlined wastewater drains, are likely the most common sources of fecal contamination of groundwater in the area. Water abstracted from groundwater sources needs to be treated before use for consumption purposes. In addition, efforts should be made to delineate protected areas around groundwater abstraction points to minimize contamination from point sources of pollution. PMID:29649111
Quantitative assessment of workload and stressors in clinical radiation oncology.
Mazur, Lukasz M; Mosaly, Prithima R; Jackson, Marianne; Chang, Sha X; Burkhardt, Katharin Deschesne; Adams, Robert D; Jones, Ellen L; Hoyle, Lesley; Xu, Jing; Rockwell, John; Marks, Lawrence B
2012-08-01
Workload level and sources of stressors have been implicated as sources of error in multiple settings. We assessed workload levels and sources of stressors among radiation oncology professionals. Furthermore, we explored the potential association between workload and the frequency of reported radiotherapy incidents by the World Health Organization (WHO). Data collection was aimed at various tasks performed by 21 study participants from different radiation oncology professional subgroups (simulation therapists, radiation therapists, physicists, dosimetrists, and physicians). Workload was assessed using National Aeronautics and Space Administration Task-Load Index (NASA TLX). Sources of stressors were quantified using observational methods and segregated using a standard taxonomy. Comparisons between professional subgroups and tasks were made using analysis of variance ANOVA, multivariate ANOVA, and Duncan test. An association between workload levels (NASA TLX) and the frequency of radiotherapy incidents (WHO incidents) was explored (Pearson correlation test). A total of 173 workload assessments were obtained. Overall, simulation therapists had relatively low workloads (NASA TLX range, 30-36), and physicists had relatively high workloads (NASA TLX range, 51-63). NASA TLX scores for physicians, radiation therapists, and dosimetrists ranged from 40-52. There was marked intertask/professional subgroup variation (P<.0001). Mental demand (P<.001), physical demand (P=.001), and effort (P=.006) significantly differed among professional subgroups. Typically, there were 3-5 stressors per cycle of analyzed tasks with the following distribution: interruptions (41.4%), time factors (17%), technical factors (13.6%), teamwork issues (11.6%), patient factors (9.0%), and environmental factors (7.4%). A positive association between workload and frequency of reported radiotherapy incidents by the WHO was found (r = 0.87, P value=.045). Workload level and sources of stressors vary among professional subgroups. Understanding the factors that influence these findings can guide adjustments to the workflow procedures, physical layout, and/or communication protocols to enhance safety. Additional evaluations are needed in order to better understand if these findings are systemic. Copyright © 2012 Elsevier Inc. All rights reserved.
Quantitative Assessment of Workload and Stressors in Clinical Radiation Oncology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mazur, Lukasz M., E-mail: lukasz_mazur@ncsu.edu; Industrial Extension Service, North Carolina State University, Raleigh, North Carolina; Biomedical Engineering, North Carolina State University, Raleigh, North Carolina
2012-08-01
Purpose: Workload level and sources of stressors have been implicated as sources of error in multiple settings. We assessed workload levels and sources of stressors among radiation oncology professionals. Furthermore, we explored the potential association between workload and the frequency of reported radiotherapy incidents by the World Health Organization (WHO). Methods and Materials: Data collection was aimed at various tasks performed by 21 study participants from different radiation oncology professional subgroups (simulation therapists, radiation therapists, physicists, dosimetrists, and physicians). Workload was assessed using National Aeronautics and Space Administration Task-Load Index (NASA TLX). Sources of stressors were quantified using observational methodsmore » and segregated using a standard taxonomy. Comparisons between professional subgroups and tasks were made using analysis of variance ANOVA, multivariate ANOVA, and Duncan test. An association between workload levels (NASA TLX) and the frequency of radiotherapy incidents (WHO incidents) was explored (Pearson correlation test). Results: A total of 173 workload assessments were obtained. Overall, simulation therapists had relatively low workloads (NASA TLX range, 30-36), and physicists had relatively high workloads (NASA TLX range, 51-63). NASA TLX scores for physicians, radiation therapists, and dosimetrists ranged from 40-52. There was marked intertask/professional subgroup variation (P<.0001). Mental demand (P<.001), physical demand (P=.001), and effort (P=.006) significantly differed among professional subgroups. Typically, there were 3-5 stressors per cycle of analyzed tasks with the following distribution: interruptions (41.4%), time factors (17%), technical factors (13.6%), teamwork issues (11.6%), patient factors (9.0%), and environmental factors (7.4%). A positive association between workload and frequency of reported radiotherapy incidents by the WHO was found (r = 0.87, P value=.045). Conclusions: Workload level and sources of stressors vary among professional subgroups. Understanding the factors that influence these findings can guide adjustments to the workflow procedures, physical layout, and/or communication protocols to enhance safety. Additional evaluations are needed in order to better understand if these findings are systemic.« less
Next Generation of Leaching Tests
A corresponding abstract has been cleared for this presentation. The four methods comprising the Leaching Environmental Assessment Framework are described along with the tools to support implementation of the more rigorous and accurate source terms that are developed using LEAF ...
Rey-Martinez, Jorge; Pérez-Fernández, Nicolás
2016-12-01
The proposed validation goal of 0.9 in intra-class correlation coefficient was reached with the results of this study. With the obtained results we consider that the developed software (RombergLab) is a validated balance assessment software. The reliability of this software is dependent of the used force platform technical specifications. Develop and validate a posturography software and share its source code in open source terms. Prospective non-randomized validation study: 20 consecutive adults underwent two balance assessment tests, six condition posturography was performed using a clinical approved software and force platform and the same conditions were measured using the new developed open source software using a low cost force platform. Intra-class correlation index of the sway area obtained from the center of pressure variations in both devices for the six conditions was the main variable used for validation. Excellent concordance between RombergLab and clinical approved force platform was obtained (intra-class correlation coefficient =0.94). A Bland and Altman graphic concordance plot was also obtained. The source code used to develop RombergLab was published in open source terms.
Petty, J.D.; Jones, S.B.; Huckins, J.N.; Cranor, W.L.; Parris, J.T.; McTague, T.B.; Boyle, T.P.
2000-01-01
As an integral part of our continued development of water quality assessment approaches, we combined integrative sampling, instrumental analysis of widely occurring anthropogenic contaminants, and the application of a suite of bioindicator tests as a specific part of a broader survey of ecological conditions, species diversity, and habitat quality in the Santa Cruz River in Arizona, USA. Lipid-containing semipermeable membrane devices (SPMDs) were employed to sequester waterborne hydrophobic chemicals. Instrumental analysis and a suite of bioindicator tests were used to determine the presence and potential toxicological relevance of mixtures of bioavailable chemicals in two major water sources of the Santa Cruz River. The SPMDs were deployed at two sites; the effluent weir of the International Wastewater Treatment Plant (IWWTP) and the Nogales Wash. Both of these systems empty into the Santa Cruz River and the IWWTP effluent is a potential source of water for a constructed wetland complex. Analysis of the SPMD sample extracts revealed the presence of organochlorine pesticides (OCs), polychlorinated biphenyls (PCBs), and polycyclic aromatic hydrocarbons (PAHs). The bioindicator tests demonstrated increased liver enzyme activity, perturbation of neurotransmitter systems and potential endocrine disrupting effects (vitellogenin induction) in fish exposed to the extracts. With increasing global demands on limited water resources, the approach described herein provides an assessment paradigm applicable to determining the quality of water in a broad range of aquatic systems.
NASA Technical Reports Server (NTRS)
Smith, Ramsey; Reuter, Dennis; Irons, James; Lunsford, Allen; Montanero, Matthew; Tesfaye, Zelalem; Wenny, Brian; Thome, Kurtis
2011-01-01
The preflight calibration testing of TIRS evaluates the performance of the instrument at the component, subsystem and system level, The overall objective is to provide an instrument that is well calibrated and well characterized with specification compliant data that will ensure the data continuity of Landsat from the previous missions to the LDCM, The TIRS flight build unit and the flight instrument were assessed through a series of calibration tests at NASA Goddard Space Flight Center. Instrument-level requirements played a strong role in defining the test equipment and procedures used for the calibration in the thermal/vacuum chamber. The calibration ground support equipment (CGSE), manufactured by MEI and ATK Corporation, was used to measure the optical, radiometric and geometric characteristics of TIRS, The CGSE operates in three test configurations: GeoRad (geometric, radiometric and spatial), flood source and spectral, TIRS was evaluated though the following tests: bright target recovery, radiometry, spectral response, spatial shape, scatter, stray light, focus, and uniformity, Data were obtained for the instrument and various subsystems under conditions simulating those on orbit In the spectral configuration, a monochromator system with a blackbody source is used for in-band and out-of-band relative spectral response characterization, In the flood source configuration the entire focal plane array is illuminated simultaneously to investigate pixel-to-pixel uniformity and dead or inoperable pixels, The remaining tests were executed in the GeoRad configuration and use a NIST calibrated cavity blackbody source, The NIST calibration is transferred to the TIRS sensor and to the blackbody source on-board TIRS, The onboard calibrator will be the primary calibration source for the TIRS sensor on orbit.
PFLOTRAN-RepoTREND Source Term Comparison Summary.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frederick, Jennifer M.
Code inter-comparison studies are useful exercises to verify and benchmark independently developed software to ensure proper function, especially when the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment. This summary describes the results of the first portion of the code inter-comparison between PFLOTRAN and RepoTREND, which compares the radionuclide source term used in a typical performance assessment.
Earthquake Source Inversion Blindtest: Initial Results and Further Developments
NASA Astrophysics Data System (ADS)
Mai, P.; Burjanek, J.; Delouis, B.; Festa, G.; Francois-Holden, C.; Monelli, D.; Uchide, T.; Zahradnik, J.
2007-12-01
Images of earthquake ruptures, obtained from modelling/inverting seismic and/or geodetic data exhibit a high degree in spatial complexity. This earthquake source heterogeneity controls seismic radiation, and is determined by the details of the dynamic rupture process. In turn, such rupture models are used for studying source dynamics and for ground-motion prediction. But how reliable and trustworthy are these earthquake source inversions? Rupture models for a given earthquake, obtained by different research teams, often display striking disparities (see http://www.seismo.ethz.ch/srcmod) However, well resolved, robust, and hence reliable source-rupture models are an integral part to better understand earthquake source physics and to improve seismic hazard assessment. Therefore it is timely to conduct a large-scale validation exercise for comparing the methods, parameterization and data-handling in earthquake source inversions.We recently started a blind test in which several research groups derive a kinematic rupture model from synthetic seismograms calculated for an input model unknown to the source modelers. The first results, for an input rupture model with heterogeneous slip but constant rise time and rupture velocity, reveal large differences between the input and inverted model in some cases, while a few studies achieve high correlation between the input and inferred model. Here we report on the statistical assessment of the set of inverted rupture models to quantitatively investigate their degree of (dis-)similarity. We briefly discuss the different inversion approaches, their possible strength and weaknesses, and the use of appropriate misfit criteria. Finally we present new blind-test models, with increasing source complexity and ambient noise on the synthetics. The goal is to attract a large group of source modelers to join this source-inversion blindtest in order to conduct a large-scale validation exercise to rigorously asses the performance and reliability of current inversion methods and to discuss future developments.
High performance jet-engine flight test data base for HSR
NASA Technical Reports Server (NTRS)
Kelly, Jeffrey
1992-01-01
The primary acoustic priority of the flight test data base for HSR is the validation of the NASA Aircraft Noise Prediction Program (ANOPP) and other source noise codes. Also, the noise measurements are an important support function for the High Lift Program devoted to HSR. Another concern that will be addressed is a possible noise problem 7-20 miles from take-off during climbout. The attention arises from the higher speeds envisioned for the HSCT compared to conventional aircraft causing levels to increase because of Doppler amplification in conjunction with high source levels due to jet noise. An attempt may be made to measure airframe noise for the F-16XL test which would provide an assessment of this noise component for delta wing aircraft.
Assessment of COPD-related outcomes via a national electronic medical record database.
Asche, Carl; Said, Quayyim; Joish, Vijay; Hall, Charles Oaxaca; Brixner, Diana
2008-01-01
The technology and sophistication of healthcare utilization databases have expanded over the last decade to include results of lab tests, vital signs, and other clinical information. This review provides an assessment of the methodological and analytical challenges of conducting chronic obstructive pulmonary disease (COPD) outcomes research in a national electronic medical records (EMR) dataset and its potential application towards the assessment of national health policy issues, as well as a description of the challenges or limitations. An EMR database and its application to measuring outcomes for COPD are described. The ability to measure adherence to the COPD evidence-based practice guidelines, generated by the NIH and HEDIS quality indicators, in this database was examined. Case studies, before and after their publication, were used to assess the adherence to guidelines and gauge the conformity to quality indicators. EMR was the only source of information for pulmonary function tests, but low frequency in ordering by primary care was an issue. The EMR data can be used to explore impact of variation in healthcare provision on clinical outcomes. The EMR database permits access to specific lab data and biometric information. The richness and depth of information on "real world" use of health services for large population-based analytical studies at relatively low cost render such databases an attractive resource for outcomes research. Various sources of information exist to perform outcomes research. It is important to understand the desired endpoints of such research and choose the appropriate database source.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neary, Vincent Sinclair; Yang, Zhaoqing; Wang, Taiping
A wave model test bed is established to benchmark, test and evaluate spectral wave models and modeling methodologies (i.e., best practices) for predicting the wave energy resource parameters recommended by the International Electrotechnical Commission, IEC TS 62600-101Ed. 1.0 ©2015. Among other benefits, the model test bed can be used to investigate the suitability of different models, specifically what source terms should be included in spectral wave models under different wave climate conditions and for different classes of resource assessment. The overarching goal is to use these investigations to provide industry guidance for model selection and modeling best practices depending onmore » the wave site conditions and desired class of resource assessment. Modeling best practices are reviewed, and limitations and knowledge gaps in predicting wave energy resource parameters are identified.« less
Gomathi, Kadayam G; Ahmed, Soofia; Sreedharan, Jayadevan
2012-01-01
Objectives: The aim of this study was to assess the psychological health of first-year health professional students and to study sources of student stress. Methods: All first-year students (N = 125) of the Gulf Medical University (GMU) in Ajman, United Arab Emirates (UAE), were invited to participate in a voluntary, anonymous, self-administered, questionnaire-based survey in January 2011. Psychological health was assessed using the 12-item General Health Questionnaire. A 24-item questionnaire, with items related to academic, psychosocial and health domains was used to identify sources of stress. Pearson’s chi-squared test and the Mann-Whitney U-test were used for testing the association between psychological morbidity and sources of stress. Results: A total of 112 students (89.6%) completed the survey and the overall prevalence of psychological morbidity was found to be 33.6%. The main academic-related sources of stress were ‘frequency of exams’, ‘academic workload’, and ‘time management’. Major psychosocial stressors were ‘worries regarding future’, ‘high parental expectations’, ‘anxiety’, and ‘dealing with members of the opposite sex’. Health-related issues were ‘irregular eating habits’, ‘lack of exercise’, and ‘sleep-related problems’. Psychological morbidity was not significantly associated with any of the demographic factors studied. However, total stress scores and academics-related domain scores were significantly associated with psychological morbidity. Conclusion: Psychological morbidity was seen in one in three first-year students attending GMU. While worries regarding the future and parental expectations were sources of stress for many students, psychological morbidity was found to be significantly associated with only the total stress and the academic-related domain scores. PMID:22548140
ERIC Educational Resources Information Center
Liu, Xiufeng; Ruiz, Miguel E.
2008-01-01
This article reports a study on using data mining to predict K-12 students' competence levels on test items related to energy. Data sources are the 1995 Third International Mathematics and Science Study (TIMSS), 1999 TIMSS-Repeat, 2003 Trend in International Mathematics and Science Study (TIMSS), and the National Assessment of Educational…
ERIC Educational Resources Information Center
Leighton, Jacqueline P.; Bustos Gómez, María Clara
2018-01-01
Formative assessments and feedback are vital to enhancing learning outcomes but require that learners feel at ease identifying their errors, and receiving feedback from a trusted source--teachers. An experimental test of a new theoretical framework was conducted to cultivate a pedagogical alliance to enhance students' (a) trust in the teacher, (b)…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peruzzo, S., E-mail: simone.peruzzo@igi.cnr.it; Cervaro, V.; Dalla Palma, M.
2016-02-15
This paper presents the results of numerical simulations and experimental tests carried out to assess the feasibility and suitability of graphite castellated tiles as beam-facing component in the diagnostic calorimeter of the negative ion source SPIDER (Source for Production of Ions of Deuterium Extracted from Radio frequency plasma). The results indicate that this concept could be a reliable, although less performing, alternative for the present design based on carbon fiber composite tiles, as it provides thermal measurements on the required spatial scale.
NASA Astrophysics Data System (ADS)
Peruzzo, S.; Cervaro, V.; Dalla Palma, M.; Delogu, R.; De Muri, M.; Fasolo, D.; Franchin, L.; Pasqualotto, R.; Pimazzoni, A.; Rizzolo, A.; Tollin, M.; Zampieri, L.; Serianni, G.
2016-02-01
This paper presents the results of numerical simulations and experimental tests carried out to assess the feasibility and suitability of graphite castellated tiles as beam-facing component in the diagnostic calorimeter of the negative ion source SPIDER (Source for Production of Ions of Deuterium Extracted from Radio frequency plasma). The results indicate that this concept could be a reliable, although less performing, alternative for the present design based on carbon fiber composite tiles, as it provides thermal measurements on the required spatial scale.
Why Education Practitioners and Stakeholders Should Care about Person Fit in Educational Assessments
ERIC Educational Resources Information Center
Walker, A. Adrienne
2017-01-01
In this article, A. Adrienne Walker introduces the concept of person fit to education stakeholders as a source of evidence to inform the trustworthiness of a test score for interpretation and use (validity). Person fit analyses are used in educational measurement research to explore the degree to which a person's test score can be interpreted as a…
NASA Astrophysics Data System (ADS)
Sund, Per
2016-09-01
Science teachers regard practical work as important and many claim that it helps students to learn science. Besides theoretical knowledge, such as concepts and formulas, practical work is considered to be an integral and basic part of science education. As practical work is perceived and understood in different ways, comparing the results between classes and schools is difficult. One way of making the results comparable is to develop systematic inquiries to be assessed in national large-scale tests. However, introducing similar testing conditions in a laboratory environment is not always possible. Although the instructions and assessment guides for such tests are detailed, many obstacles need to be overcome if equality in the overall test situation is to be achieved. This empirical case study investigates two secondary school science teachers' assessments of 15-16 years old students in three separate groups in the practical part of a Swedish national test in chemistry. Data are gathered using two video cameras and three pairs of spy camera glasses. The results show that individual and independent assessments are difficult due to the social interactions that take place and the physical sources of errors that occur in this type of setting.
NASA Astrophysics Data System (ADS)
Mai, P. M.; Schorlemmer, D.; Page, M.
2012-04-01
Earthquake source inversions image the spatio-temporal rupture evolution on one or more fault planes using seismic and/or geodetic data. Such studies are critically important for earthquake seismology in general, and for advancing seismic hazard analysis in particular, as they reveal earthquake source complexity and help (i) to investigate earthquake mechanics; (ii) to develop spontaneous dynamic rupture models; (iii) to build models for generating rupture realizations for ground-motion simulations. In applications (i - iii), the underlying finite-fault source models are regarded as "data" (input information), but their uncertainties are essentially unknown. After all, source models are obtained from solving an inherently ill-posed inverse problem to which many a priori assumptions and uncertain observations are applied. The Source Inversion Validation (SIV) project is a collaborative effort to better understand the variability between rupture models for a single earthquake (as manifested in the finite-source rupture model database) and to develop robust uncertainty quantification for earthquake source inversions. The SIV project highlights the need to develop a long-standing and rigorous testing platform to examine the current state-of-the-art in earthquake source inversion, and to develop and test novel source inversion approaches. We will review the current status of the SIV project, and report the findings and conclusions of the recent workshops. We will briefly discuss several source-inversion methods, how they treat uncertainties in data, and assess the posterior model uncertainty. Case studies include initial forward-modeling tests on Green's function calculations, and inversion results for synthetic data from spontaneous dynamic crack-like strike-slip earthquake on steeply dipping fault, embedded in a layered crustal velocity-density structure.
Organizing and Presenting Program Outcome Data.
ERIC Educational Resources Information Center
Anema, Marion G.; Brown, Barbara E.; Stringfield, Yvonne N.
2003-01-01
Data collection and assessment processes used by a nursing school are described. Sources include student achievement data from tests, projects, journals, case studies, community service, and clinical practicums. The ways in which data are organized, presented, and used are discussed. (SK)
Integrating data types to enhance shoreline change assessments
NASA Astrophysics Data System (ADS)
Long, J.; Henderson, R.; Plant, N. G.; Nelson, P. R.
2016-12-01
Shorelines represent the variable boundary between terrestrial and marine environments. Assessment of geographic and temporal variability in shoreline position and related variability in shoreline change rates are an important part of studies and applications related to impacts from sea-level rise and storms. The results from these assessments are used to quantify future ecosystem services and coastal resilience and guide selection of appropriate coastal restoration and protection designs. But existing assessments typically fail to incorporate all available shoreline observations because they are derived from multiple data types and have different or unknown biases and uncertainties. Shoreline-change research and assessments often focus on either the long-term trajectory using sparse data over multiple decades or shorter-term evolution using data collected more frequently but over a shorter period of time. The combination of data collected with significantly different temporal resolution is not often considered. Also, differences in the definition of the shoreline metric itself can occur, whether using a single or multiple data source(s), due to variation the signal being detected in the data (e.g. instantaneous land/water interface, swash zone, wrack line, or topographic contours). Previous studies have not explored whether more robust shoreline change assessments are possible if all available data are utilized and all uncertainties are considered. In this study, we test the hypothesis that incorporating all available shoreline data will lead to both improved historical assessments and enhance the predictive capability of shoreline-change forecasts. Using over 250 observations of shoreline position at Dauphin Island, Alabama over the last century, we compare shoreline-change rates derived from individual data sources (airborne lidar, satellite, aerial photographs) with an assessment using the combination of all available data. Biases or simple uncertainties in the shoreline metric from different data types and varying temporal/spatial resolution of the data are examined. As part of this test, we also demonstrate application of data assimilation techniques to predict shoreline position by accurately including the uncertainty in each type of data.
NASA Technical Reports Server (NTRS)
Kerslake, Thomas W.; Scheiman, David A.
2005-01-01
This paper documents testing and analyses to quantify International Space Station (ISS) Solar Array Wing (SAW) string electrical performance under highly off-nominal, low-temperature-low-intensity (LILT) operating conditions with nonsolar light sources. This work is relevant for assessing feasibility and risks associated with a Sequential Shunt Unit (SSU) remove and replace (R&R) Extravehicular Activity (EVA). During eclipse, SAW strings can be energized by moonlight, EVA suit helmet lights or video camera lights. To quantify SAW performance under these off-nominal conditions, solar cell performance testing was performed using full moon, solar simulator and Video Camera Luminaire (VCL) light sources. Test conditions included 25 to 110 C temperatures and 1- to 0.0001-Sun illumination intensities. Electrical performance data and calculated eclipse lighting intensities were combined to predict SAW current-voltage output for comparison with electrical hazard thresholds. Worst case predictions show there is no connector pin molten metal hazard but crew shock hazard limits are exceeded due to VCL illumination. Assessment uncertainties and limitations are discussed along with operational solutions to mitigate SAW electrical hazards from VCL illumination. Results from a preliminary assessment of SAW arcing are also discussed. The authors recommend further analyses once SSU, R&R, and EVA procedures are better defined.
Development of visible spectroscopy diagnostics for W sources assessment in WEST
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meyer, O., E-mail: olivier.meyer@cea.fr; Giacalone, J. C.; Pascal, J. Y.
2016-11-15
The present work concerns the development of a W sources assessment system in the framework of the tungsten-W environment in steady state tokamak project that aims at equipping the existing Tore Supra device with a tungsten divertor in order to test actively cooled tungsten Plasma Facing Components (PFCs) in view of preparing ITER operation. The goal is to assess W sources and D recycling with spectral, spatial, and temporal resolution adapted to the PFCs observed. The originality of the system is that all optical elements are installed in the vacuum vessel and compatible with steady state operation. Our system ismore » optimized to measure radiance as low as 10{sup 16} Ph/(m{sup 2} s sr). A total of 240 optical fibers will be deployed to the detection systems such as the “Filterscope,” developed by Oak Ridge National Laboratory (USA) and consisting of photomultiplier tubes and filters, or imaging spectrometers dedicated to Multiview analysis.« less
Bell, Raoul; Giang, Trang; Buchner, Axel
2012-01-01
Previous research has shown a source memory advantage for faces presented in negative contexts. As yet it remains unclear whether participants remember the specific type of context in which the faces were presented or whether they can only remember that the face was associated with negative valence. In the present study, participants saw faces together with descriptions of two different types of negative behaviour and neutral behaviour. In Experiment 1, we examined whether the participants were able to discriminate between two types of other-relevant negative context information (cheating and disgusting behaviour) in a source memory test. In Experiment 2, we assessed source memory for other-relevant negative (threatening) context information (other-aggressive behaviour) and self-relevant negative context information (self-aggressive behaviour). A multinomial source memory model was used to separately assess partial source memory for the negative valence of the behaviour and specific source memory for the particular type of negative context the face was associated with. In Experiment 1, source memory was specific for the particular type of negative context presented (i.e., cheating or disgusting behaviour). Experiment 2 showed that source memory for other-relevant negative information was more specific than source memory for self-relevant information. Thus, emotional source memory may vary in specificity depending on the degree to which the negative emotional context is perceived as threatening.
Assessment of Microphone Phased Array for Measuring Launch Vehicle Lift-off Acoustics
NASA Technical Reports Server (NTRS)
Garcia, Roberto
2012-01-01
The specific purpose of the present work was to demonstrate the suitability of a microphone phased array for launch acoustics applications via participation in selected firings of the Ares I Scale Model Acoustics Test. The Ares I Scale Model Acoustics Test is a part of the discontinued Constellation Program Ares I Project, but the basic understanding gained from this test is expected to help development of the Space Launch System vehicles. Correct identification of sources not only improves the predictive ability, but provides guidance for a quieter design of the launch pad and optimization of the water suppression system. This document contains the results of the NASA Engineering and Safety Center assessment.
Malhotra, Sita; Sidhu, Shailpreet K; Devi, Pushpa
2015-08-29
Safe water is a precondition for health and development and is a basic human right, yet it is still denied to hundreds of millions of people throughout the developing world. Water-related diseases caused by insufficient safe water supplies, coupled with poor sanitation and hygiene, cause 3.4 million deaths a year, mostly in children. The present study was conducted on 1,317 drinking water samples from various water sources in Amritsar district in northern India. All the samples were analyzed to assess bacteriological quality of water for presumptive coliform count by the multiple tube test. A total of 42.9% (565/1,317) samples from various sources were found to be unfit for human consumption. Of the total 565 unsatisfactory samples, 253 were from submersible pumps, 197 were from taps of piped supply (domestic/public), 79 were from hand pumps, and 36 were from various other sources A significantly high level of contamination was observed in samples collected from submersible pumps (47.6%) and water tanks (47.3%), as these sources of water are more exposed and liable to contamination. Despite continuous efforts by the government, civil society, and the international community, over a billion people still do not have access to improved water resources. Bacteriological assessment of all sources of drinking should be planned and conducted on regular basis to prevent waterborne dissemination of diseases.
ERIC Educational Resources Information Center
Brockmann, Frank
2011-01-01
State testing programs today are more extensive than ever, and their results are required to serve more purposes and high-stakes decisions than one might have imagined. Assessment results are used to hold schools, districts, and states accountable for student performance and to help guide a multitude of important decisions. This report describes…
Effects of Lambertian sources design on uniformity and measurements
NASA Astrophysics Data System (ADS)
Cariou, Nadine; Durell, Chris; McKee, Greg; Wilks, Dylan; Glastre, Wilfried
2014-10-01
Integrating sphere (IS) based uniform sources are a primary tool for ground based calibration, characterization and testing of flight radiometric equipment. The idea of a Lambertian field of energy is a very useful tool in radiometric testing, but this concept is being checked in many ways by newly lowered uncertainty goals. At an uncertainty goal of 2% one needs to assess carefully uniformity in addition to calibration uncertainties, as even sources with a 0.5% uniformity are now substantial proportions of uncertainty budgets. The paper explores integrating sphere design options for achieving 99.5% and better uniformity of exit port radiance and spectral irradiance created by an integrating sphere. Uniformity in broad spectrum and spectral bands are explored. We discuss mapping techniques and results as a function of observed uniformity as well as laboratory testing results customized to match with customer's instrumentation field of view. We will also discuss recommendations with basic commercial instrumentation, we have used to validate, inspect, and improve correlation of uniformity measurements with the intended application.
Comparison of Parent Report and Direct Assessment of Child Skills in Toddlers.
Miller, Lauren E; Perkins, Kayla A; Dai, Yael G; Fein, Deborah A
2017-09-01
There are unique challenges associated with measuring development in early childhood. Two primary sources of information are used: parent report and direct assessment. Each approach has strengths and weaknesses, particularly when used to identify and diagnose developmental delays. The present study aimed to evaluate consistency between parent report and direct assessment of child skills in toddlers with and without Autism Spectrum Disorder (ASD) across receptive language, expressive language, and fine motor domains. 109 children were evaluated at an average age of two years; data on child skills were collected via parent report and direct assessment. Children were classified into three groups (i.e., ASD, Other Developmental Disorder, or Typical Development) based on DSM-IV-TR diagnosis. Mixed design ANOVAs, with data source as a within subjects factor and diagnostic group as a between subjects factor, were used to assess agreement. Chi square tests of agreement were then used to examine correspondence at the item level. Results suggested that parent report of language and fine motor skills did not significantly differ from direct assessment, and this finding held across diagnostic groups. Item level analyses revealed that, in most cases of significant disagreement, parents reported a skill as present, but it was not seen on direct testing. Results indicate that parents are generally reliable reporters of child language and fine motor abilities in toddlerhood, even when their children have developmental disorders such as ASD. However, the fullest picture may be obtained by using both parent report and direct assessment.
Ladics, Gregory S; Holsapple, Michael P; Astwood, James D; Kimber, Ian; Knippels, Leon M J; Helm, Ricki M; Dong, Wumin
2003-05-01
There is a need to assess the safety of foods deriving from genetically modified (GM) crops, including the allergenic potential of novel gene products. Presently, there is no single in vitro or in vivo model that has been validated for the identification or characterization of potential food allergens. Instead, the evaluation focuses on risk factors such as source of the gene (i.e., allergenic vs. nonallergenic sources), physicochemical and genetic comparisons to known allergens, and exposure assessments. The purpose of this workshop was to gather together researchers working on various strategies for assessing protein allergenicity: (1) to describe the current state of knowledge and progress that has been made in the development and evaluation of appropriate testing strategies and (2) to identify critical issues that must now be addressed. This overview begins with a consideration of the current issues involved in assessing the allergenicity of GM foods. The second section presents information on in vitro models of digestibility, bioinformatics, and risk assessment in the context of clinical prevention and management of food allergy. Data on rodent models are presented in the next two sections. Finally, nonrodent models for assessing protein allergenicity are discussed. Collectively, these studies indicate that significant progress has been made in developing testing strategies. However, further efforts are needed to evaluate and validate the sensitivity, specificity, and reproducibility of many of these assays for determining the allergenicity potential of GM foods.
Aquatic assessment of the Ely Copper Mine Superfund site, Vershire, Vermont
Seal, Robert R.; Kiah, Richard G.; Piatak, Nadine M.; Besser, John M.; Coles, James F.; Hammarstrom, Jane M.; Argue, Denise M.; Levitan, Denise M.; Deacon, Jeffrey R.; Ingersoll, Christopher G.
2010-01-01
The information was used to develop an overall assessment of the impact on the aquatic system that appears to be a result of the acid rock drainage at the Ely Mine. More than 700 meters of Ely Brook, including two of the six ponds, were found to be severely impacted, on the basis of water-quality data and biological assessments. The reference location was of good quality based on the water quality and biological assessment. More than 3,125 meters of Schoolhouse Brook are also severely impacted, on the basis of water-quality data and biological assessments. The biological community begins to recover near the confluence with the Ompompanoosuc River. The evidence is less conclusive regarding the Ompompanoosuc River. The sediment data suggest that the sediments could be a source of toxicity in Ely Brook and Schoolhouse Brook. The surface-water assessment is consistent with the outcome of a surface-water toxicity testing program performed by the U.S. Environmental Protection Agency for Ely Brook and Schoolhouse Brook and a surface-water toxicity testing program and in situ amphibian testing program for the ponds.
NASA Technical Reports Server (NTRS)
Horne, William C.
2011-01-01
Measurements of background noise were recently obtained with a 24-element phased microphone array in the test section of the Arnold Engineering Development Center 80- by120-Foot Wind Tunnel at speeds of 50 to 100 knots (27.5 to 51.4 m/s). The array was mounted in an aerodynamic fairing positioned with array center 1.2m from the floor and 16 m from the tunnel centerline, The array plate was mounted flush with the fairing surface as well as recessed in. (1.27 cm) behind a porous Kevlar screen. Wind-off speaker measurements were also acquired every 15 on a 10 m semicircular arc to assess directional resolution of the array with various processing algorithms, and to estimate minimum detectable source strengths for future wind tunnel aeroacoustic studies. The dominant background noise of the facility is from the six drive fans downstream of the test section and first set of turning vanes. Directional array response and processing methods such as background-noise cross-spectral-matrix subtraction suggest that sources 10-15 dB weaker than the background can be detected.
NASA Astrophysics Data System (ADS)
Aubrey, A. D.; Thorpe, A. K.; Christensen, L. E.; Dinardo, S.; Frankenberg, C.; Rahn, T. A.; Dubey, M.
2013-12-01
It is critical to constrain both natural and anthropogenic sources of methane to better predict the impact on global climate change. Critical technologies for this assessment include those that can detect methane point and concentrated diffuse sources over large spatial scales. Airborne spectrometers can potentially fill this gap for large scale remote sensing of methane while in situ sensors, both ground-based and mounted on aerial platforms, can monitor and quantify at small to medium spatial scales. The Jet Propulsion Laboratory (JPL) and collaborators recently conducted a field test located near Casper, WY, at the Rocky Mountain Oilfield Test Center (RMOTC). These tests were focused on demonstrating the performance of remote and in situ sensors for quantification of point-sourced methane. A series of three controlled release points were setup at RMOTC and over the course of six experiment days, the point source flux rates were varied from 50 LPM to 2400 LPM (liters per minute). During these releases, in situ sensors measured real-time methane concentration from field towers (downwind from the release point) and using a small Unmanned Aerial System (sUAS) to characterize spatiotemporal variability of the plume structure. Concurrent with these methane point source controlled releases, airborne sensor overflights were conducted using three aircraft. The NASA Carbon in Arctic Reservoirs Vulnerability Experiment (CARVE) participated with a payload consisting of a Fourier Transform Spectrometer (FTS) and an in situ methane sensor. Two imaging spectrometers provided assessment of optical and thermal infrared detection of methane plumes. The AVIRIS-next generation (AVIRIS-ng) sensor has been demonstrated for detection of atmospheric methane in the short wave infrared region, specifically using the absorption features at ~2.3 μm. Detection of methane in the thermal infrared region was evaluated by flying the Hyperspectral Thermal Emission Spectrometer (HyTES), retrievals which interrogate spectral features in the 7.5 to 8.5 μm region. Here we discuss preliminary results from the JPL activities during the RMOTC controlled release experiment, including capabilities of airborne sensors for total columnar atmospheric methane detection and comparison to results from ground measurements and dispersion models. Potential application areas for these remote sensing technologies include assessment of anthropogenic and natural methane sources over wide spatial scales that represent significant unconstrained factors to the global methane budget.
Cardiovascular (CV) safety concerns are a significant source of drug development attrition in the pharmaceutical industry today. Though current nonclinical testing paradigms have largely prevented catastrophic CV events in Phase I studies, many challenges relating to the inabil...
In vitro eye irritation testing using the open source reconstructed hemicornea - a ring trial.
Mewes, Karsten R; Engelke, Maria; Zorn-Kruppa, Michaela; Bartok, Melinda; Tandon, Rashmi; Brandner, Johanna M; Petersohn, Dirk
2017-01-01
The aim of the present ring trial was to test whether two new methodological approaches for the in vitro classification of eye irritating chemicals can be reliably transferred from the developers' laboratories to other sites. Both test methods are based on the well-established open source reconstructed 3D hemicornea models. In the first approach, the initial depth of injury after chemical treatment in the hemicornea model is derived from the quantitative analysis of histological sections. In the second approach, tissue viability, as a measure for corneal damage after chemical treatment, is analyzed separately for epithelium and stroma of the hemicornea model. The three independent laboratories that participated in the ring trial produced their own hemicornea models according to the test producer's instructions, thus supporting the open source concept. A total of 9 chemicals with different physicochemical and eye-irritating properties were tested to assess the between-laboratory reproducibility (BLR), the predictive performance, as well as possible limitations of the test systems. The BLR was 62.5% for the first and 100% for the second method. Both methods enabled to discriminate Cat. 1 chemicals from all non-Cat. 1 substances, which qualifies them to be used in a top-down approach. However, the selectivity between No Cat. and Cat. 2 chemicals still needs optimization.
Final Environmental Assessment: Demolition/Restoration of Ipswich Antenna Test Facility
2012-05-01
sourc--es \\ Veta quantified b y using fual oU e onsu.mption from the-last full yesr of use ( CY20 1 0). Buildings S-3, S-5, and S-15 ware haated b y #2...the environment. E nvironmental Consequences Environmental analyses focused on the following areas: land use, socioeconomics, utilities...the wetland as possib I e . In addition the Conservation Commission may impose additional requirements such as: staking the wetland boundaries
The microbial quality of drinking water in Manonyane community: Maseru District (Lesotho).
Gwimbi, P
2011-09-01
Provision of good quality household drinking water is an important means of improving public health in rural communities especially in Africa; and is the rationale behind protecting drinking water sources and promoting healthy practices at and around such sources. To examine the microbial content of drinking water from different types of drinking water sources in Manonyane community of Lesotho. The community's hygienic practices around the water sources are also assessed to establish their contribution to water quality. Water samples from thirty five water sources comprising 22 springs, 6 open wells, 6 boreholes and 1 open reservoir were assessed. Total coliform and Escherichia coli bacteria were analyzed in water sampled. Results of the tests were compared with the prescribed World Health Organization desirable limits. A household survey and field observations were conducted to assess the hygienic conditions and practices at and around the water sources. Total coliform were detected in 97% and Escherichia coli in 71% of the water samples. The concentration levels of Total coliform and Escherichia coli were above the permissible limits of the World Health Organization drinking water quality guidelines in each case. Protected sources had significantly less number of colony forming units (cfu) per 100 ml of water sample compared to unprotected sources (56% versus 95%, p < 0.05). Similarly in terms of Escherichia coli, protected sources had less counts (7% versus 40%, p < 0.05) compared with those from unprotected sources. Hygiene conditions and practices that seemed to potentially contribute increased total coliform and Escherichia coli counts included non protection of water sources from livestock faeces, laundry practices, and water sources being down slope of pit latrines in some cases. These findings suggest source water protection and good hygiene practices can improve the quality of household drinking water where disinfection is not available. The results also suggest important lines of inquiry and provide support and input for environmental and public health programmes, particularly those related to water and sanitation.
The Determinants of Productivity in Medical Testing: Intensity and Allocation of Care*
Abaluck, Jason; Agha, Leila; Kabrhel, Chris; Raja, Ali; Venkatesh, Arjun
2017-01-01
A large body of research has investigated whether physicians overuse care. There is less evidence on whether, for a fixed level of spending, doctors allocate resources to patients with the highest expected returns. We assess both sources of inefficiency exploiting variation in rates of negative imaging tests for pulmonary embolism. We document enormous across-doctor heterogeneity in testing conditional on patient population, which explains the negative relationship between physicians’ testing rates and test yields. Furthermore, doctors do not target testing to the highest risk patients, reducing test yields by one third. Our calibration suggests misallocation is more costly than overuse. PMID:29104293
NASA Astrophysics Data System (ADS)
Roth, Wolff-Michael; Oliveri, Maria Elena; Dallie Sandilands, Debra; Lyons-Thomas, Juliette; Ercikan, Kadriye
2013-03-01
Even if national and international assessments are designed to be comparable, subsequent psychometric analyses often reveal differential item functioning (DIF). Central to achieving comparability is to examine the presence of DIF, and if DIF is found, to investigate its sources to ensure differentially functioning items that do not lead to bias. In this study, sources of DIF were examined using think-aloud protocols. The think-aloud protocols of expert reviewers were conducted for comparing the English and French versions of 40 items previously identified as DIF (N = 20) and non-DIF (N = 20). Three highly trained and experienced experts in verifying and accepting/rejecting multi-lingual versions of curriculum and testing materials for government purposes participated in this study. Although there is a considerable amount of agreement in the identification of differentially functioning items, experts do not consistently identify and distinguish DIF and non-DIF items. Our analyses of the think-aloud protocols identified particular linguistic, general pedagogical, content-related, and cognitive factors related to sources of DIF. Implications are provided for the process of arriving at the identification of DIF, prior to the actual administration of tests at national and international levels.
Wireless acquisition of multi-channel seismic data using the Seismobile system
NASA Astrophysics Data System (ADS)
Isakow, Zbigniew
2017-11-01
This paper describes the wireless acquisition of multi-channel seismic data using a specialized mobile system, Seismobile, designed for subsoil diagnostics for transportation routes. The paper presents examples of multi-channel seismic records obtained during system tests in a configuration with 96 channels (4 landstreamers of 24-channel) and various seismic sources. Seismic waves were generated at the same point using different sources: a 5-kg hammer, a Gisco's source with a 90-kg pile-driver, and two other the pile-drivers of 45 and 70 kg. Particular attention is paid to the synchronization of source timing, the measurement of geometry by autonomous GPS systems, and the repeatability of triggering measurements constrained by an accelerometer identifying the seismic waveform. The tests were designed to the registration, reliability, and range of the wireless transmission of survey signals. The effectiveness of the automatic numbering of measuring modules was tested as the system components were arranged and fixed to the streamers. After measurements were completed, the accuracy and speed of data downloading from the internal memory (SDHC 32GB WiFi) was determined. Additionally, the functionality of automatic battery recharging, the maximum survey duration, and the reliability of battery discharge signalling were assessed.
Schmidt, Robert L; Walker, Brandon S; Cohen, Michael B
2015-03-01
Reliable estimates of accuracy are important for any diagnostic test. Diagnostic accuracy studies are subject to unique sources of bias. Verification bias and classification bias are 2 sources of bias that commonly occur in diagnostic accuracy studies. Statistical methods are available to estimate the impact of these sources of bias when they occur alone. The impact of interactions when these types of bias occur together has not been investigated. We developed mathematical relationships to show the combined effect of verification bias and classification bias. A wide range of case scenarios were generated to assess the impact of bias components and interactions on total bias. Interactions between verification bias and classification bias caused overestimation of sensitivity and underestimation of specificity. Interactions had more effect on sensitivity than specificity. Sensitivity was overestimated by at least 7% in approximately 6% of the tested scenarios. Specificity was underestimated by at least 7% in less than 0.1% of the scenarios. Interactions between verification bias and classification bias create distortions in accuracy estimates that are greater than would be predicted from each source of bias acting independently. © 2014 American Cancer Society.
2014-11-05
usable simulations. This procedure was to be tested using real-world data collected from open-source venues. The final system would support rapid...assess social change. Construct is an agent-based dynamic-network simulation system design to allow the user to assess the spread of information and...protest or violence. Technical Challenges Addressed Re‐use: Most agent-based simulation ( ABM ) in use today are one-off. In contrast, we
The influences of implementing state-mandated science assessment on teacher practice
NASA Astrophysics Data System (ADS)
Katzmann, Jason Matthew
Four high school Biology teachers, two novice and two experienced, participated in a year and a half case study. By utilizing a naturalistic paradigm, the four individuals were studied in their natural environment, their classrooms. Data sources included: three semi-structured interviews, classroom observation field notes, and classroom artifacts. Through cross-case analysis and a constant comparative methodology, coding nodes where combined and refined resulting in the final themes for discussion. The following research question was investigated: what is the impact of high-stakes testing on high school Biology teacher's instructional planning, instructional practices and classroom assessments? Seven final themes were realized: Assessment, CSAP, Planning, Pressure, Standards, Teaching and Time. Each theme was developed and discussed utilizing each participant's voice. Trustworthiness of this study was established via five avenues: triangulation of data sources, credibility, transferability, dependability and confirmability. A model of the influences of high-stakes testing on teacher practice was developed to describe the seven themes (Figure 5). This model serves as an illustration of the complex nature of teacher practice and the influences upon it. The four participants in this study were influenced by high-stakes assessment. It influenced their instructional decisions, assessment practices, use of time, planning decisions and decreased the amount of inquiry that occurred in the classroom. Implications of this research and future research directions are described.
Respiratory source control using a surgical mask: An in vitro study.
Patel, Rajeev B; Skaria, Shaji D; Mansour, Mohamed M; Smaldone, Gerald C
2016-07-01
Cough etiquette and respiratory hygiene are forms of source control encouraged to prevent the spread of respiratory infection. The use of surgical masks as a means of source control has not been quantified in terms of reducing exposure to others. We designed an in vitro model using various facepieces to assess their contribution to exposure reduction when worn at the infectious source (Source) relative to facepieces worn for primary (Receiver) protection, and the factors that contribute to each. In a chamber with various airflows, radiolabeled aerosols were exhaled via a ventilated soft-face manikin head using tidal breathing and cough (Source). Another manikin, containing a filter, quantified recipient exposure (Receiver). The natural fit surgical mask, fitted (SecureFit) surgical mask and an N95-class filtering facepiece respirator (commonly known as an "N95 respirator") with and without a Vaseline-seal were tested. With cough, source control (mask or respirator on Source) was statistically superior to mask or unsealed respirator protection on the Receiver (Receiver protection) in all environments. To equal source control during coughing, the N95 respirator must be Vaseline-sealed. During tidal breathing, source control was comparable or superior to mask or respirator protection on the Receiver. Source control via surgical masks may be an important adjunct defense against the spread of respiratory infections. The fit of the mask or respirator, in combination with the airflow patterns in a given setting, are significant contributors to source control efficacy. Future clinical trials should include a surgical mask source control arm to assess the contribution of source control in overall protection against airborne infection.
Abd-Elmaksoud, Sherif; Naranjo, Jaime E; Gerba, Charles P
2013-06-01
Effective individual microbiological water purifiers are needed for consumption of untreated water sources by campers, emergency use, military, and in developing counties. A handheld UV light device was tested to assess if it could meet the virus reduction requirements established by the United State Environmental Protection Agency, National Science Foundation and the World Health Organization. The device was found capable of inactivating at least 4 log₁₀ of poliovirus type 1, rotavirus SA-11 and MS-2 virus in 500 mL volumes of general case test water. But in the presence of high turbidity and organic matter, filtration was necessary to achieve a 4 log₁₀ reduction of the test viruses.
Assessment of semi-volatile organic compounds in drinking water sources in Jiangsu, China.
Wu, Yifeng; Jia, Yongzhi; Lu, Xiwu
2013-08-01
Many xenobiotic compounds, especially organic pollutants in drinking water, can cause threats to human health and natural ecosystems. The ability to predict the level of pollutants and identify their source is crucial for the design of pollutant risk reduction plans. In this study, 25 semi-volatile organic compounds (SVOCs) were assessed at 16 monitoring sites of drinking water sources in Jiangsu, east China, to evaluate water quality conditions and source of pollutants. Four multivariate statistical techniques were used for this analysis. The correlation test indicated that 25 SVOCs parameters variables had a significant spatial variability (P<0.05). The results of correlation analysis, principal component analysis (PCA) and cluster analysis (CA) suggested that at least four sources, i.e., agricultural residual pesticides, industrial sewage, water transportation vehicles and miscellaneous sources, were responsible for the presence of SVOCs in the drinking water sites examined, accounting for 89.6% of the total variance in the dataset. The analysis of site similarity showed that 16 sites could be divided into high, moderate, and low pollutant level groups at (D(link)/D(max))×25<10, and each group had primary typical SVOCs. These results provide useful information for developing appropriate strategies for contaminants control in drinking water sources. Copyright © 2013 Elsevier Inc. All rights reserved.
DOT National Transportation Integrated Search
2013-10-01
The research has been conducted on laboratory-cast concrete prism specimens containing both fine and coarse aggregates obtained from different sources to provide a spectrum of reactivity for assessment through the developed NIRAS technique. The NIRAS...
40 CFR 63.11502 - What definitions apply to this subpart?
Code of Federal Regulations, 2013 CFR
2013-07-01
...: process knowledge, an engineering assessment, or test data. Byproduct means a chemical (liquid, gas, or... limit applicable to the process vent. (4) Design analysis based on accepted chemical engineering... (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Chemical Manufacturing Area Sources...
40 CFR 63.11502 - What definitions apply to this subpart?
Code of Federal Regulations, 2014 CFR
2014-07-01
...: process knowledge, an engineering assessment, or test data. Byproduct means a chemical (liquid, gas, or... limit applicable to the process vent. (4) Design analysis based on accepted chemical engineering... (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Chemical Manufacturing Area Sources...
40 CFR 63.11502 - What definitions apply to this subpart?
Code of Federal Regulations, 2012 CFR
2012-07-01
...: process knowledge, an engineering assessment, or test data. Byproduct means a chemical (liquid, gas, or... (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Chemical Manufacturing Area Sources...., for chemical value as a product, isolated intermediate, byproduct, or coproduct, or for heat value...
Mir Contamination Observations and Implications to the International Space Station
NASA Technical Reports Server (NTRS)
Soares, Carlos; Mikatarian, Ron
2000-01-01
A series of external contamination measurements were made on the Russian Mir Space Station. The Mir external contamination observations summarized in this paper were essential in assessing the system level impact of Russian Segment induced contamination on the International Space Station (ISS). Mir contamination observations include results from a series of flight experiments: CNES Comes-Aragatz, retrieved NASA camera bracket, Euro-Mir '95 ICA, retrieved NASA Trek blanket, Russian Astra-II, Mir Solar Array Return Experiment (SARE), etc. Results from these experiments were studied in detail to characterize Mir induced contamination. In conjunction with Mir contamination observations, Russian materials samples were tested for condensable outgassing rates in the U.S. These test results were essential in the characterization of Mir contamination sources. Once Mir contamination sources were identified and characterized, activities to assess the implications to ISS were implemented. As a result, modifications in Russian materials selection and/or usage were implemented to control contamination and mitigate risk to ISS.
Attenuation of X and Gamma Rays in Personal Radiation Shielding Protective Clothing.
Kozlovska, Michaela; Cerny, Radek; Otahal, Petr
2015-11-01
A collection of personal radiation shielding protective clothing, suitable for use in case of accidents in nuclear facilities or radiological emergency situations involving radioactive agents, was gathered and tested at the Nuclear Protection Department of the National Institute for Nuclear, Chemical and Biological Protection, Czech Republic. Attenuating qualities of shielding layers in individual protective clothing were tested via spectra measurement of x and gamma rays, penetrating them. The rays originated from different radionuclide point sources, the gamma ray energies of which cover a broad energy range. The spectra were measured by handheld spectrometers, both scintillation and High Purity Germanium. Different narrow beam geometries were adjusted using a special testing bench and a set of various collimators. The main experimentally determined quantity for individual samples of personal radiation shielding protective clothing was x and gamma rays attenuation for significant energies of the spectra. The attenuation was assessed comparing net peak areas (after background subtraction) in spectra, where a tested sample was placed between the source and the detector, and corresponding net peak areas in spectra, measured without the sample. Mass attenuation coefficients, which describe attenuating qualities of shielding layers materials in individual samples, together with corresponding lead equivalents, were determined as well. Experimentally assessed mass attenuation coefficients of the samples were compared to the referred ones for individual heavy metals.
Panda, R; Ariyarathna, H; Amnuaycheewa, P; Tetteh, A; Pramod, S N; Taylor, S L; Ballmer-Weber, B K; Goodman, R E
2013-02-01
Premarket, genetically modified (GM) plants are assessed for potential risks of food allergy. The major risk would be transfer of a gene encoding an allergen or protein nearly identical to an allergen into a different food source, which can be assessed by specific serum testing. The potential that a newly expressed protein might become an allergen is evaluated based on resistance to digestion in pepsin and abundance in food fractions. If the modified plant is a common allergenic source (e.g. soybean), regulatory guidelines suggest testing for increases in the expression of endogenous allergens. Some regulators request evaluating endogenous allergens for rarely allergenic plants (e.g. maize and rice). Since allergic individuals must avoid foods containing their allergen (e.g. peanut, soybean, maize, or rice), the relevance of the tests is unclear. Furthermore, no acceptance criteria are established and little is known about the natural variation in allergen concentrations in these crops. Our results demonstrate a 15-fold difference in the major maize allergen, lipid transfer protein between nine varieties, and complex variation in IgE binding to various soybean varieties. We question the value of evaluating endogenous allergens in GM plants unless the intent of the modification was production of a hypoallergenic crop. © 2012 John Wiley & Sons A/S.
Soil-transmitted helminth eggs assessment in wastewater in an urban area in India.
Grego, Sonia; Barani, Viswa; Hegarty-Craver, Meghan; Raj, Antony; Perumal, Prasanna; Berg, Adrian B; Archer, Colleen
2018-02-01
Water quality and sanitation are inextricably linked to prevalence and control of soil-transmitted helminth infections, a public health concern in resource-limited settings. India bears a large burden of disease associated with poor sanitation. Transformative onsite sanitation technologies are being developed that feature elimination of pathogens including helminth eggs in wastewater treatment. We are conducting third-party testing of multiple sanitation technology systems in Coimbatore (Tamil Nadu) India. To ensure stringent testing of the pathogen removal ability of sanitation technologies, the presence of helminth eggs in wastewater across the town of Coimbatore was assessed. Wastewater samples from existing test sites as well as desludging trucks servicing residential and non-residential septic tanks, were collected. The AmBic methodology (based on washing, sieving, sedimenting and floating) was used for helminth egg isolation. We tested 29 different source samples and found a 52% prevalence of potentially infective helminth eggs. Identification and enumeration of helminth species is reported against the septage source (private residential vs. shared toilet facility) and total solids content. Trichuris egg counts were higher than those of hookworm and Ascaris from desludging trucks, whereas hookworm egg counts were higher in fresh wastewater samples. Surprisingly, no correlation between soil transmitted helminth eggs and total solids was observed.
Performance characterization of a solenoid-type gas valve for the H- magnetron source at FNAL
NASA Astrophysics Data System (ADS)
Sosa, A.; Bollinger, D. S.; Karns, P. R.
2017-08-01
The magnetron-style H- ion sources currently in operation at Fermilab use piezoelectric gas valves to function. This kind of gas valve is sensitive to small changes in ambient temperature, which affect the stability and performance of the ion source. This motivates the need to find an alternative way of feeding H2 gas into the source. A solenoid-type gas valve has been characterized in a dedicated off-line test stand to assess the feasibility of its use in the operational ion sources. H- ion beams have been extracted at 35 keV using this valve. In this study, the performance of the solenoid gas valve has been characterized measuring the beam current output of the magnetron source with respect to the voltage and pulse width of the signal applied to the gas valve.
NASA Technical Reports Server (NTRS)
Mosher, Marianne
1990-01-01
The principal objective is to assess the adequacy of linear acoustic theory with an impedence wall boundary condition to model the detailed sound field of an acoustic source in a duct. Measurements and calculations are compared of a simple acoustic source in a rectangular concrete duct lined with foam on the walls and anechoic end terminations. Measurement of acoustic pressure for twelve wave numbers provides variation in frequency and absorption characteristics of the duct walls. Close to the source, where the interference of wall reflections is minimal, correlation is very good. Away from the source, correlation degrades, especially for the lower frequencies. Sensitivity studies show little effect on the predicted results for changes in impedance boundary condition values, source location, measurement location, temperature, and source model for variations spanning the expected measurement error.
Large Dataset of Acute Oral Toxicity Data Created for Testing ...
Acute toxicity data is a common requirement for substance registration in the US. Currently only data derived from animal tests are accepted by regulatory agencies, and the standard in vivo tests use lethality as the endpoint. Non-animal alternatives such as in silico models are being developed due to animal welfare and resource considerations. We compiled a large dataset of oral rat LD50 values to assess the predictive performance currently available in silico models. Our dataset combines LD50 values from five different sources: literature data provided by The Dow Chemical Company, REACH data from eChemportal, HSDB (Hazardous Substances Data Bank), RTECS data from Leadscope, and the training set underpinning TEST (Toxicity Estimation Software Tool). Combined these data sources yield 33848 chemical-LD50 pairs (data points), with 23475 unique data points covering 16439 compounds. The entire dataset was loaded into a chemical properties database. All of the compounds were registered in DSSTox and 59.5% have publically available structures. Compounds without a structure in DSSTox are currently having their structures registered. The structural data will be used to evaluate the predictive performance and applicable chemical domains of three QSAR models (TIMES, PROTOX, and TEST). Future work will combine the dataset with information from ToxCast assays, and using random forest modeling, assess whether ToxCast assays are useful in predicting acute oral toxicity. Pre
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeRosier, R.
1984-09-01
This volume is a compendium of detailed emission and test data from field tests of a firetube industrial boiler burning a coal/oil/water (COW) mixture. The boiler was tested while burning COW fuel, and COW with soda ash added (COW+SA) to serve as an SO/sub 2/ sorbent. The test data include: preliminary equipment calibration data, boiler operating data for both tests, fuel analysis results, and complete flue gas emission measurement and laboratory analysis results. Flue gas emission measurements included: continuous monitoring for criteria gas pollutants; gas chromatography (GC) of gas grab samples for volatile organics (C1-C6); EPA Method 5 for particulate;more » controlled condensation system for SO2 emissions; and source assessment sampling system (SASS) for total organics in two boiling point ranges (100 to 300 C and > 300 C), organic compound category information using infrared spectrometry (IR) and low resolution mass spectrometry (LRMS), specific quantitation of the semivolatile organic priority pollutants using gas chromatography/mass spectrometry (GC/MS), liquid chromatography (LC) separation of organic extracts into seven polarity fractions with total organic and IR analyses of eluted fractions, flue gas concentrations of trace elements by spark source mass spectrometry (SSMS) and atomic absorption spectroscopy (AAS), and biological assays of organic extracts.« less
Hemachandra, Chamini K; Pathiratne, Asoka
2017-01-01
Biological effect directed in vivo tests with model organisms are useful in assessing potential health risks associated with chemical contaminations in surface waters. This study examined the applicability of two in vivo test systems viz. plant, Allium cepa root based tests and fish, Oreochromis niloticus erythrocyte based tests for screening cytogenotoxic potential of raw source water, water treatment waste (effluents) and treated water of drinking water treatment plants (DWTPs) using two DWTPs associated with a major river in Sri Lanka. Measured physico-chemical parameters of the raw water, effluents and treated water samples complied with the respective Sri Lankan standards. In the in vivo tests, raw water induced statistically significant root growth retardation, mitodepression and chromosomal abnormalities in the root meristem of the plant and micronuclei/nuclear buds evolution and genetic damage (as reflected by comet scores) in the erythrocytes of the fish compared to the aged tap water controls signifying greater genotoxicity of the source water especially in the dry period. The effluents provoked relatively high cytogenotoxic effects on both test systems but the toxicity in most cases was considerably reduced to the raw water level with the effluent dilution (1:8). In vivo tests indicated reduction of cytogenotoxic potential in the tested drinking water samples. The results support the potential applications of practically feasible in vivo biological test systems such as A. cepa root based tests and the fish erythrocyte based tests as complementary tools for screening cytogenotoxicity potential of the source water and water treatment waste reaching downstream of aquatic ecosystems and for evaluating cytogenotoxicity eliminating efficacy of the DWTPs in different seasons in view of human and ecological safety. Copyright © 2016 Elsevier Ltd. All rights reserved.
The Chuar Petroleum System, Arizona and Utah
Lillis, Paul G.
2016-01-01
The Neoproterozoic Chuar Group consists of marine mudstone, sandstone and dolomitic strata divided into the Galeros and Kwagunt Formations, and is exposed only in the eastern Grand Canyon, Arizona. Research by the U.S. Geological Survey (USGS) in the late 1980s identified strata within the group to be possible petroleum source rocks, and in particular the Walcott Member of the Kwagunt Formation. Industry interest in a Chuar oil play led to several exploratory wells drilled in the 1990s in southern Utah and northern Arizona to test the overlying Cambrian Tapeats Sandstone reservoir, and confirm the existence of the Chuar in subcrop. USGS geochemical analyses of Tapeats oil shows in two wells have been tentatively correlated to Chuar bitumen extracts. Distribution of the Chuar in the subsurface is poorly constrained with only five well penetrations, but recently published gravity/aeromagnetic interpretations provide further insight into the Chuar subcrop distribution. The Chuar petroleum system was reexamined as part of the USGS Paradox Basin resource assessment in 2011. A map was constructed to delineate the Chuar petroleum system that encompasses the projected Chuar source rock distribution and all oil shows in the Tapeats Sandstone, assuming that the Chuar is the most likely source for such oil shows. Two hypothetical plays were recognized but not assessed: (1) a conventional play with a Chuar source and Tapeats reservoir, and (2) an unconventional play with a Chuar source and reservoir. The conventional play has been discouraging because most surface structures have been tested by drilling with minimal petroleum shows, and there is some evidence that petroleum may have been flushed by CO2 from Tertiary volcanism. The unconventional play is untested and remains promising even though the subcrop distribution of source facies within the Chuar Group is largely unknown.
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeRosier, R.
1984-07-01
This volume of the report gives emission results from field tests of a crude-oil process heater burning a combination of oil and refinery gas. The heater had been modified by adding a system for injecting secondary air to reduce NOx emissions. One test was conducted with the staged air system (low NOx), and the other, without (baseline). Tests included continuous monitoring of flue gas emissions and source assessment sampling system (SASS) sampling of the flue gas with subsequent laboratory analysis of the samples utilizing gas chromatography (GC), infrared spectrometry (IR), gas chromatography/mass spectroscopy (GC/MS), and low resolution mass spectrometry (SSMS)more » for trace metals. LRMS analysis suggested the presence of eight compound categories in the organic emissions during the baseline test and four in the low-NOx test.« less
NASA Technical Reports Server (NTRS)
Klos, Jacob; Palumbo, Daniel L.; Buehrle, Ralph D.; Williams, Earl G.; Valdivia, Nicolas; Herdic, Peter C.; Sklanka, Bernard
2005-01-01
A series of tests was planned and conducted in the Interior Noise Test Facility at Boeing Field, on the NASA Aries 757 flight research aircraft, and in the Structural Acoustic Loads and Transmission Facility at NASA Langley Research Center. These tests were designed to answer several questions concerning the use of array methods in flight. One focus of the tests was determining whether and to what extent array methods could be used to identify the effects of an acoustical treatment applied to a limited portion of an aircraft fuselage. Another focus of the tests was to verify that the arrays could be used to localize and quantify a known source purposely placed in front of the arrays. Thus the issues related to backside sources and flanking paths present in the complicated sound field were addressed during these tests. These issues were addressed through the use of reference transducers, both accelerometers mounted to the fuselage and microphones in the cabin, that were used to correlate the pressure holograms. measured by the microphone arrays using either SVD methods or partial coherence methods. This correlation analysis accepts only energy that is coherent with the sources sensed by the reference transducers, allowing a noise control engineer to only identify and study those vibratory sources of interest. The remainder of this paper will present a detailed description of the test setups that were used in this test sequence and typical results of the NAH/IBEM analysis used to reconstruct the sound fields. Also, a comparison of data obtained in the laboratory environments and during flights of the 757 aircraft will be made.
An assessment technique for computer-socket manufacturing
Sanders, Joan; Severance, Michael
2015-01-01
An assessment strategy is presented for testing the quality of carving and forming of individual computer aided manufacturing facilities. The strategy is potentially useful to facilities making sockets and companies marketing manufacturing equipment. To execute the strategy, an evaluator fabricates a collection of test models and sockets using the manufacturing suite under evaluation, and then measures their shapes using scanning equipment. Overall socket quality is assessed by comparing socket shapes with electronic file shapes. Then model shapes are compared with electronic file shapes to characterize carving performance. Socket shapes are compared with model shapes to characterize forming performance. The mean radial error (MRE), which is the average difference in radii between the two shapes being compared, provides insight into sizing quality. Inter-quartile range (IQR), the range of radial error for the best matched half of the points on the surfaces being compared, provides insight into shape quality. By determining MRE and IQR for carving and forming separately, the source(s) of socket shape error may be pinpointed. The developed strategy may provide a useful tool to the prosthetics community and industry to help identify problems and limitations in computer aided manufacturing and insight into appropriate modifications to overcome them. PMID:21938663
Minouflet, Marion; Ayrault, Sophie; Badot, Pierre-Marie; Cotelle, Sylvie; Ferard, Jean-François
2005-01-01
Since the middle of the 20th century, ionizing radiations from radioactive isotopes including 137Cs have been investigated to determine their genotoxic impact on living organisms. The present study was designed to compare the effectiveness of three plant bioassays to assess DNA damage induced by low doses of 137Cs: Vicia-micronucleus test (Vicia-MCN), Tradescantia-micronucleus test (Trad-MCN) and Tradescantia-stamen-hair mutation test (Trad-SH) were used. Vicia faba (broad bean) and Tradescantia clone 4430 (spiderwort) were exposed to 137Cs according to different scenarios: external and internal (contamination) irradiations. Experiments were conducted with various levels of radioactivity in solution or in soil, using solid or liquid 137Cs sources. The three bioassays showed different sensitivities to the treatments. Trad-MCN appeared to be the most sensitive test (significative response from 1.5 kBq/200 ml after 30 h of contamination). Moreover, at comparable doses, internal irradiations led to larger effects for the three bioassays. These bioassays are effective tests for assessing the genotoxic effects of radioactive 137Cs pollution.
Lee King, Patricia A; Pate, David J
2014-02-01
Perinatal HIV transmission disproportionately affects African American, Latina and potentially Hmong women in the United States. Understanding racially and ethnically diverse women's perceptions of and experiences with perinatal health care, HIV testing and HIV/AIDS may inform effective health communications to reduce the risk of perinatal HIV transmission among disproportionate risk groups. We used a qualitative descriptive research design with content analysis of five focus groups of African American, Caucasian, Hmong and Latina women of reproductive age with low socioeconomic status distinguished by their race/ethnicity or HIV status. A purposive stratified sample of 37 women shared their health-care experiences, health information sources and perceptions of HIV testing and HIV/AIDS. Women's responses highlighted the importance of developing and leveraging trusted provider and community-based relationships and assessing a woman's beliefs and values in her sociocultural context, to ensure clear, consistent and relevant communications. Perinatal health communications that are culturally sensitive and based on an assessment of women's knowledge and understanding of perinatal health and HIV/AIDS may be an effective tool for health educators addressing racial and ethnic disparities in perinatal HIV transmission.
NASA Astrophysics Data System (ADS)
Schultz, Michael; Verbesselt, Jan; Herold, Martin; Avitabile, Valerio
2013-10-01
Researchers who use remotely sensed data can spend half of their total effort analysing prior data. If this data preprocessing does not match the application, this time spent on data analysis can increase considerably and can lead to inaccuracies. Despite the existence of a number of methods for pre-processing Landsat time series, each method has shortcomings, particularly for mapping forest changes under varying illumination, data availability and atmospheric conditions. Based on the requirements of mapping forest changes as defined by the United Nations (UN) Reducing Emissions from Forest Degradation and Deforestation (REDD) program, the accurate reporting of the spatio-temporal properties of these changes is necessary. We compared the impact of three fundamentally different radiometric preprocessing techniques Moderate Resolution Atmospheric TRANsmission (MODTRAN), Second Simulation of a Satellite Signal in the Solar Spectrum (6S) and simple Dark Object Subtraction (DOS) on mapping forest changes using Landsat time series data. A modification of Breaks For Additive Season and Trend (BFAST) monitor was used to jointly map the spatial and temporal agreement of forest changes at test sites in Ethiopia and Viet Nam. The suitability of the pre-processing methods for the occurring forest change drivers was assessed using recently captured Ground Truth and high resolution data (1000 points). A method for creating robust generic forest maps used for the sampling design is presented. An assessment of error sources has been performed identifying haze as a major source for time series analysis commission error.
Evenden, M L; Gries, R
2010-06-01
Sex pheromone monitoring lures from five different commercial sources were compared for their attractiveness to male diamondback moth, Plutella xylostella L. (Lepidoptera: Plutellidae) in canola, Brassica napus L., fields in western Canada. Lures that had the highest pheromone release rate, as determined by aeration analyses in the laboratory, were the least attractive in field tests. Lures from all the commercial sources tested released more (Z)-11-hexadecenal than (Z)-11-hexadecenyl acetate and the most attractive lures released a significantly higher aldehyde to acetate ratio than less attractive lures. Traps baited with sex pheromone lures from APTIV Inc. (Portland, OR) and ConTech Enterprises Inc. (Delta, BC, Canada) consistently captured more male diamondback moths than traps baited with lures from the other sources tested. In two different lure longevity field trapping experiments, older lures were more attractive to male diamondback moths than fresh lures. Pheromone release from aged lures was constant at very low release rates. The most attractive commercially available sex pheromone lures tested attracted fewer diamondback moth males than calling virgin female moths suggesting that research on the development of a more attractive synthetic sex pheromone lure is warranted.
Source-Type Identification Analysis Using Regional Seismic Moment Tensors
NASA Astrophysics Data System (ADS)
Chiang, A.; Dreger, D. S.; Ford, S. R.; Walter, W. R.
2012-12-01
Waveform inversion to determine the seismic moment tensor is a standard approach in determining the source mechanism of natural and manmade seismicity, and may be used to identify, or discriminate different types of seismic sources. The successful applications of the regional moment tensor method at the Nevada Test Site (NTS) and the 2006 and 2009 North Korean nuclear tests (Ford et al., 2009a, 2009b, 2010) show that the method is robust and capable for source-type discrimination at regional distances. The well-separated populations of explosions, earthquakes and collapses on a Hudson et al., (1989) source-type diagram enables source-type discrimination; however the question remains whether or not the separation of events is universal in other regions, where we have limited station coverage and knowledge of Earth structure. Ford et al., (2012) have shown that combining regional waveform data and P-wave first motions removes the CLVD-isotropic tradeoff and uniquely discriminating the 2009 North Korean test as an explosion. Therefore, including additional constraints from regional and teleseismic P-wave first motions enables source-type discrimination at regions with limited station coverage. We present moment tensor analysis of earthquakes and explosions (M6) from Lop Nor and Semipalatinsk test sites for station paths crossing Kazakhstan and Western China. We also present analyses of smaller events from industrial sites. In these sparse coverage situations we combine regional long-period waveforms, and high-frequency P-wave polarity from the same stations, as well as from teleseismic arrays to constrain the source type. Discrimination capability with respect to velocity model and station coverage is examined, and additionally we investigate the velocity model dependence of vanishing free-surface traction effects on seismic moment tensor inversion of shallow sources and recovery of explosive scalar moment. Our synthetic data tests indicate that biases in scalar seismic moment and discrimination for shallow sources are small and can be understood in a systematic manner. We are presently investigating the frequency dependence of vanishing traction of a very shallow (10m depth) M2+ chemical explosion recorded at several kilometer distances, and preliminary results indicate at the typical frequency passband we employ the bias does not affect our ability to retrieve the correct source mechanism but may affect the retrieval of the correct scalar seismic moment. Finally, we assess discrimination capability in a composite P-value statistical framework.
TOTAL PARTICLE, SULFATE, AND ACIDIC AEROSOL EMISSIONS FROM KEROSENE SPACE HEATERS
Chamber studies were conducted on four unvented kerosene space heaters to assess emissions of total particle, sulfate, and acidic aerosol. The heaters tested represented four burner designs currently in use by the public. Kerosene space heaters are a potential source of fine part...
Experimental Creep Life Assessment for the Advanced Stirling Convertor Heater Head
NASA Technical Reports Server (NTRS)
Krause, David L.; Kalluri, Sreeramesh; Shah, Ashwin R.; Korovaichuk, Igor
2010-01-01
The United States Department of Energy is planning to develop the Advanced Stirling Radioisotope Generator (ASRG) for the National Aeronautics and Space Administration (NASA) for potential use on future space missions. The ASRG provides substantial efficiency and specific power improvements over radioisotope power systems of heritage designs. The ASRG would use General Purpose Heat Source modules as energy sources and the free-piston Advanced Stirling Convertor (ASC) to convert heat into electrical energy. Lockheed Martin Corporation of Valley Forge, Pennsylvania, is integrating the ASRG systems, and Sunpower, Inc., of Athens, Ohio, is designing and building the ASC. NASA Glenn Research Center of Cleveland, Ohio, manages the Sunpower contract and provides technology development in several areas for the ASC. One area is reliability assessment for the ASC heater head, a critical pressure vessel within which heat is converted into mechanical oscillation of a displacer piston. For high system efficiency, the ASC heater head operates at very high temperature (850 C) and therefore is fabricated from an advanced heat-resistant nickel-based superalloy Microcast MarM-247. Since use of MarM-247 in a thin-walled pressure vessel is atypical, much effort is required to assure that the system will operate reliably for its design life of 17 years. One life-limiting structural response for this application is creep; creep deformation is the accumulation of time-dependent inelastic strain under sustained loading over time. If allowed to progress, the deformation eventually results in creep rupture. Since creep material properties are not available in the open literature, a detailed creep life assessment of the ASC heater head effort is underway. This paper presents an overview of that creep life assessment approach, including the reliability-based creep criteria developed from coupon testing, and the associated heater head deterministic and probabilistic analyses. The approach also includes direct benchmark experimental creep assessment. This element provides high-fidelity creep testing of prototypical heater head test articles to investigate the relevant material issues and multiaxial stress state. Benchmark testing provides required data to evaluate the complex life assessment methodology and to validate that analysis. Results from current benchmark heater head tests and newly developed experimental methods are presented. In the concluding remarks, the test results are shown to compare favorably with the creep strain predictions and are the first experimental evidence for a robust ASC heater head creep life.
Davis, Gregg S; Waits, Kara; Nordstrom, Lora; Weaver, Brett; Aziz, Maliha; Gauld, Lori; Grande, Heidi; Bigler, Rick; Horwinski, Joseph; Porter, Stephen; Stegger, Marc; Johnson, James R; Liu, Cindy M; Price, Lance B
2015-09-15
Klebsiella pneumoniae is a common colonizer of the gastrointestinal tract of humans, companion animals, and livestock. To better understand potential contributions of foodborne K. pneumoniae to human clinical infections, we compared K. pneumoniae isolates from retail meat products and human clinical specimens to assess their similarity based on antibiotic resistance, genetic relatedness, and virulence. Klebsiella pneumoniae was isolated from retail meats from Flagstaff grocery stores in 2012 and from urine and blood specimens from Flagstaff Medical Center in 2011-2012. Isolates underwent antibiotic susceptibility testing and whole-genome sequencing. Genetic relatedness of the isolates was assessed using multilocus sequence typing and phylogenetic analyses. Extraintestinal virulence of several closely related meat-source and urine isolates was assessed using a murine sepsis model. Meat-source isolates were significantly more likely to be multidrug resistant and resistant to tetracycline and gentamicin than clinical isolates. Four sequence types occurred among both meat-source and clinical isolates. Phylogenetic analyses confirmed close relationships among meat-source and clinical isolates. Isolates from both sources showed similar virulence in the mouse sepsis model. Meat-source K. pneumoniae isolates were more likely than clinical isolates to be antibiotic resistant, which could reflect selective pressures from antibiotic use in food-animal production. The close genetic relatedness of meat-source and clinical isolates, coupled with similarities in virulence, suggest that the barriers to transmission between these 2 sources are low. Taken together, our results suggest that retail meat is a potential vehicle for transmitting virulent, antibiotic-resistant K. pneumoniae from food animals to humans. © The Author 2015. Published by Oxford University Press on behalf of the Infectious Diseases Society of America.
NASA Technical Reports Server (NTRS)
Conner, David A.; Page, Juliet A.
2002-01-01
To improve aircraft noise impact modeling capabilities and to provide a tool to aid in the development of low noise terminal area operations for rotorcraft and tiltrotors, the Rotorcraft Noise Model (RNM) was developed by the NASA Langley Research Center and Wyle Laboratories. RNM is a simulation program that predicts how sound will propagate through the atmosphere and accumulate at receiver locations located on flat ground or varying terrain, for single and multiple vehicle flight operations. At the core of RNM are the vehicle noise sources, input as sound hemispheres. As the vehicle "flies" along its prescribed flight trajectory, the source sound propagation is simulated and accumulated at the receiver locations (single points of interest or multiple grid points) in a systematic time-based manner. These sound signals at the receiver locations may then be analyzed to obtain single event footprints, integrated noise contours, time histories, or numerous other features. RNM may also be used to generate spectral time history data over a ground mesh for the creation of single event sound animation videos. Acoustic properties of the noise source(s) are defined in terms of sound hemispheres that may be obtained from theoretical predictions, wind tunnel experimental results, flight test measurements, or a combination of the three. The sound hemispheres may contain broadband data (source levels as a function of one-third octave band) and pure-tone data (in the form of specific frequency sound pressure levels and phase). A PC executable version of RNM is publicly available and has been adopted by a number of organizations for Environmental Impact Assessment studies of rotorcraft noise. This paper provides a review of the required input data, the theoretical framework of RNM's propagation model and the output results. Code validation results are provided from a NATO helicopter noise flight test as well as a tiltrotor flight test program that used the RNM as a tool to aid in the development of low noise approach profiles.
Prototyping Control and Data Acquisition for the ITER Neutral Beam Test Facility
NASA Astrophysics Data System (ADS)
Luchetta, Adriano; Manduchi, Gabriele; Taliercio, Cesare; Soppelsa, Anton; Paolucci, Francesco; Sartori, Filippo; Barbato, Paolo; Breda, Mauro; Capobianco, Roberto; Molon, Federico; Moressa, Modesto; Polato, Sandro; Simionato, Paola; Zampiva, Enrico
2013-10-01
The ITER Neutral Beam Test Facility will be the project's R&D facility for heating neutral beam injectors (HNB) for fusion research operating with H/D negative ions. Its mission is to develop technology to build the HNB prototype injector meeting the stringent HNB requirements (16.5 MW injection power, -1 MeV acceleration energy, 40 A ion current and one hour continuous operation). Two test-beds will be built in sequence in the facility: first SPIDER, the ion source test-bed, to optimize the negative ion source performance, second MITICA, the actual prototype injector, to optimize ion beam acceleration and neutralization. The SPIDER control and data acquisition system is under design. To validate the main architectural choices, a system prototype has been assembled and performance tests have been executed to assess the prototype's capability to meet the control and data acquisition system requirements. The prototype is based on open-source software frameworks running under Linux. EPICS is the slow control engine, MDSplus is the data handler and MARTe is the fast control manager. The prototype addresses low and high-frequency data acquisition, 10 kS/s and 10 MS/s respectively, camera image acquisition, data archiving, data streaming, data retrieval and visualization, real time fast control with 100 μs control cycle and supervisory control.
Fusar-Poli, Paolo; Rutigliano, Grazia; Stahl, Daniel; Schmidt, André; Ramella-Cravaro, Valentina; Hitesh, Shetty; McGuire, Philip
2016-12-01
Pretest risk estimation is routinely used in clinical medicine to inform further diagnostic testing in individuals with suspected diseases. To our knowledge, the overall characteristics and specific determinants of pretest risk of psychosis onset in individuals undergoing clinical high risk (CHR) assessment are unknown. To investigate the characteristics and determinants of pretest risk of psychosis onset in individuals undergoing CHR assessment and to develop and externally validate a pretest risk stratification model. Clinical register-based cohort study. Individuals were drawn from electronic, real-world, real-time clinical records relating to routine mental health care of CHR services in South London and the Maudsley National Health Service Trust in London, United Kingdom. The study included nonpsychotic individuals referred on suspicion of psychosis risk and assessed by the Outreach and Support in South London CHR service from 2002 to 2015. Model development and validation was performed with machine-learning methods based on Least Absolute Shrinkage and Selection Operator for Cox proportional hazards model. Pretest risk of psychosis onset in individuals undergoing CHR assessment. Predictors included age, sex, age × sex interaction, race/ethnicity, socioeconomic status, marital status, referral source, and referral year. A total of 710 nonpsychotic individuals undergoing CHR assessment were included. The mean age was 23 years. Three hundred ninety-nine individuals were men (56%), their race/ethnicity was heterogenous, and they were referred from a variety of sources. The cumulative 6-year pretest risk of psychosis was 14.55% (95% CI, 11.71% to 17.99%), confirming substantial pretest risk enrichment during the recruitment of individuals undergoing CHR assessment. Race/ethnicity and source of referral were associated with pretest risk enrichment. The predictive model based on these factors was externally validated, showing moderately good discrimination and sufficient calibration. It was used to stratify individuals undergoing CHR assessment into 4 classes of pretest risk (6-year): low, 3.39% (95% CI, 0.96% to 11.56%); moderately low, 11.58% (95% CI, 8.10% to 16.40%); moderately high, 23.69% (95% CI, 16.58% to 33.20%); and high, 53.65% (95% CI, 36.78% to 72.46%). Significant risk enrichment occurs before individuals are assessed for a suspected CHR state. Race/ethnicity and source of referral are associated with pretest risk enrichment in individuals undergoing CHR assessment. A stratification model can identify individuals at differential pretest risk of psychosis. Identification of these subgroups may inform outreach campaigns and subsequent testing and eventually optimize psychosis prediction.
Kalman, Lisa V.; Lubin, Ira M.; Barker, Shannon; du Sart, Desiree; Elles, Rob; Grody, Wayne W.; Pazzagli, Mario; Richards, Sue; Schrijver, Iris; Zehnbauer, Barbara
2015-01-01
Context Participation in proficiency testing (PT) or external quality assessment (EQA) programs allows the assessment and comparison of test performance among different clinical laboratories and technologies. In addition to the approximately 2300 tests for individual genetic disorders, recent advances in technology have enabled the development of clinical tests which quickly and economically analyze the entire human genome. New PT/EQA approaches are needed to ensure the continued quality of these complex tests. Objective To review the availability and scope of PT/EQA for molecular genetic testing for inherited conditions in Europe, Australasia and the United States; to evaluate the successes and demonstrated value of available PT/EQA programs; and to examine the challenges to the provision of comprehensive PT/EQA posed by new laboratory practices and methodologies. Data Sources The available literature on this topic was reviewed and supplemented with personal experiences of several PT/EQA providers. Conclusions PT/EQA schemes are available for common genetic disorders tested in many clinical laboratories, but are not available for most genetic tests offered by only one or a few laboratories. Provision of broad, method-based PT schemes, such as DNA sequencing, would allow assessment of a large number of tests for which formal PT is not currently available. Participation in PT/EQA improves the quality of testing by identifying inaccuracies that laboratories can trace to errors in the testing process. Areas of research and development to ensure that PT/EQA programs can meet the needs of new and evolving genetic tests and technologies are identified and discussed. PMID:23808472
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vevera, Bradley J; Hyres, James W; McClintock, David A
2014-01-01
Irradiated AISI 316L stainless steel disks were removed from the Spallation Neutron Source (SNS) for post-irradiation examination (PIE) to assess mechanical property changes due to radiation damage and erosion of the target vessel. Topics reviewed include high-resolution photography of the disk specimens, cleaning to remove mercury (Hg) residue and surface oxides, profile mapping of cavitation pits using high frequency ultrasonic testing (UT), high-resolution surface replication, and machining of test specimens using wire electrical discharge machining (EDM), tensile testing, Rockwell Superficial hardness testing, Vickers microhardness testing, scanning electron microscopy (SEM), and energy dispersive spectroscopy (EDS). The effectiveness of the cleaning proceduremore » was evident in the pre- and post-cleaning photography and permitted accurate placement of the test specimens on the disks. Due to the limited amount of material available and the unique geometry of the disks, machine fixturing and test specimen design were critical aspects of this work. Multiple designs were considered and refined during mock-up test runs on unirradiated disks. The techniques used to successfully machine and test the various specimens will be presented along with a summary of important findings from the laboratory examinations.« less
Assessment of source-based nitrogen removal alternatives in leather tanning industry wastewater.
Zengin, G; Olmez, T; Doğruel, S; Kabdaşli, I; Tünay, O
2002-01-01
Nitrogen is an important parameter of leather tanning wastewaters. Magnesium ammonium phosphate (MAP) precipitation is a chemical treatment alternative for ammonia removal. In this study, a detailed source-based wastewater characterisation of a bovine leather tannery was made and nitrogen speciation as well as other basic pollutant parameter values was evaluated. This evaluation has led to definition of alternatives for source-based MAP treatment. MAP precipitation experiments conducted on these alternatives have yielded over 90% ammonia removal at pH 9.5 and using stoichiometric doses. Among the alternatives tested liming-deliming and bating-washing was found to be the most advantageous providing 71% ammonia removal.
Emotional memory: No source memory without old-new recognition.
Bell, Raoul; Mieth, Laura; Buchner, Axel
2017-02-01
Findings reported in the memory literature suggest that the emotional components of an encoding episode can be dissociated from nonemotional memory. In particular, it has been found that the previous association with threatening events can be retrieved in aversive conditioning even in the absence of item identification. In the present study, we test whether emotional source memory can be independent of item recognition. Participants saw pictures of snakes paired with threatening and nonthreatening context information (poisonousness or nonpoisonousness). In the source memory test, participants were required to remember whether a snake was associated with poisonousness or nonpoisonousness. A simple extension of a well-established multinomial source monitoring model was used to measure source memory for unrecognized items. By using this model, it was possible to assess directly whether participants were able to associate a previously seen snake with poisonousness or nonpoisonousness even if the snake itself was not recognized as having been presented during the experiment. In 3 experiments, emotional source memory was only found for recognized items. While source memory for recognized items differed between emotional and nonemotional information, source memory for unrecognized items was equally absent for emotional and nonemotional information. We conclude that emotional context information is bound to item representations and cannot be retrieved in the absence of item recognition. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
MacLean, Charles E; Lamparello, Adam
2014-01-01
Forensic DNA Phenotyping ("FDP"), estimating the externally visible characteristics ("EVCs") of the source of human DNA left at a crime scene, is evolving from science fiction toward science fact. FDP can already identify a source's gender with 100% accuracy, and likely hair color, iris color, adult height, and a number of other EVCs with accuracy rates approaching 70%. Patent applications have been filed for approaches to generating 3D likenesses of DNA sources based on the DNA alone. Nonetheless, criminal investigators, particularly in the United States, have been reticent to apply FDP in their casework. The reticence is likely related to a number of perceived and real dilemmas associated with FDP: is FDP racial profiling, should we test unknown and unseen physical conditions, does testing for behavioral characteristics impermissibly violate the source's privacy, ought testing be permitted for samples from known sources or DNA databases, and should FDP be limited to use in investigations only or is FDP appropriate for use in a criminal court. As this article explains, although those dilemmas are substantive, they are not insurmountable, and can be quite easily managed with appropriate regulation and protocols. As FDP continues to develop, there will be less need for criminal investigators to shy away from FDP. Cold cases, missing persons, and victims in crimes without other evidence will one day soon all be well served by FDP.
DOE Office of Scientific and Technical Information (OSTI.GOV)
E.H. Seabury; D.L. Chichester; C.J. Wharton
2008-08-01
Prompt Gamma Neutron Activation Analysis (PGNAA) systems employ neutrons as a probe to interrogate items, e.g. chemical warfare materiel-filled munitions. The choice of a neutron source in field-portable systems is determined by its ability to excite nuclei of interest, operational concerns such as radiological safety and ease-of-use, and cost. Idaho National Laboratory’s PINS Chemical Assay System has traditionally used a Cf-252 isotopic neutron source, but recently a Deuterium-Tritium (DT) Electronic Neutron Generator (ENG) has been tested as an alternate neutron source. This paper presents the results of using both of these neutron sources to interrogate chemical warfare materiel (CWM) andmore » high explosive (HE) filled munitions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seabury, E. H.; Chichester, D. L.; Wharton, C. J.
2009-03-10
Prompt Gamma Neutron Activation Analysis (PGNAA) systems employ neutrons as a probe to interrogate items, e.g. chemical warfare materiel-filled munitions. The choice of a neutron source in field-portable systems is determined by its ability to excite nuclei of interest, operational concerns such as radiological safety and ease-of-use, and cost. Idaho National Laboratory's PINS Chemical Assay System has traditionally used a {sup 252}Cf isotopic neutron source, but recently a deuterium-tritium (DT) electronic neutron generator (ENG) has been tested as an alternate neutron source. This paper presents the results of using both of these neutron sources to interrogate chemical warfare materiel (CWM)more » and high explosive (HE) filled munitions.« less
Groshkova, Teodora; Best, David; White, William
2013-03-01
Sociological work on social capital and its impact on health behaviours have been translated into the addiction field in the form of 'recovery capital' as the construct for assessing individual progress on a recovery journey. Yet there has been little attempt to quantify recovery capital. The aim of the project was to create a scale that assessed addiction recovery capital. Initial focus group work identified and tested candidate items and domains followed by data collection from multiple sources to enable psychometric assessment of a scale measuring recovery capital. The scale shows moderate test-retest reliability at 1 week and acceptable concurrent validity. Principal component analysis determined single factor structure. The Assessment of Recovery Capital (ARC) is a brief and easy to administer measurement of recovery capital that has acceptable psychometric properties and may be a useful complement to deficit-based assessment and outcome monitoring instruments for substance dependent individuals in and out of treatment. © 2012 Australasian Professional Society on Alcohol and other Drugs.
Neurobehavioral epidemiology: application in risk assessment.
Grandjean, P; White, R F; Weihe, P
1996-01-01
Neurobehavioral epidemiology may contribute information to risk assessment in relation to a) characterization of neurotoxicity and its time course; b) the dose-effect relationship; c) the dose-response relationship; and d) predisposing factors. The quality of this information relies on the validity of the exposure data, the validity and sensitivity of neurobehavioral function tests, and the degree to which sources of bias are controlled. With epidemiologic studies of methylmercury-associated neurotoxicity as an example, the field of research involves numerous uncertainties that should be taken into account in the risk assessment process. PMID:9182047
Boncyk, Wayne C.; Markham, Brian L.; Barker, John L.; Helder, Dennis
1996-01-01
The Landsat-7 Image Assessment System (IAS), part of the Landsat-7 Ground System, will calibrate and evaluate the radiometric and geometric performance of the Enhanced Thematic Mapper Plus (ETM +) instrument. The IAS incorporates new instrument radiometric artifact correction and absolute radiometric calibration techniques which overcome some limitations to calibration accuracy inherent in historical calibration methods. Knowledge of ETM + instrument characteristics gleaned from analysis of archival Thematic Mapper in-flight data and from ETM + prelaunch tests allow the determination and quantification of the sources of instrument artifacts. This a priori knowledge will be utilized in IAS algorithms designed to minimize the effects of the noise sources before calibration, in both ETM + image and calibration data.
The Use of EEG as a Workload Assessment Tool in Flight Test
1993-10-01
resource, single pool, mental model (Wickens) 9 which postulates that the human has a limited source of mental potential and when tasked with multiple...psychological spectrum presents an interesting challenge for future research. 10 EP Amplitude Microvolts) I-J ---- Single Task ......... Difficult...example, they obtained a p value of .000025 for a single test and then applied a Bonferroni correction to yield a conservatively corrected value of p
Weppelmann, Thomas A; Alam, Meer T; Widmer, Jocelyn; Morrissey, David; Rashid, Mohammed H; De Rochars, Valery M Beau; Morris, J Glenn; Ali, Afsar; Johnson, Judith A
2014-12-01
In 2010, a magnitude 7.0 earthquake struck Haiti, severely damaging the drinking and wastewater infrastructure and leaving millions homeless. Compounding this problem, the introduction of Vibrio cholerae resulted in a massive cholera outbreak that infected over 700,000 people and threatened the safety of Haiti's drinking water. To mitigate this public health crisis, non-government organizations installed thousands of wells to provide communities with safe drinking water. However, despite increased access, Haiti currently lacks the monitoring capacity to assure the microbial safety of any of its water resources. For these reasons, this study was designed to assess the feasibility of using a simple, low-cost method to detect indicators of fecal contamination of drinking water that could be implemented at the community level. Water samples from 358 sources of drinking water in the Léogâne flood basin were screened with a commercially available hydrogen sulfide test and a standard membrane method for the enumeration of thermotolerant coliforms. When compared with the gold standard method, the hydrogen sulfide test had a sensitivity of 65 % and a specificity of 93 %. While the sensitivity of the assay increased at higher fecal coliform concentrations, it never exceeded 88 %, even with fecal coliform concentrations greater than 100 colony-forming units per 100 ml. While its simplicity makes the hydrogen sulfide test attractive for assessing water quality in low-resource settings, the low sensitivity raises concerns about its use as the sole indicator of the presence or absence of fecal coliforms in individual or community water sources.
Weppelmann, Thomas A.; Alam, Meer T.; Widmer, Jocelyn; Morrissey, David; Rashid, Mohammed H.; Beau De Rochars, Valery M.; Morris, J. Glenn; Ali, Afsar; Johnson, Judith A.
2014-01-01
In 2010 a magnitude 7.0 earthquake struck Haiti, severely damaging the drinking and waste water infrastructure and leaving millions homeless. Compounding this problem, the introduction of Vibrio cholera resulted in a massive cholera outbreak that infected over 700,000 people and threatened the safety of Haiti’s drinking water. To mitigate this public health crisis, non-government organizations installed thousands of wells to provide communities with safe drinking water. However, despite increased access, Haiti currently lacks the monitoring capacity to assure the microbial safety of any of its water resources. For these reasons, this study was designed to assess the feasibility of using a simple, low cost method to detect indicators of fecal contamination of drinking water that could be implemented at the community level. Water samples from 358 sources of drinking water in the Léogâne flood basin were screened with a commercially available hydrogen sulfide test and a standard membrane method for the enumeration of thermotolerant coliforms. When compared with the gold standard method, the hydrogen sulfide test had a sensitivity of 65% and a specificity of 93%. While the sensitivity of the assay increased at higher fecal coliform concentrations, it never exceeded 88%, even with fecal coliform concentrations greater than 100 colony forming units per 100 milliliters. While its simplicity makes the hydrogen sulfide test attractive for assessing water quality in low resource settings, the low sensitivity raises concerns about its use as the sole indicator of the presence or absence of fecal coliforms in individual or community water sources. PMID:25182685
NASA Technical Reports Server (NTRS)
Schmitt, Jeff G.; Stahnke, Brian
2017-01-01
This report describes test results from an assessment of the acoustically treated 9x15 Foot Low Speed Wind Tunnel at the NASA Glenn Research Center in Cleveland, Ohio in July of 2016. The tests were conducted in accordance with the recently adopted international standard ISO 26101-2012 on Qualification of Free Field Test Environments. This method involves moving a microphone relative to a source and comparing the sound pressure level versus distance measurements with theoretical inverse square law spreading.
NASA Technical Reports Server (NTRS)
Hass, Neal; Mizukami, Masashi; Neal, Bradford A.; St. John, Clinton; Beil, Robert J.; Griffin, Timothy P.
1999-01-01
This paper presents pertinent results and assessment of propellant feed system leak detection as applied to the Linear Aerospike SR-71 Experiment (LASRE) program flown at the NASA Dryden Flight Research Center, Edwards, California. The LASRE was a flight test of an aerospike rocket engine using liquid oxygen and high-pressure gaseous hydrogen as propellants. The flight safety of the crew and the experiment demanded proven technologies and techniques that could detect leaks and assess the integrity of hazardous propellant feed systems. Point source detection and systematic detection were used. Point source detection was adequate for catching gross leakage from components of the propellant feed systems, but insufficient for clearing LASRE to levels of acceptability. Systematic detection, which used high-resolution instrumentation to evaluate the health of the system within a closed volume, provided a better means for assessing leak hazards. Oxygen sensors detected a leak rate of approximately 0.04 cubic inches per second of liquid oxygen. Pressure sensor data revealed speculated cryogenic boiloff through the fittings of the oxygen system, but location of the source(s) was indeterminable. Ultimately, LASRE was cancelled because leak detection techniques were unable to verify that oxygen levels could be maintained below flammability limits.
Drinking water quality and source reliability in rural Ashanti region, Ghana.
Arnold, Meghan; VanDerslice, James A; Taylor, Brooke; Benson, Scott; Allen, Sam; Johnson, Mark; Kiefer, Joe; Boakye, Isaac; Arhinn, Bernard; Crookston, Benjamin T; Ansong, Daniel
2013-03-01
Site-specific information about local water sources is an important part of a community-driven effort to improve environmental conditions. The purpose of this assessment was to gather this information for residents of rural villages in Ghana. Sanitary surveys and bacteriological testing for total coliforms and Escherichia coli (EC) using Colilert(®) were conducted at nearly 80 water sources serving eight villages. A focus group was carried out to assess the desirability and perceived quality of water sources. Standpipes accounted for almost half of the available water sources; however, a third of them were not functioning at the time of the survey. EC bacteria were found in the majority of shallow wells (80%), rivers (67%), and standpipes (61%), as well as 28% of dug wells. Boreholes were free of EC. Residents felt that the standpipes and boreholes produced safe drinking water. Intermittent service and poor water quality from the piped supply has led to limited access to drinking water. The perception of residents, that the water from standpipes is clean and does not need to be treated at home, is particularly troubling in light of the poor bacteriological quality of water from the standpipes.
Dueri, Sibylle; Marinov, Dimitar; Fiandrino, Annie; Tronczyński, Jacek; Zaldívar, José-Manuel
2010-01-01
A 3D hydrodynamic and contaminant fate model was implemented for polychlorinated dibenzo-p-dioxins and dibenzofurans (PCDD/Fs) in Thau lagoon. The hydrodynamic model was tested against temperature and salinity measurements, while the contaminant fate model was assessed against available data collected at different stations inside the lagoon. The model results allow an assessment of the spatial and temporal variability of the distribution of contaminants in the lagoon, the seasonality of loads and the role of atmospheric deposition for the input of PCDD/Fs. The outcome suggests that air is an important source of PCDD/Fs for this ecosystem, therefore the monitoring of air pollution is very appropriate for assessing the inputs of these contaminants. These results call for the development of integrated environmental protection policies. PMID:20617040
The Use of Structural-Acoustic Techniques to Assess Potential Structural Damage From Sonic Booms
NASA Technical Reports Server (NTRS)
Garrelick, Joel; Martini, Kyle
1996-01-01
The potential impact of supersonic operations includes structural damage from the sonic boom overpressure. This paper describes a study of how structural-acoustic modeling and testing techniques may be used to assess the potential for such damage in the absence of actual flyovers. Procedures are described whereby transfer functions relating structural response to sonic boom signature may be obtained with a stationary acoustic source and appropriate data processing. Further, by invoking structural-acoustic reciprocity, these transfer functions may also be acquired by measuring the radiated sound from the structure under a mechanical drive. The approach is based on the fundamental assumption of linearity, both with regard to the (acoustic) propagation of the boom in the vicinity of the structure and to the structure's response. Practical issues revolve around acoustic far field and source directivity requirements. The technique was implemented on a specially fabricated test structure at Edwards AFB, CA with the support of Wyle Laboratories, Inc. Blank shots from a cannon served as our acoustic source and taps from an instrumented hammer generated the mechanical drive. Simulated response functions were constructed. Results of comparisons with corresponding measurements recorded during dedicated supersonic flyovers with F-15 aircraft are presented for a number of sensor placements.
Development of a hardware-based AC microgrid for AC stability assessment
NASA Astrophysics Data System (ADS)
Swanson, Robert R.
As more power electronic-based devices enable the development of high-bandwidth AC microgrids, the topic of microgrid power distribution stability has become of increased interest. Recently, researchers have proposed a relatively straightforward method to assess the stability of AC systems based upon the time-constants of sources, the net bus capacitance, and the rate limits of sources. In this research, a focus has been to develop a hardware test system to evaluate AC system stability. As a first step, a time domain model of a two converter microgrid was established in which a three phase inverter acts as a power source and an active rectifier serves as an adjustable constant power AC load. The constant power load can be utilized to create rapid power flow transients to the generating system. As a second step, the inverter and active rectifier were designed using a Smart Power Module IGBT for switching and an embedded microcontroller as a processor for algorithm implementation. The inverter and active rectifier were designed to operate simultaneously using a synchronization signal to ensure each respective local controller operates in a common reference frame. Finally, the physical system was created and initial testing performed to validate the hardware functionality as a variable amplitude and variable frequency AC system.
Breastfeeding and Mental and Motor Development at 5 ½ Years
Clark, Katy M.; Castillo, Marcela; Calatroni, Agustin; Walter, Tomas; Cayazzo, Marisol; Pino, Paulina; Lozoff, Betsy
2006-01-01
Objective Breastfeeding is associated with better child development outcomes, but uncertainty remains primarily due to the close relationship between breastfeeding and socioeconomic status. This study assesses the issue in a low socioeconomic status sample where breastfeeding was close to universal. Methods 784 Chilean children were followed longitudinally from infancy. All but 4 were initially breastfed, 40% nursed beyond 12 months, and infant growth was normal. Child development was assessed at 5 ½ years by a cognitive, language, and motor test battery. The duration of breastfeeding as the sole milk source was analyzed as a continuous variable, adjusting for a comprehensive set of background factors. Results The relationship between breastfeeding and most 5 ½ -year developmental outcomes was non-linear, with poorer outcome for periods of breastfeeding as the sole milk source for < 2 months or > 8 months – statistically significant for language, motor, and one comprehensive cognitive test, with a suggestive trend for IQ. Conclusions The observed non-linear relationships showed that breastfeeding as the sole milk source for < 2 months or > 8 months, compared to 2–8 months, was associated with poorer development in this sample. The latter finding requires replication in other samples where long breastfeeding is common and socioeconomic status is relatively homogeneous. PMID:16530141
Semaphore network encryption report
NASA Astrophysics Data System (ADS)
Johnson, Karen L.
1994-03-01
This paper documents the results of a preliminary assessment performed on the commercial off-the-shelf (COTS) Semaphore Communications Corporation (SCC) Network Security System (NSS). The Semaphore NSS is a family of products designed to address important network security concerns, such as network source address authentication and data privacy. The assessment was performed in the INFOSEC Core Integration Laboratory, and its scope was product usability focusing on interoperability and system performance in an existing operational network. Included in this paper are preliminary findings. Fundamental features and functionality of the Semaphore NSS are identified, followed by details of the assessment, including test descriptions and results. A summary of test results and future plans are also included. These findings will be useful to those investigating the use of commercially available solutions to network authentication and data privacy.
40 CFR 63.11502 - What definitions apply to this subpart?
Code of Federal Regulations, 2010 CFR
2010-07-01
...: process knowledge, an engineering assessment, or test data. Byproduct means a chemical (liquid, gas, or... (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Chemical Manufacturing Area Sources... system(s); (5) A gas stream routed to other processes for reaction or other use in another process (i.e...
Additional Evidence for the Accuracy of Biographical Data: Long-Term Retest and Observer Ratings.
ERIC Educational Resources Information Center
Shaffer, Garnett Stokes; And Others
1986-01-01
Investigated accuracy of responses to biodata questionnaire using a test-retest design and informed external observers for verification. Responses from 237 subjects and 200 observers provided evidence that many responses to biodata questionnaire were accurate. Assessed sources of inaccuracy, including social desirability effects, and noted…
2013-01-01
Background There is little or no information available on the impact of funding by the food industry on trial outcomes and methodological quality of synbiotics, probiotics and prebiotics research in infants. The objective of this study was to compare the methodological quality, outcomes of food industry sponsored trials versus non industry sponsored trials, with regards to supplementation of synbiotics, probiotics and prebiotics in infant formula. Methods A comprehensive search was conducted to identify published and unpublished randomized clinical trials (RCTs). Cochrane methodology was used to assess the risk of bias of included RCTs in the following domains: 1) sequence generation; 2) allocation concealment; 3) blinding; 4) incomplete outcome data; 5) selective outcome reporting; and 6) other bias. Clinical outcomes and authors’ conclusions were reported in frequencies and percentages. The association between source of funding, risk of bias, clinical outcomes and conclusions were assessed using Pearson’s Chi-square test and the Fisher’s exact test. A p-value < 0.05 was statistically significant. Results Sixty seven completed and 3 on-going RCTs were included. Forty (59.7%) were funded by food industry, 11 (16.4%) by non-industry entities and 16 (23.9%) did not specify source of funding. Several risk of bias domains, especially sequence generation, allocation concealment and blinding, were not adequately reported. There was no significant association between the source of funding and sequence generation, allocation concealment, blinding and selective reporting, majority of reported clinical outcomes or authors’ conclusions. On the other hand, source of funding was significantly associated with the domains of incomplete outcome data, free of other bias domains as well as reported antibiotic use and conclusions on weight gain. Conclusion In RCTs on infants fed infant formula containing probiotics, prebiotics or synbiotics, the source of funding did not influence the majority of outcomes in favour of the sponsors’ products. More non-industry funded research is needed to further assess the impact of funding on methodological quality, reported clinical outcomes and authors’ conclusions. PMID:24219082
3D source localization of interictal spikes in epilepsy patients with MRI lesions
NASA Astrophysics Data System (ADS)
Ding, Lei; Worrell, Gregory A.; Lagerlund, Terrence D.; He, Bin
2006-08-01
The present study aims to accurately localize epileptogenic regions which are responsible for epileptic activities in epilepsy patients by means of a new subspace source localization approach, i.e. first principle vectors (FINE), using scalp EEG recordings. Computer simulations were first performed to assess source localization accuracy of FINE in the clinical electrode set-up. The source localization results from FINE were compared with the results from a classic subspace source localization approach, i.e. MUSIC, and their differences were tested statistically using the paired t-test. Other factors influencing the source localization accuracy were assessed statistically by ANOVA. The interictal epileptiform spike data from three adult epilepsy patients with medically intractable partial epilepsy and well-defined symptomatic MRI lesions were then studied using both FINE and MUSIC. The comparison between the electrical sources estimated by the subspace source localization approaches and MRI lesions was made through the coregistration between the EEG recordings and MRI scans. The accuracy of estimations made by FINE and MUSIC was also evaluated and compared by R2 statistic, which was used to indicate the goodness-of-fit of the estimated sources to the scalp EEG recordings. The three-concentric-spheres head volume conductor model was built for each patient with three spheres of different radii which takes the individual head size and skull thickness into consideration. The results from computer simulations indicate that the improvement of source spatial resolvability and localization accuracy of FINE as compared with MUSIC is significant when simulated sources are closely spaced, deep, or signal-to-noise ratio is low in a clinical electrode set-up. The interictal electrical generators estimated by FINE and MUSIC are in concordance with the patients' structural abnormality, i.e. MRI lesions, in all three patients. The higher R2 values achieved by FINE than MUSIC indicate that FINE provides a more satisfactory fitting of the scalp potential measurements than MUSIC in all patients. The present results suggest that FINE provides a useful brain source imaging technique, from clinical EEG recordings, for identifying and localizing epileptogenic regions in epilepsy patients with focal partial seizures. The present study may lead to the establishment of a high-resolution source localization technique from scalp-recorded EEGs for aiding presurgical planning in epilepsy patients.
Respiratory source control using a surgical mask: An in vitro study
Patel, Rajeev B.; Skaria, Shaji D.; Mansour, Mohamed M.; Smaldone, Gerald C.
2016-01-01
ABSTRACT Cough etiquette and respiratory hygiene are forms of source control encouraged to prevent the spread of respiratory infection. The use of surgical masks as a means of source control has not been quantified in terms of reducing exposure to others. We designed an in vitro model using various facepieces to assess their contribution to exposure reduction when worn at the infectious source (Source) relative to facepieces worn for primary (Receiver) protection, and the factors that contribute to each. In a chamber with various airflows, radiolabeled aerosols were exhaled via a ventilated soft-face manikin head using tidal breathing and cough (Source). Another manikin, containing a filter, quantified recipient exposure (Receiver). The natural fit surgical mask, fitted (SecureFit) surgical mask and an N95-class filtering facepiece respirator (commonly known as an “N95 respirator”) with and without a Vaseline-seal were tested. With cough, source control (mask or respirator on Source) was statistically superior to mask or unsealed respirator protection on the Receiver (Receiver protection) in all environments. To equal source control during coughing, the N95 respirator must be Vaseline-sealed. During tidal breathing, source control was comparable or superior to mask or respirator protection on the Receiver. Source control via surgical masks may be an important adjunct defense against the spread of respiratory infections. The fit of the mask or respirator, in combination with the airflow patterns in a given setting, are significant contributors to source control efficacy. Future clinical trials should include a surgical mask source control arm to assess the contribution of source control in overall protection against airborne infection. PMID:26225807
Testing earthquake source inversion methodologies
Page, M.; Mai, P.M.; Schorlemmer, D.
2011-01-01
Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data and the complex rupture process at depth. The resulting earthquake source models quantify the spatiotemporal evolution of ruptures. They are also used to provide a rapid assessment of the severity of an earthquake and to estimate losses. However, because of uncertainties in the data, assumed fault geometry and velocity structure, and chosen rupture parameterization, it is not clear which features of these source models are robust. Improved understanding of the uncertainty and reliability of earthquake source inversions will allow the scientific community to use the robust features of kinematic inversions to more thoroughly investigate the complexity of the rupture process and to better constrain other earthquakerelated computations, such as ground motion simulations and static stress change calculations.
Kephart, Christopher M.; Bushon, Rebecca N.
2010-01-01
An influx of concentrated animal feeding operations in northwest Ohio has prompted local agencies to examine the effects of these industrial farms on water quality in the upper Portage River watershed. The utility of microbial source-tracking (MST) tools as a means of characterizing sources of fecal contamination in the watershed was evaluated. From 2007 to 2008, scientists with the U.S. Geological Survey, Bowling Green State University, and the Wood County Health Department collected and analyzed 17 environmental samples and 13 fecal source samples for Bacteroides-based host-associated DNA markers. At many of the environmental sites tested, MST marker results corroborated the presumptive fecal contamination sources. Results from this demonstration study support the utility of using MST with host-specific molecular markers to characterize the sources of fecal contamination in the Portage River watershed.
Performance Characterization of a Solenoid-type Gas Valve for the H- Magnetron Source at FNAL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sosa, A.; Bollinger, D. S.; Karns, P. R.
2016-09-06
The magnetron-style H- ion sources currently in operation at Fermilab use piezoelectric gas valves to function. This kind of gas valve is sensitive to small changes in ambient temperature, which affect the stability and performance of the ion source. This motivates the need to find an alternative way of feeding H2 gas into the source. A solenoid-type gas valve has been characterized in a dedicated off-line test stand to assess the feasibility of its use in the operational ion sources. H- ion beams have been extracted at 35 keV using this valve. In this study, the performance of the solenoidmore » gas valve has been characterized measuring the beam current output of the magnetron source with respect to the voltage and pulse width of the signal applied to the gas valve.« less
State-of-the-art assessment of electric and hybrid vehicles
NASA Technical Reports Server (NTRS)
1978-01-01
Data are presented that were obtained from the electric and hybrid vehicles tested, information collected from users of electric vehicles, and data and information on electric and hybrid vehicles obtained on a worldwide basis from manufacturers and available literature. The data given include: (1) information and data base (electric and hybrid vehicle systems descriptions, sources of vehicle data and information, and sources of component data); (2) electric vehicles (theoretical background, electric vehicle track tests, user experience, literature data, and summary of electric vehicle status); (3) electric vehicle components (tires, differentials, transmissions, traction motors, controllers, batteries, battery chargers, and component summary); and (4) hybrid vehicles (types of hybrid vehicles, operating modes, hybrid vehicles components, and hybrid vehicles performance characteristics).
Assessment of DPOAE test-retest difference curves via hierarchical Gaussian processes.
Bao, Junshu; Hanson, Timothy; McMillan, Garnett P; Knight, Kristin
2017-03-01
Distortion product otoacoustic emissions (DPOAE) testing is a promising alternative to behavioral hearing tests and auditory brainstem response testing of pediatric cancer patients. The central goal of this study is to assess whether significant changes in the DPOAE frequency/emissions curve (DP-gram) occur in pediatric patients in a test-retest scenario. This is accomplished through the construction of normal reference charts, or credible regions, that DP-gram differences lie in, as well as contour probabilities that measure how abnormal (or in a certain sense rare) a test-retest difference is. A challenge is that the data were collected over varying frequencies, at different time points from baseline, and on possibly one or both ears. A hierarchical structural equation Gaussian process model is proposed to handle the different sources of correlation in the emissions measurements, wherein both subject-specific random effects and variance components governing the smoothness and variability of each child's Gaussian process are coupled together. © 2016, The International Biometric Society.
Public health information and statistics dissemination efforts for Indonesia on the Internet.
Hanani, Febiana; Kobayashi, Takashi; Jo, Eitetsu; Nakajima, Sawako; Oyama, Hiroshi
2011-01-01
To elucidate current issues related to health statistics dissemination efforts on the Internet in Indonesia and to propose a new dissemination website as a solution. A cross-sectional survey was conducted. Sources of statistics were identified using link relationship and Google™ search. Menu used to locate statistics, mode of presentation and means of access to statistics, and available statistics were assessed for each site. Assessment results were used to derive design specification; a prototype system was developed and evaluated with usability test. 49 sources were identified on 18 governmental, 8 international and 5 non-government websites. Of 49 menus identified, 33% used non-intuitive titles and lead to inefficient search. 69% of them were on government websites. Of 31 websites, only 39% and 23% used graph/chart and map for presentation. Further, only 32%, 39% and 19% provided query, export and print feature. While >50% sources reported morbidity, risk factor and service provision statistics, <40% sources reported health resource and mortality statistics. Statistics portal website was developed using Joomla!™ content management system. Usability test demonstrated its potential to improve data accessibility. In this study, government's efforts to disseminate statistics in Indonesia are supported by non-governmental and international organizations and existing their information may not be very useful because it is: a) not widely distributed, b) difficult to locate, and c) not effectively communicated. Actions are needed to ensure information usability, and one of such actions is the development of statistics portal website.
2004-06-01
with TAs C-52A, C-52E, C-52N, and C-52W. It is used for air-to- ground munitions testing, countermeasures development and testing, and ground ...feet above ground level regardless of underlying land use . • Participating in “air shows” and fly-overs by U.S. Air Force aircraft at non-Air Force...Intermittent Intermittent 46 OSS Source : U.S. Government, 2001 Airway/Air Traffic Control The Warning Areas used by Eglin AFB are surrounded by
Grid-search Moment Tensor Estimation: Implementation and CTBT-related Application
NASA Astrophysics Data System (ADS)
Stachnik, J. C.; Baker, B. I.; Rozhkov, M.; Friberg, P. A.; Leifer, J. M.
2017-12-01
This abstract presents a review work related to moment tensor estimation for Expert Technical Analysis at the Comprehensive Test Ban Treaty Organization. In this context of event characterization, estimation of key source parameters provide important insights into the nature of failure in the earth. For example, if the recovered source parameters are indicative of a shallow source with large isotropic component then one conclusion is that it is a human-triggered explosive event. However, an important follow-up question in this application is - does an alternative hypothesis like a deeper source with a large double couple component explain the data approximately as well as the best solution? Here we address the issue of both finding a most likely source and assessing its uncertainty. Using the uniform moment tensor discretization of Tape and Tape (2015) we exhaustively interrogate and tabulate the source eigenvalue distribution (i.e., the source characterization), tensor orientation, magnitude, and source depth. The benefit of the grid-search is that we can quantitatively assess the extent to which model parameters are resolved. This provides a valuable opportunity during the assessment phase to focus interpretation on source parameters that are well-resolved. Another benefit of the grid-search is that it proves to be a flexible framework where different pieces of information can be easily incorporated. To this end, this work is particularly interested in fitting teleseismic body waves and regional surface waves as well as incorporating teleseismic first motions when available. Being that the moment tensor search methodology is well-established we primarily focus on the implementation and application. We present a highly scalable strategy for systematically inspecting the entire model parameter space. We then focus on application to regional and teleseismic data recorded during a handful of natural and anthropogenic events, report on the grid-search optimum, and discuss the resolution of interesting and/or important recovered source properties.
Differences in Performance Among Test Statistics for Assessing Phylogenomic Model Adequacy.
Duchêne, David A; Duchêne, Sebastian; Ho, Simon Y W
2018-05-18
Statistical phylogenetic analyses of genomic data depend on models of nucleotide or amino acid substitution. The adequacy of these substitution models can be assessed using a number of test statistics, allowing the model to be rejected when it is found to provide a poor description of the evolutionary process. A potentially valuable use of model-adequacy test statistics is to identify when data sets are likely to produce unreliable phylogenetic estimates, but their differences in performance are rarely explored. We performed a comprehensive simulation study to identify test statistics that are sensitive to some of the most commonly cited sources of phylogenetic estimation error. Our results show that, for many test statistics, traditional thresholds for assessing model adequacy can fail to reject the model when the phylogenetic inferences are inaccurate and imprecise. This is particularly problematic when analysing loci that have few variable informative sites. We propose new thresholds for assessing substitution model adequacy and demonstrate their effectiveness in analyses of three phylogenomic data sets. These thresholds lead to frequent rejection of the model for loci that yield topological inferences that are imprecise and are likely to be inaccurate. We also propose the use of a summary statistic that provides a practical assessment of overall model adequacy. Our approach offers a promising means of enhancing model choice in genome-scale data sets, potentially leading to improvements in the reliability of phylogenomic inference.
Cronin, Aidan A; Odagiri, Mitsunori; Arsyad, Bheta; Nuryetty, Mariet Tetty; Amannullah, Gantjang; Santoso, Hari; Darundiyah, Kristin; Nasution, Nur 'Aisyah
2017-10-01
There remains a pressing need for systematic water quality monitoring strategies to assess drinking water safety and to track progress towards the Sustainable Development Goals (SDG). This study incorporated water quality testing into an existing national socioeconomic survey in Yogyakarta province, Indonesia; the first such study in Indonesia in terms of SDG tracking. Multivariate regression analysis assessed the association between faecal and nitrate contamination and drinking water sources household drinking water adjusted for wealth, education level, type of water sources and type of sanitation facilities. The survey observed widespread faecal contamination in both sources for drinking water (89.2%, 95%CI: 86.9-91.5%; n=720) and household drinking water (67.1%, 95%CI: 64.1-70.1%; n=917) as measured by Escherichia coli. This was despite widespread improved drinking water source coverage (85.3%) and commonly self-reported boiling practices (82.2%). E.coli concentration levels in household drinking water were associated with wealth, education levels of a household head, and type of water source (i.e. vender water or local sources). Following the proposed SDG definition for Target 6.1 (water) and 6.2 (sanitation), the estimated proportion of households with access to safely managed drinking water and sanitation was 8.5% and 45.5%, respectively in the study areas, indicating substantial difference from improved drinking water (82.2%) and improved sanitation coverage (70.9%) as per the MDGs targets. The greatest contamination and risk factors were found in the poorest households indicating the urgent need for targeted and effective interventions here. There is suggested evidence that sub-surface leaching from on-site sanitation adversely impacts on drinking water sources, which underscores the need for further technical assistance in promoting latrine construction. Urgent action is still needed to strengthen systematic monitoring efforts towards tracking SDG Goal 6. Copyright © 2017 Elsevier GmbH. All rights reserved.
Jackson, Mark A.; Bothast, Rodney J.
1990-01-01
We assessed the influence of various carbon concentrations and carbon-to-nitrogen (C:N) ratios on Colletotrichum truncatum NRRL 13737 conidium formation in submerged cultures grown in a basal salts medium containing various amounts of glucose and Casamino Acids. Under the nutritional conditions tested, the highest conidium concentrations were produced in media with carbon concentrations of 4.0 to 15.3 g/liter. High carbon concentrations (20.4 to 40.8 g/liter) inhibited sporulation and enhanced the formation of microsclerotiumlike hyphal masses. At all the carbon concentrations tested, a culture grown in a medium with a C:N ratio of 15:1 produced more conidia than cultures grown in media with C:N ratios of 40:1 or 5:1. While glucose exhaustion was often coincident with conidium formation, cultures containing residual glucose sporulated and those with high carbon concentrations (>25 g/liter) exhausted glucose without sporulation. Nitrogen source studies showed that the levels of C. truncatum NRRL 13737 conidiation were similar for all protein hydrolysates tested. Reduced conidiation occurred when amino acid and inorganic nitrogen sources were used. Of the nine carbon sources evaluated, acetate as the sole carbon source resulted in the lowest level of sporulation. Images PMID:16348348
Drivers of microbiological quality of household drinking water - a case study in rural Ethiopia.
Usman, Muhammed A; Gerber, Nicolas; Pangaribowo, Evita H
2018-04-01
This study aims at assessing the determinants of microbiological contamination of household drinking water under multiple-use water systems in rural areas of Ethiopia. For this analysis, a random sample of 454 households was surveyed between February and March 2014, and water samples from community sources and household storage containers were collected and tested for fecal contamination. The number of Escherichia coli (E. coli) colony-forming units per 100 mL water was used as an indicator of fecal contamination. The microbiological tests demonstrated that 58% of household stored water samples and 38% of protected community water sources were contaminated with E. coli. Moreover, most improved water sources often considered to provide safe water showed the presence of E. coli. The result shows that households' stored water collected from unprotected wells/springs had higher levels of E. coli than stored water from alternative sources. Distance to water sources and water collection containers are also strongly associated with stored water quality. To ensure the quality of stored water, the study suggests that there is a need to promote water safety from the point-of-source to point-of-use, with due considerations for the linkages between water and agriculture to advance the Sustainable Development Goal 6 of ensuring access to clean water for everyone.
Solar cell and module performance assessment based on indoor calibration methods
NASA Astrophysics Data System (ADS)
Bogus, K.
A combined space/terrestrial solar cell test calibration method that requires five steps and can be performed indoors is described. The test conditions are designed to qualify the cell or module output data in standard illumination and temperature conditions. Measurements are made of the short-circuit current, the open circuit voltage, the maximum power, the efficiency, and the spectral response. Standard sunlight must be replicated both in earth surface and AM0 conditions; Xe lamps are normally used for the light source, with spectral measurements taken of the light. Cell and module spectral response are assayed by using monochromators and narrow band pass monochromatic filters. Attention is required to define the performance characteristics of modules under partial shadowing. Error sources that may effect the measurements are discussed, as are previous cell performance testing and calibration methods and their effectiveness in comparison with the behaviors of satellite solar power panels.
School adjustment of children in residential care: a multi-source analysis.
Martín, Eduardo; Muñoz de Bustillo, María del Carmen
2009-11-01
School adjustment is one the greatest challenges in residential child care programs. This study has two aims: to analyze school adjustment compared to a normative population, and to carry out a multi-source analysis (child, classmates, and teacher) of this adjustment. A total of 50 classrooms containing 60 children from residential care units were studied. The "Método de asignación de atributos perceptivos" (Allocation of perceptive attributes; Díaz-Aguado, 2006), the "Test Autoevaluativo Multifactorial de Adaptación Infantil" (TAMAI [Multifactor Self-assessment Test of Child Adjustment]; Hernández, 1996) and the "Protocolo de valoración para el profesorado (Evaluation Protocol for Teachers; Fernández del Valle, 1998) were applied. The main results indicate that, compared with their classmates, children in residential care are perceived as more controversial and less integrated at school, although no differences were observed in problems of isolation. The multi-source analysis shows that there is agreement among the different sources when the externalized and visible aspects are evaluated. These results are discussed in connection with the practices that are being developed in residential child care programs.
Pressure ulcer prevention knowledge among Jordanian nurses: a cross- sectional study
2014-01-01
Background Pressure ulcer remains a significant problem in the healthcare system. In addition to the suffering it causes patients, it bears a growing financial burden. Although pressure ulcer prevention and care have improved in recent years, pressure ulcer still exists and occurs in both hospital and community settings. In Jordan, there are a handful of studies on pressure ulcer. This study aims to explore levels of knowledge and knowledge sources about pressure ulcer prevention, as well as barriers to implementing pressure ulcer prevention guidelines among Jordanian nurses. Methods Using a cross-sectional study design and a self-administered questionnaire, data was collected from 194 baccalaureate and master’s level staff nurses working in eight Jordanian hospitals. From September to October of 2011, their knowledge levels about pressure ulcer prevention and the sources of this knowledge were assessed, along with the barriers which reduce successful pressure ulcer care and prevention. ANOVA and t-test analysis were used to test the differences in nurses’ knowledge according to participants’ characteristics. Means, standard deviation, and frequencies were used to describe nurses’ knowledge levels, knowledge sources, and barriers to pressure ulcer prevention. Results The majority (73%, n = 141) of nurses had inadequate knowledge about pressure ulcer prevention. The mean scores of the test for all participants was 10.84 out of 26 (SD = 2.3, range = 5–17), with the lowest score in themes related to PU etiology, preventive measures to reduce amount of pressure/shear, and risk assessment. In-service training was the second source of education on pressure ulcer, coming after university training. Shortage of staff and lack of time were the most frequently cited barriers to carrying out pressure ulcer risk assessment, documentation, and prevention. Conclusions This study highlights concerns about Jordanian nurses’ knowledge of pressure ulcer prevention. The results of the current study showed inadequate knowledge among Jordanian nurses about pressure ulcer prevention based on National Pressure Ulcer Advisory Panel guidelines. Also, the low level of nurses’ pressure ulcer knowledge suggests poor dissemination of pressure ulcer knowledge in Jordan, a suggestion supported by the lack of relationship between years of experience and pressure ulcer knowledge. PMID:24565372
NASA Technical Reports Server (NTRS)
Mckee, R. G.; Alvares, N. J.
1976-01-01
The following projects were completed as part of the effort to develop and test economically feasible fire-resistant materials for interior furnishings of aircraft as well as detectors of incipient fires in passenger and cargo compartments: (1) determination of the sensitivity of various contemporary gas and smoke detectors to pyrolysis and combustion products from materials commonly used in aircraft interiors and from materials that may be used in the future, (2) assessment of the environmental limitations to detector sensitivity and reliability. The tests were conducted on three groups of materials by exposure to the following three sources of exposure: radiant and Meeker burner flame, heated coil, and radiant source only. The first test series used radiant heat and flame exposures on easily obtainable test materials. Next, four materials were selected from the first group and exposed to an incandescent coil to provide the conditions for smoldering combustion. Finally, radiant heat exposures were used on advanced materials that are not readily available.
NASA Technical Reports Server (NTRS)
Domack, M. S.
1985-01-01
A research program was conducted to critically assess the effects of precracked specimen configuration, stress intensity solutions, compliance relationships and other experimental test variables for stress corrosion testing of 7075-T6 aluminum alloy plate. Modified compact and double beam wedge-loaded specimens were tested and analyzed to determine the threshold stress intensity factor and stress corrosion crack growth rate. Stress intensity solutions and experimentally determined compliance relationships were developed and compared with other solutions available in the literature. Crack growth data suggests that more effective crack length measurement techniques are necessary to better characterize stress corrosion crack growth. Final load determined by specimen reloading and by compliance did not correlate well, and was considered a major source of interlaboratory variability. Test duration must be determined systematically, accounting for crack length measurement resolution, time for crack arrest, and experimental interferences. This work was conducted as part of a round robin program sponsored by ASTM committees G1.06 and E24.04 to develop a standard test method for stress corrosion testing using precracked specimens.
Scenario based approach for multiple source Tsunami Hazard assessment for Sines, Portugal
NASA Astrophysics Data System (ADS)
Wronna, M.; Omira, R.; Baptista, M. A.
2015-08-01
In this paper, we present a scenario-based approach for tsunami hazard assessment for the city and harbour of Sines - Portugal, one of the test-sites of project ASTARTE. Sines holds one of the most important deep-water ports which contains oil-bearing, petrochemical, liquid bulk, coal and container terminals. The port and its industrial infrastructures are facing the ocean southwest towards the main seismogenic sources. This work considers two different seismic zones: the Southwest Iberian Margin and the Gloria Fault. Within these two regions, we selected a total of six scenarios to assess the tsunami impact at the test site. The tsunami simulations are computed using NSWING a Non-linear Shallow Water Model With Nested Grids. In this study, the static effect of tides is analysed for three different tidal stages MLLW (mean lower low water), MSL (mean sea level) and MHHW (mean higher high water). For each scenario, inundation is described by maximum values of wave height, flow depth, drawback, runup and inundation distance. Synthetic waveforms are computed at virtual tide gauges at specific locations outside and inside the harbour. The final results describe the impact at Sines test site considering the single scenarios at mean sea level, the aggregate scenario and the influence of the tide on the aggregate scenario. The results confirm the composite of Horseshoe and Marques Pombal fault as the worst case scenario. It governs the aggregate scenario with about 60 % and inundates an area of 3.5 km2.
Reliability analysis of the objective structured clinical examination using generalizability theory.
Trejo-Mejía, Juan Andrés; Sánchez-Mendiola, Melchor; Méndez-Ramírez, Ignacio; Martínez-González, Adrián
2016-01-01
The objective structured clinical examination (OSCE) is a widely used method for assessing clinical competence in health sciences education. Studies using this method have shown evidence of validity and reliability. There are no published studies of OSCE reliability measurement with generalizability theory (G-theory) in Latin America. The aims of this study were to assess the reliability of an OSCE in medical students using G-theory and explore its usefulness for quality improvement. An observational cross-sectional study was conducted at National Autonomous University of Mexico (UNAM) Faculty of Medicine in Mexico City. A total of 278 fifth-year medical students were assessed with an 18-station OSCE in a summative end-of-career final examination. There were four exam versions. G-theory with a crossover random effects design was used to identify the main sources of variance. Examiners, standardized patients, and cases were considered as a single facet of analysis. The exam was applied to 278 medical students. The OSCE had a generalizability coefficient of 0.93. The major components of variance were stations, students, and residual error. The sites and the versions of the tests had minimum variance. Our study achieved a G coefficient similar to that found in other reports, which is acceptable for summative tests. G-theory allows the estimation of the magnitude of multiple sources of error and helps decision makers to determine the number of stations, test versions, and examiners needed to obtain reliable measurements.
Reliability analysis of the objective structured clinical examination using generalizability theory.
Trejo-Mejía, Juan Andrés; Sánchez-Mendiola, Melchor; Méndez-Ramírez, Ignacio; Martínez-González, Adrián
2016-01-01
Background The objective structured clinical examination (OSCE) is a widely used method for assessing clinical competence in health sciences education. Studies using this method have shown evidence of validity and reliability. There are no published studies of OSCE reliability measurement with generalizability theory (G-theory) in Latin America. The aims of this study were to assess the reliability of an OSCE in medical students using G-theory and explore its usefulness for quality improvement. Methods An observational cross-sectional study was conducted at National Autonomous University of Mexico (UNAM) Faculty of Medicine in Mexico City. A total of 278 fifth-year medical students were assessed with an 18-station OSCE in a summative end-of-career final examination. There were four exam versions. G-theory with a crossover random effects design was used to identify the main sources of variance. Examiners, standardized patients, and cases were considered as a single facet of analysis. Results The exam was applied to 278 medical students. The OSCE had a generalizability coefficient of 0.93. The major components of variance were stations, students, and residual error. The sites and the versions of the tests had minimum variance. Conclusions Our study achieved a G coefficient similar to that found in other reports, which is acceptable for summative tests. G-theory allows the estimation of the magnitude of multiple sources of error and helps decision makers to determine the number of stations, test versions, and examiners needed to obtain reliable measurements.
Assessment of vaccine testing at three laboratories using the guinea pig model of tuberculosis.
Grover, Ajay; Troudt, Jolynn; Arnett, Kimberly; Izzo, Linda; Lucas, Megan; Strain, Katie; McFarland, Christine; Hall, Yper; McMurray, David; Williams, Ann; Dobos, Karen; Izzo, Angelo
2012-01-01
The guinea pig model of tuberculosis is used extensively in different locations to assess the efficacy of novel tuberculosis vaccines during pre-clinical development. Two key assays are used to measure protection against virulent challenge: a 30 day post-infection assessment of mycobacterial burden and long-term post-infection survival and pathology analysis. To determine the consistency and robustness of the guinea pig model for testing vaccines, a comparative assessment between three sites that are currently involved in testing tuberculosis vaccines from external providers was performed. Each site was asked to test two "subunit" type vaccines in their routine animal model as if testing vaccines from a provider. All sites performed a 30 day study, and one site also performed a long-term survival/pathology study. Despite some differences in experimental approach between the sites, such as the origin of the Mycobacterium tuberculosis strain and the type of aerosol exposure device used to infect the animals and the source of the guinea pigs, the data obtained between sites were consistent in regard to the ability of each "vaccine" tested to reduce the mycobacterial burden. The observations also showed that there was good concurrence between the results of short-term and long-term studies. This validation exercise means that efficacy data can be compared between sites. Copyright © 2011 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1992-06-01
Proposed action is to construct at BNL a 5,600-ft{sup 2} support building, install and operate a prototypic 200 MeV accelerator and a prototypic 700 MeV storage ring within, and to construct and operate a 15 kV substation to power the building. The accelerator and storage ring would comprise the x-ray lithography source or XLS.
Analysis of Discrete-Source Damage Progression in a Tensile Stiffened Composite Panel
NASA Technical Reports Server (NTRS)
Wang, John T.; Lotts, Christine G.; Sleight, David W.
1999-01-01
This paper demonstrates the progressive failure analysis capability in NASA Langley s COMET-AR finite element analysis code on a large-scale built-up composite structure. A large-scale five stringer composite panel with a 7-in. long discrete source damage was analyzed from initial loading to final failure including the geometric and material nonlinearities. Predictions using different mesh sizes, different saw cut modeling approaches, and different failure criteria were performed and assessed. All failure predictions have a reasonably good correlation with the test result.
The Ohio River Valley CO2 Storage Project AEP Mountaineer Plan, West Virginia
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neeraj Gupta
2009-01-07
This report includes an evaluation of deep rock formations with the objective of providing practical maps, data, and some of the issues considered for carbon dioxide (CO{sub 2}) storage projects in the Ohio River Valley. Injection and storage of CO{sub 2} into deep rock formations represents a feasible option for reducing greenhouse gas emissions from coal-burning power plants concentrated along the Ohio River Valley area. This study is sponsored by the U.S. Department of Energy (DOE) National Energy Technology Laboratory (NETL), American Electric Power (AEP), BP, Ohio Coal Development Office, Schlumberger, and Battelle along with its Pacific Northwest Division. Anmore » extensive program of drilling, sampling, and testing of a deep well combined with a seismic survey was used to characterize the local and regional geologic features at AEP's 1300-megawatt (MW) Mountaineer Power Plant. Site characterization information has been used as part of a systematic design feasibility assessment for a first-of-a-kind integrated capture and storage facility at an existing coal-fired power plant in the Ohio River Valley region--an area with a large concentration of power plants and other emission sources. Subsurface characterization data have been used for reservoir simulations and to support the review of the issues relating to injection, monitoring, strategy, risk assessment, and regulatory permitting. The high-sulfur coal samples from the region have been tested in a capture test facility to evaluate and optimize basic design for a small-scale capture system and eventually to prepare a detailed design for a capture, local transport, and injection facility. The Ohio River Valley CO{sub 2} Storage Project was conducted in phases with the ultimate objectives of demonstrating both the technical aspects of CO{sub 2} storage and the testing, logistical, regulatory, and outreach issues related to conducting such a project at a large point source under realistic constraints. The site characterization phase was completed, laying the groundwork for moving the project towards a potential injection phase. Feasibility and design assessment activities included an assessment of the CO{sub 2} source options (a slip-stream capture system or transported CO{sub 2}); development of the injection and monitoring system design; preparation of regulatory permits; and continued stakeholder outreach.« less
Presence of enteric viruses in source waters for drinking water production in The Netherlands.
Lodder, W J; van den Berg, H H J L; Rutjes, S A; de Roda Husman, A M
2010-09-01
The quality of drinking water in The Netherlands has to comply with the Dutch Drinking Water Directive: less than one infection in 10,000 persons per year may occur due to consumption of unboiled drinking water. Since virus concentrations in drinking waters may be below the detection limit but entail a public health risk, the infection risk from drinking water consumption requires the assessment of the virus concentrations in source waters and of the removal efficiency of treatment processes. In this study, samples of source waters were taken during 4 years of regular sampling (1999 to 2002), and enteroviruses, reoviruses, somatic phages, and F-specific phages were detected in 75% (range, 0.0033 to 5.2 PFU/liter), 83% (0.0030 to 5.9 PFU/liter), 100% (1.1 to 114,156 PFU/liter), and 97% (0.12 to 14,403 PFU/liter), respectively, of 75 tested source water samples originating from 10 locations for drinking water production. By endpoint dilution reverse transcription-PCR (RT-PCR), 45% of the tested source water samples were positive for norovirus RNA (0.22 to 177 PCR-detectable units [PDU]/liter), and 48% were positive for rotavirus RNA (0.65 to 2,249 PDU/liter). Multiple viruses were regularly detected in the source water samples. A significant correlation between the concentrations of the two phages and those of the enteroviruses could be demonstrated. The virus concentrations varied greatly between 10 tested locations, and a seasonal effect was observed. Peak concentrations of pathogenic viruses occur in source waters used for drinking water production. If seasonal and short-term fluctuations coincide with less efficient or failing treatment, an unacceptable public health risk from exposure to this drinking water may occur.
Wilson, Sacoby; Burwell-Naney, Kristen; Jiang, Chengsheng; Zhang, Hongmei; Samantapudi, Ashok; Murray, Rianna; Dalemarre, Laura; Rice, LaShanta; Williams, Edith
2015-01-01
Populations of color and low-income communities are often disproportionately burdened by exposures to various environmental contaminants, including air pollution. Some air pollutants have carcinogenic properties that are particularly problematic in South Carolina (SC), a state that consistently has high rates of cancer mortality for all sites. The purpose of this study was to assess cancer risk disparities in SC by linking risk estimates from the U.S. Environmental Protection Agency’s 2005 National Air Toxics Assessment (NATA) with sociodemographic data from the 2000 US Census Bureau. Specifically, NATA risk data for varying risk categories were linked by tract ID and analyzed with sociodemographic variables from the 2000 census using R. The average change in cancer risk from all sources by sociodemographic variable was quantified using multiple linear regression models. Spatial methods were further employed using ArcGIS 10 to assess the distribution of all source risk and percent non-white at each census tract level. The relative risk estimates of the proportion of high cancer risk tracts (defined as the top 10% of cancer risk in SC) and their respective 95% confidence intervals (CIs) were calculated between the first and latter three quartiles defined by sociodemographic factors, while the variance in the percentage of high cancer risk between quartile groups was tested using Pearson’s chi-square. The average total cancer risk for SC was 26.8 people/million (ppl/million). The risk from on-road sources was approximately 5.8 ppl/million, higher than the risk from major, area, and non-road sources (1.8, 2.6, and 1.3 ppl/million), respectively. Based on our findings, addressing on-road sources may decrease the disproportionate cancer risk burden among low-income populations and communities of color in SC. PMID:26037107
Two Are Not Better than One: Combining Unitization and Relational Encoding Strategies
ERIC Educational Resources Information Center
Tu, Hsiao-Wei; Diana, Rachel A.
2016-01-01
In recognition memory, "recollection" is defined as retrieval of the context associated with an event, whereas "familiarity" is defined as retrieval based on item strength alone. Recent studies have shown that conventional recollection-based tasks, in which context details are manipulated for source memory assessment at test,…
Optimal Rating Procedures and Methodology for NAEP Open- Ended Items. Working Paper Series.
ERIC Educational Resources Information Center
Patz, Richard J.; Wilson, Mark; Hoskens, Machteld
The National Assessment of Educational Progress (NAEP) collects data in the form of repeated, discrete measures (test items) with hierarchical structure for both measures and subjects, that is complex by any standard. This complexity has been managed through a "divide and conquer" approach of isolating and evaluating sources of…
ERIC Educational Resources Information Center
Asil, Mustafa; Brown, Gavin T. L.
2016-01-01
The use of the Programme for International Student Assessment (PISA) across nations, cultures, and languages has been criticized. The key criticisms point to the linguistic and cultural biases potentially underlying the design of reading comprehension tests, raising doubts about the legitimacy of comparisons across economies. Our research focused…
Many statutory needs for sediment quality assessment exist (U.S. EPA 1996). A variety of sediment toxicity tests have been used to support the development of sediment quality guidelines and to determine the benthic impacts of dredging activities and point and non-point source tox...
The Advanced Statistical Trajectory Regional Air Pollution (ASTRAP) model simulates long-term transport and deposition of oxides of and nitrogen. t is a potential screening tool for assessing long-term effects on regional visibility from sulfur emission sources. owever, a rigorou...
ERIC Educational Resources Information Center
Greene, Jennifer C.; Kellogg, Theodore
Statewide assessment data available from two school years, two grade levels, and five sources (achievement tests; student, principal, and teacher questionnaires; and principal interviews), were aggregated to more closely investigate the relationship between student/school characteristics and student achievement. To organize this large number of…
Many statutory needs for sediment quality assessment exist (U.S. EPA 1996). A variety of sediment toxicity tests have been used to support the development of sediment quality guidelines and to determine the benthic impacts of dredging activities and point and non-point source tox...
Acorn storage alternatives tested on Oregon white oak
Warren D. Devine; Constance A. Harrington; Joseph M. Kraft
2010-01-01
We assessed various combinations of storage factors: bag type, temperature, duration, and antifungal pre-storage treatments for white oak acorn storage, using Oregon white oak (Quercus garryana Douglas ex Hook. [Fagaceae]) acorns from 7 seed sources. Acorn viability remained high (84%), even after 2 y of refrigerated storage, but the majority of...
Sentence Repetition: What Does the Task Measure?
ERIC Educational Resources Information Center
Polišenská, Kamila; Chiat, Shula; Roy, Penny
2015-01-01
Background: Sentence repetition is gaining increasing attention as a source of information about children's sentence-level abilities in clinical assessment, and as a clinical marker of specific language impairment. However, it is widely debated what the task is testing and therefore how informative it is. Aims: (1) To evaluate the effects of…
Environment Sentinel Biomonitor Technology Assessment
2013-09-01
turbidity, humic /fulvic acids , geosmin/MIB, hard water) with minimal effect on test outcome. It is better to be able to operate under a wide range...inhibition between 20–80%. c. Susceptibility to source water conditions: very low i. No response for pH (4.5–9), geosmin, MIB, humic /fulvic acids , or hard
ERIC Educational Resources Information Center
Collins, Alyson A.; Lindström, Esther R.; Compton, Donald L.
2018-01-01
Researchers have increasingly investigated sources of variance in reading comprehension test scores, particularly with students with reading difficulties (RD). The purpose of this meta-analysis was to determine if the achievement gap between students with RD and typically developing (TD) students varies as a function of different reading…
Matching radio catalogues with realistic geometry: application to SWIRE and ATLAS
NASA Astrophysics Data System (ADS)
Fan, Dongwei; Budavári, Tamás; Norris, Ray P.; Hopkins, Andrew M.
2015-08-01
Cross-matching catalogues at different wavelengths is a difficult problem in astronomy, especially when the objects are not point-like. At radio wavelengths, an object can have several components corresponding, for example, to a core and lobes. Considering not all radio detections correspond to visible or infrared sources, matching these catalogues can be challenging. Traditionally, this is done by eye for better quality, which does not scale to the large data volumes expected from the next-generation of radio telescopes. We present a novel automated procedure, using Bayesian hypothesis testing, to achieve reliable associations by explicit modelling of a particular class of radio-source morphology. The new algorithm not only assesses the likelihood of an association between data at two different wavelengths, but also tries to assess whether different radio sources are physically associated, are double-lobed radio galaxies, or just distinct nearby objects. Application to the Spitzer Wide-Area Infrared Extragalactic and Australia Telescope Large Area Survey CDF-S catalogues shows that this method performs well without human intervention.
NASA Astrophysics Data System (ADS)
Upton, D. W.; Saeed, B. I.; Mather, P. J.; Lazaridis, P. I.; Vieira, M. F. Q.; Atkinson, R. C.; Tachtatzis, C.; Garcia, M. S.; Judd, M. D.; Glover, I. A.
2018-03-01
Monitoring of partial discharge (PD) activity within high-voltage electrical environments is increasingly used for the assessment of insulation condition. Traditional measurement techniques employ technologies that either require off-line installation or have high power consumption and are hence costly. A wireless sensor network is proposed that utilizes only received signal strength to locate areas of PD activity within a high-voltage electricity substation. The network comprises low-power and low-cost radiometric sensor nodes which receive the radiation propagated from a source of PD. Results are reported from several empirical tests performed within a large indoor environment and a substation environment using a network of nine sensor nodes. A portable PD source emulator was placed at multiple locations within the network. Signal strength measured by the nodes is reported via WirelessHART to a data collection hub where it is processed using a location algorithm. The results obtained place the measured location within 2 m of the actual source location.
Pinpointing the North Korea Nuclear tests with body waves scattered by surface topography
NASA Astrophysics Data System (ADS)
Wang, N.; Shen, Y.; Bao, X.; Flinders, A. F.
2017-12-01
On September 3, 2017, North Korea conducted its sixth and by far the largest nuclear test at the Punggye-ri test site. In this work, we apply a novel full-wave location method that combines a non-linear grid-search algorithm with the 3D strain Green's tensor database to locate this event. We use the first arrivals (Pn waves) and their immediate codas, which are likely dominated by waves scattered by the surface topography near the source, to pinpoint the source location. We assess the solution in the search volume using a least-squares misfit between the observed and synthetic waveforms, which are obtained using the collocated-grid finite difference method on curvilinear grids. We calculate the one standard deviation level of the 'best' solution as a posterior error estimation. Our results show that the waveform based location method allows us to obtain accurate solutions with a small number of stations. The solutions are absolute locations as opposed to relative locations based on relative travel times, because topography-scattered waves depend on the geometric relations between the source and the unique topography near the source. Moreover, we use both differential waveforms and traveltimes to locate pairs of the North Korea tests in years 2016 and 2017 to further reduce the effects of inaccuracies in the reference velocity model (CRUST 1.0). Finally, we compare our solutions with those of other studies based on satellite images and relative traveltimes.
NASA Astrophysics Data System (ADS)
Belis, Claudio A.; Pernigotti, Denise; Pirovano, Guido
2017-04-01
Source Apportionment (SA) is the identification of ambient air pollution sources and the quantification of their contribution to pollution levels. This task can be accomplished using different approaches: chemical transport models and receptor models. Receptor models are derived from measurements and therefore are considered as a reference for primary sources urban background levels. Chemical transport model have better estimation of the secondary pollutants (inorganic) and are capable to provide gridded results with high time resolution. Assessing the performance of SA model results is essential to guarantee reliable information on source contributions to be used for the reporting to the Commission and in the development of pollution abatement strategies. This is the first intercomparison ever designed to test both receptor oriented models (or receptor models) and chemical transport models (or source oriented models) using a comprehensive method based on model quality indicators and pre-established criteria. The target pollutant of this exercise, organised in the frame of FAIRMODE WG 3, is PM10. Both receptor models and chemical transport models present good performances when evaluated against their respective references. Both types of models demonstrate quite satisfactory capabilities to estimate the yearly source contributions while the estimation of the source contributions at the daily level (time series) is more critical. Chemical transport models showed a tendency to underestimate the contribution of some single sources when compared to receptor models. For receptor models the most critical source category is industry. This is probably due to the variety of single sources with different characteristics that belong to this category. Dust is the most problematic source for Chemical Transport Models, likely due to the poor information about this kind of source in the emission inventories, particularly concerning road dust re-suspension, and consequently the little detail about the chemical components of this source used in the models. The sensitivity tests show that chemical transport models show better performances when displaying a detailed set of sources (14) than when using a simplified one (only 8). It was also observed that an enhanced vertical profiling can improve the estimation of specific sources, such as industry, under complex meteorological conditions and that an insufficient spatial resolution in urban areas can impact on the capabilities of models to estimate the contribution of diffuse primary sources (e.g. traffic). Both families of models identify traffic and biomass burning as the first and second most contributing categories, respectively, to elemental carbon. The results of this study demonstrate that the source apportionment assessment methodology developed by the JRC is applicable to any kind of SA model. The same methodology is implemented in the on-line DeltaSA tool to support source apportionment model evaluation (http://source-apportionment.jrc.ec.europa.eu/).
Eco- and genotoxicity profiling of a rapeseed biodiesel using a battery of bioassays.
Eck-Varanka, Bettina; Kováts, Nora; Horváth, Eszter; Ferincz, Árpád; Kakasi, Balázs; Nagy, Szabolcs Tamás; Imre, Kornélia; Paulovits, Gábor
2018-04-30
Biodiesel is considered an important renewable energy source but still there is some controversy about its environmental toxicity, especially to aquatic life. In our study, the toxicity of water soluble fraction of biodiesel was evaluated in relatively low concentrations using a battery of bioassays: Vibrio fischeri bioluminescence inhibition, Sinapis alba root growth inhibition, Daphnia magna immobilization, boar semen live/dead ratio and DNA fragmentation and Unio pictorum micronucleus test. While the S. alba test indicated nutritive (stimulating) effect of the sample, the biodiesel exerted toxic effect in the aquatic tests. D. magna was the most sensitive with EC 50 value of 0.0226%. For genotoxicity assessment, the mussel micronucleus test (MNT) was applied, detecting considerable genotoxic potential of the biodiesel sample: it elucidated micronuclei formation already at low concentration of 3.3%. Although this test has never been employed in biodiesel eco/genotoxicity assessments, it seems a promising tool, based on its appropriate sensitivity, and representativity. Copyright © 2018 Elsevier Inc. All rights reserved.
Age-related differences in agenda-driven monitoring of format and task information
Mitchell, Karen J.; Ankudowich, Elizabeth; Durbin, Kelly A.; Greene, Erich J.; Johnson, Marcia K.
2013-01-01
Age-related source memory deficits may arise, in part, from changes in the agenda-driven processes that control what features of events are relevant during remembering. Using fMRI, we compared young and older adults on tests assessing source memory for format (picture, word) or encoding task (self-, other-referential), as well as on old-new recognition. Behaviorally, relative to old-new recognition, older adults showed disproportionate and equivalent deficits on both source tests compared to young adults. At encoding, both age groups showed expected activation associated with format in posterior visual processing areas, and with task in medial prefrontal cortex. At test, the groups showed similar selective, agenda-related activity in these representational areas. There were, however, marked age differences in the activity of control regions in lateral and medial prefrontal cortex and lateral parietal cortex. Results of correlation analyses were consistent with the idea that young adults had greater trial-by-trial agenda-driven modulation of activity (i.e., greater selectivity) than did older adults in representational regions. Thus, under selective remembering conditions where older adults showed clear differential regional activity in representational areas depending on type of test, they also showed evidence of disrupted frontal and parietal function and reduced item-by-item modulation of test-appropriate features. This pattern of results is consistent with an age-related deficit in the engagement of selective reflective attention. PMID:23357375
Biological assessment of aquatic pollution: a review, with emphasis on plants as biomonitors.
Doust, J L; Schmidt, M; Doust, L L
1994-05-01
In a number of disciplines including ecology, ecotoxicology, water quality management, water resource management, fishery biology etc., there is significant interest in the testing of new materials, environmental samples (of water or sediments) and specific sites, in terms of their effects on biota. In the first instance, we consider various sources of aquatic pollution, sources typically associated with developed areas of the world. Historically, much water quality assessment has been performed by researchers with a background in chemistry or engineering, thus chemical analysis was a dominant form of assessment. However, chemical analyses, particularly of such materials as organochlorines and polyaromatic hydrocarbons can be expensive, and local environmental factors may cause the actual exposure of an organism to be little correlated with chemical concentrations in the surrounding water or sediments. To a large extent toxicity testing has proceeded independently of environmental quality assessment in situ, and the work has been done by different, and differently-trained researchers. Here we attempt to bring together the various forms of biological assessment of aquatic pollution, because in our opinion it is worth developing a coherent framework for the application of this powerful tool. Biotic assessment in its most primitive form involves the simple tracking of mortality in exposed organisms. However, in most natural environments it is extended, chronic exposure to contaminants that has the most wide-ranging and irreversible repercussions--thus measures of sub-lethal impairment are favoured. From an ecological standpoint, it is most valuable to assess ecological effects by direct study of in situ contaminant body burdens and impairment of growth and reproduction compared with 'clean' sites. A distinction is made here between bioindication and biomonitoring, and a case is made for including aquatic macrophytes (angiosperms) in studies of contaminant levels and effects in the biota. It is apparent that there is a concurrent need for laboratory-based testing of new industrial by-products before any are released in the environment, and such studies should aid the investigation of mechanisms and modes of toxicity, but environmental assessment, and tracking of improvements in environmental quality are most effectively achieved by active biomonitoring experiments.
Industry-University SBIR NDT Projects — A Critical Assessment
NASA Astrophysics Data System (ADS)
Reinhart, Eugene R.
2007-03-01
The Small Business Innovative Research (SBIR) program, funded by various United States government agencies (DOD, DOE, NSF, etc.), provides funds for Research and Development (R&D) of nondestructive testing (NDT) techniques and equipment, thereby supplying valuable money for NDT development by small businesses and stimulating cooperative university programs. A review and critical assessment of the SBIR program as related to NDT is presented and should provide insight into reasons for or against pursuing this source of R&D funding.
Benchmarking Controlled Trial--a novel concept covering all observational effectiveness studies.
Malmivaara, Antti
2015-06-01
The Benchmarking Controlled Trial (BCT) is a novel concept which covers all observational studies aiming to assess effectiveness. BCTs provide evidence of the comparative effectiveness between health service providers, and of effectiveness due to particular features of the health and social care systems. BCTs complement randomized controlled trials (RCTs) as the sources of evidence on effectiveness. This paper presents a definition of the BCT; compares the position of BCTs in assessing effectiveness with that of RCTs; presents a checklist for assessing methodological validity of a BCT; and pilot-tests the checklist with BCTs published recently in the leading medical journals.
Cavitating Propeller Performance in Inclined Shaft Conditions with OpenFOAM: PPTC 2015 Test Case
NASA Astrophysics Data System (ADS)
Gaggero, Stefano; Villa, Diego
2018-05-01
In this paper, we present our analysis of the non-cavitating and cavitating unsteady performances of the Potsdam Propeller Test Case (PPTC) in oblique flow. For our calculations, we used the Reynolds-averaged Navier-Stokes equation (RANSE) solver from the open-source OpenFOAM libraries. We selected the homogeneous mixture approach to solve for multiphase flow with phase change, using the volume of fluid (VoF) approach to solve the multiphase flow and modeling the mass transfer between vapor and water with the Schnerr-Sauer model. Comparing the model results with the experimental measurements collected during the Second Workshop on Cavitation and Propeller Performance - SMP'15 enabled our assessment of the reliability of the open-source calculations. Comparisons with the numerical data collected during the workshop enabled further analysis of the reliability of different flow solvers from which we produced an overview of recommended guidelines (mesh arrangements and solver setups) for accurate numerical prediction even in off-design conditions. Lastly, we propose a number of calculations using the boundary element method developed at the University of Genoa for assessing the reliability of this dated but still widely adopted approach for design and optimization in the preliminary stages of very demanding test cases.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, Robert C.; Szecsody, James; Rigali, Mark J.
We have performed an initial evaluation and testing program to assess the effectiveness of a hydroxyapatite (Ca10(PO4)6(OH)2) permeable reactive barrier and source area treatment to decrease uranium mobility at the Department of Energy (DOE) former Old Rifle uranium mill processing site in Rifle, western Colorado. Uranium ore was processed at the site from the 1940s to the 1970s. The mill facilities at the site as well as the uranium mill tailings previously stored there have all been removed. Groundwater in the alluvial aquifer beneath the site still contains elevated concentrations of uranium, and is currently used for field tests tomore » study uranium behavior in groundwater and investigate potential uranium remediation technologies. The technology investigated in this work is based on in situ formation of apatite in sediment to create a subsurface apatite PRB and also for source area treatment. The process is based on injecting a solution containing calcium citrate and sodium into the subsurface for constructing the PRB within the uranium plume. As the indigenous sediment micro-organisms biodegrade the injected citrate, the calcium is released and reacts with the phosphate to form hydroxyapatite (precipitate). This paper reports on proof-of-principle column tests with Old Rifle sediment and synthetic groundwater.« less
Awareness of Omega-3 Fatty Acids and Possible Health Effects among Young Adults.
Roke, Kaitlin; Rattner, Jodi; Brauer, Paula; Mutch, David M
2018-03-16
To assess awareness of omega-3 fatty acids (FAs) and their possible health effects among young adults. An online survey was deployed to young adults. Questionnaire development involved identification of topic areas by content experts and adaptation of questions from previous consumer surveys. Focus groups and cognitive interviews ensured face validity, feasibility, and clarity of survey questions. Degrees of awareness and self-reported consumption were assessed by descriptive statistics and associations by Cochran's Q tests, Pearson's χ 2 tests, Z-tests, and logistic regression. Of the 834 survey completers (aged 18-25 years), more respondents recognized the abbreviations EPA (∼51%) and DHA (∼66%) relative to ALA (∼40%; P ≤ 0.01). Most respondents (∼83%) recognized that EPA and DHA have been linked to heart and brain health. Respondents who used academic/reputable sources, healthcare professionals, and/or social media to obtain nutritional information were more likely to report awareness of these health effects (P ≤ 0.01). Finally, 48% of respondents reported purchasing or consuming omega-3 foods, while 21% reported taking omega-3 supplements. This baseline survey suggests a high level of awareness of some aspects of omega-3 fats and health in a sample of young adults, and social media has become a prominent source of nutrition and health information.
Conducting Source Water Assessments
This page presents background information on the source water assessment steps, the four steps of a source wter assessment, and how to use the results of an assessment to protect drinking water sources.
The Use of Sensory Analysis Techniques to Assess the Quality of Indoor Air.
Lewkowska, Paulina; Dymerski, Tomasz; Gębicki, Jacek; Namieśnik, Jacek
2017-01-02
The quality of indoor air is one of the significant elements that influences people's well-being and health inside buildings. Emissions of pollutants, which may cause odor nuisance, are the main reason for people's complaints regarding the quality of indoor air. As a result, it is necessary to perform tests aimed at identifying the sources of odors inside buildings. The article contains basic information on the characteristics of the sources of indoor air pollution and the influence of the odor detection threshold on people's health and comfort. An attempt was also made to classify and use sensory analysis techniques to perform tests of the quality of indoor air, which would enable identification of sensory experience and would allow for indication of the degree of their intensity.
Contamination source review for Building E5974, Edgewood Area, Aberdeen Proving Ground, Maryland
DOE Office of Scientific and Technical Information (OSTI.GOV)
Billmark, K.A.; Emken, M.E.; O`Reilly, D.P.
1995-09-01
This report documents the results of a contamination source review of Building E5974 at the Aberdeen Proving Ground (APG) in Maryland. The primary mission at APG has been the testing and evaluation of US Army warfare materials. Since its beginning in 1917, the Edgewood Area of APG has been the principal location for chemical warfare agent research, development, and testing in the US. APG was also used for producing chemical warfare agents during both world wars, and it has been a center for the storage of chemical warfare material. An attempt was made to identify and define areas of toxicmore » or hazardous contaminants and to assess the physical condition and accessibility of APG buildings. The information obtained from this review may be used to assist the US Army in planning for the future use or disposition of the buildings. The contamination source review consisted of the following tasks: historical records search, physical inspection, photographic documentation, geophysical investigation, and collection of air samples.« less
Sources of listening anxiety in learning English as a foreign language.
Chang, Anna Ching-Shyang
2008-02-01
In this study of college students' listening anxiety in learning English in a classroom context, participants were 160 students (47 men and 113 women) ages 18 to 19 years. To address their listening anxiety, participants were chosen from students enrolling in a required listening course. A listening questionnaire was used to assess learners' anxiety about spoken English, its intensity, and the main sources of listening anxiety. Overall, participants showed moderately high intensity of anxiety in listening to spoken English, but were more anxious in testing than in general situations. In contrast to previous research on the nature of spoken English as the main source of listening anxiety, this study found that low confidence in comprehending spoken English, taking English listening courses as a requirement, and worrying about test difficulty were the three main factors contributing to participants' listening anxiety in a classroom context. Participants' learning profiles both in the classroom and outside the class yielded data which provides suggestions for reducing anxiety.
Use of Ultrasonic Energy in Assessing Microbial Contamination on Surfaces
Puleo, John R.; Favero, Martin S.; Petersen, Norman J.
1967-01-01
Ultrasonic tanks were evaluated for their ability to remove viable microorganisms from various surfaces for subsequent enumeration. Test surfaces were polished stainless steel, smooth glass, frosted glass, and electronic components. The position of contaminated surfaces in relation to the ultrasonic energy source, distance of the ultrasonic source from the test surfaces, and temperature of the rinse fluid were some of the factors which influenced recovery. Experimental systems included both naturally occurring microbial contamination and artificial contamination with spores of Bacillus subtilis var. niger. The results showed that ultrasonic energy was more reliable and efficient than mechanical agitation for recovering surface contaminants. Conditions which increased the number and percentage of microorganisms recovered by ultrasonic energy were: using a cold rinse fluid, placing the sample bottle on the bottom of the ultrasonic tank, and facing the contaminated surfaces toward the energy source. It was also demonstrated that ultrasonic energy could be effectively used for eluting microorganisms from cotton swabs. PMID:16349743
Heritability of educational achievement in 12-year-olds and the overlap with cognitive ability.
Bartels, Meike; Rietveld, Marjolein J H; Van Baal, G Caroline M; Boomsma, Dorret I
2002-12-01
In order to determine high school entrance level in the Netherlands, nowadays, much value is attached to the results of a national test of educational achievement (CITO), administered around age 12. Surprisingly, up until now, no attention has been paid to the etiology of individual differences in the results of this national test of educational achievement. No attempt has been made to address the question about the nature of a possible association between the results of the CITO and cognitive abilities, as measured by psychometric IQ. The aim of this study is to explore to what extent psychometric IQ and scholastic achievement, as assessed by the CITO high school entrance test, are correlated. In addition, it was investigated whether this expected correlation was due to a common genetic background, shared or nonshared environmental influences common to CITO and intelligence or a combination of these influences. To this end multivariate behavior genetic analyses with CITO and IQ at ages 5, 7, 10 and 12 years have been conducted. The correlations were.41,.50,.60, and.63 between CITO and IQ assessed at age 5, 7, 10, and 12 respectively. The results of the analyses pointed to genetic effects as the main source of variance in CITO and an important source of covariance between CITO and IQ. Additive genetic effects accounted for 60% of the individual differences found in CITO scores in a large sample of Dutch 12-year-olds. This high heritability indicated that the CITO might be a valuable instrument to assess individual differences in cognitive abilities in children but might not be the right instrument to put the effect of education to the test.
Thulium heat source IR D Project 91-031
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walter, C.E.; Kammeraad, J.E.; Newman, J.G.
1991-01-01
The goal of the Thulium Heat Source study is to determine the performance capability and evaluate the safety and environmental aspects of a thulium-170 heat source. Thulium-170 has several attractive features, including the fact that it decays to a stable, chemically innocuous isotope in a relatively short time. A longer-range goal is to attract government funding for the development, fabrication, and demonstration testing in an Autonomous Underwater Vehicle (AUV) of one or more thulium isotope power (TIP) prototype systems. The approach is to study parametrically the performance of thulium-170 heat source designs in the power range of 5-50 kW{sub th}.more » At least three heat source designs will be characterized in this power range to assess their performance, mass, and volume. The authors will determine shielding requirements, and consider the safety and environmental aspects of their use.« less
Dubey, J P; Mitchell, S M; Morrow, J K; Rhyan, J C; Stewart, L M; Granstrom, D E; Romand, S; Thulliez, P; Saville, W J; Lindsay, D S
2003-08-01
Sarcocystis neurona, Neospora caninum, N. hughesi, and Toxoplasma gondii are 4 related coccidians considered to be associated with encephalomyelitis in horses. The source of infection for N. hughesi is unknown, whereas opossums, dogs, and cats are the definitive hosts for S. neurona, N. caninum, and T. gondii, respectively. Seroprevalence of these coccidians in 276 wild horses from central Wyoming outside the known range of the opossum (Didelphis virginiana) was determined. Antibodies to T. gondii were found only in 1 of 276 horses tested with the modified agglutination test using 1:25, 1:50, and 1:500 dilutions. Antibodies to N. caninum were found in 86 (31.1%) of the 276 horses tested with the Neospora agglutination test--the titers were 1:25 in 38 horses, 1:50 in 15, 1:100 in 9, 1:200 in 8, 1:400 in 4, 1:800 in 2, 1:1,600 in 2, 1:3,200 in 2, and 1:12,800 in 1. Antibodies to S. neurona were assessed with the serum immunoblot; of 276 horses tested, 18 had antibodies considered specific for S. neurona. Antibodies to S. neurona also were assessed with the S. neurona direct agglutination test (SAT). Thirty-nine of 265 horses tested had SAT antibodies--in titers of 1:50 in 26 horses and 1:100 in 13. The presence of S. neurona antibodies in horses in central Wyoming suggests that either there is cross-reactivity between S. neurona and some other infection or a definitive host other than opossum is the source of infection. In a retrospective study, S. neurona antibodies were not found by immunoblot in the sera of 243 horses from western Canada outside the range of D. virginiana.
How Patterns of Learning About Sexual Information Among Adolescents Are Related to Sexual Behaviors.
Bleakley, Amy; Khurana, Atika; Hennessy, Michael; Ellithorpe, Morgan
2018-03-01
Parents, peers and media are informal sources of sexual information for adolescents. Although the content of sexual information communicated by these sources is known to vary, little is known about what adolescents report actually learning from each source. Data from 1,990 U.S.14-17-year-olds who participated in an online survey in 2015 were used to assess learning about four topics (sex, condoms, hormonal birth control and romantic relationships) from three informal sources (parents, peers, and television and movies). Gender and race differences in learning by source and topic were assessed using t tests. Following a factor analysis, learning about all topics was grouped by source, and regression analyses were conducted to examine associations between learning from each source and three outcomes: sexual activity, condom use and hormonal birth control use. Models included interactions between information sources and race and gender. White adolescents reported learning more from parents and less from media than black adolescents. Compared with males, females learned more about hormonal birth control and less about condoms from their parents, and more about relationships from peers and media. Learning from parents and from peers were positively associated with adolescents' sexual activity (unstandardized coefficients, 0.26 and 0.52, respectively). Learning from parents was positively associated with condom use (odds ratio, 1.5). Adolescents' learning about sex from informal sources varies by race and gender. Future research should examine whether sexual health interventions and message development can capitalize on these differences. Copyright © 2018 by the Guttmacher Institute.
NASA Astrophysics Data System (ADS)
Schroth, M. H.; Kleikemper, J.; Pombo, S. A.; Zeyer, J.
2002-12-01
In the past, studies on microbial communities in natural environments have typically focused on either their structure or on their metabolic function. However, linking structure and function is important for understanding microbial community dynamics, in particular in contaminated environments. We will present results of a novel combination of a hydrogeological field method (push-pull tests) with molecular tools and stable isotope analysis, which was employed to quantify anaerobic activities and associated microbial diversity in a petroleum-contaminated aquifer in Studen, Switzerland. Push-pull tests consisted of the injection of test solution containing a conservative tracer and reactants (electron acceptors, 13C-labeled carbon sources) into the aquifer anoxic zone. Following an incubation period, the test solution/groundwater mixture was extracted from the same location. Metabolic activities were computed from solute concentrations measured during extraction. Simultaneously, microbial diversity in sediment and groundwater was characterized by using fluorescence in situ hybridization (FISH), denaturing gradient gel electrophoresis (DGGE), as well as phospholipids fatty acid (PLFA) analysis in combination with 13C isotopic measurements. Results from DGGE analyses provided information on the general community structure before, during and after the tests, while FISH yielded information on active populations. Moreover, using 13C-labeling of microbial PLFA we were able to directly link carbon source assimilation in an aquifer to indigenous microorganisms while providing quantitative information on respective carbon source consumption.
2004-11-01
Target Centroid 98 RANW / R SC GIS 04071 Data valid as of 11 Mar 04 rogertargets_a#2.apr Figure 2-3. Chemical/Industrial and High Fidelity Urban...existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding...Fidelity Targets, NTTR Nevada Division of Wildlife – Nevada Test and Training Range JDAM Targets Nevada Natural Heritage Program – Data Request received 8
DOE Office of Scientific and Technical Information (OSTI.GOV)
Graves, Todd L; Hamada, Michael S
2008-01-01
Good estimates of the reliability of a system make use of test data and expert knowledge at all available levels. Furthermore, by integrating all these information sources, one can determine how best to allocate scarce testing resources to reduce uncertainty. Both of these goals are facilitated by modern Bayesian computational methods. We apply these tools to examples that were previously solvable only through the use of ingenious approximations, and use genetic algorithms to guide resource allocation.
2015-08-01
for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data ...sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden...UNCLASSIFIED UNCLASSIFIED • Baseline drop tower data collected from Anthropomorphic Test Devices (ATDs) seated in 12 models of Commercial Off-The-Shelf
2011-01-01
Background In healthcare, a gap exists between what is known from research and what is practiced. Understanding this gap depends upon our ability to robustly measure research utilization. Objectives The objectives of this systematic review were: to identify self-report measures of research utilization used in healthcare, and to assess the psychometric properties (acceptability, reliability, and validity) of these measures. Methods We conducted a systematic review of literature reporting use or development of self-report research utilization measures. Our search included: multiple databases, ancestry searches, and a hand search. Acceptability was assessed by examining time to complete the measure and missing data rates. Our approach to reliability and validity assessment followed that outlined in the Standards for Educational and Psychological Testing. Results Of 42,770 titles screened, 97 original studies (108 articles) were included in this review. The 97 studies reported on the use or development of 60 unique self-report research utilization measures. Seven of the measures were assessed in more than one study. Study samples consisted of healthcare providers (92 studies) and healthcare decision makers (5 studies). No studies reported data on acceptability of the measures. Reliability was reported in 32 (33%) of the studies, representing 13 of the 60 measures. Internal consistency (Cronbach's Alpha) reliability was reported in 31 studies; values exceeded 0.70 in 29 studies. Test-retest reliability was reported in 3 studies with Pearson's r coefficients > 0.80. No validity information was reported for 12 of the 60 measures. The remaining 48 measures were classified into a three-level validity hierarchy according to the number of validity sources reported in 50% or more of the studies using the measure. Level one measures (n = 6) reported evidence from any three (out of four possible) Standards validity sources (which, in the case of single item measures, was all applicable validity sources). Level two measures (n = 16) had evidence from any two validity sources, and level three measures (n = 26) from only one validity source. Conclusions This review reveals significant underdevelopment in the measurement of research utilization. Substantial methodological advances with respect to construct clarity, use of research utilization and related theory, use of measurement theory, and psychometric assessment are required. Also needed are improved reporting practices and the adoption of a more contemporary view of validity (i.e., the Standards) in future research utilization measurement studies. PMID:21794144
Nyabenda, A; Briart, C; Deggouj, N; Gersdorff, M
2003-12-01
To date, the effectiveness of balanced rehabilitation for patients with Meniere's syndrome has not been unanimously acknowledged by all physicians and physiotherapists. The purpose of this study is to assess the therapeutic efficacy of rotational exercises in the treatment of disequilibrium for patients with unilateral Meniere's syndrome. Rotational stimuli were used to symmetrize and reduce postrotatory nystagmic response. Three reference sources were used to assess the efficacy of this management: vestibulospinal function tests: pre- and post-treatment results at the Romberg test, the Unterberger-Fukuda stepping test, the Babinski-Weil test, and gait testing with eyes closed; rotational tests: pre- and post-treatment results; and the self-perceived impact of vertigo: assessed by the Dizziness Handicap Inventory (DHI) and a scale based on the guidelines of the Japanese Society of Equilibrium Research (JSER, 1993). The JSER scale provides quantitative vertigo evaluation; the DHI reflects the patient's perceptual evaluation of handicap. Patients required 11 sessions (mean value) to attain subjective improvement. Of the 23 patients, only seven required optokinetic stimulation (mean requirement: three sessions). Rotational tests and dynamic tests of the vestibulospinal function improved. The DHI and JSER results show that patients' post-rehabilitation perceptual evaluation significantly improved. The objective and subjective measures of disequilibrium in patients with unilateral Meniere's syndrome were significantly improved.
A support system for assessing local vulnerability to weather and climate
Coletti, Alex; Howe, Peter D.; Yarnal, Brent; Wood, Nathan J.
2013-01-01
The changing number and nature of weather- and climate-related natural hazards is causing more communities to need to assess their vulnerabilities. Vulnerability assessments, however, often require considerable expertise and resources that are not available or too expensive for many communities. To meet the need for an easy-to-use, cost-effective vulnerability assessment tool for communities, a prototype online vulnerability assessment support system was built and tested. This prototype tool guides users through a stakeholder-based vulnerability assessment that breaks the process into four easy-to-implement steps. Data sources are integrated in the online environment so that perceived risks—defined and prioritized qualitatively by users—can be compared and discussed against the impacts that past events have had on the community. The support system is limited in scope, and the locations of the case studies do not provide a sufficiently broad range of sample cases. The addition of more publically available hazard databases combined with future improvements in the support system architecture and software will expand opportunities for testing and fully implementing the support system.
Immediate source-monitoring, self-focused attention and the positive symptoms of schizophrenia.
Startup, Mike; Startup, Sue; Sedgman, Adele
2008-10-01
Previous research suggests that tendencies to misattribute one's own thoughts to an external source, as assessed by an immediate source-monitoring test, are associated with auditory verbal hallucinations (AVHs). However, recent research suggests that such tendencies are associated instead with symptoms of thought interference. The main aim of the present study was to examine whether such tendencies are differentially associated with different types of thought interference, with AVHs, or with both. It has also been suggested that external misattributions are especially likely to occur with emotionally salient material and if the individual's focus is on the self. These suggestions were also tested. The positive psychotic symptoms of 57 individuals with a diagnosis of schizophrenia were assessed and they then completed the Self-Focus Sentence Completion blank. Immediately after completing each sentence they were asked to indicate to what extent the sentence was their own. The number of sentences that were not rated as completely their own served as their externalization score. Externalization scores correlated significantly with the severity of three symptoms: voices commenting, delusions of being controlled, and thought insertion. In a logistic regression analysis, all three of these symptoms were significantly and independently related to externalization. Externalization was not associated with either a negative or a neutral self-focus. Thus tendencies to misattribute one's own thoughts to an external source are associated with AVHs and some, but not all, symptoms of thought interference. The importance for externalization of self-focused attention and of the emotional salience of the elicited thoughts was not supported.
Functional Performance of Pyrovalves
NASA Technical Reports Server (NTRS)
Bement, Laurence J.
1996-01-01
Following several flight and ground test failures of spacecraft systems using single-shot, 'normally closed' pyrotechnically actuated valves (pyrovalves), a government/industry cooperative program was initiated to assess the functional performance of five qualified designs. The goal of the program was to improve performance-based requirements for the procurement of pyrovalves. Specific objectives included the demonstration of performance test methods, the measurement of 'blowby' (the passage of gases from the pyrotechnic energy source around the activating piston into the valve's fluid path), and the quantification of functional margins for each design. Experiments were conducted in-house at NASA on several units each of the five valve designs. The test methods used for this program measured the forces and energies required to actuate the valves, as well as the energies and the pressures (where possible) delivered by the pyrotechnic sources. Functional performance ranged widely among the designs. Blowby cannot be prevented by o-ring seals; metal-to-metal seals were effective. Functional margin was determined by dividing the energy delivered by the pyrotechnic sources in excess to that required to accomplish the function by the energy required for that function. All but two designs had adequate functional margins with the pyrotechnic cartridges evaluated.
Garcia, Michael; Daugherty, Christopher; Ben Khallouq, Bertha; Maugans, Todd
2018-05-01
OBJECTIVE The Internet is used frequently by patients and family members to acquire information about pediatric neurosurgical conditions. The sources, nature, accuracy, and usefulness of this information have not been examined recently. The authors analyzed the results from searches of 10 common pediatric neurosurgical terms using a novel scoring test to assess the value of the educational information obtained. METHODS Google and Bing searches were performed for 10 common pediatric neurosurgical topics (concussion, craniosynostosis, hydrocephalus, pediatric brain tumor, pediatric Chiari malformation, pediatric epilepsy surgery, pediatric neurosurgery, plagiocephaly, spina bifida, and tethered spinal cord). The first 10 "hits" obtained with each search engine were analyzed using the Currency, Relevance, Authority, Accuracy, and Purpose (CRAAP) test, which assigns a numerical score in each of 5 domains. Agreement between results was assessed for 1) concurrent searches with Google and Bing; 2) Google searches over time (6 months apart); 3) Google searches using mobile and PC platforms concurrently; and 4) searches using privacy settings. Readability was assessed with an online analytical tool. RESULTS Google and Bing searches yielded information with similar CRAAP scores (mean 72% and 75%, respectively), but with frequently differing results (58% concordance/matching results). There was a high level of agreement (72% concordance) over time for Google searches and also between searches using general and privacy settings (92% concordance). Government sources scored the best in both CRAAP score and readability. Hospitals and universities were the most prevalent sources, but these sources had the lowest CRAAP scores, due in part to an abundance of self-marketing. The CRAAP scores for mobile and desktop platforms did not differ significantly (p = 0.49). CONCLUSIONS Google and Bing searches yielded useful educational information, using either mobile or PC platforms. Most information was relevant and accurate; however, the depth and breadth of information was variable. Search results over a 6-month period were moderately stable. Pediatric neurosurgery practices and neurosurgical professional organization websites were inferior (less current, less accurate, less authoritative, and less purposeful) to governmental and encyclopedia-type resources such as Wikipedia. This presents an opportunity for pediatric neurosurgeons to participate in the creation of better online patient/parent educational material.
2013-01-01
Numerous quantitative PCR assays for microbial fecal source tracking (MST) have been developed and evaluated in recent years. Widespread application has been hindered by a lack of knowledge regarding the geographical stability and hence applicability of such methods beyond the regional level. This study assessed the performance of five previously reported quantitative PCR assays targeting human-, cattle-, or ruminant-associated Bacteroidetes populations on 280 human and animal fecal samples from 16 countries across six continents. The tested cattle-associated markers were shown to be ruminant-associated. The quantitative distributions of marker concentrations in target and nontarget samples proved to be essential for the assessment of assay performance and were used to establish a new metric for quantitative source-specificity. In general, this study demonstrates that stable target populations required for marker-based MST occur around the globe. Ruminant-associated marker concentrations were strongly correlated with total intestinal Bacteroidetes populations and with each other, indicating that the detected ruminant-associated populations seem to be part of the intestinal core microbiome of ruminants worldwide. Consequently tested ruminant-targeted assays appear to be suitable quantitative MST tools beyond the regional level while the targeted human-associated populations seem to be less prevalent and stable, suggesting potential for improvements in human-targeted methods. PMID:23755882
Mehra, S; Morrison, P D; Coates, F; Lawrie, A C
2017-02-01
Terrestrial orchids depend on orchid mycorrhizal fungi (OMF) as symbionts for their survival, growth and nutrition. The ability of OMF from endangered orchid species to compete for available resources with OMF from common species may affect the distribution, abundance and therefore conservation status of their orchid hosts. Eight symbiotically effective OMF from endangered and more common Caladenia species were tested for their ability to utilise complex insoluble and simple soluble carbon sources produced during litter degradation by growth with different carbon sources in liquid medium to measure the degree of OMF variation with host conservation status or taxonomy. On simple carbon sources, fungal growth was assessed by biomass. On insoluble substrates, ergosterol content was assessed using ultra-performance liquid chromatography (UPLC). The OMF grew on all natural materials and complex carbon sources, but produced the greatest biomass on xylan and starch and the least on bark and chitin. On simple carbon sources, the greatest OMF biomass was measured on most hexoses and disaccharides and the least on galactose and arabinose. Only some OMF used sucrose, the most common sugar in green plants, with possible implications for symbiosis. OMF from common orchids produced more ergosterol and biomass than those from endangered orchids in the Dilatata and Reticulata groups but not in the Patersonii and Finger orchids. This suggests that differences in carbon source utilisation may contribute to differences in the distribution of some orchids, if these differences are retained on site.
Varughese, Eunice A; Brinkman, Nichole E; Anneken, Emily M; Cashdollar, Jennifer L; Fout, G Shay; Furlong, Edward T; Kolpin, Dana W; Glassmeyer, Susan T; Keely, Scott P
2018-04-01
Drinking water treatment plants rely on purification of contaminated source waters to provide communities with potable water. One group of possible contaminants are enteric viruses. Measurement of viral quantities in environmental water systems are often performed using polymerase chain reaction (PCR) or quantitative PCR (qPCR). However, true values may be underestimated due to challenges involved in a multi-step viral concentration process and due to PCR inhibition. In this study, water samples were concentrated from 25 drinking water treatment plants (DWTPs) across the US to study the occurrence of enteric viruses in source water and removal after treatment. The five different types of viruses studied were adenovirus, norovirus GI, norovirus GII, enterovirus, and polyomavirus. Quantitative PCR was performed on all samples to determine presence or absence of these viruses in each sample. Ten DWTPs showed presence of one or more viruses in source water, with four DWTPs having treated drinking water testing positive. Furthermore, PCR inhibition was assessed for each sample using an exogenous amplification control, which indicated that all of the DWTP samples, including source and treated water samples, had some level of inhibition, confirming that inhibition plays an important role in PCR-based assessments of environmental samples. PCR inhibition measurements, viral recovery, and other assessments were incorporated into a Bayesian model to more accurately determine viral load in both source and treated water. Results of the Bayesian model indicated that viruses are present in source water and treated water. By using a Bayesian framework that incorporates inhibition, as well as many other parameters that affect viral detection, this study offers an approach for more accurately estimating the occurrence of viral pathogens in environmental waters. Published by Elsevier B.V.
Lessons learned in preparing method 29 filters for compliance testing audits.
Martz, R F; McCartney, J E; Bursey, J T; Riley, C E
2000-01-01
Companies conducting compliance testing are required to analyze audit samples at the time they collect and analyze the stack samples if audit samples are available. Eastern Research Group (ERG) provides technical support to the EPA's Emission Measurements Center's Stationary Source Audit Program (SSAP) for developing, preparing, and distributing performance evaluation samples and audit materials. These audit samples are requested via the regulatory Agency and include spiked audit materials for EPA Method 29-Metals Emissions from Stationary Sources, as well as other methods. To provide appropriate audit materials to federal, state, tribal, and local governments, as well as agencies performing environmental activities and conducting emission compliance tests, ERG has recently performed testing of blank filter materials and preparation of spiked filters for EPA Method 29. For sampling stationary sources using an EPA Method 29 sampling train, the use of filters without organic binders containing less than 1.3 microg/in.2 of each of the metals to be measured is required. Risk Assessment testing imposes even stricter requirements for clean filter background levels. Three vendor sources of quartz fiber filters were evaluated for background contamination to ensure that audit samples would be prepared using filters with the lowest metal background levels. A procedure was developed to test new filters, and a cleaning procedure was evaluated to see if a greater level of cleanliness could be achieved using an acid rinse with new filters. Background levels for filters supplied by different vendors and within lots of filters from the same vendor showed a wide variation, confirmed through contact with several analytical laboratories that frequently perform EPA Method 29 analyses. It has been necessary to repeat more than one compliance test because of suspect metals background contamination levels. An acid cleaning step produced improvement in contamination level, but the difference was not significant for most of the Method 29 target metals. As a result of our studies, we conclude: Filters for Method 29 testing should be purchased in lots as large as possible. Testing firms should pre-screen new boxes and/or new lots of filters used for Method 29 testing. Random analysis of three filters (top, middle, bottom of the box) from a new box of vendor filters before allowing them to be used in field tests is a prudent approach. A box of filters from a given vendor should be screened, and filters from this screened box should be used both for testing and as field blanks in each test scenario to provide the level of quality assurance required for stationary source testing.
de Beer, Ingrid; Chani, Kudakwashe; Feeley, Frank G; Rinke de Wit, Tobias F; Sweeney-Bindels, Els; Mulongeni, Pancho
2015-01-01
Bophelo! is a mobile voluntary counseling and testing (VCT) and wellness screening program operated by PharmAccess at workplaces in Namibia, funded from both public and private resources. Publicly funded fixed site New Start centers provide similar services in Namibia. At this time of this study, no comparative information on the cost effectiveness of mobile versus fixed site service provision was available in Namibia to inform future programming for scale-up of VCT. The objectives of the study were to assess the costs of mobile VCT and wellness service delivery in Namibia and to compare the costs and effectiveness with fixed site VCT testing in Namibia. The full direct costs of all resources used by the mobile and fixed site testing programs and data on people tested and outcomes were obtained from PharmAccess and New Start centers in Namibia. Data were also collected on the source of funding, both public donor funding and private funding through contributions from employers. The data were analyzed using Microsoft Excel to determine the average cost per person tested for HIV. In 2009, the average cost per person tested for HIV at the Bophelo! mobile clinic was an estimated US$60.59 (US$310,451 for the 5124 people tested). Private employer contributions to the testing costs reduced the public cost per person tested to US$37.76. The incremental cost per person associated with testing for conditions other than HIV infection was US$11.35, an increase of 18.7%, consisting of the costs of additional tests (US$8.62) and staff time (US$2.73). The cost of testing one person for HIV in 2009 at the New Start centers was estimated at US$58.21 (US$4,082,936 for the 70 143 people tested). Mobile clinics can provide cost-effective wellness testing services at the workplace and have the potential to mobilize local private funding sources. Providing wellness testing in addition to VCT can help address the growing issue of non-communicable diseases.
Earthquake Early Warning ShakeAlert System: Testing and certification platform
Cochran, Elizabeth S.; Kohler, Monica D.; Given, Douglas; Guiwits, Stephen; Andrews, Jennifer; Meier, Men-Andrin; Ahmad, Mohammad; Henson, Ivan; Hartog, Renate; Smith, Deborah
2017-01-01
Earthquake early warning systems provide warnings to end users of incoming moderate to strong ground shaking from earthquakes. An earthquake early warning system, ShakeAlert, is providing alerts to beta end users in the western United States, specifically California, Oregon, and Washington. An essential aspect of the earthquake early warning system is the development of a framework to test modifications to code to ensure functionality and assess performance. In 2016, a Testing and Certification Platform (TCP) was included in the development of the Production Prototype version of ShakeAlert. The purpose of the TCP is to evaluate the robustness of candidate code that is proposed for deployment on ShakeAlert Production Prototype servers. TCP consists of two main components: a real‐time in situ test that replicates the real‐time production system and an offline playback system to replay test suites. The real‐time tests of system performance assess code optimization and stability. The offline tests comprise a stress test of candidate code to assess if the code is production ready. The test suite includes over 120 events including local, regional, and teleseismic historic earthquakes, recentering and calibration events, and other anomalous and potentially problematic signals. Two assessments of alert performance are conducted. First, point‐source assessments are undertaken to compare magnitude, epicentral location, and origin time with the Advanced National Seismic System Comprehensive Catalog, as well as to evaluate alert latency. Second, we describe assessment of the quality of ground‐motion predictions at end‐user sites by comparing predicted shaking intensities to ShakeMaps for historic events and implement a threshold‐based approach that assesses how often end users initiate the appropriate action, based on their ground‐shaking threshold. TCP has been developed to be a convenient streamlined procedure for objectively testing algorithms, and it has been designed with flexibility to accommodate significant changes in development of new or modified system code. It is expected that the TCP will continue to evolve along with the ShakeAlert system, and the framework we describe here provides one example of how earthquake early warning systems can be evaluated.
Allium-test as a tool for toxicity testing of environmental radioactive-chemical mixtures
NASA Astrophysics Data System (ADS)
Oudalova, A. A.; Geras'kin, S. A.; Dikareva, N. S.; Pyatkova, S. V.
2017-01-01
Bioassay-based approaches have been propagated to assess toxicity of unknown mixtures of environmental contaminants, but it was rarely applied in cases of chemicals with radionuclides combinations. Two Allium-test studies were performed to assess environmental impact from potential sources of combined radioactive-chemical pollution. Study sites were located at nuclear waste storage facilities in European and in Far-Eastern parts of Russia. As environmental media under impact, waters from monitor wells and nearby water bodies were tested. Concentrations of some chemicals and radionuclides in the samples collected enhanced the permitted limits. Cytogenetic and cytotoxic effects were used as biological endpoints, namely, frequency and spectrum of chromosome aberrations and mitotic abnormalities in anatelophase cells as well as mitotic activity in Allium root tips. Sample points were revealed where waters have an enhanced mutagenic potential. The findings obtained could be used to optimize monitoring system and advance decision making on management and rehabilitation of industrial sites. The Allium-test could be recommended and applied as an effective tool for toxicity testing in case of combined contamination of environmental compartments with radionuclides and chemical compounds.
Multivariate assessment of event-related potentials with the t-CWT method.
Bostanov, Vladimir
2015-11-05
Event-related brain potentials (ERPs) are usually assessed with univariate statistical tests although they are essentially multivariate objects. Brain-computer interface applications are a notable exception to this practice, because they are based on multivariate classification of single-trial ERPs. Multivariate ERP assessment can be facilitated by feature extraction methods. One such method is t-CWT, a mathematical-statistical algorithm based on the continuous wavelet transform (CWT) and Student's t-test. This article begins with a geometric primer on some basic concepts of multivariate statistics as applied to ERP assessment in general and to the t-CWT method in particular. Further, it presents for the first time a detailed, step-by-step, formal mathematical description of the t-CWT algorithm. A new multivariate outlier rejection procedure based on principal component analysis in the frequency domain is presented as an important pre-processing step. The MATLAB and GNU Octave implementation of t-CWT is also made publicly available for the first time as free and open source code. The method is demonstrated on some example ERP data obtained in a passive oddball paradigm. Finally, some conceptually novel applications of the multivariate approach in general and of the t-CWT method in particular are suggested and discussed. Hopefully, the publication of both the t-CWT source code and its underlying mathematical algorithm along with a didactic geometric introduction to some basic concepts of multivariate statistics would make t-CWT more accessible to both users and developers in the field of neuroscience research.
Field performance of Populus expressing somaclonal variation in resistance to Septoria musiva
M. E. Ostry; K. T. Ward
2003-01-01
Over 1500 trees from two hybrid poplar clones regenerated from tissue culture and expressing somatic variation in leaf disease resistance in a laboratory leaf disk bioassay were field-tested for 5-11 years to examine their resistance to Septoria leaf spot and canker and to assess their growth characteristics compared with the source clones....
Disentangling Dimensions in the Dimensional Change Card-Sorting Task
ERIC Educational Resources Information Center
Kloo, Daniela; Perner, Josef
2005-01-01
The dimensional change card-sorting task (DCCS task) is frequently used to assess young children's executive abilities. However, the source of children's difficulty with this task is still under debate. In the standard DCCS task, children have to sort, for example, test cards with a red cherry or a blue banana into two boxes marked with target…
ERIC Educational Resources Information Center
Appel, Randy; Wood, David
2016-01-01
The correct use of frequently occurring word combinations represents an important part of language proficiency in spoken and written discourse. This study investigates the use of English-language recurrent word combinations in low-level and high-level L2 English academic essays sourced from the Canadian Academic English Language (CAEL) assessment.…
Improving planting stock quality—the Humboldt experience
James L. Jenkinson; James A. Nelson
1993-01-01
A seedling testing program was developed to improve the survival and growth potential of planting stock produced in the USDA Forest Service Humboldt Nursery, situated on the Pacific Coast in northern California. Coastal and inland seed sources of Douglas-fir and eight other conifers in the Pacific Slope forests of western Oregon and northern California were assessed in...
ERIC Educational Resources Information Center
Walker, Scott L.; McNeal, Karen S.
2013-01-01
The Climate Stewardship Survey (CSS) was developed to measure knowledge and perceptions of global climate change, while also considering information sources that respondents 'trust.' The CSS was drafted using a three-stage approach: development of salient scales, writing individual items, and field testing and analyses. Construct validity and…
Valid and Reliable Science Content Assessments for Science Teachers
NASA Astrophysics Data System (ADS)
Tretter, Thomas R.; Brown, Sherri L.; Bush, William S.; Saderholm, Jon C.; Holmes, Vicki-Lynn
2013-03-01
Science teachers' content knowledge is an important influence on student learning, highlighting an ongoing need for programs, and assessments of those programs, designed to support teacher learning of science. Valid and reliable assessments of teacher science knowledge are needed for direct measurement of this crucial variable. This paper describes multiple sources of validity and reliability (Cronbach's alpha greater than 0.8) evidence for physical, life, and earth/space science assessments—part of the Diagnostic Teacher Assessments of Mathematics and Science (DTAMS) project. Validity was strengthened by systematic synthesis of relevant documents, extensive use of external reviewers, and field tests with 900 teachers during assessment development process. Subsequent results from 4,400 teachers, analyzed with Rasch IRT modeling techniques, offer construct and concurrent validity evidence.
Arts, Josje H E; Irfan, Muhammad-Adeel; Keene, Athena M; Kreiling, Reinhard; Lyon, Delina; Maier, Monika; Michel, Karin; Neubauer, Nicole; Petry, Thomas; Sauer, Ursula G; Warheit, David; Wiench, Karin; Wohlleben, Wendel; Landsiedel, Robert
2016-04-01
Case studies covering carbonaceous nanomaterials, metal oxide and metal sulphate nanomaterials, amorphous silica and organic pigments were performed to assess the Decision-making framework for the grouping and testing of nanomaterials (DF4nanoGrouping). The usefulness of the DF4nanoGrouping for nanomaterial hazard assessment was confirmed. In two tiers that rely exclusively on non-animal test methods followed by a third tier, if necessary, in which data from rat short-term inhalation studies are evaluated, nanomaterials are assigned to one of four main groups (MGs). The DF4nanoGrouping proved efficient in sorting out nanomaterials that could undergo hazard assessment without further testing. These are soluble nanomaterials (MG1) whose further hazard assessment should rely on read-across to the dissolved materials, high aspect-ratio nanomaterials (MG2) which could be assessed according to their potential fibre toxicity and passive nanomaterials (MG3) that only elicit effects under pulmonary overload conditions. Thereby, the DF4nanoGrouping allows identifying active nanomaterials (MG4) that merit in-depth investigations, and it provides a solid rationale for their sub-grouping to specify the further information needs. Finally, the evaluated case study materials may be used as source nanomaterials in future read-across applications. Overall, the DF4nanoGrouping is a hazard assessment strategy that strictly uses animals as a last resort. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Exploring REACH as a potential data source for characterizing ecotoxicity in life cycle assessment.
Müller, Nienke; de Zwart, Dick; Hauschild, Michael; Kijko, Gaël; Fantke, Peter
2017-02-01
Toxicity models in life cycle impact assessment (LCIA) currently only characterize a small fraction of marketed substances, mostly because of limitations in the underlying ecotoxicity data. One approach to improve the current data situation in LCIA is to identify new data sources, such as the European Registration, Evaluation, Authorisation, and Restriction of Chemicals (REACH) database. The present study explored REACH as a potential data source for LCIA based on matching reported ecotoxicity data for substances that are currently also included in the United Nations Environment Programme/Society for Environmental Toxicology and Chemistry (UNEP/SETAC) scientific consensus model USEtox for characterizing toxicity impacts. Data are evaluated with respect to number of data points, reported reliability, and test duration, and are compared with data listed in USEtox at the level of hazardous concentration for 50% of the covered species per substance. The results emphasize differences between data available via REACH and in USEtox. The comparison of ecotoxicity data from REACH and USEtox shows potential for using REACH ecotoxicity data in LCIA toxicity characterization, but also highlights issues related to compliance of submitted data with REACH requirements as well as different assumptions underlying regulatory risk assessment under REACH versus data needed for LCIA. Thus, further research is required to address data quality, pre-processing, and applicability, before considering data submitted under REACH as a data source for use in LCIA, and also to explore additionally available data sources, published studies, and reports. Environ Toxicol Chem 2017;36:492-500. © 2016 SETAC. © 2016 SETAC.
Software development infrastructure for the HYBRID modeling and simulation project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Epiney, Aaron S.; Kinoshita, Robert A.; Kim, Jong Suk
One of the goals of the HYBRID modeling and simulation project is to assess the economic viability of hybrid systems in a market that contains renewable energy sources like wind. The idea is that it is possible for the nuclear plant to sell non-electric energy cushions, which absorb (at least partially) the volatility introduced by the renewable energy sources. This system is currently modeled in the Modelica programming language. To assess the economics of the system, an optimization procedure is trying to find the minimal cost of electricity production. The RAVEN code is used as a driver for the wholemore » problem. It is assumed that at this stage, the HYBRID modeling and simulation framework can be classified as non-safety “research and development” software. The associated quality level is Quality Level 3 software. This imposes low requirements on quality control, testing and documentation. The quality level could change as the application development continues.Despite the low quality requirement level, a workflow for the HYBRID developers has been defined that include a coding standard and some documentation and testing requirements. The repository performs automated unit testing of contributed models. The automated testing is achieved via an open-source python script called BuildingsP from Lawrence Berkeley National Lab. BuildingsPy runs Modelica simulation tests using Dymola in an automated manner and generates and runs unit tests from Modelica scripts written by developers. In order to assure effective communication between the different national laboratories a biweekly videoconference has been set-up, where developers can report their progress and issues. In addition, periodic face-face meetings are organized intended to discuss high-level strategy decisions with management. A second means of communication is the developer email list. This is a list to which everybody can send emails that will be received by the collective of the developers and managers involved in the project. Thirdly, to exchange documents quickly, a SharePoint directory has been set-up. SharePoint allows teams and organizations to intelligently share, and collaborate on content from anywhere.« less
Reproducibility of Interferon Gamma (IFN-γ) Release Assays. A Systematic Review
Tagmouti, Saloua; Slater, Madeline; Benedetti, Andrea; Kik, Sandra V.; Banaei, Niaz; Cattamanchi, Adithya; Metcalfe, John; Dowdy, David; van Zyl Smit, Richard; Dendukuri, Nandini
2014-01-01
Rationale: Interferon gamma (IFN-γ) release assays for latent tuberculosis infection result in a larger-than-expected number of conversions and reversions in occupational screening programs, and reproducibility of test results is a concern. Objectives: Knowledge of the relative contribution and extent of the individual sources of variability (immunological, preanalytical, or analytical) could help optimize testing protocols. Methods: We performed a systematic review of studies published by October 2013 on all potential sources of variability of commercial IFN-γ release assays (QuantiFERON-TB Gold In-Tube and T-SPOT.TB). The included studies assessed test variability under identical conditions and under different conditions (the latter both overall and stratified by individual sources of variability). Linear mixed effects models were used to estimate within-subject SD. Measurements and Main Results: We identified a total of 26 articles, including 7 studies analyzing variability under the same conditions, 10 studies analyzing variability with repeat testing over time under different conditions, and 19 studies reporting individual sources of variability. Most data were on QuantiFERON (only three studies on T-SPOT.TB). A considerable number of conversions and reversions were seen around the manufacturer-recommended cut-point. The estimated range of variability of IFN-γ response in QuantiFERON under identical conditions was ±0.47 IU/ml (coefficient of variation, 13%) and ±0.26 IU/ml (30%) for individuals with an initial IFN-γ response in the borderline range (0.25–0.80 IU/ml). The estimated range of variability in noncontrolled settings was substantially larger (±1.4 IU/ml; 60%). Blood volume inoculated into QuantiFERON tubes and preanalytic delay were identified as key sources of variability. Conclusions: This systematic review shows substantial variability with repeat IFN-γ release assays testing even under identical conditions, suggesting that reversions and conversions around the existing cut-point should be interpreted with caution. PMID:25188809
Environmental risk assessments for transgenic crops producing output trait enzymes
Tuttle, Ann; Shore, Scott; Stone, Terry
2009-01-01
The environmental risks from cultivating crops producing output trait enzymes can be rigorously assessed by testing conservative risk hypotheses of no harm to endpoints such as the abundance of wildlife, crop yield and the rate of degradation of crop residues in soil. These hypotheses can be tested with data from many sources, including evaluations of the agronomic performance and nutritional quality of the crop made during product development, and information from the scientific literature on the mode-of-action, taxonomic distribution and environmental fate of the enzyme. Few, if any, specific ecotoxicology or environmental fate studies are needed. The effective use of existing data means that regulatory decision-making, to which an environmental risk assessment provides essential information, is not unnecessarily complicated by evaluation of large amounts of new data that provide negligible improvement in the characterization of risk, and that may delay environmental benefits offered by transgenic crops containing output trait enzymes. PMID:19924556
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barbara Luke, Director, UNLV Engineering Geophysics Laboratory
2007-04-25
Improve understanding of the earthquake hazard in the Las Vegas Valley and to assess the state of preparedness of the area's population and structures for the next big earthquake. 1. Enhance the seismic monitoring network in the Las Vegas Valley 2. Improve understanding of deep basin structure through active-source seismic refraction and reflection testing 3. Improve understanding of dynamic response of shallow sediments through seismic testing and correlations with lithology 4. Develop credible earthquake scenarios by laboratory and field studies, literature review and analyses 5. Refine ground motion expectations around the Las Vegas Valley through simulations 6. Assess current buildingmore » standards in light of improved understanding of hazards 7. Perform risk assessment for structures and infrastructures, with emphasis on lifelines and critical structures 8. Encourage and facilitate broad and open technical interchange regarding earthquake safety in southern Nevada and efforts to inform citizens of earthquake hazards and mitigation opportunities« less
Vaccarino, Anthony L; Anonymous; Anderson, Karen E.; Borowsky, Beth; Coccaro, Emil; Craufurd, David; Endicott, Jean; Giuliano, Joseph; Groves, Mark; Guttman, Mark; Ho, Aileen K; Kupchak, Peter; Paulsen, Jane S.; Stanford, Matthew S.; van Kammen, Daniel P; Watson, David; Wu, Kevin D; Evans, Ken
2011-01-01
The Functional Rating Scale Taskforce for pre-Huntington Disease (FuRST-pHD) is a multinational, multidisciplinary initiative with the goal of developing a data-driven, comprehensive, psychometrically sound, rating scale for assessing symptoms and functional ability in prodromal and early Huntington disease (HD) gene expansion carriers. The process involves input from numerous sources to identify relevant symptom domains, including HD individuals, caregivers, and experts from a variety of fields, as well as knowledge gained from the analysis of data from ongoing large-scale studies in HD using existing clinical scales. This is an iterative process in which an ongoing series of field tests in prodromal (prHD) and early HD individuals provides the team with data on which to make decisions regarding which questions should undergo further development or testing and which should be excluded. We report here the development and assessment of the first iteration of interview questions aimed to assess "Anger and Irritability" and "Obsessions and Compulsions" in prHD individuals. PMID:21826116
Reliability and Validity Evidence of Multiple Balance Assessments in Athletes With a Concussion
Murray, Nicholas; Salvatore, Anthony; Powell, Douglas; Reed-Jones, Rebecca
2014-01-01
Context: An estimated 300 000 sport-related concussion injuries occur in the United States annually. Approximately 30% of individuals with concussions experience balance disturbances. Common methods of balance assessment include the Clinical Test of Sensory Organization and Balance (CTSIB), the Sensory Organization Test (SOT), the Balance Error Scoring System (BESS), and the Romberg test; however, the National Collegiate Athletic Association recommended the Wii Fit as an alternative measure of balance in athletes with a concussion. A central concern regarding the implementation of the Wii Fit is whether it is reliable and valid for measuring balance disturbance in athletes with concussion. Objective: To examine the reliability and validity evidence for the CTSIB, SOT, BESS, Romberg test, and Wii Fit for detecting balance disturbance in athletes with a concussion. Data Sources: Literature considered for review included publications with reliability and validity data for the assessments of balance (CTSIB, SOT, BESS, Romberg test, and Wii Fit) from PubMed, PsycINFO, and CINAHL. Data Extraction: We identified 63 relevant articles for consideration in the review. Of the 63 articles, 28 were considered appropriate for inclusion and 35 were excluded. Data Synthesis: No current reliability or validity information supports the use of the CTSIB, SOT, Romberg test, or Wii Fit for balance assessment in athletes with a concussion. The BESS demonstrated moderate to high reliability (interclass correlation coefficient = 0.87) and low to moderate validity (sensitivity = 34%, specificity = 87%). However, the Romberg test and Wii Fit have been shown to be reliable tools in the assessment of balance in Parkinson patients. Conclusions: The BESS can evaluate balance problems after a concussion. However, it lacks the ability to detect balance problems after the third day of recovery. Further investigation is needed to establish the use of the CTSIB, SOT, Romberg test, and Wii Fit for assessing balance in athletes with concussions. PMID:24933431
Hens, Koen; Berth, Mario; Armbruster, Dave; Westgard, Sten
2014-07-01
Six Sigma metrics were used to assess the analytical quality of automated clinical chemistry and immunoassay tests in a large Belgian clinical laboratory and to explore the importance of the source used for estimation of the allowable total error. Clinical laboratories are continually challenged to maintain analytical quality. However, it is difficult to measure assay quality objectively and quantitatively. The Sigma metric is a single number that estimates quality based on the traditional parameters used in the clinical laboratory: allowable total error (TEa), precision and bias. In this study, Sigma metrics were calculated for 41 clinical chemistry assays for serum and urine on five ARCHITECT c16000 chemistry analyzers. Controls at two analyte concentrations were tested and Sigma metrics were calculated using three different TEa targets (Ricos biological variability, CLIA, and RiliBÄK). Sigma metrics varied with analyte concentration, the TEa target, and between/among analyzers. Sigma values identified those assays that are analytically robust and require minimal quality control rules and those that exhibit more variability and require more complex rules. The analyzer to analyzer variability was assessed on the basis of Sigma metrics. Six Sigma is a more efficient way to control quality, but the lack of TEa targets for many analytes and the sometimes inconsistent TEa targets from different sources are important variables for the interpretation and the application of Sigma metrics in a routine clinical laboratory. Sigma metrics are a valuable means of comparing the analytical quality of two or more analyzers to ensure the comparability of patient test results.
Mustafa, Reem A; Wiercioch, Wojtek; Arevalo-Rodriguez, Ingrid; Cheung, Adrienne; Prediger, Barbara; Ivanova, Liudmila; Ventresca, Matthew; Brozek, Jan; Santesso, Nancy; Bossuyt, Patrick; Garg, Amit X; Lloyd, Nancy; Lelgemann, Monika; Bühler, Diedrich; Schünemann, Holger J
2017-12-01
The objective of the study was to describe and compare current practices in developing guidelines about the use of healthcare-related tests and diagnostic strategies (HCTDS). We sampled 37 public health and clinical practice guidelines about HCTDS from various sources without language restrictions. Detailed descriptions of the systems used to assess the quality of evidence and develop recommendations were challenging to find within guidelines. We observed much variability among and within organizations with respect to how they develop recommendations about HCTDS. Twenty-four percent of the guidelines did not consider health benefits and harms but based decisions solely on test accuracy. We did not identify guidelines that described the main potential care pathways involving tests for a healthcare problem. In addition, we did not identify guidelines that systematically assessed, described, and referenced the evidence that linked test accuracy and patient-important outcomes. There is considerable variability among the processes used and factors considered in developing recommendations about the use of tests. This variability may be the cause for the disagreement we observed in recommendations about testing for the same condition. Copyright © 2017 Elsevier Inc. All rights reserved.
Aerodynamic Performance of Scale-Model Turbofan Outlet Guide Vanes Designed for Low Noise
NASA Technical Reports Server (NTRS)
Hughes, Christopher E.
2001-01-01
The design of effective new technologies to reduce aircraft propulsion noise is dependent on an understanding of the noise sources and noise generation mechanisms in the modern turbofan engine. In order to more fully understand the physics of noise in a turbofan engine, a comprehensive aeroacoustic wind tunnel test programs was conducted called the 'Source Diagnostic Test.' The text was cooperative effort between NASA and General Electric Aircraft Engines, as part of the NASA Advanced Subsonic Technology Noise Reduction Program. A 1/5-scale model simulator representing the bypass stage of a current technology high bypass ratio turbofan engine was used in the test. The test article consisted of the bypass fan and outlet guide vanes in a flight-type nacelle. The fan used was a medium pressure ratio design with 22 individual, wide chord blades. Three outlet guide vane design configurations were investigated, representing a 54-vane radial Baseline configuration, a 26-vane radial, wide chord Low Count configuration and a 26-vane, wide chord Low Noise configuration with 30 deg of aft sweep. The test was conducted in the NASA Glenn Research Center 9 by 15-Foot Low Speed Wind Tunnel at velocities simulating the takeoff and approach phases of the aircraft flight envelope. The Source Diagnostic Test had several acoustic and aerodynamic technical objectives: (1) establish the performance of a scale model fan selected to represent the current technology turbofan product; (2) assess the performance of the fan stage with each of the three distinct outlet guide vane designs; (3) determine the effect of the outlet guide vane configuration on the fan baseline performance; and (4) conduct detailed flowfield diagnostic surveys, both acoustic and aerodynamic, to characterize and understand the noise generation mechanisms in a turbofan engine. This paper addresses the fan and stage aerodynamic performance results from the Source Diagnostic Test.
Hair analysis for the detection of drug use-is there potential for evasion?
Marrinan, Shanna; Roman-Urrestarazu, Andres; Naughton, Declan; Levari, Emerlinda; Collins, John; Chilcott, Robert; Bersani, Giuseppe; Corazza, Ornella
2017-05-01
Hair analysis for illicit substances is widely used to detect chronic drug consumption or abstention from drugs. Testees are increasingly seeking ways to avoid detection by using a variety of untested adulterant products (e.g., shampoos, cleansers) widely sold online. This study aims to investigate adulteration of hair samples and to assess effectiveness of such methods. The literature on hair test evasion was searched for on PubMed or MEDLINE, Psycinfo, and Google Scholar. Given the sparse nature of peer-reviewed data on this subject, results were integrated with a qualitative assessment of online sources, including user-orientated information or commercial websites, drug fora and "chat rooms". Over four million web sources were identified in a Google search by using "beat hair drug test" and the first 86 were monitored on regular basis and considered for further analysis. Attempts to influence hair test results are widespread. Various "shampoos," and "cleansers" among other products, were found for sale, which claim to remove analytes. Often advertised with aggressive marketing strategies, which include discounts, testimonials, and unsupported claims of efficacy. However, these products may pose serious health hazards and are also potentially toxic. In addition, many anecdotal reports suggest that Novel Psychoactive Substances are also consumed as an evasion technique, as these are not easily detectable via standard drug test. Recent changes on Novel Psychoactive Substances legislations such as New Psychoactive Bill in the UK might further challenge the testing process. Further research is needed by way of chemical analysis and trial of the adulterant products sold online and their effects as well as the development of more sophisticated hair testing techniques. Copyright © 2017 John Wiley & Sons, Ltd.
The compressed average image intensity metric for stereoscopic video quality assessment
NASA Astrophysics Data System (ADS)
Wilczewski, Grzegorz
2016-09-01
The following article depicts insights towards design, creation and testing of a genuine metric designed for a 3DTV video quality evaluation. The Compressed Average Image Intensity (CAII) mechanism is based upon stereoscopic video content analysis, setting its core feature and functionality to serve as a versatile tool for an effective 3DTV service quality assessment. Being an objective type of quality metric it may be utilized as a reliable source of information about the actual performance of a given 3DTV system, under strict providers evaluation. Concerning testing and the overall performance analysis of the CAII metric, the following paper presents comprehensive study of results gathered across several testing routines among selected set of samples of stereoscopic video content. As a result, the designed method for stereoscopic video quality evaluation is investigated across the range of synthetic visual impairments injected into the original video stream.
Assessing the Links Between Anthropometrics Data and Akabane Test Results.
Muzhikov, Valery; Vershinina, Elena; Belenky, Vadim; Muzhikov, Ruslan
2018-02-01
According to popular belief, metabolic disorders and imbalances are one of the main factors contributing to various human illnesses. Early diagnosis of these disorders is one of the main methods for preventing serious diseases. The goal of this study was to assess the correlations between main physical indicators and the activity of certain acupuncture channels using the thermal Akabane test based on ancient Chinese diagnostic methods. This test measures the pain thresholds' temperature sensitivity when a point source of heat is applied to the "entrance-exit" points of each channel. The skin temperature sensitivity in our bodies is a basic reactive system; it is as significant as such important indicators as body temperature and provides a very clear representation of functional and psychophysiological profiles. On the basis of our statistical study, we revealed reliable correspondence between the activity of certain acupuncture channels and main anthropometric and biometric data. Copyright © 2018. Published by Elsevier B.V.
Selecting surrogate endpoints for estimating pesticide effects on avian reproductive success.
Bennett, Richard S; Etterson, Matthew A
2013-10-01
A Markov chain nest productivity model (MCnest) has been developed for projecting the effects of a specific pesticide-use scenario on the annual reproductive success of avian species of concern. A critical element in MCnest is the use of surrogate endpoints, defined as measured endpoints from avian toxicity tests that represent specific types of effects possible in field populations at specific phases of a nesting attempt. In this article, we discuss the attributes of surrogate endpoints and provide guidance for selecting surrogates from existing avian laboratory tests as well as other possible sources. We also discuss some of the assumptions and uncertainties related to using surrogate endpoints to represent field effects. The process of explicitly considering how toxicity test results can be used to assess effects in the field helps identify uncertainties and data gaps that could be targeted in higher-tier risk assessments. © 2013 SETAC.
Memory, metamemory, and social cues: Between conformity and resistance.
Zawadzka, Katarzyna; Krogulska, Aleksandra; Button, Roberta; Higham, Philip A; Hanczakowski, Maciej
2016-02-01
When presented with responses of another person, people incorporate these responses into memory reports: a finding termed memory conformity. Research on memory conformity in recognition reveals that people rely on external social cues to guide their memory responses when their own ability to respond is at chance. In this way, conforming to a reliable source boosts recognition performance but conforming to a random source does not impair it. In the present study we assessed whether people would conform indiscriminately to reliable and unreliable (random) sources when they are given the opportunity to exercise metamemory control over their responding by withholding answers in a recognition test. In Experiments 1 and 2, we found the pattern of memory conformity to reliable and unreliable sources in 2 variants of a free-report recognition test, yet at the same time the provision of external cues did not affect the rate of response withholding. In Experiment 3, we provided participants with initial feedback on their recognition decisions, facilitating the discrimination between the reliable and unreliable source. This led to the reduction of memory conformity to the unreliable source, and at the same time modulated metamemory decisions concerning response withholding: participants displayed metamemory conformity to the reliable source, volunteering more responses in their memory report, and metamemory resistance to the random source, withholding more responses from the memory report. Together, the results show how metamemory decisions dissociate various types of memory conformity and that memory and metamemory decisions can be independent of each other. PsycINFO Database Record (c) 2016 APA, all rights reserved.
NEXT GENERATION LEACHING TESTS FOR EVALUATING ...
In the U.S. as in other countries, there is increased interest in using industrial by-products as alternative or secondary materials, helping to conserve virgin or raw materials. The LEAF and associated test methods are being used to develop the source term for leaching or any inorganic constituents of potential concern (COPC) in determining what is environmentally acceptable. The leaching test methods include batch equilibrium, percolation column and semi-dynamic mass transport tests for monolithic and compacted granular materials. By testing over a range of values for pH, liquid/solid ratio, and physical form of the material, this approach allows one data set to be used to evaluate a range of management scenarios for a material, representing different environmental conditions (e.g., disposal or beneficial use). The results from these tests may be interpreted individually or integrated to identify a solid material’s characteristic leaching behavior. Furthermore the LEAF approach provides the ability to make meaningful comparisons of leaching between similar and dissimilar materials from national and worldwide origins. To present EPA's research under SHC to implement validated leaching tests referred to as the Leaching Environmental Assessment Framework (LEAF). The primary focus will be on the guidance for implementation of LEAF describing three case studies for developing source terms for evaluating inorganic constituents.
Merrill, Rebecca D.; Shamim, Abu Ahmed; Ali, Hasmot; Schulze, Kerry; Rashid, Mahbubur; Christian, Parul; West, Jr., Keith P.
2009-01-01
Iron is ubiquitous in natural water sources used around the world for drinking and cooking. The health impact of chronic exposure to iron through water, which in groundwater sources can reach well above the World Health Organization's defined aesthetic limit of 0.3 mg/L, is not currently understood. To quantify the impact of consumption of iron in groundwater on nutritional status, it is important to accurately assess naturally-occurring exposure levels among populations. In this study, the validity of iron quantification in water was evaluated using two portable instruments: the HACH DR/890 portable colorimeter (colorimeter) and HACH Iron test-kit, Model IR-18B (test-kit), by comparing field-based iron estimates for 25 tubewells located in northwestern Bangladesh with gold standard atomic absorption spectrophotometry analysis. Results of the study suggest that the HACH test-kit delivers more accurate point-of-use results across a wide range of iron concentrations under challenging field conditions. PMID:19507757
Merrill, Rebecca D; Shamim, Abu Ahmed; Labrique, Alain B; Ali, Hasmot; Schulze, Kerry; Rashid, Mahbubur; Christian, Parul; West, Keith P
2009-06-01
Iron is ubiquitous in natural water sources used around the world for drinking and cooking. The health impact of chronic exposure to iron through water, which in groundwater sources can reach well above the World Health Organization's defined aesthetic limit of 0.3 mg/L, is not currently understood. To quantify the impact of consumption of iron in groundwater on nutritional status, it is important to accurately assess naturally-occurring exposure levels among populations. In this study, the validity of iron quantification in water was evaluated using two portable instruments: the HACH DR/890 portable colorimeter (colorimeter) and HACH Iron test-kit, Model IR-18B (test-kit), by comparing field-based iron estimates for 25 tubewells located in northwestern Bangladesh with gold standard atomic absorption spectrophotometry analysis. Results of the study suggest that the HACH test-kit delivers more accurate point-of-use results across a wide range of iron concentrations under challenging field conditions.
Cody, John T
2002-05-01
Medical Review Officer interpretation of laboratory results is an important component of drug testing programs. The clinical evaluation of laboratory results to assess the possibility of appropriate medical use of a drug is a task with many different facets, depending on the drug class considered. This intercession prevents the reporting of positive results unless it is apparent that drugs were used illicitly. In addition to the commonly encountered prescribed drugs that yield positive drug testing results, other sources of positive results must be considered. This review describes a series of compounds referred to as "precursor" drugs that are metabolized by the body to amphetamine and/or methamphetamine. These compounds lead to positive results for amphetamines even though neither amphetamine nor methamphetamine were used, a possibility that must be considered in the review of laboratory results. Description of the drugs, their clinical indications, and results seen following administration are provided. This information allows for the informed evaluation of results with regard to the potential involvement of these drugs.
The constructive role of gender asymmetry in social interaction: further evidence.
Psaltis, Charis
2011-06-01
Two hundred and sixty-four children aged 6.5-7.5 years (first graders) took part in a pre-test, interaction, and post-test experiment working on a spatial transformation task known as the 'village task'. Cognitive progress was assessed by pre- to post-test gains in both an immediate and delayed post-test in dyads and individual participants as a control. The results indicate clear links between particular pair types with both communication processes and with learning and cognitive developmental outcomes. The present study demonstrates that gender can act as a source of status asymmetry in peer interaction to influence communication, learning, and cognitive development in same- and mixed-sex dyads.
Risk Evaluation of Business Continuity Management by Using Green Technology
NASA Astrophysics Data System (ADS)
Gang, Chen
IT disasters can be seen as the test of the ability in communities and firms to effectively protect their information and infrastructure, to reduce both human and property loss, and to rapidly recover. In this paper, we use a literature meta-analysis method to identify potential research directions in Green Business Continuity Management (GBCM). The concept and characteristics of GBCM are discussed. We analysis the connotation and the sources of green technology risk. An assessment index system is established from the perspectives of GBCM. A fuzzy comprehensive assessment method is introduced to assess the risks of green technology in Business Continuity Management.
Benchmarking Controlled Trial—a novel concept covering all observational effectiveness studies
Malmivaara, Antti
2015-01-01
Abstract The Benchmarking Controlled Trial (BCT) is a novel concept which covers all observational studies aiming to assess effectiveness. BCTs provide evidence of the comparative effectiveness between health service providers, and of effectiveness due to particular features of the health and social care systems. BCTs complement randomized controlled trials (RCTs) as the sources of evidence on effectiveness. This paper presents a definition of the BCT; compares the position of BCTs in assessing effectiveness with that of RCTs; presents a checklist for assessing methodological validity of a BCT; and pilot-tests the checklist with BCTs published recently in the leading medical journals. PMID:25965700
Scenario Based Approach for Multiple Source Tsunami Hazard Assessment for Sines, Portugal
NASA Astrophysics Data System (ADS)
Wronna, Martin; Omira, Rachid; Baptista, Maria Ana
2015-04-01
In this paper, we present a scenario-based approach for tsunami hazard assessment for the city and harbour of Sines, Portugal one the test-sites of project ASTARTE. Sines holds one of the most important deep-water ports which contains oil-bearing, petrochemical, liquid bulk, coal and container terminals. The port and its industrial infrastructures are facing the ocean to the southwest facing the main seismogenic sources. This work considers two different seismic zones: the Southwest Iberian Margin and the Gloria Fault. Within these two regions, a total of five scenarios were selected to assess tsunami impact at the test site. These scenarios correspond to the worst-case credible scenario approach based upon the largest events of the historical and paleo tsunami catalogues. The tsunami simulations from the source area towards the coast is carried out using NSWING a Non-linear Shallow Water Model With Nested Grids. The code solves the non-linear shallow water equations using the discretization and explicit leap-frog finite difference scheme, in a Cartesian or Spherical frame. The initial sea surface displacement is assumed to be equal to the sea bottom deformation that is computed by Okada equations. Both uniform and non-uniform slip conditions are used. The presented results correspond to the models using non-uniform slip conditions. In this study, the static effect of tides is analysed for three different tidal stages MLLW (mean lower low water) MSL (mean sea level) and MHHW (mean higher high water). For each scenario, inundation is described by maximum values of wave height, flow depth, drawdown, run-up and inundation distance. Synthetic waveforms are computed at virtual tide gages at specific locations outside and inside the harbour. The final results consist of Aggregate Scenario Maps presented for the different inundation parameters. This work is funded by ASTARTE - Assessment, Strategy And Risk Reduction for Tsunamis in Europe - FP7-ENV2013 6.4-3, Grant 603839
NASA Astrophysics Data System (ADS)
Tang, L.; Titov, V. V.; Chamberlin, C. D.
2009-12-01
The study describes the development, testing and applications of site-specific tsunami inundation models (forecast models) for use in NOAA's tsunami forecast and warning system. The model development process includes sensitivity studies of tsunami wave characteristics in the nearshore and inundation, for a range of model grid setups, resolutions and parameters. To demonstrate the process, four forecast models in Hawaii, at Hilo, Kahului, Honolulu, and Nawiliwili are described. The models were validated with fourteen historical tsunamis and compared with numerical results from reference inundation models of higher resolution. The accuracy of the modeled maximum wave height is greater than 80% when the observation is greater than 0.5 m; when the observation is below 0.5 m the error is less than 0.3 m. The error of the modeled arrival time of the first peak is within 3% of the travel time. The developed forecast models were further applied to hazard assessment from simulated magnitude 7.5, 8.2, 8.7 and 9.3 tsunamis based on subduction zone earthquakes in the Pacific. The tsunami hazard assessment study indicates that use of a seismic magnitude alone for a tsunami source assessment is inadequate to achieve such accuracy for tsunami amplitude forecasts. The forecast models apply local bathymetric and topographic information, and utilize dynamic boundary conditions from the tsunami source function database, to provide site- and event-specific coastal predictions. Only by combining a Deep-ocean Assessment and Reporting of Tsunami-constrained tsunami magnitude with site-specific high-resolution models can the forecasts completely cover the evolution of earthquake-generated tsunami waves: generation, deep ocean propagation, and coastal inundation. Wavelet analysis of the tsunami waves suggests the coastal tsunami frequency responses at different sites are dominated by the local bathymetry, yet they can be partially related to the locations of the tsunami sources. The study also demonstrates the nonlinearity between offshore and nearshore maximum wave amplitudes.
The role of individual differences in the accuracy of confidence judgments.
Pallier, Gerry; Wilkinson, Rebecca; Danthiir, Vanessa; Kleitman, Sabina; Knezevic, Goran; Stankov, Lazar; Roberts, Richard D
2002-07-01
Generally, self-assessment of accuracy in the cognitive domain produces overconfidence, whereas self-assessment of visual perceptual judgments results in underconfidence. Despite contrary empirical evidence, in models attempting to explain those phenomena, individual differences have often been disregarded. The authors report on 2 studies in which that shortcoming was addressed. In Experiment 1, participants (N= 520) completed a large number of cognitive-ability tests. Results indicated that individual differences provide a meaningful source of overconfidence and that a metacognitive trait might mediate that effect. In further analysis, there was only a relatively small correlation between test accuracy and confidence bias. In Experiment 2 (N = 107 participants), both perceptual and cognitive ability tests were included, along with measures of personality. Results again indicated the presence of a confidence factor that transcended the nature of the testing vehicle. Furthermore, a small relationship was found between that factor and some self-reported personality measures. Thus, personality traits and cognitive ability appeared to play only a small role in determining the accuracy of self-assessment. Collectively, the present results suggest that there are multiple causes of miscalibration, which current models of over- and underconfidence fail to encompass.
NASA Astrophysics Data System (ADS)
Marcinkevics, Zbignevs; Rubins, Uldis; Zaharans, Janis; Miscuks, Aleksejs; Urtane, Evelina; Ozolina-Moll, Liga
2016-03-01
The feasibility of bispectral imaging photoplethysmography (iPPG) system for clinical assessment of cutaneous microcirculation at two different depths is proposed. The iPPG system has been developed and evaluated for in vivo conditions during various tests: (1) topical application of vasodilatory liniment on the skin, (2) skin local heating, (3) arterial occlusion, and (4) regional anesthesia. The device has been validated by the measurements of a laser Doppler imager (LDI) as a reference. The hardware comprises four bispectral light sources (530 and 810 nm) for uniform illumination of skin, video camera, and the control unit for triggering of the system. The PPG signals were calculated and the changes of perfusion index (PI) were obtained during the tests. The results showed convincing correlations for PI obtained by iPPG and LDI at (1) topical liniment (r=0.98) and (2) heating (r=0.98) tests. The topical liniment and local heating tests revealed good selectivity of the system for superficial microcirculation monitoring. It is confirmed that the iPPG system could be used for assessment of cutaneous perfusion at two different depths, morphologically and functionally different vascular networks, and thus utilized in clinics as a cost-effective alternative to the LDI.
Marcinkevics, Zbignevs; Rubins, Uldis; Zaharans, Janis; Miscuks, Aleksejs; Urtane, Evelina; Ozolina-Moll, Liga
2016-03-01
The feasibility of bispectral imaging photoplethysmography (iPPG) system for clinical assessment of cutaneous microcirculation at two different depths is proposed. The iPPG system has been developed and evaluated for in vivo conditions during various tests: (1) topical application of vasodilatory liniment on the skin, (2) skin local heating, (3) arterial occlusion, and (4) regional anesthesia. The device has been validated by the measurements of a laser Doppler imager (LDI) as a reference. The hardware comprises four bispectral light sources (530 and 810 nm) for uniform illumination of skin, video camera, and the control unit for triggering of the system. The PPG signals were calculated and the changes of perfusion index (PI) were obtained during the tests. The results showed convincing correlations for PI obtained by iPPG530 nm and LDI at (1) topical liniment (r = 0.98) and (2) heating (r = 0.98) tests. The topical liniment and local heating tests revealed good selectivity of the system for superficial microcirculation monitoring. It is confirmed that the iPPG system could be used for assessment of cutaneous perfusion at two different depths, morphologically and functionally different vascular networks, and thus utilized in clinics as a cost-effective alternative to the LDI.
Quality assurance and quality control in mammography: a review of available guidance worldwide.
Reis, Cláudia; Pascoal, Ana; Sakellaris, Taxiarchis; Koutalonis, Manthos
2013-10-01
Review available guidance for quality assurance (QA) in mammography and discuss its contribution to harmonise practices worldwide. Literature search was performed on different sources to identify guidance documents for QA in mammography available worldwide in international bodies, healthcare providers, professional/scientific associations. The guidance documents identified were reviewed and a selection was compared for type of guidance (clinical/technical), technology and proposed QA methodologies focusing on dose and image quality (IQ) performance assessment. Fourteen protocols (targeted at conventional and digital mammography) were reviewed. All included recommendations for testing acquisition, processing and display systems associated with mammographic equipment. All guidance reviewed highlighted the importance of dose assessment and testing the Automatic Exposure Control (AEC) system. Recommended tests for assessment of IQ showed variations in the proposed methodologies. Recommended testing focused on assessment of low-contrast detection, spatial resolution and noise. QC of image display is recommended following the American Association of Physicists in Medicine guidelines. The existing QA guidance for mammography is derived from key documents (American College of Radiology and European Union guidelines) and proposes similar tests despite the variations in detail and methodologies. Studies reported on QA data should provide detail on experimental technique to allow robust data comparison. Countries aiming to implement a mammography/QA program may select/prioritise the tests depending on available technology and resources. •An effective QA program should be practical to implement in a clinical setting. •QA should address the various stages of the imaging chain: acquisition, processing and display. •AEC system QC testing is simple to implement and provides information on equipment performance.
Design and realization of disaster assessment algorithm after forest fire
NASA Astrophysics Data System (ADS)
Xu, Aijun; Wang, Danfeng; Tang, Lihua
2008-10-01
Based on GIS technology, this paper mainly focuses on the application of disaster assessment algorithm after forest fire and studies on the design and realization of disaster assessment based on GIS. After forest fire through the analysis and processing of multi-sources and heterogeneous data, this paper integrates the foundation that the domestic and foreign scholars laid of the research on assessment for forest fire loss with the related knowledge of assessment, accounting and forest resources appraisal so as to study and approach the theory framework and assessment index of the research on assessment for forest fire loss. The technologies of extracting boundary, overlay analysis, and division processing of multi-sources spatial data are available to realize the application of the investigation method of the burnt forest area and the computation of the fire area. The assessment provides evidence for fire cleaning in burnt areas and new policy making on restoration in terms of the direct and the indirect economic loss and ecological and environmental damage caused by forest fire under the condition of different fire danger classes and different amounts of forest accumulation, thus makes forest resources protection operated in a faster, more efficient and more economical way. Finally, this paper takes Lin'an city of Zhejiang province as a test area to confirm the method mentioned in the paper in terms of key technologies.
Experimental assessment of theory for refraction of sound by a shear layer
NASA Technical Reports Server (NTRS)
Schlinker, R. H.; Amiet, R. K.
1978-01-01
The refraction angle and amplitude changes associated with sound transmission through a circular, open-jet shear layer were studied in a 0.91 m diameter open jet acoustic research tunnel. Free stream Mach number was varied from 0.1 to 0.4. Good agreement between refraction angle correction theory and experiment was obtained over the test Mach number, frequency and angle measurement range for all on-axis acoustic source locations. For off-axis source positions, good agreement was obtained at a source-to-shear layer separation distance greater than the jet radius. Measureable differences between theory and experiment occurred at a source-to-shear layer separation distance less than one jet radius. A shear layer turbulence scattering experiment was conducted at 90 deg to the open jet axis for the same free stream Mach numbers and axial source locations used in the refraction study. Significant discrete tone spectrum broadening and tone amplitude changes were observed at open jet Mach numbers above 0.2 and at acoustic source frequencies greater than 5 kHz. More severe turbulence scattering was observed for downstream source locations.
Evaluation of discrete frequency sound in closed-test-section wind tunnels
NASA Technical Reports Server (NTRS)
Mosher, Marianne
1990-01-01
The principal objective of this study is to assess the adequacy of linear acoustic theory with an impedance wall boundary condition for modeling the detailed sound field of an acoustic source in a duct. This study compares measurements and calculations of a simple acoustic source in a rectangular concrete duct lined with foam on the walls and anechoic end terminations. Measuring acoustic pressure for 12 wave numbers provides variation in frequency and absorption characteristics of the duct walls. The cases in this study contain low frequencies and low wall absorptions corresponding to measurements of low-frequency helicopter noise in a lined wind tunnel. This regime is particularly difficult to measure in wind tunnels due to high levels of the reverberant field relatively close to the source. Close to the source, where the interference of wall reflections is minimal, correlation is very good. Away from the source, correlation degrades, especially for the lower frequencies. Sensitivity studies show little effect on the predicted results for changes in impedance boundary condition values, source location, measurement location, temperature, and source model for variations spanning the expected measurement error.
Wright, Alison G; Ellis, Timothy P; Ilag, Leodevico L
2014-12-01
An aqueous filtered molasses concentrate (FMC) sourced from sugar cane was used as a functional ingredient in a range of carbohydrate-containing foods to reduce glycaemic response. When compared to untreated controls, postprandial glucose responses in the test products were reduced 5-20%, assessed by accredited glycaemic index (GI) testing. The reduction in glucose response in the test foods was dose-dependent and directly proportional to the ratio of FMC added to the amount of available carbohydrate in the test products. The insulin response to the foods was also reduced with FMC addition as compared to untreated controls. Inclusion of FMC in test foods did not replace any formulation ingredients; it was incorporated as an additional ingredient to existing formulations. Filtered molasses concentrate, made by a proprietary and patented process, contains many naturally occurring compounds. Some of the identified compounds are known to influence carbohydrate metabolism, and include phenolic compounds, minerals and organic acids. FMC, sourced from a by-product of sugar cane processing, shows potential as a natural functional ingredient capable of modifying carbohydrate metabolism and contributing to GI reduction of processed foods and beverages.
Tumorigenicity assessment of human cell-processed therapeutic products.
Yasuda, Satoshi; Sato, Yoji
2015-09-01
Human pluripotent stem cells (hPSCs) are expected to be sources of various cell types used for cell therapy, although hPSCs are intrinsically tumorigenic and form teratomas in immunodeficient animals after transplant. Despite the urgent need, no detailed guideline for the assessment of tumorigenicity of human cell-processed therapeutic products (hCTPs) has been issued. Here we describe our consideration on tumorigenicity and related tests of hCTPs. The purposes of those tests for hPSC-based products are classified into three categories: 1) quality control of raw materials; 2) quality control of intermediate/final products; and 3) safety assessment of final products. Appropriate types of tests need to be selected, taking the purpose(s) into consideration. In contrast, human somatic (and somatic stem) cells are believed to have little tumorigenicity. Therefore, GMP-compliant quality control is essential to avoid contamination of somatic cell-derived products with tumorigenic cells. Compared with in vivo tumorigenicity tests, in vitro cell proliferation assays may be more useful and reasonable for detecting immortalized cells that have a growth advantage in somatic cell-based products. The results obtained from tumorigenicity and related tests for hCTPs should meet the criteria for decisions on product development, manufacturing processes, and clinical applications. Copyright © 2015.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sig Drellack, Lance Prothro
2007-12-01
The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result ofmore » the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The simulations are challenged by the distributed sources in each of the Corrective Action Units, by complex mass transfer processes, and by the size and complexity of the field-scale flow models. An efficient methodology utilizing particle tracking results and convolution integrals provides in situ concentrations appropriate for Monte Carlo analysis. Uncertainty in source releases and transport parameters including effective porosity, fracture apertures and spacing, matrix diffusion coefficients, sorption coefficients, and colloid load and mobility are considered. With the distributions of input uncertainties and output plume volumes, global analysis methods including stepwise regression, contingency table analysis, and classification tree analysis are used to develop sensitivity rankings of parameter uncertainties for each model considered, thus assisting a variety of decisions.« less
Klymus, Katy E.; Richter, Cathy; Thompson, Nathan; Hinck, Jo E.
2017-01-01
Understanding how anthropogenic impacts on the landscape affect wildlife requires a knowledge of community assemblages. Species surveys are the first step in assessing community structure, and recent molecular applications such as metabarcoding and environmental DNA analyses have been proposed as an additional and complementary wildlife survey method. Here, we test eDNA metabarcoding as a survey tool to examine the potential use of uranium mine containment ponds as water sources by wildlife. We tested samples from surface water near mines and from one mine containment pond using two markers, 12S and 16S rRNA gene amplicons, to survey for vertebrate species. We recovered large numbers of sequence reads from taxa expected to be in the area and from less common or hard to observe taxa such as the tiger salamander and gray fox. Detection of these two species is of note because they were not observed in a previous species assessment, and tiger salamander DNA was found in the mine containment pond sample. We also found that sample concentration by centrifugation was a more efficient and more feasible method than filtration in these highly turbid surface waters. Ultimately, the use of eDNA metabarcoding could allow for a better understanding of the area’s overall biodiversity and community composition as well as aid current ecotoxicological risk assessment work.
Development of phytotoxicity tests using wetland species
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, M.K.; Fairchild, J.F.
1994-12-31
Laboratory phytotoxicity tests used to assess contaminant effects may not effectively protect wetland communities. The authors are developing routine culture and testing methods for selected fresh water plants, that can be used in risk assessments and monitoring of existing wetland systems. Utility of these tests includes evaluating the effects of point or non-point source contamination that may cause water or sediment quality degradation. Selected species include algae (blue-green, green), phytoflagellates (Chlamydomonas, Euglena), and floating or submerged vascular plants (milfoil, coontail, wild celery, elodea, duckweed). Algae toxicity tests range from 2-d, 4-d, and 7 day tests, and macrophyte tests from 10-dmore » to 14 days. Metribuzin and boron are the selected contaminants for developing the test methods. Metribuzin, a triazinone herbicide, is a photosystem 11 inhibitor, and is commonly used for control of grass and broad-leaf plants. As a plant micronutrient, boron is required in very small amounts, but excessive levels can result in phytotoxicity or accumulation. The investigations focus on the influence of important factors including the influence of light quality and quantity, and nutrient media. Reference toxicant exposures with potassium chloride are used to establish baseline data for sensitivity and vitality of the plants. These culture and test methods will be incorporated into recommendations for standard phytotoxicity test designs.« less
Kauai Test Facility hazards assessment document
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swihart, A
1995-05-01
The Department of Energy Order 55003A requires facility-specific hazards assessment be prepared, maintained, and used for emergency planning purposes. This hazards assessment document describes the chemical and radiological hazards associated with the Kauai Test Facility, Barking Sands, Kauai, Hawaii. The Kauai Test Facility`s chemical and radiological inventories were screened according to potential airborne impact to onsite and offsite individuals. The air dispersion model, ALOHA, estimated pollutant concentrations downwind from the source of a release, taking into consideration the toxicological and physical characteristics of the release site, the atmospheric conditions, and the circumstances of the release. The greatest distance to themore » Early Severe Health Effects threshold is 4.2 kilometers. The highest emergency classification is a General Emergency at the {open_quotes}Main Complex{close_quotes} and a Site Area Emergency at the Kokole Point Launch Site. The Emergency Planning Zone for the {open_quotes}Main Complex{close_quotes} is 5 kilometers. The Emergency Planning Zone for the Kokole Point Launch Site is the Pacific Missile Range Facility`s site boundary.« less
Reading Content Knowledge: What Do Teachers Need to Know and How Can We Assess Their Knowledge?
ERIC Educational Resources Information Center
Lilienthal, Linda K.
2008-01-01
The purpose of this study was to investigate preservice teachers' reading content knowledge, to develop a definition of reading, and to develop an informal test of teachers' reading content knowledge. A content analysis of two contemporary reading textbooks used in university reading courses was the source of a six-tier, hierarchical definition of…
This poster will present a modeling and mapping assessment of landscape sensitivity to non-point source pollution as applied to a hierarchy of catchment drainages in the Coastal Plain of the state of North Carolina. Analysis of the subsurface residence time of water in shallow a...
Laboratory tests for mumps vaccines.
Minor, P D
1997-03-01
The action of live attenuated vaccines against mumps is poorly understood although their clinical efficacy is beyond doubt. The attenuated character of the vaccine is assured by consistency of production related to clinical trials, and limited studies of vaccine seeds in primates. Potency is assessed by infectivity in vitro and is subject to poorly understood sources of variation. Molecular biological studies are at an early stage.
ERIC Educational Resources Information Center
Reynolds, Greg D.; Courage, Mary L.; Richards, John E.
2010-01-01
In this study, we had 3 major goals. The 1st goal was to establish a link between behavioral and event-related potential (ERP) measures of infant attention and recognition memory. To assess the distribution of infant visual preferences throughout ERP testing, we designed a new experimental procedure that embeds a behavioral measure (paired…
The Effects of Crosswind Flight on Rotor Harmonic Noise Radiation
NASA Technical Reports Server (NTRS)
Greenwood, Eric; Sim, Ben W.
2013-01-01
In order to develop recommendations for procedures for helicopter source noise characterization, the effects of crosswinds on main rotor harmonic noise radiation are assessed using a model of the Bell 430 helicopter. Crosswinds are found to have a significant effect on Blade-Vortex Interaction (BVI) noise radiation when the helicopter is trimmed with the fuselage oriented along the inertial flight path. However, the magnitude of BVI noise remains unchanged when the pilot orients the fuselage along the aerodynamic velocity vector, crabbing for zero aerodynamic sideslip. The effects of wind gradients on BVI noise are also investigated and found to be smaller in the crosswind direction than in the headwind direction. The effects of crosswinds on lower harmonic noise sources at higher flight speeds are also assessed. In all cases, the directivity of radiated noise is somewhat changed by the crosswind. The model predictions agree well with flight test data for the Bell 430 helicopter captured under various wind conditions. The results of this investigation would suggest that flight paths for future acoustic flight testing are best aligned across the prevailing wind direction to minimize the effects of winds on noise measurements when wind cannot otherwise be avoided.
Iron bioavailability studies as assessed by intrinsic and extrinsic labeling techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, C.D.
Although soybeans are a rich source of iron and incorporation of soy protein into diets is increasing, the presence of phytate or fiber endogenous to the seeds may inhibit total iron absorption from diets including soy protein. Four studies on iron bioavailability as assessed by intrinsic and extrinsic labeling techniques in rats were completed. The effect of previous dietary protein on the absorption of intrinsically /sup 59/Fe labeled defatted soy flour was determined in rats. The results indicated that the type of dietary protein (animal vs. plant) in pre-test diets would have little influence on iron absorption from a singlemore » soy protein test meal. Therefore, adaptation of soy protein does not improve bioavailability of iron. Soybean hulls were investigated as a source of iron fortification in bread. The results indicated that retention of /sup 59/Fe from white bread baked with soy hulls did not differ from white bread fortified with bakery grade ferrous sulfate. The effect of endogenous soybean phytate on iron absorption in rats was measured using seeds of varying phytate content and intrinsically labeled with /sup 59/Fe. Increasing concentration of phytate in whole soybean flour had no significant effect on iron absorption.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-12-31
Describes a study undertaken to assess the indoor air quality in the Clos St-Andre, a 78-unit residential complex in downtown Montreal, through the implementation of a monitoring protocol in three of the building`s suites; and to examine the relationships between mechanical ventilation, material emissions, occupant lifestyle, and indoor air pollutant concentrations. The monitoring protocol consisted of tracer gas, air exchange testing, material emission testing, airtightness testing, and the monitoring of air temperature, relative humidity, carbon dioxide, carbon monoxide, formaldehyde, and total volatile organic carbon in the suites. Trends in pollutant concentrations over time in the post-construction period are noted.
Paans, Wolter; Sermeus, Walter; Nieweg, Roos Mb; Krijnen, Wim P; van der Schans, Cees P
2012-08-01
This paper reports a study about the effect of knowledge sources, such as handbooks, an assessment format and a predefined record structure for diagnostic documentation, as well as the influence of knowledge, disposition toward critical thinking and reasoning skills, on the accuracy of nursing diagnoses.Knowledge sources can support nurses in deriving diagnoses. A nurse's disposition toward critical thinking and reasoning skills is also thought to influence the accuracy of his or her nursing diagnoses. A randomised factorial design was used in 2008-2009 to determine the effect of knowledge sources. We used the following instruments to assess the influence of ready knowledge, disposition, and reasoning skills on the accuracy of diagnoses: (1) a knowledge inventory, (2) the California Critical Thinking Disposition Inventory, and (3) the Health Science Reasoning Test. Nurses (n = 249) were randomly assigned to one of four factorial groups, and were instructed to derive diagnoses based on an assessment interview with a simulated patient/actor. The use of a predefined record structure resulted in a significantly higher accuracy of nursing diagnoses. A regression analysis reveals that almost half of the variance in the accuracy of diagnoses is explained by the use of a predefined record structure, a nurse's age and the reasoning skills of `deduction' and `analysis'. Improving nurses' dispositions toward critical thinking and reasoning skills, and the use of a predefined record structure, improves accuracy of nursing diagnoses.
2012-01-01
Background This paper reports a study about the effect of knowledge sources, such as handbooks, an assessment format and a predefined record structure for diagnostic documentation, as well as the influence of knowledge, disposition toward critical thinking and reasoning skills, on the accuracy of nursing diagnoses. Knowledge sources can support nurses in deriving diagnoses. A nurse’s disposition toward critical thinking and reasoning skills is also thought to influence the accuracy of his or her nursing diagnoses. Method A randomised factorial design was used in 2008–2009 to determine the effect of knowledge sources. We used the following instruments to assess the influence of ready knowledge, disposition, and reasoning skills on the accuracy of diagnoses: (1) a knowledge inventory, (2) the California Critical Thinking Disposition Inventory, and (3) the Health Science Reasoning Test. Nurses (n = 249) were randomly assigned to one of four factorial groups, and were instructed to derive diagnoses based on an assessment interview with a simulated patient/actor. Results The use of a predefined record structure resulted in a significantly higher accuracy of nursing diagnoses. A regression analysis reveals that almost half of the variance in the accuracy of diagnoses is explained by the use of a predefined record structure, a nurse’s age and the reasoning skills of `deduction’ and `analysis’. Conclusions Improving nurses’ dispositions toward critical thinking and reasoning skills, and the use of a predefined record structure, improves accuracy of nursing diagnoses. PMID:22852577
Episodic Memory and Regional Atrophy in Frontotemporal Lobar Degeneration
Söderlund, Hedvig; Black, Sandra E.; Miller, Bruce L.; Freedman, Morris; Levine, Brian
2008-01-01
It has been unclear to what extent memory is affected in frontotemporal lobar degeneration (FTLD). Since patients usually have atrophy in regions implicated in memory function, the frontal and/or temporal lobes, one would expect some memory impairment, and that the degree of atrophy in these regions would be inversely related to memory function. The purposes of this study were 1) to assess episodic memory function in FTLD, and more specifically patients' ability to episodically re-experience an event, and determine its source; 2) to examine whether memory performance is related to quantified regional brain atrophy. FTLD patients (n=18) and healthy comparison subjects (n=14) were assessed with cued recall, recognition, “remember/know” (self-reported re-experiencing) and source recall, at 30 min and 24 hr after encoding. Regional gray matter volumes were assessed with high resolution structural MRI concurrently to testing. Patients performed worse than comparison subjects on all memory measures. Gray matter volume in the left medial temporal lobe was positively correlated with recognition, re-experiencing, and source recall. Gray matter volume in the left posterior temporal lobe correlated significantly with recognition, at 30 min and 24 hr, and with source recall at 30 min. Estimated familiarity at 30 min was positively correlated with gray matter volume in the left inferior parietal lobe. In summary, episodic memory deficits in FTLD may be more common than previously thought, particularly in patients with left medial and posterior temporal atrophy. PMID:17888461
2001-11-01
that there were· no· target misses. The Hellfire missile does not have a depleted uranium head . . -,, 2.2.2.3 Tank movement During the test, the...guide otber users through the use of this. complicated program. The_input data files for NOISEMAP consist of a root file name with several extensions...SOURCES subdirectory. This file will have the root file name followed by an accession number, then the .bps extension. The user must check the *.log
NASA Technical Reports Server (NTRS)
Stanford, Malcolm K.; Thomas, Fransua; Dellacorte, Christopher
2012-01-01
The flammability of 60-NITINOL (60 weight percentage Ni and 40 weight percentage Ti) in an oxygen-rich atmosphere is assessed. It is determined that 60-NITINOL burns readily in gaseous oxygen and would not be a good candidate for components exposed to oxygen-rich environments where there may be an ignition source. The results are the same whether the material is tested without heat treatment, after a solution treatment or after furnace annealing. These results provide guidance for materials selection of aerospace turbomachinery components.
Determining the sources of fine-grained sediment using the Sediment Source Assessment Tool (Sed_SAT)
Gorman Sanisaca, Lillian E.; Gellis, Allen C.; Lorenz, David L.
2017-07-27
A sound understanding of sources contributing to instream sediment flux in a watershed is important when developing total maximum daily load (TMDL) management strategies designed to reduce suspended sediment in streams. Sediment fingerprinting and sediment budget approaches are two techniques that, when used jointly, can qualify and quantify the major sources of sediment in a given watershed. The sediment fingerprinting approach uses trace element concentrations from samples in known potential source areas to determine a clear signature of each potential source. A mixing model is then used to determine the relative source contribution to the target suspended sediment samples.The computational steps required to apportion sediment for each target sample are quite involved and time intensive, a problem the Sediment Source Assessment Tool (Sed_SAT) addresses. Sed_SAT is a user-friendly statistical model that guides the user through the necessary steps in order to quantify the relative contributions of sediment sources in a given watershed. The model is written using the statistical software R (R Core Team, 2016b) and utilizes Microsoft Access® as a user interface but requires no prior knowledge of R or Microsoft Access® to successfully run the model successfully. Sed_SAT identifies outliers, corrects for differences in size and organic content in the source samples relative to the target samples, evaluates the conservative behavior of tracers used in fingerprinting by applying a “Bracket Test,” identifies tracers with the highest discriminatory power, and provides robust error analysis through a Monte Carlo simulation following the mixing model. Quantifying sediment source contributions using the sediment fingerprinting approach provides local, State, and Federal land management agencies with important information needed to implement effective strategies to reduce sediment. Sed_SAT is designed to assist these agencies in applying the sediment fingerprinting approach to quantify sediment sources in the sediment TMDL framework.
Dietary sources of lutein in adults suffering eye disease (AMD/cataracts).
Sulich, Agnieszka; Hamułka, Jadwiga; Nogal, Dorota
2015-01-01
Epidemiological studies indicate that by consuming 6-14 mg lutein daily, the risk of acquiring eye diseases like age-related macular degeneration (AMD) or cataracts becomes reduced. Their symptoms can also by such means be alleviated and treatment improved. To estimate dietary intakes of lutein obtained from foodstuffs and supplements along with determining its main sources in selected groups of adults suffering from eye disease and healthy controls. The study was performed in Warsaw and its neighbourhoods during 2008-12. Subjects were 375 adults aged 50-97 years, of whom half had been diagnosed with AMD and/or cataracts; constituting the test group. Dietary intakes of lutein were assessed by Food Frequency Questionnaire Method whilst interview questionnaires assessed the intake of supplements. Overall, the average dietary intake of lutein from foodstuffs was 2.5 mg daily, with the test group being significantly higher than healthy controls (2.9 vs 2.1 mg daily). Women's intakes were also higher than in men (2.9 vs 2.1 mg daily), as were those possessing higher or secondary education compared to the others with primary or vocational education (2.7 vs 2.3 mg daily). Fresh vegetables were found to be the main dietary sources of lutein that included green leafy vegetables and frozen vegetables, constituting respectively 63% and 13% of the dietary intake. Dietary supplements containing lutein were taken by 109 subjects of whom most had eye disease (over 80%); where the average daily consumption of lutein from this source was 6.5 mg. For older people, the dietary intake of lutein from foodstuffs may be insufficient to prevent eye disease. Taking daily dietary supplements would thus be indicated to make up such deficiencies of lutein.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yidong Xia; Mitch Plummer; Robert Podgorney
2016-02-01
Performance of heat production process over a 30-year period is assessed in a conceptual EGS model with a geothermal gradient of 65K per km depth in the reservoir. Water is circulated through a pair of parallel wells connected by a set of single large wing fractures. The results indicate that the desirable output electric power rate and lifespan could be obtained under suitable material properties and system parameters. A sensitivity analysis on some design constraints and operation parameters indicates that 1) the fracture horizontal spacing has profound effect on the long-term performance of heat production, 2) the downward deviation anglemore » for the parallel doublet wells may help overcome the difficulty of vertical drilling to reach a favorable production temperature, and 3) the thermal energy production rate and lifespan has close dependence on water mass flow rate. The results also indicate that the heat production can be improved when the horizontal fracture spacing, well deviation angle, and production flow rate are under reasonable conditions. To conduct the reservoir modeling and simulations, an open-source, finite element based, fully implicit, fully coupled hydrothermal code, namely FALCON, has been developed and used in this work. Compared with most other existing codes that are either closed-source or commercially available in this area, this new open-source code has demonstrated a code development strategy that aims to provide an unparalleled easiness for user-customization and multi-physics coupling. Test results have shown that the FALCON code is able to complete the long-term tests efficiently and accurately, thanks to the state-of-the-art nonlinear and linear solver algorithms implemented in the code.« less
Modulating the Neutron Flux from a Mirror Neutron Source
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ryutov, D D
2011-09-01
A 14-MeV neutron source based on a Gas-Dynamic Trap will provide a high flux of 14 MeV neutrons for fusion materials and sub-component testing. In addition to its main goal, the source has potential applications in condensed matter physics and biophysics. In this report, the author considers adding one more capability to the GDT-based neutron source, the modulation of the neutron flux with a desired frequency. The modulation may be an enabling tool for the assessment of the role of non-steady-state effects in fusion devices as well as for high-precision, low-signal basic science experiments favoring the use of the synchronousmore » detection technique. A conclusion is drawn that modulation frequency of up to 1 kHz and modulation amplitude of a few percent is achievable. Limitations on the amplitude of modulations at higher frequencies are discussed.« less
How to Manual: How to Update and Enhance Your Local Source Water Protection Assessments
Describes opportunities for improving source water assessments performed under the Safe Drinking Water Act 1453. It includes: local delineations, potential contaminant source inventories, and susceptibility determinations of source water assessment.
Bussing, Regina; Zima, Bonnie T.; Mason, Dana M.; Meyer, Johanna.M.; White, Kimberly; Garvan, Cynthia W.
2012-01-01
PURPOSE The chronic illness model advocates for psychoeducation within a collaborative care model to enhance outcomes. To inform psychoeducational approaches for attention-deficit/hyperactivity disorder (ADHD), this study describes parent and adolescent knowledge, perceptions and information sources and explores how these vary by sociodemographic characteristics, ADHD risk, and past child mental health service use. METHODS Parents and adolescents were assessed 7.7 years after initial school district screening for ADHD risk. The study sample included 374 adolescents (56% high and 44% low ADHD risk), on average 15.4 (SD 1.8) years old, and 36% were African American. Survey questions assessed ADHD knowledge, perceptions, and cues to action, and elicited utilized and preferred information sources. Multiple logistic regression was used to determine potential independent predictors of ADHD knowledge. McNemar's tests compared information source utilization against preference. RESULTS Despite relatively high self-rated ADHD familiarity, misperceptions among parents and adolescents were common, including a sugar etiology (25% and 27%, respectively) and medication overuse (85% and 67%). African American respondents expressed lower ADHD awareness and greater belief in sugar etiology than Caucasians. Parents used a wide range of ADHD information sources while adolescents relied on social network members and teachers/school. However, parents and adolescents expressed similar strong preferences for the Internet (49% and 51%) and doctor (40% and 27%) as ADHD information sources. CONCLUSION Culturally appropriate psychoeducational strategies are needed that combine doctor-provided ADHD information with reputable Internet sources. Despite time limitations during patient visits, both parents and teens place high priority on receiving information from their doctor. PMID:23174470
Alfredsson, Jayne; Plichart, Patrick; Zary, Nabil
2012-01-01
Research on computer supported scoring of assessments in health care education has mainly focused on automated scoring. Little attention has been given to how informatics can support the currently predominant human-based grading approach. This paper reports steps taken to develop a model for a computer supported scoring process that focuses on optimizing a task that was previously undertaken without computer support. The model was also implemented in the open source assessment platform TAO in order to study its benefits. Ability to score test takers anonymously, analytics on the graders reliability and a more time efficient process are example of observed benefits. A computer supported scoring will increase the quality of the assessment results.
Contamination source review for Building E5978, Edgewood Area, Aberdeen Proving Ground, Maryland
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mosqueda, G.; Dougherty, J.; Draugelis, A.K.
1995-09-01
This report documents the results of a contamination source review of Building E5978 at the Aberdeen Proving Group (APG) in Maryland. The primary mission at APG has been the testing and evaluation of US Army warfare materials. Since its beginning in 1917, the Edgewood Area of APG has been the principal location for chemical warfare agent research, development, and testing in the US. APG was also used for producing chemical warfare agents during both world wars, and it has been a center for the storage of chemical warfare material. An attempt was made to identify and define areas of toxicmore » or hazardous contaminants and to assess the physical condition and accessibility of APG buildings. The information obtained from this review may be used to assist the US Army in planning for the future use or disposition of the buildings. The contamination source review consisted of the following tasks: historical records search, physical inspection, photographic documentation, geophysical investigation, and collection of air samples for the presence of volatile organic compounds.« less
Contamination source review for Building E3641, Edgewood Area, Aberdeen Proving Ground, Maryland
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zellmer, S.D.; Draugelis, A.K.; Rueda, J.
1995-09-01
This report documents the results of a contamination source review of Building E3641 at the Aberdeen Proving Ground (APG) in Maryland. The primary mission at APG has been the testing and evaluation of US Army warfare materials. Since its beginning in 1917, the Edgewood Area of APG has been the principal location for chemical warfare agent research, development, and testing in the US. APG was also used for producing chemical warfare agents during both world wars, and it has been a center for the storage of chemical warfare material. An attempt was made to identify and define areas of toxicmore » or hazardous contaminants and to assess the physical condition and accessibility of APG buildings. The information obtained from this review may be used to assist the US Army in planning for the future use or disposition of the buildings. The contamination source review consisted of the following tasks: historical records search, physical inspection, photographic documentation, geophysical investigation, and review of available records regarding underground storage tanks associated with each building.« less
Milne, A D; Brousseau, P A; Brousseau, C A
2014-12-01
A bench-top study was performed to assess the effects of different laryngoscope handles on the light intensity delivered from disposable metal or plastic laryngoscope blades. The light intensity from both the handle light sources themselves and the combined handle and laryngoscope blade sets was measured using a custom-designed testing system and light meter. Five samples of each disposable blade type were tested and compared with a standard re-usable stainless steel blade using three different handle/light sources (Vital Signs LED, Heine 2.5 V Xenon and 3.5 V Xenon). The light intensity delivered by the disposable blades ranged from 790 to 3846 lux for the different handle types. Overall, the 3.5 V Heine handle delivered the highest light output (p < 0.007) in comparison with the other handles. For the disposable blades, the overall light output was significantly higher from the plastic than the metal blades (p < 0.001). © 2014 The Association of Anaesthetists of Great Britain and Ireland.
Finnilä, Katarina; Mahlberg, Nina; Santtila, Pekka; Sandnabba, Kenneth; Niemi, Pekka
2003-05-01
In the present study the relative contributions of internal and external sources of variation in children's suggestibility in interrogative situations were examined. One hundred and eleven children (48 4- to 5-year-olds and 63 7- to 8-year-olds) were administered a suggestibility test (BTSS) and the most suggestible (N=36) and the least suggestible (N=36) children were randomly assigned to either an interview condition containing several suggestive techniques or to one containing only suggestive questions. The effects of internal sources of variation in suggestibility were compared with the effects of the interview styles on the children's answers. The former did influence the children, but the external sources of variation in suggestibility had a stronger impact. Influences of cognitive, developmental factors could be found, but not when abuse-related questions were asked and high pressured interview methods were used. These findings indicate that individual assessment of suggestibility can be of some assistance when interviewing children, but diminishing suggestive influences in interrogations must be given priority.
The electromagnetic interference of mobile phones on the function of a γ-camera.
Javadi, Hamid; Azizmohammadi, Zahra; Mahmoud Pashazadeh, Ali; Neshandar Asli, Isa; Moazzeni, Taleb; Baharfar, Nastaran; Shafiei, Babak; Nabipour, Iraj; Assadi, Majid
2014-03-01
The aim of the present study is to evaluate whether or not the electromagnetic field generated by mobile phones interferes with the function of a SPECT γ-camera during data acquisition. We tested the effects of 7 models of mobile phones on 1 SPECT γ-camera. The mobile phones were tested when making a call, in ringing mode, and in standby mode. The γ-camera function was assessed during data acquisition from a planar source and a point source of Tc with activities of 10 mCi and 3 mCi, respectively. A significant visual decrease in count number was considered to be electromagnetic interference (EMI). The percentage of induced EMI with the γ-camera per mobile phone was in the range of 0% to 100%. The incidence of EMI was mainly observed in the first seconds of ringing and then mitigated in the following frames. Mobile phones are portable sources of electromagnetic radiation, and there is interference potential with the function of SPECT γ-cameras leading to adverse effects on the quality of the acquired images.
Assessing the TARES as an ethical model for antismoking ads.
Lee, Seow Ting; Cheng, I-Huei
2010-01-01
This study examines the ethical dimensions of public health communication, with a focus on antismoking public service announcements (PSAs). The content analysis of 826 television ads from the U.S. Centers for Disease Control and Prevention's (CDC) Media Campaign Resource Center is an empirical testing of Baker and Martinson's (2001) TARES Test that directly examines persuasive messages for truthfulness, authenticity, respect, equity, and social responsibility. In general, the antismoking ads score highly on ethicality. There are significant relationships between ethicality and message attributes (thematic frame, emotion appeal, source, and target audience). Ads that portrayed smoking as damaging to health and socially unacceptable score lower in ethicality than ads that focus on tobacco industry manipulation, addiction, dangers of secondhand smoke, and cessation. Emotion appeals of anger and sadness are associated with higher ethicality than shame and humor appeals. Ads targeting teen/youth audiences score lower on ethicality than ads targeting adult and general audiences. There are significant differences in ethicality based on source; ads produced by the CDC rate higher in ethicality than other sources. Theoretical implications and practical recommendations are discussed.
Magwedere, K; Bishi, A; Tjipura-Zaire, G; Eberle, G; Hemberger, Y; Hoffman, L C; Dziva, F
2011-12-01
A confirmed case of human brucellosis motivated an investigation into the potential source of infection in Namibia. Since domestic animals are principal sources of Brucella infection in humans, 1692 serum samples were screened from sheep, goats and cattle from 4 presumably at-risk farms and 900 springbok (Antidorcas marsupialis) serum samples from 29 mixed farming units for Brucella antibodies by the Rose-Bengal test (RBT) and positive cases confirmed by complement fixation test (CFT). To assess the prevalence of human brucellosis, 137 abattoir employees were tested for Brucella antibodies using the standard tube agglutination test (STAT) and by enzyme linked immunosorbent assay (ELISA). Cattle and sheep from all 4 farms were negative by RBT and CFT but 2 of the 4 farms (Ba and C) had 26/42 and 12/285 seropositive goats, respectively. Post mortem examination of seropositive goats revealed no gross pathological lesions typical of brucellosis except enlarged mesenteric and iliac lymph nodes seen in a single buck. Culture for brucellae from organs of seropositive animals was negative. None of the wildlife sera tested positive by either RBT or CFT. Interviews revealed that besides the case that prompted the investigation, a family and another person from other farms with confirmed brucellosis shared a common history of consumption of unpasteurised goat milk, home-made goat cheese and coffee with raw milk and prior contact with goats, suggesting goats as the likely source of infection. All 137 abattoir employees tested negative by STAT, but 3 were positive by ELISA. The 3 abattoir workers were clinically normal and lacked historical connections with clinical cases. Although goats are often associated with B. melitensis, these studies could not explicitly implicate this species owing to cross-reactivity with B. abortus, which can also infect goats. Nevertheless, these data reinforce the need for a better National Control Programme for brucellosis in Namibia.
The sensory timecourses associated with conscious visual item memory and source memory.
Thakral, Preston P; Slotnick, Scott D
2015-09-01
Previous event-related potential (ERP) findings have suggested that during visual item and source memory, nonconscious and conscious sensory (occipital-temporal) activity onsets may be restricted to early (0-800 ms) and late (800-1600 ms) temporal epochs, respectively. In an ERP experiment, we tested this hypothesis by separately assessing whether the onset of conscious sensory activity was restricted to the late epoch during source (location) memory and item (shape) memory. We found that conscious sensory activity had a late (>800 ms) onset during source memory and an early (<200 ms) onset during item memory. In a follow-up fMRI experiment, conscious sensory activity was localized to BA17, BA18, and BA19. Of primary importance, the distinct source memory and item memory ERP onsets contradict the hypothesis that there is a fixed temporal boundary separating nonconscious and conscious processing during all forms of visual conscious retrieval. Copyright © 2015 Elsevier B.V. All rights reserved.
Integration and Utilization of Nuclear Systems on the Moon and Mars
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houts, Michael G.; Schmidt, George R.; Bragg-Sitton, Shannon
2006-01-20
Over the past five decades numerous studies have identified nuclear energy as an enhancing or enabling technology for planetary surface exploration missions. This includes both radioisotope and fission sources for providing both heat and electricity. Nuclear energy sources were used to provide electricity on Apollo missions 12, 14, 15, 16, and 17, and on the Mars Viking landers. Very small nuclear energy sources were used to provide heat on the Mars Pathfinder, Spirit, and Opportunity rovers. Research has been performed at NASA MSFC to help assess potential issues associated with surface nuclear energy sources, and to generate data that couldmore » be useful to a future program. Research areas include System Integration, use of Regolith as Radiation Shielding, Waste Heat Rejection, Surface Environmental Effects on the Integrated System, Thermal Simulators, Surface System Integration / Interface / Interaction Testing, End-to-End Breadboard Development, Advanced Materials Development, Surface Energy Source Coolants, and Planetary Surface System Thermal Management and Control. This paper provides a status update on several of these research areas.« less
NASA Astrophysics Data System (ADS)
Omira, R.; Matias, L.; Baptista, M. A.
2016-12-01
This study constitutes a preliminary assessment of probabilistic tsunami inundation in the NE Atlantic region. We developed an event-tree approach to calculate the likelihood of tsunami flood occurrence and exceedance of a specific near-shore wave height for a given exposure time. Only tsunamis of tectonic origin are considered here, taking into account local, regional, and far-field sources. The approach used here consists of an event-tree method that gathers probability models for seismic sources, tsunami numerical modeling, and statistical methods. It also includes a treatment of aleatoric uncertainties related to source location and tidal stage. Epistemic uncertainties are not addressed in this study. The methodology is applied to the coastal test-site of Sines located in the NE Atlantic coast of Portugal. We derive probabilistic high-resolution maximum wave amplitudes and flood distributions for the study test-site considering 100- and 500-year exposure times. We find that the probability that maximum wave amplitude exceeds 1 m somewhere along the Sines coasts reaches about 60 % for an exposure time of 100 years and is up to 97 % for an exposure time of 500 years. The probability of inundation occurrence (flow depth >0 m) varies between 10 % and 57 %, and from 20 % up to 95 % for 100- and 500-year exposure times, respectively. No validation has been performed here with historical tsunamis. This paper illustrates a methodology through a case study, which is not an operational assessment.
TEM PSHA2015 Reliability Assessment
NASA Astrophysics Data System (ADS)
Lee, Y.; Wang, Y. J.; Chan, C. H.; Ma, K. F.
2016-12-01
The Taiwan Earthquake Model (TEM) developed a new probabilistic seismic hazard analysis (PSHA) for determining the probability of exceedance (PoE) of ground motion over a specified period in Taiwan. To investigate the adequacy of the seismic source parameters adopted in the 2015 PSHA of the TEM (TEM PSHA2015), we conducted several tests of the seismic source models. The observed maximal peak ground acceleration (PGA) of the ML > 4.0 mainshocks in the 23-year data period of 1993-2015 were used to test the predicted PGA of PSHA from the areal and subduction zone sources with the time-independent Poisson assumption. This comparison excluded the observations from 1999 Chi-Chi earthquake, as this was the only earthquake associated with the identified active fault in this past 23 years. We used tornado diagrams to analyze the sensitivities of these source parameters to the ground motion values of the PSHA. This study showed that the predicted PGA for a 63% PoE in the 23-year period corresponded to the empirical PGA and the predicted numbers of PGA exceedances to a threshold value 0.1g close to the observed numbers, confirming the parameter applicability for the areal and subduction zone sources. We adopted the disaggregation analysis from a hazard map to determine the contribution of the individual seismic sources to hazard for six metropolitan cities in Taiwan. The sensitivity tests of the seismogenic structure parameters indicated that the slip rate and maximum magnitude are dominant factors for the TEM PSHA2015. For densely populated faults in SW Taiwan, maximum magnitude is more sensitive than the slip rate, giving the concern on the possible multiple fault segments rupture with larger magnitude in this area, which was not yet considered in TEM PSHA2015. The source category disaggregation also suggested that special attention is necessary for subduction zone earthquakes for long-period shaking seismic hazards in Northern Taiwan.
Pulley, S; Collins, A L
2018-09-01
The mitigation of diffuse sediment pollution requires reliable provenance information so that measures can be targeted. Sediment source fingerprinting represents one approach for supporting these needs, but recent methodological developments have resulted in an increasing complexity of data processing methods rendering the approach less accessible to non-specialists. A comprehensive new software programme (SIFT; SedIment Fingerprinting Tool) has therefore been developed which guides the user through critical data analysis decisions and automates all calculations. Multiple source group configurations and composite fingerprints are identified and tested using multiple methods of uncertainty analysis. This aims to explore the sediment provenance information provided by the tracers more comprehensively than a single model, and allows for model configurations with high uncertainties to be rejected. This paper provides an overview of its application to an agricultural catchment in the UK to determine if the approach used can provide a reduction in uncertainty and increase in precision. Five source group classifications were used; three formed using a k-means cluster analysis containing 2, 3 and 4 clusters, and two a-priori groups based upon catchment geology. Three different composite fingerprints were used for each classification and bi-plots, range tests, tracer variability ratios and virtual mixtures tested the reliability of each model configuration. Some model configurations performed poorly when apportioning the composition of virtual mixtures, and different model configurations could produce different sediment provenance results despite using composite fingerprints able to discriminate robustly between the source groups. Despite this uncertainty, dominant sediment sources were identified, and those in close proximity to each sediment sampling location were found to be of greatest importance. This new software, by integrating recent methodological developments in tracer data processing, guides users through key steps. Critically, by applying multiple model configurations and uncertainty assessment, it delivers more robust solutions for informing catchment management of the sediment problem than many previously used approaches. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.
Functional performance of pyrovalves
NASA Technical Reports Server (NTRS)
Bement, Laurence J.
1996-01-01
Following several flight and ground test failures of spacecraft systems using single-shot, 'normally closed' pyrotechnically actuated valves (pyrovalves), a Government/Industry cooperative program was initiated to assess the functional performance of five qualified designs. The goal of the program was to provide information on functional performance of pyrovalves to allow users the opportunity to improve procurement requirements. Specific objectives included the demonstration of performance test methods, the seating; these gases/particles entered the fluid path of measurement of 'blowby' (the passage of gases from the pyrotechnic energy source around the activating piston into the valve's fluid path), and the quantification of functional margins for each design. Experiments were conducted at NASA's Langley Research Center on several units for each of the five valve designs. The test methods used for this program measured the forces and energies required to actuate the valves, as well as the energies and the pressures (where possible) delivered by the pyrotechnic sources. Functional performance ranged widely among the designs. Blowby cannot be prevented by o-ring seals; metal-to-metal seals were effective. Functional margin was determined by dividing the energy delivered by the pyrotechnic sources in excess to that required to accomplish the function by the energy required for that function. Two of the five designs had inadequate functional margins with the pyrotechnic cartridges evaluated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parkhurst, MaryAnn; Guilmette, Raymond A.
2009-03-01
The Capstone Depleted Uranium (DU) Aerosol Characterization and Risk Assessment Study was conducted to generate data about DU aerosols generated during the perforation of armored combat vehicles with large-caliber DU penetrators, and to apply the data in assessments of human health risks to personnel exposed to these aerosols, primarily through inhalation, during the 1991 Gulf War or in future military operations. The Capstone study consisted of two components: 1) generating, sampling and characterizing DU aerosols by firing at and perforating combat vehicles and 2) applying the source-term quantities and characteristics of the aerosols to the evaluation of doses and risks.more » This paper reviews the background of the study including the bases for the study, previous reviews of DU particles and health assessments from DU used by the U.S. military, the objectives of the study components, the participants and oversight teams, and the types of exposures it was intended to evaluate. It then discusses exposure scenarios used in the dose and risk assessment and provides an overview of how the field tests and dose and risk assessments were conducted.« less
Gómez, Aina G; Ondiviela, Bárbara; Puente, Araceli; Juanes, José A
2015-05-15
This work presents a standard and unified procedure for assessment of environmental risks at the contaminant source level in port aquatic systems. Using this method, port managers and local authorities will be able to hierarchically classify environmental hazards and proceed with the most suitable management actions. This procedure combines rigorously selected parameters and indicators to estimate the environmental risk of each contaminant source based on its probability, consequences and vulnerability. The spatio-temporal variability of multiple stressors (agents) and receptors (endpoints) is taken into account to provide accurate estimations for application of precisely defined measures. The developed methodology is tested on a wide range of different scenarios via application in six European ports. The validation process confirms its usefulness, versatility and adaptability as a management tool for port water quality in Europe and worldwide. Copyright © 2015 Elsevier Ltd. All rights reserved.
Teachers' ratings of the academic performance of children with epilepsy.
Katzenstein, Jennifer M; Fastenau, Philip S; Dunn, David W; Austin, Joan K
2007-05-01
The present study examined how knowledge of a child's seizure condition is related to teachers' assessment of the child's academic ability. Children with epilepsy were divided into two groups based on teachers' awareness of the children's seizure condition (Label). The children's achievement was assessed using the Woodcock Johnson Tests of Achievement-Revised (WJ-R), and the teacher's ratings were obtained from the Child Behavior Checklist Teacher Report Form (TRF) (Source). A 2 (Source) x 2 (Label) mixed-design analysis of covariance (controlling for IQ and how well the teacher knew the child) revealed a significant interaction, F(1,121)=4.22, P=0.04. For the WJ-R there was no effect of Label on Achievement, but on the TRF lower scores were observed for children who were labeled. These results support the hypothesis that some teachers might underestimate the academic abilities of children with epilepsy.
pySeismicDQA: open source post experiment data quality assessment and processing
NASA Astrophysics Data System (ADS)
Polkowski, Marcin
2017-04-01
Seismic Data Quality Assessment is python based, open source set of tools dedicated for data processing after passive seismic experiments. Primary goal of this toolset is unification of data types and formats from different dataloggers necessary for further processing. This process requires additional data checks for errors, equipment malfunction, data format errors, abnormal noise levels, etc. In all such cases user needs to decide (manually or by automatic threshold) if data is removed from output dataset. Additionally, output dataset can be visualized in form of website with data availability charts and waveform visualization with earthquake catalog (external). Data processing can be extended with simple STA/LTA event detection. pySeismicDQA is designed and tested for two passive seismic experiments in central Europe: PASSEQ 2006-2008 and "13 BB Star" (2013-2016). National Science Centre Poland provided financial support for this work via NCN grant DEC-2011/02/A/ST10/00284.
Chang, Chang-Tang; Chiou, Chyow-Shan
2006-05-01
This study attempts to assess the effectiveness of control strategies for reducing volatile organic compound (VOC) emission from the polyvinyl chloride (PVC) wallpaper production industry. In Taiwan, methyl ethyl ketone, TOL, and cyclohexanone have comprised the major content of solvents, accounting for approximately 113,000 t/yr to avoid excessive viscosity of plasticizer dioctyl phthalate (DOP) and to increase facility in working. Emissions of these VOCs from solvents have caused serious odor and worse air quality problems. In this study, 80 stacks in five factories were tested to evaluate emission characteristics at each VOC source. After examining the VOC concentrations in the flue gases and contents, the VOC emission rate before treatment and from fugitive sources was 93,000 and 800 t/yr, respectively. In this study, the semiwet electrostatic precipitator is recommended for use as cost-effective control equipment.
Potential microbial risk factors related to soil amendments and irrigation water of potato crops.
Selma, M V; Allende, A; López-Gálvez, F; Elizaquível, P; Aznar, R; Gil, M I
2007-12-01
This study assesses the potential microbial risk factors related to the use of soil amendments and irrigation water on potato crops, cultivated in one traditional and two intensive farms during two harvest seasons. The natural microbiota and potentially pathogenic micro-organisms were evaluated in the soil amendment, irrigation water, soil and produce. Uncomposted amendments and residual and creek water samples showed the highest microbial counts. The microbial load of potatoes harvested in spring was similar among the tested farms despite the diverse microbial levels of Listeria spp. and faecal coliforms in the potential risk sources. However, differences in total coliform load of potato were found between farms cultivated in the autumn. Immunochromatographic rapid tests and the BAM's reference method (Bacteriological Analytical Manual; AOAC International) were used to detect Escherichia coli O157:H7 from the potential risk sources and produce. Confirmation of the positive results by polymerase chain reaction procedures showed that the immunochromatographic assay was not reliable as it led to false-positive results. The potentially pathogenic micro-organisms of soil amendment, irrigation water and soil samples changed with the harvest seasons and the use of different agricultural practices. However, the microbial load of the produce was not always influenced by these risk sources. Improvements in environmental sample preparation are needed to avoid interferences in the use of immunochromatographic rapid tests. The potential microbial risk sources of fresh produce should be regularly controlled using reliable detection methods to guarantee their microbial safety.
NASA Astrophysics Data System (ADS)
Hynds, Paul D.; Misstear, Bruce D.; Gill, Laurence W.
2012-12-01
Groundwater quality analyses were carried out on samples from 262 private sources in the Republic of Ireland during the period from April 2008 to November 2010, with microbial quality assessed by thermotolerant coliform (TTC) presence. Assessment of potential microbial contamination risk factors was undertaken at all sources, and local meteorological data were also acquired. Overall, 28.9% of wells tested positive for TTC, with risk analysis indicating that source type (i.e., borehole or hand-dug well), local bedrock type, local subsoil type, groundwater vulnerability, septic tank setback distance, and 48 h antecedent precipitation were all significantly associated with TTC presence (p < 0.05). A number of source-specific design parameters were also significantly associated with bacterial presence. Hierarchical logistic regression with stepwise parameter entry was used to develop a private well susceptibility model, with the final model exhibiting a mean predictive accuracy of >80% (TTC present or absent) when compared to an independent validation data set. Model hierarchies of primary significance are source design (20%), septic tank location (11%), hydrogeological setting (10%), and antecedent 120 h precipitation (2%). Sensitivity analysis shows that the probability of contamination is highly sensitive to septic tank setback distance, with probability increasing linearly with decreases in setback distance. Likewise, contamination probability was shown to increase with increasing antecedent precipitation. Results show that while groundwater vulnerability category is a useful indicator of aquifer susceptibility to contamination, its suitability with regard to source contamination is less clear. The final model illustrates that both localized (well-specific) and generalized (aquifer-specific) contamination mechanisms are involved in contamination events, with localized bypass mechanisms dominant. The susceptibility model developed here could be employed in the appropriate location, design, construction, and operation of private groundwater wells, thereby decreasing the contamination risk, and hence health risk, associated with these sources.
Final safety analysis report for the Galileo Mission: Volume 2: Book 1, Accident model document
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The Accident Model Document (AMD) is the second volume of the three volume Final Safety Analysis Report (FSAR) for the Galileo outer planetary space science mission. This mission employs Radioisotope Thermoelectric Generators (RTGs) as the prime electrical power sources for the spacecraft. Galileo will be launched into Earth orbit using the Space Shuttle and will use the Inertial Upper Stage (IUS) booster to place the spacecraft into an Earth escape trajectory. The RTG's employ silicon-germanium thermoelectric couples to produce electricity from the heat energy that results from the decay of the radioisotope fuel, Plutonium-238, used in the RTG heat source.more » The heat source configuration used in the RTG's is termed General Purpose Heat Source (GPHS), and the RTG's are designated GPHS-RTGs. The use of radioactive material in these missions necessitates evaluations of the radiological risks that may be encountered by launch complex personnel as well as by the Earth's general population resulting from postulated malfunctions or failures occurring in the mission operations. The FSAR presents the results of a rigorous safety assessment, including substantial analyses and testing, of the launch and deployment of the RTGs for the Galileo mission. This AMD is a summary of the potential accident and failure sequences which might result in fuel release, the analysis and testing methods employed, and the predicted source terms. Each source term consists of a quantity of fuel released, the location of release and the physical characteristics of the fuel released. Each source term has an associated probability of occurrence. 27 figs., 11 tabs.« less
Individual differences in spontaneous analogical transfer.
Kubricht, James R; Lu, Hongjing; Holyoak, Keith J
2017-05-01
Research on analogical problem solving has shown that people often fail to spontaneously notice the relevance of a semantically remote source analog when solving a target problem, although they are able to form mappings and derive inferences when given a hint to recall the source. Relatively little work has investigated possible individual differences that predict spontaneous transfer, or how such differences may interact with interventions that facilitate transfer. In this study, fluid intelligence was measured for participants in an analogical problem-solving task, using an abridged version of the Raven's Progressive Matrices (RPM) test. In two experiments, we systematically compared the effect of augmenting verbal descriptions of the source with animations or static diagrams. Solution rates to Duncker's radiation problem were measured across varying source presentation conditions, and participants' understanding of the relevant source material was assessed. The pattern of transfer was best fit by a moderated mediation model: the positive impact of fluid intelligence on spontaneous transfer was mediated by its influence on source comprehension; however, this path was in turn modulated by provision of a supplemental animation via its influence on comprehension of the source. Animated source depictions were most beneficial in facilitating spontaneous transfer for those participants with low scores on the fluid intelligence measure.
Annoyance from industrial noise: indicators for a wide variety of industrial sources.
Alayrac, M; Marquis-Favre, C; Viollon, S; Morel, J; Le Nost, G
2010-09-01
In the study of noises generated by industrial sources, one issue is the variety of industrial noise sources and consequently the complexity of noises generated. Therefore, characterizing the environmental impact of an industrial plant requires better understanding of the noise annoyance caused by industrial noise sources. To deal with the variety of industrial sources, the proposed approach is set up by type of spectral features and based on a perceptive typology of steady and permanent industrial noises comprising six categories. For each perceptive category, listening tests based on acoustical factors are performed on noise annoyance. Various indicators are necessary to predict noise annoyance due to various industrial noise sources. Depending on the spectral features of the industrial noise sources, noise annoyance indicators are thus assessed. In case of industrial noise sources without main spectral features such as broadband noise, noise annoyance is predicted by the A-weighted sound pressure level L(Aeq) or the loudness level L(N). For industrial noises with spectral components such as low-frequency noises with a main component at 100 Hz or noises with spectral components in middle frequencies, indicators are proposed here that allow good prediction of noise annoyance by taking into account spectral features.
Assessing biodiversity on the farm scale as basis for ecosystem service payments.
von Haaren, Christina; Kempa, Daniela; Vogel, Katrin; Rüter, Stefan
2012-12-30
Ecosystem services payments must be based on a standardised transparent assessment of the goods and services provided. This is especially relevant in the context of EU agri-environmental programs, but also for organic-food companies that foster environmental services on their contractor farms. Addressing the farm scale is important because land users/owners are major recipients of payments and they could be more involved in data generation and conservation management. A standardised system for measuring on-farm biodiversity does not yet exist that concentrates on performance indicators and includes farmers in generating information. A method is required that produces ordinal or metric scaled assessment results as well as management measures. Another requirement is the ease of application, which includes the ease of gathering input data and understandability. In order to respond to this need, we developed a method which is designed for automated application in an open source farm assessment system named MANUELA. The method produces an ordinal scale assessment of biodiversity that includes biotopes, species, biotope connectivity and the influence of land use. In addition, specific measures for biotope types are proposed. The open source geographical information system OpenJump is used for the implementation of MANUELA. The results of the trial applications and robustness tests show that the assessment can be implemented, for the most part, using existing information as well as data available from farmers or advisors. The results are more sensitive for showing on-farm achievements and changes than existing biotope-type classifications. Such a differentiated classification is needed as a basis for ecosystem service payments and for designing effective measures. The robustness of the results with respect to biotope connectivity is comparable to that of complex models, but it should be further improved. Interviews with the test farmers substantiate that the assessment methods can be implemented on farms and they are understood by farmers. Copyright © 2012 Elsevier Ltd. All rights reserved.
[Regional atmospheric environment risk source identification and assessment].
Zhang, Xiao-Chun; Chen, Wei-Ping; Ma, Chun; Zhan, Shui-Fen; Jiao, Wen-Tao
2012-12-01
Identification and assessment for atmospheric environment risk source plays an important role in regional atmospheric risk assessment and regional atmospheric pollution prevention and control. The likelihood exposure and consequence assessment method (LEC method) and the Delphi method were employed to build a fast and effective method for identification and assessment of regional atmospheric environment risk sources. This method was applied to the case study of a large coal transportation port in North China. The assessment results showed that the risk characteristics and the harm degree of regional atmospheric environment risk source were in line with the actual situation. Fast and effective identification and assessment of risk source has laid an important foundation for the regional atmospheric environmental risk assessment and regional atmospheric pollution prevention and control.
Morris, Jamae Fontain; Murphy, Jennifer; Fagerli, Kirsten; Schneeberger, Chandra; Jaron, Peter; Moke, Fenny; Juma, Jane; Ochieng, J Ben; Omore, Richard; Roellig, Dawn; Xiao, Lihua; Priest, Jeffrey W; Narayanan, Jothikumar; Montgomery, Joel; Hill, Vince; Mintz, Eric; Ayers, Tracy L; O'Reilly, Ciara E
2018-04-02
Cryptosporidium is a leading cause of diarrhea among Kenyan infants. Ceramic water filters (CWFs) are used for household water treatment. We assessed the impact of CWFs on diarrhea, cryptosporidiosis prevention, and water quality in rural western Kenya. A randomized, controlled intervention trial was conducted in 240 households with infants 4-10 months old. Twenty-six weekly household surveys assessed infant diarrhea and health facility visits. Stool specimens from infants with diarrhea were examined for Cryptosporidium . Source water, filtered water, and filter retentate were tested for Cryptosporidium and/or microbial indicators. To estimate the effect of CWFs on health outcomes, logistic regression models using generalized estimating equations were performed; odds ratios (ORs) and 95% confidence intervals (CIs) are reported. Households reported using surface water (36%), public taps (29%), or rainwater (17%) as their primary drinking water sources, with no differences in treatment groups. Intervention households reported less diarrhea (7.6% versus 8.9%; OR: 0.86 [0.64-1.16]) and significantly fewer health facility visits for diarrhea (1.0% versus 1.9%; OR: 0.50 [0.30-0.83]). In total, 15% of intervention and 12% of control stools yielded Cryptosporidium ( P = 0.26). Escherichia coli was detected in 93% of source water samples; 71% of filtered water samples met World Health Organization recommendations of < 1 E. coli /100 mL. Cryptosporidium was not detected in source water and was detected in just 2% of filter rinses following passage of large volumes of source water. Water quality was improved among CWF users; however, the short study duration and small sample size limited our ability to observe reductions in cryptosporidiosis.
Grouping and Read-Across Approaches for Risk Assessment of Nanomaterials.
Oomen, Agnes G; Bleeker, Eric A J; Bos, Peter M J; van Broekhuizen, Fleur; Gottardo, Stefania; Groenewold, Monique; Hristozov, Danail; Hund-Rinke, Kerstin; Irfan, Muhammad-Adeel; Marcomini, Antonio; Peijnenburg, Willie J G M; Rasmussen, Kirsten; Jiménez, Araceli Sánchez; Scott-Fordsmand, Janeck J; van Tongeren, Martie; Wiench, Karin; Wohlleben, Wendel; Landsiedel, Robert
2015-10-26
Physicochemical properties of chemicals affect their exposure, toxicokinetics/fate and hazard, and for nanomaterials, the variation of these properties results in a wide variety of materials with potentially different risks. To limit the amount of testing for risk assessment, the information gathering process for nanomaterials needs to be efficient. At the same time, sufficient information to assess the safety of human health and the environment should be available for each nanomaterial. Grouping and read-across approaches can be utilised to meet these goals. This article presents different possible applications of grouping and read-across for nanomaterials within the broader perspective of the MARINA Risk Assessment Strategy (RAS), as developed in the EU FP7 project MARINA. Firstly, nanomaterials can be grouped based on limited variation in physicochemical properties to subsequently design an efficient testing strategy that covers the entire group. Secondly, knowledge about exposure, toxicokinetics/fate or hazard, for example via properties such as dissolution rate, aspect ratio, chemical (non-)activity, can be used to organise similar materials in generic groups to frame issues that need further attention, or potentially to read-across. Thirdly, when data related to specific endpoints is required, read-across can be considered, using data from a source material for the target nanomaterial. Read-across could be based on a scientifically sound justification that exposure, distribution to the target (fate/toxicokinetics) and hazard of the target material are similar to, or less than, the source material. These grouping and read-across approaches pave the way for better use of available information on nanomaterials and are flexible enough to allow future adaptations related to scientific developments.
Fine-grained suspended sediment source identification for the Kharaa River basin, northern Mongolia
NASA Astrophysics Data System (ADS)
Rode, Michael; Theuring, Philipp; Collins, Adrian L.
2015-04-01
Fine sediment inputs into river systems can be a major source of nutrients and heavy metals and have a strong impact on the water quality and ecosystem functions of rivers and lakes, including those in semiarid regions. However, little is known to date about the spatial distribution of sediment sources in most large scale river basins in Central Asia. Accordingly, a sediment source fingerprinting technique was used to assess the spatial sources of fine-grained (<10 microns) sediment in the 15 000 km2 Kharaa River basin in northern Mongolia. Five field sampling campaigns in late summer 2009, and spring and late summer in both 2010 and 2011, were conducted directly after high water flows, to collect an overall total of 900 sediment samples. The work used a statistical approach for sediment source discrimination with geochemical composite fingerprints based on a new Genetic Algorithm (GA)-driven Discriminant Function Analysis, the Kruskal-Wallis H-test and Principal Component Analysis. The composite fingerprints were subsequently used for numerical mass balance modelling with uncertainty analysis. The contributions of the individual sub-catchment spatial sediment sources varied from 6.4% (the headwater sub-catchment of Sugnugur Gol) to 36.2% (the Kharaa II sub-catchment in the middle reaches of the study basin) with the pattern generally showing higher contributions from the sub-catchments in the middle, rather than the upstream, portions of the study area. The importance of riverbank erosion was shown to increase from upstream to midstream tributaries. The source tracing procedure provides results in reasonable accordance with previous findings in the study region and demonstrates the general applicability and associated uncertainties of an approach for fine-grained sediment source investigation in large scale semi-arid catchments. The combined application of source fingerprinting and catchment modelling approaches can be used to assess whether tracing estimates are credible and in combination such approaches provide a basis for making sediment source apportionment more compelling to catchment stakeholders and managers.
NASA Astrophysics Data System (ADS)
Milej, Daniel; Janusek, Dariusz; Gerega, Anna; Wojtkiewicz, Stanislaw; Sawosz, Piotr; Treszczanowicz, Joanna; Weigl, Wojciech; Liebert, Adam
2015-10-01
The aim of the study was to determine optimal measurement conditions for assessment of brain perfusion with the use of optical contrast agent and time-resolved diffuse reflectometry in the near-infrared wavelength range. The source-detector separation at which the distribution of time of flights (DTOF) of photons provided useful information on the inflow of the contrast agent to the intracerebral brain tissue compartments was determined. Series of Monte Carlo simulations was performed in which the inflow and washout of the dye in extra- and intracerebral tissue compartments was modeled and the DTOFs were obtained at different source-detector separations. Furthermore, tests on diffuse phantoms were carried out using a time-resolved setup allowing the measurement of DTOFs at 16 source-detector separations. Finally, the setup was applied in experiments carried out on the heads of adult volunteers during intravenous injection of indocyanine green. Analysis of statistical moments of the measured DTOFs showed that the source-detector separation of 6 cm is recommended for monitoring of inflow of optical contrast to the intracerebral brain tissue compartments with the use of continuous wave reflectometry, whereas the separation of 4 cm is enough when the higher-order moments of DTOFs are available.
Estimation of wear in total hip replacement using a ten station hip simulator.
Brummitt, K; Hardaker, C S
1996-01-01
The results of hip simulator tests on a total of 16 total hip joints, all of them 22.25 mm Charnley designs, are presented. Wear at up to 6.75 million cycles was assessed by using a coordinate measuring machine. The results gave good agreement with clinical estimates of wear rate on the same design of joint replacement from a number of sources. Good agreement was also obtained when comparison was made with the published results from more sophisticated simulators. The major source of variation in the results was found to occur in the first million cycles where creep predominates. The results of this study support the use of this type of simplified simulator for estimating wear in a total hip prosthesis. The capability to test a significant number of joints simultaneously may make this mechanism preferable to more complex machines in many cases.
Moore, Amy Lawson; Miller, Terissa M
2018-01-01
The purpose of the current study is to evaluate the validity and reliability of the revised Gibson Test of Cognitive Skills, a computer-based battery of tests measuring short-term memory, long-term memory, processing speed, logic and reasoning, visual processing, as well as auditory processing and word attack skills. This study included 2,737 participants aged 5-85 years. A series of studies was conducted to examine the validity and reliability using the test performance of the entire norming group and several subgroups. The evaluation of the technical properties of the test battery included content validation by subject matter experts, item analysis and coefficient alpha, test-retest reliability, split-half reliability, and analysis of concurrent validity with the Woodcock Johnson III Tests of Cognitive Abilities and Tests of Achievement. Results indicated strong sources of evidence of validity and reliability for the test, including internal consistency reliability coefficients ranging from 0.87 to 0.98, test-retest reliability coefficients ranging from 0.69 to 0.91, split-half reliability coefficients ranging from 0.87 to 0.91, and concurrent validity coefficients ranging from 0.53 to 0.93. The Gibson Test of Cognitive Skills-2 is a reliable and valid tool for assessing cognition in the general population across the lifespan.
Survey methods for assessing land cover map accuracy
Nusser, S.M.; Klaas, E.E.
2003-01-01
The increasing availability of digital photographic materials has fueled efforts by agencies and organizations to generate land cover maps for states, regions, and the United States as a whole. Regardless of the information sources and classification methods used, land cover maps are subject to numerous sources of error. In order to understand the quality of the information contained in these maps, it is desirable to generate statistically valid estimates of accuracy rates describing misclassification errors. We explored a full sample survey framework for creating accuracy assessment study designs that balance statistical and operational considerations in relation to study objectives for a regional assessment of GAP land cover maps. We focused not only on appropriate sample designs and estimation approaches, but on aspects of the data collection process, such as gaining cooperation of land owners and using pixel clusters as an observation unit. The approach was tested in a pilot study to assess the accuracy of Iowa GAP land cover maps. A stratified two-stage cluster sampling design addressed sample size requirements for land covers and the need for geographic spread while minimizing operational effort. Recruitment methods used for private land owners yielded high response rates, minimizing a source of nonresponse error. Collecting data for a 9-pixel cluster centered on the sampled pixel was simple to implement, and provided better information on rarer vegetation classes as well as substantial gains in precision relative to observing data at a single-pixel.
Validation of a Smartphone-Based Approach to In Situ Cognitive Fatigue Assessment
Linden, Mark
2017-01-01
Background Acquired Brain Injuries (ABIs) can result in multiple detrimental cognitive effects, such as reduced memory capability, concentration, and planning. These effects can lead to cognitive fatigue, which can exacerbate the symptoms of ABIs and hinder management and recovery. Assessing cognitive fatigue is difficult due to the largely subjective nature of the condition and existing assessment approaches. Traditional methods of assessment use self-assessment questionnaires delivered in a medical setting, but recent work has attempted to employ more objective cognitive tests as a way of evaluating cognitive fatigue. However, these tests are still predominantly delivered within a medical environment, limiting their utility and efficacy. Objective The aim of this research was to investigate how cognitive fatigue can be accurately assessed in situ, during the quotidian activities of life. It was hypothesized that this assessment could be achieved through the use of mobile assistive technology to assess working memory, sustained attention, information processing speed, reaction time, and cognitive throughput. Methods The study used a bespoke smartphone app to track daily cognitive performance, in order to assess potential levels of cognitive fatigue. Twenty-one participants with no prior reported brain injuries took place in a two-week study, resulting in 81 individual testing instances being collected. The smartphone app delivered three cognitive tests on a daily basis: (1) Spatial Span to measure visuospatial working memory; (2) Psychomotor Vigilance Task (PVT) to measure sustained attention, information processing speed, and reaction time; and (3) a Mental Arithmetic Test to measure cognitive throughput. A smartphone-optimized version of the Mental Fatigue Scale (MFS) self-assessment questionnaire was used as a baseline to assess the validity of the three cognitive tests, as the questionnaire has already been validated in multiple peer-reviewed studies. Results The most highly correlated results were from the PVT, which showed a positive correlation with those from the prevalidated MFS, measuring 0.342 (P<.008). Scores from the cognitive tests were entered into a regression model and showed that only reaction time in the PVT was a significant predictor of fatigue (P=.016, F=2.682, 95% CI 9.0-84.2). Higher scores on the MFS were related to increases in reaction time during our mobile variant of the PVT. Conclusions The results show that the PVT mobile cognitive test developed for this study could be used as a valid and reliable method for measuring cognitive fatigue in situ. This test would remove the subjectivity associated with established self-assessment approaches and the need for assessments to be performed in a medical setting. Based on our findings, future work could explore delivering a small set of tests with increased duration to further improve measurement reliability. Moreover, as the smartphone assessment tool can be used as part of everyday life, additional sources of data relating to physiological, psychological, and environmental context could be included within the analysis to improve the nature and precision of the assessment process. PMID:28818818
Effects of the source of social comparison information on former cancer patients' quality of life.
Brakel, Thecla M; Dijkstra, Arie; Buunk, Abraham P
2012-11-01
Life, following curative treatment, can be a struggle for former cancer patients. In this phase of their illness, social comparison information may help to improve a patient's quality of life (QOL). The objective of this study was to determine whether the effects of this information depend on the following two variables: (1) the individual's physical health and (2) the individual's sensitivity to social comparison. In the current study, the effects on a patient's QOL were tested that occur when they are listening to a psychological oncological expert talking about cancer patients' experiences. Three different recorded interviews with experts were compared (on negative emotions, effective coping, or both), and individual differences were tested as moderators. In addition, the expert source conditions were compared with a condition in which the source was not an expert but a former patient. In a randomized field experiment, 154 Dutch former cancer patients (M(age) = 55 years; 68% women) were assigned to one of the four conditions (three expert source and one former patient source condition). QOL was assessed after 2 months. The effects of the expert source conditions on QOL depended on the participants' physical health (good vs. poor) and on the participants' sensitivity to social comparison (whether the recipient reacts with contrast or identification), as indicated by significant three-way interactions (p < .001). Depending on these two variables, one of the three expert source conditions was at least as effective as the former patient source condition. The results show that desired and undesired effects are found when individual differences relevant to the processing of intervention information are examined. ©2012 The British Psychological Society.
Hemispherical breathing mode speaker using a dielectric elastomer actuator.
Hosoya, Naoki; Baba, Shun; Maeda, Shingo
2015-10-01
Although indoor acoustic characteristics should ideally be assessed by measuring the reverberation time using a point sound source, a regular polyhedron loudspeaker, which has multiple loudspeakers on a chassis, is typically used. However, such a configuration is not a point sound source if the size of the loudspeaker is large relative to the target sound field. This study investigates a small lightweight loudspeaker using a dielectric elastomer actuator vibrating in the breathing mode (the pulsating mode such as the expansion and contraction of a balloon). Acoustic testing with regard to repeatability, sound pressure, vibration mode profiles, and acoustic radiation patterns indicate that dielectric elastomer loudspeakers may be feasible.
Cerebral correlates of faking: evidence from a brief implicit association test on doping attitudes.
Schindler, Sebastian; Wolff, Wanja; Kissler, Johanna M; Brand, Ralf
2015-01-01
Direct assessment of attitudes toward socially sensitive topics can be affected by deception attempts. Reaction-time based indirect measures, such as the Implicit Association Test (IAT), are less susceptible to such biases. Neuroscientific evidence shows that deception can evoke characteristic ERP differences. However, the cerebral processes involved in faking an IAT are still unknown. We randomly assigned 20 university students (15 females, 24.65 ± 3.50 years of age) to a counterbalanced repeated-measurements design, requesting them to complete a Brief-IAT (BIAT) on attitudes toward doping without deception instruction, and with the instruction to fake positive and negative doping attitudes. Cerebral activity during BIAT completion was assessed using high-density EEG. Event-related potentials during faking revealed enhanced frontal and reduced occipital negativity, starting around 150 ms after stimulus presentation. Further, a decrease in the P300 and LPP components was observed. Source analyses showed enhanced activity in the right inferior frontal gyrus between 150 and 200 ms during faking, thought to reflect the suppression of automatic responses. Further, more activity was found for faking in the bilateral middle occipital gyri and the bilateral temporoparietal junction. Results indicate that faking reaction-time based tests alter brain processes from early stages of processing and reveal the cortical sources of the effects. Analyzing the EEG helps to uncover response patterns in indirect attitude tests and broadens our understanding of the neural processes involved in such faking. This knowledge might be useful for uncovering faking in socially sensitive contexts, where attitudes are likely to be concealed.
NASA Astrophysics Data System (ADS)
Winiarek, Victor; Vira, Julius; Bocquet, Marc; Sofiev, Mikhail; Saunier, Olivier
2011-06-01
In the event of an accidental atmospheric release of radionuclides from a nuclear power plant, accurate real-time forecasting of the activity concentrations of radionuclides is required by the decision makers for the preparation of adequate countermeasures. The accuracy of the forecast plume is highly dependent on the source term estimation. On several academic test cases, including real data, inverse modelling and data assimilation techniques were proven to help in the assessment of the source term. In this paper, a semi-automatic method is proposed for the sequential reconstruction of the plume, by implementing a sequential data assimilation algorithm based on inverse modelling, with a care to develop realistic methods for operational risk agencies. The performance of the assimilation scheme has been assessed through the intercomparison between French and Finnish frameworks. Two dispersion models have been used: Polair3D and Silam developed in two different research centres. Different release locations, as well as different meteorological situations are tested. The existing and newly planned surveillance networks are used and realistically large multiplicative observational errors are assumed. The inverse modelling scheme accounts for strong error bias encountered with such errors. The efficiency of the data assimilation system is tested via statistical indicators. For France and Finland, the average performance of the data assimilation system is strong. However there are outlying situations where the inversion fails because of a too poor observability. In addition, in the case where the power plant responsible for the accidental release is not known, robust statistical tools are developed and tested to discriminate candidate release sites.
Sapsirisavat, Vorapot; Vongsutilers, Vorasit; Thammajaruk, Narukjaporn; Pussadee, Kanitta; Riyaten, Prakit; Kerr, Stephen; Avihingsanon, Anchalee; Phanuphak, Praphan; Ruxrungtham, Kiat
2016-01-01
Ensuring that medicines meet quality standards is mandatory for ensuring safety and efficacy. There have been occasional reports of substandard generic medicines, especially in resource-limiting settings where policies to control quality may be less rigorous. As HIV treatment in Thailand depends mostly on affordable generic antiretrovirals (ARV), we performed quality assurance testing of several generic ARV available from different sources in Thailand and a source from Vietnam. We sampled Tenofovir 300mg, Efavirenz 600mg and Lopinavir/ritonavir 200/50mg from 10 primary hospitals randomly selected from those participating in the National AIDS Program, 2 non-government organization ARV clinics, and 3 private drug stores. Quality of ARV was analyzed by blinded investigators at the Faculty of Pharmaceutical Science, Chulalongkorn University. The analysis included an identification test for drug molecules, a chemical composition assay to quantitate the active ingredients, a uniformity of mass test and a dissolution test to assess in-vitro drug release. Comparisons were made against the standards described in the WHO international pharmacopeia. A total of 42 batches of ARV from 15 sources were sampled from January-March 2015. Among those generics, 23, 17, 1, and 1 were Thai-made, Indian-made, Vietnamese-made and Chinese-made, respectively. All sampled products, regardless of manufacturers or sources, met the International Pharmacopeia standards for composition assay, mass uniformity and dissolution. Although local regulations restrict ARV supply to hospitals and clinics, samples of ARV could be bought from private drug stores even without formal prescription. Sampled generic ARVs distributed within Thailand and 1 Vietnamese pharmacy showed consistent quality. However some products were illegally supplied without prescription, highlighting the importance of dispensing ARV for treatment or prevention in facilities where continuity along the HIV treatment and care cascade is available.
Sreeramareddy, Chandrashekhar T; Shankar, Pathiyil R; Binu, VS; Mukhopadhyay, Chiranjoy; Ray, Biswabina; Menezes, Ritesh G
2007-01-01
Background In recent years there has been a growing appreciation of the issues of quality of life and stresses involved medical training as this may affect their learning and academic performance. However, such studies are lacking in medical schools of Nepal. Therefore, we carried out this study to assess the prevalence of psychological morbidity, sources and severity of stress and coping strategies among medical students in our integrated problem-stimulated undergraduate medical curriculum. Methods A cross-sectional, questionnaire-based survey was carried out among the undergraduate medical students of Manipal College of Medical Sciences, Pokhara, Nepal during the time period August, 2005 to December, 2006. The psychological morbidity was assessed using General Health Questionnaire. A 24-item questionnaire was used to assess sources of stress and their severity. Coping strategies adopted was assessed using brief COPE inventory. Results The overall response rate was 75.8% (407 out of 525 students). The overall prevalence of psychological morbidity was 20.9% and was higher among students of basic sciences, Indian nationality and whose parents were medical doctors. By logistic regression analysis, GHQ-caseness was associated with occurrence of academic and health-related stressors. The most common sources of stress were related to academic and psychosocial concerns. The most important and severe sources of stress were staying in hostel, high parental expectations, vastness of syllabus, tests/exams, lack of time and facilities for entertainment. The students generally used active coping strategies and alcohol/drug was a least used coping strategy. The coping strategies commonly used by students in our institution were positive reframing, planning, acceptance, active coping, self-distraction and emotional support. The coping strategies showed variation by GHQ-caseness, year of study, gender and parents' occupation. Conclusion The higher level of psychological morbidity warrants need for interventions like social and psychological support to improve the quality of life for these medical students. Student advisors and counselors may train students about stress management. There is also need to bring about academic changes in quality of teaching and evaluation system. A prospective study is necessary to study the association of psychological morbidity with demographic variables, sources of stress and coping strategies. PMID:17678553
Sreeramareddy, Chandrashekhar T; Shankar, Pathiyil R; Binu, V S; Mukhopadhyay, Chiranjoy; Ray, Biswabina; Menezes, Ritesh G
2007-08-02
In recent years there has been a growing appreciation of the issues of quality of life and stresses involved medical training as this may affect their learning and academic performance. However, such studies are lacking in medical schools of Nepal. Therefore, we carried out this study to assess the prevalence of psychological morbidity, sources and severity of stress and coping strategies among medical students in our integrated problem-stimulated undergraduate medical curriculum. A cross-sectional, questionnaire-based survey was carried out among the undergraduate medical students of Manipal College of Medical Sciences, Pokhara, Nepal during the time period August, 2005 to December, 2006. The psychological morbidity was assessed using General Health Questionnaire. A 24-item questionnaire was used to assess sources of stress and their severity. Coping strategies adopted was assessed using brief COPE inventory. The overall response rate was 75.8% (407 out of 525 students). The overall prevalence of psychological morbidity was 20.9% and was higher among students of basic sciences, Indian nationality and whose parents were medical doctors. By logistic regression analysis, GHQ-caseness was associated with occurrence of academic and health-related stressors. The most common sources of stress were related to academic and psychosocial concerns. The most important and severe sources of stress were staying in hostel, high parental expectations, vastness of syllabus, tests/exams, lack of time and facilities for entertainment. The students generally used active coping strategies and alcohol/drug was a least used coping strategy. The coping strategies commonly used by students in our institution were positive reframing, planning, acceptance, active coping, self-distraction and emotional support. The coping strategies showed variation by GHQ-caseness, year of study, gender and parents' occupation. The higher level of psychological morbidity warrants need for interventions like social and psychological support to improve the quality of life for these medical students. Student advisors and counselors may train students about stress management. There is also need to bring about academic changes in quality of teaching and evaluation system. A prospective study is necessary to study the association of psychological morbidity with demographic variables, sources of stress and coping strategies.
Najmeddin, Ali; Keshavarzi, Behnam; Moore, Farid; Lahijanzadeh, Ahmadreza
2017-10-28
This study investigates the occurrence and spatial distribution of potentially toxic elements (PTEs) (Hg, Cd, Cu, Mo, Pb, Zn, Ni, Co, Cr, Al, Fe, Mn, V and Sb) in 67 road dust samples collected from urban industrial areas in Ahvaz megacity, southwest of Iran. Geochemical methods, multivariate statistics, geostatistics and health risk assessment model were adopted to study the spatial pollution pattern and to identify the priority pollutants, regions of concern and sources of the studied PTEs. Also, receptor positive matrix factorization model was employed to assess pollution sources. Compared to the local background, the median enrichment factor values revealed the following order: Sb > Pb > Hg > Zn > Cu > V > Fe > Mo > Cd > Mn > Cr ≈ Co ≈ Al ≈ Ni. Statistical results show that a significant difference exists between concentrations of Mo, Cu, Pb, Zn, Fe, Sb, V and Hg in different regions (univariate analysis, Kruskal-Wallis test p < 0.05), indicating the existence of highly contaminated spots. Integrated source identification coupled with positive matrix factorization model revealed that traffic-related emissions (43.5%) and steel industries (26.4%) were first two sources of PTEs in road dust, followed by natural sources (22.6%) and pipe and oil processing companies (7.5%). The arithmetic mean of pollution load index (PLI) values for high traffic sector (1.92) is greater than industrial (1.80) and residential areas (1.25). Also, the results show that ecological risk values for Hg and Pb in 41.8 and 9% of total dust samples are higher than 80, indicating their considerable or higher potential ecological risk. The health risk assessment model showed that ingestion of dust particles contributed more than 83% of the overall non-carcinogenic risk. For both residential and industrial scenarios, Hg and Pb had the highest risk values, whereas Mo has the lowest value.
Philip Radtke; David Walker; Jereme Frank; Aaron Weiskittel; Clara DeYoung; David MacFarlane; Grant Domke; Christopher Woodall; John Coulston; James Westfall
2017-01-01
Accurate estimation of forest biomass and carbon stocks at regional to national scales is a key requirement in determining terrestrial carbon sources and sinks on United States (US) forest lands. To that end, comprehensive assessment and testing of alternative volume and biomass models were conducted for individual tree models employed in the component ratio method (...
ERIC Educational Resources Information Center
Lang, William Steve; And Others
The effects of the use of computer-enhanced instruction with remedial students were assessed, using 4,293 ninth through twelfth graders--3,308 Black, 957 White, and 28 Other--involved in the Governor's Remediation Initiative (GRI) in Georgia. Data sources included the Comprehensive Tests of Basic Skills (CTBS), a data collection form developed for…
Operational Based Vision Assessment
2014-02-01
formulated or supplied the drawings, specifications, or other data does not license the holder or any other person or corporation or convey any...expensive than other developers’ software. The sources for the GPUs ( Nvidia ) and the host computer (Concurrent’s iHawk) were identified. The...boundaries, which is a distracting artifact when performing visual tests. The problem has been isolated by the OBVA team to the Nvidia GPUs. The OBVA system
2014-02-18
advancement of aviation drone technology has led to significant developments and improvements in the capabilities of military remotely piloted aircraft...stress; less excitement seeking and action oriented; less assertive; more socially introverted and withdrawn; more socially compliant and...to age and educational differences. Fifth, evaluations that involve selection and assessment of pilot applicants should include collateral sources of
NASA Technical Reports Server (NTRS)
Olson, S. L.
2004-01-01
NASA's current method of material screening determines fire resistance under conditions representing a worst-case for normal gravity flammability - the Upward Flame Propagation Test (Test 1). Its simple pass-fail criteria eliminates materials that burn for more than 12 inches from a standardized ignition source. In addition, if a material drips burning pieces that ignite a flammable fabric below, it fails. The applicability of Test 1 to fires in microgravity and extraterrestrial environments, however, is uncertain because the relationship between this buoyancy-dominated test and actual extraterrestrial fire hazards is not understood. There is compelling evidence that the Test 1 may not be the worst case for spacecraft fires, and we don t have enough information to assess if it is adequate at Lunar or Martian gravity levels.
NASA Technical Reports Server (NTRS)
Olson, S. L.
2004-01-01
NASA s current method of material screening determines fire resistance under conditions representing a worst-case for normal gravity flammability - the Upward Flame Propagation Test (Test 1[1]). Its simple pass-fail criteria eliminates materials that burn for more than 12 inches from a standardized ignition source. In addition, if a material drips burning pieces that ignite a flammable fabric below, it fails. The applicability of Test 1 to fires in microgravity and extraterrestrial environments, however, is uncertain because the relationship between this buoyancy-dominated test and actual extraterrestrial fire hazards is not understood. There is compelling evidence that the Test 1 may not be the worst case for spacecraft fires, and we don t have enough information to assess if it is adequate at Lunar or Martian gravity levels.
SOURCE WATER ASSESSMENT USING GEOGRAPHIC INFORMATION SYSTEMS
The 1996 amendments to Section 1453 of the Safe Drinking Water Act require the states to establish and implement a Source Water Assessment Program (SWAP). Source water is the water taken from rivers, reservoirs, or wells for use as public drinking water. Source water assessment i...
Contamination source review for Building E6891, Edgewood Area, Aberdeen Proving Ground, Maryland
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zellmer, S.D.; Draugelis, A.K.; Rueda, J.
1995-09-01
The US Army Aberdeen Proving Ground (APG) commissioned Argonne National Laboratory (ANL) to conduct a contamination source review to identify and define areas of toxic or hazardous contaminants and to assess the physical condition and accessibility of various APG buildings. This report provides the results of the contamination source review for Building E6891. The information obtained from this review may be used to assist the US Army in planning for the future use or disposition of the buildings. The contamination source review consisted of the following tasks: historical records search, physical inspection, photographic documentation, geophysical investigation, and collection of airmore » samples. This building is part of the Lauderick Creek Concrete Slab Test Site, located in the Lauderick Creek Area in the Edgewood Area. Many of the APG facilities constructed between 1917 and the 1960s are no longer used because of obsolescence and their poor state of repair. Because many of these buildings were used for research, development, testing, and/or pilot-scale production of chemical warfare agents and other military substances the potential exists` for portions of the buildings to be contaminated with these substances, their degradation products, and other laboratory or industrial chemicals. These buildings and associated structures or appurtenances may contribute to environmental concerns at APG.« less
Candeias, J P; Estrada, J J S; Pinho, A S; D'Avila, R L; Ramalho, A T
2007-03-01
Industrial radiography is the most frequent method of non-destructive testing (NDT) used by Brazilian industrial facilities for investigating the material integrity of a test object. In Brazil, industrial radiography employs around 220 x-ray and 290 gamma radiography machines. About 90% of the latter uses iridium ((192)Ir) sources. The large majority of (192)Ir projectors in operation in Brazil have been in continuous usage for more than 25 years, which means that they are old and worn-out. Usually the majority of accidents concerning gamma radiography occur during the return of the source into the exposure container. Poor maintenance or imperfections of the internal channel of the exposure container can lead to accidental source exposure. In the present work the internal tube of 65 gamma machines from nine Brazilian companies that render gamma radiography services were analysed using an industrial videoscope. The internal images from the projectors were compared with the internal image of an apparatus that had never been used, i.e. has never received a radioactive source. From the 65 machines evaluated, nine showed irregularities of the internal tube. It was also observed that each company follows a different methodology for the maintenance and lubrication of the exposure containers and drive cables.
Sources of Variation in Sweat Chloride Measurements in Cystic Fibrosis.
Collaco, Joseph M; Blackman, Scott M; Raraigh, Karen S; Corvol, Harriet; Rommens, Johanna M; Pace, Rhonda G; Boelle, Pierre-Yves; McGready, John; Sosnay, Patrick R; Strug, Lisa J; Knowles, Michael R; Cutting, Garry R
2016-12-01
Expanding the use of cystic fibrosis transmembrane conductance regulator (CFTR) potentiators and correctors for the treatment of cystic fibrosis (CF) requires precise and accurate biomarkers. Sweat chloride concentration provides an in vivo assessment of CFTR function, but it is unknown the degree to which CFTR mutations account for sweat chloride variation. To estimate potential sources of variation for sweat chloride measurements, including demographic factors, testing variability, recording biases, and CFTR genotype itself. A total of 2,639 sweat chloride measurements were obtained in 1,761 twins/siblings from the CF Twin-Sibling Study, French CF Modifier Gene Study, and Canadian Consortium for Genetic Studies. Variance component estimation was performed by nested mixed modeling. Across the tested CF population as a whole, CFTR gene mutations were found to be the primary determinant of sweat chloride variability (56.1% of variation) with contributions from variation over time (e.g., factors related to testing on different days; 13.8%), environmental factors (e.g., climate, family diet; 13.5%), other residual factors (e.g., test variability; 9.9%), and unique individual factors (e.g., modifier genes, unique exposures; 6.8%) (likelihood ratio test, P < 0.001). Twin analysis suggested that modifier genes did not play a significant role because the heritability estimate was negligible (H 2 = 0; 95% confidence interval, 0.0-0.35). For an individual with CF, variation in sweat chloride was primarily caused by variation over time (58.1%) with the remainder attributable to residual/random factors (41.9%). Variation in the CFTR gene is the predominant cause of sweat chloride variation; most of the non-CFTR variation is caused by testing variability and unique environmental factors. If test precision and accuracy can be improved, sweat chloride measurement could be a valuable biomarker for assessing response to therapies directed at mutant CFTR.
Sources of Variation in Sweat Chloride Measurements in Cystic Fibrosis
Blackman, Scott M.; Raraigh, Karen S.; Corvol, Harriet; Rommens, Johanna M.; Pace, Rhonda G.; Boelle, Pierre-Yves; McGready, John; Sosnay, Patrick R.; Strug, Lisa J.; Knowles, Michael R.; Cutting, Garry R.
2016-01-01
Rationale: Expanding the use of cystic fibrosis transmembrane conductance regulator (CFTR) potentiators and correctors for the treatment of cystic fibrosis (CF) requires precise and accurate biomarkers. Sweat chloride concentration provides an in vivo assessment of CFTR function, but it is unknown the degree to which CFTR mutations account for sweat chloride variation. Objectives: To estimate potential sources of variation for sweat chloride measurements, including demographic factors, testing variability, recording biases, and CFTR genotype itself. Methods: A total of 2,639 sweat chloride measurements were obtained in 1,761 twins/siblings from the CF Twin-Sibling Study, French CF Modifier Gene Study, and Canadian Consortium for Genetic Studies. Variance component estimation was performed by nested mixed modeling. Measurements and Main Results: Across the tested CF population as a whole, CFTR gene mutations were found to be the primary determinant of sweat chloride variability (56.1% of variation) with contributions from variation over time (e.g., factors related to testing on different days; 13.8%), environmental factors (e.g., climate, family diet; 13.5%), other residual factors (e.g., test variability; 9.9%), and unique individual factors (e.g., modifier genes, unique exposures; 6.8%) (likelihood ratio test, P < 0.001). Twin analysis suggested that modifier genes did not play a significant role because the heritability estimate was negligible (H2 = 0; 95% confidence interval, 0.0–0.35). For an individual with CF, variation in sweat chloride was primarily caused by variation over time (58.1%) with the remainder attributable to residual/random factors (41.9%). Conclusions: Variation in the CFTR gene is the predominant cause of sweat chloride variation; most of the non-CFTR variation is caused by testing variability and unique environmental factors. If test precision and accuracy can be improved, sweat chloride measurement could be a valuable biomarker for assessing response to therapies directed at mutant CFTR. PMID:27258095
Nielsen, Daiva E; Shih, Sarah; El-Sohemy, Ahmed
2014-01-01
Direct-to-consumer (DTC) genetic tests have facilitated easy access to personal genetic information related to health and nutrition; however, consumer perceptions of the nutritional information provided by these tests have not been evaluated. The objectives of this study were to assess individual perceptions of personalized nutrition and genetic testing and to determine whether a personalized nutrition intervention modifies perceptions. A double-blind, parallel-group, randomized controlled trial was conducted among healthy men and women aged 20-35 years (n = 138). Participants in the intervention group (n = 92) were given a report of DNA-based dietary advice and those in the control group (n = 46) were given a general dietary advice report. A survey was completed at baseline and 3 and 12 months after distributing the reports to assess perceptions between the two groups. No significant differences in perceptions of personalized nutrition and genetic testing were observed between the intervention and control group, so responses of both groups were combined. As compared to baseline, participant responses increased significantly toward the positive end of a Likert scale at 3 months for the statement 'I am interested in the relationship between diet and genetics' (mean change ± SD: 0.28 ± 0.99, p = 0.0002). The majority of participants indicated that a university research lab (47%) or health care professional (41%) were the best sources for obtaining accurate personal genetic information, while a DTC genetic testing company received the fewest selections (12%). Most participants (56%) considered dietitians to be the best source of personalized nutrition followed by medical doctors (27%), naturopaths (8%) and nurses (6%). These results suggest that perceptions of personalized nutrition changed over the course of the intervention. Individuals view a research lab or health care professional as better providers of genetic information than a DTC genetic testing company, and registered dietitians are considered to be the best providers of personalized nutrition advice. © 2014 S. Karger AG, Basel.
Urbanus, Brittany L; Schmidt, Shelly J; Lee, Soo-Yeun
2014-11-01
Beet sugar contains an off-aroma, which was hypothesized to generate expectations on the acceptability of a product made with beet sugar. Thus, the objective of this study was to assess the impact of information about the sugar source (beet vs. cane) on the overall liking of an orange-flavored beverage. One hundred panelists evaluated an orange-flavored powdered beverage mix and beverage made with beet and cane sugars using a 5-phase testing protocol involving a tetrad test and hedonic ratings performed under blind and informed conditions. Tetrad test results indicated that there was a significant difference (P < 0.05) between the beverage mix made with beet sugar and cane sugar; however, no difference was found between the beverage made with beet sugar and cane sugar. Hedonic ratings revealed the significance of information conditions on the panelists evaluation of sugar (F = 24.67, P < 0.001); however, no difference in the liking was identified for the beverage mix or beverage. Average hedonic scores were higher under informed condition compared to blind condition for all products, possibly because labels tend to reduce uncertainty about a product. Results from this study are representative of the responses from the general population and suggest that they are not affected by sugar source information in a beverage product. Based on concerns with the use of beet sugar expressed in the popular press, there may be a subgroup of the population that has a preconceived bias about sugar sources due to their prior experiences and knowledge and, thus, would be influenced by labels indicating the sugar source used in a product. © 2014 Institute of Food Technologists®
Methane production and isotopic fingerprinting in ethanol fuel contaminated sites.
Freitas, Juliana G; Fletcher, Barbara; Aravena, Ramon; Barker, James F
2010-01-01
Biodegradation of organic compounds in groundwater can be a significant source of methane in contaminated sites. Methane might accumulate in indoor spaces posing a hazard. The increasing use of ethanol as a gasoline additive is a concern with respect to methane production since it is easily biodegraded and has a high oxygen demand, favoring the development of anaerobic conditions. This study evaluated the use of stable carbon isotopes to distinguish the methane origin between gasoline and ethanol biodegradation, and assessed the occurrence of methane in ethanol fuel contaminated sites. Two microcosm tests were performed under anaerobic conditions: one test using ethanol and the other using toluene as the sole carbon source. The isotopic tool was then applied to seven field sites known to be impacted by ethanol fuels. In the microcosm tests, it was verified that methane from ethanol (δ¹³C = -11.1‰) is more enriched in ¹³C, with δ¹³C values ranging from -20‰ to -30‰, while the methane from toluene (δ¹³C = -28.5‰) had a carbon isotopic signature of -55‰. The field samples had δ¹³C values varying over a wide range (-10‰ to -80‰), and the δ¹³C values allowed the methane source to be clearly identified in five of the seven ethanol/gasoline sites. In the other two sites, methane appears to have been produced from both sources. Both gasoline and ethanol were sources of methane in potentially hazardous concentrations and methane could be produced from organic acids originating from ethanol along the groundwater flow system even after all the ethanol has been completed biodegraded. Copyright © 2010 The Author(s). Journal compilation © 2010 National Ground Water Association.
Haack, S.K.; Duris, J.W.; Fogarty, L.R.; Kolpin, D.W.; Focazio, M.J.; Furlong, E.T.; Meyer, M.T.
2009-01-01
The objective of this study was to compare fecal indicator bacteria (FIB) (fecal coliforms, Escherichia coli [EC], and enterococci [ENT]) concentrations with a wide array of typical organic wastewater chemicals and selected bacterial genes as indicators of fecal pollution in water samples collected at or near 18 surface water drinking water intakes. Genes tested included esp (indicating human-pathogenic ENT) and nine genes associated with various animal sources of shiga-toxin-producing EC (STEC). Fecal pollution was indicated by genes and/or chemicals for 14 of the 18 tested samples, with little relation to FIB standards. Of 13 samples with <50 EC 100 mL-1, human pharmaceuticals or chemical indicators of wastewater treatment plant effluent occurred in six, veterinary antibiotics were detected in three, and stx1 or stx2 genes (indicating varying animal sources of STEC) were detected in eight. Only the EC eaeA gene was positively correlated with FIB concentrations. Human-source fecal pollution was indicated by the esp gene and the human pharmaceutical carbamazepine in one of the nine samples that met all FIB recreational water quality standards. Escherichia coli rfbO157 and stx2c genes, which are typically associated with cattle sources and are of potential human health significance, were detected in one sample in the absence of tested chemicals. Chemical and gene-based indicators of fecal contamination may be present even when FIB standards are met, and some may, unlike FIB, indicate potential sources. Application of multiple water quality indicators with variable environmental persistence and fate may yield greater confidence in fecal pollution assessment and may inform remediation decisions. Copyright ?? 2009 by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America. All rights reserved.
Taylor, Peter; Gartemann, Juliane; Hsieh, Jeanie; Creeden, James
2011-01-01
This systematic review assesses the current status of anti-cyclic citrullinated peptide (anti-CCP) and rheumatoid factor (RF) tests in the diagnosis and prognosis of rheumatoid arthritis (RA). We reviewed publications on tests and biomarkers for early diagnosis of RA from English-language MEDLINE-indexed journals and non-MEDLINE-indexed sources. 85 publications were identified and reviewed, including 68 studies from MEDLINE and 17 non-MEDLINE sources. Anti-CCP2 assays provide improved sensitivity over anti-CCP assays and RF, but anti-CCP2 and RF assays in combination demonstrate a positive predictive value (PPV) nearing 100%, greater than the PPV of either of the tests alone. The combination also appears to be able to distinguish between patients whose disease course is expected to be more severe and both tests are incorporated in the 2010 ACR Rheumatoid Arthritis Classification Criteria. While the clinical value of anti-CCP tests has been established, differences in cut-off values, sensitivities and specificities exist between first-, second- and third-generation tests and harmonization efforts are under way. Anti-CCP and RF are clinically valuable biomarkers for the diagnosis and prognosis of RA patients. The combination of the two biomarkers in conjunction with other clinical measures is an important tool for the diagnosis and management of RA patients. PMID:21915375
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew
GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of a reliability database (RDB) methodology to determine applicable reliability data for inclusion in the quantification of the PRA. The RDBmore » method developed during this project seeks to satisfy the requirements of the Data Analysis element of the ASME/ANS Non-LWR PRA standard. The RDB methodology utilizes a relevancy test to examine reliability data and determine whether it is appropriate to include as part of the reliability database for the PRA. The relevancy test compares three component properties to establish the level of similarity to components examined as part of the PRA. These properties include the component function, the component failure modes, and the environment/boundary conditions of the component. The relevancy test is used to gauge the quality of data found in a variety of sources, such as advanced reactor-specific databases, non-advanced reactor nuclear databases, and non-nuclear databases. The RDB also establishes the integration of expert judgment or separate reliability analysis with past reliability data. This paper provides details on the RDB methodology, and includes an example application of the RDB methodology for determining the reliability of the intermediate heat exchanger of a sodium fast reactor. The example explores a variety of reliability data sources, and assesses their applicability for the PRA of interest through the use of the relevancy test.« less
Positive patch test reactions to oxidized limonene: exposure and relevance.
Bråred Christensson, Johanna; Andersen, Klaus E; Bruze, Magnus; Johansen, Jeanne D; Garcia-Bravo, Begoña; Gimenez Arnau, Ana; Goh, Chee-Leok; Nixon, Rosemary; White, Ian R
2014-11-01
R-Limonene is a common fragrance terpene found in domestic and industrial products. R-Limonene autoxidizes on air exposure, and the oxidation products can cause contact allergy. In a recent multicentre study, 5.2% (range 2.3-12.1%) of 2900 patients showed a positive patch test reaction to oxidized R-limonene. To study the exposure to limonene among consecutive dermatitis patients reacting to oxidized R-limonene in an international setting, and to assess the relevance of the exposure for the patients' dermatitis. Oxidized R-limonene 3.0% (containing limonene hydroperoxides at 0.33%) in petrolatum was tested in 2900 consecutive dermatitis patients in Australia, Denmark, the United Kingdom, Singapore, Spain, and Sweden. A questionnaire assessing exposure to limonene-containing products was completed. Overall, exposure to products containing limonene was found and assessed as being probably relevant for the patients' dermatitis in 36% of the limonene-allergic patients. In Barcelona and Copenhagen, > 70% of the patients were judged to have had an exposure to limonene assessed as relevant. Oxidized R-limonene is a common fragrance allergen, and limonene was frequently found in the labelling on the patients' products, and assessed as relevant for the patients' dermatitis. A large number of domestic and occupational sources for contact with R-limonene were identified. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Screening of Potential Landing Gear Noise Control Devices at Virginia Tech For QTD II Flight Test
NASA Technical Reports Server (NTRS)
Ravetta, Patricio A.; Burdisso, Ricardo A.; Ng, Wing F.; Khorrami, Mehdi R.; Stoker, Robert W.
2007-01-01
In support of the QTD II (Quiet Technology Demonstrator) program, aeroacoustic measurements of a 26%-scale, Boeing 777 main landing gear model were conducted in the Virginia Tech Stability Tunnel. The objective of these measurements was to perform risk mitigation studies on noise control devices for a flight test performed at Glasgow, Montana in 2005. The noise control devices were designed to target the primary main gear noise sources as observed in several previous tests. To accomplish this task, devices to reduce noise were built using stereo lithography for landing gear components such as the brakes, the forward cable harness, the shock strut, the door/strut gap and the lower truck. The most promising device was down selected from test results. In subsequent stages, the initial design of the selected lower truck fairing was improved to account for all the implementation constraints encountered in the full-scale airplane. The redesigned truck fairing was then retested to assess the impact of the modifications on the noise reduction potential. From extensive acoustic measurements obtained using a 63-element microphone phased array, acoustic source maps and integrated spectra were generated in order to estimate the noise reduction achievable with each device.
Okullo, Joab Odhiambo; Moturi, Wilkister Nyaora; Ogendi, George Morara
2017-01-01
The post-2015 Sustainable Development Goals for sanitation call for universal access to adequate and equitable sanitation and an end to open defaecation by 2030. In Isiolo County, a semi-arid region lying in the northern part of Kenya, poor sanitation and water shortage remain a major problem facing the rural communities. The overall aim of the study was to assess the relationship between sanitation practices and the bacteriological quality of drinking water sources. The study also assessed the risk factors contributing to open defaecation in the rural environments of the study area. A cross-sectional study of 150 households was conducted to assess the faecal disposal practices in open defaecation free (ODF) and open defaecation not free (ODNF) areas. Sanitary surveys and bacteriological analyses were conducted for selected community water sources to identify faecal pollution sources, contamination pathways, and contributory factors. Analysis of data was performed using SPSS (descriptive and inferential statistics at α = .05 level of significance). Open defaecation habit was reported in 51% of the study households in ODNF villages and in 17% households in ODF villages. Higher mean colony counts were recorded for water samples from ODNF areas 2.0, 7.8, 5.3, and 7.0 (×10 3 ) colony-forming units (CFUs)/100 mL compared with those of ODF 1.8, 6.4, 3.5, and 6.1 (×10 3 ) areas for Escherichia coli , faecal streptococci, Salmonella typhi , and total coliform, respectively. Correlation tests revealed a significant relationship between sanitary surveys and contamination of water sources ( P = .002). The water sources exhibited high levels of contamination with microbial pathogens attributed to poor sanitation. Practising safe faecal disposal in particular is recommended as this will considerably reverse the situation and thus lead to improved human health.
Okullo, Joab Odhiambo; Moturi, Wilkister Nyaora; Ogendi, George Morara
2017-01-01
Background information: The post-2015 Sustainable Development Goals for sanitation call for universal access to adequate and equitable sanitation and an end to open defaecation by 2030. In Isiolo County, a semi-arid region lying in the northern part of Kenya, poor sanitation and water shortage remain a major problem facing the rural communities. Objective: The overall aim of the study was to assess the relationship between sanitation practices and the bacteriological quality of drinking water sources. The study also assessed the risk factors contributing to open defaecation in the rural environments of the study area. Methods: A cross-sectional study of 150 households was conducted to assess the faecal disposal practices in open defaecation free (ODF) and open defaecation not free (ODNF) areas. Sanitary surveys and bacteriological analyses were conducted for selected community water sources to identify faecal pollution sources, contamination pathways, and contributory factors. Analysis of data was performed using SPSS (descriptive and inferential statistics at α = .05 level of significance). Results: Open defaecation habit was reported in 51% of the study households in ODNF villages and in 17% households in ODF villages. Higher mean colony counts were recorded for water samples from ODNF areas 2.0, 7.8, 5.3, and 7.0 (×103) colony-forming units (CFUs)/100 mL compared with those of ODF 1.8, 6.4, 3.5, and 6.1 (×103) areas for Escherichia coli, faecal streptococci, Salmonella typhi, and total coliform, respectively. Correlation tests revealed a significant relationship between sanitary surveys and contamination of water sources (P = .002). Conclusions: The water sources exhibited high levels of contamination with microbial pathogens attributed to poor sanitation. Practising safe faecal disposal in particular is recommended as this will considerably reverse the situation and thus lead to improved human health. PMID:29051705
YouTube as a patient-information source for root canal treatment.
Nason, K; Donnelly, A; Duncan, H F
2016-12-01
To assess the content and completeness of Youtube ™ as an information source for patients undergoing root canal treatment procedures. YouTube ™ (https://www.youtube.com/) was searched for information using three relevant treatment search terms ('endodontics', 'root canal' and 'root canal treatment'). After exclusions (language, no audio, >15 min, duplicates), 20 videos per search term were selected. General video assessment included duration, ownership, views, age, likes/dislikes, target audience and video/audio quality, whilst content was analysed under six categories ('aetiology', 'anatomy', 'symptoms', 'procedure', 'postoperative course' and 'prognosis'). Content was scored for completeness level and statistically analysed using anova and post hoc Tukey's test (P < 0.05). To obtain 60 acceptable videos, 124 were assessed. Depending on the search term employed, the video content and ownership differed markedly. There was wide variation in both the number of video views and 'likes/dislikes'. The average video age was 788 days. In total, 46% of videos were 'posted' by a dentist/specialist source; however, this was search term specific rising to 70% of uploads for the search 'endodontic', whilst laypersons contributed 18% of uploads for the search 'root canal treatment'. Every video lacked content in the designated six categories, although 'procedure' details were covered more frequently and in better detail than other categories. Videos posted by dental professional (P = 0.046) and commercial sources (P = 0.009) were significantly more complete than videos posted by laypeople. YouTube ™ videos for endodontic search terms varied significantly by source and content and were generally incomplete. The danger of patient reliance on YouTube ™ is highlighted, as is the need for endodontic professionals to play an active role in directing patients towards alternative high-quality information sources. © 2015 International Endodontic Journal. Published by John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Pujayanto, Pujayanto; Budiharti, Rini; Adhitama, Egy; Nuraini, Niken Rizky Amalia; Vernanda Putri, Hanung
2018-07-01
This research proposes the development of a web-based assessment system to identify students’ misconception. The system, named WAS (web-based assessment system), can identify students’ misconception profile on linear kinematics automatically after the student has finished the test. The test instrument was developed and validated. Items were constructed and arranged from the result of a focus group discussion (FGD), related to previous research. Fifty eight students (female = 37, male = 21) were used as samples. They were from different classes with 18 students from the gifted class and another 40 students from the normal class. WAS was designed specifically to support the teacher as an efficient replacement for a paper-based test system. In addition, WAS offers flexible timing functionally, stand-alone subject module, robustness and scalability. The entire WAS program and interface was developed with open source-based technologies such as the XAMP server, MySQL database, Javascript and PHP. It provides results immediately and provides diagrammatic questions as well as scientific symbols. It is feasible to apply this system to many students at once. Thus, it could be integrated in many schools as part of physics courses.
Gordeev, Konstantin; Shinkarev, Sergey; Ilyin, Leonid; Bouville, André; Hoshi, Masaharu; Luckyanov, Nickolas; Simon, Steven L
2006-02-01
A methodology to assess internal exposure to thyroid from radioiodines for the residents living in settlements located in the vicinity of the Semipalatinsk Nuclear Test Site is described that is the result of many years of research, primarily at the Moscow Institute of Biophysics. This methodology introduces two important concepts. First, the biologically active fraction, is defined as the fraction of the total activity on fallout particles with diameter less than 50 microns. That fraction is retained by vegetation and will ultimately result in contamination of dairy products. Second, the relative distance is derived as a dimensionless quantity from information on test yield, maximum height of cloud, and average wind velocity and describes how the biologically active fraction is distributed with distance from the site of the explosion. The parameter is derived in such a way that at locations with equal values of relative distance, the biologically active fraction will be the same for any test. The estimates of internal exposure to thyroid for the residents of Dolon and Kanonerka villages, for which the external exposure were assessed and given in a companion paper (Gordeev et al. 2006) in this conference, are presented. The main sources of uncertainty in the estimates are identified.
The aging physician and surgeon.
Sataloff, Robert T; Hawkshaw, Mary; Kutinsky, Joshua; Maitz, Edward A
2016-01-01
As the population of aging physicians increases, methods of assessing physicians' cognitive function and predicting clinically significant changes in clinical performance become increasingly important. Although several approaches have been suggested, no evaluation system is accepted or utilized widely. This article reviews literature using MEDLINE, PubMed, and other sources. Articles discussing the problems of geriatric physicians are summarized, stressing publications that proposed methods of evaluation. Selected literature on evaluating aging pilots also was reviewed, and potential applications for physician evaluation are proposed. Neuropsychological cognitive test protocols were summarized, and a reduced evaluation protocol is proposed for interdisciplinary longitudinal research. Although there are several articles evaluating cognitive function in aging physicians and aging pilots, and although a few institutions have instituted cognitive evaluation, there are no longitudinal data assessing cognitive function in physicians over time or correlating them with performance. Valid, reliable testing of cognitive function of physicians is needed. In order to understand its predictive value, physicians should be tested over time starting when they are young, and results should be correlated with physician performance. Early testing is needed to determine whether cognitive deficits are age-related or long-standing. A multi-institutional study over many years is proposed. Additional assessments of other factors such as manual dexterity (perhaps using simulators) and physician frailty are recommended.
Performance Characterization of LCLS-II Superconducting Radiofrequency Cryomodules
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gregory, RuthAnn
This paper will describe the LCLS (Linac Coherent Light Source)-II, Fermilab’s role in the development of LCLS-II, and my contributions as a Lee Teng intern. LCLS-II is a second generation x-ray free electron laser being constructed at SLAC National Accelerator Laboratory. Fermilab is responsible for the design, construction, and testing of several 1.3 GHz cryomodules to be used in LCLS-II. These cryomodules are currently being tested at Fermilab. Some software was written to analyze the data from the cryomodule tests. This software assesses the performance of the cryomodules by looking at data on the cavity voltage, cavity gradient, dark current,more » and radiation.« less
10 CFR 35.2067 - Records of leaks tests and inventory of sealed sources and brachytherapy sources.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 1 2010-01-01 2010-01-01 false Records of leaks tests and inventory of sealed sources and... MATERIAL Records § 35.2067 Records of leaks tests and inventory of sealed sources and brachytherapy sources. (a) A licensee shall retain records of leak tests required by § 35.67(b) for 3 years. The records...
Rögener, Wiebke; Wormer, Holger
2017-05-01
While the quality of environmental science journalism has been the subject of much debate, a widely accepted benchmark to assess the quality of coverage of environmental topics is missing so far. Therefore, we have developed a set of defined criteria of environmental reporting. This instrument and its applicability are tested in a newly established monitoring project for the assessment of pieces on environmental issues, which refer to scientific sources and therefore can be regarded as a special field of science journalism. The quality is assessed in a kind of journalistic peer review. We describe the systematic development of criteria, which might also be a model procedure for other fields of science reporting. Furthermore, we present results from the monitoring of 50 environmental reports in German media. According to these preliminary data, the lack of context and the deficient elucidation of the evidence pose major problems in environmental reporting.
Puścian, Alicja; Łęski, Szymon; Kasprowicz, Grzegorz; Winiarski, Maciej; Borowska, Joanna; Nikolaev, Tomasz; Boguszewski, Paweł M; Lipp, Hans-Peter; Knapska, Ewelina
2016-10-12
Eco-HAB is an open source, RFID-based system for automated measurement and analysis of social preference and in-cohort sociability in mice. The system closely follows murine ethology. It requires no contact between a human experimenter and tested animals, overcoming the confounding factors that lead to irreproducible assessment of murine social behavior between laboratories. In Eco-HAB, group-housed animals live in a spacious, four-compartment apparatus with shadowed areas and narrow tunnels, resembling natural burrows. Eco-HAB allows for assessment of the tendency of mice to voluntarily spend time together in ethologically relevant mouse group sizes. Custom-made software for automated tracking, data extraction, and analysis enables quick evaluation of social impairments. The developed protocols and standardized behavioral measures demonstrate high replicability. Unlike classic three-chambered sociability tests, Eco-HAB provides measurements of spontaneous, ecologically relevant social behaviors in group-housed animals. Results are obtained faster, with less manpower, and without confounding factors.
Comparison of self-reported HIV testing data with medical records data in Houston, TX 2012-2013.
An, Qian; Chronister, Karen; Song, Ruiguang; Pearson, Megan; Pan, Yi; Yang, Biru; Khuwaja, Salma; Hernandez, Angela; Hall, H Irene
2016-03-23
To assess the agreement between self-reported and medical record data on HIV status and dates of first positive and last negative HIV tests. Participants were recruited from patients attending Houston health clinics during 2012-2013. Self-reported data were collected using a questionnaire and compared with medical record data. Agreement of HIV status was assessed using kappa statistics and of HIV test dates using concordance correlation coefficient. The extent of difference between self-reported and medical record test dates was determined. Agreement between self-reported and medical record data was good on HIV status and date of first positive HIV test, but poor on date of last negative HIV test. About half of participants that self-reported never tested had HIV test results in medical records. Agreement varied by sex, race and/or ethnicity, and medical care facility. For HIV-positive persons, more self-reported first positive HIV test dates preceded medical record dates, with a median difference of 6 months. For HIV-negative persons, more medical record dates of last negative HIV test preceded self-reported dates, with a median difference of 2 months. Studies relying on self-reported HIV status other than HIV positive and self-reported date of last negative should consider including information from additional sources to validate the self-reported data. Published by Elsevier Inc.
The key-features approach to assess clinical decisions: validity evidence to date.
Bordage, G; Page, G
2018-05-17
The key-features (KFs) approach to assessment was initially proposed during the First Cambridge Conference on Medical Education in 1984 as a more efficient and effective means of assessing clinical decision-making skills. Over three decades later, we conducted a comprehensive, systematic review of the validity evidence gathered since then. The evidence was compiled according to the Standards for Educational and Psychological Testing's five sources of validity evidence, namely, Content, Response process, Internal structure, Relations to other variables, and Consequences, to which we added two other types related to Cost-feasibility and Acceptability. Of the 457 publications that referred to the KFs approach between 1984 and October 2017, 164 are cited here; the remaining 293 were either redundant or the authors simply mentioned the KFs concept in relation to their work. While one set of articles reported meeting the validity standards, another set examined KFs test development choices and score interpretation. The accumulated validity evidence for the KFs approach since its inception supports the decision-making construct measured and its use to assess clinical decision-making skills at all levels of training and practice and with various types of exam formats. Recognizing that gathering validity evidence is an ongoing process, areas with limited evidence, such as item factor analyses or consequences of testing, are identified as well as new topics needing further clarification, such as the use of the KFs approach for formative assessment and its place within a program of assessment.
Lopez-Cepero, Andrea; Torres, Roxana; Elias, Augusto; Rosal, Milagros C; Palacios, Cristina
2015-12-01
Micronutrients are critical for healthy growth and development of children. Micronutrient intake from dietary sources is inadequate among some children and may be improved by use of multivitamin and multimineral (MVMM) supplements. To assess micronutrient intake from dietary and MVMM supplement sources among 12-year-old children in Puerto Rico. A representative sample of 732 children enrolled in an oral health study in Puerto Rico, who completed dietary and MVMM assessments through one 24-h recall, were included in this analysis. Micronutrient intake sources were described and compared to the Dietary Reference Intakes (DRIs) using the Estimated Average Requirement when available (used Adequate Intake for vitamin K and pantothenic acid). Micronutrient profiles of MVMM users and non-users were compared using t-tests. Mean intakes of vitamins A, D, E, and K, pantothenic acid, calcium, and magnesium from food and beverage sources were below the DRIs. From food and beverage sources, MVMM users had higher intakes of riboflavin and folate compared to non-users (p < 0.05). When MVMM supplements were taken into account, users had higher intakes of all nutrients except vitamin K. With the help of MVMM, users increased intake of vitamins E, A, D, and pantothenic acid to IOM-recommended levels but calcium, magnesium, and vitamin K remained below guidelines. Micronutrient intake from diet was below the IOM-recommended levels in the total sample. MVMM use improved intake of selected micronutrients and facilitated meeting recommendations for some nutrients. Public health measures to improve micronutrient intake among children in Puerto Rico are needed.
Development and Validation of a Multimedia-based Assessment of Scientific Inquiry Abilities
NASA Astrophysics Data System (ADS)
Kuo, Che-Yu; Wu, Hsin-Kai; Jen, Tsung-Hau; Hsu, Ying-Shao
2015-09-01
The potential of computer-based assessments for capturing complex learning outcomes has been discussed; however, relatively little is understood about how to leverage such potential for summative and accountability purposes. The aim of this study is to develop and validate a multimedia-based assessment of scientific inquiry abilities (MASIA) to cover a more comprehensive construct of inquiry abilities and target secondary school students in different grades while this potential is leveraged. We implemented five steps derived from the construct modeling approach to design MASIA. During the implementation, multiple sources of evidence were collected in the steps of pilot testing and Rasch modeling to support the validity of MASIA. Particularly, through the participation of 1,066 8th and 11th graders, MASIA showed satisfactory psychometric properties to discriminate students with different levels of inquiry abilities in 101 items in 29 tasks when Rasch models were applied. Additionally, the Wright map indicated that MASIA offered accurate information about students' inquiry abilities because of the comparability of the distributions of student abilities and item difficulties. The analysis results also suggested that MASIA offered precise measures of inquiry abilities when the components (questioning, experimenting, analyzing, and explaining) were regarded as a coherent construct. Finally, the increased mean difficulty thresholds of item responses along with three performance levels across all sub-abilities supported the alignment between our scoring rubrics and our inquiry framework. Together with other sources of validity in the pilot testing, the results offered evidence to support the validity of MASIA.
Bansal, Sheel; St Clair, J Bradley; Harrington, Constance A; Gould, Peter J
2015-10-01
The success of conifers over much of the world's terrestrial surface is largely attributable to their tolerance to cold stress (i.e., cold hardiness). Due to an increase in climate variability, climate change may reduce conifer cold hardiness, which in turn could impact ecosystem functioning and productivity in conifer-dominated forests. The expression of cold hardiness is a product of environmental cues (E), genetic differentiation (G), and their interaction (G × E), although few studies have considered all components together. To better understand and manage for the impacts of climate change on conifer cold hardiness, we conducted a common garden experiment replicated in three test environments (cool, moderate, and warm) using 35 populations of coast Douglas-fir (Pseudotsuga menziesii var. menziesii) to test the hypotheses: (i) cool-temperature cues in fall are necessary to trigger cold hardening, (ii) there is large genetic variation among populations in cold hardiness that can be predicted from seed-source climate variables, (iii) observed differences among populations in cold hardiness in situ are dependent on effective environmental cues, and (iv) movement of seed sources from warmer to cooler climates will increase risk to cold injury. During fall 2012, we visually assessed cold damage of bud, needle, and stem tissues following artificial freeze tests. Cool-temperature cues (e.g., degree hours below 2 °C) at the test sites were associated with cold hardening, which were minimal at the moderate test site owing to mild fall temperatures. Populations differed 3-fold in cold hardiness, with winter minimum temperatures and fall frost dates as strong seed-source climate predictors of cold hardiness, and with summer temperatures and aridity as secondary predictors. Seed-source movement resulted in only modest increases in cold damage. Our findings indicate that increased fall temperatures delay cold hardening, warmer/drier summers confer a degree of cold hardiness, and seed-source movement from warmer to cooler climates may be a viable option for adapting coniferous forest to future climate. Published 2015. This article is a U.S. Government work and is in the public domain in the USA.
Bussing, Regina; Zima, Bonnie T; Mason, Dana M; Meyer, Johanna M; White, Kimberly; Garvan, Cynthia W
2012-12-01
The chronic illness model advocates for psychoeducation within a collaborative care model to enhance outcomes. To inform psychoeducational approaches for ADHD, this study describes parent and adolescent knowledge, perceptions, and information sources and explores how these vary by sociodemographic characteristics, ADHD risk, and past child mental health service use. Parents and adolescents were assessed 7.7 years after initial school district screening for ADHD risk. The study sample included 374 adolescents (56% high and 44% low ADHD risk) aged, on average, 15.4 (standard deviation = 1.8) years, and 36% were African American. Survey questions assessed ADHD knowledge, perceptions, and cues to action and elicited used and preferred information sources. Multiple logistic regression was used to determine potential independent predictors of ADHD knowledge. McNemar tests compared information source utilization against preference. Despite relatively high self-rated ADHD familiarity, misperceptions among parents and adolescents were common, including a sugar etiology (25% and 27%, respectively) and medication overuse (85% and 67%, respectively). African American respondents expressed less ADHD awareness and greater belief in sugar etiology than Caucasians. Parents used a wide range of ADHD information sources, whereas adolescents relied on social network members and teachers/school. However, parents and adolescents expressed similar strong preferences for the Internet (49% and 51%, respectively) and doctor (40% and 27%, respectively) as ADHD information sources. Culturally appropriate psychoeducational strategies are needed that combine doctor-provided ADHD information with reputable Internet sources. Despite time limitations during patient visits, both parents and teens place high priority on receiving information from their doctor. Copyright © 2012 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Al-Jumaili, Safaa Kh.; Pearson, Matthew R.; Holford, Karen M.; Eaton, Mark J.; Pullin, Rhys
2016-05-01
An easy to use, fast to apply, cost-effective, and very accurate non-destructive testing (NDT) technique for damage localisation in complex structures is key for the uptake of structural health monitoring systems (SHM). Acoustic emission (AE) is a viable technique that can be used for SHM and one of the most attractive features is the ability to locate AE sources. The time of arrival (TOA) technique is traditionally used to locate AE sources, and relies on the assumption of constant wave speed within the material and uninterrupted propagation path between the source and the sensor. In complex structural geometries and complex materials such as composites, this assumption is no longer valid. Delta T mapping was developed in Cardiff in order to overcome these limitations; this technique uses artificial sources on an area of interest to create training maps. These are used to locate subsequent AE sources. However operator expertise is required to select the best data from the training maps and to choose the correct parameter to locate the sources, which can be a time consuming process. This paper presents a new and improved fully automatic delta T mapping technique where a clustering algorithm is used to automatically identify and select the highly correlated events at each grid point whilst the "Minimum Difference" approach is used to determine the source location. This removes the requirement for operator expertise, saving time and preventing human errors. A thorough assessment is conducted to evaluate the performance and the robustness of the new technique. In the initial test, the results showed excellent reduction in running time as well as improved accuracy of locating AE sources, as a result of the automatic selection of the training data. Furthermore, because the process is performed automatically, this is now a very simple and reliable technique due to the prevention of the potential source of error related to manual manipulation.
An assessment of some non-gray global radiation models in enclosures
NASA Astrophysics Data System (ADS)
Meulemans, J.
2016-01-01
The accuracy of several non-gray global gas/soot radiation models, namely the Wide-Band Correlated-K (WBCK) model, the Spectral Line Weighted-sum-of-gray-gases model with one optimized gray gas (SLW-1), the (non-gray) Weighted-Sum-of-Gray-Gases (WSGG) model with different sets of coefficients (Smith et al., Soufiani and Djavdan, Taylor and Foster) was assessed on several test cases from the literature. Non-isothermal (or isothermal) participating media containing non-homogeneous (or homogeneous) mixtures of water vapor, carbon dioxide and soot in one-dimensional planar enclosures and multi-dimensional rectangular enclosures were investigated. For all the considered test cases, a benchmark solution (LBL or SNB) was used in order to compute the relative error of each model on the predicted radiative source term and the wall net radiative heat flux.
Sources of mycosporine-like amino acids in planktonic Chlorella-bearing ciliates (Ciliophora)
SONNTAG, BETTINA; SUMMERER, MONIKA; SOMMARUGA, RUBEN
2007-01-01
Mycosporine-like amino acids (MAAs) are a family of secondary metabolites known to protect organisms exposed to solar UV radiation. We tested their distribution among several planktonic ciliates bearing Chlorella isolated from an oligo-mesotrophic lake in Tyrol, Austria. In order to test the origin of these compounds, the MAAs were assessed by high performance liquid chromatography in both the ciliates and their symbiotic algae. Considering all Chlorella-bearing ciliates, we found: (i) seven different MAAs (mycosporine-glycine, palythine, asterina-330, shinorine, porphyra-334, usujirene, palythene); (ii) one to several MAAs per species and (iii) qualitative and quantitative seasonal changes in the MAAs (e.g. in Pelagodileptus trachelioides). In all species tested, concentrations of MAAs were always <1% of ciliate dry weight. Several MAAs were also identified in the Chlorella isolated from the ciliates, thus providing initial evidence for their symbiotic origin. In Uroleptus sp., however, we found evidence for a dietary source of MAAs. Our results suggest that accumulation of MAAs in Chlorella-bearing ciliates represents an additional benefit of this symbiosis and an adaptation for survival in sunlit, UV-exposed waters.
1991-06-01
assessing EMC characteristics of EM systems in a lecture on "Measuremazzenvironments and testing".The various test en - vronments available will be described...severe in a dual-diversity system gains at least 10 dB in SNR relative to a maritime situation where salt water corrosion has for many non-diversity...environment having great significance for NATO systems are: (a) electromagnetic interference (EMT) arising from both natural and man-made sources; (b
The development of a computer assisted instruction and assessment system in pharmacology.
Madsen, B W; Bell, R C
1977-01-01
We describe the construction of a computer based system for instruction and assessment in pharmacology, utilizing a large bank of multiple choice questions. Items were collected from many sources, edited and coded for student suitability, topic, taxonomy and difficulty and text references. Students reserve a time during the day, specify the type of test desired and questions are presented randomly from the subset satisfying their criteria. Answers are scored after each question and a summary given at the end of every test; details on item performance are recorded automatically. The biggest hurdle in implementation was the assembly, review, classification and editing of items, while the programming was relatively straight-forward. A number of modifications had to be made to the initial plans and changes will undoubtedly continue with further experience. When fully operational the system will possess a number of advantages including: elimination of test preparation, editing and marking; facilitated item review opportunities; increased objectivity, feedback, flexibility and descreased anxiety in students.
RF Conditioning and Testing of Fundamental Power Couplers for SNS Superconducting Cavity Production
DOE Office of Scientific and Technical Information (OSTI.GOV)
M. Stirbet; G.K. Davis; M. A. Drury
The Spallation Neutron Source (SNS) makes use of 33 medium beta (0.61) and 48 high beta (0.81) superconducting cavities. Each cavity is equipped with a fundamental power coupler, which should withstand the full klystron power of 550 kW in full reflection for the duration of an RF pulse of 1.3 msec at 60 Hz repetition rate. Before assembly to a superconducting cavity, the vacuum components of the coupler are submitted to acceptance procedures consisting of preliminary quality assessments, cleaning and clean room assembly, vacuum leak checks and baking under vacuum, followed by conditioning and RF high power testing. Similar acceptancemore » procedures (except clean room assembly and baking) were applied for the airside components of the coupler. All 81 fundamental power couplers for SNS superconducting cavity production have been RF power tested at JLAB Newport News and, beginning in April 2004 at SNS Oak Ridge. This paper gives details of coupler processing and RF high power-assessed performances.« less
Auditory Confrontation Naming in Alzheimer’s Disease
Brandt, Jason; Bakker, Arnold; Maroof, David Aaron
2010-01-01
Naming is a fundamental aspect of language and is virtually always assessed with visual confrontation tests. Tests of the ability to name objects by their characteristic sounds would be particularly useful in the assessment of visually impaired patients, and may be particularly sensitive in Alzheimer’s disease (AD). We developed an Auditory Naming Task, requiring the identification of the source of environmental sounds (i.e., animal calls, musical instruments, vehicles) and multiple-choice recognition of those not identified. In two separate studies, mild-to-moderate AD patients performed more poorly than cognitively normal elderly on the Auditory Naming Task. This task was also more difficult than two versions of a comparable Visual Naming Task, and correlated more highly with Mini-Mental State Exam score. Internal consistency reliability was acceptable, although ROC analysis revealed auditory naming to be slightly less successful than visual confrontation naming in discriminating AD patients from normal subjects. Nonetheless, our Auditory Naming Test may prove useful in research and clinical practice, especially with visually-impaired patients. PMID:20981630
Microbial Groundwater Sampling Protocol for Fecal-Rich Environments
Harter, Thomas; Watanabe, Naoko; Li, Xunde; Atwill, Edward R; Samuels, William
2014-01-01
Inherently, confined animal farming operations (CAFOs) and other intense fecal-rich environments are potential sources of groundwater contamination by enteric pathogens. The ubiquity of microbial matter poses unique technical challenges in addition to economic constraints when sampling wells in such environments. In this paper, we evaluate a groundwater sampling protocol that relies on extended purging with a portable submersible stainless steel pump and Teflon® tubing as an alternative to equipment sterilization. The protocol allows for collecting a large number of samples quickly, relatively inexpensively, and under field conditions with limited access to capacity for sterilizing equipment. The protocol is tested on CAFO monitoring wells and considers three cross-contamination sources: equipment, wellbore, and ambient air. For the assessment, we use Enterococcus, a ubiquitous fecal indicator bacterium (FIB), in laboratory and field tests with spiked and blank samples, and in an extensive, multi-year field sampling campaign on 17 wells within 2 CAFOs. The assessment shows that extended purging can successfully control for equipment cross-contamination, but also controls for significant contamination of the well-head, within the well casing and within the immediate aquifer vicinity of the well-screen. Importantly, our tests further indicate that Enterococcus is frequently entrained in water samples when exposed to ambient air at a CAFO during sample collection. Wellbore and air contamination pose separate challenges in the design of groundwater monitoring strategies on CAFOs that are not addressed by equipment sterilization, but require adequate QA/QC procedures and can be addressed by the proposed sampling strategy. PMID:24903186
Exploratory X-ray Monitoring of z>4 Radio-Quiet Quasars
NASA Astrophysics Data System (ADS)
Shemmer, Ohad
2017-09-01
We propose to extend our exploratory X-ray monitoring project of some of the most distant radio-quiet quasars by obtaining one snapshot observation per Cycle for each of four sources at z>4. Combining these observations with six available X-ray epochs per source will provide basic temporal information over rest-frame timescales of 3-5 yr. We are supporting this project with Swift monitoring of luminous radio-quiet quasars at z=1.3-2.7 to break the L-z degeneracy and test evolutionary scenarios of the central engine in active galactic nuclei. Our ultimate goal is to provide a basic assessment of the X-ray variability properties of luminous quasars at the highest accessible redshifts that will serve as the benchmark for X-ray variability studies of such sources with future X-ray missions.
de Knegt, Leonardo V; Pires, Sara M; Löfström, Charlotta; Sørensen, Gitte; Pedersen, Karl; Torpdahl, Mia; Nielsen, Eva M; Hald, Tine
2016-03-01
Salmonella is an important cause of bacterial foodborne infections in Denmark. To identify the main animal-food sources of human salmonellosis, risk managers have relied on a routine application of a microbial subtyping-based source attribution model since 1995. In 2013, multiple locus variable number tandem repeat analysis (MLVA) substituted phage typing as the subtyping method for surveillance of S. Enteritidis and S. Typhimurium isolated from animals, food, and humans in Denmark. The purpose of this study was to develop a modeling approach applying a combination of serovars, MLVA types, and antibiotic resistance profiles for the Salmonella source attribution, and assess the utility of the results for the food safety decisionmakers. Full and simplified MLVA schemes from surveillance data were tested, and model fit and consistency of results were assessed using statistical measures. We conclude that loci schemes STTR5/STTR10/STTR3 for S. Typhimurium and SE9/SE5/SE2/SE1/SE3 for S. Enteritidis can be used in microbial subtyping-based source attribution models. Based on the results, we discuss that an adjustment of the discriminatory level of the subtyping method applied often will be required to fit the purpose of the study and the available data. The issues discussed are also considered highly relevant when applying, e.g., extended multi-locus sequence typing or next-generation sequencing techniques. © 2015 Society for Risk Analysis.
Barthel, D; Fischer, K I; Nolte, S; Otto, C; Meyrose, A-K; Reisinger, S; Dabs, M; Thyen, U; Klein, M; Muehlan, H; Ankermann, T; Walter, O; Rose, M; Ravens-Sieberer, U
2016-03-01
To describe the implementation process of a computer-adaptive test (CAT) for measuring health-related quality of life (HRQoL) of children and adolescents in two pediatric clinics in Germany. The study focuses on the feasibility and user experience with the Kids-CAT, particularly the patients' experience with the tool and the pediatricians' experience with the Kids-CAT Report. The Kids-CAT was completed by 312 children and adolescents with asthma, diabetes or rheumatoid arthritis. The test was applied during four clinical visits over a 1-year period. A feedback report with the test results was made available to the pediatricians. To assess both feasibility and acceptability, a multimethod research design was used. To assess the patients' experience with the tool, the children and adolescents completed a questionnaire. To assess the clinicians' experience, two focus groups were conducted with eight pediatricians. The children and adolescents indicated that the Kids-CAT was easy to complete. All pediatricians reported that the Kids-CAT was straightforward and easy to understand and integrate into clinical practice; they also expressed that routine implementation of the tool would be desirable and that the report was a valuable source of information, facilitating the assessment of self-reported HRQoL of their patients. The Kids-CAT was considered an efficient and valuable tool for assessing HRQoL in children and adolescents. The Kids-CAT Report promises to be a useful adjunct to standard clinical care with the potential to improve patient-physician communication, enabling pediatricians to evaluate and monitor their young patients' self-reported HRQoL.
NASA Astrophysics Data System (ADS)
Colli, M.; Lanza, L. G.; La Barbera, P.; Chan, P. W.
2014-07-01
The contribution of any single uncertainty factor in the resulting performance of infield rain gauge measurements still has to be comprehensively assessed due to the high number of real world error sources involved, such as the intrinsic variability of rainfall intensity (RI), wind effects, wetting losses, the ambient temperature, etc. In recent years the World Meteorological Organization (WMO) addressed these issues by fostering dedicated investigations, which revealed further difficulties in assessing the actual reference rainfall intensity in the field. This work reports on an extensive assessment of the OTT Pluvio2 weighing gauge accuracy when measuring rainfall intensity under laboratory dynamic conditions (time varying reference flow rates). The results obtained from the weighing rain gauge (WG) were also compared with a MTX tipping-bucket rain gauge (TBR) under the same test conditions. Tests were carried out by simulating various artificial precipitation events, with unsteady rainfall intensity, using a suitable dynamic rainfall generator. Real world rainfall data measured by an Ogawa catching-type drop counter at a field test site located within the Hong Kong International Airport (HKIA) were used as a reference for the artificial rain generation system. Results demonstrate that the differences observed between the laboratory and field performance of catching-type gauges are only partially attributable to the weather and operational conditions in the field. The dynamics of real world precipitation events is responsible for a large part of the measurement errors, which can be accurately assessed in the laboratory under controlled environmental conditions. This allows for new testing methodologies and the development of instruments with enhanced performance in the field.
Di Nardo, Francesco; Mengoni, Michele; Morettini, Micaela
2013-05-01
Present study provides a novel MATLAB-based parameter estimation procedure for individual assessment of hepatic insulin degradation (HID) process from standard frequently-sampled intravenous glucose tolerance test (FSIGTT) data. Direct access to the source code, offered by MATLAB, enabled us to design an optimization procedure based on the alternating use of Gauss-Newton's and Levenberg-Marquardt's algorithms, which assures the full convergence of the process and the containment of computational time. Reliability was tested by direct comparison with the application, in eighteen non-diabetic subjects, of well-known kinetic analysis software package SAAM II, and by application on different data. Agreement between MATLAB and SAAM II was warranted by intraclass correlation coefficients ≥0.73; no significant differences between corresponding mean parameter estimates and prediction of HID rate; and consistent residual analysis. Moreover, MATLAB optimization procedure resulted in a significant 51% reduction of CV% for the worst-estimated parameter by SAAM II and in maintaining all model-parameter CV% <20%. In conclusion, our MATLAB-based procedure was suggested as a suitable tool for the individual assessment of HID process. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Cravotta, Charles A.
2005-01-01
This report describes field, laboratory, and computational methods that could be used to assess remedial strategies for abandoned mine drainage (AMD). During April-June, 2004, the assessment process was applied to AMD from bituminous coal deposits at a test site in the Staple Bend Tunnel Unit of Allegheny Portage Railroad National Historic Site (ALPO-SBTU) in Cambria County, Pennsylvania. The purpose of this study was (1) to characterize the AMD quantity and quality within the ALPO-SBTU test site; (2) to evaluate the efficacy of limestone or steel slag for neutralization of the AMD on the basis of reaction-rate measurements; and (3) to identify possible alternatives for passive or active treatment of the AMD. The data from this case study ultimately will be used by the National Park Service (NPS) to develop a site remediation plan. The approach used in this study could be applicable at other sites subject to drainage from abandoned coal or metal mines.During April 2004, AMD from 9 sources (sites1, 1Fe, Fe, 2, 3, 3B, 5, 6, and 7) at the ALPO-SBTU test site had a combined flow rate of 1,420 gallons per minute (gal/min) and flow-weighted averages for pH of 3.3, net acidity of 55 milligrams per liter (mg/L) as CaCO3, and concentrations of dissolved sulfate, aluminum, iron, and manganese of 694 mg/L, 4.4 mg/L, 0.74 mg/L, and 1.2 mg/L, respectively. These pH, net acidity, sulfate, and aluminum values exceed effluent criteria for active mines in Pennsylvania.During April-June 2004, limestone and steel slag that were locally available were tested in the laboratory for their composition, approximate surface area, and potential to neutralize samples of the AMD. Although the substrates had a similar particle-size distribution and identical calcium content (43 percent as calcium oxide), the limestone was composed of crystalline carbonates and the slag was composed of silicate glass and minerals. After a minimum of 8 hours contact between the AMD and limestone or steel slag in closed containers (cubitainers), near-neutral effluent was produced. With prolonged contact between the AMD and limestone or steel slag, the concentrations of iron, aluminum, and most dissolved trace elements in effluent from the cubitainers declined while pH was maintained greater than 6.0 and less than 9.0. The cubitainer testing demonstrated (1) lower alkalinity production but higher pH of AMD treated with steel slag compared to limestone, and (2) predictable relations between the effluent quality, detention time, and corresponding flow rate and bulk volume for a bed of crushed limestone or steel slag in an AMD passive-treatment system.The process for evaluating AMD remedial strategies at the ALPO-SBTU test site involved the computation and ranking of the metal loadings during April 2004 for each of the AMD sources and a comparison of the data on AMD flow and chemistry (alkalinity, acidity, dissolved oxygen, ferric iron, aluminum) with published criteria for selection of passive-treatment technology. Although neutralization of the AMD by reaction with limestone was demonstrated with cubitainer tests, an anoxic limestone drain (ALD) was indicated as inappropriate for any AMD source at the test site because all had excessive concentrations of dissolved oxygen and (or) aluminum. One passive-treatment scenario that was identified for the individual or combined AMD sources involved an open limestone channel (OLC) to collect the AMD source(s), a vertical flow compost wetland (VFCW) to add alkalinity, and an aerobic wetland to facilitate iron and manganese oxidation and retention of precipitated solids. Innovative passive-system designs that direct flow upward through submerged layers of limestone and/or steel slag and that incorporate siphons for automatic flushing of solids to a pond also may warrant consideration. Alternatively, an active-treatment system with a hydraulic-powered lime doser could be employed instead of the VFCW or upflow system. Now, given these data on AMD flow and chemistry and identified remedial technologies, a resource manager can use a publicly available computer program such as "AMDTreat" to evaluate the potential sizes and costs of various remedial alternatives.
Auditory enhancement of increments in spectral amplitude stems from more than one source.
Carcagno, Samuele; Semal, Catherine; Demany, Laurent
2012-10-01
A component of a test sound consisting of simultaneous pure tones perceptually "pops out" if the test sound is preceded by a copy of itself with that component attenuated. Although this "enhancement" effect was initially thought to be purely monaural, it is also observable when the test sound and the precursor sound are presented contralaterally (i.e., to opposite ears). In experiment 1, we assessed the magnitude of ipsilateral and contralateral enhancement as a function of the time interval between the precursor and test sounds (10, 100, or 600 ms). The test sound, randomly transposed in frequency from trial to trial, was followed by a probe tone, either matched or mismatched in frequency to the test sound component which was the target of enhancement. Listeners' ability to discriminate matched probes from mismatched probes was taken as an index of enhancement magnitude. The results showed that enhancement decays more rapidly for ipsilateral than for contralateral precursors, suggesting that ipsilateral enhancement and contralateral enhancement stem from at least partly different sources. It could be hypothesized that, in experiment 1, contralateral precursors were effective only because they provided attentional cues about the target tone frequency. In experiment 2, this hypothesis was tested by presenting the probe tone before the precursor sound rather than after the test sound. Although the probe tone was then serving as a frequency cue, contralateral precursors were again found to produce enhancement. This indicates that contralateral enhancement cannot be explained by cuing alone and is a genuine sensory phenomenon.
Motor function tests for 0-2-year-old children - a systematic review.
Kjølbye, Camilla Buch; Drivsholm, Thomas Bo; Ertmann, Ruth Kirk; Lykke, Kirsten; Rasmussen, Rasmus Køster
2018-06-01
There is no evidence on how motor function is best evaluated in children in a low-risk setting. The method used in the Danish Preventive Child Health Examination Programme (DPCHEP) in general practise has not been validated. The objective of this review was to identify existing motor function tests for 0-2-year-old children that were validated for use in the background population and which are suitable for use in the DPCHEP. This systematic review was conducted in accordance with the PRISMA guidelines. A systematic literature search was performed in PubMed, Embase, SwedMed, PsycInfo and CINAHL in accordance with the inclusion and exclusion criteria. Five motor function tests were identified. The Alberta Infant Motor Scale (AIMS) exclusively assesses motor function, the Harris Infant Neuromotor Assessment also assesses cognition and the Early Motor Questionnaire (EMQ) additionally assesses perception-action integration skills. The Ages and Stages Questionnaire (ASQ) and The Brigance Infant and Toddler Screen include further aspects of development. All test methods, except for the AIMS, are based on parent involvement. For implementation in the DPCHEP, five motor function tests were potentially adequate. However, the time consumption and extensive use of tools render three of the five tests unsuitable for implementation in the existing programme. The two remaining tests, the ASQ and the EMQ, are parent questionnaires. We suggest that these should be pilot tested with a view to their subsequent implementation in the DPCHEP. It may be considered to present the test elements in a more manageable and systematic way, possibly with illustrations. Articles published in the DMJ are “open access”. This means that the articles are distributed under the terms of the Creative Commons Attribution Non-commercial License, which permits any non-commercial use, distribution, and reproduction in any medium, provided the original author(s) and source are credited.
Energetic Materials Hazard Initiation: DoD Assessment Team Final Report
1987-05-05
which the Department of De - fense is now emphasizing (JROC, 1986). Although this aspect of hazards reduction primarily involves fielded systems and the...source of deficiency in the impact testing. Some efforts are reported in which phototransistors, IR sensors and/or pressure sensors are used to detect...Montgomery (1959), and Moore (1973). Recent research on the electrostatic discharge sensitivity of solid propellant samples was begun at Societe Nationale des
Deterrence Impact Modeling Environment (DIME) Proof-of-Concept Test Evaluations and Findings
2016-06-01
sources of this caution: financial, technical, legal, and ethical . Several current Coast Guard policies complicate ongoing engagement with and assessment...and ethical . There is evidence that several of the stakeholder communities most important to the Coast Guard have not been early adopters of the...self-organization) or longer-term outcomes (such as over-harvesting, regeneration of biodiversity, resilience of an ecological system to human nature
Assessment of Logistics Effectiveness for Expeditionary Units
2017-12-01
navigating the bureaucracy when spending government money requires the ESU teams to be savvy supply experts. Figure 8. EOD Force Laydown. Source...main uses is to help free a trapped diver from any number of hazards. The MK-16 equipment must withstand these conditions and not puncture...of the valves being shipped to the units as repair parts. The pieces require an oxygen- free environment for testing, along with other stringent
ERIC Educational Resources Information Center
National Assessment Governing Board, 2013
2013-01-01
To what extent can young people analyze the pros and cons of a proposal to develop a new source of energy? Construct and test a model or prototype? Use the Internet to find and summarize data and information in order to solve a problem or achieve a goal? The exploding growth in the world of technology and the need to answer questions similar to…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Punnoose, Ratish J.; Armstrong, Robert C.; Wong, Matthew H.
Formal methods have come into wide use because of their effectiveness in verifying "safety and security" requirements of digital systems; a set of requirements for which testing is mostly ineffective. Formal methods are routinely used in the design and verification of high-consequence digital systems in industry. This report outlines our work in assessing the capabilities of commercial and open source formal tools and the ways in which they can be leveraged in digital design workflows.
Bailes, Graham; Lind, Margaret; Ely, Andrew; Powell, Marianne; Moore-Kucera, Jennifer; Miles, Carol; Inglis, Debra; Brodhagen, Marion
2013-05-10
Fungi native to agricultural soils that colonized commercially available biodegradable mulch (BDM) films were isolated and assessed for potential to degrade plastics. Typically, when formulations of plastics are known and a source of the feedstock is available, powdered plastic can be suspended in agar-based media and degradation determined by visualization of clearing zones. However, this approach poorly mimics in situ degradation of BDMs. First, BDMs are not dispersed as small particles throughout the soil matrix. Secondly, BDMs are not sold commercially as pure polymers, but rather as films containing additives (e.g. fillers, plasticizers and dyes) that may affect microbial growth. The procedures described herein were used for isolates acquired from soil-buried mulch films. Fungal isolates acquired from excavated BDMs were tested individually for growth on pieces of new, disinfested BDMs laid atop defined medium containing no carbon source except agar. Isolates that grew on BDMs were further tested in liquid medium where BDMs were the sole added carbon source. After approximately ten weeks, fungal colonization and BDM degradation were assessed by scanning electron microscopy. Isolates were identified via analysis of ribosomal RNA gene sequences. This report describes methods for fungal isolation, but bacteria also were isolated using these methods by substituting media appropriate for bacteria. Our methodology should prove useful for studies investigating breakdown of intact plastic films or products for which plastic feedstocks are either unknown or not available. However our approach does not provide a quantitative method for comparing rates of BDM degradation.
Bailes, Graham; Lind, Margaret; Ely, Andrew; Powell, Marianne; Moore-Kucera, Jennifer; Miles, Carol; Inglis, Debra; Brodhagen, Marion
2013-01-01
Fungi native to agricultural soils that colonized commercially available biodegradable mulch (BDM) films were isolated and assessed for potential to degrade plastics. Typically, when formulations of plastics are known and a source of the feedstock is available, powdered plastic can be suspended in agar-based media and degradation determined by visualization of clearing zones. However, this approach poorly mimics in situ degradation of BDMs. First, BDMs are not dispersed as small particles throughout the soil matrix. Secondly, BDMs are not sold commercially as pure polymers, but rather as films containing additives (e.g. fillers, plasticizers and dyes) that may affect microbial growth. The procedures described herein were used for isolates acquired from soil-buried mulch films. Fungal isolates acquired from excavated BDMs were tested individually for growth on pieces of new, disinfested BDMs laid atop defined medium containing no carbon source except agar. Isolates that grew on BDMs were further tested in liquid medium where BDMs were the sole added carbon source. After approximately ten weeks, fungal colonization and BDM degradation were assessed by scanning electron microscopy. Isolates were identified via analysis of ribosomal RNA gene sequences. This report describes methods for fungal isolation, but bacteria also were isolated using these methods by substituting media appropriate for bacteria. Our methodology should prove useful for studies investigating breakdown of intact plastic films or products for which plastic feedstocks are either unknown or not available. However our approach does not provide a quantitative method for comparing rates of BDM degradation. PMID:23712218
Ahmed, W; Gyawali, P; Toze, S
2015-03-03
Quantitative PCR (qPCR) assays were used to determine the concentrations of E. coli including shiga toxin-producing E. coli (STEC) associated virulence genes (eaeA, stx1, stx2, and hlyA) in ten animal species (fecal sources) and environmental water samples in Southeast Queensland, Australia. The mean Log10 concentrations and standard deviations of E. coli 23S rRNA across fecal sources ranged from 1.3 ± 0.1 (horse) to 6.3 ± 0.4 (cattle wastewater) gene copies at a test concentration of 10 ng of DNA. The differences in mean concentrations of E. coli 23S rRNA gene copies among fecal source samples were significantly different from each other (P < 0.0001). Among the virulence genes, stx2 (25%, 95% CI, 17-33%) was most prevalent among fecal sources, followed by eaeA (19%, 95% CI, 12-27%), stx1 (11%, 95% CI, 5%-17%) and hlyA (8%, 95% CI, 3-13%). The Log10 concentrations of STEC virulence genes in cattle wastewater samples ranged from 3.8 to 5.0 gene copies at a test concentration of 10 ng of DNA. Of the 18 environmental water samples tested, three (17%) were positive for eaeA and two (11%) samples were also positive for the stx2 virulence genes. The data presented in this study will aid in the estimation of quantitative microbial risk assessment (QMRA) from fecal pollution of domestic and wild animals in drinking/recreational water catchments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
West, B.G.; Collins, G.
1996-01-01
The Torquay Sub-basin is located in the offshore part of the eastern Otway Basin, some 50km southwest of Melbourne. Three wells, all dry holes, were drilled between 1967 and 1992. Nerita-1 drilled in 1967 tested Eocene and Early Cretaceous reservoirs in a Miocene anticline. Snail-1 drilled in 1972 was not a valid structural test, and Wild Dog-1 drilled in 1992 tested Late Eocene sands in an Oligocene inversion faulted anticline sourced from Early Cretaceous coals. The area was assessed by previous explorers as lacking effective source. Work currently underway indicates these wells were dry because of lack of migration pathwaysmore » to the Tertiary. To the west, significant gas has been discovered in Late Cretaceous reservoirs offshore at Minerva-11 and LaBella-1, and onshore in wells in the Port Campbell Embayment. In the Bass Basin to the south, there have been consistent oil, condensate and gas shows. Geochemical analysis of the Early Cretaceous Eumeralla Formation and Casterton beds throughout the Otway Basin demonstrate they contain source rocks capable of generating both oil and gas. Our study indicates that early Cretaceous sandstones with porosities better than 20%, may be present at depths of less than 2000m in the Torquay Sub-basin in tilted fault blocks. Source would be from down-dip lacustrine shales of the Casterton Beds. The general basement high area in which this play is developed is some 15km by 15 km with up to 400m of relief.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
West, B.G.; Collins, G.
1996-12-31
The Torquay Sub-basin is located in the offshore part of the eastern Otway Basin, some 50km southwest of Melbourne. Three wells, all dry holes, were drilled between 1967 and 1992. Nerita-1 drilled in 1967 tested Eocene and Early Cretaceous reservoirs in a Miocene anticline. Snail-1 drilled in 1972 was not a valid structural test, and Wild Dog-1 drilled in 1992 tested Late Eocene sands in an Oligocene inversion faulted anticline sourced from Early Cretaceous coals. The area was assessed by previous explorers as lacking effective source. Work currently underway indicates these wells were dry because of lack of migration pathwaysmore » to the Tertiary. To the west, significant gas has been discovered in Late Cretaceous reservoirs offshore at Minerva-11 and LaBella-1, and onshore in wells in the Port Campbell Embayment. In the Bass Basin to the south, there have been consistent oil, condensate and gas shows. Geochemical analysis of the Early Cretaceous Eumeralla Formation and Casterton beds throughout the Otway Basin demonstrate they contain source rocks capable of generating both oil and gas. Our study indicates that early Cretaceous sandstones with porosities better than 20%, may be present at depths of less than 2000m in the Torquay Sub-basin in tilted fault blocks. Source would be from down-dip lacustrine shales of the Casterton Beds. The general basement high area in which this play is developed is some 15km by 15 km with up to 400m of relief.« less
Legionella detection by culture and qPCR: Comparing apples and oranges.
Whiley, Harriet; Taylor, Michael
2016-01-01
Legionella spp. are the causative agent of Legionnaire's disease and an opportunistic pathogen of significant public health concern. Identification and quantification from environmental sources is crucial for identifying outbreak origins and providing sufficient information for risk assessment and disease prevention. Currently there are a range of methods for Legionella spp. quantification from environmental sources, but the two most widely used and accepted are culture and real-time polymerase chain reaction (qPCR). This paper provides a review of these two methods and outlines their advantages and limitations. Studies from the last 10 years which have concurrently used culture and qPCR to quantify Legionella spp. from environmental sources have been compiled. 26/28 studies detected Legionella at a higher rate using qPCR compared to culture, whilst only one study detected equivalent levels of Legionella spp. using both qPCR and culture. Aggregating the environmental samples from all 28 studies, 2856/3967 (72%) tested positive for the presence of Legionella spp. using qPCR and 1331/3967 (34%) using culture. The lack of correlation between methods highlights the need to develop an acceptable standardized method for quantification that is sufficient for risk assessment and management of this human pathogen.