How to Manual: How to Update and Enhance Your Local Source Water Protection Assessments
Describes opportunities for improving source water assessments performed under the Safe Drinking Water Act 1453. It includes: local delineations, potential contaminant source inventories, and susceptibility determinations of source water assessment.
ERIC Educational Resources Information Center
Heric, Matthew; Carter, Jenn
2011-01-01
Cognitive readiness (CR) and performance for operational time-critical environments are continuing points of focus for military and academic communities. In response to this need, we designed an open source interactive CR assessment application as a highly adaptive and efficient open source testing administration and analysis tool. It is capable…
Better Assessment Science Integrating Point and Non-point Sources (BASINS)
Better Assessment Science Integrating Point and Nonpoint Sources (BASINS) is a multipurpose environmental analysis system designed to help regional, state, and local agencies perform watershed- and water quality-based studies.
This document contains the methods and the results of baseline risk assessments (i.e., after the implementation of the MACT standard) and the results of the post-control scenario risk assessment performed for the ferroalloys source category.
2009-09-01
nuclear industry for conducting performance assessment calculations. The analytical FORTRAN code for the DNAPL source function, REMChlor, was...project. The first was to apply existing deterministic codes , such as T2VOC and UTCHEM, to the DNAPL source zone to simulate the remediation processes...but describe the spatial variability of source zones unlike one-dimensional flow and transport codes that assume homogeneity. The Lagrangian models
48 CFR 873.116 - Source selection decision.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Source selection decision. (a) An integrated comparative assessment of proposals should be performed... source selection team, or advisory boards or panels, may conduct comparative analysis(es) of proposals...
48 CFR 873.116 - Source selection decision.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Source selection decision. (a) An integrated comparative assessment of proposals should be performed... source selection team, or advisory boards or panels, may conduct comparative analysis(es) of proposals...
48 CFR 873.116 - Source selection decision.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Source selection decision. (a) An integrated comparative assessment of proposals should be performed... source selection team, or advisory boards or panels, may conduct comparative analysis(es) of proposals...
48 CFR 873.116 - Source selection decision.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Source selection decision. (a) An integrated comparative assessment of proposals should be performed... source selection team, or advisory boards or panels, may conduct comparative analysis(es) of proposals...
48 CFR 873.116 - Source selection decision.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Source selection decision. (a) An integrated comparative assessment of proposals should be performed... source selection team, or advisory boards or panels, may conduct comparative analysis(es) of proposals...
RESIDUAL RISK ASSESSMENT: MAGNETIC TAPE ...
This document describes the residual risk assessment for the Magnetic Tape Manufacturing source category. For stationary sources, section 112 (f) of the Clean Air Act requires EPA to assess risks to human health and the environment following implementation of technology-based control standards. If these technology-based control standards do not provide an ample margin of safety, then EPA is required to promulgate addtional standards. This document describes the methodology and results of the residual risk assessment performed for the Magnetic Tape Manufacturing source category. The results of this analyiss will assist EPA in determining whether a residual risk rule for this source category is appropriate.
Source term model evaluations for the low-level waste facility performance assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yim, M.S.; Su, S.I.
1995-12-31
The estimation of release of radionuclides from various waste forms to the bottom boundary of the waste disposal facility (source term) is one of the most important aspects of LLW facility performance assessment. In this work, several currently used source term models are comparatively evaluated for the release of carbon-14 based on a test case problem. The models compared include PRESTO-EPA-CPG, IMPACTS, DUST and NEFTRAN-II. Major differences in assumptions and approaches between the models are described and key parameters are identified through sensitivity analysis. The source term results from different models are compared and other concerns or suggestions are discussed.
ERIC Educational Resources Information Center
Sebok, Stefanie S.; Roy, Marguerite; Klinger, Don A.; De Champlain, André F.
2015-01-01
Examiner effects and content specificity are two well known sources of construct irrelevant variance that present great challenges in performance-based assessments. National medical organizations that are responsible for large-scale performance based assessments experience an additional challenge as they are responsible for administering…
Global Inter-Laboratory Fecal Source Identification Methods Comparison Study
Source tracking is key to identifying sources of fecal contamination for remediation as well as risk assessment. Previous intra- and inter-lab studies have investigated the performance of human and cow-associated source tracking markers, as well as library-dependent fecal source ...
Lind, Sophie E; Bowler, Dermot M
2009-09-01
This study investigated semantic and episodic memory in autism spectrum disorder (ASD), using a task which assessed recognition and self-other source memory. Children with ASD showed undiminished recognition memory but significantly diminished source memory, relative to age- and verbal ability-matched comparison children. Both children with and without ASD showed an "enactment effect", demonstrating significantly better recognition and source memory for self-performed actions than other-person-performed actions. Within the comparison group, theory-of-mind (ToM) task performance was significantly correlated with source memory, specifically for other-person-performed actions (after statistically controlling for verbal ability). Within the ASD group, ToM task performance was not significantly correlated with source memory (after controlling for verbal ability). Possible explanations for these relations between source memory and ToM are considered.
RESIDUAL RISK ASSESSMENT: ETHYLENE OXIDE ...
This document describes the residual risk assessment for the Ethylene Oxide Commercial Sterilization source category. For stationary sources, section 112 (f) of the Clean Air Act requires EPA to assess risks to human health and the environment following implementation of technology-based control standards. If these technology-based control standards do not provide an ample margin of safety, then EPA is required to promulgate addtional standards. This document describes the methodology and results of the residual risk assessment performed for the Ethylene Oxide Commercial Sterilization source category. The results of this analyiss will assist EPA in determining whether a residual risk rule for this source category is appropriate.
Better Assessment Science Integrating Point and Nonpoint Sources
Better Assessment Science Integrating Point and Nonpoint Sources (BASINS) is not a model per se, but is a multipurpose environmental decision support system for use by regional, state, and local agencies in performing watershed- and water-quality-based studies. BASI...
THE MEASUREMENT AND USE OF CONTAMINANT FLUX FOR PERFORMANCE ASSESSMENT OF DNAPL REMEDIATION
A review is presented of both mass flux as a DNAPL remedial performance metric and reduction in mass flux as a remedial performance objective at one or more control planes down gradient of DNAPL source areas. The use of mass flux to assess remedial performance has been proposed ...
Perimal-Lewis, Lua; Teubner, David; Hakendorf, Paul; Horwood, Chris
2016-12-01
Effective and accurate use of routinely collected health data to produce Key Performance Indicator reporting is dependent on the underlying data quality. In this research, Process Mining methodology and tools were leveraged to assess the data quality of time-based Emergency Department data sourced from electronic health records. This research was done working closely with the domain experts to validate the process models. The hospital patient journey model was used to assess flow abnormalities which resulted from incorrect timestamp data used in time-based performance metrics. The research demonstrated process mining as a feasible methodology to assess data quality of time-based hospital performance metrics. The insight gained from this research enabled appropriate corrective actions to be put in place to address the data quality issues. © The Author(s) 2015.
2016-06-15
selection strategy is key to minimizing risk and ensuring best value for all stakeholders. On the basis of thorough market research , acquisition...administrative lead-time, Contractor Performance Assessment Reporting System ratings, and earned value management assessments) and source selection strategy ...Postgraduate School A. PURPOSE This research analyzes LPTA and tradeoff source selection strategies and contract outcomes to determine if a relationship
Crowd-sourced assessment of surgical skills in cricothyrotomy procedure.
Aghdasi, Nava; Bly, Randall; White, Lee W; Hannaford, Blake; Moe, Kris; Lendvay, Thomas S
2015-06-15
Objective assessment of surgical skills is resource intensive and requires valuable time of expert surgeons. The goal of this study was to assess the ability of a large group of laypersons using a crowd-sourcing tool to grade a surgical procedure (cricothyrotomy) performed on a simulator. The grading included an assessment of the entire procedure by completing an objective assessment of technical skills survey. Two groups of graders were recruited as follows: (1) Amazon Mechanical Turk users and (2) three expert surgeons from University of Washington Department of Otolaryngology. Graders were presented with a video of participants performing the procedure on the simulator and were asked to grade the video using the objective assessment of technical skills questions. Mechanical Turk users were paid $0.50 for each completed survey. It took 10 h to obtain all responses from 30 Mechanical Turk users for 26 training participants (26 videos/tasks), whereas it took 60 d for three expert surgeons to complete the same 26 tasks. The assessment of surgical performance by a group (n = 30) of laypersons matched the assessment by a group (n = 3) of expert surgeons with a good level of agreement determined by Cronbach alpha coefficient = 0.83. We found crowd sourcing was an efficient, accurate, and inexpensive method for skills assessment with a good level of agreement to experts' grading. Copyright © 2015 Elsevier Inc. All rights reserved.
Habeeb, Christine M; Eklund, Robert C; Coffee, Pete
2017-06-01
This study explored person-related sources of variance in athletes' efficacy beliefs and performances when performing in pairs with distinguishable roles differing in partner dependence. College cheerleaders (n = 102) performed their role in repeated performance trials of two low- and two high-difficulty paired-stunt tasks with three different partners. Data were obtained on self-, other-, and collective efficacy beliefs and subjective performances, and objective performance assessments were obtained from digital recordings. Using the social relations model framework, total variance in each belief/assessment was partitioned, for each role, into numerical components of person-related variance relative to the self, the other, and the collective. Variance component by performance role by task-difficulty repeated-measures analysis of variances revealed that the largest person-related variance component differed by athlete role and increased in size in high-difficulty tasks. Results suggest that the extent the athlete's performance depends on a partner relates to the extent the partner is a source of self-, other-, and collective efficacy beliefs.
ERIC Educational Resources Information Center
Sprondel, Volker; Kipp, Kerstin H.; Mecklinger, Axel
2011-01-01
Event-related potential (ERP) correlates of item and source memory were assessed in 18 children (7-8 years), 20 adolescents (13-14 years), and 20 adults (20-29 years) performing a continuous recognition memory task with object and nonobject stimuli. Memory performance increased with age and was particularly low for source memory in children. The…
ERIC Educational Resources Information Center
Plant, Jennifer L.; Corden, Mark; Mourad, Michelle; O'Brien, Bridget C.; van Schaik, Sandrijn M.
2013-01-01
;Self-directed learning requires self-assessment of learning needs and performance, a complex process that requires collecting and interpreting data from various sources. Learners' approaches to self-assessment likely vary depending on the learner and the context. The aim of this study was to gain insight into how learners process external…
Enhancing Self-Efficacy and Performance: An Experimental Comparison of Psychological Techniques
ERIC Educational Resources Information Center
Wright, Bradley James; O'Halloran, Paul Daniel; Stukas, Arthur Anthony
2016-01-01
Purpose: We assessed how 6 psychological performance enhancement techniques (PETs) differentially improved self-efficacy (SE) and skill performance. We also assessed whether vicarious experiences and verbal persuasion as posited sources of SE (Bandura, 1982) were supported and, further, if the effects of the 6 PETs remained after controlling for…
Enhancing Self-Efficacy and Performance: An Experimental Comparison of Psychological Techniques.
Wright, Bradley James; O'Halloran, Paul Daniel; Stukas, Arthur Anthony
2016-01-01
We assessed how 6 psychological performance enhancement techniques (PETs) differentially improved self-efficacy (SE) and skill performance. We also assessed whether vicarious experiences and verbal persuasion as posited sources of SE (Bandura, 1982 ) were supported and, further, if the effects of the 6 PETs remained after controlling for achievement motivation traits and self-esteem. A within-subject design assessed each individual across 2 trials for 3 disparate PETs. A between-groups design assessed differences between PETs paired against each other for 3 similar novel tasks. Participants (N = 96) performed 2 trials of 10 attempts at each of the tasks (kick, throw, golf putt) in a counterbalanced sequence using their nondominant limb. Participants completed the Sport Orientation Questionnaire, Rosenberg Self-Esteem Scale, and General Self-Efficacy Scale and were randomly allocated to either the modeling or imagery, goal-setting or instructional self-statement, or knowledge-of-results or motivational feedback conditions aligned with each task. An instructional self-statement improved performance better than imagery, modeling, goal setting, and motivational and knowledge-of-results augmented feedback. Motivational auditory feedback most improved SE. Increased SE change scores were related to increased performance difference scores on all tasks after controlling for age, sex, achievement motivation, and self-esteem. Some sources of SE may be more influential than others on both SE and performance improvements. We provide partial support for the sources of SE proposed by Bandura's social-cognitive theory with verbal persuasion but not vicarious experiences improving SE.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, David; Bucknor, Matthew; Jerden, James
2016-10-01
The potential release of radioactive material during a plant incident, referred to as the source term, is a vital design metric and will be a major focus of advanced reactor licensing. The U.S. Nuclear Regulatory Commission has stated an expectation for advanced reactor vendors to present a mechanistic assessment of the potential source term in their license applications. The mechanistic source term presents an opportunity for vendors to realistically assess the radiological consequences of an incident, and may allow reduced emergency planning zones and smaller plant sites. However, the development of a mechanistic source term for advanced reactors is notmore » without challenges, as there are often numerous phenomena impacting the transportation and retention of radionuclides. This project sought to evaluate U.S. capabilities regarding the mechanistic assessment of radionuclide release from core damage incidents at metal fueled, pool-type sodium fast reactors (SFRs). The purpose of the analysis was to identify, and prioritize, any gaps regarding computational tools or data necessary for the modeling of radionuclide transport and retention phenomena. To accomplish this task, a parallel-path analysis approach was utilized. One path, led by Argonne and Sandia National Laboratories, sought to perform a mechanistic source term assessment using available codes, data, and models, with the goal to identify gaps in the current knowledge base. The second path, performed by an independent contractor, performed sensitivity analyses to determine the importance of particular radionuclides and transport phenomena in regards to offsite consequences. The results of the two pathways were combined to prioritize gaps in current capabilities.« less
Feedback data sources that inform physician self-assessment.
Lockyer, Jocelyn; Armson, Heather; Chesluk, Benjamin; Dornan, Timothy; Holmboe, Eric; Loney, Elaine; Mann, Karen; Sargeant, Joan
2011-01-01
Self-assessment is a process of interpreting data about one's performance and comparing it to explicit or implicit standards. To examine the external data sources physicians used to monitor themselves. Focus groups were conducted with physicians who participated in three practice improvement activities: a multisource feedback program; a program providing patient and chart audit data; and practice-based learning groups. We used grounded theory strategies to understand the external sources that stimulated self-assessment and how they worked. Data from seven focus groups (49 physicians) were analyzed. Physicians used information from structured programs, other educational activities, professional colleagues, and patients. Data were of varying quality, often from non-formal sources with implicit (not explicit) standards. Mandatory programs elicited variable responses, whereas data and activities the physicians selected themselves were more likely to be accepted. Physicians used the information to create a reference point against which they could weigh their performance using it variably depending on their personal interpretation of its accuracy, application, and utility. Physicians use and interpret data and standards of varying quality to inform self-assessment. Physicians may benefit from regular and routine feedback and guidance on how to seek out data for self-assessment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This report presents the economic analysis of final effluent limitation guidelines, New Source Performance Standards, and pretreatment standards being promulgated for the steam-electric power plant point source category. It describes the costs of the final regulations, assesses the effects of these costs on the electric utility industry, and examines the cost-effectiveness of the regulations.
Advanced Reactor PSA Methodologies for System Reliability Analysis and Source Term Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, D.; Brunett, A.; Passerini, S.
Beginning in 2015, a project was initiated to update and modernize the probabilistic safety assessment (PSA) of the GE-Hitachi PRISM sodium fast reactor. This project is a collaboration between GE-Hitachi and Argonne National Laboratory (Argonne), and funded in part by the U.S. Department of Energy. Specifically, the role of Argonne is to assess the reliability of passive safety systems, complete a mechanistic source term calculation, and provide component reliability estimates. The assessment of passive system reliability focused on the performance of the Reactor Vessel Auxiliary Cooling System (RVACS) and the inherent reactivity feedback mechanisms of the metal fuel core. Themore » mechanistic source term assessment attempted to provide a sequence specific source term evaluation to quantify offsite consequences. Lastly, the reliability assessment focused on components specific to the sodium fast reactor, including electromagnetic pumps, intermediate heat exchangers, the steam generator, and sodium valves and piping.« less
Thermal Performance Data Services (TPDS)
NASA Technical Reports Server (NTRS)
French, Richard T.; Wright, Michael J.
2013-01-01
Initiated as a NASA Engineering and Safety Center (NESC) assessment in 2009, the Thermal Performance Database (TPDB) was a response to the need for a centralized thermal performance data archive. The assessment was renamed Thermal Performance Data Services (TPDS) in 2012; the undertaking has had two fronts of activity: the development of a repository software application and the collection of historical thermal performance data sets from dispersed sources within the thermal performance community. This assessment has delivered a foundational tool on which additional features should be built to increase efficiency, expand the protection of critical Agency investments, and provide new discipline-advancing work opportunities. This report contains the information from the assessment.
Before new, rapid quantitative PCR (qPCR) methods for recreational water quality assessment and microbial source tracking (MST) can be useful in a regulatory context, an understanding of the ability of the method to detect a DNA target (marker) when the contaminant soure has been...
Evaluation of head orientation and neck muscle EMG signals as three-dimensional command sources.
Williams, Matthew R; Kirsch, Robert F
2015-03-05
High cervical spinal cord injuries result in significant functional impairments and affect both the injured individual as well as their family and care givers. To help restore function to these individuals, multiple user interfaces are available to enable command and control of external devices. However, little work has been performed to assess the 3D performance of these interfaces. We investigated the performance of eight human subjects in using three user interfaces (head orientation, EMG from muscles of the head and neck, and a three-axis joystick) to command the endpoint position of a multi-axis robotic arm within a 3D workspace to perform a novel out-to-center 3D Fitts' Law style task. Two of these interfaces (head orientation, EMG from muscles of the head and neck) could realistically be used by individuals with high tetraplegia, while the joystick was evaluated as a standard of high performance. Performance metrics were developed to assess the aspects of command source performance. Data were analyzed using a mixed model design ANOVA. Fixed effects were investigated between sources as well as for interactions between index of difficulty, command source, and the five performance measures used. A 5% threshold for statistical significance was used in the analysis. The performances of the three command interfaces were rather similar, though significant differences between command sources were observed. The apparent similarity is due in large part to the sequential command strategy (i.e., one dimension of movement at a time) typically adopted by the subjects. EMG-based commands were particularly pulsatile in nature. The use of sequential commands had a significant impact on each command source's performance for movements in two or three dimensions. While the sequential nature of the commands produced by the user did not fit with Fitts' Law, the other performance measures used were able to illustrate the properties of each command source. Though pulsatile, given the overall similarity between head orientation and the EMG interface, (which also could be readily included in a future implanted neuroprosthesis) the use of EMG as a command source for controlling an arm in 3D space is an attractive choice.
Performance Analysis of GAME: A Generic Automated Marking Environment
ERIC Educational Resources Information Center
Blumenstein, Michael; Green, Steve; Fogelman, Shoshana; Nguyen, Ann; Muthukkumarasamy, Vallipuram
2008-01-01
This paper describes the Generic Automated Marking Environment (GAME) and provides a detailed analysis of its performance in assessing student programming projects and exercises. GAME has been designed to automatically assess programming assignments written in a variety of languages based on the "structure" of the source code and the correctness…
NASA Astrophysics Data System (ADS)
García-Mayordomo, J.; Gaspar-Escribano, J. M.; Benito, B.
2007-10-01
A probabilistic seismic hazard assessment of the Province of Murcia in terms of peak ground acceleration (PGA) and spectral accelerations [SA( T)] is presented in this paper. In contrast to most of the previous studies in the region, which were performed for PGA making use of intensity-to-PGA relationships, hazard is here calculated in terms of magnitude and using European spectral ground-motion models. Moreover, we have considered the most important faults in the region as specific seismic sources, and also comprehensively reviewed the earthquake catalogue. Hazard calculations are performed following the Probabilistic Seismic Hazard Assessment (PSHA) methodology using a logic tree, which accounts for three different seismic source zonings and three different ground-motion models. Hazard maps in terms of PGA and SA(0.1, 0.2, 0.5, 1.0 and 2.0 s) and coefficient of variation (COV) for the 475-year return period are shown. Subsequent analysis is focused on three sites of the province, namely, the cities of Murcia, Lorca and Cartagena, which are important industrial and tourism centres. Results at these sites have been analysed to evaluate the influence of the different input options. The most important factor affecting the results is the choice of the attenuation relationship, whereas the influence of the selected seismic source zonings appears strongly site dependant. Finally, we have performed an analysis of source contribution to hazard at each of these cities to provide preliminary guidance in devising specific risk scenarios. We have found that local source zones control the hazard for PGA and SA( T ≤ 1.0 s), although contribution from specific fault sources and long-distance north Algerian sources becomes significant from SA(0.5 s) onwards.
The Reliability and Sources of Error of Using Rubrics-Based Assessment for Student Projects
ERIC Educational Resources Information Center
Menéndez-Varela, José-Luis; Gregori-Giralt, Eva
2018-01-01
Rubrics are widely used in higher education to assess performance in project-based learning environments. To date, the sources of error that may affect their reliability have not been studied in depth. Using generalisability theory as its starting-point, this article analyses the influence of the assessors and the criteria of the rubrics on the…
Sokolova, Ekaterina; Petterson, Susan R; Dienus, Olaf; Nyström, Fredrik; Lindgren, Per-Eric; Pettersson, Thomas J R
2015-09-01
Norovirus contamination of drinking water sources is an important cause of waterborne disease outbreaks. Knowledge on pathogen concentrations in source water is needed to assess the ability of a drinking water treatment plant (DWTP) to provide safe drinking water. However, pathogen enumeration in source water samples is often not sufficient to describe the source water quality. In this study, the norovirus concentrations were characterised at the contamination source, i.e. in sewage discharges. Then, the transport of norovirus within the water source (the river Göta älv in Sweden) under different loading conditions was simulated using a hydrodynamic model. Based on the estimated concentrations in source water, the required reduction of norovirus at the DWTP was calculated using quantitative microbial risk assessment (QMRA). The required reduction was compared with the estimated treatment performance at the DWTP. The average estimated concentration in source water varied between 4.8×10(2) and 7.5×10(3) genome equivalents L(-1); and the average required reduction by treatment was between 7.6 and 8.8 Log10. The treatment performance at the DWTP was estimated to be adequate to deal with all tested loading conditions, but was heavily dependent on chlorine disinfection, with the risk of poor reduction by conventional treatment and slow sand filtration. To our knowledge, this is the first article to employ discharge-based QMRA, combined with hydrodynamic modelling, in the context of drinking water. Copyright © 2015 Elsevier B.V. All rights reserved.
Performance characterization of a solenoid-type gas valve for the H- magnetron source at FNAL
NASA Astrophysics Data System (ADS)
Sosa, A.; Bollinger, D. S.; Karns, P. R.
2017-08-01
The magnetron-style H- ion sources currently in operation at Fermilab use piezoelectric gas valves to function. This kind of gas valve is sensitive to small changes in ambient temperature, which affect the stability and performance of the ion source. This motivates the need to find an alternative way of feeding H2 gas into the source. A solenoid-type gas valve has been characterized in a dedicated off-line test stand to assess the feasibility of its use in the operational ion sources. H- ion beams have been extracted at 35 keV using this valve. In this study, the performance of the solenoid gas valve has been characterized measuring the beam current output of the magnetron source with respect to the voltage and pulse width of the signal applied to the gas valve.
2012-06-01
Source Compositions for HPS Dataset ...........................................78 Figure 25 Comparison of Source Apportionment for HPS Dataset...The similarity in the three source patterns from HPS makes the apportionment less certain at that site compared to the four source patterns at... apportionment of these sources across the site. Overall these techniques passed all the performance assessment tests that are presented in Section 6. 3.3
DOE Office of Scientific and Technical Information (OSTI.GOV)
Archer, Daniel E.; Hornback, Donald Eric; Johnson, Jeffrey O.
This report summarizes the findings of a two year effort to systematically assess neutron and gamma backgrounds relevant to operational modeling and detection technology implementation. The first year effort focused on reviewing the origins of background sources and their impact on measured rates in operational scenarios of interest. The second year has focused on the assessment of detector and algorithm performance as they pertain to operational requirements against the various background sources and background levels.
Performance Characterization of a Solenoid-type Gas Valve for the H- Magnetron Source at FNAL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sosa, A.; Bollinger, D. S.; Karns, P. R.
2016-09-06
The magnetron-style H- ion sources currently in operation at Fermilab use piezoelectric gas valves to function. This kind of gas valve is sensitive to small changes in ambient temperature, which affect the stability and performance of the ion source. This motivates the need to find an alternative way of feeding H2 gas into the source. A solenoid-type gas valve has been characterized in a dedicated off-line test stand to assess the feasibility of its use in the operational ion sources. H- ion beams have been extracted at 35 keV using this valve. In this study, the performance of the solenoidmore » gas valve has been characterized measuring the beam current output of the magnetron source with respect to the voltage and pulse width of the signal applied to the gas valve.« less
Deterministic Seismic Hazard Assessment of Center-East IRAN (55.5-58.5˚ E, 29-31˚ N)
NASA Astrophysics Data System (ADS)
Askari, M.; Ney, Beh
2009-04-01
Deterministic Seismic Hazard Assessment of Center-East IRAN (55.5-58.5˚E, 29-31˚N) Mina Askari, Behnoosh Neyestani Students of Science and Research University,Iran. Deterministic seismic hazard assessment has been performed in Center-East IRAN, including Kerman and adjacent regions of 100km is selected. A catalogue of earthquakes in the region, including historical earthquakes and instrumental earthquakes is provided. A total of 25 potential seismic source zones in the region delineated as area sources for seismic hazard assessment based on geological, seismological and geophysical information, then minimum distance for every seismic sources until site (Kerman) and maximum magnitude for each source have been determined, eventually using the N. A. ABRAHAMSON and J. J. LITEHISER '1989 attenuation relationship, maximum acceleration is estimated to be 0.38g, that is related to the movement of blind fault with maximum magnitude of this source is Ms=5.5.
NASA Astrophysics Data System (ADS)
Cai, Z.; Wilson, R. D.
2009-05-01
Techniques for optimizing the removal of NAPL mass in source zones have advanced at a more rapid rate than strategies to assess treatment performance. Informed selection of remediation approaches would be easier if measurements of performance were more directly transferable. We developed a number of methods based on data generated from multilevel sampler (MLS) transects to assess the effectiveness of a bioaugmentation/biostimulation trial in a TCE source residing in a terrace gravel aquifer in the East Midlands, UK. In this spatially complex aquifer, treatment inferred from long screen monitoring well data was not as reliable as that from consideration of mass flux changes across transects installed in and downgradient of the source. Falling head tests were conducted in the MLS ports to generate the necessary hydraulic conductivity (K) data. Combining K with concentration provides a mass flux map that allows calculation of mass turnover and an assessment of where in the complex geology the greatest turnover occurred. Five snapshots over a 600-day period indicate a marked reduction in TCE flux, suggesting a significant reduction in DNAPL mass over that expected due to natural processes. However, persistence of daughter products suggested that complete dechlorination did not occur. The MLS fence data also revealed that delivery of both carbon source and pH buffer were not uniform across the test zone. This may have lead to the generation of niches of iron(III) and sulphate reduction as well as methanogenesis, which impacted on dechlorination processes. In the absence of this spatial data, it is difficult to reconcile apparent treatment as indicated in monitoring well data to on-going processes.
Holst, Daniel; Kowalewski, Timothy M; White, Lee W; Brand, Timothy C; Harper, Jonathan D; Sorenson, Mathew D; Kirsch, Sarah; Lendvay, Thomas S
2015-05-01
Crowdsourcing is the practice of obtaining services from a large group of people, typically an online community. Validated methods of evaluating surgical video are time-intensive, expensive, and involve participation of multiple expert surgeons. We sought to obtain valid performance scores of urologic trainees and faculty on a dry-laboratory robotic surgery task module by using crowdsourcing through a web-based grading tool called Crowd Sourced Assessment of Technical Skill (CSATS). IRB approval was granted to test the technical skills grading accuracy of Amazon.com Mechanical Turk™ crowd-workers compared to three expert faculty surgeon graders. The two groups assessed dry-laboratory robotic surgical suturing performances of three urology residents (PGY-2, -4, -5) and two faculty using three performance domains from the validated Global Evaluative Assessment of Robotic Skills assessment tool. After an average of 2 hours 50 minutes, each of the five videos received 50 crowd-worker assessments. The inter-rater reliability (IRR) between the surgeons and crowd was 0.91 using Cronbach's alpha statistic (confidence intervals=0.20-0.92), indicating an agreement level between the two groups of "excellent." The crowds were able to discriminate the surgical level, and both the crowds and the expert faculty surgeon graders scored one senior trainee's performance above a faculty's performance. Surgery-naive crowd-workers can rapidly assess varying levels of surgical skill accurately relative to a panel of faculty raters. The crowds provided rapid feedback and were inexpensive. CSATS may be a valuable adjunct to surgical simulation training as requirements for more granular and iterative performance tracking of trainees become mandated and commonplace.
Open source database of images DEIMOS: extension for large-scale subjective image quality assessment
NASA Astrophysics Data System (ADS)
Vítek, Stanislav
2014-09-01
DEIMOS (Database of Images: Open Source) is an open-source database of images and video sequences for testing, verification and comparison of various image and/or video processing techniques such as compression, reconstruction and enhancement. This paper deals with extension of the database allowing performing large-scale web-based subjective image quality assessment. Extension implements both administrative and client interface. The proposed system is aimed mainly at mobile communication devices, taking into account advantages of HTML5 technology; it means that participants don't need to install any application and assessment could be performed using web browser. The assessment campaign administrator can select images from the large database and then apply rules defined by various test procedure recommendations. The standard test procedures may be fully customized and saved as a template. Alternatively the administrator can define a custom test, using images from the pool and other components, such as evaluating forms and ongoing questionnaires. Image sequence is delivered to the online client, e.g. smartphone or tablet, as a fully automated assessment sequence or viewer can decide on timing of the assessment if required. Environmental data and viewing conditions (e.g. illumination, vibrations, GPS coordinates, etc.), may be collected and subsequently analyzed.
Web-Based Portfolio Assessment: Validation of an Open Source Platform
ERIC Educational Resources Information Center
Collins, Regina; Elliot, Norbert; Klobucar, Andrew; Deek, Fadi P.
2013-01-01
Assessment of educational outcomes through purchased tests is commonplace in the evaluation of individual student ability and of educational programs. Focusing on the assessment of writing performance in a longitudinal study of first-time, full-time students (n = 598), this research describes the design, use, and assessment of an open-source…
ERIC Educational Resources Information Center
Federer, Meghan Rector; Nehm, Ross H.; Pearl, Dennis K.
2016-01-01
Understanding sources of performance bias in science assessment provides important insights into whether science curricula and/or assessments are valid representations of student abilities. Research investigating assessment bias due to factors such as instrument structure, participant characteristics, and item types are well documented across a…
Miniaturized pulsed laser source for time-domain diffuse optics routes to wearable devices
NASA Astrophysics Data System (ADS)
Di Sieno, Laura; Nissinen, Jan; Hallman, Lauri; Martinenghi, Edoardo; Contini, Davide; Pifferi, Antonio; Kostamovaara, Juha; Mora, Alberto Dalla
2017-08-01
We validate a miniaturized pulsed laser source for use in time-domain (TD) diffuse optics, following rigorous and shared protocols for performance assessment of this class of devices. This compact source (12×6 mm2) has been previously developed for range finding applications and is able to provide short, high energy (˜100 ps, ˜0.5 nJ) optical pulses at up to 1 MHz repetition rate. Here, we start with a basic level laser characterization with an analysis of suitability of this laser for the diffuse optics application. Then, we present a TD optical system using this source and its performances in both recovering optical properties of tissue-mimicking homogeneous phantoms and in detecting localized absorption perturbations. Finally, as a proof of concept of in vivo application, we demonstrate that the system is able to detect hemodynamic changes occurring in the arm of healthy volunteers during a venous occlusion. Squeezing the laser source in a small footprint removes a key technological bottleneck that has hampered so far the realization of a miniaturized TD diffuse optics system, able to compete with already assessed continuous-wave devices in terms of size and cost, but with wider performance potentialities, as demonstrated by research over the last two decades.
Integrating Data Sources for Process Sustainability Assessments (presentation)
To perform a chemical process sustainability assessment requires significant data about chemicals, process design specifications, and operating conditions. The required information includes the identity of the chemicals used, the quantities of the chemicals within the context of ...
Technical assessment for quality control of resins
NASA Technical Reports Server (NTRS)
Gosnell, R. B.
1977-01-01
Survey visits to companies involved in the manufacture and use of graphite-epoxy prepregs were conducted to assess the factors which may contribute to variability in the mechanical properties of graphite-epoxy composites. In particular, the purpose was to assess the contributions of the epoxy resins to variability. Companies represented three segments of the composites industry - aircraft manufacturers, prepreg manufacturers, and epoxy resin manufacturers. Several important sources of performance variability were identified from among the complete spectrum of potential sources which ranged from raw materials to composite test data interpretation.
Fereday, Jennifer; Muir-Cochrane, Eimear
2004-01-01
Performance feedback is information provided to employees about how well they are performing in their work role. The nursing profession has a long history of providing formal, written performance reviews, traditionally from a manager to subordinate, with less formal feedback sources including peers, clients and multidisciplinary team members. This paper is based on one aspect of a PhD research study exploring the dynamics of performance feedback primarily from the nursing clinicians' perspective. The research reported here discusses the impact of the social relationship (between the source and recipient of performance feedback) on the recipient's evaluation of feedback as being 'credible' and 'useful' for self-assessment. Focus group interviews were utilised to ascertain the nursing clinicians' perspectives of performance feedback. Thematic analysis of the data was informed by the Social Phenomenology of Alfred Schutz (1967) specifically his theories of intersubjective understanding. Findings supported the level of familiarity between the feedback source and the nursing clinician as a significant criterion influencing the acceptance or rejection of feedback. Implications for the selection of performance feedback sources and processes within nursing are discussed.
ERIC Educational Resources Information Center
Gagnon, Laurie
2010-01-01
The perspectives of graduates offer a valuable source of understanding for educators and policy-makers on how to ensure high quality educational pathways that prepare all students for work and college. Based on in-depth interviews with graduates from three Boston Public Schools with well-established performance-based assessment systems, the study…
Open-Source Syringe Pump Library
Wijnen, Bas; Hunt, Emily J.; Anzalone, Gerald C.; Pearce, Joshua M.
2014-01-01
This article explores a new open-source method for developing and manufacturing high-quality scientific equipment suitable for use in virtually any laboratory. A syringe pump was designed using freely available open-source computer aided design (CAD) software and manufactured using an open-source RepRap 3-D printer and readily available parts. The design, bill of materials and assembly instructions are globally available to anyone wishing to use them. Details are provided covering the use of the CAD software and the RepRap 3-D printer. The use of an open-source Rasberry Pi computer as a wireless control device is also illustrated. Performance of the syringe pump was assessed and the methods used for assessment are detailed. The cost of the entire system, including the controller and web-based control interface, is on the order of 5% or less than one would expect to pay for a commercial syringe pump having similar performance. The design should suit the needs of a given research activity requiring a syringe pump including carefully controlled dosing of reagents, pharmaceuticals, and delivery of viscous 3-D printer media among other applications. PMID:25229451
The fifteen-year performance of a granular iron, permeable reactive barrier (PRB; Elizabeth City, North Carolina) is reviewed with respect to contaminanttreatment (hexavalent chromium and trichloroethylene) and hydraulic performance. Due to in-situ treatment of the chromium sourc...
Structure-specific scalar intensity measures for near-source and ordinary earthquake ground motions
Luco, N.; Cornell, C.A.
2007-01-01
Introduced in this paper are several alternative ground-motion intensity measures (IMs) that are intended for use in assessing the seismic performance of a structure at a site susceptible to near-source and/or ordinary ground motions. A comparison of such IMs is facilitated by defining the "efficiency" and "sufficiency" of an IM, both of which are criteria necessary for ensuring the accuracy of the structural performance assessment. The efficiency and sufficiency of each alternative IM, which are quantified via (i) nonlinear dynamic analyses of the structure under a suite of earthquake records and (ii) linear regression analysis, are demonstrated for the drift response of three different moderate- to long-period buildings subjected to suites of ordinary and of near-source earthquake records. One of the alternative IMs in particular is found to be relatively efficient and sufficient for the range of buildings considered and for both the near-source and ordinary ground motions. ?? 2007, Earthquake Engineering Research Institute.
Thulium heat source IR D Project 91-031
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walter, C.E.; Kammeraad, J.E.; Newman, J.G.
1991-01-01
The goal of the Thulium Heat Source study is to determine the performance capability and evaluate the safety and environmental aspects of a thulium-170 heat source. Thulium-170 has several attractive features, including the fact that it decays to a stable, chemically innocuous isotope in a relatively short time. A longer-range goal is to attract government funding for the development, fabrication, and demonstration testing in an Autonomous Underwater Vehicle (AUV) of one or more thulium isotope power (TIP) prototype systems. The approach is to study parametrically the performance of thulium-170 heat source designs in the power range of 5-50 kW{sub th}.more » At least three heat source designs will be characterized in this power range to assess their performance, mass, and volume. The authors will determine shielding requirements, and consider the safety and environmental aspects of their use.« less
Kim, Jin Woo; Choo, Ki Seok; Jeon, Ung Bae; Kim, Tae Un; Hwang, Jae Yeon; Yeom, Jeong A; Jeong, Hee Seok; Choi, Yoon Young; Nam, Kyung Jin; Kim, Chang Won; Jeong, Dong Wook; Lim, Soo Jin
2016-07-01
Multi-detector computed tomography (MDCT) angiography is now used for the diagnosing patients with peripheral arterial disease. The dose of radiation is related to variable factors, such as tube current, tube voltage, and helical pitch. To assess the diagnostic performance and radiation dose of lower extremity CT angiography (CTA) using a 128-slice dual source CT at 80 kVp and high pitch in patients with critical limb ischemia (CLI). Twenty-eight patients (mean, 64.1 years; range, 39-80 years) with CLI were enrolled in this retrospective study and underwent CTA using a 128-slice dual source CT at 80 kVp and high pitch and subsequent intra-arterial digital subtraction angiography (DSA), which was used as a reference standard for assessing diagnostic performance. For arterial segments with significant disease (>50% stenosis), overall sensitivity, specificity, and accuracy of lower extremity CTA were 94.8% (95% CI, 91.7-98.0%), 91.5% (95% CI, 87.7-95.2%), and 93.1% (95% CI, 90.6-95.6%), respectively, and its positive and negative predictive values were 91.0% (95% CI, 87.1-95.0%), and 95.1% (95% CI, 92.1-98.1%), respectively. Mean radiation dose delivered to lower extremities was 266.6 mGy.cm. Lower extremity CTA using a 128-slice dual source CT at 80 kVp and high pitch was found to have good diagnostic performance for the assessment of patients with CLI using an extremely low radiation dose. © The Foundation Acta Radiologica 2015.
Regulatory Guide on Conducting a Security Vulnerability Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ek, David R.
This document will provide guidelines on conducting a security vulnerability assessment at a facility regulated by the Radiation Protection Centre. The guidelines provide a performance approach assess security effectiveness. The guidelines provide guidance for a review following the objectives outlined in IAEA NSS#11 for Category 1, 2, & 3 sources.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, M.A.
1991-12-31
In conducting a performance assessment for a low-level waste (LLW) disposal facility, one of the important considerations for determining the source term, which is defined as the amount of radioactivity being released from the facility, is the quantity of radioactive material present. This quantity, which will be referred to as the source inventory, is generally estimated through a review of historical records and waste tracking systems at the LLW facility. In theory, estimating the total source inventory for Department of Energy (DOE) LLW disposal facilities should be possible by reviewing the national data base maintained for LLW operations, the Solidmore » Waste Information Management System (SWIMS), or through the annual report that summarizes the SWIMS data, the Integrated Data Base (IDB) report. However, in practice, there are some difficulties in making this estimate. This is not unexpected, since the SWIMS and the IDB were not developed with the goal of developing a performance assessment source term in mind. The practical shortcomings using the existing data to develop a source term for DOE facilities will be discussed in this paper.« less
Memory for performed and observed activities following traumatic brain injury
Wright, Matthew J.; Wong, Andrew L.; Obermeit, Lisa C.; Woo, Ellen; Schmitter-Edgecombe, Maureen; Fuster, Joaquín M.
2014-01-01
Traumatic brain injury (TBI) is associated with deficits in memory for the content of completed activities. However, TBI groups have shown variable memory for the temporal order of activities. We sought to clarify the conditions under which temporal order memory for activities is intact following TBI. Additionally, we evaluated activity source memory and the relationship between activity memory and functional outcome in TBI participants. Thus, we completed a study of activity memory with 18 severe TBI survivors and 18 healthy age- and education-matched comparison participants. Both groups performed eight activities and observed eight activities that were fashioned after routine daily tasks. Incidental encoding conditions for activities were utilized. The activities were drawn from two counterbalanced lists, and both performance and observation were randomly determined and interspersed. After all of the activities were completed, content memory (recall and recognition), source memory (conditional source identification), and temporal order memory (correlation between order reconstruction and actual order) for the activities were assessed. Functional ability was assessed via the Community Integration Questionnaire (CIQ). In terms of content memory, TBI participants recalled and recognized fewer activities than comparison participants. Recognition of performed and observed activities was strongly associated with social integration on the CIQ. There were no between- or within-group differences in temporal order or source memory, although source memory performances were near ceiling. The findings were interpreted as suggesting that temporal order memory following TBI is intact under conditions of both purposeful activity completion and incidental encoding, and that activity memory is related to functional outcomes following TBI. PMID:24524393
The use of climate information in vulnerability assessments.
DOT National Transportation Integrated Search
2011-01-01
This memorandum focuses on the use of climate information when performing a vulnerability : assessment, a topic that was discussed at the Newark Pilot Peer Exchange Workshop on May 4-5, : 2011. The memorandum describes several sources of climate info...
Chen, Xichao; Luo, Qian; Wang, Donghong; Gao, Jijun; Wei, Zi; Wang, Zijian; Zhou, Huaidong; Mazumder, Asit
2015-11-01
Owing to the growing public awareness on the safety and aesthetics in water sources, more attention has been given to the adverse effects of volatile organic compounds (VOCs) on aquatic organisms and human beings. In this study, 77 target VOCs (including 54 common VOCs, 13 carbonyl compounds, and 10 taste and odor compounds) were detected in typical drinking water sources from 5 major river basins (the Yangtze, the Huaihe, the Yellow, the Haihe and the Liaohe River basins) and their occurrences were characterized. The ecological, human health, and olfactory assessments were performed to assess the major hazards in source water. The investigation showed that there existed potential ecological risks (1.30 × 10 ≤ RQtotals ≤ 8.99 × 10) but little human health risks (6.84 × 10(-7) ≤ RQtotals ≤ 4.24 × 10(-4)) by VOCs, while that odor problems occurred extensively. The priority contaminants in drinking water sources of China were also listed based on the present assessment criteria. Copyright © 2015 Elsevier Ltd. All rights reserved.
Fostering Historical Thinking with Digitized Primary Sources
ERIC Educational Resources Information Center
Tally, Bill; Goldenberg, Lauren B.
2005-01-01
This pilot study examined middle school and high school student performance on an online historical thinking assessment task. After their teachers received training in the use of digital historical archives, students from all groups engaged in historical thinking behaviors (e.g., observation, sourcing, inferencing, evidence, question-posing, and…
Livingood, William C; Morris, Michael; Sorensen, Bonita; Chapman, Karen; Rivera, Lillian; Beitsch, Les; Street, Phil; Coughlin, Susan; Smotherman, Carmen; Wood, David
2013-01-01
The Florida Public Health Practice-Based Research Network conducted the study of Florida county health departments (CHDs) to assess relationships between self-assessed performance on essential services (ESs) and sources of funding. Primary data were collected using an online survey based on Public Health Accreditation Board standards for ES. Bivariate and multivariate analyses were conducted to assess the relationship of sources and amounts of revenue obtained from the Florida Department of Health financial system to responses to the survey of CHD capacity for ESs. Self-assessed CHD performance for each ES varied extensively among the CHDs and across the 10 ESs, ranging from a high of 98% CHDs completely or almost completely meeting the standards for ES 2 (Investigating Problems and Hazards) to a low of 32% completely or almost completely meeting standards for ES 10 (Research/Evidence). Medicaid revenue and fees were positively correlated with some ESs. Per capita revenue support varied extensively among the CHDs. Revenue for ES is decreasing and is heavily reliant on noncategorical (discretionary) revenue. This study has important implications for continued reliance on ES as an organizing construct for public health.
Wang, Qi; Xie, Zhiyi; Li, Fangbai
2015-11-01
This study aims to identify and apportion multi-source and multi-phase heavy metal pollution from natural and anthropogenic inputs using ensemble models that include stochastic gradient boosting (SGB) and random forest (RF) in agricultural soils on the local scale. The heavy metal pollution sources were quantitatively assessed, and the results illustrated the suitability of the ensemble models for the assessment of multi-source and multi-phase heavy metal pollution in agricultural soils on the local scale. The results of SGB and RF consistently demonstrated that anthropogenic sources contributed the most to the concentrations of Pb and Cd in agricultural soils in the study region and that SGB performed better than RF. Copyright © 2015 Elsevier Ltd. All rights reserved.
Quality of Care for PTSD and Depression in the Military Health System
evaluate the receipt of recommended assessments and treatments. These measures draw on multiple data sources including administrative encounter data...services are effective in reducing symptoms. When comparing performance between 20122013 and 20132014, most measures demonstrated slight improvement ...in 20132014 for over 38,000 active-component service members with PTSD or depression. The assessment includes performance on 30 quality measures to
Information and problem report usage in system saftey engineering division
NASA Technical Reports Server (NTRS)
Morrissey, Stephen J.
1990-01-01
Five basic problems or question areas are examined. They are as follows: (1) Evaluate adequacy of current problem/performance data base; (2) Evaluate methods of performing trend analysis; (3) Methods and sources of data for probabilistic risk assessment; and (4) How is risk assessment documentation upgraded and/or updated. The fifth problem was to provide recommendations for each of the above four areas.
Aerobic Exercise During Encoding Impairs Hippocampus-Dependent Memory.
Soga, Keishi; Kamijo, Keita; Masaki, Hiroaki
2017-08-01
We investigated how aerobic exercise during encoding affects hippocampus-dependent memory through a source memory task that assessed hippocampus-independent familiarity and hippocampus-dependent recollection processes. Using a within-participants design, young adult participants performed a memory-encoding task while performing a cycling exercise or being seated. The subsequent retrieval phase was conducted while sitting on a chair. We assessed behavioral and event-related brain potential measures of familiarity and recollection processes during the retrieval phase. Results indicated that source accuracy was lower for encoding with exercise than for encoding in the resting condition. Event-related brain potential measures indicated that the parietal old/new effect, which has been linked to recollection processing, was observed in the exercise condition, whereas it was absent in the rest condition, which is indicative of exercise-induced hippocampal activation. These findings suggest that aerobic exercise during encoding impairs hippocampus-dependent memory, which may be attributed to inefficient source encoding during aerobic exercise.
Online Biomonitoring and Early Warning Systems for Protection of Water Sources
The ability to perform real time biomonitoring of behavioral responses and stress levels experienced by fish is important as it could be used for assessing source water toxicity as a first line of defense to protect and encourage recreational use of waterbodies. This paper propos...
MEASUREMENT AND USE OF CONTAMINANT FLUX AS AN ASSESSMENT TOOL FOR DNAPL REMEDIAL PERFORMANCE
Current remedial techniques are unable to completely eliminate all dense nonaqueous phase liquid (DNAPL) from source zone areas at most sites, and conflicting views on the benefits of partial DNAPL source zone remediation exist in the literature. A comparison of contaminant flux...
THE MEASUREMENT AND USE OF CONTAMINANT FLUX AS AN ASSESSMENT TOOL FOR DNAPL REMEDIAL PERFORMANCE
Current remedial techniques are unable to completely eliminate all dense nonaqueous phase liquid (DNAPL) from source zone areas at most sites, and conflicting views on the benefits of partial DNAPL source zone remediation exist in the literature. A comparison of contaminant flux...
NASA Astrophysics Data System (ADS)
Shepson, P. B.; Lavoie, T. N.; Kerlo, A. E.; Stirm, B. H.
2016-12-01
Understanding the contribution of anthropogenic activities to atmospheric greenhouse gas concentrations requires an accurate characterization of emission sources. Previously, we have reported the use of a novel aircraft-based mass balance measurement technique to quantify greenhouse gas emission rates from point and area sources, however, the accuracy of this approach has not been evaluated to date. Here, an assessment of method accuracy and precision was performed by conducting a series of six aircraft-based mass balance experiments at a power plant in southern Indiana and comparing the calculated CO2 emission rates to the reported hourly emission measurements made by continuous emissions monitoring systems (CEMS) installed directly in the exhaust stacks at the facility. For all flights, CO2 emissions were quantified before CEMS data were released online to ensure unbiased analysis. Additionally, we assess the uncertainties introduced to the final emission rate caused by our analysis method, which employs a statistical kriging model to interpolate and extrapolate the CO2 fluxes across the flight transects from the ground to the top of the boundary layer. Subsequently, using the results from these flights combined with the known emissions reported by the CEMS, we perform an inter-model comparison of alternative kriging methods to evaluate the performance of the kriging approach.
Wennberg, Richard; Cheyne, Douglas
2014-05-01
To assess the reliability of MEG source imaging (MSI) of anterior temporal spikes through detailed analysis of the localization and orientation of source solutions obtained for a large number of spikes that were separately confirmed by intracranial EEG to be focally generated within a single, well-characterized spike focus. MSI was performed on 64 identical right anterior temporal spikes from an anterolateral temporal neocortical spike focus. The effects of different volume conductors (sphere and realistic head model), removal of noise with low frequency filters (LFFs) and averaging multiple spikes were assessed in terms of the reliability of the source solutions. MSI of single spikes resulted in scattered dipole source solutions that showed reasonable reliability for localization at the lobar level, but only for solutions with a goodness-of-fit exceeding 80% using a LFF of 3 Hz. Reliability at a finer level of intralobar localization was limited. Spike averaging significantly improved the reliability of source solutions and averaging 8 or more spikes reduced dependency on goodness-of-fit and data filtering. MSI performed on topographically identical individual spikes from an intracranially defined classical anterior temporal lobe spike focus was limited by low reliability (i.e., scattered source solutions) in terms of fine, sublobar localization within the ipsilateral temporal lobe. Spike averaging significantly improved reliability. MSI performed on individual anterior temporal spikes is limited by low reliability. Reduction of background noise through spike averaging significantly improves the reliability of MSI solutions. Copyright © 2013 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Miniaturized pulsed laser source for time-domain diffuse optics routes to wearable devices.
Di Sieno, Laura; Nissinen, Jan; Hallman, Lauri; Martinenghi, Edoardo; Contini, Davide; Pifferi, Antonio; Kostamovaara, Juha; Mora, Alberto Dalla
2017-08-01
We validate a miniaturized pulsed laser source for use in time-domain (TD) diffuse optics, following rigorous and shared protocols for performance assessment of this class of devices. This compact source (12×6 mm2) has been previously developed for range finding applications and is able to provide short, high energy (∼100 ps, ∼0.5 nJ) optical pulses at up to 1 MHz repetition rate. Here, we start with a basic level laser characterization with an analysis of suitability of this laser for the diffuse optics application. Then, we present a TD optical system using this source and its performances in both recovering optical properties of tissue-mimicking homogeneous phantoms and in detecting localized absorption perturbations. Finally, as a proof of concept of in vivo application, we demonstrate that the system is able to detect hemodynamic changes occurring in the arm of healthy volunteers during a venous occlusion. Squeezing the laser source in a small footprint removes a key technological bottleneck that has hampered so far the realization of a miniaturized TD diffuse optics system, able to compete with already assessed continuous-wave devices in terms of size and cost, but with wider performance potentialities, as demonstrated by research over the last two decades. (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).
NASA Technical Reports Server (NTRS)
Kerslake, Thomas W.; Scheiman, David A.
2005-01-01
This paper documents testing and analyses to quantify International Space Station (ISS) Solar Array Wing (SAW) string electrical performance under highly off-nominal, low-temperature-low-intensity (LILT) operating conditions with nonsolar light sources. This work is relevant for assessing feasibility and risks associated with a Sequential Shunt Unit (SSU) remove and replace (R&R) Extravehicular Activity (EVA). During eclipse, SAW strings can be energized by moonlight, EVA suit helmet lights or video camera lights. To quantify SAW performance under these off-nominal conditions, solar cell performance testing was performed using full moon, solar simulator and Video Camera Luminaire (VCL) light sources. Test conditions included 25 to 110 C temperatures and 1- to 0.0001-Sun illumination intensities. Electrical performance data and calculated eclipse lighting intensities were combined to predict SAW current-voltage output for comparison with electrical hazard thresholds. Worst case predictions show there is no connector pin molten metal hazard but crew shock hazard limits are exceeded due to VCL illumination. Assessment uncertainties and limitations are discussed along with operational solutions to mitigate SAW electrical hazards from VCL illumination. Results from a preliminary assessment of SAW arcing are also discussed. The authors recommend further analyses once SSU, R&R, and EVA procedures are better defined.
Mission applications for advanced photovoltaic solar arrays
NASA Technical Reports Server (NTRS)
Stella, Paul M.; West, John L.; Chave, Robert G.; Mcgee, David P.; Yen, Albert S.
1990-01-01
The suitability of the Advanced Photovoltaic Solar Array (APSA) for future space missions was examined by considering the impact on the spacecraft system in general. The lightweight flexible blanket array system was compared to rigid arrays and a radio-isotope thermoelectric generator (RTG) static power source for a wide range of assumed future earth orbiting and interplanetary mission applications. The study approach was to establish assessment criteria and a rating scheme, identify a reference mission set, perform the power system assessment for each mission, and develop conclusions and recommendations to guide future APSA technology development. The authors discuss the three selected power sources, the assessment criteria and rating definitions, and the reference missions. They present the assessment results in a convenient tabular format. It is concluded that the three power sources examined, APSA, conventional solar arrays, and RTGs, can be considered to complement each other. Each power technology has its own range of preferred applications.
The Chandra Source Catalog: Source Variability
NASA Astrophysics Data System (ADS)
Nowak, Michael; Rots, A. H.; McCollough, M. L.; Primini, F. A.; Glotfelty, K. J.; Bonaventura, N. R.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Evans, J. D.; Fabbiano, G.; Galle, E.; Gibbs, D. G.; Grier, J. D.; Hain, R.; Hall, D. M.; Harbo, P. N.; He, X.; Houck, J. C.; Karovska, M.; Lauer, J.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Plummer, D. A.; Refsdal, B. L.; Siemiginowska, A. L.; Sundheim, B. A.; Tibbetts, M. S.; Van Stone, D. W.; Winkelman, S. L.; Zografou, P.
2009-01-01
The Chandra Source Catalog (CSC) contains fields of view that have been studied with individual, uninterrupted observations that span integration times ranging from 1 ksec to 160 ksec, and a large number of which have received (multiple) repeat observations days to years later. The CSC thus offers an unprecedented look at the variability of the X-ray sky over a broad range of time scales, and across a wide diversity of variable X-ray sources: stars in the local galactic neighborhood, galactic and extragalactic X-ray binaries, Active Galactic Nuclei, etc. Here we describe the methods used to identify and quantify source variability within a single observation, and the methods used to assess the variability of a source when detected in multiple, individual observations. Three tests are used to detect source variability within a single observation: the Kolmogorov-Smirnov test and its variant, the Kuiper test, and a Bayesian approach originally suggested by Gregory and Loredo. The latter test not only provides an indicator of variability, but is also used to create a best estimate of the variable lightcurve shape. We assess the performance of these tests via simulation of statistically stationary, variable processes with arbitrary input power spectral densities (here we concentrate on results of red noise simulations) at variety of mean count rates and fractional root mean square variabilities relevant to CSC sources. We also assess the false positive rate via simulations of constant sources whose sole source of fluctuation is Poisson noise. We compare these simulations to a preliminary assessment of the variability found in real CSC sources, and estimate the variability sensitivities of the CSC.
The Chandra Source Catalog: Source Variability
NASA Astrophysics Data System (ADS)
Nowak, Michael; Rots, A. H.; McCollough, M. L.; Primini, F. A.; Glotfelty, K. J.; Bonaventura, N. R.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Evans, J. D.; Evans, I.; Fabbiano, G.; Galle, E. C.; Gibbs, D. G., II; Grier, J. D.; Hain, R.; Hall, D. M.; Harbo, P. N.; He, X.; Houck, J. C.; Karovska, M.; Lauer, J.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Plummer, D. A.; Refsdal, B. L.; Siemiginowska, A. L.; Sundheim, B. A.; Tibbetts, M. S.; van Stone, D. W.; Winkelman, S. L.; Zografou, P.
2009-09-01
The Chandra Source Catalog (CSC) contains fields of view that have been studied with individual, uninterrupted observations that span integration times ranging from 1 ksec to 160 ksec, and a large number of which have received (multiple) repeat observations days to years later. The CSC thus offers an unprecedented look at the variability of the X-ray sky over a broad range of time scales, and across a wide diversity of variable X-ray sources: stars in the local galactic neighborhood, galactic and extragalactic X-ray binaries, Active Galactic Nuclei, etc. Here we describe the methods used to identify and quantify source variability within a single observation, and the methods used to assess the variability of a source when detected in multiple, individual observations. Three tests are used to detect source variability within a single observation: the Kolmogorov-Smirnov test and its variant, the Kuiper test, and a Bayesian approach originally suggested by Gregory and Loredo. The latter test not only provides an indicator of variability, but is also used to create a best estimate of the variable lightcurve shape. We assess the performance of these tests via simulation of statistically stationary, variable processes with arbitrary input power spectral densities (here we concentrate on results of red noise simulations) at variety of mean count rates and fractional root mean square variabilities relevant to CSC sources. We also assess the false positive rate via simulations of constant sources whose sole source of fluctuation is Poisson noise. We compare these simulations to an assessment of the variability found in real CSC sources, and estimate the variability sensitivities of the CSC.
Miller, Thomas Martin; de Wet, Wouter C.; Patton, Bruce W.
2015-10-28
In this study, a computational assessment of the variation in terrestrial neutron and photon background from extraterrestrial sources is presented. The motivation of this assessment is to evaluate the practicality of developing a tool or database to estimate background in real time (or near–real time) during an experimental measurement or to even predict the background for future measurements. The extraterrestrial source focused on during this assessment is naturally occurring galactic cosmic rays (GCRs). The MCNP6 transport code was used to perform the computational assessment. However, the GCR source available in MCNP6 was not used. Rather, models developed and maintained bymore » NASA were used to generate the GCR sources. The largest variation in both neutron and photon background spectra was found to be caused by changes in elevation on Earth's surface, which can be as large as an order of magnitude. All other perturbations produced background variations on the order of a factor of 3 or less. The most interesting finding was that ~80% and 50% of terrestrial background neutrons and photons, respectively, are generated by interactions in Earth's surface and other naturally occurring and man-made objects near a detector of particles from extraterrestrial sources and their progeny created in Earth's atmosphere. In conclusion, this assessment shows that it will be difficult to estimate the terrestrial background from extraterrestrial sources without a good understanding of a detector's surroundings. Therefore, estimating or predicting background during a measurement environment like a mobile random search will be difficult.« less
Consistency of the Performance and Nonperformance Methods in Gifted Identification
ERIC Educational Resources Information Center
Acar, Selcuk; Sen, Sedat; Cayirdag, Nur
2016-01-01
Current approaches to gifted identification suggest collecting multiple sources of evidence. Some gifted identification guidelines allow for the interchangeable use of "performance" and "nonperformance" identification methods. This multiple criteria approach lacks a strong overlap between the assessment tools; however,…
ERIC Educational Resources Information Center
Aleta, Beda T.
2016-01-01
This research study aims to determine the factors of engineering skills self- efficacy sources contributing on the academic performance of AMAIUB engineering students. Thus, a better measure of engineering self-efficacy is needed to adequately assess engineering students' beliefs in their capabilities to perform tasks in their engineering…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Diaz, Aaron A.; Baldwin, David L.; Cinson, Anthony D.
2014-08-06
This Technical Letter Report satisfies the M3AR-14PN2301022 milestone, and is focused on identifying and quantifying the mechanistic sources of sensor performance variation between individual 22-element, linear phased-array sensor prototypes, SN1 and SN2. This effort constitutes an iterative evolution that supports the longer term goal of producing and demonstrating a pre-manufacturing prototype ultrasonic probe that possesses the fundamental performance characteristics necessary to enable the development of a high-temperature sodium-cooled fast reactor inspection system. The scope of the work for this portion of the PNNL effort conducted in FY14 includes performing a comparative evaluation and assessment of the performance characteristics of themore » SN1 and SN2 22 element PA-UT probes manufactured at PNNL. Key transducer performance parameters, such as sound field dimensions, resolution capabilities, frequency response, and bandwidth are used as a metric for the comparative evaluation and assessment of the SN1 and SN2 engineering test units.« less
Assessing Procedural Competence: Validity Considerations.
Pugh, Debra M; Wood, Timothy J; Boulet, John R
2015-10-01
Simulation-based medical education (SBME) offers opportunities for trainees to learn how to perform procedures and to be assessed in a safe environment. However, SBME research studies often lack robust evidence to support the validity of the interpretation of the results obtained from tools used to assess trainees' skills. The purpose of this paper is to describe how a validity framework can be applied when reporting and interpreting the results of a simulation-based assessment of skills related to performing procedures. The authors discuss various sources of validity evidence because they relate to SBME. A case study is presented.
Braat-Eggen, P Ella; van Heijst, Anne; Hornikx, Maarten; Kohlrausch, Armin
2017-09-01
The aim of this study is to gain more insight in the assessment of noise in open-plan study environments and to reveal correlations between noise disturbance experienced by students and the noise sources they perceive, the tasks they perform and the acoustic parameters of the open-plan study environment they work in. Data were collected in five open-plan study environments at universities in the Netherlands. A questionnaire was used to investigate student tasks, perceived sound sources and their perceived disturbance, and sound measurements were performed to determine the room acoustic parameters. This study shows that 38% of the surveyed students are disturbed by background noise in an open-plan study environment. Students are mostly disturbed by speech when performing complex cognitive tasks like studying for an exam, reading and writing. Significant but weak correlations were found between the room acoustic parameters and noise disturbance of students. Practitioner Summary: A field study was conducted to gain more insight in the assessment of noise in open-plan study environments at universities in the Netherlands. More than one third of the students was disturbed by noise. An interaction effect was found for task type, source type and room acoustic parameters.
ERIC Educational Resources Information Center
Benner, Susan M.; Hatch, J. Amos
2004-01-01
A team of early childhood teacher education faculty developed the 3-D talent development model of teacher education, blending theory and research from many sources. These sources include research on talent development, nonuniversal development, and roles of teachers and their professional growth. The faculty integrated constructs from these…
NASA Astrophysics Data System (ADS)
Kwon, Hyeokjun; Kang, Yoojin; Jang, Junwoo
2017-09-01
Color fidelity has been used as one of indices to evaluate the performance of light sources. Since the Color Rendering Index (CRI) was proposed at CIE, many color fidelity metrics have been proposed to increase the accuracy of the metric. This paper focuses on a comparison of the color fidelity metrics in an aspect of accuracy with human visual assessments. To visually evaluate the color fidelity of light sources, we made a simulator that reproduces the color samples under lighting conditions. In this paper, eighteen color samples of the Macbeth color checker under test light sources and reference illuminant for each of them are simulated and displayed on a well-characterized monitor. With only a spectrum set of the test light source and reference illuminant, color samples under any lighting condition can be reproduced. In this paper, the spectrums of the two LED and two OLED light sources that have similar values of CRI are used for the visual assessment. In addition, the results of the visual assessment are compared with the two color fidelity metrics that include CRI and IES TM-30-15 (Rf), proposed by Illuminating Engineering Society (IES) in 2015. Experimental results indicate that Rf outperforms CRI in terms of the correlation with visual assessment.
Cheng, Xianfeng; Danek, Tomas; Drozdova, Jarmila; Huang, Qianrui; Qi, Wufu; Zou, Liling; Yang, Shuran; Zhao, Xinliang; Xiang, Yungang
2018-03-07
The environmental assessment and identification of sources of heavy metals in Zn-Pb ore deposits are important steps for the effective prevention of subsequent contamination and for the development of corrective measures. The concentrations of eight heavy metals (As, Cd, Cr, Cu, Hg, Ni, Pb, and Zn) in soils from 40 sampling points around the Jinding Zn-Pb mine in Yunnan, China, were analyzed. An environmental quality assessment of the obtained data was performed using five different contamination and pollution indexes. Statistical analyses were performed to identify the relations among the heavy metals and the pH in soils and possible sources of pollution. The concentrations of As, Cd, Pb, and Zn were extremely high, and 23, 95, 25, and 35% of the samples, respectively, exceeded the heavy metal limits set in the Chinese Environmental Quality Standard for Soils (GB15618-1995, grade III). According to the contamination and pollution indexes, environmental risks in the area are high or extremely high. The highest risk is represented by Cd contamination, the median concentration of which exceeds the GB15618-1995 limit. Based on the combination of statistical analyses and geostatistical mapping, we identified three groups of heavy metals that originate from different sources. The main sources of As, Cd, Pb, Zn, and Cu are mining activities, airborne particulates from smelters, and the weathering of tailings. The main sources of Hg are dust fallout and gaseous emissions from smelters and tailing dams. Cr and Ni originate from lithogenic sources.
NASA Astrophysics Data System (ADS)
Jaars, Kerneels; Vestenius, Mika; van Zyl, Pieter G.; Beukes, Johan P.; Hellén, Heidi; Vakkari, Ville; Venter, Marcell; Josipovic, Miroslav; Hakola, Hannele
2018-01-01
Volatile organic compounds (VOCs) can have significant impacts on climate and human health. Certain VOCs are proven to be carcinogenic and toxic, which can affect human health directly and indirectly. In order to develop climate change reduction strategies and to assess the impacts of VOCs on human health, it is crucial to determine the sources of VOCs, which can be emitted from biogenic and anthropogenic sources. The aim of this study was to perform source apportionment using positive matrix factorisation (PMF) analysis on VOC data collected at a regional background location affected by the major sources in the interior of South Africa, which include the western- and eastern Bushveld Igneous Complex, the Johannesburg-Pretoria metropolitan conurbation, the Vaal Triangle, the Mpumalanga Highveld and also a region of anti-cyclonic recirculation of air mass over the interior of South Africa. In addition, a risk assessment study was also performed in view of the major source regions affecting Welgegund in order to quantify the impacts of anthropogenic VOCs measured at Welgegund on human health. Measurements were conducted at the Welgegund measurement station located on a commercial farm approximately 100 km west of Johannesburg for a period of more than two years. PMF analysis revealed ten meaningful factor solutions, of which five factors were associated with biogenic emissions and five with anthropogenic sources. Three of the biogenic factors were characterised by a specific biogenic species, i.e. isoprene, limonene and 2-methyl-3-buten-2-ol (MBO), while the other two biogenic factors comprised mixtures of biogenic species with different tracer species. The temporal factor contribution for the isoprene, limonene and MBO factors correlated relatively well with the seasonal wet pattern. One anthropogenic factor was associated with emissions from a densely populated anthropogenic source region to the east of Welgegund with a large number of industrial activities, while another anthropogenic factor could be related to coal combustion. An anthropogenic factor was also identified that reflected the influence of solvents on atmospheric VOC concentrations, while two anthropogenic factors were determined that indicated the influence of farming activities in close proximity to Welgegund. A lifetime cancer risk- (LCR) and non-cancer hazard ratio (HR) assessment study conducted for VOCs measured at Welgegund in relation to three source regions indicated that the non-cancerous influence of VOCs measured in the source regions is significantly lower compared to the cancerous influence of these species on human health, which raises concern. However, LCR values were within an acceptable range. Factor analysis performed in this paper also identified sources that could be targeted to minimise VOC-related LCRs and HRs e.g. benzene-related cancers can be reduced by targeting incomplete combustion sources and coal combustion.
Assessing TCE source bioremediation by geostatistical analysis of a flux fence.
Cai, Zuansi; Wilson, Ryan D; Lerner, David N
2012-01-01
Mass discharge across transect planes is increasingly used as a metric for performance assessment of in situ groundwater remediation systems. Mass discharge estimates using concentrations measured in multilevel transects are often made by assuming a uniform flow field, and uncertainty contributions from spatial concentration and flow field variability are often overlooked. We extend our recently developed geostatistical approach to estimate mass discharge using transect data of concentration and hydraulic conductivity, so accounting for the spatial variability of both datasets. The magnitude and uncertainty of mass discharge were quantified by conditional simulation. An important benefit of the approach is that uncertainty is quantified as an integral part of the mass discharge estimate. We use this approach for performance assessment of a bioremediation experiment of a trichloroethene (TCE) source zone. Analyses of dissolved parent and daughter compounds demonstrated that the engineered bioremediation has elevated the degradation rate of TCE, resulting in a two-thirds reduction in the TCE mass discharge from the source zone. The biologically enhanced dissolution of TCE was not significant (~5%), and was less than expected. However, the discharges of the daughter products cis-1,2, dichloroethene (cDCE) and vinyl chloride (VC) increased, probably because of the rapid transformation of TCE from the source zone to the measurement transect. This suggests that enhancing the biodegradation of cDCE and VC will be crucial to successful engineered bioremediation of TCE source zones. © 2012, The Author(s). Ground Water © 2012, National Ground Water Association.
[Life cycle assessment of the infrastructure for hydrogen sources of fuel cell vehicles].
Feng, Wen; Wang, Shujuan; Ni, Weidou; Chen, Changhe
2003-05-01
In order to promote the application of life cycle assessment and provide references for China to make the project of infrastructure for hydrogen sources of fuel cell vehicles in the near future, 10 feasible plans of infrastructure for hydrogen sources of fuel cell vehicles were designed according to the current technologies of producing, storing and transporting hydrogen. Then life cycle assessment was used as a tool to evaluate the environmental performances of the 10 plans. The standard indexes of classified environmental impacts of every plan were gotten and sensitivity analysis for several parameters were carried out. The results showed that the best plan was that hydrogen will be produced by natural gas steam reforming in central factory, then transported to refuelling stations through pipelines, and filled to fuel cell vehicles using hydrogen gas at last.
Fang, Ji-Tseng; Ko, Yu-Shien; Chien, Chu-Chun; Yu, Kuang-Hui
2013-01-01
Since 1994, Taiwanese medical universities have employed the multiple application method comprising "recommendations and screening" and "admission application." The purpose of this study is to examine whether medical students admitted using different admission programs gave different performances. To evaluate the six core competencies for medical students proposed by Accreditation Council for Graduate Medical Education (ACGME), this study employed various assessment tools, including student opinion feedback, multi-source feedback (MSF), course grades, and examination results.MSF contains self-assessment scale, peer assessment scale, nursing staff assessment scale, visiting staff assessment scale, and chief resident assessment scale. In the subscales, the CronbachÊs alpha were higher than 0.90, indicating good reliability. Research participants consisted of 182 students from the School of Medicine at Chang Gung University. Regarding studentsÊ average grade for the medical ethics course, the performance of students who were enrolled through school recommendations exceeded that of students who were enrolled through the National College University Entrance Examination (NCUEE) p = 0.011), and all considered "teamwork" as the most important. Different entry pipelines of students in the "communication," "work attitude," "medical knowledge," and "teamwork" assessment scales showed no significant difference. The improvement rate of the students who were enrolled through the school recommendations was better than that of the students who were enrolled through the N CUEE in the "professional skills," "medical core competencies," "communication," and "teamwork" projects of self-assessment and peer assessment scales. However, the students who were enrolled through the NCUEE were better in the "professional skills," "medical core competencies," "communication," and "teamwork" projects of the visiting staff assessment scale and the chief resident assessment scale. Collectively, the performance of the students enrolled through recommendations was slightly better than that of the students enrolled through the NCUEE, although statistical significance was found in certain parts of the grades only.
Alaska national hydrography dataset positional accuracy assessment study
Arundel, Samantha; Yamamoto, Kristina H.; Constance, Eric; Mantey, Kim; Vinyard-Houx, Jeremy
2013-01-01
Initial visual assessments Wide range in the quality of fit between features in NHD and these new image sources. No statistical analysis has been performed to actually quantify accuracy Determining absolute accuracy is cost prohibitive (must collect independent, well defined test points) Quantitative analysis of relative positional error is feasible.
Quantitative criteria for assessment of gamma-ray imager performance
NASA Astrophysics Data System (ADS)
Gottesman, Steve; Keller, Kristi; Malik, Hans
2015-08-01
In recent years gamma ray imagers such as the GammaCamTM and Polaris have demonstrated good imaging performance in the field. Imager performance is often summarized as "resolution", either angular, or spatial at some distance from the imager, however the definition of resolution is not always related to the ability to image an object. It is difficult to quantitatively compare imagers without a common definition of image quality. This paper examines three categories of definition: point source; line source; and area source. It discusses the details of those definitions and which ones are more relevant for different situations. Metrics such as Full Width Half Maximum (FWHM), variations on the Rayleigh criterion, and some analogous to National Imagery Interpretability Rating Scale (NIIRS) are discussed. The performance against these metrics is evaluated for a high resolution coded aperture imager modeled using Monte Carlo N-Particle (MCNP), and for a medium resolution imager measured in the lab.
R. D. Bergman; D. L. Reed; A. M. Taylor; D. P. Harper; D. G. Hodges
2015-01-01
Developing renewable energy sources with low environmental impacts is becoming increasingly important as concerns about consuming fossil fuel sources grow. Cultivating, harvesting, drying, and densifying raw biomass feedstocks into pellets for easy handling and transport is one step forward in this endeavor. However, the corresponding environmental performances must be...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morrissey, Elmer; O'Donnell, James; Keane, Marcus
2004-03-29
Minimizing building life cycle energy consumption is becoming of paramount importance. Performance metrics tracking offers a clear and concise manner of relating design intent in a quantitative form. A methodology is discussed for storage and utilization of these performance metrics through an Industry Foundation Classes (IFC) instantiated Building Information Model (BIM). The paper focuses on storage of three sets of performance data from three distinct sources. An example of a performance metrics programming hierarchy is displayed for a heat pump and a solar array. Utilizing the sets of performance data, two discrete performance effectiveness ratios may be computed, thus offeringmore » an accurate method of quantitatively assessing building performance.« less
Evaluation of seismic spatial interaction effects through an impact testing program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas, B.D.; Driesen, G.E.
The consequences of non-seismically qualified objects falling and striking essential, seismically qualified objects is an analytically difficult problem to assess. Analytical solutions to impact problems are conservative and only available for simple situations. In a nuclear facility, the numerous ``sources`` and ``targets`` requiring evaluation often have complex geometric configurations, which makes calculations and computer modeling difficult. Few industry or regulatory rules are available for this specialized assessment. A drop test program was recently conducted to ``calibrate`` the judgment of seismic qualification engineers who perform interaction evaluations and to further develop seismic interaction criteria. Impact tests on varying combinations of sourcesmore » and targets were performed by dropping the sources from various heights onto targets that were connected to instruments. This paper summarizes the scope, test configurations, and some results of the drop test program. Force and acceleration time history data and general observations are presented on the ruggedness of various targets when subjected to impacts from different types of sources.« less
Evaluation of seismic spatial interaction effects through an impact testing program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas, B.D.; Driesen, G.E.
The consequences of non-seismically qualified objects falling and striking essential, seismically qualified objects is an analytically difficult problem to assess. Analytical solutions to impact problems are conservative and only available for simple situations. In a nuclear facility, the numerous sources'' and targets'' requiring evaluation often have complex geometric configurations, which makes calculations and computer modeling difficult. Few industry or regulatory rules are available for this specialized assessment. A drop test program was recently conducted to calibrate'' the judgment of seismic qualification engineers who perform interaction evaluations and to further develop seismic interaction criteria. Impact tests on varying combinations of sourcesmore » and targets were performed by dropping the sources from various heights onto targets that were connected to instruments. This paper summarizes the scope, test configurations, and some results of the drop test program. Force and acceleration time history data and general observations are presented on the ruggedness of various targets when subjected to impacts from different types of sources.« less
Psychophysical evidence for auditory motion parallax.
Genzel, Daria; Schutte, Michael; Brimijoin, W Owen; MacNeilage, Paul R; Wiegrebe, Lutz
2018-04-17
Distance is important: From an ecological perspective, knowledge about the distance to either prey or predator is vital. However, the distance of an unknown sound source is particularly difficult to assess, especially in anechoic environments. In vision, changes in perspective resulting from observer motion produce a reliable, consistent, and unambiguous impression of depth known as motion parallax. Here we demonstrate with formal psychophysics that humans can exploit auditory motion parallax, i.e., the change in the dynamic binaural cues elicited by self-motion, to assess the relative depths of two sound sources. Our data show that sensitivity to relative depth is best when subjects move actively; performance deteriorates when subjects are moved by a motion platform or when the sound sources themselves move. This is true even though the dynamic binaural cues elicited by these three types of motion are identical. Our data demonstrate a perceptual strategy to segregate intermittent sound sources in depth and highlight the tight interaction between self-motion and binaural processing that allows assessment of the spatial layout of complex acoustic scenes.
Rapid Acute Dose Assessment Using MCNP6
NASA Astrophysics Data System (ADS)
Owens, Andrew Steven
Acute radiation doses due to physical contact with a high-activity radioactive source have proven to be an occupational hazard. Multiple radiation injuries have been reported due to manipulating a radioactive source with bare hands or by placing a radioactive source inside a shirt or pants pocket. An effort to reconstruct the radiation dose must be performed to properly assess and medically manage the potential biological effects from such doses. Using the reference computational phantoms defined by the International Commission on Radiological Protection (ICRP) and the Monte Carlo N-Particle transport code (MCNP6), dose rate coefficients are calculated to assess doses for common acute doses due to beta and photon radiation sources. The research investigates doses due to having a radioactive source in either a breast pocket or pants back pocket. The dose rate coefficients are calculated for discrete energies and can be used to interpolate for any given energy of photon or beta emission. The dose rate coefficients allow for quick calculation of whole-body dose, organ dose, and/or skin dose if the source, activity, and time of exposure are known. Doses are calculated with the dose rate coefficients and compared to results from the International Atomic Energy Agency (IAEA) reports from accidents that occurred in Gilan, Iran and Yanango, Peru. Skin and organ doses calculated with the dose rate coefficients appear to agree, but there is a large discrepancy when comparing whole-body doses assessed using biodosimetry and whole-body doses assessed using the dose rate coefficients.
Multiple fingerprinting analyses in quality control of Cassiae Semen polysaccharides.
Cheng, Jing; He, Siyu; Wan, Qiang; Jing, Pu
2018-03-01
Quality control issue overshadows potential health benefits of Cassiae Semen due to the analytic limitations. In this study, multiple-fingerprint analysis integrated with several chemometrics was performed to assess the polysaccharide quality of Cassiae Semen harvested from different locations. FT-IR, HPLC, and GC fingerprints of polysaccharide extracts from the authentic source were established as standard profiles, applying to assess the quality of foreign sources. Analyses of FT-IR fingerprints of polysaccharide extracts using either Pearson correlation analysis or principal component analysis (PCA), or HPLC fingerprints of partially hydrolyzed polysaccharides with PCA, distinguished the foreign sources from the authentic source. However, HPLC or GC fingerprints of completely hydrolyzed polysaccharides couldn't identify all foreign sources and the methodology using GC is quite limited in determining the monosaccharide composition. This indicates that FT-IR/HPLC fingerprints of non/partially-hydrolyzed polysaccharides, respectively, accompanied by multiple chemometrics methods, might be potentially applied in detecting and differentiating sources of Cassiae Semen. Copyright © 2018 Elsevier B.V. All rights reserved.
Students' Performance on a Symmetry Task
ERIC Educational Resources Information Center
Ho, Siew Yin; Logan, Tracy
2013-01-01
This paper describes Singapore and Australian Grade 6 students' (n=1,187) performance on a symmetry task in a recently developed Mathematics Processing Instrument (MPI). The MPI comprised tasks sourced from Australia and Singapore's national assessments, NAPLAN and PSLE. Only half of the cohort solved the item successfully. It is possible that…
Distributed Leadership and Organizational Change: Implementation of a Teaching Performance Measure
ERIC Educational Resources Information Center
Sloan, Tine
2013-01-01
This article explores leadership practice and change as evidenced in multiple data sources gathered during a self-study implementation of a teaching performance assessment. It offers promising models of distributed leadership and organizational change that can inform future program implementers and the field in general. Our experiences suggest…
Singapore Students' Performance on Australian and Singapore Assessment Items
ERIC Educational Resources Information Center
Ho, Siew Yin; Lowrie, Tom
2012-01-01
This study describes Singapore students' (N = 607) performance on a recently developed Mathematics Processing Instrument (MPI). The MPI comprised tasks sourced from Australia's NAPLAN and Singapore's PSLE. In addition, the MPI had a corresponding question which encouraged students to describe how they solved the respective tasks. In particular,…
Muddying the Waters: A New Area of Concern for Drinking Water Contamination in Cameroon
Healy Profitós, Jessica M.; Mouhaman, Arabi; Lee, Seungjun; Garabed, Rebecca; Moritz, Mark; Piperata, Barbara; Tien, Joe; Bisesi, Michael; Lee, Jiyoung
2014-01-01
In urban Maroua, Cameroon, improved drinking water sources are available to a large majority of the population, yet this water is frequently distributed through informal distribution systems and stored in home containers (canaries), leaving it vulnerable to contamination. We assessed where contamination occurs within the distribution system, determined potential sources of environmental contamination, and investigated potential pathogens. Gastrointestinal health status (785 individuals) was collected via health surveys. Drinking water samples were collected from drinking water sources and canaries. Escherichia coli and total coliform levels were evaluated and molecular detection was performed to measure human-associated faecal marker, HF183; tetracycline-resistance gene, tetQ; Campylobacter spp.; and Staphylococcus aureus. Statistical analyses were performed to evaluate the relationship between microbial contamination and gastrointestinal illness. Canari samples had higher levels of contamination than source samples. HF183 and tetQ were detected in home and source samples. An inverse relationship was found between tetQ and E. coli. Presence of tetQ with lower E. coli levels increased the odds of reported diarrhoeal illness than E. coli levels alone. Further work is warranted to better assess the relationship between antimicrobial-resistant bacteria and other pathogens in micro-ecosystems within canaries and this relationship’s impact on drinking water quality. PMID:25464137
Muddying the waters: a new area of concern for drinking water contamination in Cameroon.
Profitós, Jessica M Healy; Mouhaman, Arabi; Lee, Seungjun; Garabed, Rebecca; Moritz, Mark; Piperata, Barbara; Tien, Joe; Bisesi, Michael; Lee, Jiyoung
2014-11-28
In urban Maroua, Cameroon, improved drinking water sources are available to a large majority of the population, yet this water is frequently distributed through informal distribution systems and stored in home containers (canaries), leaving it vulnerable to contamination. We assessed where contamination occurs within the distribution system, determined potential sources of environmental contamination, and investigated potential pathogens. Gastrointestinal health status (785 individuals) was collected via health surveys. Drinking water samples were collected from drinking water sources and canaries. Escherichia coli and total coliform levels were evaluated and molecular detection was performed to measure human-associated faecal marker, HF183; tetracycline-resistance gene, tetQ; Campylobacter spp.; and Staphylococcus aureus. Statistical analyses were performed to evaluate the relationship between microbial contamination and gastrointestinal illness. Canari samples had higher levels of contamination than source samples. HF183 and tetQ were detected in home and source samples. An inverse relationship was found between tetQ and E. coli. Presence of tetQ with lower E. coli levels increased the odds of reported diarrhoeal illness than E. coli levels alone. Further work is warranted to better assess the relationship between antimicrobial-resistant bacteria and other pathogens in micro-ecosystems within canaries and this relationship's impact on drinking water quality.
NASA Astrophysics Data System (ADS)
Leng, Shuai; Zhou, Wei; Yu, Zhicong; Halaweish, Ahmed; Krauss, Bernhard; Schmidt, Bernhard; Yu, Lifeng; Kappler, Steffen; McCollough, Cynthia
2017-09-01
Photon-counting computed tomography (PCCT) uses a photon counting detector to count individual photons and allocate them to specific energy bins by comparing photon energy to preset thresholds. This enables simultaneous multi-energy CT with a single source and detector. Phantom studies were performed to assess the spectral performance of a research PCCT scanner by assessing the accuracy of derived images sets. Specifically, we assessed the accuracy of iodine quantification in iodine map images and of CT number accuracy in virtual monoenergetic images (VMI). Vials containing iodine with five known concentrations were scanned on the PCCT scanner after being placed in phantoms representing the attenuation of different size patients. For comparison, the same vials and phantoms were also scanned on 2nd and 3rd generation dual-source, dual-energy scanners. After material decomposition, iodine maps were generated, from which iodine concentration was measured for each vial and phantom size and compared with the known concentration. Additionally, VMIs were generated and CT number accuracy was compared to the reference standard, which was calculated based on known iodine concentration and attenuation coefficients at each keV obtained from the U.S. National Institute of Standards and Technology (NIST). Results showed accurate iodine quantification (root mean square error of 0.5 mgI/cc) and accurate CT number of VMIs (percentage error of 8.9%) using the PCCT scanner. The overall performance of the PCCT scanner, in terms of iodine quantification and VMI CT number accuracy, was comparable to that of EID-based dual-source, dual-energy scanners.
Varughese, Eunice A; Brinkman, Nichole E; Anneken, Emily M; Cashdollar, Jennifer L; Fout, G Shay; Furlong, Edward T; Kolpin, Dana W; Glassmeyer, Susan T; Keely, Scott P
2018-04-01
Drinking water treatment plants rely on purification of contaminated source waters to provide communities with potable water. One group of possible contaminants are enteric viruses. Measurement of viral quantities in environmental water systems are often performed using polymerase chain reaction (PCR) or quantitative PCR (qPCR). However, true values may be underestimated due to challenges involved in a multi-step viral concentration process and due to PCR inhibition. In this study, water samples were concentrated from 25 drinking water treatment plants (DWTPs) across the US to study the occurrence of enteric viruses in source water and removal after treatment. The five different types of viruses studied were adenovirus, norovirus GI, norovirus GII, enterovirus, and polyomavirus. Quantitative PCR was performed on all samples to determine presence or absence of these viruses in each sample. Ten DWTPs showed presence of one or more viruses in source water, with four DWTPs having treated drinking water testing positive. Furthermore, PCR inhibition was assessed for each sample using an exogenous amplification control, which indicated that all of the DWTP samples, including source and treated water samples, had some level of inhibition, confirming that inhibition plays an important role in PCR-based assessments of environmental samples. PCR inhibition measurements, viral recovery, and other assessments were incorporated into a Bayesian model to more accurately determine viral load in both source and treated water. Results of the Bayesian model indicated that viruses are present in source water and treated water. By using a Bayesian framework that incorporates inhibition, as well as many other parameters that affect viral detection, this study offers an approach for more accurately estimating the occurrence of viral pathogens in environmental waters. Published by Elsevier B.V.
ERIC Educational Resources Information Center
Arikan, Serkan; van de Vijver, Fons J. R.; Yagmur, Kutlay
2018-01-01
We examined Differential Item Functioning (DIF) and the size of cross-cultural performance differences in the Programme for International Student Assessment (PISA) 2012 mathematics data before and after application of propensity score matching. The mathematics performance of Indonesian, Turkish, Australian, and Dutch students on released items was…
Assessment of a membrane drinking water filter in an emergency setting.
Ensink, Jeroen H J; Bastable, Andy; Cairncross, Sandy
2015-06-01
The performance and acceptability of the Nerox(TM) membrane drinking water filter were evaluated among an internally displaced population in Pakistan. The membrane filter and a control ceramic candle filter were distributed to over 3,000 households. Following a 6-month period, 230 households were visited and filter performance and use were assessed. Only 6% of the visited households still had a functioning filter, and the removal performance ranged from 80 to 93%. High turbidity in source water (irrigation canals), together with high temperatures and large family size were likely to have contributed to poor performance and uptake of the filters.
NASA Astrophysics Data System (ADS)
Ángel López Comino, José; Kriegerowski, Marius; Cesca, Simone; Dahm, Torsten; Mirek, Janusz; Lasocki, Stanislaw
2016-04-01
Hydraulic fracturing is considered among the human operations which could induce or trigger seismicity or microseismic activity. The influence of hydraulic fracturing operations is typically expected in terms of weak magnitude events. However, the sensitivity of the rock mass to trigger seismicity varies significantly for different sites and cannot be easily predicted prior to operations. In order to assess the sensitivity of microseismity to hydraulic fracturing operations, we perform a seismic monitoring at a shale gas exploration/exploitation site in the central-western part of the Peribaltic synclise at Pomerania (Poland). The monitoring will be continued before, during and after the termination of hydraulic fracturing operations. The fracking operations are planned in April 2016 at a depth 4000 m. A specific network setup has been installed since summer 2015, including a distributed network of broadband stations and three small-scale arrays. The network covers a region of 60 km2. The aperture of small scale arrays is between 450 and 950 m. So far no fracturing operations have been performed, but seismic data can already be used to assess the seismic noise and background microseismicity, and to investigate and assess the detection performance of our monitoring setup. Here we adopt a recently developed tool to generate a synthetic catalogue and waveform dataset, which realistically account for the expected microseismicity. Synthetic waveforms are generated for a local crustal model, considering a realistic distribution of hypocenters, magnitudes, moment tensors, and source durations. Noise free synthetic seismograms are superposed to real noise traces, to reproduce true monitoring conditions at the different station locations. We estimate the detection probability for different magnitudes, source-receiver distances, and noise conditions. This information is used to estimate the magnitude of completeness at the depth of the hydraulic fracturing horizontal wells. Our technique is useful to evaluate the efficiency of the seismic network and validate detection and location algorithms, taking into account the signal to noise ratio. The same dataset may be used at a later time, to assess the performance of other seismological analysis, such as hypocentral location, magnitude estimation and source parameters inversion. This work is funded by the EU H2020 SHEER project.
ERIC Educational Resources Information Center
Jia, Jiyou; Chen, Yuhao; Ding, Zhuhui; Ruan, Meixian
2012-01-01
Vocabulary acquisition and assessment are regarded as the key basis for the instruction of English as a second language. However, it is time-consuming, fallible and repetitive for the school teachers and parents to assess the proficiency of the students' vocabulary acquisition. We customized the open source course management system Moodle to build…
ERIC Educational Resources Information Center
Young, John Q.; Lieu, Sandra; O'Sullivan, Patricia; Tong, Lowell
2011-01-01
Objective: The authors developed and tested the feasibility and utility of a new direct-observation instrument to assess trainee performance of a medication management session. Methods: The Psychopharmacotherapy-Structured Clinical Observation (P-SCO) instrument was developed based on multiple sources of expertise and then implemented in 4…
ERIC Educational Resources Information Center
Grissom, Jason A.; Loeb, Susanna
2017-01-01
Teacher effectiveness varies substantially, yet principals' evaluations of teachers often fail to differentiate performance among teachers. We offer new evidence on principals' subjective evaluations of their teachers' effectiveness using two sources of data from a large, urban district: principals' high-stakes personnel evaluations of teachers,…
They Hear, but Do Not Listen: Retention for Podcasted Material in a Classroom Context
ERIC Educational Resources Information Center
Daniel, David B.; Woody, William Douglas
2010-01-01
This study examined the retention of students who listened to podcasts of a primary source to the retention of students who read the source as text. We also assessed students' preferences and study habits. Quiz scores revealed that the podcast group performed more poorly than did students who read the text. Although students initially preferred…
ERIC Educational Resources Information Center
Mitchell, Karen J.; Raye, Carol L.; McGuire, Joseph T.; Frankel, Hillary; Greene, Erich J.; Johnson, Marcia K.
2008-01-01
A short-term source monitoring procedure with functional magnetic resonance imaging assessed neural activity when participants made judgments about the format of 1 of 4 studied items (picture, word), the encoding task performed (cost, place), or whether an item was old or new. The results support findings from long-term memory studies showing that…
A field test of emulsified zero valent iron (EZVI) nanoparticles was conducted at Parris Island, SC, USA and was monitored for two and half years to assess the treatment of subsurface-source zone chlorinated volatile organic compounds (CVOCs) dominated by tetrachloroethene (PCE) ...
Rosato, Stefano; D'Errigo, Paola; Badoni, Gabriella; Fusco, Danilo; Perucci, Carlo A; Seccareccia, Fulvia
2008-08-01
The availability of two contemporary sources of information about coronary artery bypass graft (CABG) interventions, allowed 1) to verify the feasibility of performing outcome evaluation studies using administrative data sources, and 2) to compare hospital performance obtainable using the CABG Project clinical database with hospital performance derived from the use of current administrative data. Interventions recorded in the CABG Project were linked to the hospital discharge record (HDR) administrative database. Only the linked records were considered for subsequent analyses (46% of the total CABG Project). A new selected population "clinical card-HDR" was then defined. Two independent risk-adjustment models were applied, each of them using information derived from one of the two different sources. Then, HDR information was supplemented with some patient preoperative conditions from the CABG clinical database. The two models were compared in terms of their adaptability to data. Hospital performances identified by the two different models and significantly different from the mean was compared. In only 4 of the 13 hospitals considered for analysis, the results obtained using the HDR model did not completely overlap with those obtained by the CABG model. When comparing statistical parameters of the HDR model and the HDR model + patient preoperative conditions, the latter showed the best adaptability to data. In this "clinical card-HDR" population, hospital performance assessment obtained using information from the clinical database is similar to that derived from the use of current administrative data. However, when risk-adjustment models built on administrative databases are supplemented with a few clinical variables, their statistical parameters improve and hospital performance assessment becomes more accurate.
Aeolus End-To-End Simulator and Wind Retrieval Algorithms up to Level 1B
NASA Astrophysics Data System (ADS)
Reitebuch, Oliver; Marksteiner, Uwe; Rompel, Marc; Meringer, Markus; Schmidt, Karsten; Huber, Dorit; Nikolaus, Ines; Dabas, Alain; Marshall, Jonathan; de Bruin, Frank; Kanitz, Thomas; Straume, Anne-Grete
2018-04-01
The first wind lidar in space ALADIN will be deployed on ESÁs Aeolus mission. In order to assess the performance of ALADIN and to optimize the wind retrieval and calibration algorithms an end-to-end simulator was developed. This allows realistic simulations of data downlinked by Aeolus. Together with operational processors this setup is used to assess random and systematic error sources and perform sensitivity studies about the influence of atmospheric and instrument parameters.
Temperature measurement and performance assessment of the experimental composting bioreactor
NASA Astrophysics Data System (ADS)
Bajko, Jaroslav; Fišer, Jan; Jícha, Miroslav
2018-06-01
Considerable amount of heat produced during composting of organic matter is usually lost to the surrounding environment. In order to utilize this potential heat source, biomass is composted in mounds or vessels and excess thermal energy is captured via heat exchangers and transported to the site for space heating, preparation of utility water and other applications. The aim of this experimental research is to measure the temperature profiles in a pilot-scale composting bioreactor and assess its performance.
Introducing a design exigency to promote student learning through assessment: A case study.
Grealish, Laurie A; Shaw, Julie M
2018-02-01
Assessment technologies are often used to classify student and newly qualified nurse performance as 'pass' or 'fail', with little attention to how these decisions are achieved. Examining the design exigencies of classification technologies, such as performance assessment technologies, provides opportunities to explore flexibility and change in the process of using those technologies. Evaluate an established assessment technology for nursing performance as a classification system. A case study analysis that is focused on the assessment approach and a priori design exigencies of performance assessment technology, in this case the Australian Nursing Standards Assessment Tool 2016. Nurse assessors are required to draw upon their expertise to judge performance, but that judgement is described as a source of bias, creating confusion. The definition of satisfactory performance is 'ready to enter practice'. To pass, the performance on each criterion must be at least satisfactory, indicating to the student that no further improvement is required. The Australian Nursing Standards Assessment Tool 2016 does not have a third 'other' category, which is usually found in classification systems. Introducing a 'not yet competent' category and creating a two-part, mixed methods assessment process can improve the Australian Nursing Standards Assessment Tool 2016 assessment technology. Using a standards approach in the first part, judgement is valued and can generate learning opportunities across a program. Using a measurement approach in the second part, student performance can be 'not yet competent' but still meet criteria for year level performance and a graded pass. Subjecting the Australian Nursing Standards Assessment Tool 2016 assessment technology to analysis as a classification system provides opportunities for innovation in design. This design innovation has the potential to support students who move between programs and clinicians who assess students from different universities. Copyright © 2017 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coleman, Justin; Slaughter, Andrew; Veeraraghavan, Swetha
Multi-hazard Analysis for STOchastic time-DOmaiN phenomena (MASTODON) is a finite element application that aims at analyzing the response of 3-D soil-structure systems to natural and man-made hazards such as earthquakes, floods and fire. MASTODON currently focuses on the simulation of seismic events and has the capability to perform extensive ‘source-to-site’ simulations including earthquake fault rupture, nonlinear wave propagation and nonlinear soil-structure interaction (NLSSI) analysis. MASTODON is being developed to be a dynamic probabilistic risk assessment framework that enables analysts to not only perform deterministic analyses, but also easily perform probabilistic or stochastic simulations for the purpose of risk assessment.
Marshall, J H; Baker, D M; Lee, M J; Jones, G L; Lobo, A J; Brown, S R
2017-06-01
Decision-making in perianal Crohn's fistula (pCD) is preference sensitive. Patients use the internet to access healthcare information. The aim of this study was to assess the online information and patient decision aids relating to surgery for pCD. A search of Google™ and the Decision Aids Library Inventory (DALI) was performed using a predefined search strategy. Patient-focussed sources providing information about pCD surgery were included in the analysis. Written health information was assessed using the International Patient Decision Aids Standards (IPDAS) and DISCERN criteria. The readability of the source content was assessed using the Flesch-Kincaid score. Of the 201 sources found, 187 were excluded, leaving 14 sources for analysis. Three sources were dedicated to pCD, and six sources mentioned pCD-specific outcomes. The most common surgical intervention reported was seton insertion (n = 13). The least common surgical intervention reported was proctectomy (n = 1). The mean IPDAS and DISCERN scores were 4.43 ± 1.65 out of 12 (range = 2-8) and 2.93 ± 0.73 out of 5 (range = 1-5), respectively. The mean reading ease was US college standard. We found no patient decision aids relating to surgery for pCD. The online sources relating to surgery for pCD are few, and their quality is poor, as seen in the low IPDAS and DISCERN scores. Less than half of the sources mentioned pCD-specific outcomes, and three sources were solely dedicated to providing information on pCD. Healthcare professionals should look to create a patient tool to assist decision-making in pCD.
Rater Expertise in a Second Language Speaking Assessment: The Influence of Training and Experience
ERIC Educational Resources Information Center
Davis, Lawrence Edward
2012-01-01
Speaking performance tests typically employ raters to produce scores; accordingly, variability in raters' scoring decisions has important consequences for test reliability and validity. One such source of variability is the rater's level of expertise in scoring. Therefore, it is important to understand how raters' performance is influenced by…
Acoustic Source Localization in Aircraft Interiors Using Microphone Array Technologies
NASA Technical Reports Server (NTRS)
Sklanka, Bernard J.; Tuss, Joel R.; Buehrle, Ralph D.; Klos, Jacob; Williams, Earl G.; Valdivia, Nicolas
2006-01-01
Using three microphone array configurations at two aircraft body stations on a Boeing 777-300ER flight test, the acoustic radiation characteristics of the sidewall and outboard floor system are investigated by experimental measurement. Analysis of the experimental data is performed using sound intensity calculations for closely spaced microphones, PATCH Inverse Boundary Element Nearfield Acoustic Holography, and Spherical Nearfield Acoustic Holography. Each method is compared assessing strengths and weaknesses, evaluating source identification capability for both broadband and narrowband sources, evaluating sources during transient and steady-state conditions, and quantifying field reconstruction continuity using multiple array positions.
NASA Astrophysics Data System (ADS)
Lucchi, M.; Lorenzini, M.; Valdiserri, P.
2017-01-01
This work presents a numerical simulation of the annual performance of two different systems: a traditional one composed by a gas boiler-chiller pair and one consisting of a ground source heat pump (GSHP) both coupled to two thermal storage tanks. The systems serve a bloc of flats located in northern Italy and are assessed over a typical weather year, covering both the heating and cooling seasons. The air handling unit (AHU) coupled with the GSHP exhibits excellent characteristics in terms of temperature control, and has high performance parameters (EER and COP), which make conduction costs about 30% lower than those estimated for the traditional plant.
Battelle Energy Alliance, LLC (BEA) 2016 Self-Assessment Report for Idaho National Laboratory (INL)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alvarez, Juan
This report provides Battelle Energy Alliance’s (BEA) self-assessment of performance for the period of October 1, 2015, through September 30, 2016, as evaluated against the goals, performance objectives, and notable outcomes defined in the Fiscal Year (FY) 2016 Performance Evaluation and Measurement Plan (PEMP). BEA took into consideration and consolidated all input provided from internal and external sources (e.g., Contractor Assurance System [CAS], program and customer feedback, external and independent reviews, and Department of Energy [DOE] Idaho Operations Office [ID] quarterly PEMP reports and Quarterly Evaluation Reports). The overall performance of BEA during this rating period was self-assessed as “Excellent,”more » exceeding expectations of performance in Goal 1.0, “Efficient and Effective Mission Accomplishment”; Goal 2.0, “Efficient and Effective Stewardship and Operation of Research Facilities”; and Goal 3.0, “Sound and Competent Leadership and Stewardship of the Laboratory.” BEA met or exceeded expectations for Mission Support Goals 4.0 through 7.0 assessing a final multiplier of 1.0. Table 1 documents BEA’s assessment of performance to the goals and individual performance objectives. Table 2 documents completion of the notable outcomes. A more-detailed assessment of performance for each individual performance objective is documented in the closeout reports (see the PEMP reporting system). Table 3 includes an update to “Performance Challenges” as reported in the FY 2015 Self-Assessment Report.« less
Performance of salmon fishery portfolios across western North America.
Griffiths, Jennifer R; Schindler, Daniel E; Armstrong, Jonathan B; Scheuerell, Mark D; Whited, Diane C; Clark, Robert A; Hilborn, Ray; Holt, Carrie A; Lindley, Steven T; Stanford, Jack A; Volk, Eric C
2014-12-01
Quantifying the variability in the delivery of ecosystem services across the landscape can be used to set appropriate management targets, evaluate resilience and target conservation efforts. Ecosystem functions and services may exhibit portfolio-type dynamics, whereby diversity within lower levels promotes stability at more aggregated levels. Portfolio theory provides a framework to characterize the relative performance among ecosystems and the processes that drive differences in performance. We assessed Pacific salmon Oncorhynchus spp. portfolio performance across their native latitudinal range focusing on the reliability of salmon returns as a metric with which to assess the function of salmon ecosystems and their services to humans. We used the Sharpe ratio (e.g. the size of the total salmon return to the portfolio relative to its variability (risk)) to evaluate the performance of Chinook and sockeye salmon portfolios across the west coast of North America. We evaluated the effects on portfolio performance from the variance of and covariance among salmon returns within each portfolio, and the association between portfolio performance and watershed attributes. We found a positive latitudinal trend in the risk-adjusted performance of Chinook and sockeye salmon portfolios that also correlated negatively with anthropogenic impact on watersheds (e.g. dams and land-use change). High-latitude Chinook salmon portfolios were on average 2·5 times more reliable, and their portfolio risk was mainly due to low variance in the individual assets. Sockeye salmon portfolios were also more reliable at higher latitudes, but sources of risk varied among the highest performing portfolios. Synthesis and applications . Portfolio theory provides a straightforward method for characterizing the resilience of salmon ecosystems and their services. Natural variability in portfolio performance among undeveloped watersheds provides a benchmark for restoration efforts. Locally and regionally, assessing the sources of portfolio risk can guide actions to maintain existing resilience (protect habitat and disturbance regimes that maintain response diversity; employ harvest strategies sensitive to different portfolio components) or improve restoration activities. Improving our understanding of portfolio reliability may allow for management of natural resources that is robust to ongoing environmental change. Portfolio theory provides a straightforward method for characterizing the resilience of salmon ecosystems and their services. Natural variability in portfolio performance among undeveloped watersheds provides a benchmark for restoration efforts. Locally and regionally, assessing the sources of portfolio risk can guide actions to maintain existing resilience (protect habitat and disturbance regimes that maintain response diversity; employ harvest strategies sensitive to different portfolio components) or improve restoration activities. Improving our understanding of portfolio reliability may allow for management of natural resources that is robust to ongoing environmental change.
Performance of salmon fishery portfolios across western North America
Griffiths, Jennifer R; Schindler, Daniel E; Armstrong, Jonathan B; Scheuerell, Mark D; Whited, Diane C; Clark, Robert A; Hilborn, Ray; Holt, Carrie A; Lindley, Steven T; Stanford, Jack A; Volk, Eric C
2014-01-01
Quantifying the variability in the delivery of ecosystem services across the landscape can be used to set appropriate management targets, evaluate resilience and target conservation efforts. Ecosystem functions and services may exhibit portfolio-type dynamics, whereby diversity within lower levels promotes stability at more aggregated levels. Portfolio theory provides a framework to characterize the relative performance among ecosystems and the processes that drive differences in performance. We assessed Pacific salmon Oncorhynchus spp. portfolio performance across their native latitudinal range focusing on the reliability of salmon returns as a metric with which to assess the function of salmon ecosystems and their services to humans. We used the Sharpe ratio (e.g. the size of the total salmon return to the portfolio relative to its variability (risk)) to evaluate the performance of Chinook and sockeye salmon portfolios across the west coast of North America. We evaluated the effects on portfolio performance from the variance of and covariance among salmon returns within each portfolio, and the association between portfolio performance and watershed attributes. We found a positive latitudinal trend in the risk-adjusted performance of Chinook and sockeye salmon portfolios that also correlated negatively with anthropogenic impact on watersheds (e.g. dams and land-use change). High-latitude Chinook salmon portfolios were on average 2·5 times more reliable, and their portfolio risk was mainly due to low variance in the individual assets. Sockeye salmon portfolios were also more reliable at higher latitudes, but sources of risk varied among the highest performing portfolios. Synthesis and applications. Portfolio theory provides a straightforward method for characterizing the resilience of salmon ecosystems and their services. Natural variability in portfolio performance among undeveloped watersheds provides a benchmark for restoration efforts. Locally and regionally, assessing the sources of portfolio risk can guide actions to maintain existing resilience (protect habitat and disturbance regimes that maintain response diversity; employ harvest strategies sensitive to different portfolio components) or improve restoration activities. Improving our understanding of portfolio reliability may allow for management of natural resources that is robust to ongoing environmental change. Portfolio theory provides a straightforward method for characterizing the resilience of salmon ecosystems and their services. Natural variability in portfolio performance among undeveloped watersheds provides a benchmark for restoration efforts. Locally and regionally, assessing the sources of portfolio risk can guide actions to maintain existing resilience (protect habitat and disturbance regimes that maintain response diversity; employ harvest strategies sensitive to different portfolio components) or improve restoration activities. Improving our understanding of portfolio reliability may allow for management of natural resources that is robust to ongoing environmental change. PMID:25552746
Episodic Memory and Regional Atrophy in Frontotemporal Lobar Degeneration
Söderlund, Hedvig; Black, Sandra E.; Miller, Bruce L.; Freedman, Morris; Levine, Brian
2008-01-01
It has been unclear to what extent memory is affected in frontotemporal lobar degeneration (FTLD). Since patients usually have atrophy in regions implicated in memory function, the frontal and/or temporal lobes, one would expect some memory impairment, and that the degree of atrophy in these regions would be inversely related to memory function. The purposes of this study were 1) to assess episodic memory function in FTLD, and more specifically patients' ability to episodically re-experience an event, and determine its source; 2) to examine whether memory performance is related to quantified regional brain atrophy. FTLD patients (n=18) and healthy comparison subjects (n=14) were assessed with cued recall, recognition, “remember/know” (self-reported re-experiencing) and source recall, at 30 min and 24 hr after encoding. Regional gray matter volumes were assessed with high resolution structural MRI concurrently to testing. Patients performed worse than comparison subjects on all memory measures. Gray matter volume in the left medial temporal lobe was positively correlated with recognition, re-experiencing, and source recall. Gray matter volume in the left posterior temporal lobe correlated significantly with recognition, at 30 min and 24 hr, and with source recall at 30 min. Estimated familiarity at 30 min was positively correlated with gray matter volume in the left inferior parietal lobe. In summary, episodic memory deficits in FTLD may be more common than previously thought, particularly in patients with left medial and posterior temporal atrophy. PMID:17888461
Amiri, Leyla; Madadian, Edris; Hassani, Ferri P
2018-06-08
The objective of this study is to perform the energy and exergy analysis of an integrated ground-source heat pump (GSHP) system, along with technical assessment, for geothermal energy production by deployment of Engineering Equation Solver (EES). The system comprises heat pump cycle and ground heat exchanger for extracting geothermal energy from underground mine water. A simultaneous energy and exergy analysis of the system is brought off. These analyses provided persuasive outcomes due to the use of an economic and green source of energy. The energetic coefficient of performance (COP) of the entire system is 2.33 and the exergy efficiency of the system is 28.6%. The exergetic efficiencies of the compressor, ground heat exchanger, evaporator, expansion valve, condenser and fan are computed to be 38%, 42%, 53%, 55%, 60% and 64%, respectively. In the numerical investigation, different alteration such as changing the temperature and pressure of the condenser show promising potential for further application of GSHPs. The outcomes of this research can be used for developing and designing novel coupled heat and power systems.
Gao, Jian; Zhang, Jie; Li, Hong; Li, Lei; Xu, Linghong; Zhang, Yujie; Wang, Zhanshan; Wang, Xuezhong; Zhang, Weiqi; Chen, Yizhen; Cheng, Xi; Zhang, Hao; Peng, Liang; Chai, Fahe; Wei, Yongjie
2018-07-01
Volatile organic compounds (VOCs) can react with atmospheric radicals while being transported after being emitted, resulting in substantial losses. Using only observed VOC mixing ratios to assess VOC pollution, is therefore problematic. The observed mixing ratios and initial mixing ratios taking chemical loss into consideration were performed using data for 90 VOCs in the atmosphere in a typical urban area in Beijing in winter 2013 to gain a more accurate view of VOC pollution. The VOC sources, ambient VOC mixing ratios and compositions, variability and influencing factors, contributions to near-ground-ozone and health risks posed were assessed. Source apportionment should be conducted using initial mixing ratios, but health risks should be assessed using observed mixing ratios. The daytime daily mean initial mixing ratio (72.62ppbv) was 7.72ppbv higher than the daytime daily mean observed mixing ratio (64.90ppbv). Alkenes contributed >70% of the consumed VOCs. The nighttime daily mean observed mixing ratio was 71.66ppbv, 6.76ppbv higher than the daytime mixing ratio. The observed mixing ratio for 66 VOCs was 40.31% higher in Beijing than New York. The OFPs of Ini-D (266.54ppbv) was underestimated 23.41% compared to the OFP of Obs-D (204.14ppbv), improving emission control of ethylene and propene would be an effective way of controlling O 3 . Health risk assessments performed for 28 hazardous VOCs show that benzene, chloroform, 1,2-dichloroethane, and acetaldehyde pose carcinogenic risk and acrolein poses non-carcinogenic risks. Source apportionment results indicated that vehicle exhausts, solvent usage and industrial processes were the main VOC source during the study. Copyright © 2018. Published by Elsevier B.V.
Hulsman, Robert L; Peters, Joline F; Fabriek, Marcel
2013-09-01
Peer-assessment of communication skills may contribute to mastery of assessment criteria. When students develop the capacity to judge their peers' performance, they might improve their capacity to examine their own clinical performance. In this study peer-assessment ratings are compared to teacher-assessment ratings. The aim of this paper is to explore the impact of personality and social reputation as source of bias in assessment of communication skills. Second year students were trained and assessed history taking communication skills. Peers rated the students' personality and academic and social reputation. Peer-assessment ratings were significantly correlated with teacher-ratings in a summative assessment of medical communication. Peers did not provide negative ratings on final scales but did provide negative ratings on subcategories. Peer- and teacher-assessments were both related to the students' personality and academic reputation. Peer-assessment cannot replace teacher-assessment if the assessment should result in high-stake decisions about students. Our data do not confirm the hypothesis that peers are overly biased by personality and reputation characteristics in peer-assessment of performance. Early introduction of peer-assessment in medical education would facilitate early acceptance of this mode of evaluation and would promote early on the habit of critical evaluation of professional clinical performance and acceptance of being evaluated critically by peers. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Rey-Martinez, Jorge; Pérez-Fernández, Nicolás
2016-12-01
The proposed validation goal of 0.9 in intra-class correlation coefficient was reached with the results of this study. With the obtained results we consider that the developed software (RombergLab) is a validated balance assessment software. The reliability of this software is dependent of the used force platform technical specifications. Develop and validate a posturography software and share its source code in open source terms. Prospective non-randomized validation study: 20 consecutive adults underwent two balance assessment tests, six condition posturography was performed using a clinical approved software and force platform and the same conditions were measured using the new developed open source software using a low cost force platform. Intra-class correlation index of the sway area obtained from the center of pressure variations in both devices for the six conditions was the main variable used for validation. Excellent concordance between RombergLab and clinical approved force platform was obtained (intra-class correlation coefficient =0.94). A Bland and Altman graphic concordance plot was also obtained. The source code used to develop RombergLab was published in open source terms.
Federer, Meghan Rector; Nehm, Ross H.; Pearl, Dennis K.
2016-01-01
Understanding sources of performance bias in science assessment provides important insights into whether science curricula and/or assessments are valid representations of student abilities. Research investigating assessment bias due to factors such as instrument structure, participant characteristics, and item types are well documented across a variety of disciplines. However, the relationships among these factors are unclear for tasks evaluating understanding through performance on scientific practices, such as explanation. Using item-response theory (Rasch analysis), we evaluated differences in performance by gender on a constructed-response (CR) assessment about natural selection (ACORNS). Three isomorphic item strands of the instrument were administered to a sample of undergraduate biology majors and nonmajors (Group 1: n = 662 [female = 51.6%]; G2: n = 184 [female = 55.9%]; G3: n = 642 [female = 55.1%]). Overall, our results identify relationships between item features and performance by gender; however, the effect is small in the majority of cases, suggesting that males and females tend to incorporate similar concepts into their CR explanations. These results highlight the importance of examining gender effects on performance in written assessment tasks in biology. PMID:26865642
Performance Appraisals: How to Make Them Work.
1987-03-01
Choice....................8 Rank Order...................... Forced Distribution.................9 Management by Objectives.............10 Assessment... management today. Next, the many sources and causes of shortcomings with evaluation systems are explored in detail. Considerations in system design are...the response of others when I’ve evaluated them. A good performance system can be a tremendous management tool when trying to develop others
Performance assessment of the BEBIG MultiSource® high dose rate brachytherapy treatment unit
NASA Astrophysics Data System (ADS)
Palmer, Antony; Mzenda, Bongile
2009-12-01
A comprehensive system characterisation was performed of the Eckert & Ziegler BEBIG GmbH MultiSource® High Dose Rate (HDR) brachytherapy treatment unit with an 192Ir source. The unit is relatively new to the UK market, with the first installation in the country having been made in the summer of 2009. A detailed commissioning programme was devised and is reported including checks of the fundamental parameters of source positioning, dwell timing, transit doses and absolute dosimetry of the source. Well chamber measurements, autoradiography and video camera analysis techniques were all employed. The absolute dosimetry was verified by the National Physical Laboratory, UK, and compared to a measurement based on a calibration from PTB, Germany, and the supplied source certificate, as well as an independent assessment by a visiting UK centre. The use of the 'Krieger' dosimetry phantom has also been evaluated. Users of the BEBIG HDR system should take care to avoid any significant bend in the transfer tube, as this will lead to positioning errors of the source, of up to 1.0 mm for slight bends, 2.0 mm for moderate bends and 5.0 mm for extreme curvature (depending on applicators and transfer tube used) for the situations reported in this study. The reason for these errors and the potential clinical impact are discussed. Users should also note the methodology employed by the system for correction of transit doses, and that no correction is made for the initial and final transit doses. The results of this investigation found that the uncorrected transit doses lead to small errors in the delivered dose at the first dwell position, of up to 2.5 cGy at 2 cm (5.6 cGy at 1 cm) from a 10 Ci source, but the transit dose correction for other dwells was accurate within 0.2 cGy. The unit has been mechanically reliable, and source positioning accuracy and dwell timing have been reproducible, with overall performance similar to other existing HDR equipment. The unit is capable of high quality brachytherapy treatment delivery, taking the above factors into account.
Illuminant and observer metamerism and the Hardy-Rand-Rittler color vision test editions.
Dain, Stephen J
2006-01-01
A previous study identified a significant metamerism in the several editions of the Hardy-Rand-Rittller pseudoisochromatic plates (HRR) but did not proceed to quantify the consequences of that metamerism (Dain, 2004). Metamerism arises from two sources and is almost inevitable when a printed color vision test is reproduced in several editions. Metamerism has two consequences; these are illuminant/source-based changes in performance and changes in performance with observer (less well known) when assessing anomalous trichromats. This study addresses the effects of illuminant/source and observer metamerism on the fourth editions of HRR. Groups of colors intended to lie on a dichromat confusion line generally remain on a confusion line when the source id changed. The plates appear to be resistant to each form of metamerism, perhaps because the features of the spectral reflectance are similar for figure color and background gray. As a consequence, the clinician needs to be less concerned about using a non-recommended source than was previously believed.
Rewarded remembering: dissociations between self-rated motivation and memory performance.
Ngaosuvan, Leonard; Mäntylä, Timo
2005-08-01
People often claim that they perform better in memory performance tasks when they are more motivated. However, past research has shown minimal effects of motivation on memory performance when factors contributing to item-specific biases during encoding and retrieval are taken into account. The purpose of the present study was to examine the generality of this apparent dissociation by using more sensitive measures of experienced motivation and memory performance. Extrinsic motivation was manipulated through competition instructions, and subjective ratings of intrinsic and extrinsic motivation were obtained before and after study instructions. Participants studied a series of words, and memory performance was assessed by content recall (Experiment 1) and source recall (Experiment 2). Both experiments showed dissociation between subjective ratings of extrinsic motivation and actual memory performance, so that competition increased self-rated extrinsic motivation but had no effects on memory performance, including source recall. Inconsistent with most people's expectations, the findings suggest that extrinsic motivation has minimal effects on memory performance.
NASA Technical Reports Server (NTRS)
Chidester, Thomas R.; Kanki, Barbara G.; Helmreich, Robert L.
1989-01-01
The crew-factors research program at NASA Ames has developed a methodology for studying the impact of a variety of variables on the effectiveness of crews flying realistic but high workload simulated trips. The validity of investigations using the methodology is enhanced by careful design of full-mission scenarios, performance assessment using converging sources of data, and recruitment of representative subjects. Recently, portions of this methodology have been adapted for use in assessing the effectiveness of crew coordination among participants in line-oriented flight training.
2011-02-11
on TF39 engines, which are typically used for C-5 Galaxy aircraft; T56 engines, which are typically used for C-130 aircraft; and fuel accessories on...conducting separate cost-benefit analyses for the TF39 and T56 engine maintenance work. Under Air Force guidance for depot-level source-of-repair selection...Force has not developed specific risk mitigation plans for the TF39 or T56 engines because it is still assessing how the work will be performed
Vila, Javier; Bowman, Joseph D.; Richardson, Lesley; Kincl, Laurel; Conover, Dave L.; McLean, Dave; Mann, Simon; Vecchia, Paolo; van Tongeren, Martie; Cardis, Elisabeth
2016-01-01
Introduction: To date, occupational exposure assessment of electromagnetic fields (EMF) has relied on occupation-based measurements and exposure estimates. However, misclassification due to between-worker variability remains an unsolved challenge. A source-based approach, supported by detailed subject data on determinants of exposure, may allow for a more individualized exposure assessment. Detailed information on the use of occupational sources of exposure to EMF was collected as part of the INTERPHONE-INTEROCC study. To support a source-based exposure assessment effort within this study, this work aimed to construct a measurement database for the occupational sources of EMF exposure identified, assembling available measurements from the scientific literature. Methods: First, a comprehensive literature search was performed for published and unpublished documents containing exposure measurements for the EMF sources identified, a priori as well as from answers of study subjects. Then, the measurements identified were assessed for quality and relevance to the study objectives. Finally, the measurements selected and complementary information were compiled into an Occupational Exposure Measurement Database (OEMD). Results: Currently, the OEMD contains 1624 sets of measurements (>3000 entries) for 285 sources of EMF exposure, organized by frequency band (0 Hz to 300 GHz) and dosimetry type. Ninety-five documents were selected from the literature (almost 35% of them are unpublished technical reports), containing measurements which were considered informative and valid for our purpose. Measurement data and complementary information collected from these documents came from 16 different countries and cover the time period between 1974 and 2013. Conclusion: We have constructed a database with measurements and complementary information for the most common sources of exposure to EMF in the workplace, based on the responses to the INTERPHONE-INTEROCC study questionnaire. This database covers the entire EMF frequency range and represents the most comprehensive resource of information on occupational EMF exposure. It is available at www.crealradiation.com/index.php/en/databases. PMID:26493616
PFLOTRAN-RepoTREND Source Term Comparison Summary.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frederick, Jennifer M.
Code inter-comparison studies are useful exercises to verify and benchmark independently developed software to ensure proper function, especially when the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment. This summary describes the results of the first portion of the code inter-comparison between PFLOTRAN and RepoTREND, which compares the radionuclide source term used in a typical performance assessment.
KSC VAB Aeroacoustic Hazard Assessment
2010-07-01
a Mobile Launch Platform (MLP). Several platform levels are used to perform processing operations and will too be reused and/or modified. Each HB is...I/V Flight Vehicles and the VAB The shell is made up of hundreds of aluminum “punch-out” panels which are designed to fail above approximately 100...into several slices which are converted to monopole sources (see figure below). Figure 3: Plume conversion into acoustic sources [2] The
Performance Assessment of Network Intrusion-Alert Prediction
2012-09-01
the threats. In this thesis, we use Snort to generate the intrusion detection alerts. 2. SNORT Snort is an open source network intrusion...standard for IPS. (Snort, 2012) We choose Snort because it is an open source product that is free to download and can be deployed cross-platform...Learning & prediction in relational time series: A survey. 21st Behavior Representation in Modeling & Simulation ( BRIMS ) Conference 2012, 93–100. Tan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peruzzo, S., E-mail: simone.peruzzo@igi.cnr.it; Cervaro, V.; Dalla Palma, M.
2016-02-15
This paper presents the results of numerical simulations and experimental tests carried out to assess the feasibility and suitability of graphite castellated tiles as beam-facing component in the diagnostic calorimeter of the negative ion source SPIDER (Source for Production of Ions of Deuterium Extracted from Radio frequency plasma). The results indicate that this concept could be a reliable, although less performing, alternative for the present design based on carbon fiber composite tiles, as it provides thermal measurements on the required spatial scale.
NASA Astrophysics Data System (ADS)
Peruzzo, S.; Cervaro, V.; Dalla Palma, M.; Delogu, R.; De Muri, M.; Fasolo, D.; Franchin, L.; Pasqualotto, R.; Pimazzoni, A.; Rizzolo, A.; Tollin, M.; Zampieri, L.; Serianni, G.
2016-02-01
This paper presents the results of numerical simulations and experimental tests carried out to assess the feasibility and suitability of graphite castellated tiles as beam-facing component in the diagnostic calorimeter of the negative ion source SPIDER (Source for Production of Ions of Deuterium Extracted from Radio frequency plasma). The results indicate that this concept could be a reliable, although less performing, alternative for the present design based on carbon fiber composite tiles, as it provides thermal measurements on the required spatial scale.
NASA Astrophysics Data System (ADS)
Ángel López Comino, José; Cesca, Simone; Kriegerowski, Marius; Heimann, Sebastian; Dahm, Torsten; Mirek, Janusz; Lasocky, Stanislaw
2017-04-01
Previous analysis to assess the monitoring performance of a dedicated seismic network are always useful to determine its capability of detecting, locating and characterizing target seismicity. This work focuses on a hydrofracking experiment in Poland, which is monitored in the framework of the SHEER (SHale gas Exploration and Exploitation induced Risks) EU project. The seismic installation is located near Wysin (Poland), in the central-western part of the Peribaltic synclise at Pomerania. The network setup includes a distributed network of six broadband stations, three shallow borehole stations and three small-scale arrays. We assess the monitoring performance prior operations, using synthetic seismograms. Realistic full waveform are generated and combined with real noise before fracking operations, to produce either event based or continuous synthetic waveforms. Background seismicity is modelled by double couple (DC) focal mechanisms. Non-DC sources resemble induced tensile fractures opening in the direction of the minimal compressive stress and closing in the same direction after the injection. Microseismic sources are combined with a realistic crustal model, distribution of hypocenters, magnitudes and source durations. The network detection performance is then assessed in terms of Magnitude of Completeness (Mc) through two different techniques: i) using an amplitude threshold approach, taking into account a station dependent noise level and different values of signal-to-noise ratio (SNR) and ii) through the application of an automatic detection algorithm to the continuous synthetic dataset. In the first case, we compare the maximal amplitude of noise free synthetic waveforms with the different noise levels. Imposing the simultaneous detection at e.g. 4 stations for a robust detection, the Mc is assessed and can be adjusted by empirical relationships for different SNR values. We find that different source mechanisms have different detection threshold. The background seismicity (DC sources) is better detectable than induced earthquakes (tensile cracks mechanisms). Assuming a SNR of 2, we estimate a Mc 0.55 around the fracking wells, with an increase of 0.05 during day hours. The value of Mc can be decreased to 0.45 around the fracking region, taking advantage by the array installations. The second approach applies a full waveform detection and location algorithm based on the stacking of smooth characteristic function and the identification of high coherence in the signals recorded at different stations. In this case the detection can be increased at the cost of increasing also false detections, with an acceptable compromise found for Mc 0.1.
Mukerjee, Shaibal; Norris, Gary A; Smith, Luther A; Noble, Christopher A; Neas, Lucas M; Ozkaynak, A Halûk; Gonzales, Melissa
2004-04-15
The relationship between continuous measurements of volatile organic compounds sources and particle number was evaluated at a Photochemical Assessment Monitoring Station Network (PAMS) site located near the U.S.-Mexico Border in central El Paso, TX. Sources of volatile organic compounds (VOCs) were investigated using the multivariate receptor model UNMIX and the effective variance least squares receptor model known as Chemical Mass Balance (CMB, Version 8.0). As expected from PAMS measurements, overall findings from data screening as well as both receptor models confirmed that mobile sources were the major source of VOCs. Comparison of hourly source contribution estimates (SCEs) from the two receptor models revealed significant differences in motor vehicle exhaust and evaporative gasoline contributions. However, the motor vehicle exhaust contributions were highly correlated with each other. Motor vehicle exhaust was also correlated with the ultrafine and accumulation mode particle count, which suggests that motor vehicle exhaust is a source of these particles at the measurement site. Wind sector analyses were performed using the SCE and pollutant data to assess source location of VOCs, particle count, and criteria pollutants. Results from this study have application to source apportionment studies and mobile source emission control strategies that are ongoing in this air shed.
ERIC Educational Resources Information Center
Liu, Xiufeng; Ruiz, Miguel E.
2008-01-01
This article reports a study on using data mining to predict K-12 students' competence levels on test items related to energy. Data sources are the 1995 Third International Mathematics and Science Study (TIMSS), 1999 TIMSS-Repeat, 2003 Trend in International Mathematics and Science Study (TIMSS), and the National Assessment of Educational…
NASA Astrophysics Data System (ADS)
Belis, Claudio A.; Pernigotti, Denise; Pirovano, Guido
2017-04-01
Source Apportionment (SA) is the identification of ambient air pollution sources and the quantification of their contribution to pollution levels. This task can be accomplished using different approaches: chemical transport models and receptor models. Receptor models are derived from measurements and therefore are considered as a reference for primary sources urban background levels. Chemical transport model have better estimation of the secondary pollutants (inorganic) and are capable to provide gridded results with high time resolution. Assessing the performance of SA model results is essential to guarantee reliable information on source contributions to be used for the reporting to the Commission and in the development of pollution abatement strategies. This is the first intercomparison ever designed to test both receptor oriented models (or receptor models) and chemical transport models (or source oriented models) using a comprehensive method based on model quality indicators and pre-established criteria. The target pollutant of this exercise, organised in the frame of FAIRMODE WG 3, is PM10. Both receptor models and chemical transport models present good performances when evaluated against their respective references. Both types of models demonstrate quite satisfactory capabilities to estimate the yearly source contributions while the estimation of the source contributions at the daily level (time series) is more critical. Chemical transport models showed a tendency to underestimate the contribution of some single sources when compared to receptor models. For receptor models the most critical source category is industry. This is probably due to the variety of single sources with different characteristics that belong to this category. Dust is the most problematic source for Chemical Transport Models, likely due to the poor information about this kind of source in the emission inventories, particularly concerning road dust re-suspension, and consequently the little detail about the chemical components of this source used in the models. The sensitivity tests show that chemical transport models show better performances when displaying a detailed set of sources (14) than when using a simplified one (only 8). It was also observed that an enhanced vertical profiling can improve the estimation of specific sources, such as industry, under complex meteorological conditions and that an insufficient spatial resolution in urban areas can impact on the capabilities of models to estimate the contribution of diffuse primary sources (e.g. traffic). Both families of models identify traffic and biomass burning as the first and second most contributing categories, respectively, to elemental carbon. The results of this study demonstrate that the source apportionment assessment methodology developed by the JRC is applicable to any kind of SA model. The same methodology is implemented in the on-line DeltaSA tool to support source apportionment model evaluation (http://source-apportionment.jrc.ec.europa.eu/).
Source-term development for a contaminant plume for use by multimedia risk assessment models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whelan, Gene; McDonald, John P.; Taira, Randal Y.
1999-12-01
Multimedia modelers from the U.S. Environmental Protection Agency (EPA) and the U.S. Department of Energy (DOE) are collaborating to conduct a comprehensive and quantitative benchmarking analysis of four intermedia models: DOE's Multimedia Environmental Pollutant Assessment System (MEPAS), EPA's MMSOILS, EPA's PRESTO, and DOE's RESidual RADioactivity (RESRAD). These models represent typical analytically, semi-analytically, and empirically based tools that are utilized in human risk and endangerment assessments for use at installations containing radioactive and/or hazardous contaminants. Although the benchmarking exercise traditionally emphasizes the application and comparison of these models, the establishment of a Conceptual Site Model (CSM) should be viewed with equalmore » importance. This paper reviews an approach for developing a CSM of an existing, real-world, Sr-90 plume at DOE's Hanford installation in Richland, Washington, for use in a multimedia-based benchmarking exercise bet ween MEPAS, MMSOILS, PRESTO, and RESRAD. In an unconventional move for analytically based modeling, the benchmarking exercise will begin with the plume as the source of contamination. The source and release mechanism are developed and described within the context of performing a preliminary risk assessment utilizing these analytical models. By beginning with the plume as the source term, this paper reviews a typical process and procedure an analyst would follow in developing a CSM for use in a preliminary assessment using this class of analytical tool.« less
Wang, Dongsheng; Xing, Linan; Xie, Jiankun; Chow, Christopher W K; Xu, Zhizhen; Zhao, Yanmei; Drikas, Mary
2010-09-01
China has a very complex water supply system which relies on many rivers and lakes. As the population and economic development increases, water quality is greatly impacted by anthropogenic processes. This seriously affects the character of the dissolved organic matter (DOM) and imposes operational challenges to the water treatment facilities in terms of process optimization. The aim of this investigation was to compare selected drinking water sources (raw) with different DOM character, and the respective treated waters after coagulation, using simple organic characterization techniques to obtain a better understanding of the impact of source water quality on water treatment. Results from the analyses of selected water samples showed that the dissolved organic carbon (DOC) of polluted waters is generally higher than that of un-polluted waters, but the specific UV absorbance value has the opposite trend. After resolving the high performance size exclusion chromatography (HPSEC) peak components of source waters using peak fitting, the twelve waters studied can be divided into two main groups (micro-polluted and un-polluted) by using cluster analysis. The DOM removal efficiency (treatability) of these waters has been compared using four coagulants. For water sources allocated to the un-polluted group, traditional coagulants (Al(2)(SO(4))(3) and FeCl(3)) achieved better removal. High performance poly aluminum chloride, a new type of composite coagulant, performed very well and more efficiently for polluted waters. After peak fitting the HPSEC chromatogram of each of the treated waters, average removal efficiency of the profiles can be calculated and these correspond well with DOC and UV removal. This provides a convenient tool to assess coagulation removal and coagulant selection. Copyright (c) 2010 Elsevier Ltd. All rights reserved.
Semaphore network encryption report
NASA Astrophysics Data System (ADS)
Johnson, Karen L.
1994-03-01
This paper documents the results of a preliminary assessment performed on the commercial off-the-shelf (COTS) Semaphore Communications Corporation (SCC) Network Security System (NSS). The Semaphore NSS is a family of products designed to address important network security concerns, such as network source address authentication and data privacy. The assessment was performed in the INFOSEC Core Integration Laboratory, and its scope was product usability focusing on interoperability and system performance in an existing operational network. Included in this paper are preliminary findings. Fundamental features and functionality of the Semaphore NSS are identified, followed by details of the assessment, including test descriptions and results. A summary of test results and future plans are also included. These findings will be useful to those investigating the use of commercially available solutions to network authentication and data privacy.
Perceived sources of change in trainees' self-efficacy beliefs.
Lent, Robert W; Cinamon, Rachel Gali; Bryan, Nicole A; Jezzi, Matthew M; Martin, Helena M; Lim, Robert
2009-09-01
Thought-listing procedures were used to examine the perceived incidence, size, direction, and bases of change in the session-level self-efficacy of therapists in training. Ninety-eight Master's-level trainees completed a cognitive assessment task immediately after each session with a client in their first practicum. Participants typically reported modest-sized, positive changes in their therapeutic self-efficacy at each session. Seven perceived sources of change in self-efficacy were identified. Some of these sources (e.g., trainees' performance evaluations, affective reactions) were consistent with general self-efficacy theory; others reflected the interpersonal performance context of therapy (e.g., perceptions of the therapeutic relationship and client behavior). Implications of the findings for training and future research on therapist development are considered. (PsycINFO Database Record (c) 2010 APA, all rights reserved).
Stress testing hydrologic models using bottom-up climate change assessment
NASA Astrophysics Data System (ADS)
Stephens, C.; Johnson, F.; Marshall, L. A.
2017-12-01
Bottom-up climate change assessment is a promising approach for understanding the vulnerability of a system to potential future changes. The technique has been utilised successfully in risk-based assessments of future flood severity and infrastructure vulnerability. We find that it is also an ideal tool for assessing hydrologic model performance in a changing climate. In this study, we applied bottom-up climate change to compare the performance of two different hydrologic models (an event-based and a continuous model) under increasingly severe climate change scenarios. This allowed us to diagnose likely sources of future prediction error in the two models. The climate change scenarios were based on projections for southern Australia, which indicate drier average conditions with increased extreme rainfall intensities. We found that the key weakness in using the event-based model to simulate drier future scenarios was the model's inability to dynamically account for changing antecedent conditions. This led to increased variability in model performance relative to the continuous model, which automatically accounts for the wetness of a catchment through dynamic simulation of water storages. When considering more intense future rainfall events, representation of antecedent conditions became less important than assumptions around (non)linearity in catchment response. The linear continuous model we applied may underestimate flood risk in a future climate with greater extreme rainfall intensity. In contrast with the recommendations of previous studies, this indicates that continuous simulation is not necessarily the key to robust flood modelling under climate change. By applying bottom-up climate change assessment, we were able to understand systematic changes in relative model performance under changing conditions and deduce likely sources of prediction error in the two models.
Deal, Shanley B; Stefanidis, Dimitrios; Brunt, L Michael; Alseidi, Adnan
2017-05-01
We sought to determine the feasibility of developing a multimedia educational tutorial to teach learners to assess the critical view of safety using input from expert surgeons, non-surgeons and crowd-sourcing. We intended to develop a tutorial that would teach learners how to identify the basic anatomy and physiology of the gallbladder, identify the components of the critical view of safety criteria, and understand its significance for performing a safe gallbladder removal. Using rounds of assessment with experts, laypersons and crowd-workers we developed an educational video with improving comprehension after each round of revision. We demonstrate that the development of a multimedia educational tool to educate learners of various backgrounds is feasible using an iterative review process that incorporates the input of experts and crowd sourcing. When planning the development of an educational tutorial, a step-wise approach as described herein should be considered. Copyright © 2017 Elsevier Inc. All rights reserved.
Fan Noise Prediction with Applications to Aircraft System Noise Assessment
NASA Technical Reports Server (NTRS)
Nark, Douglas M.; Envia, Edmane; Burley, Casey L.
2009-01-01
This paper describes an assessment of current fan noise prediction tools by comparing measured and predicted sideline acoustic levels from a benchmark fan noise wind tunnel test. Specifically, an empirical method and newly developed coupled computational approach are utilized to predict aft fan noise for a benchmark test configuration. Comparisons with sideline noise measurements are performed to assess the relative merits of the two approaches. The study identifies issues entailed in coupling the source and propagation codes, as well as provides insight into the capabilities of the tools in predicting the fan noise source and subsequent propagation and radiation. In contrast to the empirical method, the new coupled computational approach provides the ability to investigate acoustic near-field effects. The potential benefits/costs of these new methods are also compared with the existing capabilities in a current aircraft noise system prediction tool. The knowledge gained in this work provides a basis for improved fan source specification in overall aircraft system noise studies.
Conceptual Trade Study of General Purpose Heat Source Powered Stirling Converter Configurations
NASA Technical Reports Server (NTRS)
Turpin, J. B.
2007-01-01
This Technical Manual describes a parametric study of general purpose heat source (GPHS) powered Stirling converter configurations. This study was performed in support of MSFC s efforts to establish the capability to perform non-nuclear system level testing and integration of radioisotope power systems. Six different GPHS stack configurations at a total of three different power levels (80, 250, and 500 W(sub e) were analyzed. The thermal profiles of the integrated GPHS modules (for each configuration) were calculated to determine maximum temperatures for comparison to allowable material limits. Temperature profiles for off-nominal power conditions were also assessed in order to better understand how power demands from the Stirling engine impact the performance of a given configuration.
Performance modeling for A-SCOPE: a space-borne lidar measuring atmospheric CO2
NASA Astrophysics Data System (ADS)
Caron, Jérôme; Durand, Yannig; Bezy, Jean-Loup; Meynart, Roland
2009-09-01
A-SCOPE (Advanced Space Carbon and Climate Observation of Planet Earth) has been one of the six candidates for the third cycle of the Earth Explorer Core missions, selected by the European Space Agency (ESA) for assessment studies. Earth Explorer missions focus on the science and research aspects of ESA's Living Planet Programme. A-SCOPE mission aims at observing atmospheric CO2 for a better understanding of the carbon cycle. Knowledge about the spatial distribution of sources and sinks of CO2 with unprecedented accuracy will provide urgently needed information about the global carbon cycle. A-SCOPE mission encompasses a new approach to observe the Earth from space based on an IPDA (Integrated Path Differential Absorption) Lidar. Based on the known principle of a differential measurement technique, the IPDA lidar relies on the measurement of the laser echoes reflected by hard targets as the ground or the top of the vegetation. Such a time-gated technique is a promising way to overcome the sources of systematic errors inherent to passive missions. To be fully exploited, it however translates into stringent instrument requirements and requires a dedicated performance assessment. In this paper, the A-SCOPE instrument concept is first presented, with the aim of summarizing some important outcomes from the industrial assessment studies. After a discussion of the mission requirements and measurement principles, an overview is given about the instrument architecture. Then the instrument performance is reported, together with a detailed discussion about sources of systematic errors, which pose the strongest technical challenges.
Rockfall hazard and risk assessments along roads at a regional scale: example in Swiss Alps
NASA Astrophysics Data System (ADS)
Michoud, C.; Derron, M.-H.; Horton, P.; Jaboyedoff, M.; Baillifard, F.-J.; Loye, A.; Nicolet, P.; Pedrazzini, A.; Queyrel, A.
2012-03-01
Unlike fragmental rockfall runout assessments, there are only few robust methods to quantify rock-mass-failure susceptibilities at regional scale. A detailed slope angle analysis of recent Digital Elevation Models (DEM) can be used to detect potential rockfall source areas, thanks to the Slope Angle Distribution procedure. However, this method does not provide any information on block-release frequencies inside identified areas. The present paper adds to the Slope Angle Distribution of cliffs unit its normalized cumulative distribution function. This improvement is assimilated to a quantitative weighting of slope angles, introducing rock-mass-failure susceptibilities inside rockfall source areas previously detected. Then rockfall runout assessment is performed using the GIS- and process-based software Flow-R, providing relative frequencies for runout. Thus, taking into consideration both susceptibility results, this approach can be used to establish, after calibration, hazard and risk maps at regional scale. As an example, a risk analysis of vehicle traffic exposed to rockfalls is performed along the main roads of the Swiss alpine valley of Bagnes.
DOT National Transportation Integrated Search
2012-07-01
This project has developed and implemented a software environment to utilize data collected by Traffic Management Centers (TMC) in Florida, in combination with data from other sources to support various applications. The environment allows capturing ...
Multiscale Spatial Modeling of Human Exposure from Local Sources to Global Intake.
Wannaz, Cedric; Fantke, Peter; Jolliet, Olivier
2018-01-16
Exposure studies, used in human health risk and impact assessments of chemicals, are largely performed locally or regionally. It is usually not known how global impacts resulting from exposure to point source emissions compare to local impacts. To address this problem, we introduce Pangea, an innovative multiscale, spatial multimedia fate and exposure assessment model. We study local to global population exposure associated with emissions from 126 point sources matching locations of waste-to-energy plants across France. Results for three chemicals with distinct physicochemical properties are expressed as the evolution of the population intake fraction through inhalation and ingestion as a function of the distance from sources. For substances with atmospheric half-lives longer than a week, less than 20% of the global population intake through inhalation (median of 126 emission scenarios) can occur within a 100 km radius from the source. This suggests that, by neglecting distant low-level exposure, local assessments might only account for fractions of global cumulative intakes. We also study ∼10 000 emission locations covering France more densely to determine per chemical and exposure route which locations minimize global intakes. Maps of global intake fractions associated with each emission location show clear patterns associated with population and agriculture production densities.
The mariner 9 power subsystem design and flight performance
NASA Technical Reports Server (NTRS)
Josephs, R. H.
1973-01-01
The design and flight performance of the Mariner Mars 1971 power subsystem are presented. Mariner 9 was the first spacecraft to orbit another planet, and some of the power management techniques employed to support an orbital mission far from earth with marginal sunlight for its photovoltaic-battery power source are described. The performance of its nickel-cadmium battery during repetitive sun occultation phases of the mission, and the results of unique tests in flight to assess the performance capability of its solar array are reported.
Assessing embryo development using swept source optical coherence tomography
NASA Astrophysics Data System (ADS)
Caujolle, S.; Cernat, R.; Silvestri, G.; Marques, M. J.; Bradu, A.; Feuchter, T.; Robinson, G.; Griffin, D.; Podoleanu, A.
2018-03-01
A detailed assessment of embryo development would assist biologists with selecting the most suitable embryos for transfer leading to higher pregnancy rates. Currently, only low resolution microscopy is employed to perform this assessment. Although this method delivers some information on the embryo surface morphology, no specific details are shown related to its inner structure. Using a Master-Slave Swept-Source Optical Coherence Tomography (SS-OCT), images of bovine embryos from day 7 after fertilization were collected from different depths. The dynamic changes inside the embryos were examined, in detail and in real-time from several depths. To prove our ability to characterize the morphology, a single embryo was imaged over 26 hours. The embryo was deprived of its life support environment, leading to its death. Over this period, clear morphological changes were observed.
Franco-Pedroso, Javier; Ramos, Daniel; Gonzalez-Rodriguez, Joaquin
2016-01-01
In forensic science, trace evidence found at a crime scene and on suspect has to be evaluated from the measurements performed on them, usually in the form of multivariate data (for example, several chemical compound or physical characteristics). In order to assess the strength of that evidence, the likelihood ratio framework is being increasingly adopted. Several methods have been derived in order to obtain likelihood ratios directly from univariate or multivariate data by modelling both the variation appearing between observations (or features) coming from the same source (within-source variation) and that appearing between observations coming from different sources (between-source variation). In the widely used multivariate kernel likelihood-ratio, the within-source distribution is assumed to be normally distributed and constant among different sources and the between-source variation is modelled through a kernel density function (KDF). In order to better fit the observed distribution of the between-source variation, this paper presents a different approach in which a Gaussian mixture model (GMM) is used instead of a KDF. As it will be shown, this approach provides better-calibrated likelihood ratios as measured by the log-likelihood ratio cost (Cllr) in experiments performed on freely available forensic datasets involving different trace evidences: inks, glass fragments and car paints. PMID:26901680
Performance of a Fuel-Cell-Powered, Small Electric Airplane Assessed
NASA Technical Reports Server (NTRS)
Berton, Jeffrey J.
2004-01-01
Rapidly emerging fuel-cell-power technologies may be used to launch a new revolution of electric propulsion systems for light aircraft. Future small electric airplanes using fuel cell technologies hold the promise of high reliability, low maintenance, low noise, and - with the exception of water vapor - zero emissions. An analytical feasibility and performance assessment was conducted by NASA Glenn Research Center's Airbreathing Systems Analysis Office of a fuel-cell-powered, propeller-driven, small electric airplane based on a model of the MCR-01 two-place kitplane (Dyn'Aero, Darois, France). This assessment was conducted in parallel with an ongoing effort by the Advanced Technology Products Corporation and the Foundation for Advancing Science and Technology Education. Their project - partially funded by a NASA grant - is to design, build, and fly the first manned, continuously propelled, nongliding electric airplane. In our study, an analytical performance model of a proton exchange membrane (PEM) fuel cell propulsion system was developed and applied to a notional, two-place light airplane modeled after the MCR-01 kitplane. The PEM fuel cell stack was fed pure hydrogen fuel and humidified ambient air via a small automotive centrifugal supercharger. The fuel cell performance models were based on chemical reaction analyses calibrated with published data from the fledgling U.S. automotive fuel cell industry. Electric propeller motors, rated at two shaft power levels in separate assessments, were used to directly drive a two-bladed, variable-pitch propeller. Fuel sources considered were compressed hydrogen gas and cryogenic liquid hydrogen. Both of these fuel sources provided pure, contaminant-free hydrogen for the PEM cells.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faby, Sebastian, E-mail: sebastian.faby@dkfz.de; Kuchenbecker, Stefan; Sawall, Stefan
2015-07-15
Purpose: To study the performance of different dual energy computed tomography (DECT) techniques, which are available today, and future multi energy CT (MECT) employing novel photon counting detectors in an image-based material decomposition task. Methods: The material decomposition performance of different energy-resolved CT acquisition techniques is assessed and compared in a simulation study of virtual non-contrast imaging and iodine quantification. The material-specific images are obtained via a statistically optimal image-based material decomposition. A projection-based maximum likelihood approach was used for comparison with the authors’ image-based method. The different dedicated dual energy CT techniques are simulated employing realistic noise models andmore » x-ray spectra. The authors compare dual source DECT with fast kV switching DECT and the dual layer sandwich detector DECT approach. Subsequent scanning and a subtraction method are studied as well. Further, the authors benchmark future MECT with novel photon counting detectors in a dedicated DECT application against the performance of today’s DECT using a realistic model. Additionally, possible dual source concepts employing photon counting detectors are studied. Results: The DECT comparison study shows that dual source DECT has the best performance, followed by the fast kV switching technique and the sandwich detector approach. Comparing DECT with future MECT, the authors found noticeable material image quality improvements for an ideal photon counting detector; however, a realistic detector model with multiple energy bins predicts a performance on the level of dual source DECT at 100 kV/Sn 140 kV. Employing photon counting detectors in dual source concepts can improve the performance again above the level of a single realistic photon counting detector and also above the level of dual source DECT. Conclusions: Substantial differences in the performance of today’s DECT approaches were found for the application of virtual non-contrast and iodine imaging. Future MECT with realistic photon counting detectors currently can only perform comparably to dual source DECT at 100 kV/Sn 140 kV. Dual source concepts with photon counting detectors could be a solution to this problem, promising a better performance.« less
Groundwater vulnerability and risk mapping using GIS, modeling and a fuzzy logic tool.
Nobre, R C M; Rotunno Filho, O C; Mansur, W J; Nobre, M M M; Cosenza, C A N
2007-12-07
A groundwater vulnerability and risk mapping assessment, based on a source-pathway-receptor approach, is presented for an urban coastal aquifer in northeastern Brazil. A modified version of the DRASTIC methodology was used to map the intrinsic and specific groundwater vulnerability of a 292 km(2) study area. A fuzzy hierarchy methodology was adopted to evaluate the potential contaminant source index, including diffuse and point sources. Numerical modeling was performed for delineation of well capture zones, using MODFLOW and MODPATH. The integration of these elements provided the mechanism to assess groundwater pollution risks and identify areas that must be prioritized in terms of groundwater monitoring and restriction on use. A groundwater quality index based on nitrate and chloride concentrations was calculated, which had a positive correlation with the specific vulnerability index.
SHIELD: FITGALAXY -- A Software Package for Automatic Aperture Photometry of Extended Sources
NASA Astrophysics Data System (ADS)
Marshall, Melissa
2013-01-01
Determining the parameters of extended sources, such as galaxies, is a common but time-consuming task. Finding a photometric aperture that encompasses the majority of the flux of a source and identifying and excluding contaminating objects is often done by hand - a lengthy and difficult to reproduce process. To make extracting information from large data sets both quick and repeatable, I have developed a program called FITGALAXY, written in IDL. This program uses minimal user input to automatically fit an aperture to, and perform aperture and surface photometry on, an extended source. FITGALAXY also automatically traces the outlines of surface brightness thresholds and creates surface brightness profiles, which can then be used to determine the radial properties of a source. Finally, the program performs automatic masking of contaminating sources. Masks and apertures can be applied to multiple images (regardless of the WCS solution or plate scale) in order to accurately measure the same source at different wavelengths. I present the fluxes, as measured by the program, of a selection of galaxies from the Local Volume Legacy Survey. I then compare these results with the fluxes given by Dale et al. (2009) in order to assess the accuracy of FITGALAXY.
Maillet, David; Rajah, M Natasha
2016-06-01
Recent evidence indicates that young adults frequently exhibit task-unrelated thoughts (TUTs) such as mind-wandering during episodic encoding tasks and that TUTs negatively impact subsequent memory. In the current study, we assessed age-related differences in the frequency and neural correlates of TUTs during a source memory encoding task, as well as age-related differences in the relationship between the neural correlates of TUTs and subsequent source forgetting effects (i.e., source misses). We found no age-related differences in frequency of TUTs during fMRI scanning. Moreover, TUT frequency at encoding was positively correlated with source misses at retrieval across age groups. In both age groups, brain regions including bilateral middle/superior frontal gyri and precuneus were activated to a greater extent during encoding for subsequent source misses versus source hits and during TUTs versus on-task episodes. Overall, our results reveal that, during a source memory encoding task in an fMRI environment, young and older adults exhibit a similar frequency of TUTs and that experiencing TUTs at encoding is associated with decreased retrieval performance. In addition, in both age groups, experiencing TUTs at encoding is associated with increased activation in some of the same regions that exhibit subsequent source forgetting effects.
Implementation of hospital governing boards: views from the field.
McNatt, Zahirah; Thompson, Jennifer W; Mengistu, Abraham; Tatek, Dawit; Linnander, Erika; Ageze, Leulseged; Lawson, Ruth; Berhanu, Negalign; Bradley, Elizabeth H
2014-04-17
Decentralization through the establishment of hospital governing boards has been touted as an effective way to improve the quality and efficiency of hospitals in low-income countries. Although several studies have examined the process of decentralization, few have quantitatively assessed the implementation of hospital governing boards and their impact on hospital performance. Therefore, we sought to describe the functioning of governing boards and to determine the association between governing board functioning and hospital performance. We conducted a cross-sectional study with governing board chairpersons to assess board (1) structure, (2) roles and responsibilities and (3) training and orientation practices. Using bivariate analysis and multivariable regression, we examined the association between governing board functioning and hospital performance. Hospital performance indicators: 1) percent of hospital management standards met, measured with the Ethiopian Hospital Reform Implementation Guidelines and 2) patient experience, measured with the Inpatient and Outpatient Assessment of Healthcare surveys. A total of 92 boards responded to the survey (96% response rate). The average percentage of EHRIG standards met was 58.1% (standard deviation (SD) 21.7 percentage points), and the mean overall patient experience score was 7.2 (SD 2.2). Hospitals with greater hospital management standards met had governing boards that paid members, reviewed performance in several domains quarterly or more frequently, developed new revenue sources, determined services to be outsourced, reviewed patient complaints, and had members with knowledge in business and financial management (all P-values < 0.05). Hospitals with more positive patient experience had governing boards that developed new revenue sources, determined services to be outsourced, and reviewed patient complaints (all P-values < 0.05). These cross-sectional data suggest that strengthening governing boards to perform essential responsibilities may result in improved hospital performance.
Grieco-Calub, Tina M.; Litovsky, Ruth Y.
2010-01-01
Objectives To measure sound source localization in children who have sequential bilateral cochlear implants (BICIs); to determine if localization accuracy correlates with performance on a right-left discrimination task (i.e., spatial acuity); to determine if there is a measurable bilateral benefit on a sound source identification task (i.e., localization accuracy) by comparing performance under bilateral and unilateral listening conditions; to determine if sound source localization continues to improve with longer durations of bilateral experience. Design Two groups of children participated in this study: a group of 21 children who received BICIs in sequential procedures (5–14 years old) and a group of 7 typically-developing children with normal acoustic hearing (5 years old). Testing was conducted in a large sound-treated booth with loudspeakers positioned on a horizontal arc with a radius of 1.2 m. Children participated in two experiments that assessed spatial hearing skills. Spatial hearing acuity was assessed with a discrimination task in which listeners determined if a sound source was presented on the right or left side of center; the smallest angle at which performance on this task was reliably above chance is the minimum audible angle. Sound localization accuracy was assessed with a sound source identification task in which children identified the perceived position of the sound source from a multi-loudspeaker array (7 or 15); errors are quantified using the root-mean-square (RMS) error. Results Sound localization accuracy was highly variable among the children with BICIs, with RMS errors ranging from 19°–56°. Performance of the NH group, with RMS errors ranging from 9°–29° was significantly better. Within the BICI group, in 11/21 children RMS errors were smaller in the bilateral vs. unilateral listening condition, indicating bilateral benefit. There was a significant correlation between spatial acuity and sound localization accuracy (R2=0.68, p<0.01), suggesting that children who achieve small RMS errors tend to have the smallest MAAs. Although there was large intersubject variability, testing of 11 children in the BICI group at two sequential visits revealed a subset of children who show improvement in spatial hearing skills over time. Conclusions A subset of children who use sequential BICIs can acquire sound localization abilities, even after long intervals between activation of hearing in the first- and second-implanted ears. This suggests that children with activation of the second implant later in life may be capable of developing spatial hearing abilities. The large variability in performance among the children with BICIs suggests that maturation of sound localization abilities in children with BICIs may be dependent on various individual subject factors such as age of implantation and chronological age. PMID:20592615
ERIC Educational Resources Information Center
Brockmann, Frank
2011-01-01
State testing programs today are more extensive than ever, and their results are required to serve more purposes and high-stakes decisions than one might have imagined. Assessment results are used to hold schools, districts, and states accountable for student performance and to help guide a multitude of important decisions. This report describes…
ERIC Educational Resources Information Center
Jukes, Matthew C. H.; Grigorenko, Elena L.
2010-01-01
Background: The use of cognitive tests is increasing in Africa but little is known about how such tests are affected by the great ethnic and linguistic diversity on the continent. Aim: To assess ethnic and linguistic group differences in cognitive test performance in the West African country of the Gambia and to investigate the sources of these…
Assessment and modification of an ion source grid design in KSTAR neutral beam system.
Lee, Dong Won; Shin, Kyu In; Jin, Hyung Gon; Choi, Bo Guen; Kim, Tae-Seong; Jeong, Seung Ho
2014-02-01
A new 2 MW NB (Neutral Beam) ion source for supplying 3.5 MW NB heating for the KSTAR campaign was developed in 2012 and its grid was made from OFHC (Oxygen Free High Conductivity) copper with rectangular cooling channels. However, the plastic deformation such as a bulging in the plasma grid of the ion source was found during the overhaul period after the 2012 campaign. A thermal-hydraulic and a thermo-mechanical analysis using the conventional code, ANSYS, were carried out and the thermal fatigue life assessment was evaluated. It was found that the thermal fatigue life of the OFHC copper grid was about 335 cycles in case of 0.165 MW/m(2) heat flux and it gave too short fatigue life to be used as a KSTAR NB ion source grid. To overcome the limited fatigue life of the current design, the following methods were proposed in the present study: (1) changing the OHFC copper to CuCrZr, copper-alloy or (2) adopting a new design with a pure Mo metal grid and CuCrZr tubes. It is confirmed that the proposed methods meet the requirements by performing the same assessment.
NASA Astrophysics Data System (ADS)
Lachat, E.; Landes, T.; Grussenmeyer, P.
2018-05-01
Terrestrial and airborne laser scanning, photogrammetry and more generally 3D recording techniques are used in a wide range of applications. After recording several individual 3D datasets known in local systems, one of the first crucial processing steps is the registration of these data into a common reference frame. To perform such a 3D transformation, commercial and open source software as well as programs from the academic community are available. Due to some lacks in terms of computation transparency and quality assessment in these solutions, it has been decided to develop an open source algorithm which is presented in this paper. It is dedicated to the simultaneous registration of multiple point clouds as well as their georeferencing. The idea is to use this algorithm as a start point for further implementations, involving the possibility of combining 3D data from different sources. Parallel to the presentation of the global registration methodology which has been employed, the aim of this paper is to confront the results achieved this way with the above-mentioned existing solutions. For this purpose, first results obtained with the proposed algorithm to perform the global registration of ten laser scanning point clouds are presented. An analysis of the quality criteria delivered by two selected software used in this study and a reflexion about these criteria is also performed to complete the comparison of the obtained results. The final aim of this paper is to validate the current efficiency of the proposed method through these comparisons.
Recommended Henry’s Law Constants for Non-Groundwater Pathways Models in GoldSim
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dyer, J.
This memorandum documents the source and numerical value of Henry’s law constants for volatile radionuclides of interest used in the non-groundwater (air and radon) pathways models for the 2018 E-Area Performance Assessment.
Sparsity enables estimation of both subcortical and cortical activity from MEG and EEG
Krishnaswamy, Pavitra; Obregon-Henao, Gabriel; Ahveninen, Jyrki; Khan, Sheraz; Iglesias, Juan Eugenio; Hämäläinen, Matti S.; Purdon, Patrick L.
2017-01-01
Subcortical structures play a critical role in brain function. However, options for assessing electrophysiological activity in these structures are limited. Electromagnetic fields generated by neuronal activity in subcortical structures can be recorded noninvasively, using magnetoencephalography (MEG) and electroencephalography (EEG). However, these subcortical signals are much weaker than those generated by cortical activity. In addition, we show here that it is difficult to resolve subcortical sources because distributed cortical activity can explain the MEG and EEG patterns generated by deep sources. We then demonstrate that if the cortical activity is spatially sparse, both cortical and subcortical sources can be resolved with M/EEG. Building on this insight, we develop a hierarchical sparse inverse solution for M/EEG. We assess the performance of this algorithm on realistic simulations and auditory evoked response data, and show that thalamic and brainstem sources can be correctly estimated in the presence of cortical activity. Our work provides alternative perspectives and tools for characterizing electrophysiological activity in subcortical structures in the human brain. PMID:29138310
Batt, Angela L; Furlong, Edward T; Mash, Heath E; Glassmeyer, Susan T; Kolpin, Dana W
2017-02-01
A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. Published by Elsevier B.V.
2013-01-01
Numerous quantitative PCR assays for microbial fecal source tracking (MST) have been developed and evaluated in recent years. Widespread application has been hindered by a lack of knowledge regarding the geographical stability and hence applicability of such methods beyond the regional level. This study assessed the performance of five previously reported quantitative PCR assays targeting human-, cattle-, or ruminant-associated Bacteroidetes populations on 280 human and animal fecal samples from 16 countries across six continents. The tested cattle-associated markers were shown to be ruminant-associated. The quantitative distributions of marker concentrations in target and nontarget samples proved to be essential for the assessment of assay performance and were used to establish a new metric for quantitative source-specificity. In general, this study demonstrates that stable target populations required for marker-based MST occur around the globe. Ruminant-associated marker concentrations were strongly correlated with total intestinal Bacteroidetes populations and with each other, indicating that the detected ruminant-associated populations seem to be part of the intestinal core microbiome of ruminants worldwide. Consequently tested ruminant-targeted assays appear to be suitable quantitative MST tools beyond the regional level while the targeted human-associated populations seem to be less prevalent and stable, suggesting potential for improvements in human-targeted methods. PMID:23755882
Murphy, Douglas J; Bruce, David A; Mercer, Stewart W; Eva, Kevin W
2009-05-01
To investigate the reliability and feasibility of six potential workplace-based assessment methods in general practice training: criterion audit, multi-source feedback from clinical and non-clinical colleagues, patient feedback (the CARE Measure), referral letters, significant event analysis, and video analysis of consultations. Performance of GP registrars (trainees) was evaluated with each tool to assess the reliabilities of the tools and feasibility, given raters and number of assessments needed. Participant experience of process determined by questionnaire. 171 GP registrars and their trainers, drawn from nine deaneries (representing all four countries in the UK), participated. The ability of each tool to differentiate between doctors (reliability) was assessed using generalisability theory. Decision studies were then conducted to determine the number of observations required to achieve an acceptably high reliability for "high-stakes assessment" using each instrument. Finally, descriptive statistics were used to summarise participants' ratings of their experience using these tools. Multi-source feedback from colleagues and patient feedback on consultations emerged as the two methods most likely to offer a reliable and feasible opinion of workplace performance. Reliability co-efficients of 0.8 were attainable with 41 CARE Measure patient questionnaires and six clinical and/or five non-clinical colleagues per doctor when assessed on two occasions. For the other four methods tested, 10 or more assessors were required per doctor in order to achieve a reliable assessment, making the feasibility of their use in high-stakes assessment extremely low. Participant feedback did not raise any major concerns regarding the acceptability, feasibility, or educational impact of the tools. The combination of patient and colleague views of doctors' performance, coupled with reliable competence measures, may offer a suitable evidence-base on which to monitor progress and completion of doctors' training in general practice.
UNMIX Methods Applied to Characterize Sources of Volatile Organic Compounds in Toronto, Ontario
Porada, Eugeniusz; Szyszkowicz, Mieczysław
2016-01-01
UNMIX, a sensor modeling routine from the U.S. Environmental Protection Agency (EPA), was used to model volatile organic compound (VOC) receptors in four urban sites in Toronto, Ontario. VOC ambient concentration data acquired in 2000–2009 for 175 VOC species in four air quality monitoring stations were analyzed. UNMIX, by performing multiple modeling attempts upon varying VOC menus—while rejecting the results that were not reliable—allowed for discriminating sources by their most consistent chemical characteristics. The method assessed occurrences of VOCs in sources typical of the urban environment (traffic, evaporative emissions of fuels, banks of fugitive inert gases), industrial point sources (plastic-, polymer-, and metalworking manufactures), and in secondary sources (releases from water, sediments, and contaminated urban soil). The remote sensing and robust modeling used here produces chemical profiles of putative VOC sources that, if combined with known environmental fates of VOCs, can be used to assign physical sources’ shares of VOCs emissions into the atmosphere. This in turn provides a means of assessing the impact of environmental policies on one hand, and industrial activities on the other hand, on VOC air pollution. PMID:29051416
Assessment of In Situ Time Resolved Shock Experiments at Synchrotron Light Sources*
NASA Astrophysics Data System (ADS)
Belak, J.; Ilavsky, J.; Hessler, J. P.
2005-07-01
Prior to fielding in situ time resolved experiments of shock wave loading at the Advanced Photon Source, we have performed feasibility experiments assessing a single photon bunch. Using single and poly-crystal Al, Ti, V and Cu shock to incipient spallation on the gas gun, samples were prepared from slices normal to the spall plane of thickness 100-500 microns. In addition, single crystal Al of thickness 500 microns was shocked to incipient spallation and soft recovered using the LLNL e-gun mini-flyer system. The e-gun mini-flyer impacts the sample target producing a 10's ns flat-top shock transient. Here, we present results for imaging, small-angle scattering (SAS), and diffraction. In particular, there is little SAS away from the spall plane and significant SAS at the spall plane, demonstrating the presence of sub-micron voids. * Use of the Advanced Photon Source was supported by the U. S. Department of Energy, Office of Science, Office of Basic Energy Sciences, under Contract No. W-31-109-Eng-38 and work performed under the auspices of the U.S. Department of Energy by University of California, Lawrence Livermore National Laboratory under Contract W-7405-Eng-48.
NASA Astrophysics Data System (ADS)
Federer, Meghan Rector
Assessment is a key element in the process of science education teaching and research. Understanding sources of performance bias in science assessment is a major challenge for science education reforms. Prior research has documented several limitations of instrument types on the measurement of students' scientific knowledge (Liu et al., 2011; Messick, 1995; Popham, 2010). Furthermore, a large body of work has been devoted to reducing assessment biases that distort inferences about students' science understanding, particularly in multiple-choice [MC] instruments. Despite the above documented biases, much has yet to be determined for constructed response [CR] assessments in biology and their use for evaluating students' conceptual understanding of scientific practices (such as explanation). Understanding differences in science achievement provides important insights into whether science curricula and/or assessments are valid representations of student abilities. Using the integrative framework put forth by the National Research Council (2012), this dissertation aimed to explore whether assessment biases occur for assessment practices intended to measure students' conceptual understanding and proficiency in scientific practices. Using a large corpus of undergraduate biology students' explanations, three studies were conducted to examine whether known biases of MC instruments were also apparent in a CR instrument designed to assess students' explanatory practice and understanding of evolutionary change (ACORNS: Assessment of COntextual Reasoning about Natural Selection). The first study investigated the challenge of interpreting and scoring lexically ambiguous language in CR answers. The incorporation of 'multivalent' terms into scientific discourse practices often results in statements or explanations that are difficult to interpret and can produce faulty inferences about student knowledge. The results of this study indicate that many undergraduate biology majors frequently incorporate multivalent concepts into explanations of change, resulting in explanatory practices that were scientifically non-normative. However, use of follow-up question approaches was found to resolve this source of bias and thereby increase the validity of inferences about student understanding. The second study focused on issues of item and instrument structure, specifically item feature effects and item position effects, which have been shown to influence measures of student performance across assessment tasks. Results indicated that, along the instrument item sequence, items with similar surface features produced greater sequencing effects than sequences of items with dissimilar surface features. This bias could be addressed by use of a counterbalanced design (i.e., Latin Square) at the population level of analysis. Explanation scores were also highly correlated with student verbosity, despite verbosity being an intrinsically trivial aspect of explanation quality. Attempting to standardize student response length was one proposed solution to the verbosity bias. The third study explored gender differences in students' performance on constructed-response explanation tasks using impact (i.e., mean raw scores) and differential item function (i.e., item difficulties) patterns. While prior research in science education has suggested that females tend to perform better on constructed-response items, the results of this study revealed no overall differences in gender achievement. However, evaluation of specific item features patterns suggested that female respondents have a slight advantage on unfamiliar explanation tasks. That is, male students tended to incorporate fewer scientifically normative concepts (i.e., key concepts) than females for unfamiliar taxa. Conversely, females tended to incorporate more scientifically non-normative ideas (i.e., naive ideas) than males for familiar taxa. Together these results indicate that gender achievement differences for this CR instrument may be a result of differences in how males and females interpret and respond to combinations of item features. Overall, the results presented in the subsequent chapters suggest that as science education shifts toward the evaluation of fused scientific knowledge and practice (e.g., explanation), it is essential that educators and researchers investigate potential sources of bias inherent to specific assessment practices. This dissertation revealed significant sources of CR assessment bias, and provided solutions to address these problems.
Functional Performance of Pyrovalves
NASA Technical Reports Server (NTRS)
Bement, Laurence J.
1996-01-01
Following several flight and ground test failures of spacecraft systems using single-shot, 'normally closed' pyrotechnically actuated valves (pyrovalves), a government/industry cooperative program was initiated to assess the functional performance of five qualified designs. The goal of the program was to improve performance-based requirements for the procurement of pyrovalves. Specific objectives included the demonstration of performance test methods, the measurement of 'blowby' (the passage of gases from the pyrotechnic energy source around the activating piston into the valve's fluid path), and the quantification of functional margins for each design. Experiments were conducted in-house at NASA on several units each of the five valve designs. The test methods used for this program measured the forces and energies required to actuate the valves, as well as the energies and the pressures (where possible) delivered by the pyrotechnic sources. Functional performance ranged widely among the designs. Blowby cannot be prevented by o-ring seals; metal-to-metal seals were effective. Functional margin was determined by dividing the energy delivered by the pyrotechnic sources in excess to that required to accomplish the function by the energy required for that function. All but two designs had adequate functional margins with the pyrotechnic cartridges evaluated.
Assessment on security system of radioactive sources used in hospitals of Thailand
NASA Astrophysics Data System (ADS)
Jitbanjong, Petchara; Wongsawaeng, Doonyapong
2016-01-01
Unsecured radioactive sources have caused deaths and serious injuries in many parts of the world. In Thailand, there are 17 hospitals that use teletherapy with cobalt-60 radioactive sources. They need to be secured in order to prevent unauthorized removal, sabotage and terrorists from using such materials in a radiological weapon. The security system of radioactive sources in Thailand is regulated by the Office of Atoms for Peace in compliance with Global Threat Reduction Initiative (GTRI), U.S. DOE, which has started to be implemented since 2010. This study aims to perform an assessment on the security system of radioactive sources used in hospitals in Thailand and the results can be used as a recommended baseline data for development or improvement of hospitals on the security system of a radioactive source at a national regulatory level and policy level. Results from questionnaires reveal that in 11 out of 17 hospitals (64.70%), there were a few differences in conditions of hospitals using radioactive sources with installation of the security system and those without installation of the security system. Also, personals working with radioactive sources did not clearly understand the nuclear security law. Thus, government organizations should be encouraged to arrange trainings on nuclear security to increase the level of understanding. In the future, it is recommended that the responsible government organization issues a minimum requirement of nuclear security for every medical facility using radioactive sources.
NASA Technical Reports Server (NTRS)
Roman, Monsi C.; Mittelman, Marc W.
2010-01-01
This slide presentation summarizes the studies performed to assess the bulk phase microbial community during the Space Station Water Recover Tests (WRT) from 1990-1998. These tests show that it is possible to recycle water from different sources including urine, and produce water that can exceed the quality of municpally produced tap water.
Onboard Navigation Systems Characteristics
NASA Technical Reports Server (NTRS)
1979-01-01
The space shuttle onboard navigation systems characteristics are described. A standard source of equations and numerical data for use in error analyses and mission simulations related to space shuttle development is reported. The sensor characteristics described are used for shuttle onboard navigation performance assessment. The use of complete models in the studies depend on the analyses to be performed, the capabilities of the computer programs, and the availability of computer resources.
Munguía-Rosas, Miguel A; Campos-Navarrete, María J; Parra-Tabla, Víctor
2013-01-01
Dimorphic cleistogamy is a specialized form of mixed mating system where a single plant produces both open, potentially outcrossed chasmogamous (CH) and closed, obligately self-pollinated cleistogamous (CL) flowers. Typically, CH flowers and seeds are bigger and energetically more costly than those of CL. Although the effects of inbreeding and floral dimorphism are critical to understanding the evolution and maintenance of cleistogamy, these effects have been repeatedly confounded. In an attempt to separate these effects, we compared the performance of progeny derived from the two floral morphs while controlling for the source of pollen. That is, flower type and pollen source effects were assessed by comparing the performance of progeny derived from selfed CH vs. CL and outcrossed CH vs. selfed CH flowers, respectively. The experiment was carried out with the herb Ruellia nudiflora under two contrasting light environments. Outcrossed progeny generally performed better than selfed progeny. However, inbreeding depression ranges from low (1%) to moderate (36%), with the greatest value detected under shaded conditions when cumulative fitness was used. Although flower type generally had less of an effect on progeny performance than pollen source did, the progeny derived from selfed CH flowers largely outperformed the progeny from CL flowers, but only under shaded conditions and when cumulative fitness was taken into account. On the other hand, the source of pollen and flower type influenced seed predation, with selfed CH progeny the most heavily attacked by predators. Therefore, the effects of pollen source and flower type are environment-dependant and seed predators may increase the genetic differences between progeny derived from CH and CL flowers. Inbreeding depression alone cannot account for the maintenance of a mixed mating system in R. nudiflora and other unidentified mechanisms must thus be involved.
Driver, Hannah R
2016-06-01
It is essential that a medical assessment is completed before commencing electroconvulsive therapy (ECT) to identify possible risk factors that may complicate anesthesia. This audit aimed to improve the physical examination and documented medical history of this assessment usually performed by referring psychiatrists. Patients from 2 sites (A and B) were retrospectively audited against standards from the Scottish ECT Accreditation Network. The timing and systems examined for the physical examination were noted. Site A used an examination sheet; at site B there was no standard way of recording examination. Documented medical histories were compared with comprehensive histories obtained using multiple sources. Discrepancies were noted. Site A was reaudited after amending the examination sheet and adding prompts to use multiple sources when gathering histories. There were repeat examinations for ECT in 30 patients (100%) at site A and 6 (23%) of 26 at site B. Physical examinations were incomplete in 47% of patients at A and 100% at B. Oral examination was frequently missed at both sites. Medical histories were accurate in 20% at A and 38% at B. In the reaudit of site A, all 12 patients had a complete repeat examination; histories were accurate in 8 (67%), with multiple sources used in 11 (92%). Incomplete or inaccurate assessments put patients at risk. Oral examinations should be part of initial medical assessments. This audit shows the benefits of using a physical examination sheet to ensure a repeat complete examination. Using multiple sources to gather medical histories is encouraged.
Violato, Claudio; Lockyer, Jocelyn M; Fidler, Herta
2008-10-01
Multi-source feedback (MSF) enables performance data to be provided to doctors from patients, co-workers and medical colleagues. This study examined the evidence for the validity of MSF instruments for general practice, investigated changes in performance for doctors who participated twice, 5 years apart, and determined the association between change in performance and initial assessment and socio-demographic characteristics. Data for 250 doctors included three datasets per doctor from, respectively, 25 patients, eight co-workers and eight medical colleagues, collected on two occasions. There was high internal consistency (alpha > 0.90) and adequate generalisability (Ep(2) > 0.70). D study results indicate adequate generalisability coefficients for groups of eight assessors (medical colleagues, co-workers) and 25 patient surveys. Confirmatory factor analyses provided evidence for the validity of factors that were theoretically expected, meaningful and cohesive. Comparative fit indices were 0.91 for medical colleague data, 0.87 for co-worker data and 0.81 for patient data. Paired t-test analysis showed significant change between the two assessments from medical colleagues and co-workers, but not between the two patient surveys. Multiple linear regressions explained 2.1% of the variance at time 2 for medical colleagues, 21.4% of the variance for co-workers and 16.35% of the variance for patient assessments, with professionalism a key variable in all regressions. There is evidence for the construct validity of the instruments and for their stability over time. Upward changes in performance will occur, although their effect size is likely to be small to moderate.
Increasing Confidence In Treatment Performance Assessment Using Geostatistical Methods
It is well established that the presence of dense non-aqueous phase liquids (DNAPLs) such as trichloroethylene (TCE) in aquifer systems represents a very long-term source of groundwater contamination. Significant effort in recent years has been focussed on developing effective me...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowman, A.W.
1990-04-01
This paper describes an approach to solve air quality problems which frequently occur during iterations of the baseline change process. From a schedule standpoint, it is desirable to perform this evaluation in as short a time as possible while budgetary pressures limit the size of the staff available to do the work. Without a method in place to deal with baseline change proposal requests the environment analysts may not be able to produce the analysis results in the time frame expected. Using a concept called the Rapid Response Air Quality Analysis System (RAAS), the problems of timing and cost becomemore » tractable. The system could be adapted to assess other atmospheric pathway impacts, e.g., acoustics or visibility. The air quality analysis system used to perform the EA analysis (EA) for the Salt Repository Project (part of the Civilian Radioactive Waste Management Program), and later to evaluate the consequences of proposed baseline changes, consists of three components: Emission source data files; Emission rates contained in spreadsheets; Impact assessment model codes. The spreadsheets contain user-written codes (macros) that calculate emission rates from (1) emission source data (e.g., numbers and locations of sources, detailed operating schedules, and source specifications including horsepower, load factor, and duty cycle); (2) emission factors such as those published by the U.S. Environmental Protection Agency, and (3) control efficiencies.« less
NASA Astrophysics Data System (ADS)
Bezawada, Rajesh; Uijt de Haag, Maarten
2010-04-01
This paper discusses the results of an initial evaluation study of hazard and integrity monitor functions for use with integrated alerting and notification. The Hazard and Integrity Monitor (HIM) (i) allocates information sources within the Integrated Intelligent Flight Deck (IIFD) to required functionality (like conflict detection and avoidance) and determines required performance of these information sources as part of that function; (ii) monitors or evaluates the required performance of the individual information sources and performs consistency checks among various information sources; (iii) integrates the information to establish tracks of potential hazards that can be used for the conflict probes or conflict prediction for various time horizons including the 10, 5, 3, and <3 minutes used in our scenario; (iv) detects and assesses the class of the hazard and provide possible resolutions. The HIM monitors the operation-dependent performance parameters related to the potential hazards in a manner similar to the Required Navigation Performance (RNP). Various HIM concepts have been implemented and evaluated using a previously developed sensor simulator/synthesizer. Within the simulation framework, various inputs to the IIFD and its subsystems are simulated, synthesized from actual collected data, or played back from actual flight test sensor data. The framework and HIM functions are implemented in SimulinkR, a modeling language developed by The MathworksTM. This modeling language allows for test and evaluation of various sensor and communication link configurations as well as the inclusion of feedback from the pilot on the performance of the aircraft.
Reveiz, Ludovic; Sangalang, Stephanie; Glujovsky, Demian; Pinzon, Carlos E; Asenjo Lobos, Claudia; Cortes, Marcela; Cañón, Martin; Bardach, Ariel; Bonfill, Xavier
2013-01-01
Few studies have assessed the nature and quality of randomized controlled trials (RCTs) in Latin America and the Caribbean (LAC). The aims of this systematic review are to evaluate the characteristics (including the risk of bias assessment) of RCT conducted in LAC according to funding source. A review of RCTs published in 2010 in which the author's affiliation was from LAC was performed in PubMed and LILACS. Two reviewers independently extracted data and assessed the risk of bias. The primary outcomes were risk of bias assessment and funding source. A total of 1,695 references were found in PubMed and LILACS databases, of which 526 were RCTs (N = 73.513 participants). English was the dominant publication language (93%) and most of the RCTs were published in non-LAC journals (84.2%). Only five of the 19 identified countries accounted for nearly 95% of all RCTs conducted in the region (Brazil 70.9%, Mexico 10.1%, Argentina 5.9%, Colombia 3.8%, and Chile 3.4%). Few RCTs covered priority areas related with Millennium Development Goals like maternal health (6.7%) or high priority infectious diseases (3.8%). Regarding children, 3.6% and 0.4% RCT evaluated nutrition and diarrhea interventions respectively but none pneumonia. As a comparison, aesthetic and sport related interventions account for 4.6% of all trials. A random sample of RCTs (n = 358) was assessed for funding source: exclusively public (33.8%); private (e.g. pharmaceutical company) (15.3%); other (e.g. mixed, NGO) (15.1%); no funding (35.8%). Overall assessments for risk of bias showed no statistically significant differences between RCTs and type of funding source. Statistically significant differences favoring private and others type of funding was found when assessing trial registration and conflict of interest reporting. Findings of this study could be used to provide more direction for future research to facilitate innovation, improve health outcomes or address priority health problems.
Reveiz, Ludovic; Sangalang, Stephanie; Glujovsky, Demian; Pinzon, Carlos E.; Asenjo Lobos, Claudia; Cortes, Marcela; Cañón, Martin; Bardach, Ariel; Bonfill, Xavier
2013-01-01
Introduction Few studies have assessed the nature and quality of randomized controlled trials (RCTs) in Latin America and the Caribbean (LAC). Methods and Findings The aims of this systematic review are to evaluate the characteristics (including the risk of bias assessment) of RCT conducted in LAC according to funding source. A review of RCTs published in 2010 in which the author's affiliation was from LAC was performed in PubMed and LILACS. Two reviewers independently extracted data and assessed the risk of bias. The primary outcomes were risk of bias assessment and funding source. A total of 1,695 references were found in PubMed and LILACS databases, of which 526 were RCTs (N = 73.513 participants). English was the dominant publication language (93%) and most of the RCTs were published in non-LAC journals (84.2%). Only five of the 19 identified countries accounted for nearly 95% of all RCTs conducted in the region (Brazil 70.9%, Mexico 10.1%, Argentina 5.9%, Colombia 3.8%, and Chile 3.4%). Few RCTs covered priority areas related with Millennium Development Goals like maternal health (6.7%) or high priority infectious diseases (3.8%). Regarding children, 3.6% and 0.4% RCT evaluated nutrition and diarrhea interventions respectively but none pneumonia. As a comparison, aesthetic and sport related interventions account for 4.6% of all trials. A random sample of RCTs (n = 358) was assessed for funding source: exclusively public (33.8%); private (e.g. pharmaceutical company) (15.3%); other (e.g. mixed, NGO) (15.1%); no funding (35.8%). Overall assessments for risk of bias showed no statistically significant differences between RCTs and type of funding source. Statistically significant differences favoring private and others type of funding was found when assessing trial registration and conflict of interest reporting. Conclusion Findings of this study could be used to provide more direction for future research to facilitate innovation, improve health outcomes or address priority health problems. PMID:23418566
Wu, Bing; Zhang, Yan; Zhang, Xu-Xiang; Cheng, Shu-Pei
2011-12-01
A carcinogenic risk assessment of polycyclic aromatic hydrocarbons (PAHs) in source water and drinking water of China was conducted using probabilistic techniques from a national perspective. The published monitoring data of PAHs were gathered and converted into BaP equivalent (BaP(eq)) concentrations. Based on the transformed data, comprehensive risk assessment was performed by considering different age groups and exposure pathways. Monte Carlo simulation and sensitivity analysis were applied to quantify uncertainties of risk estimation. The risk analysis indicated that, the risk values for children and teens were lower than the accepted value (1.00E-05), indicating no significant carcinogenic risk. The probability of risk values above 1.00E-05 was 5.8% and 6.7% for adults and lifetime groups, respectively. Overall, carcinogenic risks of PAHs in source water and drinking water of China were mostly accepted. However, specific regions, such as Yellow river of Lanzhou reach and Qiantang river should be paid more attention. Notwithstanding the uncertainties inherent in the risk assessment, this study is the first attempt to provide information on carcinogenic risk of PAHs in source water and drinking water of China, and might be useful for potential strategies of carcinogenic risk management and reduction. Copyright © 2011 Elsevier B.V. All rights reserved.
Wu, Xin Yin; Lam, Victor C K; Yu, Yue Feng; Ho, Robin S T; Feng, Ye; Wong, Charlene H L; Yip, Benjamin H K; Tsoi, Kelvin K F; Wong, Samuel Y S; Chung, Vincent C H
2016-11-01
Well-conducted meta-analyses (MAs) are considered as one of the best sources of clinical evidence for treatment decision. MA with methodological flaws may introduce bias and mislead evidence users. The aim of this study is to investigate the characteristics and methodological quality of MAs on diabetes mellitus (DM) treatments. Systematic review. Cochrane Database of Systematic Review and Database of Abstract of Reviews of Effects were searched for relevant MAs. Assessing methodological quality of systematic reviews (AMSTAR) tool was used to evaluate the methodological quality of included MAs. Logistic regression analysis was used to identify association between characteristics of MA and AMSTAR results. A total of 252 MAs including 4999 primary studies and 13,577,025 patients were included. Over half of the MAs (65.1%) only included type 2 DM patients and 160 MAs (63.5%) focused on pharmacological treatments. About 89.7% MAs performed comprehensive literature search and 89.3% provided characteristics of included studies. Included MAs generally had poor performance on the remaining AMSTAR items, especially in assessing publication bias (39.3%), providing lists of studies (19.0%) and declaring source of support comprehensively (7.5%). Only 62.7% MAs mentioned about harm of interventions. MAs with corresponding author from Asia performed less well in providing MA protocol than those from Europe. Methodological quality of MA on DM treatments was unsatisfactory. There is considerable room for improvement, especially in assessing publication bias, providing lists of studies and declaring source of support comprehensively. Also, there is an urgent need for MA authors to report treatment harm comprehensively. © 2016 European Society of Endocrinology.
2014 Assessment of the Ballistic Missile Defense System (BMDS)
2015-03-23
for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of...take several more years to collect the test data needed to adequately VV&A the BMDS M&S required to perform such assessments. As data are collected ...Accreditation is possible only if a sufficient quantity and quality of flight test data have been collected to support model verification and
ERIC Educational Resources Information Center
Tahiroglu, Deniz; Moses, Louis J.; Carlson, Stephanie M.; Mahy, Caitlin E. V.; Olofson, Eric L.; Sabbagh, Mark A.
2014-01-01
Children's theory of mind (ToM) is typically measured with laboratory assessments of performance. Although these measures have generated a wealth of informative data concerning developmental progressions in ToM, they may be less useful as the sole source of information about individual differences in ToM and their relation to other facets of…
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-12-01
This report assesses the water quality related benefits that would be expected if the US Environmental Protection Agency (EPA) adopts the proposed effluent limitations, guidelines and pretreatment standards for the Centralized Waste Treatment (CWT) Industry. EPA estimates that under baseline conditions 205 CWT facilities discharge approximately 5.22 million lbs/year of metal and organic pollutants.
Performance assessment of a Bayesian Forecasting System (BFS) for real-time flood forecasting
NASA Astrophysics Data System (ADS)
Biondi, D.; De Luca, D. L.
2013-02-01
SummaryThe paper evaluates, for a number of flood events, the performance of a Bayesian Forecasting System (BFS), with the aim of evaluating total uncertainty in real-time flood forecasting. The predictive uncertainty of future streamflow is estimated through the Bayesian integration of two separate processors. The former evaluates the propagation of input uncertainty on simulated river discharge, the latter computes the hydrological uncertainty of actual river discharge associated with all other possible sources of error. A stochastic model and a distributed rainfall-runoff model were assumed, respectively, for rainfall and hydrological response simulations. A case study was carried out for a small basin in the Calabria region (southern Italy). The performance assessment of the BFS was performed with adequate verification tools suited for probabilistic forecasts of continuous variables such as streamflow. Graphical tools and scalar metrics were used to evaluate several attributes of the forecast quality of the entire time-varying predictive distributions: calibration, sharpness, accuracy, and continuous ranked probability score (CRPS). Besides the overall system, which incorporates both sources of uncertainty, other hypotheses resulting from the BFS properties were examined, corresponding to (i) a perfect hydrological model; (ii) a non-informative rainfall forecast for predicting streamflow; and (iii) a perfect input forecast. The results emphasize the importance of using different diagnostic approaches to perform comprehensive analyses of predictive distributions, to arrive at a multifaceted view of the attributes of the prediction. For the case study, the selected criteria revealed the interaction of the different sources of error, in particular the crucial role of the hydrological uncertainty processor when compensating, at the cost of wider forecast intervals, for the unreliable and biased predictive distribution resulting from the Precipitation Uncertainty Processor.
Solar cell and module performance assessment based on indoor calibration methods
NASA Astrophysics Data System (ADS)
Bogus, K.
A combined space/terrestrial solar cell test calibration method that requires five steps and can be performed indoors is described. The test conditions are designed to qualify the cell or module output data in standard illumination and temperature conditions. Measurements are made of the short-circuit current, the open circuit voltage, the maximum power, the efficiency, and the spectral response. Standard sunlight must be replicated both in earth surface and AM0 conditions; Xe lamps are normally used for the light source, with spectral measurements taken of the light. Cell and module spectral response are assayed by using monochromators and narrow band pass monochromatic filters. Attention is required to define the performance characteristics of modules under partial shadowing. Error sources that may effect the measurements are discussed, as are previous cell performance testing and calibration methods and their effectiveness in comparison with the behaviors of satellite solar power panels.
Dissociation between memory accuracy and memory confidence following bilateral parietal lesions.
Simons, Jon S; Peers, Polly V; Mazuz, Yonatan S; Berryhill, Marian E; Olson, Ingrid R
2010-02-01
Numerous functional neuroimaging studies have observed lateral parietal lobe activation during memory tasks: a surprise to clinicians who have traditionally associated the parietal lobe with spatial attention rather than memory. Recent neuropsychological studies examining episodic recollection after parietal lobe lesions have reported differing results. Performance was preserved in unilateral lesion patients on source memory tasks involving recollecting the context in which stimuli were encountered, and impaired in patients with bilateral parietal lesions on tasks assessing free recall of autobiographical memories. Here, we investigated a number of possible accounts for these differing results. In 3 experiments, patients with bilateral parietal lesions performed as well as controls at source recollection, confirming the previous unilateral lesion results and arguing against an explanation for those results in terms of contralesional compensation. Reducing the behavioral relevance of mnemonic information critical to the source recollection task did not affect performance of the bilateral lesion patients, indicating that the previously observed reduced autobiographical free recall might not be due to impaired bottom-up attention. The bilateral patients did, however, exhibit reduced confidence in their source recollection abilities across the 3 experiments, consistent with a suggestion that parietal lobe lesions might lead to impaired subjective experience of rich episodic recollection.
Wilby, Kyle J; Govaerts, Marjan J B; Austin, Zubin; Dolmans, Diana H J M
2017-03-21
Research has shown that patients' and practitioners' cultural orientations affect communication behaviors and interpretations in cross-cultural patient-practitioner interactions. Little is known about the effect of cultural orientations on assessment of communication behaviors in cross-cultural educational settings. The purpose of this study is to explore cultural orientation as a potential source of assessor idiosyncrasy or between-assessor variability in assessment of communication skills. More specifically, we explored if and how (expert) assessors' valuing of communication behaviours aligned with their cultural orientations (power-distance, masculinity-femininity, uncertainty avoidance, and individualism-collectivism). Twenty-five pharmacist-assessors watched 3 videotaped scenarios (patient-pharmacist interactions) and ranked each on a 5-point global rating scale. Videotaped scenarios demonstrated combinations of well-portrayed and borderline examples of instrumental and affective communication behaviours. We used stimulated recall and verbal protocol analysis to investigate assessors' interpretations and evaluations of communication behaviours. Uttered assessments of communication behaviours were coded as instrumental (task-oriented) or affective (socioemotional) and either positive or negative. Cultural orientations were measured using the Individual Cultural Values Scale. Correlations between cultural orientations and global scores, and frequencies of positive, negative, and total utterances of instrumental and affective behaviours were determined. Correlations were found to be scenario specific. In videos with poor or good performance, no differences were found across cultural orientations. When borderline performance was demonstrated, high power-distance and masculinity were significantly associated with higher global ratings (r = .445, and .537 respectively, p < 0.05) as well as with fewer negative utterances regarding instrumental (task focused) behaviours (r = -.533 and - .529, respectively). Higher masculinity scores were furthermore associated with positive utterances of affective (socioemotional) behaviours (r = .441). Our findings thus confirm cultural orientation as a source of assessor idiosyncrasy and meaningful variations in interpretation of communication behaviours. Interestingly, expert assessors generally agreed on scenarios of good or poor performances but borderline performance was influenced by cultural orientation. Contrary to current practices of assessor and assessment instrument standardization, findings support the use of multiple assessors for patient-practitioner interactions and development of qualitative assessment tools to capture these varying, yet valid, interpretations of performance.
NASA Astrophysics Data System (ADS)
Delle Monache, L.; Rodriguez, L. M.; Meech, S.; Hahn, D.; Betancourt, T.; Steinhoff, D.
2016-12-01
It is necessary to accurately estimate the initial source characteristics in the event of an accidental or intentional release of a Chemical, Biological, Radiological, or Nuclear (CBRN) agent into the atmosphere. The accurate estimation of the source characteristics are important because many times they are unknown and the Atmospheric Transport and Dispersion (AT&D) models rely heavily on these estimates to create hazard assessments. To correctly assess the source characteristics in an operational environment where time is critical, the National Center for Atmospheric Research (NCAR) has developed a Source Term Estimation (STE) method, known as the Variational Iterative Refinement STE algorithm (VIRSA). VIRSA consists of a combination of modeling systems. These systems include an AT&D model, its corresponding STE model, a Hybrid Lagrangian-Eulerian Plume Model (H-LEPM), and its mathematical adjoint model. In an operational scenario where we have information regarding the infrastructure of a city, the AT&D model used is the Urban Dispersion Model (UDM) and when using this model in VIRSA we refer to the system as uVIRSA. In all other scenarios where we do not have the city infrastructure information readily available, the AT&D model used is the Second-order Closure Integrated PUFF model (SCIPUFF) and the system is referred to as sVIRSA. VIRSA was originally developed using SCIPUFF 2.4 for the Defense Threat Reduction Agency and integrated into the Hazard Prediction and Assessment Capability and Joint Program for Information Systems Joint Effects Model. The results discussed here are the verification and validation of the upgraded system with SCIPUFF 3.0 and the newly implemented UDM capability. To verify uVIRSA and sVIRSA, synthetic concentration observation scenarios were created in urban and rural environments and the results of this verification are shown. Finally, we validate the STE performance of uVIRSA using scenarios from the Joint Urban 2003 (JU03) experiment, which was held in Oklahoma City and also validate the performance of sVIRSA using scenarios from the FUsing Sensor Integrated Observing Network (FUSION) Field Trial 2007 (FFT07), held at Dugway Proving Grounds in rural Utah.
Sabarudin, Akmal; Sun, Zhonghua; Yusof, Ahmad Khairuddin Md
2013-09-30
This study is conducted to investigate and compare image quality and radiation dose between prospective ECG-triggered and retrospective ECG-gated coronary CT angiography (CCTA) with the use of single-source CT (SSCT) and dual-source CT (DSCT). A total of 209 patients who underwent CCTA with suspected coronary artery disease scanned with SSCT (n=95) and DSCT (n=114) scanners using prospective ECG-triggered and retrospective ECG-gated protocols were recruited from two institutions. The image was assessed by two experienced observers, while quantitative assessment was performed by measuring the image noise, the signal-to-noise ratio (SNR) and the contrast-to-noise ratio (CNR). Effective dose was calculated using the latest published conversion coefficient factor. A total of 2087 out of 2880 coronary artery segments were assessable, with 98.0% classified as of sufficient and 2.0% as of insufficient image quality for clinical diagnosis. There was no significant difference in overall image quality between prospective ECG-triggered and retrospective gated protocols, whether it was performed with DSCT or SSCT scanners. Prospective ECG-triggered protocol was compared in terms of radiation dose calculation between DSCT (6.5 ± 2.9 mSv) and SSCT (6.2 ± 1.0 mSv) scanners and no significant difference was noted (p=0.99). However, the effective dose was significantly lower with DSCT (18.2 ± 8.3 mSv) than with SSCT (28.3 ± 7.0 mSv) in the retrospective gated protocol. Prospective ECG-triggered CCTA reduces radiation dose significantly compared to retrospective ECG-gated CCTA, while maintaining good image quality. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
THE EPA NATIONAL EXPOSURE RESEARCH LABORATORY CHILDREN'S PESTICIDE EXPOSURE MEASUREMENT PROGRAM
The U.S. Environmental Protection Agency (EPA) National Exposure Research Laboratory (NERL) is performing research in support of the Food Quality Protection Act (FQPA) of 1996. This act requires that pesticide exposure assessments to be conducted for all potential sources, rou...
Comparing different types of source memory attributes in dementia of Alzheimer's type.
Mammarella, Nicola; Fairfield, Beth; Di Domenico, Alberto
2012-04-01
Source monitoring (SM) refers to our ability to discriminate between memories from different sources. Twenty healthy high-cognitive functioning older adults, 20 healthy low-cognitive functioning older adults, and 20 older adults with dementia of Alzheimer's type (DAT) were asked to perform a series of SM tasks that varied in terms of the to-be-remembered source attribute (perceptual, spatial, temporal, semantic, social, and affective details). Results indicated that older DAT adults had greater difficulty in SM compared to the healthy control groups, especially with spatial and semantic details. Data are discussed in terms of the SM framework and suggest that poor memory for some types of source information may be considered as an important indicator of clinical memory function when assessing for the presence and severity of dementia.
Tsay, Ming-Yueh; Wu, Tai-Luan; Tseng, Ling-Li
2017-01-01
This study examines the completeness and overlap of coverage in physics of six open access scholarly communication systems, including two search engines (Google Scholar and Microsoft Academic), two aggregate institutional repositories (OAIster and OpenDOAR), and two physics-related open sources (arXiv.org and Astrophysics Data System). The 2001-2013 Nobel Laureates in Physics served as the sample. Bibliographic records of their publications were retrieved and downloaded from each system, and a computer program was developed to perform the analytical tasks of sorting, comparison, elimination, aggregation and statistical calculations. Quantitative analyses and cross-referencing were performed to determine the completeness and overlap of the system coverage of the six open access systems. The results may enable scholars to select an appropriate open access system as an efficient scholarly communication channel, and academic institutions may build institutional repositories or independently create citation index systems in the future. Suggestions on indicators and tools for academic assessment are presented based on the comprehensiveness assessment of each system.
NASA Astrophysics Data System (ADS)
Ayub, R.; Obenour, D. R.; Keyworth, A. J.; Genereux, D. P.; Mahinthakumar, K.
2016-12-01
Groundwater contamination by nutrients (nitrogen and phosphorus) is a major concern in water table aquifers that underlie agricultural areas in the mid-Atlantic Coastal Plain of the United States. High nutrient concentrations leaching into shallow groundwater can lead to human health problems and eutrophication of receiving surface waters. Liquid manure from concentrated animal feeding operations (CAFOs) stored in open-air lagoons and applied to spray fields can be a significant source of nutrients to groundwater, along with septic waste. In this study, we developed a model-based methodology for source apportionment and vulnerability assessment using sparse groundwater quality sampling measurements for Duplin County, North Carolina (NC), obtained by the NC Department of Environmental Quality (NC DEQ). This model provides information relevant to management by estimating the nutrient transport through the aquifer from different sources and addressing the uncertainty of nutrient contaminant propagation. First, the zones of influence (dependent on nutrient pathways) for individual groundwater monitoring wells were identified using a two-dimensional vertically averaged groundwater flow and transport model incorporating geologic uncertainty for the surficial aquifer system. A multiple linear regression approach is then applied to estimate the contribution weights for different nutrient source types using the nutrient measurements from monitoring wells and the potential sources within each zone of influence. Using the source contribution weights and their uncertainty, a probabilistic vulnerability assessment of the study area due to nutrient contamination is performed. Knowledge of the contribution of different nutrient sources to contamination at receptor locations (e.g., private wells, municipal wells, stream beds etc.) will be helpful in planning and implementation of appropriate mitigation measures.
Mockler, Eva M; Deakin, Jenny; Archbold, Marie; Gill, Laurence; Daly, Donal; Bruen, Michael
2017-12-01
More than half of surface water bodies in Europe are at less than good ecological status according to Water Framework Directive assessments, and diffuse pollution from agriculture remains a major, but not the only, cause of this poor performance. Agri-environmental policy and land management practices have, in many areas, reduced nutrient emissions to water. However, additional measures may be required in Ireland to further decouple the relationship between agricultural productivity and emissions to water, which is of vital importance given on-going agricultural intensification. The Source Load Apportionment Model (SLAM) framework characterises sources of phosphorus (P) and nitrogen (N) emissions to water at a range of scales from sub-catchment to national. The SLAM synthesises land use and physical characteristics to predict emissions from point (wastewater, industry discharges and septic tank systems) and diffuse sources (agriculture, forestry, etc.). The predicted annual nutrient emissions were assessed against monitoring data for 16 major river catchments covering 50% of the area of Ireland. At national scale, results indicate that total average annual emissions to surface water in Ireland are over 2700tyr -1 of P and 82,000tyr -1 of N. The proportional contributions from individual sources show that the main sources of P are from municipal wastewater treatment plants and agriculture, with wide variations across the country related to local anthropogenic pressures and the hydrogeological setting. Agriculture is the main source of N emissions to water across all regions of Ireland. These policy-relevant results synthesised large amounts of information in order to identify the dominant sources of nutrients at regional and local scales, contributing to the national nutrient risk assessment of Irish water bodies. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Performance indicators for public mental healthcare: a systematic international inventory
2012-01-01
Background The development and use of performance indicators (PI) in the field of public mental health care (PMHC) has increased rapidly in the last decade. To gain insight in the current state of PI for PMHC in nations and regions around the world, we conducted a structured review of publications in scientific peer-reviewed journals supplemented by a systematic inventory of PI published in policy documents by (non-) governmental organizations. Methods Publications on PI for PMHC were identified through database- and internet searches. Final selection was based on review of the full content of the publications. Publications were ordered by nation or region and chronologically. Individual PI were classified by development method, assessment level, care domain, performance dimension, diagnostic focus, and data source. Finally, the evidence on feasibility, data reliability, and content-, criterion-, and construct validity of the PI was evaluated. Results A total of 106 publications were included in the sample. The majority of the publications (n = 65) were peer-reviewed journal articles and 66 publications specifically dealt with performance of PMHC in the United States. The objectives of performance measurement vary widely from internal quality improvement to increasing transparency and accountability. The characteristics of 1480 unique PI were assessed. The majority of PI is based on stakeholder opinion, assesses care processes, is not specific to any diagnostic group, and utilizes administrative data sources. The targeted quality dimensions varied widely across and within nations depending on local professional or political definitions and interests. For all PI some evidence for the content validity and feasibility has been established. Data reliability, criterion- and construct validity have rarely been assessed. Only 18 publications on criterion validity were included. These show significant associations in the expected direction on the majority of PI, but mixed results on a noteworthy number of others. Conclusions PI have been developed for a broad range of care levels, domains, and quality dimensions of PMHC. To ensure their usefulness for the measurement of PMHC performance and advancement of transparency, accountability and quality improvement in PMHC, future research should focus on assessment of the psychometric properties of PI. PMID:22433251
Veillard, Jeremy; Cowling, Krycia; Bitton, Asaf; Ratcliffe, Hannah; Kimball, Meredith; Barkley, Shannon; Mercereau, Laure; Wong, Ethan; Taylor, Chelsea; Hirschhorn, Lisa R; Wang, Hong
2017-12-01
Policy Points: Strengthening accountability through better measurement and reporting is vital to ensure progress in improving quality primary health care (PHC) systems and achieving universal health coverage (UHC). The Primary Health Care Performance Initiative (PHCPI) provides national decision makers and global stakeholders with opportunities to benchmark and accelerate performance improvement through better performance measurement. Results from the initial PHC performance assessments in low- and middle-income countries (LMICs) are helping guide PHC reforms and investments and improve the PHCPI's instruments and indicators. Findings from future assessment activities will further amplify cross-country comparisons and peer learning to improve PHC. New indicators and sources of data are needed to better understand PHC system performance in LMICs. The Primary Health Care Performance Initiative (PHCPI), a collaboration between the Bill and Melinda Gates Foundation, The World Bank, and the World Health Organization, in partnership with Ariadne Labs and Results for Development, was launched in 2015 with the aim of catalyzing improvements in primary health care (PHC) systems in 135 low- and middle-income countries (LMICs), in order to accelerate progress toward universal health coverage. Through more comprehensive and actionable measurement of quality PHC, the PHCPI stimulates peer learning among LMICs and informs decision makers to guide PHC investments and reforms. Instruments for performance assessment and improvement are in development; to date, a conceptual framework and 2 sets of performance indicators have been released. The PHCPI team developed the conceptual framework through literature reviews and consultations with an advisory committee of international experts. We generated 2 sets of performance indicators selected from a literature review of relevant indicators, cross-referenced against indicators available from international sources, and evaluated through 2 separate modified Delphi processes, consisting of online surveys and in-person facilitated discussions with experts. The PHCPI conceptual framework builds on the current understanding of PHC system performance through an expanded emphasis on the role of service delivery. The first set of performance indicators, 36 Vital Signs, facilitates comparisons across countries and over time. The second set, 56 Diagnostic Indicators, elucidates underlying drivers of performance. Key challenges include a lack of available data for several indicators and a lack of validated indicators for important dimensions of quality PHC. The availability of data is critical to assessing PHC performance, particularly patient experience and quality of care. The PHCPI will continue to develop and test additional performance assessment instruments, including composite indices and national performance dashboards. Through country engagement, the PHCPI will further refine its instruments and engage with governments to better design and finance primary health care reforms. © 2017 Milbank Memorial Fund.
The importance of quality control in validating concentrations ...
A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. This paper compares the method performance of six analytical methods used to measure 174 emer
Scheffe, Richard D; Strum, Madeleine; Phillips, Sharon B; Thurman, James; Eyth, Alison; Fudge, Steve; Morris, Mark; Palma, Ted; Cook, Richard
2016-11-15
A hybrid air quality model has been developed and applied to estimate annual concentrations of 40 hazardous air pollutants (HAPs) across the continental United States (CONUS) to support the 2011 calendar year National Air Toxics Assessment (NATA). By combining a chemical transport model (CTM) with a Gaussian dispersion model, both reactive and nonreactive HAPs are accommodated across local to regional spatial scales, through a multiplicative technique designed to improve mass conservation relative to previous additive methods. The broad scope of multiple pollutants capturing regional to local spatial scale patterns across a vast spatial domain is precedent setting within the air toxics community. The hybrid design exhibits improved performance relative to the stand alone CTM and dispersion model. However, model performance varies widely across pollutant categories and quantifiably definitive performance assessments are hampered by a limited observation base and challenged by the multiple physical and chemical attributes of HAPs. Formaldehyde and acetaldehyde are the dominant HAP concentration and cancer risk drivers, characterized by strong regional signals associated with naturally emitted carbonyl precursors enhanced in urban transport corridors with strong mobile source sector emissions. The multiple pollutant emission characteristics of combustion dominated source sectors creates largely similar concentration patterns across the majority of HAPs. However, reactive carbonyls exhibit significantly less spatial variability relative to nonreactive HAPs across the CONUS.
Follow-Up Study to Assess the Use and Performance of Household Filters in Zambia
Peletz, Rachel; Simuyandi, Michelo; Simunyama, Martin; Sarenje, Kelvin; Kelly, Paul; Clasen, Thomas
2013-01-01
Effective household water treatment can improve drinking water quality and prevent disease if used correctly and consistently over time. One year after completion of a randomized controlled study of water filters among households in Zambia with children < 2 years old and mothers who were human immunodeficiency virus (HIV)-positive, we conducted a follow-up study to assess use and performance of new filters distributed at the conclusion of the study; 90% of participating households met the criteria for current users, and 75% of participating households had stored water with lower levels of fecal contamination than source water. Microbiologically, the filters continued to perform well, removing an average of 99.0% of fecal indicator bacteria. Although this study provides some encouraging evidence about the potential to maintain high uptake and filter performance, even in the absence of regular household visits, additional research is necessary to assess whether these results can be achieved over longer periods and with larger populations. PMID:24100635
An assessment technique for computer-socket manufacturing
Sanders, Joan; Severance, Michael
2015-01-01
An assessment strategy is presented for testing the quality of carving and forming of individual computer aided manufacturing facilities. The strategy is potentially useful to facilities making sockets and companies marketing manufacturing equipment. To execute the strategy, an evaluator fabricates a collection of test models and sockets using the manufacturing suite under evaluation, and then measures their shapes using scanning equipment. Overall socket quality is assessed by comparing socket shapes with electronic file shapes. Then model shapes are compared with electronic file shapes to characterize carving performance. Socket shapes are compared with model shapes to characterize forming performance. The mean radial error (MRE), which is the average difference in radii between the two shapes being compared, provides insight into sizing quality. Inter-quartile range (IQR), the range of radial error for the best matched half of the points on the surfaces being compared, provides insight into shape quality. By determining MRE and IQR for carving and forming separately, the source(s) of socket shape error may be pinpointed. The developed strategy may provide a useful tool to the prosthetics community and industry to help identify problems and limitations in computer aided manufacturing and insight into appropriate modifications to overcome them. PMID:21938663
Khan, Nasim A; Lombeida, Juan I; Singh, Manisha; Spencer, Horace J; Torralba, Karina D
2012-07-01
To assess the association of industry funding with the characteristics, outcome, and reported quality of randomized controlled trials (RCTs) of drug therapy for rheumatoid arthritis (RA). The Medline and Cochrane Central Register of Controlled Trials databases were searched to identify original RA drug therapy RCTs published in 2002-2003 and 2006-2007. Two reviewers independently assessed each RCT for the funding source, characteristics, outcome (positive [statistically significant result favoring experimental drug for the primary outcome] or not positive), and reporting of methodologic measures whose inadequate performance may have biased the assessment of treatment effect. RCTs that were registered at ClinicalTrials.gov and completed during the study years were assessed for publication bias. Of the 103 eligible RCTs identified, 58 (56.3%) were funded by industry, 19 (18.4%) were funded by nonprofit sources, 6 (5.8%) had mixed funding, and funding for 20 (19.4%) was not specified. Industry-funded RCTs had significantly more study centers and subjects, while nonprofit agency-funded RCTs had longer duration and were more likely to study different treatment strategies. Outcome could be assessed for 86 (83.5%) of the 103 RCTs studied. The funding source was not associated with a higher likelihood of positive outcomes favoring the sponsored experimental drug (75.5% of industry-funded RCTs had a positive outcome, compared with 68.8% of non-industry-funded RCTs, 40% of RCTs with mixed funding, and 81.2% of RCTs for which funding was not specified). Industry-funded RCTs showed a trend toward a higher likelihood of nonpublication (P=0.093). Industry-funded RCTs were more frequently associated with double-blinding, an adequate description of participant flow, and performance of an intent-to-treat analysis. Industry funding was not associated with a higher likelihood of positive outcomes of published RCTs of drug therapy for RA, and industry-funded RCTs performed significantly better than non-industry-funded RCTs in terms of reporting the use of some key methodologic quality measures. Copyright © 2012 by the American College of Rheumatology.
Composite analysis E-area vaults and saltstone disposal facilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cook, J.R.
1997-09-01
This report documents the Composite Analysis (CA) performed on the two active Savannah River Site (SRS) low-level radioactive waste (LLW) disposal facilities. The facilities are the Z-Area Saltstone Disposal Facility and the E-Area Vaults (EAV) Disposal Facility. The analysis calculated potential releases to the environment from all sources of residual radioactive material expected to remain in the General Separations Area (GSA). The GSA is the central part of SRS and contains all of the waste disposal facilities, chemical separations facilities and associated high-level waste storage facilities as well as numerous other sources of radioactive material. The analysis considered 114 potentialmore » sources of radioactive material containing 115 radionuclides. The results of the CA clearly indicate that continued disposal of low-level waste in the saltstone and EAV facilities, consistent with their respective radiological performance assessments, will have no adverse impact on future members of the public.« less
Oh, Deborah M; Kim, Joshua M; Garcia, Raymond E; Krilowicz, Beverly L
2005-06-01
There is increasing pressure, both from institutions central to the national scientific mission and from regional and national accrediting agencies, on natural sciences faculty to move beyond course examinations as measures of student performance and to instead develop and use reliable and valid authentic assessment measures for both individual courses and for degree-granting programs. We report here on a capstone course developed by two natural sciences departments, Biological Sciences and Chemistry/Biochemistry, which engages students in an important culminating experience, requiring synthesis of skills and knowledge developed throughout the program while providing the departments with important assessment information for use in program improvement. The student work products produced in the course, a written grant proposal, and an oral summary of the proposal, provide a rich source of data regarding student performance on an authentic assessment task. The validity and reliability of the instruments and the resulting student performance data were demonstrated by collaborative review by content experts and a variety of statistical measures of interrater reliability, including percentage agreement, intraclass correlations, and generalizability coefficients. The high interrater reliability reported when the assessment instruments were used for the first time by a group of external evaluators suggests that the assessment process and instruments reported here will be easily adopted by other natural science faculty.
Photometric redshifts for the next generation of deep radio continuum surveys - I. Template fitting
NASA Astrophysics Data System (ADS)
Duncan, Kenneth J.; Brown, Michael J. I.; Williams, Wendy L.; Best, Philip N.; Buat, Veronique; Burgarella, Denis; Jarvis, Matt J.; Małek, Katarzyna; Oliver, S. J.; Röttgering, Huub J. A.; Smith, Daniel J. B.
2018-01-01
We present a study of photometric redshift performance for galaxies and active galactic nuclei detected in deep radio continuum surveys. Using two multiwavelength data sets, over the NOAO Deep Wide Field Survey Boötes and COSMOS fields, we assess photometric redshift (photo-z) performance for a sample of ∼4500 radio continuum sources with spectroscopic redshifts relative to those of ∼63 000 non-radio-detected sources in the same fields. We investigate the performance of three photometric redshift template sets as a function of redshift, radio luminosity and infrared/X-ray properties. We find that no single template library is able to provide the best performance across all subsets of the radio-detected population, with variation in the optimum template set both between subsets and between fields. Through a hierarchical Bayesian combination of the photo-z estimates from all three template sets, we are able to produce a consensus photo-z estimate that equals or improves upon the performance of any individual template set.
The Comprehensive Care Project: Measuring Physician Performance in Ambulatory Practice
Holmboe, Eric S; Weng, Weifeng; Arnold, Gerald K; Kaplan, Sherrie H; Normand, Sharon-Lise; Greenfield, Sheldon; Hood, Sarah; Lipner, Rebecca S
2010-01-01
Objective To investigate the feasibility, reliability, and validity of comprehensively assessing physician-level performance in ambulatory practice. Data Sources/Study Setting Ambulatory-based general internists in 13 states participated in the assessment. Study Design We assessed physician-level performance, adjusted for patient factors, on 46 individual measures, an overall composite measure, and composite measures for chronic, acute, and preventive care. Between- versus within-physician variation was quantified by intraclass correlation coefficients (ICC). External validity was assessed by correlating performance on a certification exam. Data Collection/Extraction Methods Medical records for 236 physicians were audited for seven chronic and four acute care conditions, and six age- and gender-appropriate preventive services. Principal Findings Performance on the individual and composite measures varied substantially within (range 5–86 percent compliance on 46 measures) and between physicians (ICC range 0.12–0.88). Reliabilities for the composite measures were robust: 0.88 for chronic care and 0.87 for preventive services. Higher certification exam scores were associated with better performance on the overall (r = 0.19; p <.01), chronic care (r = 0.14, p = .04), and preventive services composites (r = 0.17, p = .01). Conclusions Our results suggest that reliable and valid comprehensive assessment of the quality of chronic and preventive care can be achieved by creating composite measures and by sampling feasible numbers of patients for each condition. PMID:20819110
Rudnick, Paul A.; Clauser, Karl R.; Kilpatrick, Lisa E.; Tchekhovskoi, Dmitrii V.; Neta, Pedatsur; Blonder, Nikša; Billheimer, Dean D.; Blackman, Ronald K.; Bunk, David M.; Cardasis, Helene L.; Ham, Amy-Joan L.; Jaffe, Jacob D.; Kinsinger, Christopher R.; Mesri, Mehdi; Neubert, Thomas A.; Schilling, Birgit; Tabb, David L.; Tegeler, Tony J.; Vega-Montoto, Lorenzo; Variyath, Asokan Mulayath; Wang, Mu; Wang, Pei; Whiteaker, Jeffrey R.; Zimmerman, Lisa J.; Carr, Steven A.; Fisher, Susan J.; Gibson, Bradford W.; Paulovich, Amanda G.; Regnier, Fred E.; Rodriguez, Henry; Spiegelman, Cliff; Tempst, Paul; Liebler, Daniel C.; Stein, Stephen E.
2010-01-01
A major unmet need in LC-MS/MS-based proteomics analyses is a set of tools for quantitative assessment of system performance and evaluation of technical variability. Here we describe 46 system performance metrics for monitoring chromatographic performance, electrospray source stability, MS1 and MS2 signals, dynamic sampling of ions for MS/MS, and peptide identification. Applied to data sets from replicate LC-MS/MS analyses, these metrics displayed consistent, reasonable responses to controlled perturbations. The metrics typically displayed variations less than 10% and thus can reveal even subtle differences in performance of system components. Analyses of data from interlaboratory studies conducted under a common standard operating procedure identified outlier data and provided clues to specific causes. Moreover, interlaboratory variation reflected by the metrics indicates which system components vary the most between laboratories. Application of these metrics enables rational, quantitative quality assessment for proteomics and other LC-MS/MS analytical applications. PMID:19837981
NASA Technical Reports Server (NTRS)
Haroutunian, Vahe
1995-01-01
This viewgraph presentation provides a brief review of two-equation eddy-viscosity models (TEM's) from the perspective of applied CFD. It provides objective assessment of both well-known and newer models, compares model predictions from various TEM's with experiments, identifies sources of modeling error and gives historical perspective of their effects on model performance and assessment, and recommends directions for future research on TEM's.
NASA Astrophysics Data System (ADS)
Thieme, J.; Hurowitz, J. A.; Schoonen, M. A.; Fogelqvist, E.; Gregerson, J.; Farley, K. A.; Sherman, S.; Hill, J.
2018-04-01
NSLS-II at BNL provides a unique and critical capability to perform assessments of the elemental composition and the chemical state of Mars returned samples using synchrotron radiation X-ray fluorescence imaging and X-ray absorption spectroscopy.
Modeling emissions of volatile organic compounds from silage storages and feed lanes
USDA-ARS?s Scientific Manuscript database
An initial volatile organic compound (VOC) emission model for silage sources, developed using experimental data from previous studies, was incorporated into the Integrated Farm System Model (IFSM), a whole-farm simulation model used to assess the performance, environmental impacts, and economics of ...
The Unintended Consequences of Grading Teaching
ERIC Educational Resources Information Center
Smith, Holly
2012-01-01
This article examines the possibility of a "Teaching Assessment Exercise" and attempts to quantify teaching quality as part of performance management schemes for academics. The primary sources of data are identified as student evaluation of teaching (SET) and peer observation of teaching (POT). The conceptual and empirical issues in…
NASA Astrophysics Data System (ADS)
Panulla, Brian J.; More, Loretta D.; Shumaker, Wade R.; Jones, Michael D.; Hooper, Robert; Vernon, Jeffrey M.; Aungst, Stanley G.
2009-05-01
Rapid improvements in communications infrastructure and sophistication of commercial hand-held devices provide a major new source of information for assessing extreme situations such as environmental crises. In particular, ad hoc collections of humans can act as "soft sensors" to augment data collected by traditional sensors in a net-centric environment (in effect, "crowd-sourcing" observational data). A need exists to understand how to task such soft sensors, characterize their performance and fuse the data with traditional data sources. In order to quantitatively study such situations, as well as study distributed decision-making, we have developed an Extreme Events Laboratory (EEL) at The Pennsylvania State University. This facility provides a network-centric, collaborative situation assessment and decision-making capability by supporting experiments involving human observers, distributed decision making and cognition, and crisis management. The EEL spans the information chain from energy detection via sensors, human observations, signal and image processing, pattern recognition, statistical estimation, multi-sensor data fusion, visualization and analytics, and modeling and simulation. The EEL command center combines COTS and custom collaboration tools in innovative ways, providing capabilities such as geo-spatial visualization and dynamic mash-ups of multiple data sources. This paper describes the EEL and several on-going human-in-the-loop experiments aimed at understanding the new collective observation and analysis landscape.
NASA Technical Reports Server (NTRS)
Fischer, E.
1979-01-01
The pilot's ability to accurately extract information from either one or both of two superimposed sources of information was determined. Static, aerial, color 35 mm slides of external runway environments and slides of corresponding static head-up display (HUD) symbology were used as the sources. A three channel tachistoscope was utilized to show either the HUD alone, the scene alone, or the two slides superimposed. Cognitive performance of the pilots was assessed by determining the percentage of correct answers given to two HUD related questions, two scene related questions, or one HUD and one scene related question.
Miyanishi, Tomohiro; Sumiyoshi, Tomiki; Higuchi, Yuko; Seo, Tomonori; Suzuki, Michio
2013-01-01
Patients with schizophrenia elicit cognitive decline from the early phase of the illness. Mismatch negativity (MMN) has been shown to be associated with cognitive function. We investigated the current source density of duration mismatch negativity (dMMN), by using low-resolution brain electromagnetic tomography (LORETA), and neuropsychological performance in subjects with early schizophrenia. Data were obtained from 20 patients meeting DSM-IV criteria for schizophrenia or schizophreniform disorder, and 20 healthy control (HC) subjects. An auditory odd-ball paradigm was used to measure dMMN. Neuropsychological performance was evaluated by the brief assessment of cognition in schizophrenia Japanese version (BACS-J). Patients showed smaller dMMN amplitudes than those in the HC subjects. LORETA current density for dMMN was significantly lower in patients compared to HC subjects, especially in the temporal lobes. dMMN current density in the frontal lobe was positively correlated with working memory performance in patients. This is the first study to identify brain regions showing smaller dMMN current density in early schizophrenia. Further, poor working memory was associated with decreased dMMN current density in patients. These results are likely to help understand the neural basis for cognitive impairment of schizophrenia.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaplan, D.; Powell, B.; Barber, K.
The SRNL Radiological Field Lysimeter Experiment (RadFLEx) is a one-of-a-kind test bed facility designed to study radionuclide geochemical processes in the Savannah River Site (SRS) vadose zone at a larger spatial scale (from grams to tens of kilograms of sediment) and temporal scale (from months to decade) than is readily afforded through laboratory studies. RadFLEx is a decade-long project that was initiated on July 5, 2012 and is funded by six different sources. The objective of this status report is as follows: 1) to report findings to date that have an impact on SRS performance assessment (PA) calculations, and 2)more » to provide performance metrics of the RadFLEx program. The PA results are focused on measurements of transport parameters, such as distribution coefficients (Kd values), solubility, and unsaturated flow values. As this is an interim report, additional information from subsequent research may influence our interpretation of current results. Research related to basic understanding of radionuclide geochemistry in these vadose zone soils and other source terms are not described here but are referenced for the interested reader.« less
BeadArray Expression Analysis Using Bioconductor
Ritchie, Matthew E.; Dunning, Mark J.; Smith, Mike L.; Shi, Wei; Lynch, Andy G.
2011-01-01
Illumina whole-genome expression BeadArrays are a popular choice in gene profiling studies. Aside from the vendor-provided software tools for analyzing BeadArray expression data (GenomeStudio/BeadStudio), there exists a comprehensive set of open-source analysis tools in the Bioconductor project, many of which have been tailored to exploit the unique properties of this platform. In this article, we explore a number of these software packages and demonstrate how to perform a complete analysis of BeadArray data in various formats. The key steps of importing data, performing quality assessments, preprocessing, and annotation in the common setting of assessing differential expression in designed experiments will be covered. PMID:22144879
Boncyk, Wayne C.; Markham, Brian L.; Barker, John L.; Helder, Dennis
1996-01-01
The Landsat-7 Image Assessment System (IAS), part of the Landsat-7 Ground System, will calibrate and evaluate the radiometric and geometric performance of the Enhanced Thematic Mapper Plus (ETM +) instrument. The IAS incorporates new instrument radiometric artifact correction and absolute radiometric calibration techniques which overcome some limitations to calibration accuracy inherent in historical calibration methods. Knowledge of ETM + instrument characteristics gleaned from analysis of archival Thematic Mapper in-flight data and from ETM + prelaunch tests allow the determination and quantification of the sources of instrument artifacts. This a priori knowledge will be utilized in IAS algorithms designed to minimize the effects of the noise sources before calibration, in both ETM + image and calibration data.
Assessment on security system of radioactive sources used in hospitals of Thailand
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jitbanjong, Petchara, E-mail: petcharajit@gmail.com; Wongsawaeng, Doonyapong
Unsecured radioactive sources have caused deaths and serious injuries in many parts of the world. In Thailand, there are 17 hospitals that use teletherapy with cobalt-60 radioactive sources. They need to be secured in order to prevent unauthorized removal, sabotage and terrorists from using such materials in a radiological weapon. The security system of radioactive sources in Thailand is regulated by the Office of Atoms for Peace in compliance with Global Threat Reduction Initiative (GTRI), U.S. DOE, which has started to be implemented since 2010. This study aims to perform an assessment on the security system of radioactive sources usedmore » in hospitals in Thailand and the results can be used as a recommended baseline data for development or improvement of hospitals on the security system of a radioactive source at a national regulatory level and policy level. Results from questionnaires reveal that in 11 out of 17 hospitals (64.70%), there were a few differences in conditions of hospitals using radioactive sources with installation of the security system and those without installation of the security system. Also, personals working with radioactive sources did not clearly understand the nuclear security law. Thus, government organizations should be encouraged to arrange trainings on nuclear security to increase the level of understanding. In the future, it is recommended that the responsible government organization issues a minimum requirement of nuclear security for every medical facility using radioactive sources.« less
Gosens, Ilse; Delmaar, Christiaan J E; ter Burg, Wouter; de Heer, Cees; Schuur, A Gerlienke
2014-01-01
In the risk assessment of chemical substances, aggregation of exposure to a substance from different sources via different pathways is not common practice. Focusing the exposure assessment on a substance from a single source can lead to a significant underestimation of the risk. To gain more insight on how to perform an aggregate exposure assessment, we applied a deterministic (tier 1) and a person-oriented probabilistic approach (tier 2) for exposure to the four most common parabens through personal care products in children between 0 and 3 years old. Following a deterministic approach, a worst-case exposure estimate is calculated for methyl-, ethyl-, propyl- and butylparaben. As an illustration for risk assessment, Margins of Exposure (MoE) are calculated. These are 991 and 4966 for methyl- and ethylparaben, and 8 and 10 for propyl- and butylparaben, respectively. In tier 2, more detailed information on product use has been obtained from a small survey on product use of consumers. A probabilistic exposure assessment is performed to estimate the variability and uncertainty of exposure in a population. Results show that the internal exposure for each paraben is below the level determined in tier 1. However, for propyl- and butylparaben, the percentile of the population with an exposure probability above the assumed “safe” MoE of 100, is 13% and 7%, respectively. In conclusion, a tier 1 approach can be performed using simple equations and default point estimates, and serves as a starting point for exposure and risk assessment. If refinement is warranted, the more data demanding person-oriented probabilistic approach should be used. This probabilistic approach results in a more realistic exposure estimate, including the uncertainty, and allows determining the main drivers of exposure. Furthermore, it allows to estimate the percentage of the population for which the exposure is likely to be above a specific value. PMID:23801276
Assessment & Commitment Tracking System (ACTS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryant, Robert A.; Childs, Teresa A.; Miller, Michael A.
2004-12-20
The ACTS computer code provides a centralized tool for planning and scheduling assessments, tracking and managing actions associated with assessments or that result from an event or condition, and "mining" data for reporting and analyzing information for improving performance. The ACTS application is designed to work with the MS SQL database management system. All database interfaces are written in SQL. The following software is used to develop and support the ACTS application: Cold Fusion HTML JavaScript Quest TOAD Microsoft Visual Source Safe (VSS) HTML Mailer for sending email Microsoft SQL Microsoft Internet Information Server
Debris/ice/TPS assessment and photographic analysis for Shuttle Mission STS-43
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.; Higginbotham, Scott A.; Davis, James Bradley
1991-01-01
A debris/ice Thermal Protection System (TPS) assessment and photographic analysis was conducted for Space Station Mission STS-43. Debris inspections of the flight elements and launch pad were performed before and after launch. Ice/frost conditions on the External Tank (ET) were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography was analyzed after launch to identify ice/debris sources and to evaluate potential vehicle damage and/or in-flight anomalies.
Debris/Ice/TPS Assessment and Photographic Analysis for Shuttle Mission STS-40
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.; Higginbotham, Scott A.; Davis, J. Bradley
1991-01-01
A debris, ice, Thermal Protection System (TPS) assessment and photographic analysis for Space Shuttle Mission STS-40 was conducted. Debris inspections of the flight elements and launch pad were performed before and after launch. Ice and frost conditions on the External Tank were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography was analyzed after launch to identify ice and debris sources and to evaluate potential vehicle damage and/or in-flight anomalies.
NASA Astrophysics Data System (ADS)
Tsai, Tsung-Han; Ahsen, Osman O.; Lee, Hsiang-Chieh; Liang, Kaicheng; Giacomelli, Michael G.; Potsaid, Benjamin M.; Tao, Yuankai K.; Jayaraman, Vijaysekhar; Kraus, Martin F.; Hornegger, Joachim; Figueiredo, Marisa; Huang, Qin; Mashimo, Hiroshi; Cable, Alex E.; Fujimoto, James G.
2014-03-01
We developed an ultrahigh speed endoscopic swept source optical coherence tomography (OCT) system for clinical gastroenterology using a vertical-cavity surface-emitting laser (VCSEL) and micromotor based imaging catheter, which provided an imaging speed of 600 kHz axial scan rate and 8 μm axial resolution in tissue. The micromotor catheter was 3.2 mm in diameter and could be introduced through the 3.7 mm accessory port of an endoscope. Imaging was performed at 400 frames per second with an 8 μm spot size using a pullback to generate volumetric data over 16 mm with a pixel spacing of 5 μm in the longitudinal direction. Three-dimensional OCT (3D-OCT) imaging was performed in patients with a cross section of pathologies undergoing standard upper and lower endoscopy at the Veterans Affairs Boston Healthcare System (VABHS). Patients with Barrett's esophagus, dysplasia, and inflammatory bowel disease were imaged. The use of distally actuated imaging catheters allowed OCT imaging with more flexibility such as volumetric imaging in the terminal ileum and the assessment of the hiatal hernia using retroflex imaging. The high rotational stability of the micromotor enabled 3D volumetric imaging with micron scale volumetric accuracy for both en face and cross-sectional imaging. The ability to perform 3D OCT imaging in the GI tract with microscopic accuracy should enable a wide range of studies to investigate the ability of OCT to detect pathology as well as assess treatment response.
Wise, Daniel R.; Johnson, Henry M.
2013-01-01
The watershed model SPARROW (Spatially Referenced Regressions on Watershed attributes) was used to estimate mean annual surface-water nutrient conditions (total nitrogen and total phosphorus) and to identify important nutrient sources in catchments of the Pacific Northwest region of the United States for 2002. Model-estimated nutrient yields were generally higher in catchments on the wetter, western side of the Cascade Range than in catchments on the drier, eastern side. The largest source of locally generated total nitrogen stream load in most catchments was runoff from forestland, whereas the largest source of locally generated total phosphorus stream load in most catchments was either geologic material or livestock manure (primarily from grazing livestock). However, the highest total nitrogen and total phosphorus yields were predicted in the relatively small number of catchments where urban sources were the largest contributor to local stream load. Two examples are presented that show how SPARROW results can be applied to large rivers—the relative contribution of different nutrient sources to the total nitrogen load in the Willamette River and the total phosphorus load in the Snake River. The results from this study provided an understanding of the regional patterns in surface-water nutrient conditions and should be useful to researchers and water-quality managers performing local nutrient assessments.
Khanduja, P Kristina; Bould, M Dylan; Naik, Viren N; Hladkowicz, Emily; Boet, Sylvain
2015-01-01
We systematically reviewed the effectiveness of simulation-based education, targeting independently practicing qualified physicians in acute care specialties. We also describe how simulation is used for performance assessment in this population. Data source included: DataMEDLINE, Embase, Cochrane Database of Systematic Reviews, Cochrane CENTRAL Database of Controlled Trials, and National Health Service Economic Evaluation Database. The last date of search was January 31, 2013. All original research describing simulation-based education for independently practicing physicians in anesthesiology, critical care, and emergency medicine was reviewed. Data analysis was performed in duplicate with further review by a third author in cases of disagreement until consensus was reached. Data extraction was focused on effectiveness according to Kirkpatrick's model. For simulation-based performance assessment, tool characteristics and sources of validity evidence were also collated. Of 39 studies identified, 30 studies focused on the effectiveness of simulation-based education and nine studies evaluated the validity of simulation-based assessment. Thirteen studies (30%) targeted the lower levels of Kirkpatrick's hierarchy with reliance on self-reporting. Simulation was unanimously described as a positive learning experience with perceived impact on clinical practice. Of the 17 remaining studies, 10 used a single group or "no intervention comparison group" design. The majority (n = 17; 44%) were able to demonstrate both immediate and sustained improvements in educational outcomes. Nine studies reported the psychometric properties of simulation-based performance assessment as their sole objective. These predominantly recruited independent practitioners as a convenience sample to establish whether the tool could discriminate between experienced and inexperienced operators and concentrated on a single aspect of validity evidence. Simulation is perceived as a positive learning experience with limited evidence to support improved learning. Future research should focus on the optimal modality and frequency of exposure, quality of assessment tools and on the impact of simulation-based education beyond the individuals toward improved patient care.
Testing earthquake source inversion methodologies
Page, M.; Mai, P.M.; Schorlemmer, D.
2011-01-01
Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data and the complex rupture process at depth. The resulting earthquake source models quantify the spatiotemporal evolution of ruptures. They are also used to provide a rapid assessment of the severity of an earthquake and to estimate losses. However, because of uncertainties in the data, assumed fault geometry and velocity structure, and chosen rupture parameterization, it is not clear which features of these source models are robust. Improved understanding of the uncertainty and reliability of earthquake source inversions will allow the scientific community to use the robust features of kinematic inversions to more thoroughly investigate the complexity of the rupture process and to better constrain other earthquakerelated computations, such as ground motion simulations and static stress change calculations.
Feasibility of approaches combining sensor and source features in brain-computer interface.
Ahn, Minkyu; Hong, Jun Hee; Jun, Sung Chan
2012-02-15
Brain-computer interface (BCI) provides a new channel for communication between brain and computers through brain signals. Cost-effective EEG provides good temporal resolution, but its spatial resolution is poor and sensor information is blurred by inherent noise. To overcome these issues, spatial filtering and feature extraction techniques have been developed. Source imaging, transformation of sensor signals into the source space through source localizer, has gained attention as a new approach for BCI. It has been reported that the source imaging yields some improvement of BCI performance. However, there exists no thorough investigation on how source imaging information overlaps with, and is complementary to, sensor information. Information (visible information) from the source space may overlap as well as be exclusive to information from the sensor space is hypothesized. Therefore, we can extract more information from the sensor and source spaces if our hypothesis is true, thereby contributing to more accurate BCI systems. In this work, features from each space (sensor or source), and two strategies combining sensor and source features are assessed. The information distribution among the sensor, source, and combined spaces is discussed through a Venn diagram for 18 motor imagery datasets. Additional 5 motor imagery datasets from the BCI Competition III site were examined. The results showed that the addition of source information yielded about 3.8% classification improvement for 18 motor imagery datasets and showed an average accuracy of 75.56% for BCI Competition data. Our proposed approach is promising, and improved performance may be possible with better head model. Copyright © 2011 Elsevier B.V. All rights reserved.
Kessels, Roy P C; Kortrijk, Hans E; Wester, Arie J; Nys, Gudrun M S
2008-04-01
Confabulation behavior is common in patients with Korsakoff's syndrome. A distinction can be made between spontaneous and provoked confabulations, which may have different underlying cognitive mechanisms. Provoked confabulations may be related to intrusions on memory tests, whereas spontaneous confabulations may be due to executive dysfunction or a source memory deficit. In 19 chronic Korsakoff patients, spontaneous confabulations were quantified by third-party rating (Likert scale). Provoked confabulations were assessed using the Dalla Barba Confabulation Battery. Furthermore, assessment of executive function was performed using an extensive neuropsychological battery. False memories (i.e. intrusions) and source memory were measured using twoparallelversions of a word-list learning paradigm (a modification of the Rey Auditory Verbal Learning Test). There were deficits in source memory, in which patients incorrectly assigned previously learned words to an incorrect word list. Also, Korsakoff patients had extensive executive deficits, but no relationship between the severity of these deficits and the severity of confabulation or intrusions on a memory task was found. The present findings provide evidence for a dissociation between spontaneous confabulation, provoked confabulation and false memories.
Hybrid Wing Body Aircraft System Noise Assessment with Propulsion Airframe Aeroacoustic Experiments
NASA Technical Reports Server (NTRS)
Thomas, Russell H.; Burley, Casey L.; Olson, Erik D.
2010-01-01
A system noise assessment of a hybrid wing body configuration was performed using NASA s best available aircraft models, engine model, and system noise assessment method. A propulsion airframe aeroacoustic effects experimental database for key noise sources and interaction effects was used to provide data directly in the noise assessment where prediction methods are inadequate. NASA engine and aircraft system models were created to define the hybrid wing body aircraft concept as a twin engine aircraft with a 7500 nautical mile mission. The engines were modeled as existing technology high bypass ratio turbofans. The baseline hybrid wing body aircraft was assessed at 22 dB cumulative below the FAA Stage 4 certification level. To determine the potential for noise reduction with relatively near term technologies, seven other configurations were assessed beginning with moving the engines two fan nozzle diameters upstream of the trailing edge and then adding technologies for reduction of the highest noise sources. Aft radiated noise was expected to be the most challenging to reduce and, therefore, the experimental database focused on jet nozzle and pylon configurations that could reduce jet noise through a combination of source reduction and shielding effectiveness. The best configuration for reduction of jet noise used state-of-the-art technology chevrons with a pylon above the engine in the crown position. This configuration resulted in jet source noise reduction, favorable azimuthal directivity, and noise source relocation upstream where it is more effectively shielded by the limited airframe surface, and additional fan noise attenuation from acoustic liner on the crown pylon internal surfaces. Vertical and elevon surfaces were also assessed to add shielding area. The elevon deflection above the trailing edge showed some small additional noise reduction whereas vertical surfaces resulted in a slight noise increase. With the effects of the configurations from the database included, the best available noise reduction was 40 dB cumulative. Projected effects from additional technologies were assessed for an advanced noise reduction configuration including landing gear fairings and advanced pylon and chevron nozzles. Incorporating the three additional technology improvements, an aircraft noise is projected of 42.4 dB cumulative below the Stage 4 level.
Functional performance of pyrovalves
NASA Technical Reports Server (NTRS)
Bement, Laurence J.
1996-01-01
Following several flight and ground test failures of spacecraft systems using single-shot, 'normally closed' pyrotechnically actuated valves (pyrovalves), a Government/Industry cooperative program was initiated to assess the functional performance of five qualified designs. The goal of the program was to provide information on functional performance of pyrovalves to allow users the opportunity to improve procurement requirements. Specific objectives included the demonstration of performance test methods, the seating; these gases/particles entered the fluid path of measurement of 'blowby' (the passage of gases from the pyrotechnic energy source around the activating piston into the valve's fluid path), and the quantification of functional margins for each design. Experiments were conducted at NASA's Langley Research Center on several units for each of the five valve designs. The test methods used for this program measured the forces and energies required to actuate the valves, as well as the energies and the pressures (where possible) delivered by the pyrotechnic sources. Functional performance ranged widely among the designs. Blowby cannot be prevented by o-ring seals; metal-to-metal seals were effective. Functional margin was determined by dividing the energy delivered by the pyrotechnic sources in excess to that required to accomplish the function by the energy required for that function. Two of the five designs had inadequate functional margins with the pyrotechnic cartridges evaluated.
NASA Astrophysics Data System (ADS)
Chen, Shimon; Yuval; Broday, David M.
2018-01-01
The Optimized Dispersion Model (ODM) is uniquely capable of incorporating emission estimates, ambient air quality monitoring data and meteorology to provide reliable high-resolution (in both time and space) air quality estimates using non-linear regression. However, it was so far not capable of describing the effects of emissions from elevated sources. We formulated an additional term to extend the ODM such that these sources can be accounted for, and implemented it in modeling the fine spatiotemporal patterns of ambient NOx concentrations over the coastal plain of Israel. The diurnal and seasonal variation in the contribution of industry to the ambient NOx is presented, as well as its spatial features. Although industrial stacks are responsible for 88% of the NOx emissions in the study area, their contribution to ambient NOx levels is generally about 2% with a maximal upper bound of 27%. Meteorology has a major role in this source allocation, with the highest impact of industry in the summer months, when the wind is blowing inland past the coastal stacks and vertical mixing is substantial. The new Optimized Dispersion Model (ODM) out-performs both Inverse-Distance-Weighing (IDW) interpolation and a previous ODM version in predicting ambient NOx concentrations. The performance of the new model is thoroughly assessed.
NASA Astrophysics Data System (ADS)
Ding, Lei; Lai, Yuan; He, Bin
2005-01-01
It is of importance to localize neural sources from scalp recorded EEG. Low resolution brain electromagnetic tomography (LORETA) has received considerable attention for localizing brain electrical sources. However, most such efforts have used spherical head models in representing the head volume conductor. Investigation of the performance of LORETA in a realistic geometry head model, as compared with the spherical model, will provide useful information guiding interpretation of data obtained by using the spherical head model. The performance of LORETA was evaluated by means of computer simulations. The boundary element method was used to solve the forward problem. A three-shell realistic geometry (RG) head model was constructed from MRI scans of a human subject. Dipole source configurations of a single dipole located at different regions of the brain with varying depth were used to assess the performance of LORETA in different regions of the brain. A three-sphere head model was also used to approximate the RG head model, and similar simulations performed, and results compared with the RG-LORETA with reference to the locations of the simulated sources. Multi-source localizations were discussed and examples given in the RG head model. Localization errors employing the spherical LORETA, with reference to the source locations within the realistic geometry head, were about 20-30 mm, for four brain regions evaluated: frontal, parietal, temporal and occipital regions. Localization errors employing the RG head model were about 10 mm over the same four brain regions. The present simulation results suggest that the use of the RG head model reduces the localization error of LORETA, and that the RG head model based LORETA is desirable if high localization accuracy is needed.
Methodological and ethical aspects of the sexual maturation assessment in adolescents
de Faria, Eliane Rodrigues; Franceschini, Sylvia do Carmo C.; Peluzio, Maria do Carmo G.; Sant'Ana, Luciana Ferreira da R.; Priore, Silvia Eloiza
2013-01-01
OBJECTIVE To analyze methodological and ethical aspects in the sexual maturation assessment of adolescents. DATA SOURCES Books and theses, articles and legislations on the Medline, SciELO, Science Direct databases, besides institutional documents of the World Health Organization and the Pediatric Societies of Brazil and São Paulo, considering the period from 1962 to 2012. The following keywords were used in Portuguese and English: "sexual maturation", "self-assessment", "ethics", "OBJECTIVE assessment of sexual maturation", "puberty", "adolescent", and "adolescentdevelopment". DATA SYNTHESIS The sexual maturation assessment is used in populatinal studies and in clinical daily care. The direct evaluation is performed by a specialized physician, whereas the self-assessment is carried out by the adolescent. This evaluation should be carefully performed in the appropriate place, taking into account the ethical aspects. The patient should not be constrained and the physician must respect the privacy and the confidentiality. Before this evaluation and independently of the used method, the adolescent should receive information and explanation about the procedure and the tools that will be applied. Furthermore, the patient has the right to want or not an adult close to him. CONCLUSIONS Validation studies showed that self-assessment is inferior to clinical assessment and should, therefore, be performed only when the direct examination by physicians is not possible. PMID:24142325
Brzozek, Christopher; Benke, Kurt K; Zeleke, Berihun M; Abramson, Michael J; Benke, Geza
2018-03-26
Uncertainty in experimental studies of exposure to radiation from mobile phones has in the past only been framed within the context of statistical variability. It is now becoming more apparent to researchers that epistemic or reducible uncertainties can also affect the total error in results. These uncertainties are derived from a wide range of sources including human error, such as data transcription, model structure, measurement and linguistic errors in communication. The issue of epistemic uncertainty is reviewed and interpreted in the context of the MoRPhEUS, ExPOSURE and HERMES cohort studies which investigate the effect of radiofrequency electromagnetic radiation from mobile phones on memory performance. Research into this field has found inconsistent results due to limitations from a range of epistemic sources. Potential analytic approaches are suggested based on quantification of epistemic error using Monte Carlo simulation. It is recommended that future studies investigating the relationship between radiofrequency electromagnetic radiation and memory performance pay more attention to treatment of epistemic uncertainties as well as further research into improving exposure assessment. Use of directed acyclic graphs is also encouraged to display the assumed covariate relationship.
A Framework for Quality Assurance in Child Welfare.
ERIC Educational Resources Information Center
O'Brien, Mary; Watson, Peter
In their search for new ways to assess their agencies' success in working with children and families, child welfare administrators and senior managers are increasingly seeking regular and reliable sources of information that help them evaluate agency performance, make ongoing decisions, and provide an accurate picture for agency staff and external…
Soil vapor extraction (SVE) is a prevalent remediation approach for volatile contaminants in the vadose zone. To support selection of an appropriate endpoint for the SVE remedy, an evaluation is needed to determine whether vadose zone contamination has been diminished sufficient...
Teaching Them to Teach: Programmatic Evaluation of Graduate Assistants' Teaching Performance
ERIC Educational Resources Information Center
Sobel, Karen; Avery, Susan; Ferrer-Vinent, Ignacio J.
2016-01-01
Academic libraries are one of the most important sources of "on the job" training for future library instructors. Librarians who supervise and assess these future library instructors (often in graduate assistant positions) often choose to provide observations and feedback each semester to these instructors in training. Scholarly…
2015-03-26
to my reader, Lieutenant Colonel Robert Overstreet, for helping solidify my research, coaching me through the statistical analysis, and positive...61 Descriptive Statistics .............................................................................................................. 61...common-method bias requires careful assessment of potential sources of bias and implementing procedural and statistical control methods. Podsakoff
ESTIMATION OF DIFFUSION LOSSES WHEN SAMPLING DIESEL AEROSOL: A QUALITY ASSURANCE MEASURE
A fundamental component of the QA work for the assessment of instruments and sampling system performance was the investigation of particle losses in sampling lines. Along the aerosol sample pathway from its source to the collection media or measuring instrument, some nano-size p...
Selected Staff Studies in Elementary and Secondary School Finance.
ERIC Educational Resources Information Center
Weinstein, Bernard L.; And Others
These five study papers (1) analyze recent performance contracting experiences, (2) discuss education voucher proposals and prospects, (3) assess potential federal revenue sources for education, (4) provide an inventory of federal programs in aid to education, and (5) examine the status of nonpublic education. The first study suggests that schools…
Describing Comprehension: Teachers' Observations of Students' Reading Comprehension
ERIC Educational Resources Information Center
Vander Does, Susan Lubow
2012-01-01
Teachers' observations of student performance in reading are abundant and insightful but often remain internal and unarticulated. As a result, such observations are an underutilized and undervalued source of data. Given the gaps in knowledge about students' reading comprehension that exist in formal assessments, the frequent calls for teachers'…
Agricultural production in the Corn Belt region of the Upper Mississippi River Basin (UMRB) remains a leading source of nitrogen runoff that contributes to the annual hypoxic 'Dead Zone' in the Gulf of Mexico. The rise of corn production, land conversion, and fertilizer use in re...
Conducting Source Water Assessments
This page presents background information on the source water assessment steps, the four steps of a source wter assessment, and how to use the results of an assessment to protect drinking water sources.
Designing and benchmarking the MULTICOM protein structure prediction system
2013-01-01
Background Predicting protein structure from sequence is one of the most significant and challenging problems in bioinformatics. Numerous bioinformatics techniques and tools have been developed to tackle almost every aspect of protein structure prediction ranging from structural feature prediction, template identification and query-template alignment to structure sampling, model quality assessment, and model refinement. How to synergistically select, integrate and improve the strengths of the complementary techniques at each prediction stage and build a high-performance system is becoming a critical issue for constructing a successful, competitive protein structure predictor. Results Over the past several years, we have constructed a standalone protein structure prediction system MULTICOM that combines multiple sources of information and complementary methods at all five stages of the protein structure prediction process including template identification, template combination, model generation, model assessment, and model refinement. The system was blindly tested during the ninth Critical Assessment of Techniques for Protein Structure Prediction (CASP9) in 2010 and yielded very good performance. In addition to studying the overall performance on the CASP9 benchmark, we thoroughly investigated the performance and contributions of each component at each stage of prediction. Conclusions Our comprehensive and comparative study not only provides useful and practical insights about how to select, improve, and integrate complementary methods to build a cutting-edge protein structure prediction system but also identifies a few new sources of information that may help improve the design of a protein structure prediction system. Several components used in the MULTICOM system are available at: http://sysbio.rnet.missouri.edu/multicom_toolbox/. PMID:23442819
Vecchi, R; Bernardoni, V; Valentini, S; Piazzalunga, A; Fermo, P; Valli, G
2018-02-01
In this paper, results from receptor modelling performed on a well-characterised PM 1 dataset were combined to chemical light extinction data (b ext ) with the aim of assessing the impact of different PM 1 components and sources on light extinction and visibility at a European polluted urban area. It is noteworthy that, at the state of the art, there are still very few papers estimating the impact of different emission sources on light extinction as we present here, although being among the major environmental challenges at many polluted areas. Following the concept of the well-known IMPROVE algorithm, here a tailored site-specific approach (recently developed by our group) was applied to assess chemical light extinction due to PM 1 components and major sources. PM 1 samples collected separately during daytime and nighttime at the urban area of Milan (Italy) were chemically characterised for elements, major ions, elemental and organic carbon, and levoglucosan. Chemical light extinction was estimated and results showed that at the investigated urban site it is heavily impacted by ammonium nitrate and organic matter. Receptor modelling (i.e. Positive Matrix Factorization, EPA-PMF 5.0) was effective to obtain source apportionment; the most reliable solution was found with 7 factors which were tentatively assigned to nitrates, sulphates, wood burning, traffic, industry, fine dust, and a Pb-rich source. The apportionment of aerosol light extinction (b ext,aer ) according to resolved sources showed that considering all samples together nitrate contributed at most (on average 41.6%), followed by sulphate, traffic, and wood burning accounting for 18.3%, 17.8% and 12.4%, respectively. Copyright © 2017 Elsevier Ltd. All rights reserved.
Environmental impact and risk assessments and key factors contributing to the overall uncertainties.
Salbu, Brit
2016-01-01
There is a significant number of nuclear and radiological sources that have contributed, are still contributing, or have the potential to contribute to radioactive contamination of the environment in the future. To protect the environment from radioactive contamination, impact and risk assessments are performed prior to or during a release event, short or long term after deposition or prior and after implementation of countermeasures. When environmental impact and risks are assessed, however, a series of factors will contribute to the overall uncertainties. To provide environmental impact and risk assessments, information on processes, kinetics and a series of input variables is needed. Adding problems such as variability, questionable assumptions, gaps in knowledge, extrapolations and poor conceptual model structures, a series of factors are contributing to large and often unacceptable uncertainties in impact and risk assessments. Information on the source term and the release scenario is an essential starting point in impact and risk models; the source determines activity concentrations and atom ratios of radionuclides released, while the release scenario determine the physico-chemical forms of released radionuclides such as particle size distribution, structure and density. Releases will most often contain other contaminants such as metals, and due to interactions, contaminated sites should be assessed as a multiple stressor scenario. Following deposition, a series of stressors, interactions and processes will influence the ecosystem transfer of radionuclide species and thereby influence biological uptake (toxicokinetics) and responses (toxicodynamics) in exposed organisms. Due to the variety of biological species, extrapolation is frequently needed to fill gaps in knowledge e.g., from effects to no effects, from effects in one organism to others, from one stressor to mixtures. Most toxtests are, however, performed as short term exposure of adult organisms, ignoring sensitive history life stages of organisms and transgenerational effects. To link sources, ecosystem transfer and biological effects to future impact and risks, a series of models are usually interfaced, while uncertainty estimates are seldom given. The model predictions are, however, only valid within the boundaries of the overall uncertainties. Furthermore, the model predictions are only useful and relevant when uncertainties are estimated, communicated and understood. Among key factors contributing most to uncertainties, the present paper focuses especially on structure uncertainties (model bias or discrepancies) as aspects such as particle releases, ecosystem dynamics, mixed exposure, sensitive life history stages and transgenerational effects, are usually ignored in assessment models. Research focus on these aspects should significantly reduce the overall uncertainties in the impact and risk assessment of radioactive contaminated ecosystems. Copyright © 2015 Elsevier Ltd. All rights reserved.
Modelling Complex Fenestration Systems using physical and virtual models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thanachareonkit, Anothai; Scartezzini, Jean-Louis
2010-04-15
Physical or virtual models are commonly used to visualize the conceptual ideas of architects, lighting designers and researchers; they are also employed to assess the daylighting performance of buildings, particularly in cases where Complex Fenestration Systems (CFS) are considered. Recent studies have however revealed a general tendency of physical models to over-estimate this performance, compared to those of real buildings; these discrepancies can be attributed to several reasons. In order to identify the main error sources, a series of comparisons in-between a real building (a single office room within a test module) and the corresponding physical and virtual models wasmore » undertaken. The physical model was placed in outdoor conditions, which were strictly identical to those of the real building, as well as underneath a scanning sky simulator. The virtual model simulations were carried out by way of the Radiance program using the GenSky function; an alternative evaluation method, named Partial Daylight Factor method (PDF method), was also employed with the physical model together with sky luminance distributions acquired by a digital sky scanner during the monitoring of the real building. The overall daylighting performance of physical and virtual models were assessed and compared. The causes of discrepancies between the daylighting performance of the real building and the models were analysed. The main identified sources of errors are the reproduction of building details, the CFS modelling and the mocking-up of the geometrical and photometrical properties. To study the impact of these errors on daylighting performance assessment, computer simulation models created using the Radiance program were also used to carry out a sensitivity analysis of modelling errors. The study of the models showed that large discrepancies can occur in daylighting performance assessment. In case of improper mocking-up of the glazing for instance, relative divergences of 25-40% can be found in different room locations, suggesting that more light is entering than actually monitored in the real building. All these discrepancies can however be reduced by making an effort to carefully mock up the geometry and photometry of the real building. A synthesis is presented in this article which can be used as guidelines for daylighting designers to avoid or estimate errors during CFS daylighting performance assessment. (author)« less
Huang, Ming-Xiong; Huang, Charles W; Robb, Ashley; Angeles, AnneMarie; Nichols, Sharon L; Baker, Dewleen G; Song, Tao; Harrington, Deborah L; Theilmann, Rebecca J; Srinivasan, Ramesh; Heister, David; Diwakar, Mithun; Canive, Jose M; Edgar, J Christopher; Chen, Yu-Han; Ji, Zhengwei; Shen, Max; El-Gabalawy, Fady; Levy, Michael; McLay, Robert; Webb-Murphy, Jennifer; Liu, Thomas T; Drake, Angela; Lee, Roland R
2014-01-01
The present study developed a fast MEG source imaging technique based on Fast Vector-based Spatio-Temporal Analysis using a L1-minimum-norm (Fast-VESTAL) and then used the method to obtain the source amplitude images of resting-state magnetoencephalography (MEG) signals for different frequency bands. The Fast-VESTAL technique consists of two steps. First, L1-minimum-norm MEG source images were obtained for the dominant spatial modes of sensor-waveform covariance matrix. Next, accurate source time-courses with millisecond temporal resolution were obtained using an inverse operator constructed from the spatial source images of Step 1. Using simulations, Fast-VESTAL's performance was assessed for its 1) ability to localize multiple correlated sources; 2) ability to faithfully recover source time-courses; 3) robustness to different SNR conditions including SNR with negative dB levels; 4) capability to handle correlated brain noise; and 5) statistical maps of MEG source images. An objective pre-whitening method was also developed and integrated with Fast-VESTAL to remove correlated brain noise. Fast-VESTAL's performance was then examined in the analysis of human median-nerve MEG responses. The results demonstrated that this method easily distinguished sources in the entire somatosensory network. Next, Fast-VESTAL was applied to obtain the first whole-head MEG source-amplitude images from resting-state signals in 41 healthy control subjects, for all standard frequency bands. Comparisons between resting-state MEG sources images and known neurophysiology were provided. Additionally, in simulations and cases with MEG human responses, the results obtained from using conventional beamformer technique were compared with those from Fast-VESTAL, which highlighted the beamformer's problems of signal leaking and distorted source time-courses. © 2013.
Huang, Ming-Xiong; Huang, Charles W.; Robb, Ashley; Angeles, AnneMarie; Nichols, Sharon L.; Baker, Dewleen G.; Song, Tao; Harrington, Deborah L.; Theilmann, Rebecca J.; Srinivasan, Ramesh; Heister, David; Diwakar, Mithun; Canive, Jose M.; Edgar, J. Christopher; Chen, Yu-Han; Ji, Zhengwei; Shen, Max; El-Gabalawy, Fady; Levy, Michael; McLay, Robert; Webb-Murphy, Jennifer; Liu, Thomas T.; Drake, Angela; Lee, Roland R.
2014-01-01
The present study developed a fast MEG source imaging technique based on Fast Vector-based Spatio-Temporal Analysis using a L1-minimum-norm (Fast-VESTAL) and then used the method to obtain the source amplitude images of resting-state magnetoencephalography (MEG) signals for different frequency bands. The Fast-VESTAL technique consists of two steps. First, L1-minimum-norm MEG source images were obtained for the dominant spatial modes of sensor-waveform covariance matrix. Next, accurate source time-courses with millisecond temporal resolution were obtained using an inverse operator constructed from the spatial source images of Step 1. Using simulations, Fast-VESTAL’s performance of was assessed for its 1) ability to localize multiple correlated sources; 2) ability to faithfully recover source time-courses; 3) robustness to different SNR conditions including SNR with negative dB levels; 4) capability to handle correlated brain noise; and 5) statistical maps of MEG source images. An objective pre-whitening method was also developed and integrated with Fast-VESTAL to remove correlated brain noise. Fast-VESTAL’s performance was then examined in the analysis of human mediannerve MEG responses. The results demonstrated that this method easily distinguished sources in the entire somatosensory network. Next, Fast-VESTAL was applied to obtain the first whole-head MEG source-amplitude images from resting-state signals in 41 healthy control subjects, for all standard frequency bands. Comparisons between resting-state MEG sources images and known neurophysiology were provided. Additionally, in simulations and cases with MEG human responses, the results obtained from using conventional beamformer technique were compared with those from Fast-VESTAL, which highlighted the beamformer’s problems of signal leaking and distorted source time-courses. PMID:24055704
Mazzucchelli, Gabriel; Holzhauser, Thomas; Cirkovic Velickovic, Tanja; Diaz‐Perales, Araceli; Molina, Elena; Roncada, Paola; Rodrigues, Pedro; Verhoeckx, Kitty
2017-01-01
Abstract Food allergies are recognized as a global health concern. In order to protect allergic consumers from severe symptoms, allergenic risk assessment for well‐known foods and foods containing genetically modified ingredients is installed. However, population is steadily growing and there is a rising need to provide adequate protein‐based foods, including novel sources, not yet used for human consumption. In this context safety issues such as a potential increased allergenic risk need to be assessed before marketing novel food sources. Therefore, the established allergenic risk assessment for genetically modified organisms needs to be re‐evaluated for its applicability for risk assessment of novel food proteins. Two different scenarios of allergic sensitization have to be assessed. The first scenario is the presence of already known allergenic structures in novel foods. For this, a comparative assessment can be performed and the range of cross‐reactivity can be explored, while in the second scenario allergic reactions are observed toward so far novel allergenic structures and no reference material is available. This review summarizes the current analytical methods for allergenic risk assessment, highlighting the strengths and limitations of each method and discussing the gaps in this assessment that need to be addressed in the near future. PMID:28925060
An assessment of air as a source of DNA contamination encountered when performing PCR.
Witt, Nina; Rodger, Gillian; Vandesompele, Jo; Benes, Vladimir; Zumla, Alimuddin; Rook, Graham A; Huggett, Jim F
2009-12-01
Sensitive molecular methods, such as the PCR, can detect low-level contamination, and careful technique is required to reduce the impact of contaminants. Yet, some assays that are designed to detect high copy-number target sequences appear to be impossible to perform without contamination, and frequently, personnel or laboratory environment are held responsible as the source. This complicates diagnostic and research analysis when using molecular methods. To investigate the air specifically as a source of contamination, which might occur during PCR setup, we exposed tubes of water to the air of a laboratory and clean hood for up to 24 h. To increase the chances of contamination, we also investigated a busy open-plan office in the same way. All of the experiments showed the presence of human and rodent DNA contamination. However, there was no accumulation of the contamination in any of the environments investigated, suggesting that the air was not the source of contamination. Even the air from a busy open-plan office was a poor source of contamination for all of the DNA sequences investigated (human, bacterial, fungal, and rodent). This demonstrates that the personnel and immediate laboratory environment are not necessarily to blame for the observed contamination.
Assessment and mitigation of power quality problems for PUSPATI TRIGA Reactor (RTP)
NASA Astrophysics Data System (ADS)
Zakaria, Mohd Fazli; Ramachandaramurthy, Vigna K.
2017-01-01
An electrical power systems are exposed to different types of power quality disturbances. Investigation and monitoring of power quality are necessary to maintain accurate operation of sensitive equipment especially for nuclear installations. This paper will discuss the power quality problems observed at the electrical sources of PUSPATI TRIGA Reactor (RTP). Assessment of power quality requires the identification of any anomalous behavior on a power system, which adversely affects the normal operation of electrical or electronic equipment. A power quality assessment involves gathering data resources; analyzing the data (with reference to power quality standards) then, if problems exist, recommendation of mitigation techniques must be considered. Field power quality data is collected by power quality recorder and analyzed with reference to power quality standards. Normally the electrical power is supplied to the RTP via two sources in order to keep a good reliability where each of them is designed to carry the full load. The assessment of power quality during reactor operation was performed for both electrical sources. There were several disturbances such as voltage harmonics and flicker that exceeded the thresholds. To reduce these disturbances, mitigation techniques have been proposed, such as to install passive harmonic filters to reduce harmonic distortion, dynamic voltage restorer (DVR) to reduce voltage disturbances and isolate all sensitive and critical loads.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Specht, W.L.
2000-02-28
The Savannah River Site currently has 33 permitted NPDES outfalls that have been permitted by the South Carolina Department of Health an Environmental Control to discharge to SRS streams and the Savannah River. In order to determine the cumulative impacts of these discharges to the receiving streams, a study plan was developed to perform in-stream assessments of the fish assemblages, macroinvertebrate assemblages, and habitats of the receiving streams.
Hybrid Geothermal Heat Pumps for Cooling Telecommunications Data Centers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beckers, Koenraad J; Zurmuhl, David P.; Lukawski, Maciej Z.
The technical and economic performance of geothermal heat pump (GHP) systems supplying year-round cooling to representative small data centers with cooling loads less than 500 kWth were analyzed and compared to air-source heat pumps (ASHPs). A numerical model was developed in TRNSYS software to simulate the operation of air-source and geothermal heat pumps with and without supplementary air cooled heat exchangers - dry coolers (DCs). The model was validated using data measured at an experimental geothermal system installed in Ithaca, NY, USA. The coefficient of performance (COP) and cooling capacity of the GHPs were calculated over a 20-year lifetime andmore » compared to the performance of ASHPs. The total cost of ownership (TCO) of each of the cooling systems was calculated to assess its economic performance. Both the length of the geothermal borehole heat exchangers (BHEs) and the dry cooler temperature set point were optimized to minimize the TCO of the geothermal systems. Lastly, a preliminary analysis of the performance of geothermal heat pumps for cooling dominated systems was performed for other locations including Dallas, TX, Sacramento, CA, and Minneapolis, MN.« less
Exploring the life cycle management of industrial solid waste in the case of copper slag.
Song, Xiaolong; Yang, Jianxin; Lu, Bin; Li, Bo
2013-06-01
Industrial solid waste has potential impacts on soil, water and air quality, as well as human health, during its whole life stages. A framework for the life cycle management of industrial solid waste, which integrates the source reduction process, is presented and applied to copper slag management. Three management scenarios of copper slag are developed: (i) production of cement after electric furnace treatment, (ii) production of cement after flotation, and (iii) source reduction before the recycling process. A life cycle assessment is carried out to estimate the environmental burdens of these three scenarios. Life cycle assessment results showed that the environmental burdens of the three scenarios are 2710.09, 2061.19 and 2145.02 Pt respectively. In consideration of the closed-loop recycling process, the environmental performance of the flotation approach excelled that of the electric furnace approach. Additionally, although flash smelting promotes the source reduction of copper slag compared with bath smelting, it did not reduce the overall environmental burdens resulting from the complete copper slag management process. Moreover, it led to the shifting of environmental burdens from ecosystem quality damage and resources depletion to human health damage. The case study shows that it is necessary to integrate the generation process into the whole life cycle of industrial solid waste, and to make an integrated assessment for quantifying the contribution of source reduction, rather than to simply follow the priority of source reduction and the hierarchy of waste management.
Tornero-López, Ana M; Guirado, Damián; Perez-Calatayud, Jose; Ruiz-Arrebola, Samuel; Simancas, Fernando; Gazdic-Santic, Maja; Lallena, Antonio M
2013-12-01
Air-communicating well ionization chambers are commonly used to assess air kerma strength of sources used in brachytherapy. The signal produced is supposed to be proportional to the air density within the chamber and, therefore, a density-independent air kerma strength is obtained when the measurement is corrected to standard atmospheric conditions using the usual temperature and pressure correction factor. Nevertheless, when assessing low energy sources, the ionization chambers may not fulfill that condition and a residual density dependence still remains after correction. In this work, the authors examined the behavior of the PTW 34051 SourceCheck ionization chamber when measuring the air kerma strength of (125)I seeds. Four different SourceCheck chambers were analyzed. With each one of them, two series of measurements of the air kerma strength for (125)I selectSeed(TM) brachytherapy sources were performed inside a pressure chamber and varying the pressure in a range from 747 to 1040 hPa (560 to 780 mm Hg). The temperature and relative humidity were kept basically constant. An analogous experiment was performed by taking measurements at different altitudes above sea level. Contrary to other well-known ionization chambers, like the HDR1000 PLUS, in which the temperature-pressure correction factor overcorrects the measurements, in the SourceCheck ionization chamber they are undercorrected. At a typical atmospheric situation of 933 hPa (700 mm Hg) and 20 °C, this undercorrection turns out to be 1.5%. Corrected measurements show a residual linear dependence on the density and, as a consequence, an additional density dependent correction must be applied. The slope of this residual linear density dependence is different for each SourceCheck chamber investigated. The results obtained by taking measurements at different altitudes are compatible with those obtained with the pressure chamber. Variations of the altitude and changes in the weather conditions may produce significant density corrections, and that effect should be taken into account. This effect is chamber-dependent, indicating that a specific calibration is necessary for each particular chamber. To our knowledge, this correction has not been considered so far for SourceCheck ionization chambers, but its magnitude cannot be neglected in clinical practice. The atmospheric pressure and temperature at which the chamber was calibrated need to be taken into account, and they should be reported in the calibration certificate. In addition, each institution should analyze the particular response of its SourceCheck ionization chamber and compute the adequate correction factors. In the absence of a suitable pressure chamber, a possibility for this assessment is to take measurements at different altitudes, spanning a wide enough air density range.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tornero-López, Ana M.; Guirado, Damián; Ruiz-Arrebola, Samuel
2013-12-15
Purpose: Air-communicating well ionization chambers are commonly used to assess air kerma strength of sources used in brachytherapy. The signal produced is supposed to be proportional to the air density within the chamber and, therefore, a density-independent air kerma strength is obtained when the measurement is corrected to standard atmospheric conditions using the usual temperature and pressure correction factor. Nevertheless, when assessing low energy sources, the ionization chambers may not fulfill that condition and a residual density dependence still remains after correction. In this work, the authors examined the behavior of the PTW 34051 SourceCheck ionization chamber when measuring themore » air kerma strength of {sup 125}I seeds.Methods: Four different SourceCheck chambers were analyzed. With each one of them, two series of measurements of the air kerma strength for {sup 125}I selectSeed{sup TM} brachytherapy sources were performed inside a pressure chamber and varying the pressure in a range from 747 to 1040 hPa (560 to 780 mm Hg). The temperature and relative humidity were kept basically constant. An analogous experiment was performed by taking measurements at different altitudes above sea level.Results: Contrary to other well-known ionization chambers, like the HDR1000 PLUS, in which the temperature-pressure correction factor overcorrects the measurements, in the SourceCheck ionization chamber they are undercorrected. At a typical atmospheric situation of 933 hPa (700 mm Hg) and 20 °C, this undercorrection turns out to be 1.5%. Corrected measurements show a residual linear dependence on the density and, as a consequence, an additional density dependent correction must be applied. The slope of this residual linear density dependence is different for each SourceCheck chamber investigated. The results obtained by taking measurements at different altitudes are compatible with those obtained with the pressure chamber.Conclusions: Variations of the altitude and changes in the weather conditions may produce significant density corrections, and that effect should be taken into account. This effect is chamber-dependent, indicating that a specific calibration is necessary for each particular chamber. To our knowledge, this correction has not been considered so far for SourceCheck ionization chambers, but its magnitude cannot be neglected in clinical practice. The atmospheric pressure and temperature at which the chamber was calibrated need to be taken into account, and they should be reported in the calibration certificate. In addition, each institution should analyze the particular response of its SourceCheck ionization chamber and compute the adequate correction factors. In the absence of a suitable pressure chamber, a possibility for this assessment is to take measurements at different altitudes, spanning a wide enough air density range.« less
Testing of focal plane arrays at the AEDC
NASA Astrophysics Data System (ADS)
Nicholson, Randy A.; Mead, Kimberly D.; Smith, Robert W.
1992-07-01
A facility was developed at the Arnold Engineering Development Center (AEDC) to provide complete radiometric characterization of focal plane arrays (FPAs). The highly versatile facility provides the capability to test single detectors, detector arrays, and hybrid FPAs. The primary component of the AEDC test facility is the Focal Plane Characterization Chamber (FPCC). The FPCC provides a cryogenic, low-background environment for the test focal plane. Focal plane testing in the FPCC includes flood source testing, during which the array is uniformly irradiated with IR radiation, and spot source testing, during which the target radiation is focused onto a single pixel or group of pixels. During flood source testing, performance parameters such as power consumption, responsivity, noise equivalent input, dynamic range, radiometric stability, recovery time, and array uniformity can be assessed. Crosstalk is evaluated during spot source testing. Spectral response testing is performed in a spectral response test station using a three-grating monochromator. Because the chamber can accommodate several types of testing in a single test installation, a high throughput rate and good economy of operation are possible.
A detection method for X-ray images based on wavelet transforms: the case of the ROSAT PSPC.
NASA Astrophysics Data System (ADS)
Damiani, F.; Maggio, A.; Micela, G.; Sciortino, S.
1996-02-01
The authors have developed a method based on wavelet transforms (WT) to detect efficiently sources in PSPC X-ray images. The multiscale approach typical of WT can be used to detect sources with a large range of sizes, and to estimate their size and count rate. Significance thresholds for candidate detections (found as local WT maxima) have been derived from a detailed study of the probability distribution of the WT of a locally uniform background. The use of the exposure map allows good detection efficiency to be retained even near PSPC ribs and edges. The algorithm may also be used to get upper limits to the count rate of undetected objects. Simulations of realistic PSPC images containing either pure background or background+sources were used to test the overall algorithm performances, and to assess the frequency of spurious detections (vs. detection threshold) and the algorithm sensitivity. Actual PSPC images of galaxies and star clusters show the algorithm to have good performance even in cases of extended sources and crowded fields.
Atuonwu, J C; Tassou, S A
2018-01-23
The enormous magnitude and variety of microwave applications in household, commercial and industrial food processing creates a strong motivation for improving the energy efficiency and hence, sustainability of the process. This review critically assesses key energy issues associated with microwave food processing, focusing on previous energy performance studies, energy performance metrics, standards and regulations. Factors affecting energy-efficiency are categorised into source, load and source-load matching factors. This highlights the need for highly-flexible and controllable power sources capable of receiving real-time feedback on load properties, and effecting rapid control actions to minimise reflections, heating non-uniformities and other imperfections that lead to energy losses. A case is made for the use of solid-state amplifiers as alternatives to conventional power sources, magnetrons. By a full-scale techno-economic analysis, including energy aspects, it is shown that the use of solid-state amplifiers as replacements to magnetrons is promising, not only from an energy and overall technical perspective, but also in terms of economics.
Oak Ridge Spallation Neutron Source (ORSNS) target station design integration
DOE Office of Scientific and Technical Information (OSTI.GOV)
McManamy, T.; Booth, R.; Cleaves, J.
1996-06-01
The conceptual design for a 1- to 3-MW short pulse spallation source with a liquid mercury target has been started recently. The design tools and methods being developed to define requirements, integrate the work, and provide early cost guidance will be presented with a summary of the current target station design status. The initial design point was selected with performance and cost estimate projections by a systems code. This code was developed recently using cost estimates from the Brookhaven Pulsed Spallation Neutron Source study and experience from the Advanced Neutron Source Project`s conceptual design. It will be updated and improvedmore » as the design develops. Performance was characterized by a simplified figure of merit based on a ratio of neutron production to costs. A work breakdown structure was developed, with simplified systems diagrams used to define interfaces and system responsibilities. A risk assessment method was used to identify potential problems, to identify required research and development (R&D), and to aid contingency development. Preliminary 3-D models of the target station are being used to develop remote maintenance concepts and to estimate costs.« less
OPAL: An Open-Source MPI-IO Library over Cray XT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Weikuan; Vetter, Jeffrey S; Canon, Richard Shane
Parallel IO over Cray XT is supported by a vendor-supplied MPI-IO package. This package contains a proprietary ADIO implementation built on top of the sysio library. While it is reasonable to maintain a stable code base for application scientists' convenience, it is also very important to the system developers and researchers to analyze and assess the effectiveness of parallel IO software, and accordingly, tune and optimize the MPI-IO implementation. A proprietary parallel IO code base relinquishes such flexibilities. On the other hand, a generic UFS-based MPI-IO implementation is typically used on many Linux-based platforms. We have developed an open-source MPI-IOmore » package over Lustre, referred to as OPAL (OPportunistic and Adaptive MPI-IO Library over Lustre). OPAL provides a single source-code base for MPI-IO over Lustre on Cray XT and Linux platforms. Compared to Cray implementation, OPAL provides a number of good features, including arbitrary specification of striping patterns and Lustre-stripe aligned file domain partitioning. This paper presents the performance comparisons between OPAL and Cray's proprietary implementation. Our evaluation demonstrates that OPAL achieves the performance comparable to the Cray implementation. We also exemplify the benefits of an open source package in revealing the underpinning of the parallel IO performance.« less
De Feo, Giovanni; Ferrara, Carmen; Finelli, Alessio; Grosso, Alberto
2017-12-07
The main aim of this study was to perform a Life cycle assessment study as well as an economic evaluation of the recovery of recyclable materials in a municipal solid waste management system. If citizens separate erroneously waste fractions, they produce both environmental and economic damages. The environmental and economic evaluation was performed for the case study of Nola (34.349 inhabitants) in Southern Italy, with a kerbside system that assured a source separation of 62% in 2014. The economic analysis provided a quantification of the economic benefits obtainable for the population in function of the achievable percentage of source separation. The comparison among the environmental performance of four considered scenarios showed that the higher the level of source separation was, the lower the overall impacts were. This occurred because, even if the impacts of the waste collection and transport increased, they were overcome by the avoided impacts of the recycling processes. Increasing the source separation by 1% could avoid the emission of 5 kg CO 2 eq. and 5 g PM10 for each single citizen. The economic and environmental indicators defined in this study provide simple and effective information useful for a wide-ranging audience in a behavioural change programme perspective.
Developing integrated benchmarks for DOE performance measurement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barancik, J.I.; Kramer, C.F.; Thode, Jr. H.C.
1992-09-30
The objectives of this task were to describe and evaluate selected existing sources of information on occupational safety and health with emphasis on hazard and exposure assessment, abatement, training, reporting, and control identifying for exposure and outcome in preparation for developing DOE performance benchmarks. Existing resources and methodologies were assessed for their potential use as practical performance benchmarks. Strengths and limitations of current data resources were identified. Guidelines were outlined for developing new or improved performance factors, which then could become the basis for selecting performance benchmarks. Data bases for non-DOE comparison populations were identified so that DOE performance couldmore » be assessed relative to non-DOE occupational and industrial groups. Systems approaches were described which can be used to link hazards and exposure, event occurrence, and adverse outcome factors, as needed to generate valid, reliable, and predictive performance benchmarks. Data bases were identified which contain information relevant to one or more performance assessment categories . A list of 72 potential performance benchmarks was prepared to illustrate the kinds of information that can be produced through a benchmark development program. Current information resources which may be used to develop potential performance benchmarks are limited. There is need to develop an occupational safety and health information and data system in DOE, which is capable of incorporating demonstrated and documented performance benchmarks prior to, or concurrent with the development of hardware and software. A key to the success of this systems approach is rigorous development and demonstration of performance benchmark equivalents to users of such data before system hardware and software commitments are institutionalized.« less
NASA Astrophysics Data System (ADS)
Karagulian, Federico; Belis, Claudio A.; Dora, Carlos Francisco C.; Prüss-Ustün, Annette M.; Bonjour, Sophie; Adair-Rohani, Heather; Amann, Markus
2015-11-01
For reducing health impacts from air pollution, it is important to know the sources contributing to human exposure. This study systematically reviewed and analysed available source apportionment studies on particulate matter (of diameter of 10 and 2.5 microns, PM10 and PM2.5) performed in cities to estimate typical shares of the sources of pollution by country and by region. A database with city source apportionment records, estimated with the use of receptor models, was also developed and available at the website of the World Health Organization. Systematic Scopus and Google searches were performed to retrieve city studies of source apportionment for particulate matter. Six source categories were defined. Country and regional averages of source apportionment were estimated based on city population weighting. A total of 419 source apportionment records from studies conducted in cities of 51 countries were used to calculate regional averages of sources of ambient particulate matter. Based on the available information, globally 25% of urban ambient air pollution from PM2.5 is contributed by traffic, 15% by industrial activities, 20% by domestic fuel burning, 22% from unspecified sources of human origin, and 18% from natural dust and salt. The available source apportionment records exhibit, however, important heterogeneities in assessed source categories and incompleteness in certain countries/regions. Traffic is one important contributor to ambient PM in cities. To reduce air pollution in cities and the substantial disease burden it causes, solutions to sustainably reduce ambient PM from traffic, industrial activities and biomass burning should urgently be sought. However, further efforts are required to improve data availability and evaluation, and possibly to combine with other types of information in view of increasing usefulness for policy making.
Ruiz-Vera, Tania; Pruneda-Álvarez, Lucia G; Ochoa-Martínez, Ángeles C; Ramírez-GarcíaLuna, José L; Pierdant-Pérez, Mauricio; Gordillo-Moscoso, Antonio A; Pérez-Vázquez, Francisco J; Pérez-Maldonado, Iván N
2015-09-01
The use of solid fuels for cooking and heating is likely to be the largest source of indoor air pollution on a global scale; these fuels emit substantial amounts of toxic pollutants such as polycyclic aromatic hydrocarbons (PAHs) when used in simple cooking stoves (such as open "three-stone" fires). Moreover, indoor air pollution from biomass fuels is considered an important risk factor for human health. The aim of this study was to evaluate the relationship between exposure to PAHs from wood smoke and vascular dysfunction; in a group of Mexican women that use biomass combustion as their main energy source inside their homes. We used 1-hydroxypyrene (1-OHP) as an exposure biomarker to PAHs and it was assessed using high performance liquid chromatography. The endothelium-dependent vasodilation was assessed through a vascular reactivity compression test performed with a pneumatic cuff under visualization of the brachial artery using high resolution ultrasonography (HRU). Assessment of the carotid intima-media thickness (CIMT) was used as an atherosclerosis biomarker (also assessed using HRU); and clinical parameters such as anthropometry, blood pressure, glucose, triglycerides, total cholesterol, HDL cholesterol, LDL cholesterol, among others were also evaluated. The mean concentration of urinary 1-OHP found in exposed women was 0.46±0.32μmol/mol Cr (range: 0.086-1.23μmol/mol Cr). Moreover, vascular dysfunction (diminished endothelium dependent vasodilation) was found in 45% of the women participating in the study. Association between vascular function and 1-OHP levels was found to be significant through a logistic regression analysis (p=0.034; r(2)=0.1329). Furthermore, no association between CIMT and clinical parameters, urinary 1-OHP levels or vascular dysfunction was found. Therefore, with the information obtained in this study, we advocate for the need to implement programs to reduce the risk of exposure to PAHs in communities that use biomass fuels as a main energy source. Copyright © 2015 Elsevier B.V. All rights reserved.
Pozhitkov, Alex E; Noble, Peter A; Bryk, Jarosław; Tautz, Diethard
2014-01-01
Although microarrays are analysis tools in biomedical research, they are known to yield noisy output that usually requires experimental confirmation. To tackle this problem, many studies have developed rules for optimizing probe design and devised complex statistical tools to analyze the output. However, less emphasis has been placed on systematically identifying the noise component as part of the experimental procedure. One source of noise is the variance in probe binding, which can be assessed by replicating array probes. The second source is poor probe performance, which can be assessed by calibrating the array based on a dilution series of target molecules. Using model experiments for copy number variation and gene expression measurements, we investigate here a revised design for microarray experiments that addresses both of these sources of variance. Two custom arrays were used to evaluate the revised design: one based on 25 mer probes from an Affymetrix design and the other based on 60 mer probes from an Agilent design. To assess experimental variance in probe binding, all probes were replicated ten times. To assess probe performance, the probes were calibrated using a dilution series of target molecules and the signal response was fitted to an adsorption model. We found that significant variance of the signal could be controlled by averaging across probes and removing probes that are nonresponsive or poorly responsive in the calibration experiment. Taking this into account, one can obtain a more reliable signal with the added option of obtaining absolute rather than relative measurements. The assessment of technical variance within the experiments, combined with the calibration of probes allows to remove poorly responding probes and yields more reliable signals for the remaining ones. Once an array is properly calibrated, absolute quantification of signals becomes straight forward, alleviating the need for normalization and reference hybridizations.
Pharmacy student absenteeism and academic performance.
Hidayat, Levita; Vansal, Sandeep; Kim, Esther; Sullivan, Maureen; Salbu, Rebecca
2012-02-10
To assess the association of pharmacy students' personal characteristics with absenteeism and academic performance. A survey instrument was distributed to first- (P1) and second-year (P2) pharmacy students to gather characteristics including employment status, travel time to school, and primary source of educational funding. In addition, absences from specific courses and reasons for not attending classes were assessed. Participants were divided into "high" and "low" performers based on grade point average. One hundred sixty survey instruments were completed and 135 (84.3%) were included in the study analysis. Low performers were significantly more likely than high performers to have missed more than 8 hours in therapeutics courses. Low performers were significantly more likely than high performers to miss class when the class was held before or after an examination and low performers were significantly more likely to believe that participating in class did not benefit them. There was a negative association between the number of hours students' missed and their performance in specific courses. These findings provide further insight into the reasons for students' absenteeism in a college or school of pharmacy setting.
Pharmacy Student Absenteeism and Academic Performance
Hidayat, Levita; Vansal, Sandeep; Kim, Esther; Sullivan, Maureen; Salbu, Rebecca
2012-01-01
Objectives. To assess the association of pharmacy students’ personal characteristics with absenteeism and academic performance. Methods. A survey instrument was distributed to first- (P1) and second-year (P2) pharmacy students to gather characteristics including employment status, travel time to school, and primary source of educational funding. In addition, absences from specific courses and reasons for not attending classes were assessed. Participants were divided into “high” and “low” performers based on grade point average. Results. One hundred sixty survey instruments were completed and 135 (84.3%) were included in the study analysis. Low performers were significantly more likely than high performers to have missed more than 8 hours in therapeutics courses. Low performers were significantly more likely than high performers to miss class when the class was held before or after an examination and low performers were significantly more likely to believe that participating in class did not benefit them. Conclusions. There was a negative association between the number of hours students’ missed and their performance in specific courses. These findings provide further insight into the reasons for students’ absenteeism in a college or school of pharmacy setting. PMID:22412207
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fisk, William J.; Destaillats, H.; Apte, M.G.
Heating, ventilating, and cooling classrooms in California consume substantial electrical energy. Indoor air quality (IAQ) in classrooms affects studenthealth and performance. In addition to airborne pollutants that are emitted directly by indoor sources and those generated outdoors, secondary pollutants can be formed indoors by chemical reaction of ozone with other chemicals and materials. Filters are used in nearly all classroom heating, ventilation and air?conditioning (HVAC) systems to maintain energy-efficient HVAC performance and improve indoor air quality; however, recent evidence indicates that ozone reactions with filters may, in fact, be a source of secondary pollutants. This project quantitatively evaluated ozone depositionmore » in HVAC filters and byproduct formation, and provided a preliminary assessment of the extent towhich filter systems are degrading indoor air quality. The preliminary information obtained will contribute to the design of subsequent research efforts and the identification of energy efficient solutions that improve indoor air quality in classrooms and the health and performance of students.« less
AutoLens: Automated Modeling of a Strong Lens's Light, Mass and Source
NASA Astrophysics Data System (ADS)
Nightingale, J. W.; Dye, S.; Massey, Richard J.
2018-05-01
This work presents AutoLens, the first entirely automated modeling suite for the analysis of galaxy-scale strong gravitational lenses. AutoLens simultaneously models the lens galaxy's light and mass whilst reconstructing the extended source galaxy on an adaptive pixel-grid. The method's approach to source-plane discretization is amorphous, adapting its clustering and regularization to the intrinsic properties of the lensed source. The lens's light is fitted using a superposition of Sersic functions, allowing AutoLens to cleanly deblend its light from the source. Single component mass models representing the lens's total mass density profile are demonstrated, which in conjunction with light modeling can detect central images using a centrally cored profile. Decomposed mass modeling is also shown, which can fully decouple a lens's light and dark matter and determine whether the two component are geometrically aligned. The complexity of the light and mass models are automatically chosen via Bayesian model comparison. These steps form AutoLens's automated analysis pipeline, such that all results in this work are generated without any user-intervention. This is rigorously tested on a large suite of simulated images, assessing its performance on a broad range of lens profiles, source morphologies and lensing geometries. The method's performance is excellent, with accurate light, mass and source profiles inferred for data sets representative of both existing Hubble imaging and future Euclid wide-field observations.
Munguía-Rosas, Miguel A.; Campos-Navarrete, María J.; Parra-Tabla, Víctor
2013-01-01
Dimorphic cleistogamy is a specialized form of mixed mating system where a single plant produces both open, potentially outcrossed chasmogamous (CH) and closed, obligately self-pollinated cleistogamous (CL) flowers. Typically, CH flowers and seeds are bigger and energetically more costly than those of CL. Although the effects of inbreeding and floral dimorphism are critical to understanding the evolution and maintenance of cleistogamy, these effects have been repeatedly confounded. In an attempt to separate these effects, we compared the performance of progeny derived from the two floral morphs while controlling for the source of pollen. That is, flower type and pollen source effects were assessed by comparing the performance of progeny derived from selfed CH vs. CL and outcrossed CH vs. selfed CH flowers, respectively. The experiment was carried out with the herb Ruellia nudiflora under two contrasting light environments. Outcrossed progeny generally performed better than selfed progeny. However, inbreeding depression ranges from low (1%) to moderate (36%), with the greatest value detected under shaded conditions when cumulative fitness was used. Although flower type generally had less of an effect on progeny performance than pollen source did, the progeny derived from selfed CH flowers largely outperformed the progeny from CL flowers, but only under shaded conditions and when cumulative fitness was taken into account. On the other hand, the source of pollen and flower type influenced seed predation, with selfed CH progeny the most heavily attacked by predators. Therefore, the effects of pollen source and flower type are environment-dependant and seed predators may increase the genetic differences between progeny derived from CH and CL flowers. Inbreeding depression alone cannot account for the maintenance of a mixed mating system in R. nudiflora and other unidentified mechanisms must thus be involved. PMID:24260515
Risk assessment for tephra dispersal and sedimentation: the example of four Icelandic volcanoes
NASA Astrophysics Data System (ADS)
Biass, Sebastien; Scaini, Chiara; Bonadonna, Costanza; Smith, Kate; Folch, Arnau; Höskuldsson, Armann; Galderisi, Adriana
2014-05-01
In order to assist the elaboration of proactive measures for the management of future Icelandic volcanic eruptions, we developed a new approach to assess the impact associated with tephra dispersal and sedimentation at various scales and for multiple sources. Target volcanoes are Hekla, Katla, Eyjafjallajökull and Askja, selected for their high probabilities of eruption and/or their high potential impact. We combined stratigraphic studies, probabilistic strategies and numerical modelling to develop comprehensive eruption scenarios and compile hazard maps for local ground deposition and regional atmospheric concentration using both TEPHRA2 and FALL3D models. New algorithms for the identification of comprehensive probability density functions of eruptive source parameters were developed for both short and long-lasting activity scenarios. A vulnerability assessment of socioeconomic and territorial aspects was also performed at both national and continental scales. The identification of relevant vulnerability indicators allowed for the identification of the most critical areas and territorial nodes. At a national scale, the vulnerability of economic activities and the accessibility to critical infrastructures was assessed. At a continental scale, we assessed the vulnerability of the main airline routes and airports. Resulting impact and risk were finally assessed by combining hazard and vulnerability analysis.
Konshina, Lidia G
2016-01-01
The availability of high-quality drinking water is currently the one out of the most acute problems in the Russian Federation. There was performed an analysis of the chemical composition of drinking water from sources of decentralized supply of inhabitants of the city of Yekaterinburg and the surrounding areas. Average values of indices of the water quality in the wells for individual use in the district of the city of Yekaterinburg not go beyond the standards, with the exception of manganese content. In some sources there were revealed elevated values of chromatic level, oxidability, hardness, content of iron, nitrates, barium, dry residue, ammonium nitrogen, silicon. Percentage of sources that do not meet hygienic requirements on a number of indices can reach 21-23%.
Temporal resolution and motion artifacts in single-source and dual-source cardiac CT.
Schöndube, Harald; Allmendinger, Thomas; Stierstorfer, Karl; Bruder, Herbert; Flohr, Thomas
2013-03-01
The temporal resolution of a given image in cardiac computed tomography (CT) has so far mostly been determined from the amount of CT data employed for the reconstruction of that image. The purpose of this paper is to examine the applicability of such measures to the newly introduced modality of dual-source CT as well as to methods aiming to provide improved temporal resolution by means of an advanced image reconstruction algorithm. To provide a solid base for the examinations described in this paper, an extensive review of temporal resolution in conventional single-source CT is given first. Two different measures for assessing temporal resolution with respect to the amount of data involved are introduced, namely, either taking the full width at half maximum of the respective data weighting function (FWHM-TR) or the total width of the weighting function (total TR) as a base of the assessment. Image reconstruction using both a direct fan-beam filtered backprojection with Parker weighting as well as using a parallel-beam rebinning step are considered. The theory of assessing temporal resolution by means of the data involved is then extended to dual-source CT. Finally, three different advanced iterative reconstruction methods that all use the same input data are compared with respect to the resulting motion artifact level. For brevity and simplicity, the examinations are limited to two-dimensional data acquisition and reconstruction. However, all results and conclusions presented in this paper are also directly applicable to both circular and helical cone-beam CT. While the concept of total TR can directly be applied to dual-source CT, the definition of the FWHM of a weighting function needs to be slightly extended to be applicable to this modality. The three different advanced iterative reconstruction methods examined in this paper result in significantly different images with respect to their motion artifact level, despite exactly the same amount of data being used in the reconstruction process. The concept of assessing temporal resolution by means of the data employed for reconstruction can nicely be extended from single-source to dual-source CT. However, for advanced (possibly nonlinear iterative) reconstruction algorithms the examined approach fails to deliver accurate results. New methods and measures to assess the temporal resolution of CT images need to be developed to be able to accurately compare the performance of such algorithms.
Mission Applicability and Benefits of Thin-Film Integrated Power Generation and Energy Storage
NASA Technical Reports Server (NTRS)
Hoffman, David; Raffaelle, Ryne P.; Landis, Geoffrey A.; Hepp, Aloysius F.
2001-01-01
This paper discusses the space mission applicability and benefits of a thin-film integrated power generation and energy storage device, i.e., an "Integrated Power Source" or IPS. The characteristics of an IPS that combines thin-film photo-voltaic power generation with thin-film energy storage are described. Mission concepts for a thin-film IPS as a spacecraft main electrical power system, as a decentralized or distributed power source and as an uninterruptible power supply are discussed. For two specific missions, preliminary sizing of an IPS as a main power system is performed and benefits are assessed. IPS developmental challenges that need to be overcome in order to realize the benefits of an IPS are examined. Based on this preliminary assessment, it is concluded that the most likely and beneficial application of an IPS will be as the main power system on a very small "nanosatellite," or in specialized applications serving as a decentralized or distributed power source or uninterruptible power supply.
Mission Applicability and Benefits of Thin-Film Integrated Power Generation and Energy Storage
NASA Technical Reports Server (NTRS)
Hoffman, David J.; Raffaelle, Ryne P.; Landis, Geoffrey A.; Hepp, Aloysius F.
2001-01-01
This paper discusses the space mission applicability and benefits of a thin-film integrated power generation and energy storage device, i.e., an "Integrated Power Source" or IPS. The characteristics of an IPS that combines thin-film photovoltaic power generation with thin-film energy storage are described. Mission concepts for a thin-film IPS as a spacecraft main electrical power system, as a decentralized or distributed power source and as an uninterruptible power supply are discussed. For two specific missions, preliminary sizing of an IPS as a main power system is performed and benefits are assessed. IPS developmental challenges that need to be overcome in order to realize the benefits of an IPS are examined. Based on this preliminary assessment, it is concluded that the most likely and beneficial application of an IPS will be as the main power system on a very small "nanosatellite," or in specialized applications serving as a decentralized or distributed power source or uninterruptible power supply.
Matching radio catalogues with realistic geometry: application to SWIRE and ATLAS
NASA Astrophysics Data System (ADS)
Fan, Dongwei; Budavári, Tamás; Norris, Ray P.; Hopkins, Andrew M.
2015-08-01
Cross-matching catalogues at different wavelengths is a difficult problem in astronomy, especially when the objects are not point-like. At radio wavelengths, an object can have several components corresponding, for example, to a core and lobes. Considering not all radio detections correspond to visible or infrared sources, matching these catalogues can be challenging. Traditionally, this is done by eye for better quality, which does not scale to the large data volumes expected from the next-generation of radio telescopes. We present a novel automated procedure, using Bayesian hypothesis testing, to achieve reliable associations by explicit modelling of a particular class of radio-source morphology. The new algorithm not only assesses the likelihood of an association between data at two different wavelengths, but also tries to assess whether different radio sources are physically associated, are double-lobed radio galaxies, or just distinct nearby objects. Application to the Spitzer Wide-Area Infrared Extragalactic and Australia Telescope Large Area Survey CDF-S catalogues shows that this method performs well without human intervention.
NASA Astrophysics Data System (ADS)
Upton, D. W.; Saeed, B. I.; Mather, P. J.; Lazaridis, P. I.; Vieira, M. F. Q.; Atkinson, R. C.; Tachtatzis, C.; Garcia, M. S.; Judd, M. D.; Glover, I. A.
2018-03-01
Monitoring of partial discharge (PD) activity within high-voltage electrical environments is increasingly used for the assessment of insulation condition. Traditional measurement techniques employ technologies that either require off-line installation or have high power consumption and are hence costly. A wireless sensor network is proposed that utilizes only received signal strength to locate areas of PD activity within a high-voltage electricity substation. The network comprises low-power and low-cost radiometric sensor nodes which receive the radiation propagated from a source of PD. Results are reported from several empirical tests performed within a large indoor environment and a substation environment using a network of nine sensor nodes. A portable PD source emulator was placed at multiple locations within the network. Signal strength measured by the nodes is reported via WirelessHART to a data collection hub where it is processed using a location algorithm. The results obtained place the measured location within 2 m of the actual source location.
Energy and exergy assessments for an enhanced use of energy in buildings
NASA Astrophysics Data System (ADS)
Goncalves, Pedro Manuel Ferreira
Exergy analysis has been found to be a useful method for improving the conversion efficiency of energy resources, since it helps to identify locations, types and true magnitudes of wastes and losses. It has also been applied for other purposes, such as distinguishing high- from low-quality energy sources or defining the engineering technological limits in designing more energy-efficient systems. In this doctoral thesis, the exergy analysis is widely applied in order to highlight and demonstrate it as a significant method of performing energy assessments of buildings and related energy supply systems. It aims to make the concept more familiar and accessible for building professionals and to encourage its wider use in engineering practice. Case study I aims to show the importance of exergy analysis in the energy performance assessment of eight space heating building options evaluated under different outdoor environmental conditions. This study is concerned with the so-called "reference state", which in this study is calculated using the average outdoor temperature for a given period of analysis. Primary energy and related exergy ratios are assessed and compared. Higher primary exergy ratios are obtained for low outdoor temperatures, while the primary energy ratios are assumed as constant for the same scenarios. The outcomes of this study demonstrate the significance of exergy analysis in comparison with energy analysis when different reference states are compared. Case study II and Case study III present two energy and exergy assessment studies applied to a hotel and a student accommodation building, respectively. Case study II compares the energy and exergy performance of the main end uses of a hotel building located in Coimbra in central Portugal, using data derived from an energy audit. Case study III uses data collected from energy utilities bills to estimate the energy and exergy performance associated to each building end use. Additionally, a set of energy supply options are proposed and assessed as primary energy demand and exergy efficiency, showing it as a possible benchmarking method for future legislative frameworks regarding the energy performance assessment of buildings. Case study IV proposes a set of complementary indicators for comparing cogeneration and separate heat and electricity production systems. It aims to identify the advantages of exergy analysis relative to energy analysis, giving particular examples where these advantages are significant. The results demonstrate that exergy analysis can reveal meaningful information that might not be accessible using a conventional energy analysis approach, which is particularly evident when cogeneration and separated systems provide heat at very different temperatures. Case study V follows the exergy analysis method to evaluate the energy and exergy performance of a desiccant cooling system, aiming to assess and locate irreversibilities sources. The results reveal that natural gas boiler is the most inefficient component of the plant in question, followed by the chiller and heating coil. A set of alternative heating supply options for desiccant wheel regeneration is proposed, showing that, while some renewables may effectively reduce the primary energy demand of the plant, although this may not correspond to the optimum level of exergy efficiency. The thermal and chemical exergy components of moist air are also evaluated, as well as, the influence of outdoor environmental conditions on the energy/exergy performance of the plant. This research provides knowledge that is essential for the future development of complementary energy- and exergy-based indicators, helping to improve the current methodologies on performance assessments of buildings, cogeneration and desiccant cooling systems. The significance of exergy analysis is demonstrated for different types of buildings, which may be located in different climates (reference states) and be supplied by different types of energy sources. (Abstract shortened by ProQuest.).
Jones, Andria Q; Dewey, Catherine E; Doré, Kathryn; Majowicz, Shannon E; McEwen, Scott A; Waltner-Toews, David
2006-01-01
Background Exposure assessment is typically the greatest weakness of epidemiologic studies of disinfection by-products (DBPs) in drinking water, which largely stems from the difficulty in obtaining accurate data on individual-level water consumption patterns and activity. Thus, surrogate measures for such waterborne exposures are commonly used. Little attention however, has been directed towards formal validation of these measures. Methods We conducted a study in the City of Hamilton, Ontario (Canada) in 2001–2002, to assess the accuracy of two surrogate measures of home water source: (a) urban/rural status as assigned using residential postal codes, and (b) mapping of residential postal codes to municipal water systems within a Geographic Information System (GIS). We then assessed the accuracy of a commonly-used surrogate measure of an individual's actual drinking water source, namely, their home water source. Results The surrogates for home water source provided good classification of residents served by municipal water systems (approximately 98% predictive value), but did not perform well in classifying those served by private water systems (average: 63.5% predictive value). More importantly, we found that home water source was a poor surrogate measure of the individuals' actual drinking water source(s), being associated with high misclassification errors. Conclusion This study demonstrated substantial misclassification errors associated with a surrogate measure commonly used in studies of drinking water disinfection byproducts. Further, the limited accuracy of two surrogate measures of an individual's home water source heeds caution in their use in exposure classification methodology. While these surrogates are inexpensive and convenient, they should not be substituted for direct collection of accurate data pertaining to the subjects' waterborne disease exposure. In instances where such surrogates must be used, estimation of the misclassification and its subsequent effects are recommended for the interpretation and communication of results. Our results also lend support for further investigation into the quantification of the exposure misclassification associated with these surrogate measures, which would provide useful estimates for consideration in interpretation of waterborne disease studies. PMID:16729887
Analysis of Discrete-Source Damage Progression in a Tensile Stiffened Composite Panel
NASA Technical Reports Server (NTRS)
Wang, John T.; Lotts, Christine G.; Sleight, David W.
1999-01-01
This paper demonstrates the progressive failure analysis capability in NASA Langley s COMET-AR finite element analysis code on a large-scale built-up composite structure. A large-scale five stringer composite panel with a 7-in. long discrete source damage was analyzed from initial loading to final failure including the geometric and material nonlinearities. Predictions using different mesh sizes, different saw cut modeling approaches, and different failure criteria were performed and assessed. All failure predictions have a reasonably good correlation with the test result.
NASA Astrophysics Data System (ADS)
Ciaffoni, L.; Hancock, G.; Hurst, P. L.; Kingston, M.; Langley, C. E.; Peverall, R.; Ritchie, G. A. D.; Whittaker, K. E.
2013-02-01
In this paper we report the characterization of a novel, widely tunable, diode laser source operating over the full telecom L-band (1563-1613 nm), namely the digital supermode distributed Bragg reflector (DS-DBR) laser, and its application to multi-wavelength gas sensing via absorption strategies. The spectroscopic performance of the laser has been assessed by investigating the ro-vibrational spectrum of CO2, and wavelength modulation spectroscopy was accomplished for proof-of-principle sensitive measurements in discrete spectral regions.
Explosion localization via infrasound.
Szuberla, Curt A L; Olson, John V; Arnoult, Kenneth M
2009-11-01
Two acoustic source localization techniques were applied to infrasonic data and their relative performance was assessed. The standard approach for low-frequency localization uses an ensemble of small arrays to separately estimate far-field source bearings, resulting in a solution from the various back azimuths. This method was compared to one developed by the authors that treats the smaller subarrays as a single, meta-array. In numerical simulation and a field experiment, the latter technique was found to provide improved localization precision everywhere in the vicinity of a 3-km-aperture meta-array, often by an order of magnitude.
FY2004 SYSTEM ENGINEER PROGRAM MANAGER ANNUAL REPORT
DOE Office of Scientific and Technical Information (OSTI.GOV)
JACKSON, G.J.
2004-10-29
During FY 2004, reviews of the FH System Engineer (SE) Program were conducted by the Independent Assessment (IA) Group. The results of these reviews are summarized as a part of this document. Additional reviews were performed by FH Engineering personnel. SE Engineering reviews performed include Periodic Walkdowns (typically, quarterly) by the SEs, a review of System Notebooks by the System Engineer Program Manager (SEPM), annual status report by each SE, and an annual status report by each of the Project Chief Engineers (PCEs). FY 2004 marked the completion of the first round of Vital Safety System assessments. Each of themore » VSSs on the FH VSS list has been evaluated at least once by either the FH Independent Assessment organization or was included as a part of DOE Phase II assessment. Following the completion of the K-Basins Assessment in May 2004, a review of the VSS assessment process was completed. Criteria were developed by FH, and concurred with by RL, to determine the frequency and priority of future VSS assessments. Additional actions have been taken to increase the visibility and emphasis assigned to VSSs. Completion of several Documented Safety Analyses (DSA), in combination with efforts to remove source term materials from several facilities, enabled the number of systems on the FH VSS list to be reduced from 60 at the beginning of FY 2004 to 48 by the end of FY 2004. It is expected that there will be further changes to the FH VSS list based on additional DSA revisions and continued progress towards reduction of source terms across the Hanford Site. Other new VSSs may be added to the list to reflect the relocation of materials away from the River Corridor to interim storage locations on the Central Plateau.« less
Seyring, Nicole; Dollhofer, Marie; Weißenbacher, Jakob; Bakas, Ioannis; McKinnon, David
2016-09-01
The Waste Framework Directive obliged European Union Member States to set up separate collection systems to promote high quality recycling for at least paper, metal, plastic and glass by 2015. As implementation of the requirement varies across European Union Member States, the European Commission contracted BiPRO GmbH/Copenhagen Resource Institute to assess the separate collection schemes in the 28 European Union Member States, focusing on capital cities and on metal, plastic, glass (with packaging as the main source), paper/cardboard and bio-waste. The study includes an assessment of the legal framework for, and the practical implementation of, collection systems in the European Union-28 Member States and an in depth-analysis of systems applied in all capital cities. It covers collection systems that collect one or more of the five waste streams separately from residual waste/mixed municipal waste at source (including strict separation, co-mingled systems, door-to-door, bring-point collection and civic amenity sites). A scoreboard including 13 indicators is elaborated in order to measure the performance of the systems with the capture rates as key indicators to identify best performers. Best performance are by the cities of Ljubljana, Helsinki and Tallinn, leading to the key conclusion that door-to-door collection, at least for paper and bio-waste, and the implementation of pay-as-you-throw schemes results in high capture and thus high recycling rates of packaging and other municipal waste. © The Author(s) 2016.
Grulkowski, Ireneusz; Nowak, Jan K.; Karnowski, Karol; Zebryk, Paweł; Puszczewicz, Mariusz; Walkowiak, Jaroslaw; Wojtkowski, Maciej
2013-01-01
Three-dimensional imaging of the mucosa of the lower lip and labial minor salivary glands is demonstrated in vivo using swept source optical coherence tomography (OCT) system at 1310 nm with modified interface. Volumetric data sets of the inner surface of the lower lip covering ~230 mm2 field are obtained from patients with Sjögren’s syndrome and a control group. OCT enables high-resolution visualization of mucosal architecture using cross-sectional images as well as en-face projection images. Comprehensive morphometry of the labial minor salivary glands is performed, and statistical significance is assessed. Statistically significant differences in morphometric parameters are found when subgroups of patients with Sjögren’s syndrome are analyzed. PMID:24466492
Grulkowski, Ireneusz; Nowak, Jan K; Karnowski, Karol; Zebryk, Paweł; Puszczewicz, Mariusz; Walkowiak, Jaroslaw; Wojtkowski, Maciej
2013-12-16
Three-dimensional imaging of the mucosa of the lower lip and labial minor salivary glands is demonstrated in vivo using swept source optical coherence tomography (OCT) system at 1310 nm with modified interface. Volumetric data sets of the inner surface of the lower lip covering ~230 mm(2) field are obtained from patients with Sjögren's syndrome and a control group. OCT enables high-resolution visualization of mucosal architecture using cross-sectional images as well as en-face projection images. Comprehensive morphometry of the labial minor salivary glands is performed, and statistical significance is assessed. Statistically significant differences in morphometric parameters are found when subgroups of patients with Sjögren's syndrome are analyzed.
The antimicrobial activity of honey against common equine wound bacterial isolates.
Carnwath, R; Graham, E M; Reynolds, K; Pollock, P J
2014-01-01
Delayed healing associated with distal limb wounds is a particular problem in equine clinical practice. Recent studies in human beings and other species have demonstrated the beneficial wound healing properties of honey, and medical grade honey dressings are available commercially in equine practice. Equine clinicians are reported to source other non-medical grade honeys for the same purpose. This study aimed to assess the antimicrobial activity of a number of honey types against common equine wound bacterial pathogens. Twenty-nine honey products were sourced, including gamma-irradiated and non-irradiated commercial medical grade honeys, supermarket honeys, and honeys from local beekeepers. To exclude contaminated honeys from the project, all honeys were cultured aerobically for evidence of bacterial contamination. Aerobic bacteria or fungi were recovered from 18 products. The antimicrobial activity of the remaining 11 products was assessed against 10 wound bacteria, recovered from the wounds of horses, including methicillin resistant Staphylococcus aureus and Pseudomonas aeruginosa. Eight products were effective against all 10 bacterial isolates at concentrations varying from <2% to 16% (v/v). Overall, the Scottish Heather Honey was the best performing product, and inhibited the growth of all 10 bacterial isolates at concentrations ranging from <2% to 6% (v/v). Although Manuka has been the most studied honey to date, other sources may have valuable antimicrobial properties. Since some honeys were found to be contaminated with aerobic bacteria or fungi, non-sterile honeys may not be suitable for wound treatment. Further assessment of gamma-irradiated honeys from the best performing honeys would be useful. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Bateman, Monte; Mach, Douglas; Blakeslee, Richard J.; Koshak, William
2018-01-01
As part of the calibration/validation (cal/val) effort for the Geostationary Lightning Mapper (GLM) on GOES-16, we need to assess instrument performance (detection efficiency and accuracy). One major effort is to calculate the detection efficiency of GLM by comparing to multiple ground-based systems. These comparisons will be done pair-wise between GLM and each other source. A complication in this process is that the ground-based systems sense different properties of the lightning signal than does GLM (e.g., RF vs. optical). Also, each system has a different time and space resolution and accuracy. Preliminary results indicate that GLM is performing at or above its specification.
Noise performance of frequency modulation Kelvin force microscopy
Deresmes, Dominique; Mélin, Thierry
2014-01-01
Summary Noise performance of a phase-locked loop (PLL) based frequency modulation Kelvin force microscope (FM-KFM) is assessed. Noise propagation is modeled step by step throughout the setup using both exact closed loop noise gains and an approximation known as “noise gain” from operational amplifier (OpAmp) design that offers the advantage of decoupling the noise performance study from considerations of stability and ideal loop response. The bandwidth can be chosen depending on how much noise is acceptable and it is shown that stability is not an issue up to a limit that will be discussed. With thermal and detector noise as the only sources, both approaches yield PLL frequency noise expressions equal to the theoretical value for self-oscillating circuits and in agreement with measurement, demonstrating that the PLL components neither modify nor contribute noise. Kelvin output noise is then investigated by modeling the surrounding bias feedback loop. A design rule is proposed that allows choosing the AC modulation frequency for optimized sharing of the PLL bandwidth between Kelvin and topography loops. A crossover criterion determines as a function of bandwidth, temperature and probe parameters whether thermal or detector noise is the dominating noise source. Probe merit factors for both cases are then established, suggesting how to tackle noise performance by probe design. Typical merit factors of common probe types are compared. This comprehensive study is an encouraging step toward a more integral performance assessment and a remedy against focusing on single aspects and optimizing around randomly chosen key values. PMID:24455457
ERIC Educational Resources Information Center
Masson, Michael E. J.; Rotello, Caren M.
2009-01-01
In many cognitive, metacognitive, and perceptual tasks, measurement of performance or prediction accuracy may be influenced by response bias. Signal detection theory provides a means of assessing discrimination accuracy independent of such bias, but its application crucially depends on distributional assumptions. The Goodman-Kruskal gamma…
Identifying Sources of Bias in EFL Writing Assessment through Multiple Trait Scoring
ERIC Educational Resources Information Center
Salmani-Nodoushan, Mohammad Ali
2009-01-01
For purposes of the present study, it was hypothesized that field (in)dependence would introduce systematic variance into EFL learners' performance on composition tests. 1743 freshman, sophomore, junior, and senior students all majoring in English at different Iranian universities and colleges took the Group Embedded Figures Test (GEFT). The…
Catching the Celtic Tiger by Its Tail
ERIC Educational Resources Information Center
Ferreira, M. Luisa; VanHoudt, Patrick
2004-01-01
The article attempts to assess the major sources behind the exceptional Irish growth performance in the 1990s. Unlike other Tigers, Ireland's growth is due to efficiency gains, rather than capital deepening, but the causes for the swift growth in total factor productivity cannot be pinned down to a single factor. Human capital, foreign direct…
GASOLINE-CONTAMINATED GROUND WATER AS A SOURCE OF RESIDENTIAL BENZENE EXPOSURE: A CASE STUDY
In a private residence using gasoline-contaminated water (approximately 300 ug/l benzene), a series of experiments were performed to assess the potential benzene exposures that may occur in the shower stall, bathroom, master bedroom, and living room as a result of a single 20-min...
Using Technology Pedagogical Content Knowledge Development to Enhance Learning Outcomes
ERIC Educational Resources Information Center
Agyei, Douglas D.; Keengwe, Jared
2014-01-01
This paper describes an intervention in which pre-service teachers developed their TPACK through multiple data sources. Teachers' self-reports of their TPACK knowledge were triangulated with performance-based assessment of their instructional practices and artifacts to give a better understanding and nature of pre-service teachers' TPACK…
ERIC Educational Resources Information Center
Dick, Anthony Steven
2012-01-01
Two experiments examined processes underlying cognitive inflexibility in set-shifting tasks typically used to assess the development of executive function in children. Adult participants performed a Flexible Item Selection Task (FIST) that requires shifting from categorizing by one dimension (e.g., color) to categorizing by a second orthogonal…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allgood, Tiffany L.; Sorter, Andy
The Coeur d'Alene Tribe's Energy Efficiency Feasibility Study (EEFS) is the culminating document that compiles the energy efficiency and building performance assessment and project prioritization process completed on 36 Tribally owned and operated facilities within Tribal lands. The EEFS contains sections on initial findings, utility billing analyses, energy conservation measures and prioritization and funding sources and strategies for energy project implementation.
The role of a detailed aqueous phase source release model in the LANL area G performance assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vold, E.L.; Shuman, R.; Hollis, D.K.
1995-12-31
A preliminary draft of the Performance Assessment for the Los Alamos National Laboratory (LANL) low-level radioactive waste disposal facility at Area G is currently being completed as required by Department of Energy orders. A detailed review of the inventory data base records and the existing models for source release led to the development of a new modeling capability to describe the liquid phase transport from the waste package volumes. Nuclide quantities are sorted down to four waste package release categories for modeling: rapid release, soil, concrete/sludge, and corrosion. Geochemistry for the waste packages was evaluated in terms of the equilibriummore » coefficients, Kds, and elemental solubility limits, Csl, interpolated from the literature. Percolation calculations for the base case closure cover show a highly skewed distribution with an average of 4 mm/yr percolation from the disposal unit bottom. The waste release model is based on a compartment representation of the package efflux, and depends on package size, percolation rate or Darcy flux, retardation coefficient, and moisture content.« less
Caswell, Andrew W; Roy, Sukesh; An, Xinliang; Sanders, Scott T; Schauer, Frederick R; Gord, James R
2013-04-20
Hyperspectral absorption spectroscopy is being used to monitor gas temperature, velocity, pressure, and H(2)O mole fraction in a research-grade pulsed-detonation combustor (PDC) at the Air Force Research Laboratory. The hyperspectral source employed is termed the TDM 3-FDML because it consists of three time-division-multiplexed (TDM) Fourier-domain mode-locked (FDML) lasers. This optical-fiber-based source monitors sufficient spectral information in the H(2)O absorption spectrum near 1350 nm to permit measurements over the wide range of conditions encountered throughout the PDC cycle. Doppler velocimetry based on absorption features is accomplished using a counterpropagating beam approach that is designed to minimize common-mode flow noise. The PDC in this study is operated in two configurations: one in which the combustion tube exhausts directly to the ambient environment and another in which it feeds an automotive-style turbocharger to assess the performance of a detonation-driven turbine. Because the enthalpy flow [kilojoule/second] is important in assessing the performance of the PDC in various configurations, it is calculated from the measured gas properties.
Wu, Tai-luan; Tseng, Ling-li
2017-01-01
This study examines the completeness and overlap of coverage in physics of six open access scholarly communication systems, including two search engines (Google Scholar and Microsoft Academic), two aggregate institutional repositories (OAIster and OpenDOAR), and two physics-related open sources (arXiv.org and Astrophysics Data System). The 2001–2013 Nobel Laureates in Physics served as the sample. Bibliographic records of their publications were retrieved and downloaded from each system, and a computer program was developed to perform the analytical tasks of sorting, comparison, elimination, aggregation and statistical calculations. Quantitative analyses and cross-referencing were performed to determine the completeness and overlap of the system coverage of the six open access systems. The results may enable scholars to select an appropriate open access system as an efficient scholarly communication channel, and academic institutions may build institutional repositories or independently create citation index systems in the future. Suggestions on indicators and tools for academic assessment are presented based on the comprehensiveness assessment of each system. PMID:29267327
Lamendella, R.; Domingo, J.W.S.; Oerther, D.B.; Vogel, J.R.; Stoeckel, D.M.
2007-01-01
We evaluated the efficacy, sensitivity, host-specificity, and spatial/temporal dynamics of human- and ruminant-specific 16S rRNA gene Bacteroidetes markers used to assess the sources of fecal pollution in a fecally impacted watershed. Phylogenetic analyses of 1271 fecal and environmental 16S rRNA gene clones were also performed to study the diversity of Bacteroidetes in this watershed. The host-specific assays indicated that ruminant feces were present in 28-54% of the water samples and in all sampling seasons, with increasing frequency in downstream sites. The human-targeted assays indicated that only 3-5% of the water samples were positive for human fecal signals, although a higher percentage of human-associated signals (19-24%) were detected in sediment samples. Phylogenetic analysis indicated that 57% of all water clones clustered with yet-to-be-cultured Bacteroidetes species associated with sequences obtained from ruminant feces, further supporting the prevalence of ruminant contamination in this watershed. However, since several clusters contained sequences from multiple sources, future studies need to consider the potential cosmopolitan nature of these bacterial populations when assessing fecal pollution sources using Bacteroidetes markers. Moreover, additional data is needed in order to understand the distribution of Bacteroidetes host-specific markers and their relationship to water quality regulatory standards. ?? 2006 Federation of European Microbiological Societies.
Monitor-based evaluation of pollutant load from urban stormwater runoff in Beijing.
Liu, Y; Che, W; Li, J
2005-01-01
As a major pollutant source to urban receiving waters, the non-point source pollution from urban runoff needs to be well studied and effectively controlled. Based on monitoring data from urban runoff pollutant sources, this article describes a systematic estimation of total pollutant loads from the urban areas of Beijing. A numerical model was developed to quantify main pollutant loads of urban runoff in Beijing. A sub-procedure is involved in this method, in which the flush process influences both the quantity and quality of stormwater runoff. A statistics-based method was applied in computing the annual pollutant load as an output of the runoff. The proportions of pollutant from point-source and non-point sources were compared. This provides a scientific basis for proper environmental input assessment of urban stormwater pollution to receiving waters, improvement of infrastructure performance, implementation of urban stormwater management, and utilization of stormwater.
KSC ice/frost/debris assessment for space shuttle mission STS-29R
NASA Technical Reports Server (NTRS)
Stevenson, Charles G.; Katnik, Gregory N.; Higginbotham, Scott A.
1989-01-01
An ice/frost/debris assessment was conducted for Space Shuttle Mission STS-29R. Debris inspections of the flight elements and launch pad are performed before and after launch. Ice/frost conditions on the external tank are assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by an on-pad visual inspection. High speed photography is analyzed after launch to identify ice/debris sources and evaluate potential vehicle damage. The ice/frost/debris conditions of Mission STS-29R and their effect on the Space Shuttle Program are documented.
Ice/frost/debris assessment for space shuttle mission STS-26R
NASA Technical Reports Server (NTRS)
Stevenson, Charles G.; Katnik, Gregory N.; Higginbotham, Scott A.
1988-01-01
An Ice/Frost/Debris Assessment was conducted for Space Shuttle Mission STS-26R. Debris inspections of the flight elements and launch pad are performed before and after launch. Ice/Frost conditions are assessed by use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by an on-pad visual inspection. High speed photography is viewed after launch to identify ice/debris sources and evaluate potential vehicle damage. The Ice/Frost/Debris conditions of Mission 26R and their effect on the Space Shuttle Program is documented.
Ice/frost/debris assessment for space shuttle mission STS-27R, December 2, 1988
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.; Higginbotham, Scott A.
1989-01-01
An Ice/Frost/Debris assessment was conducted for Space Shuttle Mission STS-27R. Debris inspections of the flight elements and launch pad are performed before and after launch. Ice/frost conditions are assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by an on-pad visual inspection. High speed photography is viewed after launch to identify ice/debris sources and evaluate potential vehicle damage. The Ice/Frost/Debris conditions of Mission STS-27R and their effect on the Space Shuttle Program are documented.
An experimental assessment of the imaging quality of the low energy gamma-ray telescope ZEBRA
NASA Technical Reports Server (NTRS)
Butler, R. C.; Caroli, E.; Dicocco, G.; Natalucci, L.; Spada, G.; Spizzichino, A.; Stephen, J. B.; Carter, J. N.; Charalambous, P. M.; Dean, A. J.
1985-01-01
One gamma-ray detection plane of the ZEBRA telescope, consisting of nine position sensitive scintillation crystal bars designed to operate over the spectral range 0.2 to 10 MeV, has been constructed in the laboratory. A series of experimental images has been generated using a scaled down flight pattern mask in conjunction with a diverging gamma-ray beam. Point and extended sources have been imaged in order to assess quantitatively the performance of the system.
NASA Technical Reports Server (NTRS)
Ferguson, R. E.
1985-01-01
The data base verification of the ECLS Systems Assessment Program (ESAP) was documented and changes made to enhance the flexibility of the water recovery subsystem simulations are given. All changes which were made to the data base values are described and the software enhancements performed. The refined model documented herein constitutes the submittal of the General Cluster Systems Model. A source listing of the current version of ESAP is provided in Appendix A.
Mobley, Joseph R
2005-03-01
Eight aerial surveys were flown north of the Hawaiian island of Kauai during 2001 when the North Pacific Acoustic Laboratory (NPAL) source was not transmitting, and during 2002 and 2003 when it was. All surveys were performed during the period of peak residency of humpback whales (Feb-Mar). During 2002 and 2003, surveys commenced immediately upon cessation of a 24-h cycle of transmissions. Numbers and distribution of whales observed within 40 km of the NPAL source during 2001 (source off) were compared with those observed during 2002 and 2003 (source on). A total of 75 sightings was noted during 2001, as compared with 81 and 55 during 2002 and 2003, respectively. Differences in sighting rates (sightings/km) across years were not statistically significant. Assessment of distributional changes relied upon comparisons of three measures: (a) location depths; (b) distance from the NPAL source; and (c) distance offshore. None of the distributional comparisons revealed statistically significant differences across years. Several possible interpretations are examined: (a) whales have habituated to the NPAL signal; (b) insufficient statistical power exists in the present design to detect any effects; and (c) the effects are short-lived and become undetectable shortly after the cessation of transmissions.
Auralization of Hybrid Wing Body Aircraft Flyover Noise from System Noise Predictions
NASA Technical Reports Server (NTRS)
Rizzi, Stephen A.; Aumann, Aric R.; Lopes, Leonvard V.; Burley, Casey L.
2013-01-01
System noise assessments of a state-of-the-art reference aircraft (similar to a Boeing 777-200ER with GE90-like turbofan engines) and several hybrid wing body (HWB) aircraft configurations were recently performed using NASA engine and aircraft system analysis tools. The HWB aircraft were sized to an equivalent mission as the reference aircraft and assessments were performed using measurements of airframe shielding from a series of propulsion airframe aeroacoustic experiments. The focus of this work is to auralize flyover noise from the reference aircraft and the best HWB configuration using source noise predictions and shielding data based largely on the earlier assessments. For each aircraft, three flyover conditions are auralized. These correspond to approach, sideline, and cutback operating states, but flown in straight and level flight trajectories. The auralizations are performed using synthesis and simulation tools developed at NASA. Audio and visual presentations are provided to allow the reader to experience the flyover from the perspective of a listener in the simulated environment.
Memory for Self-Performed Actions in Individuals with Asperger Syndrome
Zalla, Tiziana; Daprati, Elena; Sav, Anca-Maria; Chaste, Pauline; Nico, Daniele; Leboyer, Marion
2010-01-01
Memory for action is enhanced if individuals are allowed to perform the corresponding movements, compared to when they simply listen to them (enactment effect). Previous studies have shown that individuals with Autism Spectrum Disorders (ASD) have difficulties with processes involving the self, such as autobiographical memories and self performed actions. The present study aimed at assessing memory for action in Asperger Syndrome (AS). We investigated whether adults with AS would benefit from the enactment effect when recalling a list of previously performed items vs. items that were only visually and verbally experienced through three experimental tasks (Free Recall, Old/New Recognition and Source Memory). The results showed that while performance on Recognition and Source Memory tasks was preserved in individuals with AS, the enactment effect for self-performed actions was not consistently present, as revealed by the lower number of performed actions being recalled on the Free Recall test, as compared to adults with typical development. Subtle difficulties in encoding specific motor and proprioceptive signals during action execution in individuals with AS might affect retrieval of relevant personal episodic information. These disturbances might be associated to an impaired action monitoring system. PMID:20967277
Memory for self-performed actions in individuals with Asperger syndrome.
Zalla, Tiziana; Daprati, Elena; Sav, Anca-Maria; Chaste, Pauline; Nico, Daniele; Leboyer, Marion
2010-10-12
Memory for action is enhanced if individuals are allowed to perform the corresponding movements, compared to when they simply listen to them (enactment effect). Previous studies have shown that individuals with Autism Spectrum Disorders (ASD) have difficulties with processes involving the self, such as autobiographical memories and self performed actions. The present study aimed at assessing memory for action in Asperger Syndrome (AS). We investigated whether adults with AS would benefit from the enactment effect when recalling a list of previously performed items vs. items that were only visually and verbally experienced through three experimental tasks (Free Recall, Old/New Recognition and Source Memory). The results showed that while performance on Recognition and Source Memory tasks was preserved in individuals with AS, the enactment effect for self-performed actions was not consistently present, as revealed by the lower number of performed actions being recalled on the Free Recall test, as compared to adults with typical development. Subtle difficulties in encoding specific motor and proprioceptive signals during action execution in individuals with AS might affect retrieval of relevant personal episodic information. These disturbances might be associated to an impaired action monitoring system.
Ge, Jingping; Zhao, Jingwen; Zhang, Luyan; Zhang, Mengyun; Ping, Wenxiang
2014-01-01
Double labeling of resistance markers and report genes can be used to breed engineered Saccharomyces cerevisiae strains that can assimilate xylose and glucose as a mixed carbon source for ethanol fermentation and increased ethanol production. In this study Saccharomyces cerevisiae W5 and Candida shehatae 20335 were used as parent strains to conduct protoplast fusion and the resulting fusants were screened by double labeling. High performance liquid chromatography (HPLC) was used to assess the ethanol yield following the fermentation of xylose and glucose, as both single and mixed carbon sources, by the fusants. Interestingly, one fusant (ZLYRHZ7) was demonstrated to have an excellent fermentation performance, with an ethanol yield using the mixed carbon source of 0.424 g g−1, which compares with 0.240 g g−1 (W5) and 0.353 g g−1 (20335) for the parent strains. This indicates an improvement in the ethanol yield of 43.4% and 16.7%, respectively. PMID:25268957
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kayser, Y., E-mail: yves.kayser@psi.ch; Paul Scherrer Institut, 5232 Villigen-PSI; Błachucki, W.
2014-04-15
The high-resolution von Hamos bent crystal spectrometer of the University of Fribourg was upgraded with a focused X-ray beam source with the aim of performing micro-sized X-ray fluorescence (XRF) measurements in the laboratory. The focused X-ray beam source integrates a collimating optics mounted on a low-power micro-spot X-ray tube and a focusing polycapillary half-lens placed in front of the sample. The performances of the setup were probed in terms of spatial and energy resolution. In particular, the fluorescence intensity and energy resolution of the von Hamos spectrometer equipped with the novel micro-focused X-ray source and a standard high-power water-cooled X-raymore » tube were compared. The XRF analysis capability of the new setup was assessed by measuring the dopant distribution within the core of Er-doped SiO{sub 2} optical fibers.« less
Ge, Jingping; Zhao, Jingwen; Zhang, Luyan; Zhang, Mengyun; Ping, Wenxiang
2014-01-01
Double labeling of resistance markers and report genes can be used to breed engineered Saccharomyces cerevisiae strains that can assimilate xylose and glucose as a mixed carbon source for ethanol fermentation and increased ethanol production. In this study Saccharomyces cerevisiae W5 and Candida shehatae 20335 were used as parent strains to conduct protoplast fusion and the resulting fusants were screened by double labeling. High performance liquid chromatography (HPLC) was used to assess the ethanol yield following the fermentation of xylose and glucose, as both single and mixed carbon sources, by the fusants. Interestingly, one fusant (ZLYRHZ7) was demonstrated to have an excellent fermentation performance, with an ethanol yield using the mixed carbon source of 0.424 g g-1, which compares with 0.240 g g-1 (W5) and 0.353 g g-1 (20335) for the parent strains. This indicates an improvement in the ethanol yield of 43.4% and 16.7%, respectively.
Within-city contrasts in PM composition and sources and their relationship with nitrogen oxides.
Minguillón, M C; Rivas, I; Aguilera, I; Alastuey, A; Moreno, T; Amato, F; Sunyer, J; Querol, X
2012-10-26
The present work is part of the INMA (INfancia y Medio Ambiente -'Environment and Childhood') project, which aims at assessing the adverse effects of exposure to air pollution during pregnancy and early in life. The present study was performed in the city of Sabadell (Northeast Spain) at three sampling sites covering different traffic characteristics, during two times of the year. It assesses time and spatial variations of PM(2.5) concentrations, chemical components and source contributions, as well as gaseous pollutants. Furthermore, a cross-correlation analysis of PM components and source contributions with gaseous pollutants used as a proxy for exposure assessment is carried out. Our data show the influence of traffic emissions in the Sabadell area. The main PM sources identified by Positive Matrix Factorisation (PMF) were similar between the two seasons: mineral source (traffic-induced resuspension, demolition/construction and natural background), secondary sulphate (higher in summer), secondary nitrate (only during winter), industrial, and road traffic, which was the main contributor to PM(2.5) at two of the sites. The correlation of concentrations of nitrogen oxides was especially strong with those of elemental carbon (EC). The relatively weaker correlations with organic carbon (OC) in summer are attributed to the variable formation of secondary OC. Strong correlations between concentration of nitrogen oxides and PM(2.5) road traffic contributions obtained from source apportionment analysis were seen at all sites. Therefore, under the studied urban environment, nitrogen oxides can be used as a proxy for the exposure to road traffic contribution to PM(2.5); the use of NO(x) concentrations being preferred, with NO and NO(2) as second and third options, respectively.
Whitley, T W; Benson, N H; Allison, E J; Revicki, D A
1989-07-01
A mail survey of members of the National Flight Nurses Association was conducted to assess occupational stress and job satisfaction. In addition to scales measuring stress and job satisfaction, the questionnaire requested demographic information and included a depression scale. The anticipated direct relationship between stress and depression was observed (r = .56, p .0001), as were the expected inverse relationships between stress and job satisfaction (r = .54, p less than .0001), and between depression and job satisfaction (r = -.45, p .0001). Responses to statements on the stress scale indicated that work interference with family life and failure to receive recognition were important sources of stress, while avoidance behaviors such as tardiness and daydreaming were used infrequently to cope with stress. Inadequate recognition, particularly by administrators and supervisors, and lack of involvement in decision-making processes surfaced as sources of dissatisfaction, as did inadequate feedback about job performance. The tasks performed by flight nurses and being members of cohesive work groups were important sources of job satisfaction. The results indicate that although flight nurses basically are satisfied with their jobs and enjoy working in air medical transport, they want to know that they are performing well. They also want to be involved in decision-making processes and to be recognized for the stressful jobs they perform.
Miyanishi, Tomohiro; Sumiyoshi, Tomiki; Higuchi, Yuko; Seo, Tomonori; Suzuki, Michio
2013-01-01
Introduction Patients with schizophrenia elicit cognitive decline from the early phase of the illness. Mismatch negativity (MMN) has been shown to be associated with cognitive function. We investigated the current source density of duration mismatch negativity (dMMN), by using low-resolution brain electromagnetic tomography (LORETA), and neuropsychological performance in subjects with early schizophrenia. Methods Data were obtained from 20 patients meeting DSM-IV criteria for schizophrenia or schizophreniform disorder, and 20 healthy control (HC) subjects. An auditory odd-ball paradigm was used to measure dMMN. Neuropsychological performance was evaluated by the brief assessment of cognition in schizophrenia Japanese version (BACS-J). Results Patients showed smaller dMMN amplitudes than those in the HC subjects. LORETA current density for dMMN was significantly lower in patients compared to HC subjects, especially in the temporal lobes. dMMN current density in the frontal lobe was positively correlated with working memory performance in patients. Conclusions This is the first study to identify brain regions showing smaller dMMN current density in early schizophrenia. Further, poor working memory was associated with decreased dMMN current density in patients. These results are likely to help understand the neural basis for cognitive impairment of schizophrenia. PMID:23577204
Cavitating Propeller Performance in Inclined Shaft Conditions with OpenFOAM: PPTC 2015 Test Case
NASA Astrophysics Data System (ADS)
Gaggero, Stefano; Villa, Diego
2018-05-01
In this paper, we present our analysis of the non-cavitating and cavitating unsteady performances of the Potsdam Propeller Test Case (PPTC) in oblique flow. For our calculations, we used the Reynolds-averaged Navier-Stokes equation (RANSE) solver from the open-source OpenFOAM libraries. We selected the homogeneous mixture approach to solve for multiphase flow with phase change, using the volume of fluid (VoF) approach to solve the multiphase flow and modeling the mass transfer between vapor and water with the Schnerr-Sauer model. Comparing the model results with the experimental measurements collected during the Second Workshop on Cavitation and Propeller Performance - SMP'15 enabled our assessment of the reliability of the open-source calculations. Comparisons with the numerical data collected during the workshop enabled further analysis of the reliability of different flow solvers from which we produced an overview of recommended guidelines (mesh arrangements and solver setups) for accurate numerical prediction even in off-design conditions. Lastly, we propose a number of calculations using the boundary element method developed at the University of Genoa for assessing the reliability of this dated but still widely adopted approach for design and optimization in the preliminary stages of very demanding test cases.
Meisner, Allison; Kerr, Kathleen F; Thiessen-Philbrook, Heather; Coca, Steven G; Parikh, Chirag R
2016-02-01
Individual biomarkers of renal injury are only modestly predictive of acute kidney injury (AKI). Using multiple biomarkers has the potential to improve predictive capacity. In this systematic review, statistical methods of articles developing biomarker combinations to predict AKI were assessed. We identified and described three potential sources of bias (resubstitution bias, model selection bias, and bias due to center differences) that may compromise the development of biomarker combinations. Fifteen studies reported developing kidney injury biomarker combinations for the prediction of AKI after cardiac surgery (8 articles), in the intensive care unit (4 articles), or other settings (3 articles). All studies were susceptible to at least one source of bias and did not account for or acknowledge the bias. Inadequate reporting often hindered our assessment of the articles. We then evaluated, when possible (7 articles), the performance of published biomarker combinations in the TRIBE-AKI cardiac surgery cohort. Predictive performance was markedly attenuated in six out of seven cases. Thus, deficiencies in analysis and reporting are avoidable, and care should be taken to provide accurate estimates of risk prediction model performance. Hence, rigorous design, analysis, and reporting of biomarker combination studies are essential to realizing the promise of biomarkers in clinical practice.
Singh, Ramandeep; Baby, Britty; Damodaran, Natesan; Srivastav, Vinkle; Suri, Ashish; Banerjee, Subhashis; Kumar, Subodh; Kalra, Prem; Prasad, Sanjiva; Paul, Kolin; Anand, Sneh; Kumar, Sanjeev; Dhiman, Varun; Ben-Israel, David; Kapoor, Kulwant Singh
2016-02-01
Box trainers are ideal simulators, given they are inexpensive, accessible, and use appropriate fidelity. The development and validation of an open-source, partial task simulator that teaches the fundamental skills necessary for endonasal skull-base neuro-endoscopic surgery. We defined the Neuro-Endo-Trainer (NET) SkullBase-Task-GraspPickPlace with an activity area by analyzing the computed tomography scans of 15 adult patients with sellar suprasellar parasellar tumors. Four groups of participants (Group E, n = 4: expert neuroendoscopists; Group N, n =19: novice neurosurgeons; Group R, n = 11: neurosurgery residents with multiple iterations; and Group T, n = 27: neurosurgery residents with single iteration) performed grasp, pick, and place tasks using NET and were graded on task completion time and skills assessment scale score. Group E had lower task completion times and greater skills assessment scale scores than both Group N and R (P ≤ 0.03, 0.001). The performance of Groups N and R was found to be equivalent; in self-assessing neuro-endoscopic skill, the participants in these groups were found to have equally low pretraining scores (4/10) with significant improvement shown after NET simulation (6, 7 respectively). Angled scopes resulted in decreased scores with tilted plates compared with straight plates (30° P ≤ 0.04, 45° P ≤ 0.001). With tilted plates, decreased scores were observed when we compared the 0° with 45° endoscope (right, P ≤ 0.008; left, P ≤ 0.002). The NET, a face and construct valid open-source partial task neuroendoscopic trainer, was designed. Presimulation novice neurosurgeons and neurosurgical residents were described as having insufficient skills and preparation to practice neuro-endoscopy. Plate tilt and endoscope angle were shown to be important factors in participant performance. The NET was found to be a useful partial-task trainer for skill building in neuro-endoscopy. Copyright © 2016 Elsevier Inc. All rights reserved.
MASTODON: A geosciences simulation tool built using the open-source framework MOOSE
NASA Astrophysics Data System (ADS)
Slaughter, A.
2017-12-01
The Department of Energy (DOE) is currently investing millions of dollars annually into various modeling and simulation tools for all aspects of nuclear energy. An important part of this effort includes developing applications based on the open-source Multiphysics Object Oriented Simulation Environment (MOOSE; mooseframework.org) from Idaho National Laboratory (INL).Thanks to the efforts of the DOE and outside collaborators, MOOSE currently contains a large set of physics modules, including phase field, level set, heat conduction, tensor mechanics, Navier-Stokes, fracture (extended finite-element method), and porous media, among others. The tensor mechanics and contact modules, in particular, are well suited for nonlinear geosciences problems. Multi-hazard Analysis for STOchastic time-DOmaiN phenomena (MASTODON; https://seismic-research.inl.gov/SitePages/Mastodon.aspx)--a MOOSE-based application--is capable of analyzing the response of 3D soil-structure systems to external hazards with current development focused on earthquakes. It is capable of simulating seismic events and can perform extensive "source-to-site" simulations including earthquake fault rupture, nonlinear wave propagation, and nonlinear soil-structure interaction analysis. MASTODON also includes a dynamic probabilistic risk assessment capability that enables analysts to not only perform deterministic analyses, but also easily perform probabilistic or stochastic simulations for the purpose of risk assessment. Although MASTODON has been developed for the nuclear industry, it can be used to assess the risk for any structure subjected to earthquakes.The geosciences community can learn from the nuclear industry and harness the enormous effort underway to build simulation tools that are open, modular, and share a common framework. In particular, MOOSE-based multiphysics solvers are inherently parallel, dimension agnostic, adaptive in time and space, fully coupled, and capable of interacting with other applications. The geosciences community could benefit from existing tools by enabling collaboration between researchers and practitioners throughout the world and advance the state-of-the-art in line with other scientific research efforts.
NASA Astrophysics Data System (ADS)
Griffin, J.; Clark, D.; Allen, T.; Ghasemi, H.; Leonard, M.
2017-12-01
Standard probabilistic seismic hazard assessment (PSHA) simulates earthquake occurrence as a time-independent process. However paleoseismic studies in slowly deforming regions such as Australia show compelling evidence that large earthquakes on individual faults cluster within active periods, followed by long periods of quiescence. Therefore the instrumental earthquake catalog, which forms the basis of PSHA earthquake recurrence calculations, may only capture the state of the system over the period of the catalog. Together this means that data informing our PSHA may not be truly time-independent. This poses challenges in developing PSHAs for typical design probabilities (such as 10% in 50 years probability of exceedance): Is the present state observed through the instrumental catalog useful for estimating the next 50 years of earthquake hazard? Can paleo-earthquake data, that shows variations in earthquake frequency over time-scales of 10,000s of years or more, be robustly included in such PSHA models? Can a single PSHA logic tree be useful over a range of different probabilities of exceedance? In developing an updated PSHA for Australia, decadal-scale data based on instrumental earthquake catalogs (i.e. alternative area based source models and smoothed seismicity models) is integrated with paleo-earthquake data through inclusion of a fault source model. Use of time-dependent non-homogeneous Poisson models allows earthquake clustering to be modeled on fault sources with sufficient paleo-earthquake data. This study assesses the performance of alternative models by extracting decade-long segments of the instrumental catalog, developing earthquake probability models based on the remaining catalog, and testing performance against the extracted component of the catalog. Although this provides insights into model performance over the short-term, for longer timescales it is recognised that model choice is subject to considerable epistemic uncertainty. Therefore a formal expert elicitation process has been used to assign weights to alternative models for the 2018 update to Australia's national PSHA.
Kocbek, Simon; Cavedon, Lawrence; Martinez, David; Bain, Christopher; Manus, Chris Mac; Haffari, Gholamreza; Zukerman, Ingrid; Verspoor, Karin
2016-12-01
Text and data mining play an important role in obtaining insights from Health and Hospital Information Systems. This paper presents a text mining system for detecting admissions marked as positive for several diseases: Lung Cancer, Breast Cancer, Colon Cancer, Secondary Malignant Neoplasm of Respiratory and Digestive Organs, Multiple Myeloma and Malignant Plasma Cell Neoplasms, Pneumonia, and Pulmonary Embolism. We specifically examine the effect of linking multiple data sources on text classification performance. Support Vector Machine classifiers are built for eight data source combinations, and evaluated using the metrics of Precision, Recall and F-Score. Sub-sampling techniques are used to address unbalanced datasets of medical records. We use radiology reports as an initial data source and add other sources, such as pathology reports and patient and hospital admission data, in order to assess the research question regarding the impact of the value of multiple data sources. Statistical significance is measured using the Wilcoxon signed-rank test. A second set of experiments explores aspects of the system in greater depth, focusing on Lung Cancer. We explore the impact of feature selection; analyse the learning curve; examine the effect of restricting admissions to only those containing reports from all data sources; and examine the impact of reducing the sub-sampling. These experiments provide better understanding of how to best apply text classification in the context of imbalanced data of variable completeness. Radiology questions plus patient and hospital admission data contribute valuable information for detecting most of the diseases, significantly improving performance when added to radiology reports alone or to the combination of radiology and pathology reports. Overall, linking data sources significantly improved classification performance for all the diseases examined. However, there is no single approach that suits all scenarios; the choice of the most effective combination of data sources depends on the specific disease to be classified. Copyright © 2016 Elsevier Inc. All rights reserved.
Taïbi, K; del Campo, A D; Aguado, A; Mulet, J M
2016-04-15
Forest restoration constitutes an important issue within adaptive environmental management for climate change at global scale. However, effective implementation of these programs can only be achieved by revising current seed transfer guidelines, as they lack inherent spatial and temporal dynamics associated with climate change. In this sense, provenance trials may provide key information on the relative performance of different populations and/or genotypes under changing ecological conditions. This study addresses a methodological approach to evaluate early plantation performance and the consequent phenotypic plasticity and the pattern of the adaptation of different seed sources in contrasting environments. To this end, six seed sources of Salzmann pine were tested at three contrasting trial sites testing a hypothetical assisted population migration. Adaptation at each site was assessed through Joint Regression and Additive Main effect and Multiplication Interaction (AMMI) models. Most of the observed variation was attributed to the environment (above 90% for all traits), even so genotype and genotype by environment interaction (GxE) were significant. Seedlings out-planted under better site conditions did not differ in survival but in height growth. However, on sites with higher constraints, survival differed among seed sources and diameter growth was high. The adaptation analyses (AMMI) indicated that the cold-continental seed source 'Soria' performed as a generalist seed source, whereas 'Cordilleras Béticas', the southernmost seed source, was more adapted to harsh environments (frost and drought) in terms of survival. The results supported partially the hypothesis that assisted migration of seed sources makes sense within limited transfer distances, and this was reinforced by the GxE results. The present study could be valuable to address adaptive transfer of seedings in ecological restoration and to determine the suitable seed sources for reforestation programs and assisted population migration under climatic changes. The reported results are based on 3 years' data and need to be considered in this context. Copyright © 2016 Elsevier Ltd. All rights reserved.
Miles, Melody; Ryman, Tove K; Dietz, Vance; Zell, Elizabeth; Luman, Elizabeth T
2013-03-15
Immunization programs frequently rely on household vaccination cards, parental recall, or both to calculate vaccination coverage. This information is used at both the global and national level for planning and allocating performance-based funds. However, the validity of household-derived coverage sources has not yet been widely assessed or discussed. To advance knowledge on the validity of different sources of immunization coverage, we undertook a global review of literature. We assessed concordance, sensitivity, specificity, positive and negative predictive value, and coverage percentage point difference when subtracting household vaccination source from a medical provider source. Median coverage difference per paper ranged from -61 to +1 percentage points between card versus provider sources and -58 to +45 percentage points between recall versus provider source. When card and recall sources were combined, median coverage difference ranged from -40 to +56 percentage points. Overall, concordance, sensitivity, specificity, positive and negative predictive value showed poor agreement, providing evidence that household vaccination information may not be reliable, and should be interpreted with care. While only 5 papers (11%) included in this review were from low-middle income countries, low-middle income countries often rely more heavily on household vaccination information for decision making. Recommended actions include strengthening quality of child-level data and increasing investments to improve vaccination card availability and card marking. There is also an urgent need for additional validation studies of vaccine coverage in low and middle income countries. Copyright © 2013. Published by Elsevier Ltd.
Mehra, S; Morrison, P D; Coates, F; Lawrie, A C
2017-02-01
Terrestrial orchids depend on orchid mycorrhizal fungi (OMF) as symbionts for their survival, growth and nutrition. The ability of OMF from endangered orchid species to compete for available resources with OMF from common species may affect the distribution, abundance and therefore conservation status of their orchid hosts. Eight symbiotically effective OMF from endangered and more common Caladenia species were tested for their ability to utilise complex insoluble and simple soluble carbon sources produced during litter degradation by growth with different carbon sources in liquid medium to measure the degree of OMF variation with host conservation status or taxonomy. On simple carbon sources, fungal growth was assessed by biomass. On insoluble substrates, ergosterol content was assessed using ultra-performance liquid chromatography (UPLC). The OMF grew on all natural materials and complex carbon sources, but produced the greatest biomass on xylan and starch and the least on bark and chitin. On simple carbon sources, the greatest OMF biomass was measured on most hexoses and disaccharides and the least on galactose and arabinose. Only some OMF used sucrose, the most common sugar in green plants, with possible implications for symbiosis. OMF from common orchids produced more ergosterol and biomass than those from endangered orchids in the Dilatata and Reticulata groups but not in the Patersonii and Finger orchids. This suggests that differences in carbon source utilisation may contribute to differences in the distribution of some orchids, if these differences are retained on site.
The Use of Sensory Analysis Techniques to Assess the Quality of Indoor Air.
Lewkowska, Paulina; Dymerski, Tomasz; Gębicki, Jacek; Namieśnik, Jacek
2017-01-02
The quality of indoor air is one of the significant elements that influences people's well-being and health inside buildings. Emissions of pollutants, which may cause odor nuisance, are the main reason for people's complaints regarding the quality of indoor air. As a result, it is necessary to perform tests aimed at identifying the sources of odors inside buildings. The article contains basic information on the characteristics of the sources of indoor air pollution and the influence of the odor detection threshold on people's health and comfort. An attempt was also made to classify and use sensory analysis techniques to perform tests of the quality of indoor air, which would enable identification of sensory experience and would allow for indication of the degree of their intensity.
Characteristics of urban transportation systems. A handbook for transportation planners
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1975-05-01
The objective of the handbook, specifically for use by transportation planners in the evaluation of alternative systems, is to provide a single simplified reference source which characterizes the most important performance characteristics of the following contemporary urban transportation systems: (1) rail (commuter, rapid, and light); (2) local bus and bus rapid transit; (3) automobile-highway system (automobiles and other vehicles); (4) pedestrian assistance systems; and (5) activity center systems--people mover systems that have been installed at airports, zoos, amusement parks, etc. The handbook assesses the supply or performance aspect of urban transportation dealing with passenger demand implicitly. Seven supply parameters studiedmore » are: speed, capacity (service volume), operating cost (vehicle), energy consumption (vehicle or source), pollution, capital cost, and accident frequency.« less
Probabilistic seismic hazard analysis for a nuclear power plant site in southeast Brazil
NASA Astrophysics Data System (ADS)
de Almeida, Andréia Abreu Diniz; Assumpção, Marcelo; Bommer, Julian J.; Drouet, Stéphane; Riccomini, Claudio; Prates, Carlos L. M.
2018-05-01
A site-specific probabilistic seismic hazard analysis (PSHA) has been performed for the only nuclear power plant site in Brazil, located 130 km southwest of Rio de Janeiro at Angra dos Reis. Logic trees were developed for both the seismic source characterisation and ground-motion characterisation models, in both cases seeking to capture the appreciable ranges of epistemic uncertainty with relatively few branches. This logic-tree structure allowed the hazard calculations to be performed efficiently while obtaining results that reflect the inevitable uncertainty in long-term seismic hazard assessment in this tectonically stable region. An innovative feature of the study is an additional seismic source zone added to capture the potential contributions of characteristics earthquake associated with geological faults in the region surrounding the coastal site.
Evaluation of Long-term Performance of Enhanced Anaerobic Source Zone Bioremediation using mass flux
NASA Astrophysics Data System (ADS)
Haluska, A.; Cho, J.; Hatzinger, P.; Annable, M. D.
2017-12-01
Chlorinated ethene DNAPL source zones in groundwater act as potential long term sources of contamination as they dissolve yielding concentrations well above MCLs, posing an on-going public health risk. Enhanced bioremediation has been applied to treat many source zones with significant promise, but long-term sustainability of this technology has not been thoroughly assessed. This study evaluated the long-term effectiveness of enhanced anaerobic source zone bioremediation at chloroethene contaminated sites to determine if the treatment prevented contaminant rebound and removed NAPL from the source zone. Long-term performance was evaluated based on achieving MCL-based contaminant mass fluxes in parent compound concentrations during different monitoring periods. Groundwater concertation versus time data was compiled for 6-sites and post-remedial contaminant mass flux data was then measured using passive flux meters at wells both within and down-gradient of the source zone. Post-remedial mass flux data was then combined with pre-remedial water quality data to estimate pre-remedial mass flux. This information was used to characterize a DNAPL dissolution source strength function, such as the Power Law Model and the Equilibrium Stream tube model. The six-sites characterized for this study were (1) Former Charleston Air Force Base, Charleston, SC; (2) Dover Air Force Base, Dover, DE; (3) Treasure Island Naval Station, San Francisco, CA; (4) Former Raritan Arsenal, Edison, NJ; (5) Naval Air Station, Jacksonville, FL; and, (6) Former Naval Air Station, Alameda, CA. Contaminant mass fluxes decreased for all the sites by the end of the post-treatment monitoring period and rebound was limited within the source zone. Post remedial source strength function estimates suggest that decreases in contaminant mass flux will continue to occur at these sites, but a mass flux based on MCL levels may never be exceeded. Thus, site clean-up goals should be evaluated as order-of-magnitude reductions. Additionally, sites may require monitoring for a minimum of 5-years in order to sufficiently evaluate remedial performance. The study shows that enhanced anaerobic source zone bioremediation contributed to a modest reduction of source zone contaminant mass discharge and appears to have mitigated rebound of chlorinated ethenes.
Impact of design-parameters on the optical performance of a high-power adaptive mirror
NASA Astrophysics Data System (ADS)
Koek, Wouter D.; Nijkerk, David; Smeltink, Jeroen A.; van den Dool, Teun C.; van Zwet, Erwin J.; van Baars, Gregor E.
2017-02-01
TNO is developing a High Power Adaptive Mirror (HPAM) to be used in the CO2 laser beam path of an Extreme Ultra- Violet (EUV) light source for next-generation lithography. In this paper we report on a developed methodology, and the necessary simulation tools, to assess the performance and associated sensitivities of this deformable mirror. Our analyses show that, given the current limited insight concerning the process window of EUV generation, the HPAM module should have an actuator pitch of <= 4 mm. Furthermore we have modelled the sensitivity of performance with respect to dimpling and actuator noise. For example, for a deformable mirror with an actuator pitch of 4 mm, and if the associated performance impact is to be limited to smaller than 5%, the actuator noise should be smaller than 45 nm (rms). Our tools assist in the detailed design process by assessing the performance impact of various design choices, including for example those that affect the shape and spectral content of the influence function.
On the assessment of spatial resolution of PET systems with iterative image reconstruction
NASA Astrophysics Data System (ADS)
Gong, Kuang; Cherry, Simon R.; Qi, Jinyi
2016-03-01
Spatial resolution is an important metric for performance characterization in PET systems. Measuring spatial resolution is straightforward with a linear reconstruction algorithm, such as filtered backprojection, and can be performed by reconstructing a point source scan and calculating the full-width-at-half-maximum (FWHM) along the principal directions. With the widespread adoption of iterative reconstruction methods, it is desirable to quantify the spatial resolution using an iterative reconstruction algorithm. However, the task can be difficult because the reconstruction algorithms are nonlinear and the non-negativity constraint can artificially enhance the apparent spatial resolution if a point source image is reconstructed without any background. Thus, it was recommended that a background should be added to the point source data before reconstruction for resolution measurement. However, there has been no detailed study on the effect of the point source contrast on the measured spatial resolution. Here we use point source scans from a preclinical PET scanner to investigate the relationship between measured spatial resolution and the point source contrast. We also evaluate whether the reconstruction of an isolated point source is predictive of the ability of the system to resolve two adjacent point sources. Our results indicate that when the point source contrast is below a certain threshold, the measured FWHM remains stable. Once the contrast is above the threshold, the measured FWHM monotonically decreases with increasing point source contrast. In addition, the measured FWHM also monotonically decreases with iteration number for maximum likelihood estimate. Therefore, when measuring system resolution with an iterative reconstruction algorithm, we recommend using a low-contrast point source and a fixed number of iterations.
Assessment of variable drinking water sources used in Egypt on broiler health and welfare.
ELSaidy, N; Mohamed, R A; Abouelenien, F
2015-07-01
This study assessed the impact of four water sources used as drinking water in Egypt for broiler chickens on its performance, carcass characteristic, hematological, and immunological responses. A total of 204 unsexed 1-day old Indian River broiler chickens were used in this study. They were randomly allocated into four treatment groups of 51 birds in each, with three replicates, 17 birds per replicate. Groups were classified according to water source they had been received into (T1) received farm tap water; (T2) received filtered tap water (T3) received farm stored water at rooftop tanks, (T4) received underground (well) water. All water sources showed no significant differences among treated groups at (p>0.05) for most of the performance parameters and carcass characteristics. However (T2) group showed higher records for body weight (BWT), BWT gain (BWG), feed conversion ratio, bursa weight, serum total protein, globulin (G), albumin (A) and A/G ratio, Ab titer against New castle disease virus vaccine. On the other hand, it showed lower records for water intake (WI), WI/Feed intake ratio, total leukocytes count %, heterophil %, lymphocyte %, H/L ratio, liver weight, glutamic oxaloacetic transaminase, glutamic pyruvic transaminase, serum uric acid and creatinine. Where filtered water reverse osmosis showed lowest records for bacterial load, the absence of coliform bacteria, total dissolved solids (TDS), electrical conductivity (EC) and salinity. On the other hand stored water showed higher numerical values for TDS, EC, alkalinity, salinity, pH, bacterial count, and coliform count. Base on the results of this study, it is concluded that different water sources could safely be used as drinking water for poultry; as long as it is present within the acceptable range of drinking water quality for chickens. Suggesting the benefits of treatment of water sources on improving chickens' health and welfare. Draw attention to the importance of maintaining the hygienic quality of stored water.
Monitoring alert and drowsy states by modeling EEG source nonstationarity
NASA Astrophysics Data System (ADS)
Hsu, Sheng-Hsiou; Jung, Tzyy-Ping
2017-10-01
Objective. As a human brain performs various cognitive functions within ever-changing environments, states of the brain characterized by recorded brain activities such as electroencephalogram (EEG) are inevitably nonstationary. The challenges of analyzing the nonstationary EEG signals include finding neurocognitive sources that underlie different brain states and using EEG data to quantitatively assess the state changes. Approach. This study hypothesizes that brain activities under different states, e.g. levels of alertness, can be modeled as distinct compositions of statistically independent sources using independent component analysis (ICA). This study presents a framework to quantitatively assess the EEG source nonstationarity and estimate levels of alertness. The framework was tested against EEG data collected from 10 subjects performing a sustained-attention task in a driving simulator. Main results. Empirical results illustrate that EEG signals under alert versus drowsy states, indexed by reaction speeds to driving challenges, can be characterized by distinct ICA models. By quantifying the goodness-of-fit of each ICA model to the EEG data using the model deviation index (MDI), we found that MDIs were significantly correlated with the reaction speeds (r = -0.390 with alertness models and r = 0.449 with drowsiness models) and the opposite correlations indicated that the two models accounted for sources in the alert and drowsy states, respectively. Based on the observed source nonstationarity, this study also proposes an online framework using a subject-specific ICA model trained with an initial (alert) state to track the level of alertness. For classification of alert against drowsy states, the proposed online framework achieved an averaged area-under-curve of 0.745 and compared favorably with a classic power-based approach. Significance. This ICA-based framework provides a new way to study changes of brain states and can be applied to monitoring cognitive or mental states of human operators in attention-critical settings or in passive brain-computer interfaces.
Assessment of variable drinking water sources used in Egypt on broiler health and welfare
ELSaidy, N.; Mohamed, R. A.; Abouelenien, F.
2015-01-01
Aim: This study assessed the impact of four water sources used as drinking water in Egypt for broiler chickens on its performance, carcass characteristic, hematological, and immunological responses. Materials and Methods: A total of 204 unsexed 1-day old Indian River broiler chickens were used in this study. They were randomly allocated into four treatment groups of 51 birds in each, with three replicates, 17 birds per replicate. Groups were classified according to water source they had been received into (T1) received farm tap water; (T2) received filtered tap water (T3) received farm stored water at rooftop tanks, (T4) received underground (well) water. Results: All water sources showed no significant differences among treated groups at (p>0.05) for most of the performance parameters and carcass characteristics. However (T2) group showed higher records for body weight (BWT), BWT gain (BWG), feed conversion ratio, bursa weight, serum total protein, globulin (G), albumin (A) and A/G ratio, Ab titer against New castle disease virus vaccine. On the other hand, it showed lower records for water intake (WI), WI/Feed intake ratio, total leukocytes count %, heterophil %, lymphocyte %, H/L ratio, liver weight, glutamic oxaloacetic transaminase, glutamic pyruvic transaminase, serum uric acid and creatinine. Where filtered water reverse osmosis showed lowest records for bacterial load, the absence of coliform bacteria, total dissolved solids (TDS), electrical conductivity (EC) and salinity. On the other hand stored water showed higher numerical values for TDS, EC, alkalinity, salinity, pH, bacterial count, and coliform count. Conclusion: Base on the results of this study, it is concluded that different water sources could safely be used as drinking water for poultry; as long as it is present within the acceptable range of drinking water quality for chickens. Suggesting the benefits of treatment of water sources on improving chickens’ health and welfare. Draw attention to the importance of maintaining the hygienic quality of stored water. PMID:27047165
Integrated source and channel encoded digital communication system design study
NASA Technical Reports Server (NTRS)
Alem, W. K.; Huth, G. K.; Simon, M. K.
1978-01-01
The particular Ku-band carrier, PN despreading, and symbol synchronization strategies, which were selected for implementation in the Ku-band transponder aboard the orbiter, were assessed and evaluated from a systems performance viewpoint, verifying that system specifications were met. A study was performed of the design and implementation of tracking techniques which are suitable for incorporation into the Orbiter Ku-band communication system. Emphasis was placed on maximizing tracking accuracy and communication system flexibility while minimizing cost, weight, and system complexity of Orbiter and ground systems hardware. The payload communication study assessed the design and performance of the forward link and return link bent-pipe relay modes for attached and detached payloads. As part of this study, a design for a forward link bent-pipe was proposed which employs a residual carrier but which is tracked by the existing Costas loop.
Risk Assessment Approach for the Hanford Site River Corridor Closure Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomson, J.E.; Weiss, S.G.; Sands, J.P.
2007-07-01
The river corridor portion of the U.S. Department of Energy's (DOE) Hanford Site includes the 100 Area and 300 Area, which border the Columbia River and cover 565 km{sup 2} (218 mi{sup 2}). The River Corridor Closure (RCC) Project scope of work includes 486 contaminated facilities, 4 of 9 deactivated plutonium production reactors, and 370 waste disposal sites. DOE's cleanup actions in the river corridor were initiated in 1994 under the Comprehensive Environmental Response, Compensation, and Liability Act of 1981 (42 U.S.C. 9601, et seq.) (CERCLA) and included source and groundwater operable units (OUs). DOE's RCC Project, awarded to Washingtonmore » Closure Hanford (WCH) in 2005, focuses on source OUs and has allowed cleanup actions to continue in the 100 and 300 Areas with completion by 2013. The regulatory authorization for cleanup actions at source OUs in the river corridor consists primarily of interim action records of decision (RODs), which were supported by qualitative risk assessments and limited field investigations. A key to establishing final cleanup decisions and proceeding toward final CERCLA closeout is completion of quantitative baseline risk assessment activities. Baseline risk assessment is necessary to determine whether cleanup actions are protective of human health and the environment and to identify any course corrections needed to ensure that current and future cleanup actions are protective. Because cleanup actions are ongoing under interim action RODs, it is desirable to establish the final cleanup decision bases as early as possible to minimize the impacts of any identified course corrections to the cleanup approach. Risk assessment is being performed by WCH as the River Corridor Baseline Risk Assessment (RCBRA). The RCBRA uses a multi-step process that summarizes existing data; uses the data quality objectives process to identify both data gaps and unresolved issues through public workshops; and solicits input from regulators, trustees, and stakeholders. Sampling and analysis plans are then developed to document quality requirements and identify field sample collection approaches. After required data are collected, the risks to human health and the environment are assessed. Sampling of upland, riparian, and near-shore environments for the 100/300 Area Component was performed in 2005 and 2006. The 100/300 Area Component includes former operational/reactor areas. The results of these efforts will be incorporated into a mid-2007 draft risk assessment report for the 100/300 Area Component of the RCBRA. Adapting methodology developed from the 100/300 Area Component, the Inter-Areas risk assessment will be conducted for the riparian and near-shore environments. The Inter-Areas Component includes shoreline areas between former operational areas addressed in the 100/300 Area Component. The Inter-Areas risk assessment will supplement results from the 100/300 Area Component to provide a more complete analysis of residual risks in the river corridor. Plans for the final element of the RCBRA, the Columbia River Component, are being developed by DOE and currently is not part of the RCC Project. The Columbia River Component includes the reach of the Columbia River located adjacent to the Hanford Site and reaches downstream to an undetermined boundary. Recommendations for final cleanup decisions at source units within the river corridor, based in part on the risk assessment results, will be presented for future public review in a River Corridor Source Unit Proposed Plan. To form an integrated cleanup approach for the river corridor, the RCBRA results for the source units require integration with risk assessment results from groundwater cleanup actions managed by other contractors. WCH's risk assessment task includes development of an integration strategy for activities leading up to the final regulatory decisions for all OUs in the river corridor. (authors)« less
Methods for nuclear air-cleaning-system accident-consequence assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrae, R.W.; Bolstad, J.W.; Gregory, W.S.
1982-01-01
This paper describes a multilaboratory research program that is directed toward addressing many questions that analysts face when performing air cleaning accident consequence assessments. The program involves developing analytical tools and supportive experimental data that will be useful in making more realistic assessments of accident source terms within and up to the atmospheric boundaries of nuclear fuel cycle facilities. The types of accidents considered in this study includes fires, explosions, spills, tornadoes, criticalities, and equipment failures. The main focus of the program is developing an accident analysis handbook (AAH). We will describe the contents of the AAH, which include descriptionsmore » of selected nuclear fuel cycle facilities, process unit operations, source-term development, and accident consequence analyses. Three computer codes designed to predict gas and material propagation through facility air cleaning systems are described. These computer codes address accidents involving fires (FIRAC), explosions (EXPAC), and tornadoes (TORAC). The handbook relies on many illustrative examples to show the analyst how to approach accident consequence assessments. We will use the FIRAC code and a hypothetical fire scenario to illustrate the accident analysis capability.« less
NASA Technical Reports Server (NTRS)
2008-01-01
The Aquarius Radiometer, a subsystem of the Aquarius Instrument required a data acquisition ground system to support calibration and radiometer performance assessment. To support calibration and compose performance assessments, we developed an automated system which uploaded raw data to a ftp server and saved raw and processed data to a database. This paper details the overall functionalities of the Aquarius Instrument Science Data System (ISDS) and the individual electrical ground support equipment (EGSE) which produced data files that were infused into the ISDS. Real time EGSEs include an ICDS Simulator, Calibration GSE, Labview controlled power supply, and a chamber data acquisition system. ICDS Simulator serves as a test conductor primary workstation, collecting radiometer housekeeping (HK) and science data and passing commands and HK telemetry collection request to the radiometer. Calibration GSE (Radiometer Active Test Source) provides source choice from multiple targets for the radiometer external calibration. Power Supply GSE, controlled by labview, provides real time voltage and current monitoring of the radiometer. And finally the chamber data acquisition system produces data reflecting chamber vacuum pressure, thermistor temperatures, AVG and watts. Each GSE system produce text based data files every two to six minutes and automatically copies the data files to the Central Archiver PC. The Archiver PC stores the data files, schedules automated uploads of these files to an external FTP server, and accepts request to copy all data files to the ISDS for offline data processing and analysis. Aquarius Radiometer ISDS contains PHP and MATLab programs to parse, process and save all data to a MySQL database. Analysis tools (MATLab programs) in the ISDS system are capable of displaying radiometer science, telemetry and auxiliary data in near real time as well as performing data analysis and producing automated performance assessment reports of the Aquarius Radiometer.
Röttger, Julia; Blümel, Miriam; Engel, Susanne; Grenz-Farenholtz, Brigitte; Fuchs, Sabine; Linder, Roland; Verheyen, Frank; Busse, Reinhard
2015-01-01
Background: The responsiveness of a health system is considered to be an intrinsic goal of health systems and an essential aspect in performance assessment. Numerous studies have analysed health system responsiveness and related concepts, especially across different countries and health systems. However, fewer studies have applied the concept for the evaluation of specific healthcare delivery structures and thoroughly analysed its determinants within one country. The aims of this study are to assess the level of perceived health system responsiveness to patients with chronic diseases in ambulatory care in Germany and to analyse the determinants of health system responsiveness as well as its distribution across different population groups. Methods and Analysis: The target population consists of chronically ill people in Germany, with a focus on patients suffering from type 2 diabetes and/or from coronary heart disease (CHD). Data comes from two different sources: (i) cross-sectional survey data from a postal survey and (ii) claims data from a German sickness fund. Data from both sources will be linked at an individual-level. The postal survey has the purpose of measuring perceived health system responsiveness, health related quality of life, experiences with disease management programmes (DMPs) and (subjective) socioeconomic background. The claims data consists of information on (co)morbidities, service utilization, enrolment within a DMP and sociodemographic characteristics, including the type of residential area. Discussion: RAC is one of the first projects linking survey data on health system responsiveness at individual level with claims data. With this unique database, it will be possible to comprehensively analyse determinants of health system responsiveness and its relation to other aspects of health system performance assessment. The results of the project will allow German health system decision-makers to assess the performance of nonclinical aspects of healthcare delivery and their determinants in two important areas of health policy: in ambulatory and chronic disease care. PMID:26188807
Röttger, Julia; Blümel, Miriam; Engel, Susanne; Grenz-Farenholtz, Brigitte; Fuchs, Sabine; Linder, Roland; Verheyen, Frank; Busse, Reinhard
2015-05-20
The responsiveness of a health system is considered to be an intrinsic goal of health systems and an essential aspect in performance assessment. Numerous studies have analysed health system responsiveness and related concepts, especially across different countries and health systems. However, fewer studies have applied the concept for the evaluation of specific healthcare delivery structures and thoroughly analysed its determinants within one country. The aims of this study are to assess the level of perceived health system responsiveness to patients with chronic diseases in ambulatory care in Germany and to analyse the determinants of health system responsiveness as well as its distribution across different population groups. The target population consists of chronically ill people in Germany, with a focus on patients suffering from type 2 diabetes and/or from coronary heart disease (CHD). Data comes from two different sources: (i) cross-sectional survey data from a postal survey and (ii) claims data from a German sickness fund. Data from both sources will be linked at an individual-level. The postal survey has the purpose of measuring perceived health system responsiveness, health related quality of life, experiences with disease management programmes (DMPs) and (subjective) socioeconomic background. The claims data consists of information on (co)morbidities, service utilization, enrolment within a DMP and sociodemographic characteristics, including the type of residential area. RAC is one of the first projects linking survey data on health system responsiveness at individual level with claims data. With this unique database, it will be possible to comprehensively analyse determinants of health system responsiveness and its relation to other aspects of health system performance assessment. The results of the project will allow German health system decision-makers to assess the performance of nonclinical aspects of healthcare delivery and their determinants in two important areas of health policy: in ambulatory and chronic disease care. © 2015 by Kerman University of Medical Sciences.
The sound field of a rotating dipole in a plug flow.
Wang, Zhao-Huan; Belyaev, Ivan V; Zhang, Xiao-Zheng; Bi, Chuan-Xing; Faranosov, Georgy A; Dowell, Earl H
2018-04-01
An analytical far field solution for a rotating point dipole source in a plug flow is derived. The shear layer of the jet is modelled as an infinitely thin cylindrical vortex sheet and the far field integral is calculated by the stationary phase method. Four numerical tests are performed to validate the derived solution as well as to assess the effects of sound refraction from the shear layer. First, the calculated results using the derived formulations are compared with the known solution for a rotating dipole in a uniform flow to validate the present model in this fundamental test case. After that, the effects of sound refraction for different rotating dipole sources in the plug flow are assessed. Then the refraction effects on different frequency components of the signal at the observer position, as well as the effects of the motion of the source and of the type of source are considered. Finally, the effect of different sound speeds and densities outside and inside the plug flow is investigated. The solution obtained may be of particular interest for propeller and rotor noise measurements in open jet anechoic wind tunnels.
Han, So-Ri; Lee, Byoung-Seok; Jung, Kyung-Jin; Yu, Hee-Jin; Yun, Eun-Young; Hwang, Jae Sam; Moon, Kyoung-Sik
2016-06-01
Worldwide demand for novel food source has grown and edible insects are a promising food sources for humans. Tenebrio molitor, as known as yellow mealworm, has advantages of being rich in protein, and easy to raise as a novel food source. The objective of this study was to evaluate subchronic toxicity, including potential hypersensitivity, of freeze-dried powdered T. molitor larvae (fdTML) in male and female Sprague-Dawley rats. The fdTML was administered orally once daily at dose levels of 0, 300, 1000 and 3000 mg/kg/day for 90 days. A toxicological assessment was performed, which included mortality, clinical signs, body and organ weights, food consumption, ophthalmology, urinalysis, hematology, serum chemistry, gross findings, histopathologic examination and allergic reaction. There were no fdTML- related findings in clinical signs, urinalysis, hematology and serum chemistry, gross examination, histopathologic examination or allergic reaction. In conclusion, the No Observed Adverse Effect Level (NOAEL) for fdTML was determined to be in excess of 3000 mg/kg/day in both sexes of rats under the experimental conditions of this study. Copyright © 2016 Elsevier Inc. All rights reserved.
Sato, Maria Ines Z; Galvani, Ana Tereza; Padula, Jose Antonio; Nardocci, Adelaide Cassia; Lauretto, Marcelo de Souza; Razzolini, Maria Tereza Pepe; Hachich, Elayse Maria
2013-01-01
A survey of Giardia and Cryptosporidium was conducted in surface water used as drinking water sources by public water systems in four densely urbanized regions of Sao Paulo State, Brazil. A Quantitative Microbial Risk Assessment, based on protozoa concentrations, was performed to estimate the probability of protozoa infection associated with drinking water ingestion. A total of 206 source water samples were analyzed over a 24 month period using the USEPA Method 1623. The risk of infection was estimated using an exponential dose response model, children and adults exposure and a gamma distribution for (oo)cyst concentrations with three scenarios for treating censored data. Giardia was detected in 102 of the samples, and 19 of them were also positive for Cryptosporidium, with maximum concentrations of 97.0 cysts/L and 6.0 oocysts/L, respectively. Risk distributions were similar for the three scenarios. In the four regions, the estimated risk of Giardia infection per year, for adults and children, ranged from 0.29% to 2.47% and from 0.08% to 0.70%, respectively. Cryptosporidium risk infection varied from 0.15% to 0.29% for adults and from 0.04% to 0.08% for children. In both cases, the calculated risk surpassed the risk of infection of 10(-4) (1:10,000) defined as tolerable by USEPA for a yearly exposure. The probability of Giardia infection was very close to the rates of acute diarrheic disease for adults (1% to 3%) but lower for children (2% to 7%). The daily consumption of drinking water was an important contributing factor for these differences. The Microbiological Risk Assessment carried out in this study provides an indication of infection risks by Giardia and Cryptosporidium in the population served by these source waters. Strategies for source water protection and performance targets for the water treatment should be established to achieve the required level of public health risk. Copyright © 2012 Elsevier B.V. All rights reserved.
Mazzucchelli, Gabriel; Holzhauser, Thomas; Cirkovic Velickovic, Tanja; Diaz-Perales, Araceli; Molina, Elena; Roncada, Paola; Rodrigues, Pedro; Verhoeckx, Kitty; Hoffmann-Sommergruber, Karin
2018-01-01
Food allergies are recognized as a global health concern. In order to protect allergic consumers from severe symptoms, allergenic risk assessment for well-known foods and foods containing genetically modified ingredients is installed. However, population is steadily growing and there is a rising need to provide adequate protein-based foods, including novel sources, not yet used for human consumption. In this context safety issues such as a potential increased allergenic risk need to be assessed before marketing novel food sources. Therefore, the established allergenic risk assessment for genetically modified organisms needs to be re-evaluated for its applicability for risk assessment of novel food proteins. Two different scenarios of allergic sensitization have to be assessed. The first scenario is the presence of already known allergenic structures in novel foods. For this, a comparative assessment can be performed and the range of cross-reactivity can be explored, while in the second scenario allergic reactions are observed toward so far novel allergenic structures and no reference material is available. This review summarizes the current analytical methods for allergenic risk assessment, highlighting the strengths and limitations of each method and discussing the gaps in this assessment that need to be addressed in the near future. © 2017 The Authors. Published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Combined PIXE and XPS analysis on republican and imperial Roman coins
NASA Astrophysics Data System (ADS)
Daccà, A.; Prati, P.; Zucchiatti, A.; Lucarelli, F.; Mandò, P. A.; Gemme, G.; Parodi, R.; Pera, R.
2000-03-01
A combined PIXE and XPS analysis has been performed on a few Roman coins of the republican and imperial age. The purpose was to investigate via XPS the nature and extent of patina in order to be capable of extracting PIXE data relative to the coins bulk. The inclusion of elements from the surface layer, altered by oxidation and inclusion, is a known source of uncertainty in PIXE analyses of coins, performed to assess the composition and the provenance.
Neutron streaming studies along JET shielding penetrations
NASA Astrophysics Data System (ADS)
Stamatelatos, Ion E.; Vasilopoulou, Theodora; Batistoni, Paola; Obryk, Barbara; Popovichev, Sergey; Naish, Jonathan
2017-09-01
Neutronic benchmark experiments are carried out at JET aiming to assess the neutronic codes and data used in ITER analysis. Among other activities, experiments are performed in order to validate neutron streaming simulations along long penetrations in the JET shielding configuration. In this work, neutron streaming calculations along the JET personnel entrance maze are presented. Simulations were performed using the MCNP code for Deuterium-Deuterium and Deuterium- Tritium plasma sources. The results of the simulations were compared against experimental data obtained using thermoluminescence detectors and activation foils.
NASA Astrophysics Data System (ADS)
Moulton, J. D.; Steefel, C. I.; Yabusaki, S.; Castleton, K.; Scheibe, T. D.; Keating, E. H.; Freedman, V. L.
2013-12-01
The Advanced Simulation Capabililty for Environmental Management (ASCEM) program is developing an approach and open-source tool suite for standardized risk and performance assessments at legacy nuclear waste sites. These assessments use a graded and iterative approach, beginning with simplified highly abstracted models, and adding geometric and geologic complexity as understanding is gained. To build confidence in this assessment capability, extensive testing of the underlying tools is needed. Since the tools themselves, such as the subsurface flow and reactive-transport simulator, Amanzi, are under active development, testing must be both hierarchical and highly automated. In this presentation we show how we have met these requirements, by leveraging the python-based open-source documentation system called Sphinx with several other open-source tools. Sphinx builds on the reStructured text tool docutils, with important extensions that include high-quality formatting of equations, and integrated plotting through matplotlib. This allows the documentation, as well as the input files for tests, benchmark and tutorial problems, to be maintained with the source code under a version control system. In addition, it enables developers to build documentation in several different formats (e.g., html and pdf) from a single source. We will highlight these features, and discuss important benefits of this approach for Amanzi. In addition, we'll show that some of ASCEM's other tools, such as the sampling provided by the Uncertainty Quantification toolset, are naturally leveraged to enable more comprehensive testing. Finally, we will highlight the integration of this hiearchical testing and documentation framework with our build system and tools (CMake, CTest, and CDash).
Management of pharmaceutical substances in the environment: Lithuanian case study.
Baranauskaitė-Fedorova, Inga; Dvarionienė, Jolanta; Nikiforov, Vladimir A
2016-09-01
Investigation on the sources, discharges and related risks for the environment of the pharmaceutical substance (PhS) diclofenac (DCF) was performed in Lithuania, a country of the Baltic Sea region, for the first time. The investigation only refers to DCF as a PhS for human use; emissions from animal husbandry were not considered. In the first stage of the research, the main sources and pathways of DCF via substance flow analysis were identified within the country. During the second stage, DCF flows along the wastewater treatment plants (WWTPs) in two different cities were measured in order to assess the current levels of pharmaceutical residues in the environment. Furthermore, environmental risk assessment was carried out by taking into account the parameters of consumption data and elimination rate in WWTPs. Then, the assessment of different technical and managerial removal approaches was accomplished in an environmental management model of wastewater containing PhS, based on the framework of environmental systems theory.
On an efficient multilevel inverter assembly: structural savings and design optimisations
NASA Astrophysics Data System (ADS)
Choupan, Reza; Nazarpour, Daryoush; Golshannavaz, Sajjad
2018-01-01
This study puts forward an efficient unit cell to be taken in use in multilevel inverter assemblies. The proposed structure is in line with reductions in number of direct current (dc) voltage sources, insulated-gate bipolar transistors (IGBTs), gate driver circuits, installation area, and hence the implementation costs. Such structural savings do not sacrifice the technical performance of the proposed design wherein an increased number of output voltage levels is attained, interestingly. Targeting a techno-economic characteristic, the contemplated structure is included as the key unit of cascaded multilevel inverters. Such extensions require development of applicable design procedures. To this end, two efficient strategies are elaborated to determine the magnitudes of input dc voltage sources. As well, an optimisation process is developed to explore the optimal allocation of different parameters in overall performance of the proposed inverter. These parameters are investigated as the number of IGBTs, dc sources, diodes, and overall blocked voltage on switches. In the lights of these characteristics, a comprehensive analysis is established to compare the proposed design with the conventional and recently developed structures. Detailed simulation and experimental studies are conducted to assess the performance of the proposed design. The obtained results are discussed in depth.
Personnel Dose Assessment during Active Interrogation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Thomas Martin; Akkurt, Hatice; Patton, Bruce W
A leading candidate in the detection of special nuclear material (SNM) is active interrogation (AI). Unlike passive interrogation, AI uses a source to enhance or create a detectable signal from SNM (usually fission), particularly in shielded scenarios or scenarios where the SNM has a low activity. The use of AI thus makes the detection of SNM easier or, in some scenarios, even enables previously impossible detection. During the development of AI sources, significant effort is put into determining the source strength required to detect SNM in specific scenarios. Usually during this process, but not always, an evaluation of personnel dosemore » is also completed. In this instance personnel dose could involve any of the following: (1) personnel performing the AI; (2) unknown stowaways who are inside the object being interrogated; or (3) in clandestine interrogations, personnel who are known to be inside the object being interrogated but are unaware of the interrogation. In most instances, dose to anyone found smuggling SNM will be a secondary issue. However, for the organizations performing the AI, legal if not moral considerations should make dose to the personnel performing the AI, unknown stowaways, or innocent bystanders in clandestine interrogations a serious concern.« less
Zeleke, Berihun M.; Abramson, Michael J.; Benke, Geza
2018-01-01
Uncertainty in experimental studies of exposure to radiation from mobile phones has in the past only been framed within the context of statistical variability. It is now becoming more apparent to researchers that epistemic or reducible uncertainties can also affect the total error in results. These uncertainties are derived from a wide range of sources including human error, such as data transcription, model structure, measurement and linguistic errors in communication. The issue of epistemic uncertainty is reviewed and interpreted in the context of the MoRPhEUS, ExPOSURE and HERMES cohort studies which investigate the effect of radiofrequency electromagnetic radiation from mobile phones on memory performance. Research into this field has found inconsistent results due to limitations from a range of epistemic sources. Potential analytic approaches are suggested based on quantification of epistemic error using Monte Carlo simulation. It is recommended that future studies investigating the relationship between radiofrequency electromagnetic radiation and memory performance pay more attention to treatment of epistemic uncertainties as well as further research into improving exposure assessment. Use of directed acyclic graphs is also encouraged to display the assumed covariate relationship. PMID:29587425
A systematic literature review of open source software quality assessment models.
Adewumi, Adewole; Misra, Sanjay; Omoregbe, Nicholas; Crawford, Broderick; Soto, Ricardo
2016-01-01
Many open source software (OSS) quality assessment models are proposed and available in the literature. However, there is little or no adoption of these models in practice. In order to guide the formulation of newer models so they can be acceptable by practitioners, there is need for clear discrimination of the existing models based on their specific properties. Based on this, the aim of this study is to perform a systematic literature review to investigate the properties of the existing OSS quality assessment models by classifying them with respect to their quality characteristics, the methodology they use for assessment, and their domain of application so as to guide the formulation and development of newer models. Searches in IEEE Xplore, ACM, Science Direct, Springer and Google Search is performed so as to retrieve all relevant primary studies in this regard. Journal and conference papers between the year 2003 and 2015 were considered since the first known OSS quality model emerged in 2003. A total of 19 OSS quality assessment model papers were selected. To select these models we have developed assessment criteria to evaluate the quality of the existing studies. Quality assessment models are classified into five categories based on the quality characteristics they possess namely: single-attribute, rounded category, community-only attribute, non-community attribute as well as the non-quality in use models. Our study reflects that software selection based on hierarchical structures is found to be the most popular selection method in the existing OSS quality assessment models. Furthermore, we found that majority (47%) of the existing models do not specify any domain of application. In conclusion, our study will be a valuable contribution to the community and helps the quality assessment model developers in formulating newer models and also to the practitioners (software evaluators) in selecting suitable OSS in the midst of alternatives.
An Assessment of Air As a Source of DNA Contamination Encountered When Performing PCR
Witt, Nina; Rodger, Gillian; Vandesompele, Jo; Benes, Vladimir; Zumla, Alimuddin; Rook, Graham A.; Huggett, Jim F.
2009-01-01
Sensitive molecular methods, such as the PCR, can detect low-level contamination, and careful technique is required to reduce the impact of contaminants. Yet, some assays that are designed to detect high copy-number target sequences appear to be impossible to perform without contamination, and frequently, personnel or laboratory environment are held responsible as the source. This complicates diagnostic and research analysis when using molecular methods. To investigate the air specifically as a source of contamination, which might occur during PCR setup, we exposed tubes of water to the air of a laboratory and clean hood for up to 24 h. To increase the chances of contamination, we also investigated a busy open-plan office in the same way. All of the experiments showed the presence of human and rodent DNA contamination. However, there was no accumulation of the contamination in any of the environments investigated, suggesting that the air was not the source of contamination. Even the air from a busy open-plan office was a poor source of contamination for all of the DNA sequences investigated (human, bacterial, fungal, and rodent). This demonstrates that the personnel and immediate laboratory environment are not necessarily to blame for the observed contamination. PMID:19949694
Regionalisation of Hydrological Indices to Assess Land-Use Change Impacts in the Tropical Andes
NASA Astrophysics Data System (ADS)
Buytaert, W.; Ochoa Tocachi, B. F.
2014-12-01
Andean ecosystems are major water sources for cities and communities located in the Tropical Andes; however, there is a considerable lack of knowledge about their hydrology. Two problems are especially important: (i) the lack of monitoring to assess the impacts of historical land-use and cover change and degradation (LUCCD) at catchment scale, and (ii) the high variability in climatic and hydrological conditions that complicate the evaluation of land management practices. This study analyses how a reliable LUCCD impacts assessment can be performed in an environment of high variability combined with data-scarcity and low-quality records. We use data from participatory hydrological monitoring activities in 20 catchments distributed along the tropical Andes. A set of 46 hydrological indices is calculated and regionalized by relating them to 42 physical catchment properties. Principal Component Analysis (PCA) is performed to maximise available data while minimising redundancy in the sets of variables. Hydrological model parameters are constrained by estimated indices, and different behavioural predictions are assembled to provide a generalised response on which we assess LUCCD impacts. Results from this methodology show that the attributed effects of LUCCD in pair-wise catchment comparisons may be overstated or hidden by different sources of uncertainty, including measurement inaccuracies and model structural errors. We propose extrapolation and evaluation in ungauged catchments as a way to regionalize LUCCD predictions and to provide statistically significant conclusions in the Andean region. These estimations may deliver reliable knowledge to evaluate the hydrological impact of different watershed management practices.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Potchen, E.J.
Questions regarding what imaging performance goals need to be met to produce effective biomedical research using positron emission computer tomography, how near those performance goals are to being realized by imaging systems, and the dependence of currently-unachieved performance goals on design and operational factors have been addressed in the past year, along with refinement of economic estimates for the capital and operating costs of a PECT research facility. The two primary sources of information have been solicitations of expert opinion and review of current literature. (ACR)
2005-03-17
private sector . However, along the way, Government and private sector industry have begun to disagree about how PPI is collected and how PPI is used. Industry prefers a passive system of collecting delivery and quality data during contract performance, while the Federal Government uses both a passive system (similar to industry) as well as an active system of pulling PPI during contract performance. Industry uses PPI to establish and maintain a preferred vendor list from which to solicit bids, quotes, or proposals, while government uses PPI to assess
Bozorgmehr, Kayvan; Goosen, Simone; Mohsenpour, Amir; Kuehne, Anna; Razum, Oliver; Kunst, Anton E
2017-08-08
Background: Accurate data on the health status, health behaviour and access to health care of asylum seekers is essential, but such data is lacking in many European countries. We hence aimed to: (a) develop and pilot-test an instrument that can be used to compare and benchmark the country health information systems (HIS) with respect to the ability to assess the health status and health care situation of asylum seekers and (b) present the results of that pilot for The Netherlands (NL) and Germany (DE). Materials and Methods : Reviewing and adapting existing tools, we developed a Health Information Assessment Tool on Asylum Seekers (HIATUS) with 50 items to assess HIS performance across three dimensions: (1) availability and detail of data across potential data sources; (2) HIS resources and monitoring capacity; (3) general coverage and timeliness of publications on selected indicators. We piloted HIATUS by applying the tool to the HIS in DE and NL. Two raters per country independently assessed the performance of country HIS and the inter-rater reliability was analysed by Pearson's rho and the intra-class correlation (ICC). We then applied a consensus-based group rating to obtain the final ratings which were transformed into a weighted summary score (range: 0-97). We assessed HIS performance by calculating total and domain-specific HIATUS scores by country as well as absolute and relative gaps in scores within and between countries. Results : In the independent rating, Pearson's rho was 0.14 (NL) and 0.30 (DE), the ICC yielded an estimated reliability of 0.29 (NL) and 0.83 (DE) respectively. In the final consensus-based rating, the total HIATUS score was 47 in NL and 15 in DE, translating into a relative gap in HIS capacity of 52% (NL) and 85% (DE) respectively. Shortfalls in HIS capacity in both countries relate to the areas of HIS coordination, planning and policies, and to limited coverage of specific indicators such as self-reported health, mental health, socio-economic status and health behaviour. The relative gap in the HIATUS component "data sources and availability" was much higher in Germany (92%) than in NL (28%). Conclusions : The standardised tool (HIATUS) proved useful for assessment of country HIS performance in two countries by consensus-based rating. HIATUS revealed substantial limitations in HIS capacity to assess the health situation of asylum seekers in both countries. The tool allowed for between-country comparisons, revealing that capacities were lower in DE relative to NL. Monitoring and benchmarking gaps in HIS capacity in further European countries can help to strengthen HIS in the future.
Abbott, M.L.; Susong, D.D.; Krabbenhoft, D.P.; Rood, A.S.
2002-01-01
Mercury (total and methyl) was evaluated in snow samples collected near a major mercury emission source on the Idaho National Engineering and Environmental Laboratory (INEEL) in southeastern Idaho and 160 km downwind in Teton Range in western Wyoming. The sampling was done to assess near-field (<12 km) deposition rates around the source, compare them to those measured in a relatively remote, pristine downwind location, and to use the measurements to develop improved, site-specific model input parameters for precipitation scavenging coefficient and the fraction of Hg emissions deposited locally. Measured snow water concentrations (ng L-1) were converted to deposition (ug m-2) using the sample location snow water equivalent. The deposition was then compared to that predicted using the ISC3 air dispersion/deposition model which was run with a range of particle and vapor scavenging coefficient input values. Accepted model statistical performance measures (fractional bias and normalized mean square error) were calculated for the different modeling runs, and the best model performance was selected. Measured concentrations close to the source (average = 5.3 ng L-1) were about twice those measured in the Teton Range (average = 2.7 ng L-1) which were within the expected range of values for remote background areas. For most of the sampling locations, the ISC3 model predicted within a factor of two of the observed deposition. The best modeling performance was obtained using a scavenging coefficient value for 0.25 ??m diameter particulate and the assumption that all of the mercury is reactive Hg(II) and subject to local deposition. A 0.1 ??m particle assumption provided conservative overprediction of the data, while a vapor assumption resulted in highly variable predictions. Partitioning a fraction of the Hg emissions to elemental Hg(0) (a U.S. EPA default assumption for combustion facility risk assessments) would have underpredicted the observed fallout.
Estoppey, Nicolas; Schopfer, Adrien; Fong, Camille; Delémont, Olivier; De Alencastro, Luiz F; Esseiva, Pierre
2016-12-01
This study firstly aims to assess the field performances of low density polyethylene (LDPE) and silicone rubber (SR) samplers for the monitoring of polychlorinated biphenyls (PCBs) in water regarding the uptake, the sampling rate (R S ) estimated by using performance reference compounds (PRCs) and the time-weighted average (TWA) concentrations. The second aim is to evaluate the efficiency of these samplers to investigate PCB sources (localization and imputation steps) using methods with and without PRCs to correct for the impact of water velocity on the uptake. Samplers spiked with PRCs were deployed in the outfalls of two PCB sources and at 8 river sites situated upstream and downstream of the outfalls. After 6weeks, the uptake of PCBs in the linear phase was equivalent in LDPE and SR but 5 times lower in LDPE for PCBs approaching equilibrium. PRC-based R S and water velocity (0.08 to 1.21ms -1 ) were well correlated in river (LDPE: R 2 =0.91, SR: R 2 =0.96) but not in outfalls (higher turbulences and potential release of PRCs to air). TWA concentrations obtained with SR were slightly higher than those obtained with LDPE (factor 1.4 to 2.6 in river) likely because of uncertainty in sampler-water partition coefficient values. Concentrations obtained through filtration and extraction of water samples (203L) were 1.6 and 5.1 times higher than TWA concentrations obtained with SR and LDPE samplers, respectively. PCB sources could efficiently be localized when PRCs were used (increases of PCB loads in river) but the impact of high differences of water velocity was overcorrected (leading sometimes to false positives and negatives). Increases of PCB loads in the river could not be entirely imputed to the investigated sources (underestimation of PCBs contributing to the load increases). A method without PRCs (relationship between uptake and water velocity) appeared to be a good complementary method for LDPE. Copyright © 2016. Published by Elsevier B.V.
Field performance of Populus expressing somaclonal variation in resistance to Septoria musiva
M. E. Ostry; K. T. Ward
2003-01-01
Over 1500 trees from two hybrid poplar clones regenerated from tissue culture and expressing somatic variation in leaf disease resistance in a laboratory leaf disk bioassay were field-tested for 5-11 years to examine their resistance to Septoria leaf spot and canker and to assess their growth characteristics compared with the source clones....
A potential quantitative method for assessing individual tree performance
Lance A. Vickers; David R. Larsen; Daniel C. Dey; John M. Kabrick; Benjamin O. Knapp
2014-01-01
By what standard should a tree be judged? This question, perhaps unknowingly, is posed almost daily by practicing foresters. Unfortunately, there are few cases in which clearly defined quantitative (i.e., directly measurable) references have been established in forestry. A lack of common references may be an unnecessary source of error in silvicultural application and...
ERIC Educational Resources Information Center
Hessels, Laurens K.; van Lente, Harro
2011-01-01
In many Western science systems, funding structures increasingly stimulate academic research to contribute to practical applications, but at the same time the rise of bibliometric performance assessments have strengthened the pressure on academics to conduct excellent basic research that can be published in scholarly literature. We analyze the…
Gaming as a Platform for the Development of Innovative Problem-Based Learning Opportunities
ERIC Educational Resources Information Center
Echeverri, J. Felipe; Sadler, Troy D.
2011-01-01
The state of education in the United States, particularly in the areas of science, mathematics and technology, has been a consistent source of concern since at least the early 1980s when student performance on the 1986 National Assessment of Educational Progress (NAEP) revealed that science proficiency was lower than comparable measures from the…
Constructing and Evaluating a Validity Argument for the Final-Year Ward Simulation Exercise
ERIC Educational Resources Information Center
Till, Hettie; Ker, Jean; Myford, Carol; Stirling, Kevin; Mires, Gary
2015-01-01
The authors report final-year ward simulation data from the University of Dundee Medical School. Faculty who designed this assessment intend for the final score to represent an individual senior medical student's level of clinical performance. The results are included in each student's portfolio as one source of evidence of the student's…
Maeng, Daniel D; Scanlon, Dennis P; Chernew, Michael E; Gronniger, Tim; Wodchis, Walter P; McLaughlin, Catherine G
2010-01-01
Objective To examine the extent to which health plan quality measures capture physician practice patterns rather than plan characteristics. Data Source We gathered and merged secondary data from the following four sources: a private firm that collected information on individual physicians and their health plan affiliations, The National Committee for Quality Assurance, InterStudy, and the Dartmouth Atlas. Study Design We constructed two measures of physician network overlap for all health plans in our sample and linked them to selected measures of plan performance. Two linear regression models were estimated to assess the relationship between the measures of physician network overlap and the plan performance measures. Principal Findings The results indicate that in the presence of a higher degree of provider network overlap, plan performance measures tend to converge to a lower level of quality. Conclusions Standard health plan performance measures reflect physician practice patterns rather than plans' effort to improve quality. This implies that more provider-oriented measurement, such as would be possible with accountable care organizations or medical homes, may facilitate patient decision making and provide further incentives to improve performance. PMID:20403064
Simulation verification techniques study
NASA Technical Reports Server (NTRS)
Schoonmaker, P. B.; Wenglinski, T. H.
1975-01-01
Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.
When Assessment Data Are Words: Validity Evidence for Qualitative Educational Assessments.
Cook, David A; Kuper, Ayelet; Hatala, Rose; Ginsburg, Shiphra
2016-10-01
Quantitative scores fail to capture all important features of learner performance. This awareness has led to increased use of qualitative data when assessing health professionals. Yet the use of qualitative assessments is hampered by incomplete understanding of their role in forming judgments, and lack of consensus in how to appraise the rigor of judgments therein derived. The authors articulate the role of qualitative assessment as part of a comprehensive program of assessment, and translate the concept of validity to apply to judgments arising from qualitative assessments. They first identify standards for rigor in qualitative research, and then use two contemporary assessment validity frameworks to reorganize these standards for application to qualitative assessment.Standards for rigor in qualitative research include responsiveness, reflexivity, purposive sampling, thick description, triangulation, transparency, and transferability. These standards can be reframed using Messick's five sources of validity evidence (content, response process, internal structure, relationships with other variables, and consequences) and Kane's four inferences in validation (scoring, generalization, extrapolation, and implications). Evidence can be collected and evaluated for each evidence source or inference. The authors illustrate this approach using published research on learning portfolios.The authors advocate a "methods-neutral" approach to assessment, in which a clearly stated purpose determines the nature of and approach to data collection and analysis. Increased use of qualitative assessments will necessitate more rigorous judgments of the defensibility (validity) of inferences and decisions. Evidence should be strategically sought to inform a coherent validity argument.
The Human Exposure Potential from Propylene Releases to the Environment
Morgott, David A.
2018-01-01
A detailed literature search was performed to assess the sources, magnitudes and extent of human inhalation exposure to propylene. Exposure evaluations were performed at both the community and occupational levels for those living or working in different environments. The results revealed a multitude of pyrogenic, biogenic and anthropogenic emission sources. Pyrogenic sources, including biomass burning and fossil fuel combustion, appear to be the primary contributors to atmospheric propylene. Despite a very short atmospheric lifetime, measurable levels could be detected in highly remote locations as a result of biogenic release. The indoor/outdoor ratio for propylene has been shown to range from about 2 to 3 in non-smoking homes, which indicates that residential sources may be the largest contributor to the overall exposure for those not occupationally exposed. In homes where smoking takes place, the levels may be up to thirty times higher than non-smoking residences. Atmospheric levels in most rural regions are typically below 2 ppbv, whereas the values in urban levels are much more variable ranging as high as 10 ppbv. Somewhat elevated propylene exposures may also occur in the workplace; especially for firefighters or refinery plant operators who may encounter levels up to about 10 ppmv. PMID:29300328
Assessment of a Hospital Palliative Care Unit (HPCU) for Cancer Patients; A Conceptual Framework.
Rouhollahi, Mohammad Reza; Saghafinia, Masoud; Zandehdel, Kazem; Motlagh, Ali Ghanbari; Kazemian, Ali; Mohagheghi, Mohammad Ali; Tahmasebi, Mamak
2015-01-01
The first hospital palliative care unit (HPCU) in Iran (FARS-HPCU) has been established in 2008 in the Cancer Institute, which is the largest referral cancer center in the country. We attempted to assess the performance of the HPCU based on a comprehensive conceptual framework. The main aim of this study was to develop a conceptual framework for assessment of the HPCU performances through designing a value chain in line with the goals and the main processes (core and support). We collected data from a variety of sources, including international guidelines, international best practices, and expert opinions in the country and compared them with national policies and priorities. We also took into consideration the trend of the HPCU development in the Cancer Institute of Iran. Through benchmarking the gap area with the performance standards, some recommendations for better outcome are proposed. The framework for performance assessment consisted of 154 process indicators (PIs), based on which the main stakeholders of the HPCU (including staff, patients, and families) offered their scoring. The outcome revealed the state of the processes as well as the gaps. Despite a significant improvement in many processes and indicators, more development in the comprehensive and integrative aspects of FARS-HPCU performance is required. Consideration of all supportive and palliative requirements of the patients through interdisciplinary and collaborative approaches is recommended.
Training and Assessment of Hysteroscopic Skills: A Systematic Review.
Savran, Mona Meral; Sørensen, Stine Maya Dreier; Konge, Lars; Tolsgaard, Martin G; Bjerrum, Flemming
2016-01-01
The aim of this systematic review was to identify studies on hysteroscopic training and assessment. PubMed, Excerpta Medica, the Cochrane Library, and Web of Science were searched in January 2015. Manual screening of references and citation tracking were also performed. Studies on hysteroscopic educational interventions were selected without restrictions on study design, populations, language, or publication year. A qualitative data synthesis including the setting, study participants, training model, training characteristics, hysteroscopic skills, assessment parameters, and study outcomes was performed by 2 authors working independently. Effect sizes were calculated when possible. Overall, 2 raters independently evaluated sources of validity evidence supporting the outcomes of the hysteroscopy assessment tools. A total of 25 studies on hysteroscopy training were identified, of which 23 were performed in simulated settings. Overall, 10 studies used virtual-reality simulators and reported effect sizes for technical skills ranging from 0.31 to 2.65; 12 used inanimate models and reported effect sizes for technical skills ranging from 0.35 to 3.19. One study involved live animal models; 2 studies were performed in clinical settings. The validity evidence supporting the assessment tools used was low. Consensus between the 2 raters on the reported validity evidence was high (94%). This systematic review demonstrated large variations in the effect of different tools for hysteroscopy training. The validity evidence supporting the assessment of hysteroscopic skills was limited. Copyright © 2016 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Specialty-specific multi-source feedback: assuring validity, informing training.
Davies, Helena; Archer, Julian; Bateman, Adrian; Dewar, Sandra; Crossley, Jim; Grant, Janet; Southgate, Lesley
2008-10-01
The white paper 'Trust, Assurance and Safety: the Regulation of Health Professionals in the 21st Century' proposes a single, generic multi-source feedback (MSF) instrument in the UK. Multi-source feedback was proposed as part of the assessment programme for Year 1 specialty training in histopathology. An existing instrument was modified following blueprinting against the histopathology curriculum to establish content validity. Trainees were also assessed using an objective structured practical examination (OSPE). Factor analysis and correlation between trainees' OSPE performance and the MSF were used to explore validity. All 92 trainees participated and the assessor response rate was 93%. Reliability was acceptable with eight assessors (95% confidence interval 0.38). Factor analysis revealed two factors: 'generic' and 'histopathology'. Pearson correlation of MSF scores with OSPE performances was 0.48 (P = 0.001) and the histopathology factor correlated more highly (histopathology r = 0.54, generic r = 0.42; t = - 2.76, d.f. = 89, P < 0.01). Trainees scored least highly in relation to ability to use histopathology to solve clinical problems (mean = 4.39) and provision of good reports (mean = 4.39). Three of six doctors whose means were < 4.0 received free text comments about report writing. There were 83 forms with aggregate scores of < 4. Of these, 19.2% included comments about report writing. Specialty-specific MSF is feasible and achieves satisfactory reliability. The higher correlation of the 'histopathology' factor with the OSPE supports validity. This paper highlights the importance of validating an MSF instrument within the specialty-specific context as, in addition to assuring content validity, the PATH-SPRAT (Histopathology-Sheffield Peer Review Assessment Tool) also demonstrates the potential to inform training as part of a quality improvement model.
Baken, Kirsten A; Sjerps, Rosa M A; Schriks, Merijn; van Wezel, Annemarie P
2018-06-13
Toxicological risk assessment of contaminants of emerging concern (CEC) in (sources of) drinking water is required to identify potential health risks and prioritize chemicals for abatement or monitoring. In such assessments, concentrations of chemicals in drinking water or sources are compared to either (i) health-based (statutory) drinking water guideline values, (ii) provisional guideline values based on recent toxicity data in absence of drinking water guidelines, or (iii) generic drinking water target values in absence of toxicity data. Here, we performed a toxicological risk assessment for 163 CEC that were selected as relevant for drinking water. This relevance was based on their presence in drinking water and/or groundwater and surface water sources in downstream parts of the Rhine and Meuse, in combination with concentration levels and physicochemical properties. Statutory and provisional drinking water guideline values could be derived from publically available toxicological information for 142 of the CEC. Based on measured concentrations it was concluded that the majority of substances do not occur in concentrations which individually pose an appreciable human health risk. A health concern could however not be excluded for vinylchloride, trichloroethene, bromodichloromethane, aniline, phenol, 2-chlorobenzenamine, mevinphos, 1,4-dioxane, and nitrolotriacetic acid. For part of the selected substances, toxicological risk assessment for drinking water could not be performed since either toxicity data (hazard) or drinking water concentrations (exposure) were lacking. In absence of toxicity data, the Threshold of Toxicological Concern (TTC) approach can be applied for screening level risk assessment. The toxicological information on the selected substances was used to evaluate whether drinking water target values based on existing TTC levels are sufficiently protective for drinking water relevant CEC. Generic drinking water target levels of 37 μg/L for Cramer class I substances and 4 μg/L for Cramer class III substances in drinking water were derived based on these CEC. These levels are in line with previously reported generic drinking water target levels based on original TTC values and are shown to be protective for health effects of the majority of contaminants of emerging concern evaluated in the present study. Since the human health impact of many chemicals appearing in the water cycle has been studied insufficiently, generic drinking water target levels are useful for early warning and prioritization of CEC with unknown toxicity in drinking water and its sources for future monitoring. Copyright © 2018 Elsevier Ltd. All rights reserved.
[Regional atmospheric environment risk source identification and assessment].
Zhang, Xiao-Chun; Chen, Wei-Ping; Ma, Chun; Zhan, Shui-Fen; Jiao, Wen-Tao
2012-12-01
Identification and assessment for atmospheric environment risk source plays an important role in regional atmospheric risk assessment and regional atmospheric pollution prevention and control. The likelihood exposure and consequence assessment method (LEC method) and the Delphi method were employed to build a fast and effective method for identification and assessment of regional atmospheric environment risk sources. This method was applied to the case study of a large coal transportation port in North China. The assessment results showed that the risk characteristics and the harm degree of regional atmospheric environment risk source were in line with the actual situation. Fast and effective identification and assessment of risk source has laid an important foundation for the regional atmospheric environmental risk assessment and regional atmospheric pollution prevention and control.
The 20 GHz solid state transmitter design, impatt diode development and reliability assessment
NASA Technical Reports Server (NTRS)
Picone, S.; Cho, Y.; Asmus, J. R.
1984-01-01
A single drift gallium arsenide (GaAs) Schottky barrier IMPATT diode and related components were developed. The IMPATT diode reliability was assessed. A proof of concept solid state transmitter design and a technology assessment study were performed. The transmitter design utilizes technology which, upon implementation, will demonstrate readiness for development of a POC model within the 1982 time frame and will provide an information base for flight hardware capable of deployment in a 1985 to 1990 demonstrational 30/20 GHz satellite communication system. Life test data for Schottky barrier GaAs diodes and grown junction GaAs diodes are described. The results demonstrate the viability of GaAs IMPATTs as high performance, reliable RF power sources which, based on the recommendation made herein, will surpass device reliability requirements consistent with a ten year spaceborne solid state power amplifier mission.
Leveraging Diverse Data Sources to Identify and Describe U.S. Health Care Delivery Systems.
Cohen, Genna R; Jones, David J; Heeringa, Jessica; Barrett, Kirsten; Furukawa, Michael F; Miller, Dan; Mutti, Anne; Reschovsky, James D; Machta, Rachel; Shortell, Stephen M; Fraze, Taressa; Rich, Eugene
2017-12-15
Health care delivery systems are a growing presence in the U.S., yet research is hindered by the lack of universally agreed-upon criteria to denote formal systems. A clearer understanding of how to leverage real-world data sources to empirically identify systems is a necessary first step to such policy-relevant research. We draw from our experience in the Agency for Healthcare Research and Quality's Comparative Health System Performance (CHSP) initiative to assess available data sources to identify and describe systems, including system members (for example, hospitals and physicians) and relationships among the members (for example, hospital ownership of physician groups). We highlight five national data sources that either explicitly track system membership or detail system relationships: (1) American Hospital Association annual survey of hospitals; (2) Healthcare Relational Services Databases; (3) SK&A Healthcare Databases; (4) Provider Enrollment, Chain, and Ownership System; and (5) Internal Revenue Service 990 forms. Each data source has strengths and limitations for identifying and describing systems due to their varied content, linkages across data sources, and data collection methods. In addition, although no single national data source provides a complete picture of U.S. systems and their members, the CHSP initiative will create an early model of how such data can be combined to compensate for their individual limitations. Identifying systems in a way that can be repeated over time and linked to a host of other data sources will support analysis of how different types of organizations deliver health care and, ultimately, comparison of their performance.
Understanding workers' exposure: Systematic review and data-analysis of emission potential for NOAA.
Kuijpers, E; Bekker, C; Brouwer, D; le Feber, M; Fransman, W
2017-05-01
Exposure assessment for nano-objects, and their aggregates and agglomerates (NOAA), has evolved from explorative research toward more comprehensive exposure assessment, providing data to further develop currently used conservative control banding (CB) tools for risk assessment. This study aims to provide an overview of current knowledge on emission potential of NOAA across the occupational life cycle stages by a systematic review and subsequently use the results in a data analysis. Relevant parameters that influence emission were collected from peer-reviewed literature with a focus on the four source domains (SD) in the source-receptor conceptual framework for NOAA. To make the reviewed exposure data comparable, we applied an approach to normalize for workplace circumstances and measurement location, resulting in comparable "surrogate" emission levels. Finally, descriptive statistics were performed. During the synthesis of nanoparticles (SD1), mechanical reduction and gas phase synthesis resulted in the highest emission compared to wet chemistry and chemical vapor condensation. For the handling and transfer of bulk manufactured nanomaterial powders (SD2) the emission could be differentiated for five activity classes: (1) harvesting; (2) dumping; (3); mixing; (4) cleaning of a reactor; and (5) transferring. Additionally, SD2 was subdivided by the handled amount with cleaning further subdivided by energy level. Harvesting and dumping resulted in the highest emissions. Regarding processes with liquids (SD3b), it was possible to distinguish emissions for spraying (propellant gas, (high) pressure and pump), sonication and brushing/rolling. The highest emissions observed in SD3b were for propellant gas spraying and pressure spraying. The highest emissions for the handling of nano-articles (SD4) were found to nano-sized particles (including NOAA) for grinding. This study provides a valuable overview of emission assessments performed in the workplace during the occupational handling of NOAA. Analyses were made per source domain to derive emission levels which can be used for models to quantitatively predict the exposure.
KSC ice/frost/debris assessment for Space Shuttle Mission STS-30R
NASA Technical Reports Server (NTRS)
Stevenson, Charles G.; Katnik, Gregory N.; Higginbotham, Scott A.
1989-01-01
An ice/frost/debris assessment was conducted for Space Shuttle Mission STS-30R. Debris inspections of the flight elements and launch pad are performed before and after launch. Ice/frost conditions on the external tank are assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by an on-pad visual inspection. High speed photography is analyzed after launch to identify ice/debris sources and evaluate potential vehicle damage. The ice/frost/debris conditions of Mission STS-30R and their overall effect on the Space Shuttle Program is documented.
Potts, Henry W W; McManus, I C
2011-01-01
Objective To determine whether the ethnicity of UK trained doctors and medical students is related to their academic performance. Design Systematic review and meta-analysis. Data sources Online databases PubMed, Scopus, and ERIC; Google and Google Scholar; personal knowledge; backwards and forwards citations; specific searches of medical education journals and medical education conference abstracts. Study selection The included quantitative reports measured the performance of medical students or UK trained doctors from different ethnic groups in undergraduate or postgraduate assessments. Exclusions were non-UK assessments, only non-UK trained candidates, only self reported assessment data, only dropouts or another non-academic variable, obvious sampling bias, or insufficient details of ethnicity or outcomes. Results 23 reports comparing the academic performance of medical students and doctors from different ethnic groups were included. Meta-analyses of effects from 22 reports (n=23 742) indicated candidates of “non-white” ethnicity underperformed compared with white candidates (Cohen’s d=−0.42, 95% confidence interval −0.50 to −0.34; P<0.001). Effects in the same direction and of similar magnitude were found in meta-analyses of undergraduate assessments only, postgraduate assessments only, machine marked written assessments only, practical clinical assessments only, assessments with pass/fail outcomes only, assessments with continuous outcomes only, and in a meta-analysis of white v Asian candidates only. Heterogeneity was present in all meta-analyses. Conclusion Ethnic differences in academic performance are widespread across different medical schools, different types of exam, and in undergraduates and postgraduates. They have persisted for many years and cannot be dismissed as atypical or local problems. We need to recognise this as an issue that probably affects all of UK medical and higher education. More detailed information to track the problem as well as further research into its causes is required. Such actions are necessary to ensure a fair and just method of training and of assessing current and future doctors. PMID:21385802
Impact of workplace based assessment on doctors' education and performance: a systematic review.
Miller, Alice; Archer, Julian
2010-09-24
To investigate the literature for evidence that workplace based assessment affects doctors' education and performance. Systematic review. The primary data sources were the databases Journals@Ovid, Medline, Embase, CINAHL, PsycINFO, and ERIC. Evidence based reviews (Bandolier, Cochrane Library, DARE, HTA Database, and NHS EED) were accessed and searched via the Health Information Resources website. Reference lists of relevant studies and bibliographies of review articles were also searched. Review methods Studies of any design that attempted to evaluate either the educational impact of workplace based assessment, or the effect of workplace based assessment on doctors' performance, were included. Studies were excluded if the sampled population was non-medical or the study was performed with medical students. Review articles, commentaries, and letters were also excluded. The final exclusion criterion was the use of simulated patients or models rather than real life clinical encounters. Sixteen studies were included. Fifteen of these were non-comparative descriptive or observational studies; the other was a randomised controlled trial. Study quality was mixed. Eight studies examined multisource feedback with mixed results; most doctors felt that multisource feedback had educational value, although the evidence for practice change was conflicting. Some junior doctors and surgeons displayed little willingness to change in response to multisource feedback, whereas family physicians might be more prepared to initiate change. Performance changes were more likely to occur when feedback was credible and accurate or when coaching was provided to help subjects identify their strengths and weaknesses. Four studies examined the mini-clinical evaluation exercise, one looked at direct observation of procedural skills, and three were concerned with multiple assessment methods: all these studies reported positive results for the educational impact of workplace based assessment tools. However, there was no objective evidence of improved performance with these tools. Considering the emphasis placed on workplace based assessment as a method of formative performance assessment, there are few published articles exploring its impact on doctors' education and performance. This review shows that multisource feedback can lead to performance improvement, although individual factors, the context of the feedback, and the presence of facilitation have a profound effect on the response. There is no evidence that alternative workplace based assessment tools (mini-clinical evaluation exercise, direct observation of procedural skills, and case based discussion) lead to improvement in performance, although subjective reports on their educational impact are positive.
Hansen, J H; Nandkumar, S
1995-01-01
The formulation of reliable signal processing algorithms for speech coding and synthesis require the selection of a prior criterion of performance. Though coding efficiency (bits/second) or computational requirements can be used, a final performance measure must always include speech quality. In this paper, three objective speech quality measures are considered with respect to quality assessment for American English, noisy American English, and noise-free versions of seven languages. The purpose is to determine whether objective quality measures can be used to quantify changes in quality for a given voice coding method, with a known subjective performance level, as background noise or language conditions are changed. The speech coding algorithm chosen is regular-pulse excitation with long-term prediction (RPE-LTP), which has been chosen as the standard voice compression algorithm for the European Digital Mobile Radio system. Three areas are considered for objective quality assessment which include: (i) vocoder performance for American English in a noise-free environment, (ii) speech quality variation for three additive background noise sources, and (iii) noise-free performance for seven languages which include English, Japanese, Finnish, German, Hindi, Spanish, and French. It is suggested that although existing objective quality measures will never replace subjective testing, they can be a useful means of assessing changes in performance, identifying areas for improvement in algorithm design, and augmenting subjective quality tests for voice coding/compression algorithms in noise-free, noisy, and/or non-English applications.
Essers, Geurt; Dielissen, Patrick; van Weel, Chris; van der Vleuten, Cees; van Dulmen, Sandra; Kramer, Anneke
2015-03-01
Communication assessment in real-life consultations is a complex task. Generic assessment instruments help but may also have disadvantages. The generic nature of the skills being assessed does not provide indications for context-specific behaviour required in practice situations; context influences are mostly taken into account implicitly. Our research questions are: 1. What factors do trained raters observe when rating workplace communication? 2. How do they take context factors into account when rating communication performance with a generic rating instrument? Nineteen general practitioners (GPs), trained in communication assessment with a generic rating instrument (the MAAS-Global), participated in a think-aloud protocol reflecting concurrent thought processes while assessing videotaped real-life consultations. They were subsequently interviewed to answer questions explicitly asking them to comment on the influence of predefined contextual factors on the assessment process. Results from both data sources were analysed. We used a grounded theory approach to untangle the influence of context factors on GP communication and on communication assessment. Both from the think-aloud procedure and from the interviews we identified various context factors influencing communication, which were categorised into doctor-related (17), patient-related (13), consultation-related (18), and education-related factors (18). Participants had different views and practices on how to incorporate context factors into the GP(-trainee) communication assessment. Raters acknowledge that context factors may affect communication in GP consultations, but struggle with how to take contextual influences into account when assessing communication performance in an educational context. To assess practice situations, raters need extra guidance on how to handle specific contextual factors.
Portfolio: a comprehensive method of assessment for postgraduates in oral and maxillofacial surgery.
Kadagad, Poornima; Kotrashetti, S M
2013-03-01
Post graduate learning and assessment is an important responsibility of an academic oral and maxillofacial surgeon. The current method of assessment for post graduate training include formative evaluation in the form of seminars, case presentations, log books and infrequently conducted end of year theory exams. End of the course theory and practical examination is a summative evaluation which awards the degree to the student based on grades obtained. Oral and maxillofacial surgery is mainly a skill based specialty and deliberate practice enhances skill. But the traditional system of assessment of post graduates emphasizes their performance on the summative exam which fails to evaluate the integral picture of the student throughout the course. Emphasis on competency and holistic growth of the post graduate student during training in recent years has lead to research and evaluation of assessment methods to quantify students' progress during training. Portfolio method of assessment has been proposed as a potentially functional method for post graduate evaluation. It is defined as a collection of papers and other forms of evidence that learning has taken place. It allows the collation and integration of evidence on competence and performance from different sources to gain a comprehensive picture of everyday practice. The benefits of portfolio assessment in health professions education are twofold: it's potential to assess performance and its potential to assess outcomes, such as attitudes and professionalism that are difficult to assess using traditional instruments. This paper is an endeavor for the development of portfolio method of assessment for post graduate student in oral and maxillofacial surgery.
Assessment of solar-assisted gas-fired heat pump systems
NASA Technical Reports Server (NTRS)
Lansing, F. L.
1981-01-01
As a possible application for the Goldstone Energy Project, the performance of a 10 ton heat pump unit using a hybrid solar gas energy source was evaluated in an effort to optimize the solar collector size. The heat pump system is designed to provide all the cooling and/or heating requirements of a selected office building. The system performance is to be augmented in the heating mode by utilizing the waste heat from the power cycle. A simplified system analysis is described to assess and compute interrrelationships of the engine, heat pump, and solar and building performance parameters, and to optimize the solar concentrator/building area ratio for a minimum total system cost. In addition, four alternative heating cooling systems, commonly used for building comfort, are described; their costs are compared, and are found to be less competitive with the gas solar heat pump system at the projected solar equipment costs.
Natarajan, Annamalai; Angarita, Gustavo; Gaiser, Edward; Malison, Robert; Ganesan, Deepak; Marlin, Benjamin M
2016-09-01
Mobile health research on illicit drug use detection typically involves a two-stage study design where data to learn detectors is first collected in lab-based trials, followed by a deployment to subjects in a free-living environment to assess detector performance. While recent work has demonstrated the feasibility of wearable sensors for illicit drug use detection in the lab setting, several key problems can limit lab-to-field generalization performance. For example, lab-based data collection often has low ecological validity, the ground-truth event labels collected in the lab may not be available at the same level of temporal granularity in the field, and there can be significant variability between subjects. In this paper, we present domain adaptation methods for assessing and mitigating potential sources of performance loss in lab-to-field generalization and apply them to the problem of cocaine use detection from wearable electrocardiogram sensor data.
Restrictive loads powered by separate or by common electrical sources
NASA Technical Reports Server (NTRS)
Appelbaum, J.
1989-01-01
In designing a multiple load electrical system, the designer may wish to compare the performance of two setups: a common electrical source powering all loads, or separate electrical sources powering individual loads. Three types of electrical sources: an ideal voltage source, an ideal current source, and solar cell source powering resistive loads were analyzed for their performances in separate and common source systems. A mathematical proof is given, for each case, indicating the merit of the separate or common source system. The main conclusions are: (1) identical resistive loads powered by ideal voltage sources perform the same in both system setups, (2) nonidentical resistive loads powered by ideal voltage sources perform the same in both system setups, (3) nonidentical resistive loads powered by ideal current sources have higher performance in separate source systems, and (4) nonidentical resistive loads powered by solar cells have higher performance in a common source system for a wide range of load resistances.
Sensitivity analysis of the near-road dispersion model RLINE - An evaluation at Detroit, Michigan
NASA Astrophysics Data System (ADS)
Milando, Chad W.; Batterman, Stuart A.
2018-05-01
The development of accurate and appropriate exposure metrics for health effect studies of traffic-related air pollutants (TRAPs) remains challenging and important given that traffic has become the dominant urban exposure source and that exposure estimates can affect estimates of associated health risk. Exposure estimates obtained using dispersion models can overcome many of the limitations of monitoring data, and such estimates have been used in several recent health studies. This study examines the sensitivity of exposure estimates produced by dispersion models to meteorological, emission and traffic allocation inputs, focusing on applications to health studies examining near-road exposures to TRAP. Daily average concentrations of CO and NOx predicted using the Research Line source model (RLINE) and a spatially and temporally resolved mobile source emissions inventory are compared to ambient measurements at near-road monitoring sites in Detroit, MI, and are used to assess the potential for exposure measurement error in cohort and population-based studies. Sensitivity of exposure estimates is assessed by comparing nominal and alternative model inputs using statistical performance evaluation metrics and three sets of receptors. The analysis shows considerable sensitivity to meteorological inputs; generally the best performance was obtained using data specific to each monitoring site. An updated emission factor database provided some improvement, particularly at near-road sites, while the use of site-specific diurnal traffic allocations did not improve performance compared to simpler default profiles. Overall, this study highlights the need for appropriate inputs, especially meteorological inputs, to dispersion models aimed at estimating near-road concentrations of TRAPs. It also highlights the potential for systematic biases that might affect analyses that use concentration predictions as exposure measures in health studies.
Banas, Krzysztof; Banas, Agnieszka; Gajda, Mariusz; Kwiatek, Wojciech M; Pawlicki, Bohdan; Breese, Mark B H
2014-07-15
Assessment of the performance and up-to-date diagnostics of scientific equipment is one of the key components in contemporary laboratories. Most reliable checks are performed by real test experiments while varying the experimental conditions (typically, in the case of infrared spectroscopic measurements, the size of the beam aperture, the duration of the experiment, the spectral range, the scanner velocity, etc.). On the other hand, the stability of the instrument response in time is another key element of the great value. Source stability (or easy predictable temporal changes, similar to those observed in the case of synchrotron radiation-based sources working in non top-up mode), detector stability (especially in the case of liquid nitrogen- or liquid helium-cooled detectors) should be monitored. In these cases, recorded datasets (spectra) include additional variables such as time stamp when a particular spectrum was recorded (in the case of time trial experiments). A favorable approach in evaluating these data is building hyperspectral object that consist of all spectra and all additional parameters at which these spectra were recorded. Taking into account that these datasets could be considerably large in size, there is a need for the tools for semiautomatic data evaluation and information extraction. A comprehensive R archive network--the open-source R Environment--with its flexibility and growing potential, fits these requirements nicely. In this paper, examples of practical implementation of methods available in R for real-life Fourier transform infrared (FTIR) spectroscopic data problems are presented. However, this approach could easily be adopted to many various laboratory scenarios with other spectroscopic techniques.
Setting performance standards for medical practice: a theoretical framework.
Southgate, L; Hays, R B; Norcini, J; Mulholland, H; Ayers, B; Woolliscroft, J; Cusimano, M; McAvoy, P; Ainsworth, M; Haist, S; Campbell, M
2001-05-01
The assessment of performance in the real world of medical practice is now widely accepted as the goal of assessment at the postgraduate level. This is largely a validity issue, as it is recognised that tests of knowledge and in clinical simulations cannot on their own really measure how medical practitioners function in the broader health care system. However, the development of standards for performance-based assessment is not as well understood as in competency assessment, where simulations can more readily reflect narrower issues of knowledge and skills. This paper proposes a theoretical framework for the development of standards that reflect the more complex world in which experienced medical practitioners work. The paper reflects the combined experiences of a group of education researchers and the results of literature searches that included identifying current health system data sources that might contribute information to the measurement of standards. Standards that reflect the complexity of medical practice may best be developed through an "expert systems" analysis of clinical conditions for which desired health care outcomes reflect the contribution of several health professionals within a complex, three-dimensional, contextual model. Examples of the model are provided, but further work is needed to test validity and measurability.
Thomas, K; McBean, E; Shantz, A; Murphy, H M
2015-03-01
Most Cambodians lack access to a safe source of drinking water. Piped distribution systems are typically limited to major urban centers in Cambodia, and the remaining population relies on a variety of surface, rain, and groundwater sources. This study examines the household water supplies available to Phnom Penh's resettled peri-urban residents through a case-study approach of two communities. A quantitative microbial risk assessment is performed to assess the level of diarrheal disease risk faced by community members due to microbial contamination of drinking water. Risk levels found in this study exceed those associated with households consuming piped water. Filtered and boiled rain and tank water stored in a kettle, bucket/cooler, bucket with spigot or a 500 mL bottle were found to provide risk levels within one order-of-magnitude to the piped water available in Phnom Penh. Two primary concerns identified are the negation of the risk reductions gained by boiling due to prevailing poor storage practices and the use of highly contaminated source water.
Measured and predicted rotor performance for the SERI advanced wind turbine blades
NASA Astrophysics Data System (ADS)
Tangler, J.; Smith, B.; Kelley, N.; Jager, D.
1992-02-01
Measured and predicted rotor performance for the Solar Energy Research Institute (SERI) advanced wind turbine blades were compared to assess the accuracy of predictions and to identify the sources of error affecting both predictions and measurements. An awareness of these sources of error contributes to improved prediction and measurement methods that will ultimately benefit future rotor design efforts. Propeller/vane anemometers were found to underestimate the wind speed in turbulent environments such as the San Gorgonio Pass wind farm area. Using sonic or cup anemometers, good agreement was achieved between predicted and measured power output for wind speeds up to 8 m/sec. At higher wind speeds an optimistic predicted power output and the occurrence of peak power at wind speeds lower than measurements resulted from the omission of turbulence and yaw error. In addition, accurate two-dimensional (2-D) airfoil data prior to stall and a post stall airfoil data synthesization method that reflects three-dimensional (3-D) effects were found to be essential for accurate performance prediction.
Performance assessment study of the balloon-borne astronomical soft gamma-ray polarimeter PoGOLite
NASA Astrophysics Data System (ADS)
Arimoto, M.; Kanai, Y.; Ueno, M.; Kataoka, J.; Kawai, N.; Tanaka, T.; Yamamoto, K.; Takahashi, H.; Mizuno, T.; Fukazawa, Y.; Axelsson, M.; Kiss, M.; Marini Bettolo, C.; Carlson, P.; Klamra, W.; Pearce, M.; Chen, P.; Craig, B.; Kamae, T.; Madejski, G.; Ng, J. S. T.; Rogers, R.; Tajima, H.; Thurston, T. S.; Saito, Y.; Takahashi, T.; Gunji, S.; Bjornsson, C.-I.; Larsson, S.; Ryde, F.; Bogaert, G.; Varner, G.
2007-12-01
Measurements of polarization play a crucial role in the understanding of the dominant emission mechanism of astronomical sources. Polarized Gamma-ray Observer-Light version (PoGOLite) is a balloon-borne astronomical soft gamma-ray polarimeter at the 25 80 keV band. The PoGOLite detector consists of a hexagonal close-packed array of 217 Phoswich detector cells (PDCs) and side anti-coincidence shields (SASs) made of BGO crystals surrounding PDCs. Each PDC consists of a slow hollow scintillator, a fast scintillator and a BGO crystal that connects to a photomultiplier tube at the end. To examine the PoGOLite's capability and estimate the performance, we conducted experiments with the PDC using radioisotope 241Am. In addition, we compared this result with performance expected by Monte Carlo simulation with Geant4. As a result, we found that the actual PDC has the capability to detect a 100 m Crab source until 80 keV.
Microbial source tracking in highly vulnerable karst drinking water resources.
Diston, D; Robbi, R; Baumgartner, A; Felleisen, R
2018-02-01
Water resources situated in areas with underlying karst geology are particularly vulnerable to fecal pollution. In such vulnerable systems, microbial source tracking (MST) methods are useful tools to elucidate the pathways of both animal and human fecal pollution, leading to more accurate water use risk assessments. Here, we describe the application of a MST toolbox using both culture-dependent bacteriophage and molecular-dependent 16S rRNA assays at spring and well sites in the karstic St Imier Valley, Switzerland. Culture-dependent and molecular-dependent marker performance varied significantly, with the 16S rRNA assays displaying greater sensitivity than their phage counterpart; HF183 was the best performing human wastewater-associated marker while Rum2Bac was the best performing ruminant marker. Differences were observed in pollution regimes between the well and spring sampling sites, with the spring water being more degraded than the well site. Our results inform the choice of marker selection for MST studies and highlight differences in microbial water quality between well and spring karst sites.
Methodologies for evaluating performance and assessing uncertainty of atmospheric dispersion models
NASA Astrophysics Data System (ADS)
Chang, Joseph C.
This thesis describes methodologies to evaluate the performance and to assess the uncertainty of atmospheric dispersion models, tools that predict the fate of gases and aerosols upon their release into the atmosphere. Because of the large economic and public-health impacts often associated with the use of the dispersion model results, these models should be properly evaluated, and their uncertainty should be properly accounted for and understood. The CALPUFF, HPAC, and VLSTRACK dispersion modeling systems were applied to the Dipole Pride (DP26) field data (˜20 km in scale), in order to demonstrate the evaluation and uncertainty assessment methodologies. Dispersion model performance was found to be strongly dependent on the wind models used to generate gridded wind fields from observed station data. This is because, despite the fact that the test site was a flat area, the observed surface wind fields still showed considerable spatial variability, partly because of the surrounding mountains. It was found that the two components were comparable for the DP26 field data, with variability more important than uncertainty closer to the source, and less important farther away from the source. Therefore, reducing data errors for input meteorology may not necessarily increase model accuracy due to random turbulence. DP26 was a research-grade field experiment, where the source, meteorological, and concentration data were all well-measured. Another typical application of dispersion modeling is a forensic study where the data are usually quite scarce. An example would be the modeling of the alleged releases of chemical warfare agents during the 1991 Persian Gulf War, where the source data had to rely on intelligence reports, and where Iraq had stopped reporting weather data to the World Meteorological Organization since the 1981 Iran-Iraq-war. Therefore the meteorological fields inside Iraq must be estimated by models such as prognostic mesoscale meteorological models, based on observational data from areas outside of Iraq, and using the global fields simulated by the global meteorological models as the initial and boundary conditions for the mesoscale models. It was found that while comparing model predictions to observations in areas outside of Iraq, the predicted surface wind directions had errors between 30 to 90 deg, but the inter-model differences (or uncertainties) in the predicted surface wind directions inside Iraq, where there were no onsite data, were fairly constant at about 70 deg. (Abstract shortened by UMI.)
SOURCE WATER ASSESSMENT USING GEOGRAPHIC INFORMATION SYSTEMS
The 1996 amendments to Section 1453 of the Safe Drinking Water Act require the states to establish and implement a Source Water Assessment Program (SWAP). Source water is the water taken from rivers, reservoirs, or wells for use as public drinking water. Source water assessment i...
Michels, Nele R M; Driessen, Erik W; Muijtjens, Arno M M; Van Gaal, Luc F; Bossaert, Leo L; De Winter, Benedicte Y
2009-12-01
A portfolio is used to mentor and assess students' clinical performance at the workplace. However, students and raters often perceive the portfolio as a time-consuming instrument. In this study, we investigated whether assessment during medical internship by a portfolio can combine reliability and feasibility. The domain-oriented reliability of 61 double-rated portfolios was measured, using a generalisability analysis with portfolio tasks and raters as sources of variation in measuring the performance of a student. We obtained reliability (Phi coefficient) of 0.87 with this internship portfolio containing 15 double-rated tasks. The generalisability analysis showed that an acceptable level of reliability (Phi = 0.80) was maintained when the amount of portfolio tasks was decreased to 13 or 9 using one and two raters, respectively. Our study shows that a portfolio can be a reliable method for the assessment of workplace learning. The possibility of reducing the amount of tasks or raters while maintaining a sufficient level of reliability suggests an increase in feasibility of portfolio use for both students and raters.
A Home Ignition Assessment Model Applied to Structures in the Wildland-Urban Interface
DOE Office of Scientific and Technical Information (OSTI.GOV)
Biswas, Kaushik; Werth, David; Gupta, Narendra
2013-01-01
The issue of exterior fire threat to buildings, from either wildfires in the wildland-urban interface or neighboring structure fires, is critically important. To address this, theWildfire Ignition Resistant Home Design (WIRHD) program was initiated. The WIRHD program developed a tool, theWildFIREWizard, that will allow homeowners to estimate the external fire threat to their homes based on specific features and characteristics of the homes and yards. The software then makes recommendations to reduce the threat. The inputs include the structural and material features of the home and information about any ignition sources or flammable objects in its immediate vicinity, known asmore » the home ignition zone. The tool comprises an ignition assessment model that performs explicit calculations of the radiant and convective heating of the building envelope from the potential ignition sources. This article describes a series of material ignition and flammability tests that were performed to calibrate and/or validate the ignition assessment model. The tests involved exposing test walls with different external siding types to radiant heating and/or direct flame contact.The responses of the test walls were used to determine the conditions leading to melting, ignition, or any other mode of failure of the walls. Temperature data were used to verify the model predictions of temperature rises and ignition times of the test walls.« less
What Do Athletes Know on the Effect of Steroids? An Exploratory Study in Italy.
Vellante, Marcello; Moro, Maria Francesca; Sancassiani, Federica; Prost, Silvia; Machado, Sergio; Nardi, Antonio Egidio; Preti, Antonio; Mura, Gioia
2015-01-01
Despite the evidence of risks related to the use of anabolic steroids for the improvement of athletic performances, the diffusion of such drugs appears to be increasing. An exploratory study was conducted in Cagliari, Italy, to assess the level of information on this issue, to esteem the use of steroids among athletes, to measure the wellbeing of athletes and the risks related to steroid use. A sample of 192 athletes, including 142 non-agonists and 50 agonists (age range: 18 to 36) was invited to fill in a booklet including several self-report questionnaires. The questionnaire for the assessment of the beliefs regarding the effects of anabolic steroids was developed and validated for the study, while the Self Reporting Questionnaire was used for the assessment of the mental health aspects. A general lack of information on the specific effects of steroid use on general and psychic health, as well as on sportive performances was found. Athletes were also quite unaware of the diffusion of steroids among them. Since the sportive environment seems to be the main source of information, this channel should be targeted to address the prevention and information campaigns. The use of more specific tools and the investigation of the perception of reliability of the information sources as well as the social desirability issues should be explored in future studies.
Ma, Chunming; Wang, Rui; Liu, Yue; Lu, Qiang; Liu, Xiaoli; Yin, Fuzai
2017-05-01
The neck circumference (NC) has been shown to be an accurate index for screening overweight and obesity in children and adolescents. To perform a meta-analysis to assess the performance of NC for the assessment of overweight and obesity. Data sources were PubMed and EMBASE up to March 2016. Studies providing measures of diagnostic performance of NC and using body mass index as reference standard were included. Six eligible studies that evaluated 11 214 children and adolescents aged 6-18 years were included in the meta-analysis. NC showed pooled sensitivity to detect high body mass index of 0.780 (95% confidence interval [CI] = 0.765-0.794), specificity of 0.746 (95% CI = 0.736-0.756) and a diagnostic odds ratio of 17.343 (95% CI = 8.743-34.405). The NC had moderate diagnostic accuracy for identifying overweight and obesity in children and adolescents.
40 CFR 434.55 - New source performance standards (NSPS).
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 29 2010-07-01 2010-07-01 false New source performance standards (NSPS... PERFORMANCE STANDARDS Post-Mining Areas § 434.55 New source performance standards (NSPS). The following new source performance standards shall apply to the post-mining areas of all new source coal mines: (a...
Validation of Regression-Based Myogenic Correction Techniques for Scalp and Source-Localized EEG
McMenamin, Brenton W.; Shackman, Alexander J.; Maxwell, Jeffrey S.; Greischar, Lawrence L.; Davidson, Richard J.
2008-01-01
EEG and EEG source-estimation are susceptible to electromyographic artifacts (EMG) generated by the cranial muscles. EMG can mask genuine effects or masquerade as a legitimate effect - even in low frequencies, such as alpha (8–13Hz). Although regression-based correction has been used previously, only cursory attempts at validation exist and the utility for source-localized data is unknown. To address this, EEG was recorded from 17 participants while neurogenic and myogenic activity were factorially varied. We assessed the sensitivity and specificity of four regression-based techniques: between-subjects, between-subjects using difference-scores, within-subjects condition-wise, and within-subject epoch-wise on the scalp and in data modeled using the LORETA algorithm. Although within-subject epoch-wise showed superior performance on the scalp, no technique succeeded in the source-space. Aside from validating the novel epoch-wise methods on the scalp, we highlight methods requiring further development. PMID:19298626
Performance indicators used to assess the quality of primary dental care.
González, Grisel Zacca; Klazinga, Niek; ten Asbroek, Guus; Delnoij, Diana M
2006-12-01
An appropriate quality of medical care including dental care should be an objective of every government that aims to improve the oral health of its population. To determine performance indicators that could be used to assess the quality of primary dental care at different levels of a health care system, the sources for data collection and finally, the dimensions of quality measured by these indicators. An explorative study of the international literature was conducted using medical databases, journals and books, and official websites of organisations and associations. This resulted in a set of 57 indicators, which were classified into the following dimensions for each intended user group: For patients: health outcomes and subjective indicators; for professionals: their performance and the rates of success, failure and complications; for health care system managers and policymakers: their resources, finances and health care utilisation. A set of 57 performance indicators were identified to assess the quality of primary dental care at the levels of patients, professionals and the health care system. These indicators could be used by managers and decision-makers at any level of the health care system according to the characteristics of the services.
Analysis and methodology for aeronautical systems technology program planning
NASA Technical Reports Server (NTRS)
White, M. J.; Gershkoff, I.; Lamkin, S.
1983-01-01
A structured methodology was developed that allows the generation, analysis, and rank-ordering of system concepts by their benefits and costs, indicating the preferred order of implementation. The methodology is supported by a base of data on civil transport aircraft fleet growth projections and data on aircraft performance relating the contribution of each element of the aircraft to overall performance. The performance data are used to assess the benefits of proposed concepts. The methodology includes a computer program for performing the calculations needed to rank-order the concepts and compute their cumulative benefit-to-cost ratio. The use of the methodology and supporting data is illustrated through the analysis of actual system concepts from various sources.
Quantitative assessment of workload and stressors in clinical radiation oncology.
Mazur, Lukasz M; Mosaly, Prithima R; Jackson, Marianne; Chang, Sha X; Burkhardt, Katharin Deschesne; Adams, Robert D; Jones, Ellen L; Hoyle, Lesley; Xu, Jing; Rockwell, John; Marks, Lawrence B
2012-08-01
Workload level and sources of stressors have been implicated as sources of error in multiple settings. We assessed workload levels and sources of stressors among radiation oncology professionals. Furthermore, we explored the potential association between workload and the frequency of reported radiotherapy incidents by the World Health Organization (WHO). Data collection was aimed at various tasks performed by 21 study participants from different radiation oncology professional subgroups (simulation therapists, radiation therapists, physicists, dosimetrists, and physicians). Workload was assessed using National Aeronautics and Space Administration Task-Load Index (NASA TLX). Sources of stressors were quantified using observational methods and segregated using a standard taxonomy. Comparisons between professional subgroups and tasks were made using analysis of variance ANOVA, multivariate ANOVA, and Duncan test. An association between workload levels (NASA TLX) and the frequency of radiotherapy incidents (WHO incidents) was explored (Pearson correlation test). A total of 173 workload assessments were obtained. Overall, simulation therapists had relatively low workloads (NASA TLX range, 30-36), and physicists had relatively high workloads (NASA TLX range, 51-63). NASA TLX scores for physicians, radiation therapists, and dosimetrists ranged from 40-52. There was marked intertask/professional subgroup variation (P<.0001). Mental demand (P<.001), physical demand (P=.001), and effort (P=.006) significantly differed among professional subgroups. Typically, there were 3-5 stressors per cycle of analyzed tasks with the following distribution: interruptions (41.4%), time factors (17%), technical factors (13.6%), teamwork issues (11.6%), patient factors (9.0%), and environmental factors (7.4%). A positive association between workload and frequency of reported radiotherapy incidents by the WHO was found (r = 0.87, P value=.045). Workload level and sources of stressors vary among professional subgroups. Understanding the factors that influence these findings can guide adjustments to the workflow procedures, physical layout, and/or communication protocols to enhance safety. Additional evaluations are needed in order to better understand if these findings are systemic. Copyright © 2012 Elsevier Inc. All rights reserved.
Quantitative Assessment of Workload and Stressors in Clinical Radiation Oncology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mazur, Lukasz M., E-mail: lukasz_mazur@ncsu.edu; Industrial Extension Service, North Carolina State University, Raleigh, North Carolina; Biomedical Engineering, North Carolina State University, Raleigh, North Carolina
2012-08-01
Purpose: Workload level and sources of stressors have been implicated as sources of error in multiple settings. We assessed workload levels and sources of stressors among radiation oncology professionals. Furthermore, we explored the potential association between workload and the frequency of reported radiotherapy incidents by the World Health Organization (WHO). Methods and Materials: Data collection was aimed at various tasks performed by 21 study participants from different radiation oncology professional subgroups (simulation therapists, radiation therapists, physicists, dosimetrists, and physicians). Workload was assessed using National Aeronautics and Space Administration Task-Load Index (NASA TLX). Sources of stressors were quantified using observational methodsmore » and segregated using a standard taxonomy. Comparisons between professional subgroups and tasks were made using analysis of variance ANOVA, multivariate ANOVA, and Duncan test. An association between workload levels (NASA TLX) and the frequency of radiotherapy incidents (WHO incidents) was explored (Pearson correlation test). Results: A total of 173 workload assessments were obtained. Overall, simulation therapists had relatively low workloads (NASA TLX range, 30-36), and physicists had relatively high workloads (NASA TLX range, 51-63). NASA TLX scores for physicians, radiation therapists, and dosimetrists ranged from 40-52. There was marked intertask/professional subgroup variation (P<.0001). Mental demand (P<.001), physical demand (P=.001), and effort (P=.006) significantly differed among professional subgroups. Typically, there were 3-5 stressors per cycle of analyzed tasks with the following distribution: interruptions (41.4%), time factors (17%), technical factors (13.6%), teamwork issues (11.6%), patient factors (9.0%), and environmental factors (7.4%). A positive association between workload and frequency of reported radiotherapy incidents by the WHO was found (r = 0.87, P value=.045). Conclusions: Workload level and sources of stressors vary among professional subgroups. Understanding the factors that influence these findings can guide adjustments to the workflow procedures, physical layout, and/or communication protocols to enhance safety. Additional evaluations are needed in order to better understand if these findings are systemic.« less
Data Driven Change Is Easy; Assessing and Maintaining It Is the Hard Part
ERIC Educational Resources Information Center
Perelman, Les
2009-01-01
At MIT in the 1990's, data from two sources, a study of the writing ability of a small group of randomly selected MIT juniors correlated to their overall academic performance and a survey of alumni from various years provided the major motivation for the development by MIT faculty and administration of a very ambitious…
Amanzi: An Open-Source Multi-process Simulator for Environmental Applications
NASA Astrophysics Data System (ADS)
Moulton, J. D.; Molins, S.; Johnson, J. N.; Coon, E.; Lipnikov, K.; Day, M.; Barker, E.
2014-12-01
The Advanced Simulation Capabililty for Environmental Management (ASCEM) program is developing an approach and open-source tool suite for standardized risk and performance assessments at legacy nuclear waste sites. These assessments begin with simplified models, and add geometric and geologic complexity as understanding is gained. The Platform toolsets (Akuna) generates these conceptual models and Amanzi provides the computational engine to perform the simulations, returning the results for analysis and visualization. In this presentation we highlight key elements of the design, algorithms and implementations used in Amanzi. In particular, the hierarchical and modular design is aligned with the coupled processes being sumulated, and naturally supports a wide range of model complexity. This design leverages a dynamic data manager and the synergy of two graphs (one from the high-level perspective of the models the other from the dependencies of the variables in the model) to enable this flexible model configuration at run time. Moreover, to model sites with complex hydrostratigraphy, as well as engineered systems, we are developing a dual unstructured/structured capability. Recently, these capabilities have been collected in a framework named Arcos, and efforts have begun to improve interoperability between the unstructured and structured AMR approaches in Amanzi. To leverage a range of biogeochemistry capability from the community (e.g., CrunchFlow, PFLOTRAN, etc.), a biogeochemistry interface library was developed called Alquimia. To ensure that Amanzi is truly an open-source community code we require a completely open-source tool chain for our development. We will comment on elements of this tool chain, including the testing and documentation development tools such as docutils, and Sphinx. Finally, we will show simulation results from our phased demonstrations, including the geochemically complex Savannah River F-Area seepage basins.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yidong Xia; Mitch Plummer; Robert Podgorney
2016-02-01
Performance of heat production process over a 30-year period is assessed in a conceptual EGS model with a geothermal gradient of 65K per km depth in the reservoir. Water is circulated through a pair of parallel wells connected by a set of single large wing fractures. The results indicate that the desirable output electric power rate and lifespan could be obtained under suitable material properties and system parameters. A sensitivity analysis on some design constraints and operation parameters indicates that 1) the fracture horizontal spacing has profound effect on the long-term performance of heat production, 2) the downward deviation anglemore » for the parallel doublet wells may help overcome the difficulty of vertical drilling to reach a favorable production temperature, and 3) the thermal energy production rate and lifespan has close dependence on water mass flow rate. The results also indicate that the heat production can be improved when the horizontal fracture spacing, well deviation angle, and production flow rate are under reasonable conditions. To conduct the reservoir modeling and simulations, an open-source, finite element based, fully implicit, fully coupled hydrothermal code, namely FALCON, has been developed and used in this work. Compared with most other existing codes that are either closed-source or commercially available in this area, this new open-source code has demonstrated a code development strategy that aims to provide an unparalleled easiness for user-customization and multi-physics coupling. Test results have shown that the FALCON code is able to complete the long-term tests efficiently and accurately, thanks to the state-of-the-art nonlinear and linear solver algorithms implemented in the code.« less
Simulation Modelling in Healthcare: An Umbrella Review of Systematic Literature Reviews.
Salleh, Syed; Thokala, Praveen; Brennan, Alan; Hughes, Ruby; Booth, Andrew
2017-09-01
Numerous studies examine simulation modelling in healthcare. These studies present a bewildering array of simulation techniques and applications, making it challenging to characterise the literature. The aim of this paper is to provide an overview of the level of activity of simulation modelling in healthcare and the key themes. We performed an umbrella review of systematic literature reviews of simulation modelling in healthcare. Searches were conducted of academic databases (JSTOR, Scopus, PubMed, IEEE, SAGE, ACM, Wiley Online Library, ScienceDirect) and grey literature sources, enhanced by citation searches. The articles were included if they performed a systematic review of simulation modelling techniques in healthcare. After quality assessment of all included articles, data were extracted on numbers of studies included in each review, types of applications, techniques used for simulation modelling, data sources and simulation software. The search strategy yielded a total of 117 potential articles. Following sifting, 37 heterogeneous reviews were included. Most reviews achieved moderate quality rating on a modified AMSTAR (A Measurement Tool used to Assess systematic Reviews) checklist. All the review articles described the types of applications used for simulation modelling; 15 reviews described techniques used for simulation modelling; three reviews described data sources used for simulation modelling; and six reviews described software used for simulation modelling. The remaining reviews either did not report or did not provide enough detail for the data to be extracted. Simulation modelling techniques have been used for a wide range of applications in healthcare, with a variety of software tools and data sources. The number of reviews published in recent years suggest an increased interest in simulation modelling in healthcare.
Incorporating uncertainty in predictive species distribution modelling.
Beale, Colin M; Lennon, Jack J
2012-01-19
Motivated by the need to solve ecological problems (climate change, habitat fragmentation and biological invasions), there has been increasing interest in species distribution models (SDMs). Predictions from these models inform conservation policy, invasive species management and disease-control measures. However, predictions are subject to uncertainty, the degree and source of which is often unrecognized. Here, we review the SDM literature in the context of uncertainty, focusing on three main classes of SDM: niche-based models, demographic models and process-based models. We identify sources of uncertainty for each class and discuss how uncertainty can be minimized or included in the modelling process to give realistic measures of confidence around predictions. Because this has typically not been performed, we conclude that uncertainty in SDMs has often been underestimated and a false precision assigned to predictions of geographical distribution. We identify areas where development of new statistical tools will improve predictions from distribution models, notably the development of hierarchical models that link different types of distribution model and their attendant uncertainties across spatial scales. Finally, we discuss the need to develop more defensible methods for assessing predictive performance, quantifying model goodness-of-fit and for assessing the significance of model covariates.
Monitoring and evaluation of human resources for health: an international perspective
Diallo, Khassoum; Zurn, Pascal; Gupta, Neeru; Dal Poz, Mario
2003-01-01
Background Despite the undoubted importance of human resources to the functions of health systems, there is little consistency between countries in how human resource strategies are monitored and evaluated. This paper presents an integrated approach for developing an evidence base on human resources for health (HRH) to support decision-making, drawing on a framework for health systems performance assessment. Methods Conceptual and methodological issues for selecting indicators for HRH monitoring and evaluation are discussed, and a range of primary and secondary data sources that might be used to generate indicators are reviewed. Descriptive analyses are conducted drawing primarily on one type of source, namely routinely reported data on the numbers of health personnel and medical schools as covered by national reporting systems and compiled by the World Health Organization. Regression techniques are used to triangulate a given HRH indicator calculated from different data sources across multiple countries. Results Major variations in the supply of health personnel and training opportunities are found to occur by region. However, certain discrepancies are also observed in measuring the same indicator from different sources, possibly related to the occupational classification or to the sources' representation. Conclusion Evidence-based information is needed to better understand trends in HRH. Although a range of sources exist that can potentially be used for HRH assessment, the information that can be derived from many of these individual sources precludes refined analysis. A variety of data sources and analytical approaches, each with its own strengths and limitations, is required to reflect the complexity of HRH issues. In order to enhance cross-national comparability, data collection efforts should be processed through the use of internationally standardized classifications (in particular, for occupation, industry and education) at the greatest level of detail possible. PMID:12904252
A time-domain fluorescence diffusion optical tomography system for breast tumor diagnosis
NASA Astrophysics Data System (ADS)
Zhang, Wei; Gao, Feng; Wu, LinHui; Ma, Wenjuan; Yang, Fang; Zhou, Zhongxing; Zhang, Limin; Zhao, Huijuan
2011-02-01
A prototype time-domain fluorescence diffusion optical tomography (FDOT) system using near-infrared light is presented. The system employs two pulsed light sources, 32 source fibers and 32 detection channels, working separately for acquiring the temporal distribution of the photon flux on the tissue surface. The light sources are provided by low power picosecond pulsed diode lasers at wavelengths of 780 nm and 830 nm, and a 1×32-fiber-optic-switch sequentially directs light sources to the object surface through 32 source fibers. The light signals re-emitted from the object are collected by 32 detection fibers connected to four 8×1 fiber-optic-switch and then routed to four time-resolved measuring channels, each of which consists of a collimator, a filter wheel, a photomultiplier tube (PMT) photon-counting head and a time-correlated single photon counting (TCSPC) channel. The performance and efficacy of the designed multi-channel PMT-TCSPC system are assessed by reconstructing the fluorescent yield and lifetime images of a solid phantom.
Improved source inversion from joint measurements of translational and rotational ground motions
NASA Astrophysics Data System (ADS)
Donner, S.; Bernauer, M.; Reinwald, M.; Hadziioannou, C.; Igel, H.
2017-12-01
Waveform inversion for seismic point (moment tensor) and kinematic sources is a standard procedure. However, especially in the local and regional distances a lack of appropriate velocity models, the sparsity of station networks, or a low signal-to-noise ratio combined with more complex waveforms hamper the successful retrieval of reliable source solutions. We assess the potential of rotational ground motion recordings to increase the resolution power and reduce non-uniquenesses for point and kinematic source solutions. Based on synthetic waveform data, we perform a Bayesian (i.e. probabilistic) inversion. Thus, we avoid the subjective selection of the most reliable solution according the lowest misfit or other constructed criterion. In addition, we obtain unbiased measures of resolution and possible trade-offs. Testing different earthquake mechanisms and scenarios, we can show that the resolution of the source solutions can be improved significantly. Especially depth dependent components show significant improvement. Next to synthetic data of station networks, we also tested sparse-network and single station cases.
Scintillation analysis of truncated Bessel beams via numerical turbulence propagation simulation.
Eyyuboğlu, Halil T; Voelz, David; Xiao, Xifeng
2013-11-20
Scintillation aspects of truncated Bessel beams propagated through atmospheric turbulence are investigated using a numerical wave optics random phase screen simulation method. On-axis, aperture averaged scintillation and scintillation relative to a classical Gaussian beam of equal source power and scintillation per unit received power are evaluated. It is found that in almost all circumstances studied, the zeroth-order Bessel beam will deliver the lowest scintillation. Low aperture averaged scintillation levels are also observed for the fourth-order Bessel beam truncated by a narrower source window. When assessed relative to the scintillation of a Gaussian beam of equal source power, Bessel beams generally have less scintillation, particularly at small receiver aperture sizes and small beam orders. Upon including in this relative performance measure the criteria of per unit received power, this advantageous position of Bessel beams mostly disappears, but zeroth- and first-order Bessel beams continue to offer some advantage for relatively smaller aperture sizes, larger source powers, larger source plane dimensions, and intermediate propagation lengths.
Integration and Utilization of Nuclear Systems on the Moon and Mars
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houts, Michael G.; Schmidt, George R.; Bragg-Sitton, Shannon
2006-01-20
Over the past five decades numerous studies have identified nuclear energy as an enhancing or enabling technology for planetary surface exploration missions. This includes both radioisotope and fission sources for providing both heat and electricity. Nuclear energy sources were used to provide electricity on Apollo missions 12, 14, 15, 16, and 17, and on the Mars Viking landers. Very small nuclear energy sources were used to provide heat on the Mars Pathfinder, Spirit, and Opportunity rovers. Research has been performed at NASA MSFC to help assess potential issues associated with surface nuclear energy sources, and to generate data that couldmore » be useful to a future program. Research areas include System Integration, use of Regolith as Radiation Shielding, Waste Heat Rejection, Surface Environmental Effects on the Integrated System, Thermal Simulators, Surface System Integration / Interface / Interaction Testing, End-to-End Breadboard Development, Advanced Materials Development, Surface Energy Source Coolants, and Planetary Surface System Thermal Management and Control. This paper provides a status update on several of these research areas.« less
NASA Astrophysics Data System (ADS)
Zhang, Weihong.; Zhao, Yongsheng; Hong, Mei; Guo, Xiaodong
2009-04-01
Groundwater pollution usually is complex and concealed, remediation of which is difficult, high cost, time-consuming, and ineffective. An early warning system for groundwater pollution is needed that detects groundwater quality problems and gets the information necessary to make sound decisions before massive groundwater quality degradation occurs. Groundwater pollution early warning were performed by considering comprehensively the current groundwater quality, groundwater quality varying trend and groundwater pollution risk . The map of the basic quality of the groundwater was obtained by fuzzy comprehensive evaluation or BP neural network evaluation. Based on multi-annual groundwater monitoring datasets, Water quality state in sometime of the future was forecasted using time-sequenced analyzing methods. Water quality varying trend was analyzed by Spearman's rank correlative coefficient.The relative risk map of groundwater pollution was estimated through a procedure that identifies, cell by cell,the values of three factors, that is inherent vulnerability, load risk of pollution source and contamination hazard. DRASTIC method was used to assess inherent vulnerability of aquifer. Load risk of pollution source was analyzed based on the potential of contamination and pollution degree. Assessment index of load risk of pollution source which involves the variety of pollution source, quantity of contaminants, releasing potential of pollutants, and distance were determined. The load risks of all sources considered by GIS overlay technology. Early warning model of groundwater pollution combined with ComGIS technology organically, the regional groundwater pollution early-warning information system was developed, and applied it into Qiqiha'er groundwater early warning. It can be used to evaluate current water quality, to forecast water quality changing trend, and to analyze space-time influencing range of groundwater quality by natural process and human activities. Keywords: groundwater pollution, early warning, aquifer vulnerability, pollution load, pollution risk, ComGIS
Assessment of radio frequency exposures in schools, homes, and public places in Belgium.
Verloock, Leen; Joseph, Wout; Goeminne, Francis; Martens, Luc; Verlaek, Mart; Constandt, Kim
2014-12-01
Characterization of exposure from emerging radio frequency (RF) technologies in areas where children are present is important. Exposure to RF electromagnetic fields (EMF) was assessed in three "sensitive" microenvironments; namely, schools, homes, and public places located in urban environments and compared to exposure in offices. In situ assessment was conducted by performing spatial broadband and accurate narrowband measurements, providing 6-min averaged electric-field strengths. A distinction between internal (transmitters that are located indoors) and external (outdoor sources from broadcasting and telecommunication) sources was made. Ninety-four percent of the broadband measurements were below 1 V m(-1). The average and maximal total electric-field values in schools, homes, and public places were 0.2 and 3.2 V m(-1) (WiFi), 0.1 and 1.1 V m(-1) (telecommunication), and 0.6 and 2.4 V m(-1) (telecommunication), respectively, while for offices, average and maximal exposure were 0.9 and 3.3 V m(-1) (telecommunication), satisfying the ICNIRP reference levels. In the schools considered, the highest maximal and average field values were due to internal signals (WiFi). In the homes, public places, and offices considered, the highest maximal and average field values originated from telecommunication signals. Lowest exposures were obtained in homes. Internal sources contributed on average more indoors (31.2%) than outdoors (2.3%), while the average contributions of external sources (broadcast and telecommunication sources) were higher outdoors (97.7%) than at indoor positions (68.8%). FM, GSM, and UMTS dominate the total downlink exposure in the outdoor measurements. In indoor measurements, FM, GSM, and WiFi dominate the total exposure. The average contribution of the emerging technology LTE was only 0.6%.
Morris, Jamae Fontain; Murphy, Jennifer; Fagerli, Kirsten; Schneeberger, Chandra; Jaron, Peter; Moke, Fenny; Juma, Jane; Ochieng, J Ben; Omore, Richard; Roellig, Dawn; Xiao, Lihua; Priest, Jeffrey W; Narayanan, Jothikumar; Montgomery, Joel; Hill, Vince; Mintz, Eric; Ayers, Tracy L; O'Reilly, Ciara E
2018-04-02
Cryptosporidium is a leading cause of diarrhea among Kenyan infants. Ceramic water filters (CWFs) are used for household water treatment. We assessed the impact of CWFs on diarrhea, cryptosporidiosis prevention, and water quality in rural western Kenya. A randomized, controlled intervention trial was conducted in 240 households with infants 4-10 months old. Twenty-six weekly household surveys assessed infant diarrhea and health facility visits. Stool specimens from infants with diarrhea were examined for Cryptosporidium . Source water, filtered water, and filter retentate were tested for Cryptosporidium and/or microbial indicators. To estimate the effect of CWFs on health outcomes, logistic regression models using generalized estimating equations were performed; odds ratios (ORs) and 95% confidence intervals (CIs) are reported. Households reported using surface water (36%), public taps (29%), or rainwater (17%) as their primary drinking water sources, with no differences in treatment groups. Intervention households reported less diarrhea (7.6% versus 8.9%; OR: 0.86 [0.64-1.16]) and significantly fewer health facility visits for diarrhea (1.0% versus 1.9%; OR: 0.50 [0.30-0.83]). In total, 15% of intervention and 12% of control stools yielded Cryptosporidium ( P = 0.26). Escherichia coli was detected in 93% of source water samples; 71% of filtered water samples met World Health Organization recommendations of < 1 E. coli /100 mL. Cryptosporidium was not detected in source water and was detected in just 2% of filter rinses following passage of large volumes of source water. Water quality was improved among CWF users; however, the short study duration and small sample size limited our ability to observe reductions in cryptosporidiosis.
2018-01-01
This work presents the results of an international interlaboratory comparison on ex situ passive sampling in sediments. The main objectives were to map the state of the science in passively sampling sediments, identify sources of variability, provide recommendations and practical guidance for standardized passive sampling, and advance the use of passive sampling in regulatory decision making by increasing confidence in the use of the technique. The study was performed by a consortium of 11 laboratories and included experiments with 14 passive sampling formats on 3 sediments for 25 target chemicals (PAHs and PCBs). The resulting overall interlaboratory variability was large (a factor of ∼10), but standardization of methods halved this variability. The remaining variability was primarily due to factors not related to passive sampling itself, i.e., sediment heterogeneity and analytical chemistry. Excluding the latter source of variability, by performing all analyses in one laboratory, showed that passive sampling results can have a high precision and a very low intermethod variability (
Groeber, F; Schober, L; Schmid, F F; Traube, A; Kolbus-Hernandez, S; Daton, K; Hoffmann, S; Petersohn, D; Schäfer-Korting, M; Walles, H; Mewes, K R
2016-10-01
To replace the Draize skin irritation assay (OECD guideline 404) several test methods based on reconstructed human epidermis (RHE) have been developed and were adopted in the OECD test guideline 439. However, all validated test methods in the guideline are linked to RHE provided by only three companies. Thus, the availability of these test models is dependent on the commercial interest of the producer. To overcome this limitation and thus to increase the accessibility of in vitro skin irritation testing, an open source reconstructed epidermis (OS-REp) was introduced. To demonstrate the capacity of the OS-REp in regulatory risk assessment, a catch-up validation study was performed. The participating laboratories used in-house generated OS-REp to assess the set of 20 reference substances according to the performance standards amending the OECD test guideline 439. Testing was performed under blinded conditions. The within-laboratory reproducibility of 87% and the inter-laboratory reproducibility of 85% prove a high reliability of irritancy testing using the OS-REp protocol. In addition, the prediction capacity was with an accuracy of 80% comparable to previous published RHE based test protocols. Taken together the results indicate that the OS-REp test method can be used as a standalone alternative skin irritation test replacing the OECD test guideline 404. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
International Space Station Electric Power System Performance Code-SPACE
NASA Technical Reports Server (NTRS)
Hojnicki, Jeffrey; McKissock, David; Fincannon, James; Green, Robert; Kerslake, Thomas; Delleur, Ann; Follo, Jeffrey; Trudell, Jeffrey; Hoffman, David J.; Jannette, Anthony;
2005-01-01
The System Power Analysis for Capability Evaluation (SPACE) software analyzes and predicts the minute-by-minute state of the International Space Station (ISS) electrical power system (EPS) for upcoming missions as well as EPS power generation capacity as a function of ISS configuration and orbital conditions. In order to complete the Certification of Flight Readiness (CoFR) process in which the mission is certified for flight each ISS System must thoroughly assess every proposed mission to verify that the system will support the planned mission operations; SPACE is the sole tool used to conduct these assessments for the power system capability. SPACE is an integrated power system model that incorporates a variety of modules tied together with integration routines and graphical output. The modules include orbit mechanics, solar array pointing/shadowing/thermal and electrical, battery performance, and power management and distribution performance. These modules are tightly integrated within a flexible architecture featuring data-file-driven configurations, source- or load-driven operation, and event scripting. SPACE also predicts the amount of power available for a given system configuration, spacecraft orientation, solar-array-pointing conditions, orbit, and the like. In the source-driven mode, the model must assure that energy balance is achieved, meaning that energy removed from the batteries must be restored (or balanced) each and every orbit. This entails an optimization scheme to ensure that energy balance is maintained without violating any other constraints.
Schaffter, Thomas; Marbach, Daniel; Floreano, Dario
2011-08-15
Over the last decade, numerous methods have been developed for inference of regulatory networks from gene expression data. However, accurate and systematic evaluation of these methods is hampered by the difficulty of constructing adequate benchmarks and the lack of tools for a differentiated analysis of network predictions on such benchmarks. Here, we describe a novel and comprehensive method for in silico benchmark generation and performance profiling of network inference methods available to the community as an open-source software called GeneNetWeaver (GNW). In addition to the generation of detailed dynamical models of gene regulatory networks to be used as benchmarks, GNW provides a network motif analysis that reveals systematic prediction errors, thereby indicating potential ways of improving inference methods. The accuracy of network inference methods is evaluated using standard metrics such as precision-recall and receiver operating characteristic curves. We show how GNW can be used to assess the performance and identify the strengths and weaknesses of six inference methods. Furthermore, we used GNW to provide the international Dialogue for Reverse Engineering Assessments and Methods (DREAM) competition with three network inference challenges (DREAM3, DREAM4 and DREAM5). GNW is available at http://gnw.sourceforge.net along with its Java source code, user manual and supporting data. Supplementary data are available at Bioinformatics online. dario.floreano@epfl.ch.
Clinical assessment of nutritional status and feeding programs in horses.
Becvarova, Iveta; Pleasant, R Scott; Thatcher, Craig D
2009-04-01
Veterinarians are a primary source of nutritional information and advice for horse owners. This article reviews methods for clinical assessment of nutritional status and feeding programs that can be applied to an individual horse or group of horses. Physical examination, including measurement of body weight and evaluation of body condition score, estimation of nutrient requirements and the nutrient content of the horse's diet, and evaluation of the feeding method are important components of the assessment. Ongoing clinical assessment of health and body condition will gauge the need for reassessment of the feeding plan. Obvious indications for prompt reevaluation of diet and feeding include changes in health status (eg, body condition), life stage or physiologic state (eg, pregnancy), or performance status.
Brzonkalik, Katrin; Herrling, Tanja; Syldatk, Christoph; Neumann, Anke
2011-05-27
The aim of this study was to determine the influence of different carbon and nitrogen sources on the production of the mycotoxins alternariol (AOH), alternariol monomethyl ether (AME) and tenuazonic acid (TA) by Alternaria alternata at 28°C using a semi-synthetic medium (modified Czapek-Dox broth) supplemented with nitrogen and carbon sources. Additionally the effect of shaken and static cultivation on mycotoxin production was tested. Initial experiments showed a clear dependency between nitrogen depletion and mycotoxin production. To assess whether nitrogen limitation in general or the type of nitrogen source triggers the production, various nitrogen sources including several ammonium/nitrate salts and amino acids were tested. In static culture the production of AOH/AME can be enhanced greatly with phenylalanine whereas some nitrogen sources seem to inhibit the AOH/AME production completely. TA was not significantly affected by the choice of nitrogen source. In shaken culture the overall production of all mycotoxins was lower compared to static cultivation. Furthermore tests with a wide variety of carbon sources including monosaccharides, disaccharides, complex saccharides such as starch as well as glycerol and acetate were performed. In shaken culture AOH was produced when glucose, fructose, sucrose, acetate or mixtures of glucose/sucrose and glucose/acetate were used as carbon sources. AME production was not detected. The use of sodium acetate resulted in the highest AOH production. In static culture AOH production was also stimulated by acetate and the amount is comparable to shaken conditions. Under static conditions production of AOH was lower except when cultivated with acetate. In static cultivation 9 of 14 tested carbon sources induced mycotoxin production compared to 4 in shaken culture. This is the first study which analyses the influence of carbon and nitrogen sources in a semi-synthetic medium and assesses the effects of culture conditions on mycotoxin production by A. alternata. Copyright © 2011 Elsevier B.V. All rights reserved.
Earthquake Source Inversion Blindtest: Initial Results and Further Developments
NASA Astrophysics Data System (ADS)
Mai, P.; Burjanek, J.; Delouis, B.; Festa, G.; Francois-Holden, C.; Monelli, D.; Uchide, T.; Zahradnik, J.
2007-12-01
Images of earthquake ruptures, obtained from modelling/inverting seismic and/or geodetic data exhibit a high degree in spatial complexity. This earthquake source heterogeneity controls seismic radiation, and is determined by the details of the dynamic rupture process. In turn, such rupture models are used for studying source dynamics and for ground-motion prediction. But how reliable and trustworthy are these earthquake source inversions? Rupture models for a given earthquake, obtained by different research teams, often display striking disparities (see http://www.seismo.ethz.ch/srcmod) However, well resolved, robust, and hence reliable source-rupture models are an integral part to better understand earthquake source physics and to improve seismic hazard assessment. Therefore it is timely to conduct a large-scale validation exercise for comparing the methods, parameterization and data-handling in earthquake source inversions.We recently started a blind test in which several research groups derive a kinematic rupture model from synthetic seismograms calculated for an input model unknown to the source modelers. The first results, for an input rupture model with heterogeneous slip but constant rise time and rupture velocity, reveal large differences between the input and inverted model in some cases, while a few studies achieve high correlation between the input and inferred model. Here we report on the statistical assessment of the set of inverted rupture models to quantitatively investigate their degree of (dis-)similarity. We briefly discuss the different inversion approaches, their possible strength and weaknesses, and the use of appropriate misfit criteria. Finally we present new blind-test models, with increasing source complexity and ambient noise on the synthetics. The goal is to attract a large group of source modelers to join this source-inversion blindtest in order to conduct a large-scale validation exercise to rigorously asses the performance and reliability of current inversion methods and to discuss future developments.
Thunborg, Charlotta; von Heideken Wågert, Petra; Götell, Eva; Ivarsson, Ann-Britt; Söderlund, Anne
2015-02-10
Mobility problems and cognitive deficits related to transferring or moving persons suffering from dementia are associated with dependency. Physical assistance provided by staff is an important component of residents' maintenance of mobility in dementia care facilities. Unfortunately, hands-on assistance during transfers is also a source of confusion in persons with dementia, as well as a source of strain in the caregiver. The bidirectional effect of actions in a dementia care dyad involved in transfer is complicated to evaluate. This study aimed to develop an assessment scale for measuring actions related to transferring persons with dementia by dementia care dyads. This study was performed in four phases and guided by the framework of the biopsychosocial model and the approach presented by Social Cognitive Theory. These frameworks provided a starting point for understanding reciprocal effects in dyadic interaction. The four phases were 1) a literature review identifying existing assessment scales; 2) analyses of video-recorded transfer of persons with dementia for further generation of items, 3) computing the item content validity index of the 93 proposed items by 15 experts; and 4) expert opinion on the response scale and feasibility testing of the new assessment scale by video observation of the transfer situations. The development process resulted in a 17-item scale with a seven-point response scale. The scale consists of two sections. One section is related to transfer-related actions (e.g., capability of communication, motor skills performance, and cognitive functioning) of the person with dementia. The other section addresses the caregivers' facilitative actions (e.g., preparedness of transfer aids, interactional skills, and means of communication and interaction). The literature review and video recordings provided ideas for the item pool. Expert opinion decreased the number of items by relevance ratings and qualitative feedback. No further development of items was performed after feasibility testing of the scale. To enable assessment of transfer-related actions in dementia care dyads, our new scale shows potential for bridging the gap in this area. Results from this study could provide health care professionals working in dementia care facilities with a useful tool for assessing transfer-related actions.
Martín-Gamboa, Mario; Iribarren, Diego; Susmozas, Ana; Dufour, Javier
2016-08-01
A novel approach is developed to evaluate quantitatively the influence of operational inefficiency in biomass production on the life-cycle performance of hydrogen from biomass gasification. Vine-growers and process simulation are used as key sources of inventory data. The life cycle assessment of biohydrogen according to current agricultural practices for biomass production is performed, as well as that of target biohydrogen according to agricultural practices optimised through data envelopment analysis. Only 20% of the vineyards assessed operate efficiently, and the benchmarked reduction percentages of operational inputs range from 45% to 73% in the average vineyard. The fulfilment of operational benchmarks avoiding irregular agricultural practices is concluded to improve significantly the environmental profile of biohydrogen (e.g., impact reductions above 40% for eco-toxicity and global warming). Finally, it is shown that this type of bioenergy system can be an excellent replacement for conventional hydrogen in terms of global warming and non-renewable energy demand. Copyright © 2016 Elsevier Ltd. All rights reserved.
Teachers' ratings of the academic performance of children with epilepsy.
Katzenstein, Jennifer M; Fastenau, Philip S; Dunn, David W; Austin, Joan K
2007-05-01
The present study examined how knowledge of a child's seizure condition is related to teachers' assessment of the child's academic ability. Children with epilepsy were divided into two groups based on teachers' awareness of the children's seizure condition (Label). The children's achievement was assessed using the Woodcock Johnson Tests of Achievement-Revised (WJ-R), and the teacher's ratings were obtained from the Child Behavior Checklist Teacher Report Form (TRF) (Source). A 2 (Source) x 2 (Label) mixed-design analysis of covariance (controlling for IQ and how well the teacher knew the child) revealed a significant interaction, F(1,121)=4.22, P=0.04. For the WJ-R there was no effect of Label on Achievement, but on the TRF lower scores were observed for children who were labeled. These results support the hypothesis that some teachers might underestimate the academic abilities of children with epilepsy.
Lima, Manoel B; Feitosa, Elaine A; Emídio, Elissandro S; Dórea, Haroldo S; Alexandre, Marcelo R
2012-08-01
The assessment of aliphatic hydrocarbons was performed in the Sergipe River estuarine system, northeastern Brazil. Aliphatic hydrocarbons concentration ranged from 9.9 ug g⁻¹ up to 30.8 ug g⁻¹ of dry sediment. The carbon preference index (CPI, based on nC₂₄ to nC₃₄ range), indicated predominance of petrogenic input in two of the sites analyzed (P4 and P5). The unresolved complex mixture (UCM) was found to be present in seven of the nine sites sampled (except for P4 and P5). Overall, the results of this work suggest that there is a mix of organic matter sources to the sediment. Although the coast of Sergipe has an intense off shore petroleum exploration and the Sergipe River crosses the entire city of Aracaju, the capital city of Sergipe, non-significant anthropogenic fingerprint was assessed. Copyright © 2012 Elsevier Ltd. All rights reserved.
Annoyance from industrial noise: indicators for a wide variety of industrial sources.
Alayrac, M; Marquis-Favre, C; Viollon, S; Morel, J; Le Nost, G
2010-09-01
In the study of noises generated by industrial sources, one issue is the variety of industrial noise sources and consequently the complexity of noises generated. Therefore, characterizing the environmental impact of an industrial plant requires better understanding of the noise annoyance caused by industrial noise sources. To deal with the variety of industrial sources, the proposed approach is set up by type of spectral features and based on a perceptive typology of steady and permanent industrial noises comprising six categories. For each perceptive category, listening tests based on acoustical factors are performed on noise annoyance. Various indicators are necessary to predict noise annoyance due to various industrial noise sources. Depending on the spectral features of the industrial noise sources, noise annoyance indicators are thus assessed. In case of industrial noise sources without main spectral features such as broadband noise, noise annoyance is predicted by the A-weighted sound pressure level L(Aeq) or the loudness level L(N). For industrial noises with spectral components such as low-frequency noises with a main component at 100 Hz or noises with spectral components in middle frequencies, indicators are proposed here that allow good prediction of noise annoyance by taking into account spectral features.
On Acoustic Source Specification for Rotor-Stator Interaction Noise Prediction
NASA Technical Reports Server (NTRS)
Nark, Douglas M.; Envia, Edmane; Burley, Caesy L.
2010-01-01
This paper describes the use of measured source data to assess the effects of acoustic source specification on rotor-stator interaction noise predictions. Specifically, the acoustic propagation and radiation portions of a recently developed coupled computational approach are used to predict tonal rotor-stator interaction noise from a benchmark configuration. In addition to the use of full measured data, randomization of source mode relative phases is also considered for specification of the acoustic source within the computational approach. Comparisons with sideline noise measurements are performed to investigate the effects of various source descriptions on both inlet and exhaust predictions. The inclusion of additional modal source content is shown to have a much greater influence on the inlet results. Reasonable agreement between predicted and measured levels is achieved for the inlet, as well as the exhaust when shear layer effects are taken into account. For the number of trials considered, phase randomized predictions follow statistical distributions similar to those found in previous statistical source investigations. The shape of the predicted directivity pattern relative to measurements also improved with phase randomization, having predicted levels generally within one standard deviation of the measured levels.
Simulation of angular and energy distributions of the PTB beta secondary standard.
Faw, R E; Simons, G G; Gianakon, T A; Bayouth, J E
1990-09-01
Calculations and measurements have been performed to assess radiation doses delivered by the PTB Secondary Standard that employs 147Pm, 204Tl, and 90Sr:90Y sources in prescribed geometries, and features "beam-flattening" filters to assure uniformity of delivered doses within a 5-cm radius of the axis from source to detector plane. Three-dimensional, coupled, electron-photon Monte Carlo calculations, accounting for transmission through the source encapsulation and backscattering from the source mounting, led to energy spectra and angular distributions of electrons penetrating the source encapsulation that were used in the representation of pseudo sources of electrons for subsequent transport through the atmosphere, filters, and detectors. Calculations were supplemented by measurements made using bare LiF TLD chips on a thick polymethyl methacrylate phantom. Measurements using the 204Tl and 90Sr:90Y sources revealed that, even in the absence of the beam-flattening filters, delivered dose rates were very uniform radially. Dosimeter response functions (TLD:skin dose ratios) were calculated and confirmed experimentally for all three beta-particle sources and for bare LiF TLDs ranging in mass thickness from 10 to 235 mg cm-2.
Beniczky, Sándor; Lantz, Göran; Rosenzweig, Ivana; Åkeson, Per; Pedersen, Birthe; Pinborg, Lars H; Ziebell, Morten; Jespersen, Bo; Fuglsang-Frederiksen, Anders
2013-10-01
Although precise identification of the seizure-onset zone is an essential element of presurgical evaluation, source localization of ictal electroencephalography (EEG) signals has received little attention. The aim of our study was to estimate the accuracy of source localization of rhythmic ictal EEG activity using a distributed source model. Source localization of rhythmic ictal scalp EEG activity was performed in 42 consecutive cases fulfilling inclusion criteria. The study was designed according to recommendations for studies on diagnostic accuracy (STARD). The initial ictal EEG signals were selected using a standardized method, based on frequency analysis and voltage distribution of the ictal activity. A distributed source model-local autoregressive average (LAURA)-was used for the source localization. Sensitivity, specificity, and measurement of agreement (kappa) were determined based on the reference standard-the consensus conclusion of the multidisciplinary epilepsy surgery team. Predictive values were calculated from the surgical outcome of the operated patients. To estimate the clinical value of the ictal source analysis, we compared the likelihood ratios of concordant and discordant results. Source localization was performed blinded to the clinical data, and before the surgical decision. Reference standard was available for 33 patients. The ictal source localization had a sensitivity of 70% and a specificity of 76%. The mean measurement of agreement (kappa) was 0.61, corresponding to substantial agreement (95% confidence interval (CI) 0.38-0.84). Twenty patients underwent resective surgery. The positive predictive value (PPV) for seizure freedom was 92% and the negative predictive value (NPV) was 43%. The likelihood ratio was nine times higher for the concordant results, as compared with the discordant ones. Source localization of rhythmic ictal activity using a distributed source model (LAURA) for the ictal EEG signals selected with a standardized method is feasible in clinical practice and has a good diagnostic accuracy. Our findings encourage clinical neurophysiologists assessing ictal EEGs to include this method in their armamentarium. Wiley Periodicals, Inc. © 2013 International League Against Epilepsy.
Technology-Supported Performance Assessments for Middle School Geoscience
NASA Astrophysics Data System (ADS)
Zalles, D. R.; Quellmalz, E.; Rosenquist, A.; Kreikemeier, P.
2002-12-01
Under funding from the World Bank, the U.S. Department of Education, the National Science Foundation, and the Federal Government's Global Learning and Observations to Benefit the Environment Program (GLOBE), SRI International has developed and piloted web-accessible performance assessments that measure K-12 students' abilities to use learning technologies to reason with scientific information and communicate evidence-based conclusions to scientific problems. This presentation will describe the assessments that pertain to geoscience at the middle school level. They are the GLOBE Assessments and EPA Phoenix, an instantiation of SRI's model of assessment design known as Integrative Performance Assessments in Technology (IPAT). All are publicly-available on the web. GLOBE engages students in scientific data collection and observation about the environment. SRI's classroom assessments for GLOBE provide sample student assessment tools and frameworks that allow teachers and students to assess how well students can use the data in scientific inquiry projects. Teachers can use classroom assessment tools on the site to develop integrated investigations for assessing GLOBE within their particular science curricula. Rubrics are provided for measuring students' GLOBE-related skills, and alignments are made to state, national, and international science standards. Sample investigations are provided about atmosphere, hydrology, landcover, soils, earth systems, and visualizations. The IPAT assessments present students with engaging problems rooted in science or social science content, plus sets of tasks and questions that require them to gather relevant information on the web, use reasoning strategies to analyze and interpret the information, use spreadsheets, word processors, and other productivity tools, and communicate evidence-based findings and recommendations. In the process of gathering information and drawing conclusions, students are assessed on how well they can operate the technology as well as reason with the information made available through its use. In EPA Phoenix, students are asked to examine different representations of air quality data on the EPA website, as well as national weather data, in order to judge whether Phoenix would be a good site for holding certain athletic events. The students are assessed on how well they can interpret the data, synthesize it, and develop and communicate their conclusions. With the exception of formulating Web searches, results from piloting indicated that students were better at operating technology and interpreting single data sources than they were with synthesizing data from multiple sources and communicating cohesive evidence-based conclusions. Under the aegis of NSF and the International Association for the Evaluation of Educational Achievement, SRI is developing more IPAT assessments in science for a comparative international research study about student achievement in information and communication technology. These assessments will add other technologies into the mix such as dynamic modeling tools and geographic information systems.
Investigating gender violence in Jamaica.
Spiring, Fred
2014-01-01
As Jamaica moves through implementation of their National Policy on Gender Equality and develops harassment legislation, this article attempts to investigate current levels and trends of gender-based violence in Jamaica. All analyses make use of existing data and data formats in developing performance indicators that illustrate the current state of gender violence in Jamaica. The analyses provide a baseline for the future assessment and comparison with internationally accepted gender-based violence indicators. All source data has been included to facilitate comparisons and discussions regarding related levels and trends of violence as well as addressing performance indicator effectiveness.
Quantitative non-destructive assay of PuBe neutron sources
NASA Astrophysics Data System (ADS)
Lakosi, László; Bagi, János; Nguyen, Cong Tam
2006-02-01
PuBe neutron sources were assayed, using a combination of high resolution γ-spectrometry (HRGS) and neutron correlation technique. In a previous publication [J. Bagi, C. Tam Nguyen, L. Lakosi, Nucl. Instr. and Meth. B 222 (2004) 242] a passive neutron well-counter was reported with 3He tubes embedded in a polyamide (TERRAMID) moderator (lined inside with Cd) surrounding the sources to be measured. Gross and coincidence neutron counting was performed, and the Pu content of the sources was found out from isotope analysis and by adopting specific (α, n) reaction yields of the Pu isotopes and 241Am in Be, based on supplier's information and literature data. The method was further developed and refined. Evaluation algorithm was more precisely worked out. The contribution of secondary (correlated) neutrons to the total neutron output was derived from the coincidence (doubles) count rate and taken into account in assessing the Pu content. A new evaluation of former results was performed. Assay was extended to other PuBe sources, and new results were added. In order to attain higher detection efficiency, a more efficient moderator was also applied, with and without Cd shielding around the assay chamber. Calibration seems possible using neutron measurements only (without γ-spectrometry), based on a correlation between the Pu amount and the coincidence-to-total ratio. It is expected that the method could be used for Pu accountancy and safeguards verification as well as identification and assay of seized, found, or not documented PuBe neutron sources.
Javidi, Soroush; Mandic, Danilo P.; Took, Clive Cheong; Cichocki, Andrzej
2011-01-01
A new class of complex domain blind source extraction algorithms suitable for the extraction of both circular and non-circular complex signals is proposed. This is achieved through sequential extraction based on the degree of kurtosis and in the presence of non-circular measurement noise. The existence and uniqueness analysis of the solution is followed by a study of fast converging variants of the algorithm. The performance is first assessed through simulations on well understood benchmark signals, followed by a case study on real-time artifact removal from EEG signals, verified using both qualitative and quantitative metrics. The results illustrate the power of the proposed approach in real-time blind extraction of general complex-valued sources. PMID:22319461
NASA Astrophysics Data System (ADS)
Milej, Daniel; Janusek, Dariusz; Gerega, Anna; Wojtkiewicz, Stanislaw; Sawosz, Piotr; Treszczanowicz, Joanna; Weigl, Wojciech; Liebert, Adam
2015-10-01
The aim of the study was to determine optimal measurement conditions for assessment of brain perfusion with the use of optical contrast agent and time-resolved diffuse reflectometry in the near-infrared wavelength range. The source-detector separation at which the distribution of time of flights (DTOF) of photons provided useful information on the inflow of the contrast agent to the intracerebral brain tissue compartments was determined. Series of Monte Carlo simulations was performed in which the inflow and washout of the dye in extra- and intracerebral tissue compartments was modeled and the DTOFs were obtained at different source-detector separations. Furthermore, tests on diffuse phantoms were carried out using a time-resolved setup allowing the measurement of DTOFs at 16 source-detector separations. Finally, the setup was applied in experiments carried out on the heads of adult volunteers during intravenous injection of indocyanine green. Analysis of statistical moments of the measured DTOFs showed that the source-detector separation of 6 cm is recommended for monitoring of inflow of optical contrast to the intracerebral brain tissue compartments with the use of continuous wave reflectometry, whereas the separation of 4 cm is enough when the higher-order moments of DTOFs are available.
Doré, Marie-Claire; Caza, Nicole; Gingras, Nathalie; Rouleau, Nancie
2007-11-01
Findings from the literature consistently revealed episodic memory deficits in adolescents with psychosis. However, the nature of the dysfunction remains unclear. Based on a cognitive neuropsychological approach, a theoretically driven paradigm was used to generate valid interpretations about the underlying memory processes impaired in these patients. A total of 16 inpatient adolescents with psychosis and 19 individually matched controls were assessed using an experimental task designed to measure memory for source and temporal context of studied words. Retrospective confidence judgements for source and temporal context responses were also assessed. On word recognition, patients had more difficulty than controls discriminating target words from neutral distractors. In addition, patients identified both source and temporal context features of recognised items less often than controls. Confidence judgements analyses revealed that the difference between the proportions of correct and incorrect responses made with high confidence was lower in patients than in controls. In addition, the proportion of high-confident responses that were errors was higher in patients compared to controls. These findings suggest impaired relational binding processes in adolescents with psychosis, resulting in a difficulty to create unified memory representations. Our findings on retrospective confidence data point to impaired monitoring of retrieved information that may also impair memory performance in these individuals.
Montoya, A; Llopis, N; Gilaberte, I
2011-12-01
DISCERN is an instrument designed to help patients assess the reliability of written information on treatment choices. Originally created in English, there is no validated Spanish version of this instrument. This study seeks to validate the Spanish translation of the DISCERN instrument used as a primary measure on a multicenter study aimed to assess the reliability of web-based information on treatment choices for attention deficit/hyperactivity disorder (ADHD). We used a modified version of a method for validating translated instruments in which the original source-language version is formally compared with the back-translated source-language version. Each item was ranked in terms of comparability of language, similarity of interpretability, and degree of understandability. Responses used Likert scales ranging from 1 to 7, where 1 indicates the best interpretability, language and understandability, and 7 indicates the worst. Assessments were performed by 20 raters fluent in the source language. The Spanish translation of DISCERN, based on ratings of comparability, interpretability and degree of understandability (mean score (SD): 1.8 (1.1), 1.4 (0.9) and 1.6 (1.1), respectively), was considered extremely comparable. All items received a score of less than three, therefore no further revision of the translation was needed. The validation process showed that the quality of DISCERN translation was high, validating the comparable language of the tool translated on assessing written information on treatment choices for ADHD.
Assessment of COPD-related outcomes via a national electronic medical record database.
Asche, Carl; Said, Quayyim; Joish, Vijay; Hall, Charles Oaxaca; Brixner, Diana
2008-01-01
The technology and sophistication of healthcare utilization databases have expanded over the last decade to include results of lab tests, vital signs, and other clinical information. This review provides an assessment of the methodological and analytical challenges of conducting chronic obstructive pulmonary disease (COPD) outcomes research in a national electronic medical records (EMR) dataset and its potential application towards the assessment of national health policy issues, as well as a description of the challenges or limitations. An EMR database and its application to measuring outcomes for COPD are described. The ability to measure adherence to the COPD evidence-based practice guidelines, generated by the NIH and HEDIS quality indicators, in this database was examined. Case studies, before and after their publication, were used to assess the adherence to guidelines and gauge the conformity to quality indicators. EMR was the only source of information for pulmonary function tests, but low frequency in ordering by primary care was an issue. The EMR data can be used to explore impact of variation in healthcare provision on clinical outcomes. The EMR database permits access to specific lab data and biometric information. The richness and depth of information on "real world" use of health services for large population-based analytical studies at relatively low cost render such databases an attractive resource for outcomes research. Various sources of information exist to perform outcomes research. It is important to understand the desired endpoints of such research and choose the appropriate database source.
Solving the patient zero inverse problem by using generalized simulated annealing
NASA Astrophysics Data System (ADS)
Menin, Olavo H.; Bauch, Chris T.
2018-01-01
Identifying patient zero - the initially infected source of a given outbreak - is an important step in epidemiological investigations of both existing and emerging infectious diseases. Here, the use of the Generalized Simulated Annealing algorithm (GSA) to solve the inverse problem of finding the source of an outbreak is studied. The classical disease natural histories susceptible-infected (SI), susceptible-infected-susceptible (SIS), susceptible-infected-recovered (SIR) and susceptible-infected-recovered-susceptible (SIRS) in a regular lattice are addressed. Both the position of patient zero and its time of infection are considered unknown. The algorithm performance with respect to the generalization parameter q˜v and the fraction ρ of infected nodes for whom infection was ascertained is assessed. Numerical experiments show the algorithm is able to retrieve the epidemic source with good accuracy, even when ρ is small, but present no evidence to support that GSA performs better than its classical version. Our results suggest that simulated annealing could be a helpful tool for identifying patient zero in an outbreak where not all cases can be ascertained.
Talker identification across source mechanisms: experiments with laryngeal and electrolarynx speech.
Perrachione, Tyler K; Stepp, Cara E; Hillman, Robert E; Wong, Patrick C M
2014-10-01
The purpose of this study was to determine listeners' ability to learn talker identity from speech produced with an electrolarynx, explore source and filter differentiation in talker identification, and describe acoustic-phonetic changes associated with electrolarynx use. Healthy adult control listeners learned to identify talkers from speech recordings produced using talkers' normal laryngeal vocal source or an electrolarynx. Listeners' abilities to identify talkers from the trained vocal source (Experiment 1) and generalize this knowledge to the untrained source (Experiment 2) were assessed. Acoustic-phonetic measurements of spectral differences between source mechanisms were performed. Additional listeners attempted to match recordings from different source mechanisms to a single talker (Experiment 3). Listeners successfully learned talker identity from electrolarynx speech but less accurately than from laryngeal speech. Listeners were unable to generalize talker identity to the untrained source mechanism. Electrolarynx use resulted in vowels with higher F1 frequencies compared with laryngeal speech. Listeners matched recordings from different sources to a single talker better than chance. Electrolarynx speech, although lacking individual differences in voice quality, nevertheless conveys sufficient indexical information related to the vocal filter and articulation for listeners to identify individual talkers. Psychologically, perception of talker identity arises from a "gestalt" of the vocal source and filter.
Talker identification across source mechanisms: Experiments with laryngeal and electrolarynx speech
Perrachione, Tyler K.; Stepp, Cara E.; Hillman, Robert E.; Wong, Patrick C.M.
2015-01-01
Purpose To determine listeners' ability to learn talker identity from speech produced with an electrolarynx, explore source and filter differentiation in talker identification, and describe acoustic-phonetic changes associated with electrolarynx use. Method Healthy adult control listeners learned to identify talkers from speech recordings produced using talkers' normal laryngeal vocal source or an electrolarynx. Listeners' abilities to identify talkers from the trained vocal source (Experiment 1) and generalize this knowledge to the untrained source (Experiment 2) were assessed. Acoustic-phonetic measurements of spectral differences between source mechanisms were performed. Additional listeners attempted to match recordings from different source mechanisms to a single talker (Experiment 3). Results Listeners successfully learned talker identity from electrolarynx speech, but less accurately than from laryngeal speech. Listeners were unable to generalize talker identity to the untrained source mechanism. Electrolarynx use resulted in vowels with higher F1 frequencies compared to laryngeal speech. Listeners matched recordings from different sources to a single talker better than chance. Conclusions Electrolarynx speech, though lacking individual differences in voice quality, nevertheless conveys sufficient indexical information related to the vocal filter and articulation for listeners to identify individual talkers. Psychologically, perception of talker identity arises from a “gestalt” of the vocal source and filter. PMID:24801962
Sreeramareddy, Chandrashekhar T; Shankar, Pathiyil R; Binu, VS; Mukhopadhyay, Chiranjoy; Ray, Biswabina; Menezes, Ritesh G
2007-01-01
Background In recent years there has been a growing appreciation of the issues of quality of life and stresses involved medical training as this may affect their learning and academic performance. However, such studies are lacking in medical schools of Nepal. Therefore, we carried out this study to assess the prevalence of psychological morbidity, sources and severity of stress and coping strategies among medical students in our integrated problem-stimulated undergraduate medical curriculum. Methods A cross-sectional, questionnaire-based survey was carried out among the undergraduate medical students of Manipal College of Medical Sciences, Pokhara, Nepal during the time period August, 2005 to December, 2006. The psychological morbidity was assessed using General Health Questionnaire. A 24-item questionnaire was used to assess sources of stress and their severity. Coping strategies adopted was assessed using brief COPE inventory. Results The overall response rate was 75.8% (407 out of 525 students). The overall prevalence of psychological morbidity was 20.9% and was higher among students of basic sciences, Indian nationality and whose parents were medical doctors. By logistic regression analysis, GHQ-caseness was associated with occurrence of academic and health-related stressors. The most common sources of stress were related to academic and psychosocial concerns. The most important and severe sources of stress were staying in hostel, high parental expectations, vastness of syllabus, tests/exams, lack of time and facilities for entertainment. The students generally used active coping strategies and alcohol/drug was a least used coping strategy. The coping strategies commonly used by students in our institution were positive reframing, planning, acceptance, active coping, self-distraction and emotional support. The coping strategies showed variation by GHQ-caseness, year of study, gender and parents' occupation. Conclusion The higher level of psychological morbidity warrants need for interventions like social and psychological support to improve the quality of life for these medical students. Student advisors and counselors may train students about stress management. There is also need to bring about academic changes in quality of teaching and evaluation system. A prospective study is necessary to study the association of psychological morbidity with demographic variables, sources of stress and coping strategies. PMID:17678553
Sreeramareddy, Chandrashekhar T; Shankar, Pathiyil R; Binu, V S; Mukhopadhyay, Chiranjoy; Ray, Biswabina; Menezes, Ritesh G
2007-08-02
In recent years there has been a growing appreciation of the issues of quality of life and stresses involved medical training as this may affect their learning and academic performance. However, such studies are lacking in medical schools of Nepal. Therefore, we carried out this study to assess the prevalence of psychological morbidity, sources and severity of stress and coping strategies among medical students in our integrated problem-stimulated undergraduate medical curriculum. A cross-sectional, questionnaire-based survey was carried out among the undergraduate medical students of Manipal College of Medical Sciences, Pokhara, Nepal during the time period August, 2005 to December, 2006. The psychological morbidity was assessed using General Health Questionnaire. A 24-item questionnaire was used to assess sources of stress and their severity. Coping strategies adopted was assessed using brief COPE inventory. The overall response rate was 75.8% (407 out of 525 students). The overall prevalence of psychological morbidity was 20.9% and was higher among students of basic sciences, Indian nationality and whose parents were medical doctors. By logistic regression analysis, GHQ-caseness was associated with occurrence of academic and health-related stressors. The most common sources of stress were related to academic and psychosocial concerns. The most important and severe sources of stress were staying in hostel, high parental expectations, vastness of syllabus, tests/exams, lack of time and facilities for entertainment. The students generally used active coping strategies and alcohol/drug was a least used coping strategy. The coping strategies commonly used by students in our institution were positive reframing, planning, acceptance, active coping, self-distraction and emotional support. The coping strategies showed variation by GHQ-caseness, year of study, gender and parents' occupation. The higher level of psychological morbidity warrants need for interventions like social and psychological support to improve the quality of life for these medical students. Student advisors and counselors may train students about stress management. There is also need to bring about academic changes in quality of teaching and evaluation system. A prospective study is necessary to study the association of psychological morbidity with demographic variables, sources of stress and coping strategies.
NASA Astrophysics Data System (ADS)
Efthimiou, George C.; Kovalets, Ivan V.; Venetsanos, Alexandros; Andronopoulos, Spyros; Argyropoulos, Christos D.; Kakosimos, Konstantinos
2017-12-01
An improved inverse modelling method to estimate the location and the emission rate of an unknown point stationary source of passive atmospheric pollutant in a complex urban geometry is incorporated in the Computational Fluid Dynamics code ADREA-HF and presented in this paper. The key improvement in relation to the previous version of the method lies in a two-step segregated approach. At first only the source coordinates are analysed using a correlation function of measured and calculated concentrations. In the second step the source rate is identified by minimizing a quadratic cost function. The validation of the new algorithm is performed by simulating the MUST wind tunnel experiment. A grid-independent flow field solution is firstly attained by applying successive refinements of the computational mesh and the final wind flow is validated against the measurements quantitatively and qualitatively. The old and new versions of the source term estimation method are tested on a coarse and a fine mesh. The new method appeared to be more robust, giving satisfactory estimations of source location and emission rate on both grids. The performance of the old version of the method varied between failure and success and appeared to be sensitive to the selection of model error magnitude that needs to be inserted in its quadratic cost function. The performance of the method depends also on the number and the placement of sensors constituting the measurement network. Of significant interest for the practical application of the method in urban settings is the number of concentration sensors required to obtain a ;satisfactory; determination of the source. The probability of obtaining a satisfactory solution - according to specified criteria -by the new method has been assessed as function of the number of sensors that constitute the measurement network.
Design Considerations of a Virtual Laboratory for Advanced X-ray Sources
NASA Astrophysics Data System (ADS)
Luginsland, J. W.; Frese, M. H.; Frese, S. D.; Watrous, J. J.; Heileman, G. L.
2004-11-01
The field of scientific computation has greatly advanced in the last few years, resulting in the ability to perform complex computer simulations that can predict the performance of real-world experiments in a number of fields of study. Among the forces driving this new computational capability is the advent of parallel algorithms, allowing calculations in three-dimensional space with realistic time scales. Electromagnetic radiation sources driven by high-voltage, high-current electron beams offer an area to further push the state-of-the-art in high fidelity, first-principles simulation tools. The physics of these x-ray sources combine kinetic plasma physics (electron beams) with dense fluid-like plasma physics (anode plasmas) and x-ray generation (bremsstrahlung). There are a number of mature techniques and software packages for dealing with the individual aspects of these sources, such as Particle-In-Cell (PIC), Magneto-Hydrodynamics (MHD), and radiation transport codes. The current effort is focused on developing an object-oriented software environment using the Rational© Unified Process and the Unified Modeling Language (UML) to provide a framework where multiple 3D parallel physics packages, such as a PIC code (ICEPIC), a MHD code (MACH), and a x-ray transport code (ITS) can co-exist in a system-of-systems approach to modeling advanced x-ray sources. Initial software design and assessments of the various physics algorithms' fidelity will be presented.
A Time and Place for Everything: Developmental Differences in the Building Blocks of Episodic Memory
Lee, Joshua K.; Wendelken, J. Carter; Bunge, Silvia A.; Ghetti, Simona
2015-01-01
This research investigated whether episodic memory development can be explained by improvements in relational binding processes, involved in forming novel associations between events and the context in which they occurred. Memory for item-space, item-time, and item-item relations was assessed in an ethnically diverse sample of 151 children aged 7 to 11 years and 28 young adults. Item-space memory reached adult performance by 9½ years, whereas item-time and item-item memory improved into adulthood. In path analysis, item-space, but not item-time best explained item-item memory. Across age groups, relational binding related to source memory and performance on standardized memory assessments. In conclusion, relational binding development depends on relation type, but relational binding overall supports episodic memory development. PMID:26493950
Comparisons of Aquarius Measurements over Oceans with Radiative Transfer Models at L-Band
NASA Technical Reports Server (NTRS)
Dinnat, E.; LeVine, D.; Abraham, S.; DeMattheis, P.; Utku, C.
2012-01-01
The Aquarius/SAC-D spacecraft includes three L-band (1.4 GHz) radiometers dedicated to measuring sea surface salinity. It was launched in June 2011 by NASA and CONAE (Argentine space agency). We report detailed comparisons of Aquarius measurements with radiative transfer model predictions. These comparisons are used as part of the initial assessment of Aquarius data and to estimate the radiometer calibration bias and stability. Comparisons are also being performed to assess the performance of models used in the retrieval algorithm for correcting the effect of various sources of geophysical "noise" (e.g. Faraday rotation, surface roughness). Such corrections are critical in bringing the error in retrieved salinity down to the required 0.2 practical salinity unit on monthly global maps at 150 km by 150 km resolution.
Feuerhahn, Nicolas; Stamov-Roßnagel, Christian; Wolfram, Maren; Bellingrath, Silja; Kudielka, Brigitte M
2013-10-01
We investigate how emotional exhaustion (EE), the core component of burnout, relates to cognitive performance, job performance and health. Cognitive performance was assessed by self-rated cognitive stress symptoms, self-rated and peer-rated cognitive impairments in everyday tasks and a neuropsychological test of learning and memory (LGT-3); job performance and physical health were gauged by self-reports. Cross-sectional linear regression analyses in a sample of 100 teachers confirm that EE is negatively related to cognitive performance as assessed by self-rating and peer-rating as well as neuropsychological testing (all p < .05). Longitudinal linear regression analyses confirm similar trends (p < .10) for self-rated and peer-rated cognitive performance. Executive control deficits might explain impaired cognitive performance in EE. In longitudinal analyses, EE also significantly predicts physical health. Contrary to our expectations, EE does not affect job performance. When reversed causation is tested, none of the outcome variables at Time 1 predict EE at Time 2. This speaks against cognitive dysfunctioning serving as a vulnerability factor for exhaustion. In sum, results underpin the negative consequences of EE for cognitive performance and health, which are relevant for individuals and organizations alike. In this way, findings might contribute to the understanding of the burnout syndrome. Copyright © 2012 John Wiley & Sons, Ltd.
Probabilistic forecasts of debris-flow hazard at the regional scale with a combination of models.
NASA Astrophysics Data System (ADS)
Malet, Jean-Philippe; Remaître, Alexandre
2015-04-01
Debris flows are one of the many active slope-forming processes in the French Alps, where rugged and steep slopes mantled by various slope deposits offer a great potential for triggering hazardous events. A quantitative assessment of debris-flow hazard requires the estimation, in a probabilistic framework, of the spatial probability of occurrence of source areas, the spatial probability of runout areas, the temporal frequency of events, and their intensity. The main objective of this research is to propose a pipeline for the estimation of these quantities at the region scale using a chain of debris-flow models. The work uses the experimental site of the Barcelonnette Basin (South French Alps), where 26 active torrents have produced more than 150 debris-flow events since 1850 to develop and validate the methodology. First, a susceptibility assessment is performed to identify the debris-flow prone source areas. The most frequently used approach is the combination of environmental factors with GIS procedures and statistical techniques, integrating or not, detailed event inventories. Based on a 5m-DEM and derivatives, and information on slope lithology, engineering soils and landcover, the possible source areas are identified with a statistical logistic regression model. The performance of the statistical model is evaluated with the observed distribution of debris-flow events recorded after 1850 in the study area. The source areas in the three most active torrents (Riou-Bourdoux, Faucon, Sanières) are well identified by the model. Results are less convincing for three other active torrents (Bourget, La Valette and Riou-Chanal); this could be related to the type of debris-flow triggering mechanism as the model seems to better spot the open slope debris-flow source areas (e.g. scree slopes), but appears to be less efficient for the identification of landslide-induced debris flows. Second, a susceptibility assessment is performed to estimate the possible runout distance with a process-based model. The MassMov-2D code is a two-dimensional model of mud and debris flow dynamics over complex topography, based on a numerical integration of the depth-averaged motion equations using shallow water approximation. The run-out simulations are performed for the most active torrents. The performance of the model has been evaluated by comparing modelling results with the observed spreading areas of several recent debris flows. Existing data on the debris flow volume, input discharge and deposits were used to back-analyze those events and estimate the values of the model parameters. Third, hazard is estimated on the basis of scenarios computed in a probabilistic way, for volumes in the range 20'000 to 350'000 m3, and for several combinations of rheological parameters. In most cases, the simulations indicate that the debris flows cause significant overflowing on the alluvial fans for volumes exceeding 100'000 m3 (height of deposits > 2 m, velocities > 5 m.s-1). Probabilities of debris flow runout and debris flow intensities are then computed for each terrain units.
Goldenberg, Mitchell G; Lee, Jason Y; Kwong, Jethro C C; Grantcharov, Teodor P; Costello, Anthony
2018-03-31
To systematically review and synthesise the validity evidence supporting intraoperative and simulation-based assessments of technical skill in urological robot-assisted surgery (RAS), and make evidence-based recommendations for the implementation of these assessments in urological training. A literature search of the Medline, PsycINFO and Embase databases was performed. Articles using technical skill and simulation-based assessments in RAS were abstracted. Only studies involving urology trainees or faculty were included in the final analysis. Multiple tools for the assessment of technical robotic skill have been published, with mixed sources of validity evidence to support their use. These evaluations have been used in both the ex vivo and in vivo settings. Performance evaluations range from global rating scales to psychometrics, and assessments are carried out through automation, expert analysts, and crowdsourcing. There have been rapid expansions in approaches to RAS technical skills assessment, both in simulated and clinical settings. Alternative approaches to assessment in RAS, such as crowdsourcing and psychometrics, remain under investigation. Evidence to support the use of these metrics in high-stakes decisions is likely insufficient at present. © 2018 The Authors BJU International © 2018 BJU International Published by John Wiley & Sons Ltd.
Curved crystal x-ray optics for monochromatic imaging with a clinical source.
Bingölbali, Ayhan; MacDonald, C A
2009-04-01
Monochromatic x-ray imaging has been shown to increase contrast and reduce dose relative to conventional broadband imaging. However, clinical sources with very narrow energy bandwidth tend to have limited intensity and field of view. In this study, focused fan beam monochromatic radiation was obtained using doubly curved monochromator crystals. While these optics have been in use for microanalysis at synchrotron facilities for some time, this work is the first investigation of the potential application of curved crystal optics to clinical sources for medical imaging. The optics could be used with a variety of clinical sources for monochromatic slot scan imaging. The intensity was assessed and the resolution of the focused beam was measured using a knife-edge technique. A simulation model was developed and comparisons to the measured resolution were performed to verify the accuracy of the simulation to predict resolution for different conventional sources. A simple geometrical calculation was also developed. The measured, simulated, and calculated resolutions agreed well. Adequate resolution and intensity for mammography were predicted for appropriate source/optic combinations.
Kurz, Jochen H
2015-12-01
The task of locating a source in space by measuring travel time differences of elastic or electromagnetic waves from the source to several sensors is evident in varying fields. The new concepts of automatic acoustic emission localization presented in this article are based on developments from geodesy and seismology. A detailed description of source location determination in space is given with the focus on acoustic emission data from concrete specimens. Direct and iterative solvers are compared. A concept based on direct solvers from geodesy extended by a statistical approach is described which allows a stable source location determination even for partly erroneous onset times. The developed approach is validated with acoustic emission data from a large specimen leading to travel paths up to 1m and therefore to noisy data with errors in the determined onsets. The adaption of the algorithms from geodesy to the localization procedure of sources of elastic waves offers new possibilities concerning stability, automation and performance of localization results. Fracture processes can be assessed more accurately. Copyright © 2015 Elsevier B.V. All rights reserved.
Wallner, Jürgen; Hochegger, Kerstin; Chen, Xiaojun; Mischak, Irene; Reinbacher, Knut; Pau, Mauro; Zrnc, Tomislav; Schwenzer-Zimmerer, Katja; Zemann, Wolfgang; Schmalstieg, Dieter; Egger, Jan
2018-01-01
Computer assisted technologies based on algorithmic software segmentation are an increasing topic of interest in complex surgical cases. However-due to functional instability, time consuming software processes, personnel resources or licensed-based financial costs many segmentation processes are often outsourced from clinical centers to third parties and the industry. Therefore, the aim of this trial was to assess the practical feasibility of an easy available, functional stable and licensed-free segmentation approach to be used in the clinical practice. In this retrospective, randomized, controlled trail the accuracy and accordance of the open-source based segmentation algorithm GrowCut was assessed through the comparison to the manually generated ground truth of the same anatomy using 10 CT lower jaw data-sets from the clinical routine. Assessment parameters were the segmentation time, the volume, the voxel number, the Dice Score and the Hausdorff distance. Overall semi-automatic GrowCut segmentation times were about one minute. Mean Dice Score values of over 85% and Hausdorff Distances below 33.5 voxel could be achieved between the algorithmic GrowCut-based segmentations and the manual generated ground truth schemes. Statistical differences between the assessment parameters were not significant (p<0.05) and correlation coefficients were close to the value one (r > 0.94) for any of the comparison made between the two groups. Complete functional stable and time saving segmentations with high accuracy and high positive correlation could be performed by the presented interactive open-source based approach. In the cranio-maxillofacial complex the used method could represent an algorithmic alternative for image-based segmentation in the clinical practice for e.g. surgical treatment planning or visualization of postoperative results and offers several advantages. Due to an open-source basis the used method could be further developed by other groups or specialists. Systematic comparisons to other segmentation approaches or with a greater data amount are areas of future works.
Edwards, D. L.; Saleh, A. A.; Greenspan, S. L.
2015-01-01
Summary We performed a systematic review and meta-analysis of the performance of clinical risk assessment instruments for screening for DXA-determined osteoporosis or low bone density. Commonly evaluated risk instruments showed high sensitivity approaching or exceeding 90 % at particular thresholds within various populations but low specificity at thresholds required for high sensitivity. Simpler instruments, such as OST, generally performed as well as or better than more complex instruments. Introduction The purpose of the study is to systematically review the performance of clinical risk assessment instruments for screening for dual-energy X-ray absorptiometry (DXA)-determined osteoporosis or low bone density. Methods Systematic review and meta-analysis were performed. Multiple literature sources were searched, and data extracted and analyzed from included references. Results One hundred eight references met inclusion criteria. Studies assessed many instruments in 34 countries, most commonly the Osteoporosis Self-Assessment Tool (OST), the Simple Calculated Osteoporosis Risk Estimation (SCORE) instrument, the Osteoporosis Self-Assessment Tool for Asians (OSTA), the Osteoporosis Risk Assessment Instrument (ORAI), and body weight criteria. Meta-analyses of studies evaluating OST using a cutoff threshold of <1 to identify US postmenopausal women with osteoporosis at the femoral neck provided summary sensitivity and specificity estimates of 89 % (95%CI 82–96 %) and 41 % (95%CI 23–59 %), respectively. Meta-analyses of studies evaluating OST using a cutoff threshold of 3 to identify US men with osteoporosis at the femoral neck, total hip, or lumbar spine provided summary sensitivity and specificity estimates of 88 % (95%CI 79–97 %) and 55 % (95%CI 42–68 %), respectively. Frequently evaluated instruments each had thresholds and populations for which sensitivity for osteoporosis or low bone mass detection approached or exceeded 90 % but always with a trade-off of relatively low specificity. Conclusions Commonly evaluated clinical risk assessment instruments each showed high sensitivity approaching or exceeding 90 % for identifying individuals with DXA-determined osteoporosis or low BMD at certain thresholds in different populations but low specificity at thresholds required for high sensitivity. Simpler instruments, such as OST, generally performed as well as or better than more complex instruments. PMID:25644147
Villoria, Eduardo M; Lenzi, Antônio R; Soares, Rodrigo V; Souki, Bernardo Q; Sigurdsson, Asgeir; Marques, Alexandre P; Fidel, Sandra R
2017-01-01
To describe the use of open-source software for the post-processing of CBCT imaging for the assessment of periapical lesions development after endodontic treatment. CBCT scans were retrieved from endodontic records of two patients. Three-dimensional virtual models, voxel counting, volumetric measurement (mm 3 ) and mean intensity of the periapical lesion were performed with ITK-SNAP v. 3.0 software. Three-dimensional models of the lesions were aligned and overlapped through the MeshLab software, which performed an automatic recording of the anatomical structures, based on the best fit. Qualitative and quantitative analyses of the changes in lesions size after treatment were performed with the 3DMeshMetric software. The ITK-SNAP v. 3.0 showed the smaller value corresponding to the voxel count and the volume of the lesion segmented in yellow, indicating reduction in volume of the lesion after the treatment. A higher value of the mean intensity of the segmented image in yellow was also observed, which suggested new bone formation. Colour mapping and "point value" tool allowed the visualization of the reduction of periapical lesions in several regions. Researchers and clinicians in the monitoring of endodontic periapical lesions have the opportunity to use open-source software.
Zeitler, Daniel M; Dorman, Michael F; Natale, Sarah J; Loiselle, Louise; Yost, William A; Gifford, Rene H
2015-09-01
To assess improvements in sound source localization and speech understanding in complex listening environments after unilateral cochlear implantation for single-sided deafness (SSD). Nonrandomized, open, prospective case series. Tertiary referral center. Nine subjects with a unilateral cochlear implant (CI) for SSD (SSD-CI) were tested. Reference groups for the task of sound source localization included young (n = 45) and older (n = 12) normal-hearing (NH) subjects and 27 bilateral CI (BCI) subjects. Unilateral cochlear implantation. Sound source localization was tested with 13 loudspeakers in a 180 arc in front of the subject. Speech understanding was tested with the subject seated in an 8-loudspeaker sound system arrayed in a 360-degree pattern. Directionally appropriate noise, originally recorded in a restaurant, was played from each loudspeaker. Speech understanding in noise was tested using the Azbio sentence test and sound source localization quantified using root mean square error. All CI subjects showed poorer-than-normal sound source localization. SSD-CI subjects showed a bimodal distribution of scores: six subjects had scores near the mean of those obtained by BCI subjects, whereas three had scores just outside the 95th percentile of NH listeners. Speech understanding improved significantly in the restaurant environment when the signal was presented to the side of the CI. Cochlear implantation for SSD can offer improved speech understanding in complex listening environments and improved sound source localization in both children and adults. On tasks of sound source localization, SSD-CI patients typically perform as well as BCI patients and, in some cases, achieve scores at the upper boundary of normal performance.
Free-electron laser emission architecture impact on extreme ultraviolet lithography
NASA Astrophysics Data System (ADS)
Hosler, Erik R.; Wood, Obert R.; Barletta, William A.
2017-10-01
Laser-produced plasma (LPP) EUV sources have demonstrated ˜125 W at customer sites, establishing confidence in EUV lithography (EUVL) as a viable manufacturing technology. However, for extension to the 3-nm technology node and beyond, existing scanner/source technology must enable higher-NA imaging systems (requiring increased resist dose and providing half-field exposures) and/or EUV multipatterning (requiring increased wafer throughput proportional to the number of exposure passes). Both development paths will require a substantial increase in EUV source power to maintain the economic viability of the technology, creating an opportunity for free-electron laser (FEL) EUV sources. FEL-based EUV sources offer an economic, high-power/single-source alternative to LPP EUV sources. Should FELs become the preferred next-generation EUV source, the choice of FEL emission architecture will greatly affect its operational stability and overall capability. A near-term industrialized FEL is expected to utilize one of the following three existing emission architectures: (1) self-amplified spontaneous emission, (2) regenerative amplifier, or (3) self-seeding. Model accelerator parameters are put forward to evaluate the impact of emission architecture on FEL output. Then, variations in the parameter space are applied to assess the potential impact to lithography operations, thereby establishing component sensitivity. The operating range of various accelerator components is discussed based on current accelerator performance demonstrated at various scientific user facilities. Finally, comparison of the performance between the model accelerator parameters and the variation in parameter space provides a means to evaluate the potential emission architectures. A scorecard is presented to facilitate this evaluation and provides a framework for future FEL design and enablement for EUVL applications.
Characterization and application of a laser-driven intense pulsed neutron source using Trident
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vogel, Sven C.
A team of Los Alamos researchers supported a final campaign to use the Trident laser to produce neutrons, contributed their multidisciplinary expertise to experimentally assess if laser-driven neutron sources can be useful for MaRIE. MaRIE is the Laboratory’s proposed experimental facility for the study of matter-radiation interactions in extremes. Neutrons provide a radiographic probe that is complementary to x-rays and protons, and can address imaging challenges not amenable to those beams. The team's efforts characterize the Laboratory’s responsiveness, flexibility, and ability to apply diverse expertise where needed to perform successful complex experiments.
Analysis of air quality management with emphasis on transportation sources
NASA Technical Reports Server (NTRS)
English, T. D.; Divita, E.; Lees, L.
1980-01-01
The current environment and practices of air quality management were examined for three regions: Denver, Phoenix, and the South Coast Air Basin of California. These regions were chosen because the majority of their air pollution emissions are related to mobile sources. The impact of auto exhaust on the air quality management process is characterized and assessed. An examination of the uncertainties in air pollutant measurements, emission inventories, meteorological parameters, atmospheric chemistry, and air quality simulation models is performed. The implications of these uncertainties to current air quality management practices is discussed. A set of corrective actions are recommended to reduce these uncertainties.
Hubbard, Rebecca A; Benjamin-Johnson, Rhondee; Onega, Tracy; Smith-Bindman, Rebecca; Zhu, Weiwei; Fenton, Joshua J
2015-01-15
Quality assessment is critical for healthcare reform, but data sources are lacking for measurement of many important healthcare outcomes. With over 49 million people covered by Medicare as of 2010, Medicare claims data offer a potentially valuable source that could be used in targeted health care quality improvement efforts. However, little is known about the operating characteristics of provider profiling methods using claims-based outcome measures that may estimate provider performance with error. Motivated by the example of screening mammography performance, we compared approaches to identifying providers failing to meet guideline targets using Medicare claims data. We used data from the Breast Cancer Surveillance Consortium and linked Medicare claims to compare claims-based and clinical estimates of cancer detection rate. We then demonstrated the performance of claim-based estimates across a broad range of operating characteristics using simulation studies. We found that identification of poor performing providers was extremely sensitive to algorithm specificity, with no approach identifying more than 65% of poor performing providers when claims-based measures had specificity of 0.995 or less. We conclude that claims have the potential to contribute important information on healthcare outcomes to quality improvement efforts. However, to achieve this potential, development of highly accurate claims-based outcome measures should remain a priority. Copyright © 2014 John Wiley & Sons, Ltd.
ScanRanker: Quality Assessment of Tandem Mass Spectra via Sequence Tagging
Ma, Ze-Qiang; Chambers, Matthew C.; Ham, Amy-Joan L.; Cheek, Kristin L.; Whitwell, Corbin W.; Aerni, Hans-Rudolf; Schilling, Birgit; Miller, Aaron W.; Caprioli, Richard M.; Tabb, David L.
2011-01-01
In shotgun proteomics, protein identification by tandem mass spectrometry relies on bioinformatics tools. Despite recent improvements in identification algorithms, a significant number of high quality spectra remain unidentified for various reasons. Here we present ScanRanker, an open-source tool that evaluates the quality of tandem mass spectra via sequence tagging with reliable performance in data from different instruments. The superior performance of ScanRanker enables it not only to find unassigned high quality spectra that evade identification through database search, but also to select spectra for de novo sequencing and cross-linking analysis. In addition, we demonstrate that the distribution of ScanRanker scores predicts the richness of identifiable spectra among multiple LC-MS/MS runs in an experiment, and ScanRanker scores assist the process of peptide assignment validation to increase confident spectrum identifications. The source code and executable versions of ScanRanker are available from http://fenchurch.mc.vanderbilt.edu. PMID:21520941
Development of a compact E ? B microchannel plate detector for beam imaging
Wiggins, B. B.; Singh, Varinderjit; Vadas, J.; ...
2017-06-17
A beam imaging detector was developed by coupling a multi-strip anode with delay line readout to an E×B microchannel plate (MCP) detector. This detector is capable of measuring the incident position of the beam particles in one-dimension. To assess the spatial resolution, the detector was illuminated by an α-source with an intervening mask that consists of a series of precisely-machined slits. The measured spatial resolution was 520 um source FWHM, which was improved to 413 um FWHM by performing an FFT of the signals, rejecting spurious signals on the delay line, and requiring a minimum signal amplitude. This measured spatialmore » resolution of 413 um FWHM corresponds to an intrinsic resolution of 334 um FWHM when the effect of the finite slit width is de-convoluted. To understand the measured resolution, the performance of the detector is simulated with the ion-trajectory code SIMION.« less
Development of a compact E ? B microchannel plate detector for beam imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wiggins, B. B.; Singh, Varinderjit; Vadas, J.
A beam imaging detector was developed by coupling a multi-strip anode with delay line readout to an E×B microchannel plate (MCP) detector. This detector is capable of measuring the incident position of the beam particles in one-dimension. To assess the spatial resolution, the detector was illuminated by an α-source with an intervening mask that consists of a series of precisely-machined slits. The measured spatial resolution was 520 um source FWHM, which was improved to 413 um FWHM by performing an FFT of the signals, rejecting spurious signals on the delay line, and requiring a minimum signal amplitude. This measured spatialmore » resolution of 413 um FWHM corresponds to an intrinsic resolution of 334 um FWHM when the effect of the finite slit width is de-convoluted. To understand the measured resolution, the performance of the detector is simulated with the ion-trajectory code SIMION.« less
Alpha-particle emission probabilities of ²³⁶U obtained by alpha spectrometry.
Marouli, M; Pommé, S; Jobbágy, V; Van Ammel, R; Paepen, J; Stroh, H; Benedik, L
2014-05-01
High-resolution alpha-particle spectrometry was performed with an ion-implanted silicon detector in vacuum on a homogeneously electrodeposited (236)U source. The source was measured at different solid angles subtended by the detector, varying between 0.8% and 2.4% of 4π sr, to assess the influence of coincidental detection of alpha-particles and conversion electrons on the measured alpha-particle emission probabilities. Additional measurements were performed using a bending magnet to eliminate conversion electrons, the results of which coincide with normal measurements extrapolated to an infinitely small solid angle. The measured alpha emission probabilities for the three main peaks - 74.20 (5)%, 25.68 (5)% and 0.123 (5)%, respectively - are consistent with literature data, but their precision has been improved by at least one order of magnitude in this work. © 2013 Published by Elsevier Ltd.
NDFOM Description for DNDO Summer Internship Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Budden, Brent Scott
2017-12-01
Nuclear Detection Figure of Merit (NDFOM) is a DNDO-funded project at LANL to develop a software framework that allows a user to evaluate a radiation detection scenario of interest, quickly obtaining results on detector performance. It is intended as a “first step” in detector performance assessment, and meant to be easily employed by subject matter experts (SMEs) and non-SMEs alike. The generic scenario consists of a potential source moving past a detector at a relative velocity and with a distance of closest approach. Such a scenario is capable of describing, e.g., vehicles driving through portal monitors, border patrol scanning suspectedmore » illicit materials with a handheld instrument, and first responders with backpackor pager-based detectors (see Fig. 1). The backend library is prepopulated by the NDFOM developers to include sources and detectors of interest to DNDO and its community.« less
Debris/ice/TPS assessment and photographic analysis for Shuttle Mission STS-33R
NASA Technical Reports Server (NTRS)
Stevenson, Charles G.; Katnik, Gregory N.; Higginbotham, Scott A.
1989-01-01
A debris/ice/Thermal Protection System (TPS) assessment and photographic analysis was conducted for Space Shuttle Mission STS-33R. Debris inspections of the flight elements and launch pad are performed before and after launch. Ice/frost conditions on the external tank are assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography is analyzed after launch to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the debris/ice/TPS conditions and photographic analysis of Mission STS-33R, and their overall effect on the Space Shuttle Program.
Debris/ice/TPS assessment and photographic analysis for shuttle mission STS-31R
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.; Higginbotham, Scott A.; Davis, J. Bradley
1990-01-01
A Debris/Ice/Thermal Protection System (TPS) assessment and photographic analysis was conducted for Space Shuttle Mission STS-31R. Debris inspections of the flight elements and launch pad are performed before and after launch. Ice/frost conditions on the External Tank are assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography is analyzed after launch to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. The debris/ice/TPS conditions and photographic analysis of Mission STS-31R, is presented along with their overall effect on the Space Shuttle Program.
Debris/ice/tps Assessment and Integrated Photographic Analysis of Shuttle Mission STS-81
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.; Lin, Jill D.
1997-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-81. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle mission STS-81 and the resulting effect on the Space Shuttle Program.
Debris/ice/tps Assessment and Integrated Photographic Analysis of Shuttle Mission STS-83
NASA Technical Reports Server (NTRS)
Lin, Jill D.; Katnik, Gregory N.
1997-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-83. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle mission STS-83 and the resulting effect on the Space Shuttle Program.
Debris/ice/TPS assessment and integrated photographic analysis of Shuttle Mission STS-71
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.; Bowen, Barry C.; Davis, J. Bradley
1995-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-71. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanner data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle mission STS-71 and the resulting effect on the Space Shuttle Program.
Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-102
NASA Technical Reports Server (NTRS)
Rivera, Jorge E.; Kelly, J. David (Technical Monitor)
2001-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-102. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch were analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or inflight anomalies. This report documents the debris/ice /thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-102 and the resulting effect on the Space Shuttle Program.
Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-94
NASA Technical Reports Server (NTRS)
Bowen, Barry C.; Lin, Jill D.
1997-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-94. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle mission STS-94 and the resulting effect on the Space Shuttle Program.
Debris/ice/tps Assessment and Integrated Photographic Analysis of Shuttle Mission STS-79
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.; Lin, Jill D.
1996-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-79. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle mission STS-79 and the resulting effect on the Space Shuttle Program.
Debris/ice/TPS assessment and integrated photographic analysis of Shuttle mission STS-73
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.; Bowen, Barry C.; Lin, Jill D.
1995-01-01
A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-73. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanner data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle Mission STS-73 and the resulting effect on the Space Shuttle Program.
Debris/Ice/TPS Assessment and Photographic Analysis for Shuttle Mission STS-38
NASA Technical Reports Server (NTRS)
Higginbotham, Scott A.; Davis, J. Bradley
1991-01-01
A debris/ice/TPS assessment and photographic analysis was conducted for the Space Shuttle Mission STS-38. Debris inspection of the flight elements and launch pad were performed before and after launch. Ice/frost conditions on the external tank were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography was analyzed after launch to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. The debris/ice/TPS conditions and photographic analysis of Mission STS-38, and their overall effect on the Space Shuttle Program are documented.
Debris/ice/TPS assessment and integrated photographic analysis for Shuttle Mission STS-50
NASA Technical Reports Server (NTRS)
Higginbotham, Scott A.; Davis, J. Bradley; Katnik, Gregory N.
1992-01-01
Thermal Protection System (TPS) assessment and integrated photographic analysis was conducted for Shuttle Mission STS-50. Debris inspections of the flight elements and launch pad were performed before and after launch. Ice/frost conditions on the external tank were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography was analyzed after launch to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. The debris/ice/TPS conditions and integrated photographic analysis of Shuttle Mission STS-50, and the resulting effect on the Space Shuttle Program are documented.
Debris/Ice/TPS Assessment and Integrated Photographic Analysis for Shuttle Mission STS-49
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.; Higginbotham, Scott A.; Davis, J. Bradley
1992-01-01
A debris/ice/Thermal Protection System (TPS) assessment and integrated photographic analysis was conducted for Shuttle Mission STS-49. Debris inspections of the flight elements and launch pad were performed before and after launch. Ice/frost conditions on the External Tank were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography was analyzed after launch to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. Debris/ice/TPS conditions and integrated photographic analysis of Shuttle Mission STS-49, and the resulting effect on the Space Shuttle Program are discussed.
Debris/ice/TPS assessment and photographic analysis of shuttle mission STS-48
NASA Technical Reports Server (NTRS)
Higginbotham, Scott A.; Davis, J. Bradley
1991-01-01
A Debris/Ice/TPS assessment and photographic analysis was conducted for Space Shuttle Mission STS-48. Debris inspection of the flight elements and launch pad were performed before and after launch. Ice/frost conditions on the External Tank were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography was analyzed after launch to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. The debris/ice/TPS conditions and photographic analysis of Mission STS-48 are documented, along with their overall effect on the Space Shuttle Program.
Debris/Ice/TPS Assessment and Photographic Analysis for Shuttle Mission STS-37
NASA Technical Reports Server (NTRS)
Katnik, Gregory N.; Higginbotham, Scott A.; Davis, J. Bradley
1991-01-01
A Debris/Ice/TPS assessment and photographic analysis was conducted for Space Shuttle Mission STS-37. Debris inspections of the flight elements and launch pad were performed before and after launch. Ice/frost conditions on the External Tank were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography of launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or inflight anomalies. The debris/ice/TPS conditions and photographic analysis of Mission STS-37 are documented, along with their overall effect on the Space Shuttle Program.