Note: This page contains sample records for the topic task based exposure from Science.gov.
While these samples are representative of the content of Science.gov,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of Science.gov
to obtain the most current and comprehensive results. Last update: November 12, 2013.
The assessment of worker exposures to airborne contaminants in the dynamic environment present at most construction sites poses considerable challenges to the industrial hygienist. In this study, we applied a task-based approach to the assessment of lead exposure among structural steel iron workers engaged in a large, complex bridge rehabilitation project. We evaluated the usefulness of task-basedexposure data for the development of worker protection programs. Task-specific and multitask samples were collected, and operation-specific and 8-hr time-weighted averages were calculated. The task-specific data showed significant differences in exposure levels among different tasks. Arithmetic mean exposures varied from 1,357 micrograms/m3 lead for torch cutting and 989 micrograms/m3 for scaling to 31 micrograms/m3 for reaming and 4 micrograms/m3 for drilling. Our task-specific data were compared with the task-basedexposure levels presented by OSHA in its Lead Exposure in Construction-Interim Final Rule (29 CFR 1926). There was good general agreement between our results and OSHA's reported data. Task-based data were very useful in exposure assessment and much more precise than full-shift and operation-based measurements in guiding strategies for worker protection. These findings suggest that task-based data should routinely be collected in evaluating exposure to lead and perhaps other toxic substances in construction work. PMID:9055954
Goldberg, M; Levin, S M; Doucette, J T; Griffin, G
Few studies have been done examining noise exposures associated with agricultural tasks. This study was conducted to address that research gap by calculating the noise exposures for tasks and equipment associated with grain production and assessing the variability in those exposures. An additional aim of this study was to identify tasks and equipment that could be targeted for intervention strategies as a means toward reducing the total noise exposures of farmers and farm workers. Through the use of personal noise dosimetry and direct observation, over 30,000 one-minute noise exposure measurements and corresponding task and equipment data were collected on 18 farms and compiled into a task-based noise exposure database. Mean noise exposures were calculated for 23 tasks and 18 pieces of equipment. The noise exposures for the tasks and equipment ranged from 78.6 to 99.9 dBA and from 80.8 to 96.2 dBA, respectively, with most of the noise exposures having a large standard deviation and maximum noise exposure level. Most of the variability in the task and equipment noise exposures was attributable to within-farm variations (e.g., work practices, distance from noise sources). Comparisons of the mean noise exposures for the agricultural tasks and equipment revealed that most were not statistically different. Grain production tasks and equipment with high mean noise exposures were identified. However the substantial variability in the noise exposures and the occurrence of intense noise measurements for nearly every task and piece of equipment indicate that targeting a few specific tasks or equipment for intervention strategies would reduce lifetime noise exposure but would not completely eliminate exposure to hazardous noise levels. PMID:23923730
Humann, M J; Sanderson, W T; Donham, K J; Kelly, K M
This study examined whether exposure to second/foreign language (L2) data under different computerized task conditions had a differential impact on learners' ability to recognize and produce the target structure immediately after exposure to the input and over time. Learners' L2 development was assessed through recognition and…
This study of bridge painters working for small contractors in Massachusetts investigated the causes of elevated blood lead levels and assessed their exposure to lead. Bridge work sites were evaluated for a 2-week period during which personal and area air samples and information on work site characteristics and lead abatement methods were gathered. Short-duration personal inhalable samples collected from 18
This page from The Experiential Learning Center provides a number of scenario-basedtasks for use in the classroom or for professional development training. The materials are freely available for download and use and would be applicable to learners in a variety of subjects including software development, faculty professional development, office system applications/ICT, biology/bioinformatics, environmental studies, Python programming, engineering, network security/MIS, computational thinking and English writing. Instructor guides and other classroom instructional materials are provided. The project requests that educators let them know when these materials are used in order to track dissemination of the work and in order to inform the community about upcoming workshops and presentations.
Cardiovascular effects under various noise-exposure and task-demand conditions were studied among 40 senior highschool students. The subjects consisted of 20 males and 20 females with a mean age of 16.7 +/- 0.7 years. All subjects had equivalent abacus performance ratings. Each subject was tested with a random sequence of six sessions. The time limit set for each session was 33 min. Six experimental sessions were constructed by a random combination of noise exposure (60, 85 or 90 dB (A] white noise) and task demand (task presence or task absence) variables. Blood pressure measures were taken at the beginning and ending phases of each session. A task-demand variable was defined as a conjoint of mental arithmetic (3 min) and abacus arithmetic (30 min). The results from the present study show that the effect of noise exposure on task performance is remarkable. Only noise exposure tended to influence the performance of male students in abacus arithmetic. The effect of task demand on blood pressure was higher than that of noise exposure. No interaction effect (noise exposure x task demand) on blood pressure, was found via analyses of within-subjects two-way ANOVA. PMID:3346087
We investigated the influences of odor exposure on performance and on breathing measures. The task was composed of tracking, short-term memory, and peripheral reaction parts. During rest or while performing the task, 12 participants were exposed to 4 different odors in 2 intensities. The higher intensity of the malodors induced a short-term decrement in mean inspiration flow (Vi\\/Ti) after stimulus
Brigitta Danuser; Denise Moser; Tanja Vitale-Sethre; René Hirsig; Helmut Krueger
|These task analyses are designed to be used in combination with the "Trade and Industrial Education Service Area Resource" in order to implement competency-based education in the masonry program in Virginia. The task analysis document contains the task inventory, suggested task sequence lists, and content outlines for the secondary courses…
Henrico County Public Schools, Glen Allen, VA. Virginia Vocational Curriculum Center.
Aims: To validate the accuracy of construction worker recall of task and environment based information; and to evaluate the effect of task recall on estimates of noise exposure. Methods: A cohort of 25 construction workers recorded tasks daily and had dosimetry measurements weekly for six weeks. Worker recall of tasks reported on the daily activity cards was validated with research observations and compared directly to task recall at a six month interview. Results: The mean LEQ noise exposure level (dBA) from dosimeter measurements was 89.9 (n = 61) and 83.3 (n = 47) for carpenters and electricians, respectively. The percentage time at tasks reported during the interview was compared to that calculated from daily activity cards; only 2/22 tasks were different at the nominal 5% significance level. The accuracy, based on bias and precision, of percentage time reported for tasks from the interview was 53–100% (median 91%). For carpenters, the difference in noise estimates derived from activity cards (mean 91.9 dBA) was not different from those derived from the questionnaire (mean 91.7 dBA). This trend held for electricians as well. For all subjects, noise estimates derived from the activity card and the questionnaire were strongly correlated with dosimetry measurements. The average difference between the noise estimate derived from the questionnaire and dosimetry measurements was 2.0 dBA, and was independent of the actual exposure level. Conclusions: Six months after tasks were performed, construction workers were able to accurately recall the percentage time they spent at various tasks. Estimates of noise exposurebased on long term recall (questionnaire) were no different from estimates derived from daily activity cards and were strongly correlated with dosimetry measurements, overestimating the level on average by 2.0 dBA.
Reeb-Whitaker, C; Seixas, N; Sheppard, L; Neitzel, R
Cardiovascular effects under various noise-exposure and task-demand conditions were studied among 40 senior highschool students. The subjects consisted of 20 males and 20 females with a mean age of 16.7 ± 0.7 years. All subjects had equivalent abacus performance ratings. Each subject was tested with a random sequence of six sessions. The time limit set for each session was 33
Trong-Neng Wu; Jong-Tsun Huang; Peter F. S. Chou; Po-Ya Chang
Little is known about pesticide exposure among farmworkers, and even less is known about the exposure associated with performing specific farm tasks. Using a random sample of 213 farmworkers in 24 communities and labor camps in eastern Washington State, we examined the association between occupational task and organophosphate (OP) pesticide residues in dust and OP metabolite concentrations in urine samples of adult farmworkers and their children. The data are from a larger study that sought to test a culturally appropriate intervention to break the take-home pathway of pesticide exposure. Commonly reported farm tasks were harvesting or picking (79.2%), thinning (64.2%), loading plants or produce (42.2%), planting or transplanting (37.6%), and pruning (37.2%). Mixing, loading, or applying pesticide formulations was reported by 20% of our sample. Workers who thinned were more likely than those who did not to have detectable levels of azinphos-methyl in their house dust (92.1% vs. 72.7%; p = 0.001) and vehicle dust (92.6% vs. 76.5%; p = 0.002). Thinning was associated with higher urinary pesticide metabolite concentrations in children (91.9% detectable vs. 81.3%; p = 0.02) but not in adults. Contrary to expectation, workers who reported mixing, loading, or applying pesticide formulations had lower detectable levels of pesticide residues in their house or vehicle dust, compared with those who did not perform these job tasks, though the differences were not significant. Future research should evaluate workplace protective practices of fieldworkers and the adequacy of reentry intervals for pesticides used during thinning.
Little is known about pesticide exposure among farmworkers, and even less is known about the exposure associated with performing specific farm tasks. Using a random sample of 213 farmworkers in 24 communities and labor camps in eastern Washington State, we examined the association between occupational task and organophosphate (OP) pesticide residues in dust and OP metabolite concentrations in urine samples of adult farmworkers and their children. The data are from a larger study that sought to test a culturally appropriate intervention to break the take-home pathway of pesticide exposure. Commonly reported farm tasks were harvesting or picking (79.2%), thinning (64.2%), loading plants or produce (42.2%), planting or transplanting (37.6%), and pruning (37.2%). Mixing, loading, or applying pesticide formulations was reported by 20% of our sample. Workers who thinned were more likely than those who did not to have detectable levels of azinphos-methyl in their house dust (92.1% vs. 72.7%; p = 0.001) and vehicle dust (92.6% vs. 76.5%; p = 0.002). Thinning was associated with higher urinary pesticide metabolite concentrations in children (91.9% detectable vs. 81.3%; p = 0.02) but not in adults. Contrary to expectation, workers who reported mixing, loading, or applying pesticide formulations had lower detectable levels of pesticide residues in their house or vehicle dust, compared with those who did not perform these job tasks, though the differences were not significant. Future research should evaluate workplace protective practices of fieldworkers and the adequacy of reentry intervals for pesticides used during thinning. PMID:14754567
In this study, we investigated the interaction of three different sources of task activation in precued task switching. We distinguished (1) intentional, cue-basedtask activation from two other, involuntary sources of activation: (2) persisting activation from the preceding task and (3) stimulus-basedtask activation elicited by the task stimulus itself. We assumed that cue-basedtask activation increases as a function of cue-stimulus interval (CSI) and that task activation from the preceding trial decays as a function of response-stimulus interval Stimulus-basedtask activation is thought to be due to involuntary retrieval of stimulus-associated tasks. We manipulated stimulus-basedtask activation by mapping each of the stimuli consistently to only one or the other of the two tasks. After practice, we reversed this mapping in order to test the effects of item-specific stimulus-task association. The mapping reversal resulted in increased reaction times and increased task shift costs. These stimulus-based priming effects were markedly reduced with a long CSI, relative to a short CSI, suggesting that stimulus-based priming shows up in performance principally when competition between tasks is high and that cue-basedtask activation reduces task competition. In contrast, lengthening the response-cue interval (decay time) reduced shift costs but did not reduce the stimulus-based priming effect The data are consistent with separable stimulus-related and response-related components of task activation. Further theoretical implications of these findings are discussed. PMID:16752606
Compared the effects of noise under active task involvement as opposed to passive exposure, using 80 undergraduate Ss who were assigned to 1 of 4 conditions representing 2 * 2 combinations of task vs no task and noise vs quiet. Performance on a dial-monitoring task was unaffected by noise. Ratings of interest and tenseness were significantly higher under the task
Joachim F. Wohlwill; Jack L. Nasar; David M. DeJoy; Hossein H. Foruzani
We propose the so-called TeleSensor programming concept that uses sensory perception to achieve local autonomy in robotic manipulation. Sensor based robot tasks are used to define elemental moves within a high level programming environment. This approach is applicable in both, the real robot's world and the simulated one. Beside the graphical off-line programming concept, the range of application lies especially
This book contains a task inventory, a task analysis of 150 tasks from that inventory, and a tool list for performance-based welding courses in the state of Indiana. The task inventory and tool list reflect 28 job titles found in Indiana. In the first part of the guide, tasks are listed by these domains: carbon-arc, electron beam, G.M.A.W., gas…
|Tasks have become an essential feature of second language (L2) learning in recent years. Tasks range from getting learners to repeat linguistic elements satisfactorily to having them perform in "free" production. Along this task-based continuum, task-based scenario interaction lies at the point midway between controlled and semi-controlled…
A task-based execution provides a universal approach to dynamic load balancing for irregular applications. Tasks are arbitrary units of work that are created dynamically at run-time and that are stored in a parallel data structure, the task pool, until they are scheduled onto a processor for execution. In this paper, we evaluate the performance of different task pool implementations for
Background In recent years, cleaning has been identified as an occupational risk because of an increased incidence of reported respiratory effects, such as asthma and asthma-like symptoms among cleaning workers. Due to the lack of systematic occupational hygiene analyses and workplace exposure data, it is not clear which cleaning-related exposures induce or aggravate asthma and other respiratory effects. Currently, there is a need for systematic evaluation of cleaning products ingredients and their exposures in the workplace. The objectives of this work were to: a) identify cleaning products' ingredients of concern with respect to respiratory and skin irritation and sensitization; and b) assess the potential for inhalation and dermal exposures to these ingredients during common cleaning tasks. Methods We prioritized ingredients of concern in cleaning products commonly used in several hospitals in Massachusetts. Methods included workplace interviews, reviews of product Materials Safety Data Sheets and the scientific literature on adverse health effects to humans, reviews of physico-chemical properties of cleaning ingredients, and occupational hygiene observational analyses. Furthermore, the potential for exposure in the workplace was assessed by conducting qualitative assessment of airborne exposures and semi-quantitative assessment of dermal exposures. Results Cleaning products used for common cleaning tasks were mixtures of many chemicals, including respiratory and dermal irritants and sensitizers. Examples of ingredients of concern include quaternary ammonium compounds, 2-butoxyethanol, and ethanolamines. Cleaning workers are at risk of acute and chronic inhalation exposures to volatile organic compounds (VOC) vapors and aerosols generated from product spraying, and dermal exposures mostly through hands. Conclusion Cleaning products are mixtures of many chemical ingredients that may impact workers' health through air and dermal exposures. Because cleaning exposures are a function of product formulations and product application procedures, a combination of product evaluation with workplace exposure assessment is critical in developing strategies for protecting workers from cleaning hazards. Our taskbased assessment methods allowed classification of tasks in different exposure categories, a strategy that can be employed by epidemiological investigations related to cleaning. The methods presented here can be used by occupational and environmental health practitioners to identify intervention strategies.
Why are many teachers around the world moving toward task-based learning (TBL)? This shift is based on the strong belief that TBL facilitates second language acquisition and makes second language learning and teaching more principled and effective. Based on insights gained from using tasks as research tools, this volume shows how teachers can use…
|Why are many teachers around the world moving toward task-based learning (TBL)? This shift is based on the strong belief that TBL facilitates second language acquisition and makes second language learning and teaching more principled and effective. Based on insights gained from using tasks as research tools, this volume shows how teachers can use…
Musculoskeletal disorders are common among agricultural workers, particularly among dairy farm workers. Specifically, dairy farm workers have been identified as being at risk for knee osteoarthritis. Physical risk factors that may contribute to knee osteoarthritis include awkward postures of the knee, such as kneeling or squatting. The purpose of this study was to quantify exposure to awkward knee posture among dairy farm workers during milking and feeding tasks in two common types of milking facilities (stanchion and parlor). Twenty-three dairy farm workers performed milking and feeding tasks; 11 worked in a stanchion milking facility, and 12 worked in a parlor milking facility. An electrogoniometer was used to measure knee flexion during 30 min of the milking and feeding tasks. Milking in a stanchion facility results in a greater duration of exposure to awkward posture of the knee compared with milking in a parlor facility. Specifically, the percentage of time in >or=110 degrees knee flexion was significantly greater in the stanchion facility (X = 17.7; SE 4.2) than in the parlor facility (X = 0.05; SE 0.04; p tasks. This study supports previous findings that working in stanchion milking facility results in greater exposure to awkward knee posture compared with working in a parlor milking facility. PMID:20521198
Pesticide exposure has been associated with neuropsychological and psychiatric impairments and neurodegenerative disorders. Pesticide exposure commonly causes a deficit in inhibitory control behaviours. In the present study, we investigated whether acute exposure to organophosphate (OP) chlorpyrifos (CPF) is related to long-term lack of inhibitory control; we also examined the possible neurochemical basis of this association. Lister Hooded rats were exposed to an acute dose of CPF (250 mg/kg). Seven months later, we tested inhibitory control with the 5-choice serial reaction time task (5-CSRTT). We manipulated the baseline conditions of this task and also systemically pre-administered d-amphetamine, quinpirole, dizocilpine (MK-801) or ketanserin. We also analysed the post-mortem baseline levels of monoamines and amino acids in different brain regions. On the 5-CSRT task, CPF-exposed rats showed elevated perseverative responses that persisted across manipulation of baseline conditions of the task and under most of the pharmacological challenges tested. Only D-amphetamine induced a dose-dependent amelioration of the increased perseverative responses in the CPF group. The CPF group also exhibited increased levels of dopamine metabolism in the hippocampus and decreased levels of gamma-aminobutyric acid (GABA) and glutamate in the striatum compared to the vehicle group. These findings suggest that CPF induced a long-term compulsivity that was apparent in the 5-CSRT task and associated with changes in monoaminergic and amino acid brain systems of inhibitory control function. Exposure to high doses of OP should be taken into account in studies of environmental causes for neurodegenerative, neuropsychological and neuropsychiatric disorders. PMID:23194828
Montes de Oca, Lara; Moreno, Margarita; Cardona, Diana; Campa, Leticia; Suñol, Cristina; Galofré, Mireia; Flores, Pilar; Sánchez-Santed, Fernando
This study was intended to determine the effects of continuous bright light exposure on cardiovascular responses, particularly heart rate variability (HRV), at rest and during performance of mental tasks with acute nocturnal sleep deprivation. Eight healthy male subjects stayed awake from 21.00 to 04.30 hours under bright (BL, 2800 lux) or dim (DL, 120 lux) light conditions. During sleep deprivation, mental tasks (Stroop color-word conflict test: CWT) were performed for 15 min each hour. Blood pressure, electrocardiogram, respiratory rate, urinary melatonin concentrations and rectal temperature were measured. During sleep deprivation, BL exposure depressed melatonin secretion in comparison to DL conditions. During sleep deprivation, exposure to BL delayed the decline in heart rate (HR) for 4 h in resting periods. A significant increment of HR induced by each CWT was detected, especially at 03.00 h and later, under DL conditions only. In addition, at 04.00 h, an index of sympathetic activity and sympatho-vagal balance on HRV during CWT increased significantly under DL conditions. In contrast, an index of parasympathetic activity during CWT decreased significantly under DL conditions. However, the indexes of HRV during CWT did not change throughout sleep deprivation under BL conditions. Our results suggest that BL exposure not only delays the nocturnal decrease in HR at rest but also maintains HR and balance of cardiac autonomic modulation to mental tasks during nocturnal sleep deprivation. PMID:16679712
From July, 1974 to June, 1977, a federally funded project to develop a competency based training program for human service workers was run at Elgin Mental Health Center, a 750-bed inpatient, state psychiatric facility. During the course of the project, over 500 discrete tasks were identified which such workers perform. Tasks were grouped into…
The evidence-based practice movement has become an important feature of health care systems and health care policy. Within this context, the APA 2005 Presidential Task Force on Evidence-Based Practice defines and discusses evidence-based practice in psychology (EBPP). In an in- tegration of science and practice, the Task Force's report describes psychology's fundamental commitment to sophis- ticated EBPP and takes into
Alvin R. Mahrer; Frederick L. Newman; John C. Norcross; Doris K. Silverman; Brian D. Smedley; Bruce E. Wampold; Drew I. Westen; Brian T. Yates; Nolan W. Zane; Geoffrey M. Reed; Lynn F. Bufka; Paul D. Nelson; Cynthia D. Belar; Merry Bullock
According to the fear-avoidance model, kinesiophobia (pain-related fear) is an important factor in the development of chronic pain and disability through the maintenance of maladaptive avoidance behaviors. Using a paradigm that required repeated exposure to a reaching task, the current study investigated generalization of pain and harm expectancy corrections (i.e., the tendency to bring expectations in line with experience) in
Zina Trost; Christopher R. France; James S. Thomas
A job exposure matrix of ergonomics risk factors was constructed for school custodial workers in one large school district in the province of British Columbia using 100 h of 1-min fixed-interval observations, participatory worker consensus on task durations and existing employment and school characteristic data. Significant differences in ergonomics risk factors were found by tasks and occupations. Cleaning and moving furniture, handling garbage, cleaning washrooms and cleaning floors were associated with the most physical risks and the exposure was often higher during the summer vs. the school year. Injury rates over a 4-year period showed the custodian injury rate was four times higher than the overall injury rate across all occupations in the school district. Injury rates were significantly higher in the school year compared with summer (12.2 vs. 7.0 per 100 full-time equivalents per year, p < 0.05). Custodial workers represent a considerable proportion of the labour force and have high injury rates, yet ergonomic studies are disproportionately few. Previous studies that quantified risk factors in custodial workers tended to focus on a few tasks or specific risk factors. This study, using participatory ergonomics and observational methods, systematically quantifies the broad range of musculoskeletal risk factors across multiple tasks performed by custodial workers in schools, adding considerably to the methodological literature. PMID:19431003
Multimodality imaging is becoming increasingly important in medical imaging. Since the motivation for combining multiple imaging modalities is generally to improve diagnostic or prognostic accuracy, the benefits of multimodality imaging cannot be assessed through the display of example images. Instead, we must use objective, task-based measures of image quality to draw valid conclusions about system performance. In this paper, we will present a general framework for utilizing objective, task-based measures of image quality in assessing multimodality and adaptive imaging systems. We introduce a classification scheme for multimodality and adaptive imaging systems and provide a mathematical description of the imaging chain along with block diagrams to provide a visual illustration. We show that the task-based methodology developed for evaluating single-modality imaging can be applied, with minor modifications, to multimodality and adaptive imaging. We discuss strategies for practical implementing of task-based methods to assess and optimize multimodality imaging systems.
Clarkson, Eric; Kupinski, Matthew A.; Barrett, Harrison H.; Furenlid, Lars
Pressures to increase the efficacy and effectiveness of medical training are causing the Department of Defense to investigate the use of simulation technologies. This article describes a comprehensive cognitive task analysis technique that can be used to simultaneously generate training requirements, performance metrics, scenario requirements, and simulator/simulation requirements for medical tasks. On the basis of a variety of existing techniques, we developed a scenario-based approach that asks experts to perform the targeted task multiple times, with each pass probing a different dimension of the training development process. In contrast to many cognitive task analysis approaches, we argue that our technique can be highly cost effective because it is designed to accomplish multiple goals. The technique was pilot tested with expert instructors from a large military medical training command. These instructors were employed to generate requirements for two selected combat casualty care tasks-cricothyroidotomy and hemorrhage control. Results indicated that the technique is feasible to use and generates usable data to inform simulation-based training system design. PMID:24084301
In this paper, one task-based flocking algorithm that coordinates a swarm of robots is presented and evaluated based on the standard simulation platform. Task-based flocking algorithm(TFA) is an effective framework for mobile robots cooperation. Flocking behaviors are integrated into the cooperation of the multi-robot system to organize a robot team to achieve a common goal. The goal of the whole team is obtained through the collaboration of the individual robot’s task. The flocking model is presented, and the flocking energy function is defined based on that model to analyze the stability of the flocking and the task switching criterion. The simulation study is conducted in a five-versus-five soccer game, where the each robot dynamically selects its task in accordance with status and the whole robot team behaves as a flocking. Through simulation results and experiments, it is proved that the task-based flocking algorithm can effectively coordinate and control the robot flock to achieve the goal.
In order to determine differences in biomechanical risk factors across computer tasks, a repeated measures laboratory experiment was completed with 30 touch-typing adults (15 females and 15 males). The participants completed five different computer tasks: typing text, completing an html-based form with text fields, editing text within a document, sorting and resizing objects in a graphics task and browsing and navigating a series of intranet web pages. Electrogoniometers and inclinometers measured wrist and upper arm postures, surface electromyography measured muscle activity of four forearm muscles and three shoulder muscles and a force platform under the keyboard and force-sensing computer mouse measured applied forces. Keyboard-intensive tasks were associated with less neutral wrist postures, larger wrist velocities and accelerations and larger dynamic forearm muscle activity. Mouse-intensive tasks (graphics and intranet web page browsing) were associated with less neutral shoulder postures and less variability in forearm muscle activity. Tasks containing a mixture of mouse and keyboard use (form completion and text editing) were associated with higher shoulder muscle activity, larger range of motion and larger velocities and accelerations of the upper arm. Comparing different types of computer work demonstrates that mouse use is prevalent in most computer tasks and is associated with more constrained and non-neutral postures of the wrist and shoulder compared to keyboarding. PMID:16393803
Objectives of Tasks 7A and 7B were to develop and demonstrate computer based systems to assist plant management and staff in utilizing information more effectively to reduce occupational exposures received as a result of refueling outages, and to shorten ...
Radiologists are intensive computer users as they review and interpret radiological examinations using the Picture Archiving and Communication Systems (PACS). Since their computer tasks require the prolonged use of pointing devices, a high prevalence of Musculoskeletal Disorders (MSDs) is reported. The first phase of this study involved conducting a Cognitive Work Analysis in conjunction with a Participatory Ergonomics approach to perform a total work system analysis. We also conducted an ergonomic survey as well as collected computer use data, specifically for the mouse and keyboard. The goal of the study was to reduce the physical exposures for radiologists. This paper presents Phase I results describing the analyses and redesign process of the radiologists tasks, training design, computer use, and selected survey results. PMID:22316978
Robertson, Michelle M; Boiselle, Philip; Eisenberg, Ronald; Siegal, Dainel; Chang, Che-Hsu Joe; Dainoff, Marvin; Garabet, Angela; Garza, Jennifer Bruno; Dennerlein, Jack
The purpose of this study was to examine the overarching framework of EFL (English as a Foreign Language) reading instructional approach reflected in an EFL secondary school curriculum in Malaysia. Based on such analysis, a comparison was made if Communicative Task-Based Language is the overarching instructional approach for the Malaysian EFL…
Two experiments were conducted to investigate the automatic processing of emotional facial expressions while performing low or high demand cognitive tasks under unattended conditions. In Experiment 1, 35 subjects performed low (judging the structure of Chinese words) and high (judging the tone of Chinese words) cognitive load tasks while exposed to unattended pictures of fearful, neutral, or happy faces. The results revealed that the reaction time was slower and the performance accuracy was higher while performing the low cognitive load task than while performing the high cognitive load task. Exposure to fearful faces resulted in significantly longer reaction times and lower accuracy than exposure to neutral faces on the low cognitive load task. In Experiment 2, 26 subjects performed the same word judgment tasks and their brain event-related potentials (ERPs) were measured for a period of 800 ms after the onset of the task stimulus. The amplitudes of the early component of ERP around 176 ms (P2) elicited by unattended fearful faces over frontal-central-parietal recording sites was significantly larger than those elicited by unattended neutral faces while performing the word structure judgment task. Together, the findings of the two experiments indicated that unattended fearful faces captured significantly more attention resources than unattended neutral faces on a low cognitive load task, but not on a high cognitive load task. It was concluded that fearful faces could automatically capture attention if residues of attention resources were available under the unattended condition.
As basic research on autonomous systems in mining fields, a model-basedtask planning method for loading operation is proposed The models of loader and pile of fragmented ore have been developed. The pile model represents shape and volume of the pile, changes of shape and volume by scooping, and falling behavior accompanied scooping The model has simple structure for faster
Airborne and surface lead exposures were evaluated for construction trade groups at a previously deleaded bridge renovation site in the midwestern United States. Although all lead-based paint should have been removed, old layers of leaded paint were still present on some sections of the bridge. Ironworkers performing metal torch cutting had the highest exposures (188 microg/m3), followed by workers engaged in clean-up operations and paint removal (p < 0.001). Respirators were most frequently worn by workers with the greatest lead exposures; however, laborers performing clean-up operations had exposures to lead dust of 43 microg/m3 and often wore no respiratory protection. Wipe samples revealed that almost all contractor vehicles were contaminated with lead. Heavy equipment operators with low airborne lead exposure had the highest levels of surface contamination in personal vehicles (3,600 microg/m2). Laborers cleaning structural steel with compressed air and ironworkers exposed to lead fumes from cutting had the highest concentrations of lead dust on clothing (mean 4,766 microg/m2). Handwashing facilities were provided, but were infrequently used. No separate clothes changing facility was available at the site. The potential for "take-home" contamination was high, even though this site was thought to be relatively free of lead. Construction contractors and their workers need to be aware that previous deleading of a site may not preclude exposure to significant amounts of lead. PMID:11192213
Johnson, J C; Reynolds, S J; Fuortes, L J; Clarke, W R
This task will address EPA's need to better understand the variability in personal exposure to air pollutants for the purpose of assessing what populations are at risk for adverse health outcomes due to air pollutant exposures. To improve our understanding of exposures to air po...
Digital radiographic imaging systems, such as those using photostimulable storage phosphor, amorphous selenium, amorphous silicon, CCD, and MOSFET technology, can produce adequate image quality over a much broader range of exposure levels than that of screen/film imaging systems. In screen/film imaging, the final image brightness and contrast are indicative of over- and underexposure. In digital imaging, brightness and contrast are often determined entirely by digital postprocessing of the acquired image data. Overexposure and underexposures are not readily recognizable. As a result, patient dose has a tendency to gradually increase over time after a department converts from screen/film-based imaging to digital radiographic imaging. The purpose of this report is to recommend a standard indicator which reflects the radiation exposure that is incident on a detector after every exposure event and that reflects the noise levels present in the image data. The intent is to facilitate the production of consistent, high quality digital radiographic images at acceptable patient doses. This should be based not on image optical density or brightness but on feedback regarding the detector exposure provided and actively monitored by the imaging system. A standard beam calibration condition is recommended that is based on RQA5 but uses filtration materials that are commonly available and simple to use. Recommendations on clinical implementation of the indices to control image quality and patient dose are derived from historical tolerance limits and presented as guidelines.
Shepard, S. Jeff; Wang Jihong; Flynn, Michael [Imaging Physics Department 056, Division of Diagnostic Imaging, University of Texas M. D. Anderson Cancer Center, 1515 Holcombe Boulevard, Houston, Texas 77030 (United States); and others
We investigate a multilayer perceptron (MLP) based hierarchical approach for task adaptation in automatic speech recognition. The system consists of two MLP classifiers in tandem. A well-trained MLP available off-the-shelf is used at the first stage of the hierarchy. A second MLP is trained on the posterior features estimated by the first, but with a long temporal context of around
This study examines the role of task-based conversation in second language (L2) grammatical development, focusing on the short-term effects of both negative feedback and positive evidence on the ac- quisition of two Japanese structures. The data are drawn from 55 L2 learners of Japanese at a beginning level of proficiency in an Austra- lian tertiary institution. Five different types of
The US EPA's National Exposure Research Laboratory (NERL) has been developing, applying, and evaluating population-basedexposure models to improve our understanding of the variability in personal exposure to air pollutants. Estimates of population variability are needed for E...
The human exposure to RF power radiated by cellular base station antennas can be assessed by means of the incident power density averaged over the body. The convenience of adopting this quantity lies in its well-behaved decay away from the antenna. As a consequence, the average power density decay can be predicted using simple formulas, which remain valid even in
Recent advances in model observers that predict human perceptual performance now make it possible to optimize medical imaging systems for human task performance. We illustrate the procedure by considering the design of a lens for use in an optically coupled digital mammography system. The channelized Hotelling observer is used to model human performance, and the channels chosen are differences of Gaussians (DOGs). The task performed by the model observer is detection of a lesion at a random but known location in a clustered lumpy background mimicking breast tissue. The entire system is simulated with a Monte Carlo application according to the physics principles, and the main system component under study is the imaging lens that couples a fluorescent screen to a CCD detector. The SNR of the channelized Hotelling observer is used to quantify the detectability of the simulated lesion (signal) upon the simulated mammographic background. In this work, plots of channelized Hotelling SNR vs. signal location for various lens apertures, various working distances, and various focusing places are shown. These plots thus illustrate the trade-off between coupling efficiency and blur in a task-based manner. In this way, the channelized Hotelling SNR is used as a merit function for lens design.
Recent advances in model observers that predict human perceptual performance now make it possible to optimize medical imaging systems for human task performance. We illustrate the procedure by considering the design of a lens for use in an optically coupled digital mammography system. The channelized Hotelling observer is used to model human performance, and the channels chosen are differences of Gaussians. The task performed by the model observer is detection of a lesion at a random but known location in a clustered lumpy background mimicking breast tissue. The entire system is simulated with a Monte Carlo application according to physics principles, and the main system component under study is the imaging lens that couples a fluorescent screen to a CCD detector. The signal-to-noise ratio (SNR) of the channelized Hotelling observer is used to quantify this detectability of the simulated lesion (signal) on the simulated mammographic background. Plots of channelized Hotelling SNR versus signal location for various lens apertures, various working distances, and various focusing places are presented. These plots thus illustrate the trade-off between coupling efficiency and blur in a task-based manner. In this way, the channelized Hotelling SNR is used as a merit function for lens design.
Recent advances in model observers that predict human perceptual performance now make it possible to optimize medical imaging systems for human task performance. We illustrate the procedure by considering the design of a lens for use in an optically coupled digital mammography system. The channelized Hotelling observer is used to model human performance, and the channels chosen are differences of Gaussians. The task performed by the model observer is detection of a lesion at a random but known location in a clustered lumpy background mimicking breast tissue. The entire system is simulated with a Monte Carlo application according to physics principles, and the main system component under study is the imaging lens that couples a fluorescent screen to a CCD detector. The signal-to-noise ratio (SNR) of the channelized Hotelling observer is used to quantify this detectability of the simulated lesion (signal) on the simulated mammographic background. Plots of channelized Hotelling SNR versus signal location for various lens apertures, various working distances, and various focusing places are presented. These plots thus illustrate the trade-off between coupling efficiency and blur in a task-based manner. In this way, the channelized Hotelling SNR is used as a merit function for lens design. PMID:15669625
Recent advances in model observers that predict human perceptual performance now make it possible to optimize medical imaging systems for human task performance. We illustrate the procedure by considering the design of a lens for use in an optically coupled digital mammography system. The channelized Hotelling observer is used to model human performance, and the channels chosen are differences of Gaussians. The task performed by the model observer is detection of a lesion at a random but known location in a clustered lumpy background mimicking breast tissue. The entire system is simulated with a Monte Carlo application according to physics principles, and the main system component under study is the imaging lens that couples a fluorescent screen to a CCD detector. The signal-to-noise ratio (SNR) of the channelized Hotelling observer is used to quantify this detectability of the simulated lesion (signal) on the simulated mammographic background. Plots of channelized Hotelling SNR versus signal location for various lens apertures, various working distances, and various focusing places are presented. These plots thus illustrate the trade-off between coupling efficiency and blur in a task-based manner. In this way, the channelized Hotelling SNR is used as a merit function for lens design.
Experiments are described in which observers attempted to perform concurrently two separate visual tasks. Two types of tasks were used: the identification of a T-shaped or L-shaped letter target, and the detection or localization of a texture element of unique orientation (texture target) within a dense texture. Combining these tasks to form various task pairs, performance as a function of stimulus onset asynchrony (SOA) was established separately for each task in a pair. In addition, performance was measured when each task was carried out by itself. When paired, two identification tasks (T or L) on (two) letter targets required a significantly larger SOA than either identification task by itself. This outcome suggests the involvement of serial performance and competition for a limited resource, confirming that letter identification requires attentive fixation. However, when the identification of a central letter target was paired with the localization (upper or lower hemifield) of an eccentric texture target, performance in the pair was comparable to performance of each task by itself. This suggests parallel performance and a lack of conflict over resources. The outcome was similar when the identification of an eccentric letter target was paired with the detection (present or absent) of an eccentric texture target. These results are consistent with the possibility that localization and detection of a textural singularity do not require attentive fixation. PMID:1771133
This paper presents a learning-enhanced market- basedtask allocation approach for oversubscribed domains. In oversubscribed domains all tasks cannot be completed within the required deadlines due to a lack of resources. We focus specifically on domains where tasks can be generated throughout the mission, tasks can have different levels of importance and urgency, and penalties are assessed for failed commitments.
Edward Gil Jones; M. Bernardine Dias; Anthony Stentz
A framework called taskbased design (TBD) to design an optimal robot manipulator for a given task is proposed. Only kinematic parameters are considered. An optimal manipulator is designed. It performs a given task best by using a framework called progressive design that decomposes the complexity of the task into three steps: kinematic design, planning, and kinematic control. As an
Programmers often have to do many repetitive tasks when using an IDE (integrated development environment). These tasks require them to navigate through many views and dialogs in the same steps and input same data, which are time consuming and boring. In this paper, we present an approach to automatically perform the repetitive tasks by catching user actions on the IDE
|Although many studies have described the L2 learning opportunities created by individual tasks, considerably less research has investigated task-based syllabi and courses (Bruton, 2002; Candlin, 2001; Ellis, 2003; Skehan, 2003). This case study investigated teachers' and learners' reactions to a task-based EFL course at a Thai university. A team…
Assigning tasks to agents is complex, especially in highly dynamic environments. Typical protocol-based approaches for task assign- ment such as Contract Net have proven their value, however, they may not be flexible enough to cope with continuously changing cir- cumstances. In this paper we study and validate the feasibility of a field-based approach for task assignment in a complex problem
Danny Weyns; Nelis Boucké; Tom Holvoet; Wannes Schols
How does the way we code and control actions influence automatic skill acquisition processes? Wenke and Frensch (2005) showed that instructions can lead participants to code spatial responses based on color. Here, we tested in 3 experiments to what extent response labeling and instruction-based response coding can determine what is learned in implicit sequence learning. Instructions mapped 4 gray shape stimuli to 1 of the 4 keys each in a serial reaction task, referring to the keys in terms of either their color or their spatial location. In Experiments 1 and 2 we found that people in the color instruction conditions used color for action control and acquired sequence knowledge containing color: They were susceptible to irrelevant stimulus colors at transfer and could transfer color sequence knowledge to a new arrangement of response positions and fingers, whereas participants who had received spatial instructions could not. Implicit sequence learning was thus surprisingly flexible. Depending on whether an arbitrary nonspatial response feature was used or not used to explain the stimulus-response mappings, we either found or did not find evidence that this feature became part of action control and sequence learning. Furthermore, Experiment 3 suggested that response position might become part of the sequence knowledge even if instructions do not emphasize this response feature. Together, the findings suggest that implicit sequence learning is based on action control, which in turn strongly, but not entirely, depends on which response features are used to explain the stimulus-response mappings in the instructions. PMID:22545612
Gaschler, Robert; Frensch, Peter A; Cohen, Asher; Wenke, Dorit
Human task performance with imaging sensors is characterized by perception experiments involving ensembles of observers viewing an ensemble of task relevant images from real sensors. Summary statistics from perception experiments are used, along with detailed descriptions of the sensors and early human vision processes to build predictive models such as NV-IPM. Use of these models typically requires knowledge of more than 100 specific parameters regarding the sensor, the viewing conditions, and the task. In this research we seek to do a blind prediction of task performance using task relevant image ensembles and image processing operations that produce statistically similar outputs to those obtained in real human perception experiments. We restrict our investigation to the task of identifying tracked vehicles. The data we seek to replicate through image processing are similarity matrices derived from the confusion matrices of actual perception experiments. This paper updates our work to date examining primarily the correspondence between several image processing approaches and perception data.
Social cognition is fundamentally interpersonal: individuals' behavior and dispositions critically affect their interaction partners' information processing. However, cognitive neuroscience studies, partially because of methodological constraints, have remained largely “perceiver-centric”: focusing on the abilities, motivations, and goals of social perceivers while largely ignoring interpersonal effects. Here, we address this knowledge gap by examining the neural bases of perceiving emotionally expressive and inexpressive social “targets.” Sixteen perceivers were scanned using fMRI while they watched targets discussing emotional autobiographical events. Perceivers continuously rated each target's emotional state or eye-gaze direction. The effects of targets' emotional expressivity on perceiver's brain activity depended on task set: when perceivers explicitly attended to targets' emotions, expressivity predicted activity in neural structures—including medial prefrontal and posterior cingulate cortex—associated with drawing inferences about mental states. When perceivers instead attended to targets' eye-gaze, target expressivity predicted activity in regions—including somatosensory cortex, fusiform gyrus, and motor cortex—associated with monitoring sensorimotor states and biological motion. These findings suggest that expressive targets affect information processing in manner that depends on perceivers' goals. More broadly, these data provide an early step toward understanding the neural bases of interpersonal social cognition.
The Grid is an heterogeneous and dynamic environment which enables distributed computation. This makes it a technology prone to failures. Some related work uses replication to overcome failures in a set of independent tasks, and in workflow applications, but they do not consider possible resource limitations when scheduling the replicas. In this paper, we focus on the use of task
A chemical exposure assessment was conducted for a cohort mortality study of 6157 chemical laboratory workers employed between 1943 and 1998 at four Department of Energy sites in Oak Ridge, Tennessee, and Aiken, South Carolina. Previous studies of chemical laboratory workers have included members within professional societies where exposure assessment was either limited or not feasible, or chemical processing employees
Scott A. Henn; David F. Utterback; Kathleen M. Waters; Andrea M. Markey; William G. Tankersley
Attention is a neurocognitive mechanism that selects task-relevant sensory or mnemonic information to achieve current behavioral goals. Attentional modulation of cortical activity has been observed when attention is directed to specific locations, features, or objects. However, little is known about how high-level categorization task-set modulates perceptual representations. In the current study, observers categorized faces by gender (male vs. female) or race (Asian vs. Caucasian). Each face was perceptually ambiguous in both dimensions, such that categorization of one dimension demanded selective attention to task-relevant information within the face. We used multivoxel pattern classification (MVPC) to show that task-specific modulations evoke reliably distinct spatial patterns of activity within three face-selective cortical regions (right fusiform face area and bilateral occipital face areas). This result suggests that patterns of activity in these regions reflect not only stimulus-specific (i.e., faces vs. houses) responses, but also task-specific (i.e., race vs. gender) attentional modulation. Furthermore, exploratory whole brain MVPC (using a searchlight procedure) revealed a network of dorsal frontoparietal regions (left middle frontal gyrus, left inferior and superior parietal lobule) that also exhibit distinct patterns for the two task-sets, suggesting that these regions may represent abstract goals during high-level categorization tasks.
Chiu, Yu-Chin; Esterman, Michael; Han, Yuefeng; Rosen, Heather; Yantis, Steven
Reward-based scheduling refers to the problem in which there is a reward associated with the execution of a task. In our framework, each real-time task comprises a mandatory and an optional part. The mandatory part must complete before the task's deadline, while a nondecreasing reward function is associated with the execution of the optional part, which can be interrupted at
Hakan Aydin; Rami Melhem; Daniel Mosse ´; Pedro Mejia-Alvarez
We have been developing a task-based service navigation sys- tem that offers to the user for his selected services relevant to the task the user wants to perform. We observed that the tasks likely to be per- formed in a given situation depend on the user's role such as businessman or father. To further our research, we constructed a role-ontology
Perseverative behavior characterizes mainly patients with severe psychopathology, but it can also be observed in healthy individuals. The aim of the reported experiment was to investigate a serial addition task that elicits strong perseverative behavior in normal subjects by examining the significance of perseveration in the final step of this addition task (Gardner, 1971) as a function of time availability. The classical serial addition task, which was used in the experiment, consisted of 4 consecutive digit decreases in the added numbers following a constant digit (1,000 + 40 + 1,000 + 30 + 1,000 + 20 + 1,000 + 10) and required an additive calculation. The main questions were how and if color and time variations could influence perseverative responses in this task and whether memory performance and relevant mathematical knowledge of the participants could have an effect on responses. The sample of subjects participating in the experiment consisted of 300 healthy university students (112 male, 188 female) ranging from 17 to 40 years of age. They were divided in 5 groups of 60 subjects each. A memory digit span and spatial test were administered and relevant scores were taken for each subject of the 5 groups. Obtained results suggest the presence of a strong perseverative error in the final step of the presentation of digits for the large majority of subjects and for all 5 conditions. It seems that time and color changes and the memory span of the participants have no detectable effect on performance on this specific serial addition task.
The dorsal striatum and prefrontal cortex have been implicated in interval timing. We examined whether performance of temporal discrimination tasks is associated with increased neuronal activation in these areas, as revealed by Fos expression, a marker for neuronal activation. In Experiment 1, rats were trained on a discrete-trials temporal discrimination task in which a light (22 cd/m²) was presented for a variable time, t (2.5-47.5 s), after which levers A and B were presented. A response on lever A was reinforced if t < 25 s, and a response on lever B was reinforced if t > 25 s. A second group was trained on a light-intensity discrimination procedure, in which a light of variable intensity, i (3.6-128.5 cd/m²) was presented for 25 s. A response on lever A was reinforced if i < 22 cd/m², and a response on lever B was reinforced if i > 22 cd/m². In Experiment 2, bisection procedures were used to assess temporal (200-800 ms, 22 cd/m²) and light-intensity (3.6-128.5 cd/m², 400 ms) discrimination. The increase in proportional choice of lever B as a function of stimulus duration or intensity conformed to a two-parameter logistic equation. Fos expression in the prefrontal cortex and nucleus accumbens was higher in rats performing temporal discrimination tasks than in those performing light-intensity discrimination tasks, indicating greater neuronal activation in these areas during temporal discrimination tasks. Fos expression in the dorsal striatum did not differ between rats performing temporal and light-intensity discrimination tasks. These results suggest that the prefrontal cortex and nucleus accumbens are involved in temporal discrimination. PMID:21341886
Valencia Torres, L; Olarte Sánchez, C M; Body, S; Fone, K C F; Bradshaw, C M; Szabadi, E
A task-based approach to second language learning and teaching has been advocated by a number of contemporary authors, but has received little atten- tion in terms of program evaluation. This study presents a formative evaluation of a three-year task-based conversation program designed for tertiary stu- dents in the Republic of Korea. Three task-based textbooks were produced, which (as with every
A work measurement technique was used to monitor the activities of seven printing press operators. Repeated observations were made to learn workers' tasks and workers' locations in the plant, and a photoionization detector was used to measure the instantaneous solvent concentration in each worker's breathing zone. Location data, analyzed using a computer aided design system, did not show any indication
. This paper describes a task-based evaluation methodologyappropriate for dialogue systems such as the TRAINS-95 system, wherea human and a computer interact and collaborate to solve a given problem.In task-based evaluations, techniques are measured in terms of theiraffect on task performance measures such as how long it takes to developa solution using the system, and the quality of the final
ABSTRACT Assigning tasks to agents is complex, especially in highly dynamic environments. Typical protocol-based approaches for task assignment such as Contract Net have proven their value, however, they may,not be flexible enough,to cope with continuously changing circumstances. In this paper we study and validate the feasibility of a field-based approach,for task assignment,in a complex,problem domain. In particular, we apply the
The development of a paper-based analytical device (PAD) for assessing personal exposure to particulate metals will be presented. Human exposure to metal aerosols, such as those that occur in the mining, construction, and manufacturing industries, has a significant impact on the health of our workforce, costing an estimated $10B in the U.S and causing approximately 425,000 premature deaths world-wide each year. Occupational exposure to particulate metals affects millions of individuals in manufacturing, construction (welding, cutting, blasting), and transportation (combustion, utility maintenance, and repair services) industries. Despite these effects, individual workers are rarely assessed for their exposure to particulate metals, due mainly to the high cost and effort associated with personal exposure measurement. Current exposure assessment methods for particulate metals call for an 8-hour filter sample, after which time, the filter sample is transported to a laboratory and analyzed by inductively-coupled plasma (ICP). The time from sample collection to reporting is typically weeks and costs several hundred dollars per sample. To exacerbate the issue, method detection limits suffer because of sample dilution during digestion. The lack of sensitivity hampers task-basedexposure assessment, for which sampling times may be tens of minutes. To address these problems, and as a first step towards using microfluidics for personal exposure assessment, we have developed PADs for measurement of Pb, Cd, Cr, Fe, Ni, and Cu in aerosolized particulate matter.
The need for the establishment of evaluation methods that can measure respective improvements or degradations of onto- logical models - e.g. yielded by a precursory ontology population stage - is undisputed. We propose an evaluation scheme that allows to employ a number of different ontologies and to measure their per- formance on specific tasks. In this paper we present the
|Describes a study that compared animated demonstrations, procedural textual instructions, and demonstrations combined with spoken procedural text to assess the effectiveness of animated demonstrations for users learning HyperCard. Results of accuracy, time on task, user responses, retention, and transfer are discussed; and future research is…
|This paper discusses a framework for designing online tasks that capitalizes on the possibilities that the Internet and the Web offer for language learning. To present such a framework, we draw from constructivist theories (Brooks and Brooks, 1993) and their application to educational technology (Newby, Stepich, Lehman and Russell, 1996;…
Recent advances in model observers that predict human perceptual performance now make it possible to optimize medical imaging systems for human task performance. We illustrate the procedure by considering the design of a lens for use in an optically coupled digital mammography system. The channelized Hotelling observer is used to model human performance, and the channels chosen are differences of
Recent advances in model observers that predict human perceptual performance now make it possible to optimize medical imaging systems for human task performance. We illustrate the procedure by considering the design of a lens for use in an optically coupled digital mammography system. The channelized Hotelling observer is used to model human performance, and the channels chosen are differences of
|Researchers spend much time and effort developing measures, including measures of students? conceptual knowledge. In an effort to make such assessments easier to design, the Principled Assessment Designs for Inquiry (PADI) project has developed a framework for designing tasks and to illustrate that its use has ?reverse engineered? several …
This paper discusses a framework for designing online tasks that capitalizes on the possibilities that the Internet and the Web offer for language learning. To present such a framework, we draw from constructivist theories (Brooks and Brooks, 1993) and their application to educational technology (Newby, Stepich, Lehman and Russell, 1996; Jonassen,…
Current technological developments and application-driven demands are bringing us closer to the realization of au- tonomous multirobot systems performing increasingly com- plex missions. However, existing methods of distributing mis- sion subcomponents among multirobot teams do not explicitly handle the required complexity and instead treat tasks as sim- ple indivisible entities, ignoring any inherent structure and se- mantics that such complex
Unmanned vehicles (UV's) are increasingly being employed in civil and military domains often for operations in dangerous environments. Typically these vehicles require some level of human supervision and therefore require a user interface to enable tasking and feedback. Most existing interfaces are specific to the UV and may require significant user training. One potential solution to this is to exploit
The occupational exposure to electric and magnetic fields during various work tasks at seven 110 kV substations in Finland's Tampere region was studied. The aim was to investigate if the action values (10 kV/m for the E-field and 500 microT for the B-field) of the EU Directive 2004/40/EC were exceeded. Electric and magnetic fields were measured during the following work tasks: (1) walking or operating devices on the ground; (2) working from a service platform; (3) working around the power transformer on the ground or using a ladder; and (4) changing a bulb from a man hoist. In work task 2 "working from a service platform" the measured electric field (maximum value 16.6 kV/m) exceeded 10 kV/m in three cases. In the future it is important to study if the limit value (10 mA/m(2)) of Directive 2004/40/EC is exceeded at 110 kV substations. The occupational 500 microT action value of the magnetic flux density field (B-field) was not exceeded in any working situation. PMID:20077529
... 3.2 Comment: A reference (or web site at the very least) was needed for the "Environ Dietary Exposure Assessment." In short, wherever a database ... More results from www.fda.gov/food/foodborneillnesscontaminants/chemicalcontaminants
In recent years, several regulatory agencies and professional societies have recommended an occupational exposure limit (OEL) for formaldehyde. This article presents the findings of a panel of experts, the Industrial Health Foundation panel, who were charged to identify an OEL that would prevent irritation. To accomplish this task, they critiqued approximately 150 scientific articles. Unlike many other chemicals, a large amount of data is available upon which to base a concentration-response relationship for human irritation. A mathematical model developed by Kane et al. (1979) for predicting safe levels of exposure to irritants based on animal data was also evaluated. The panel concluded that for most persons, eye irritation clearly due to formaldehyde does not occur until at least 1.0 ppm. Information from controlled studies involving volunteers indicated that moderate to severe eye, nose, and throat irritation does not occur for most persons until airborne concentrations exceed 2.0-3.0 ppm. The data indicated that below 1.0 ppm, if irritation occurs in some persons, the effects rapidly subside due to "accommodation." Based on the weight of evidence from published studies, the panel found that persons exposed to 0.3 ppm for 4-6 h in chamber studies generally reported eye irritation at a rate no different than that observed when persons were exposed to clean air. It was noted that at a concentration of 0.5 ppm (8-h TWA) eye irritation was not observed in the majority of workers (about 80%). Consequently, the panel recommended an OEL of 0.3 ppm as an 8-h time-weighted average (TWA) with a ceiling value (CV) of 1.0 ppm (a concentration not to be exceeded) to avoid irritation. The panel believes that the ACGIH TLV of 0.3 ppm as a ceiling value was unnecessarily restrictive and that this value may have been based on the TLV Committee's interpretation of the significance of studies involving self-reported responses at concentrations less than 0.5 ppm. The panel concluded that any occupational or environmental guideline for formaldehyde should be based primarily on controlled studies in humans, since nearly all other studies are compromised by the presence of other contaminants. The panel also concluded that if concentrations of formaldehyde are kept below 0.1 ppm in the indoor environment (where exposures might occur 24 h/d) this should prevent irritation in virtually all persons. The panel could not identify a group of persons who were hypersensitive, nor was there evidence that anyone could be sensitized (develop an allergy) following inhalation exposure to formaldehyde. The panel concluded that there was sufficient evidence to show that persons with asthma respond no differently than healthy individuals following exposure to concentrations up to 3.0 ppm. Although cancer risk was not a topic that received exhaustive evaluation, the panel agreed with other scientific groups who have concluded that the cancer risk of formaldehyde is negligible at airborne concentrations that do not produce chronic irritation. PMID:9055874
Paustenbach, D; Alarie, Y; Kulle, T; Schachter, N; Smith, R; Swenberg, J; Witschi, H; Horowitz, S B
During disasters, aid organizations often respond using the resources of local volunteer members from the affected population who are not only inexperienced, but who additionally take on some of the more psychologically and physically difficult tasks in order to provide support for their community. Although not much empirical evidence exists to justify the claim, it is thought that preparation, training, and organizational support limit (or reduce) a volunteer's risk of developing later psychopathology. In this study, we examined the effects of preparation, training, and organizational support and assigned tasks on the mental health of 506 Indonesian Red Cross volunteers who participated in the response to a massive earthquake in Yogyakarta, Indonesia, in 2006. Controlling for exposure level, the volunteers were assessed for post-traumatic stress disorder (PTSD), anxiety, depression, and subjective health complaints (SHCs) 6, 12, and 18 months post-disaster. Results showed high levels of PTSD and SHCs up to 18 months post-disaster, while anxiety and depression levels remained in the normal range. Higher levels of exposure as well as certain tasks (e.g., provision of psychosocial support to beneficiaries, handling administration, or handing out food aid) made the volunteers more vulnerable. Sense of safety, expressed general need for support at 6 months, and a lack of perceived support from team leaders and the organization were also related to greater psychopathology at 18 months. The results highlight the importance of studying organizational factors. By incorporating these results into future volunteer management programs the negative effects of disaster work on volunteers can be ameliorated. PMID:23205850
Thormar, Sigridur Bjork; Gersons, Berthold P R; Juen, Barbara; Djakababa, Maria Nelden; Karlsson, Thorlakur; Olff, Miranda
In order to provide services more reliably, intelligent service robots need to consider various factors, such as their surrounding environments, user's changing requirements, and constrained resources. Most of the intelligent service robots are controlled based on a task-based control system, which generates a task plan that consists of a sequence of actions, and executes the actions by invoking the corresponding
Ethanol is well known to adversely affect frontal executive functioning, which continues to develop throughout adolescence and into young adulthood. This is also a developmental window in which ethanol is misused by a significant number of adolescents. We examined the effects of acute and chronic ethanol exposure during adolescence on behavioral inhibition and efficiency using a modified water maze task. During acquisition, rats were trained to find a stable visible platform onto which they could escape. During the test phase, the stable platform was converted to a visible floating platform (providing no escape) and a new hidden platform was added in the opposite quadrant. The hidden platform was the only means of escape during the test phase. In experiment 1, adolescent animals received ethanol (1.0g/kg) 30min before each session during the test phase. In experiment 2, adolescent animals received chronic intermittent ethanol (5.0g/kg) for 16 days (PND30 To PND46) prior to any training in the maze. At PND72, training was initiated in the same modified water maze task. Results from experiment 1 indicated that acute ethanol promoted behavioral disinhibition and inefficiency. Experiment 2 showed that chronic intermittent ethanol during adolescence appeared to have no lasting effect on behavioral disinhibition or new spatial learning during adulthood. However, chronic ethanol did promote behavioral inefficiency. In summary, results indicate that ethanol-induced promotion of perseverative behavior may contribute to the many adverse behavioral sequelae of alcohol intoxication in adolescents and young adults. Moreover, the long-term effect of adolescent chronic ethanol exposure on behavioral efficiency is similar to that observed after chronic exposure in humans.
Acheson, Shawn K.; Bearison, Craig; Risher, M. Louise; Abdelwahab, Sabri H.; Wilson, Wilkie A.; Swartzwelder, H. Scott
Ethanol is well known to adversely affect frontal executive functioning, which continues to develop throughout adolescence and into young adulthood. This is also a developmental window in which ethanol is misused by a significant number of adolescents. We examined the effects of acute and chronic ethanol exposure during adolescence on behavioral inhibition and efficiency using a modified water maze task. During acquisition, rats were trained to find a stable visible platform onto which they could escape. During the test phase, the stable platform was converted to a visible floating platform (providing no escape) and a new hidden platform was added in the opposite quadrant. The hidden platform was the only means of escape during the test phase. In experiment 1, adolescent animals received ethanol (1.0g/kg) 30min before each session during the test phase. In experiment 2, adolescent animals received chronic intermittent ethanol (5.0g/kg) for 16 days (PND30 To PND46) prior to any training in the maze. At PND72, training was initiated in the same modified water maze task. Results from experiment 1 indicated that acute ethanol promoted behavioral disinhibition and inefficiency. Experiment 2 showed that chronic intermittent ethanol during adolescence appeared to have no lasting effect on behavioral disinhibition or new spatial learning during adulthood. However, chronic ethanol did promote behavioral inefficiency. In summary, results indicate that ethanol-induced promotion of perseverative behavior may contribute to the many adverse behavioral sequelae of alcohol intoxication in adolescents and young adults. Moreover, the long-term effect of adolescent chronic ethanol exposure on behavioral efficiency is similar to that observed after chronic exposure in humans. PMID:24147077
Acheson, Shawn K; Bearison, Craig; Risher, M Louise; Abdelwahab, Sabri H; Wilson, Wilkie A; Swartzwelder, H Scott
MapReduce is a programming model that enables efficient massive data processing in large-scale computing environments such as supercomputers and clouds. Such large-scale computers employ GPUs to enjoy its good peak performance and high memory bandwidth. Since the performance of each job is depending on running application characteristics and underlying computing environments, scheduling MapReduce tasks onto CPU cores and GPU devices
A preliminary Long Duration Exposure Facility (LDEF) Materials Data Base was developed by the LDEF Materials Special Investigation Group (MSIG). The LDEF Materials Data Base is envisioned to eventually contain the wide variety and vast quantity of materials data generated for LDEF. The data is searchable by optical, thermal, and mechanical properties, exposure parameters (such as atomic oxygen flux), and
A preliminary Long Duration Exposure Facility (LDEF) Materials Data Base was developed by the LDEF Materials Special Investigation Group (MSIG). The LDEF Materials Data Base is envisioned to eventually contain the wide variety and vast quantity of materia...
This study investigated the efficacy of integrating task-based e-mail activities into a process-oriented ESL writing class. In particular, it examined the linguistic characteristics of 132 pieces of e-mail writing by ESL students in tasks that differed in terms of purpose, audience interaction and task structure. The analysis focused on the linguistic features of the students' e-mail writing at different levels,
As inhalation is the most likely route of exposure the proposed MAC-value is based on an evaluation of the inhalatory data in man and animals. The earliest appearing effect after prolonged exposure of experimental animals (3 hr\\/day on 4 consecutive days, followed by 10 days without exposure, during 8 months) to 8.5 mg PAN\\/m3 was hyperaemia of the mucous membrane
This article describes the results and recommendations of the third Cognitive Neuroscience Treatment Research to Improve Cognition in Schizophrenia meeting related to measuring treatment effects on social and affective processing. At the first meeting, it was recommended that measurement development focuses on the construct of emotion identification and responding. Five Tasks were nominated as candidate measures for this construct via the premeeting web-based survey. Two of the 5 tasks were recommended for immediate translation, the Penn Emotion Recognition Task and the Facial Affect Recognition and the Effects of Situational Context, which provides a measure of emotion identification and responding as well as a related, higher level construct, context-based modulation of emotional responding. This article summarizes the criteria-based, consensus building analysis of each nominated task that led to these 2 paradigms being recommended as priority tasks for development as measures of treatment effects on negative symptoms in schizophrenia.
Carter, Cameron S.; Barch, Deanna M.; Gur, Ruben; Gur, Raquel; Pinkham, Amy; Ochsner, Kevin
This work in progress presents a novel distributed task allocation method for visual sensor networks based on a computational market. Our proposed method automatically adapts the QoS levels of the individual tasks, depending on the resource requirements and the user-defined interest level of the service. Therefore, we define virtual commodity markets, where decentralized producer agents sell resource shares of nodes
|This article presents key steps in the design and analysis of a computer based problem-solving assessment featuring interactive tasks. The purpose of the assessment is to support targeted instruction for students by diagnosing strengths and weaknesses at different stages of problem-solving. The first focus of this article is the task piloting…
This paper describes a task-based dynamic geometry platform that is able to record student responses in a collective fashion to pre-designed dragging tasks. The platform provides a new type of data and opens up a quantitative dimension to interpret students' geometrical perception in dynamic geometry environments. The platform is capable of…
This study examines the effect of complex performance assessments with multiple embedded tasks on both student performance and retention of knowledge in a senior\\/graduate level engineering decision theory class. The research looks at whether a performance basedtask covering a subset of content also covered in a traditional cognitive test results in increased learning and knowledge retention beyond the traditional
|Time-based prospective memory (PM) has been found to be negatively affected by aging, possibly as a result of declining frontal lobe (FL) function. Despite a clear retrospective component to PM tasks, the medial temporal lobes (MTL) are thought to play only a secondary role in successful task completion. The present study investigated the role of…
Parallel programming on SMP and multi-core architectures is hard. In this paper we present a programming model for those environments based on automatic function level parallelism that strives to be easy, flexible, portable, and performant. Its main trait is its ability to exploit task level parallelism by analyzing task dependencies at run time. We present the programming environment in the
The first purpose of this study was to identify procedural and heuristic knowledge used when creating web-based instruction. The second purpose of this study was to develop suggestions for improving the Heuristic Task Analysis process, a technique for eliciting, analyzing, and representing expertise in cognitively complex tasks. Three expert…
For purposes of the present study, it was hypothesized that field (in)dependence would introduce systematic variance into Iranian EFL learners' overall and task-specific performance on task-based reading comprehension tests. 1743 freshman, sophomore, junior, and senior students all majoring in English at different Iranian universities and colleges…
|An objective procedure was developed and tested to determine the relative difficulty of Air Force jobs. Also investigated were (1) the measurement of task difficulty to allow comparability across specialties, (2) the quantitative appraisal of job demands based on component tasks being performed, and (3) the comparability of job difficulty to job…
|This instructional task/competency package is designed to help teachers and administrators in developing competency-based instructional materials for an energy and power course. Part 1 contains a description of the industrial arts program and a course description, instructional task/competency list, and content outline for energy and power. The…
Objective. The objective of this paper is to evaluate the benefits provided by a saliency-based cueing algorithm to normally sighted volunteers performing mobility and search tasks using simulated prosthetic vision. Approach. Human subjects performed mobility and search tasks using simulated prosthetic vision. A saliency algorithm based on primate vision was used to detect regions of interest (ROI) in an image. Subjects were cued to look toward the directions of these ROI using visual cues superimposed on the simulated prosthetic vision. Mobility tasks required the subjects to navigate through a corridor, avoid obstacles and locate a target at the end of the course. Two search task experiments involved finding objects on a tabletop under different conditions. Subjects were required to perform tasks with and without any help from cues. Results. Head movements, time to task completion and number of errors were all significantly reduced in search tasks when subjects used the cueing algorithm. For the mobility task, head movements and number of contacts with objects were significantly reduced when subjects used cues, whereas time was significantly reduced when no cues were used. The most significant benefit from cues appears to be in search tasks and when navigating unfamiliar environments. Significance. The results from the study show that visually impaired people and retinal prosthesis implantees may benefit from computer vision algorithms that detect important objects in their environment, particularly when they are in a new environment.
Independent tasks under time and resource constraints are scheduled. The method and techniques developed refer to the so called 'constraints-based analysis' in scheduling problems. This analysis aims to characterize admissible schedules in order to provid...
Cognitive-behavioral therapy (CBT) with exposure and ritual prevention (ERP) is widely accepted as the most effective psychological treatment for obsessive compulsive disorder (OCD). However, the extant literature and treatment manuals cannot fully address all the variations in client presentation, the diversity of ERP tasks, and how to negotiate the inevitable therapeutic challenges that may occur. Within this article, we attempt to address common difficulties encountered by therapists employing exposure-based therapy in areas related to: 1) when clients fail to habituate to their anxiety, 2) when clients misjudge how much anxiety an exposure will actually cause, 3) when incidental exposures happen in session, 4) when mental or covert rituals interfere with treatment, and 5) when clients demonstrate exceptionally high sensitivities to anxiety. The goal of this paper is to bridge the gap between treatment theory and practical implementation issues encountered by therapists providing CBT for OCD. PMID:20405764
Pence, Steven L; Sulkowski, Michael L; Jordan, Cary; Storch, Eric A
To better understand tacit knowledge underlying experienced Web-Based Instruction designers' performance, this study utilized Heuristic Task Analysis, a method developed for eliciting, analyzing, and representing expertise in complex cognitive tasks. Three experts, representing three post-secondary institutions offering online programs, were selected for interviews based on the following criteria: 1) WBI design knowledge and skills indicated by formal training\\/education and experience
This research demonstrates how women assimilate to benevolent sexism by emphasizing their relational qualities and de-emphasizing\\u000a their task-related characteristics when exposed to benevolent sexism. Studies 1 (N?=?62) and 2 (N?=?100) show, with slightly different paradigms and measures, that compared to exposure to hostile sexism, exposure to benevolent\\u000a sexism increases the extent to which female Dutch college students define themselves in
Manuela Barreto; Naomi Ellemers; Laura Piebinga; Miguel Moya
Pharmaceutical formulations containing natural and\\/or synthetic lipids are an accepted strategy for potentially improving\\u000a the oral bioavailability and systemic exposure of poorly water soluble, highly lipophilic drug candidates. For example, lipid-based\\u000a formulations are commercially available for various drugs including cyclosporine, saquinavir, ritonavir, dutasteride and amprenavir.\\u000a Consequently, lipid-based systems are often considered when needing to increase drug exposure during pre-clinical drug
William N. Charman; Susan A. Charman; Christopher J. H. Porter
|Problem-based learning is becoming a popular and effective approach in Science, as it touted as an effective way to promote active learning and encourage students to develop life long learning skills. The problem-based learning could easily be adapted to the high school level and used as long-term project for a biology laboratory.|
Techniques for analysis of speech, that use autoregressive (all- pole) modeling approaches, are presented here and compared to generally known Mel-frequency cepstrum based feature ex- traction. In the paper, first, we focus on several possible ap- plications of modeling speech power spectra that increase the performance of ASR system mainly in case of large mismatch between training and testing data.
Access Control decisions are based on the authorisation poli- cies defined for a system as well as observed context and be- haviour when evaluating these constraints at runtime. Work- flow management systems have been recognised as a primary source for defining authorisation policies at workflow design- time, as well as generating context at runtime. This paper analyses recent work in
Christian Wolter; Andreas Schaad; Christoph Meinel
The paper presents a proposal of task scheduling algorithm for a multi-processor system based on dynamically organised shared\\u000a memory processor clusters. A cluster contains processors with data caches connected to a data memory module by an internal\\u000a cluster bus. Each data memory module is also accessible for a global inter-cluster bus that is available for all processors.\\u000a Execution of tasks
The paper presents proposals of a new architecture and respective task scheduling algorithms for a multi-processor system\\u000a based on dynamically organised shared memory clusters. The clusters are organised around memory modules placed in a common\\u000a address space. Each memory module can be accessed through a local cluster bus and a common inter-cluster bus. Execution of\\u000a tasks in a processor is
A task-based lesson serves as the organizing principle for a university mathematics content course for future elementary teachers.\\u000a The course, which provides the first semester of a year-long sequence, covers the arithmetic of numbers. The daily classroom\\u000a activities follow a Japanese-style lesson plan and use tasks developed through a didactical phenomenological analysis. A situated\\u000a learning perspective frames an understanding of
Psychophysical assessments, such as the maximum acceptable lift, have been used to establish worker capability and set safe load limits for manual handling tasks in occupational settings. However, in military settings, in which task demand is set and capable workers must be selected, subjective measurements are inadequate, and maximal capacity testing must be used to assess lifting capability. The aim of this study was to establish and compare the relationship between maximal lifting capacity and a self-determined tolerable lifting limit, maximum acceptable lift, across a range of military-relevant lifting tasks. Seventy male soldiers (age 23.7 ± 6.1 years) from the Australian Army performed 7 strength-based lifting tasks to determine their maximum lifting capacity and maximum acceptable lift. Comparisons were performed to identify maximum acceptable lift relative to maximum lifting capacity for each individual task. Linear regression was used to identify the relationship across all tasks when the data were pooled. Strong correlations existed between all 7 lifting tasks (rrange = 0.87-0.96, p < 0.05). No differences were found in maximum acceptable lift relative to maximum lifting capacity across all tasks (p = 0.46). When data were pooled, maximum acceptable lift was equal to 84 ± 8% of the maximum lifting capacity. This study is the first to illustrate the strong and consistent relationship between maximum lifting capacity and maximum acceptable lift for multiple single lifting tasks. The relationship developed between these indices may be used to help assess self-selected manual handling capability through occupationally relevant maximal performance tests. PMID:22643137
Savage, Robert J; Best, Stuart A; Carstairs, Greg L; Ham, Daniel J
This article explores theoretical and methodological issues associated with task-based interviews conducted with pairs of children. We explore different approaches to interviews from sociological, psychological and subject-based perspectives. Our interviews, concerning mathematical questions and carried out with pairs of 10 and 11-year-olds, are…
Knowledge-based systems are advanced systems for representing complex problems that use artificial intelligence to solve these problems. They incorporate a database of expert knowledge designed to facilitate the knowledge retrieval in response to specific queries, along with learning. In this paper we present a knowledge based system designed to increase the efficiency of the tasks executed by a robot located
Information regarding the distribution of volatile organic compound (VOC) concentrations and exposures is scarce, and there have been few, if any, studies using population-based samples from which representative estimates can be derived. This study characterizes distributions of personal exposures to ten different VOCs in the U.S. measured in the 1999--2000 National Health and Nutrition Examination Survey (NHANES). Personal VOC exposures were collected for 669 individuals over 2-3 days, and measurements were weighted to derive national-level statistics. Four common exposure sources were identified using factor analyses: gasoline vapor and vehicle exhaust, methyl tert-butyl ether (MBTE) as a gasoline additive, tap water disinfection products, and household cleaning products. Benzene, toluene, ethyl benzene, xylenes chloroform, and tetrachloroethene were fit to log-normal distributions with reasonably good agreement to observations. 1,4-Dichlorobenzene and trichloroethene were fit to Pareto distributions, and MTBE to Weibull distribution, but agreement was poor. However, distributions that attempt to match all of the VOC exposure data can lead to incorrect conclusions regarding the level and frequency of the higher exposures. Maximum Gumbel distributions gave generally good fits to extrema, however, they could not fully represent the highest exposures of the NHANES measurements. The analysis suggests that complete models for the distribution of VOC exposures require an approach that combines standard and extreme value distributions, and that carefully identifies outliers. This is the first study to provide national-level and representative statistics regarding the VOC exposures, and its results have important implications for risk assessment and probabilistic analyses. PMID:18378311
Jia, Chunrong; D'Souza, Jennifer; Batterman, Stuart
This study assessed the possibility of training three people with cognitive impairments using an augmented reality (AR)-basedtask prompting system. Using AR technology, the system provided picture cues, identified incorrect task steps on the fly, and helped users make corrections. Based on a multiple baseline design, the data showed that the three participants considerably increased their target response, which improved their vocational job skills during the intervention phases and enabled them to maintain the acquired job skills after intervention. The practical and developmental implications of the results are discussed. PMID:23880030
We seek the best possible performance of the Rayleigh task in which one must decide whether a perceived object is a pair of Gaussian-blurred points or a blurred line. Two Bayesian reconstruction algorithms are used, the first based on a Gaussian prior-probability distribution with a nonnegativity constraint and the second based on an entropic prior. In both cases, the reconstructions are found that maximize the posterior probability. We compare the performance of the Rayleigh task obtained with two decision variables, the logarithm of the posterior probability ratio and the change in the mean-squared deviation from the reconstruction. The method of evaluation is based on the results of a numerical testing procedure in which the stated discrimination task is carried out on reconstructions of a randomly generated sequence of images. The ability to perform the Rayleigh task is summarized in terms of a discrimination index that is derived from the area under the receiver-operating characteristic (ROC) curve. We find that the use of the posterior probability does not result in better performance of the Rayleigh task than the mean-squared deviation from the reconstruction. 10 refs., 6 figs.
The large-scale sharing of task-based functional neuroimaging data has the potential to allow novel insights into the organization of mental function in the brain, but the field of neuroimaging has lagged behind other areas of bioscience in the development of data sharing resources. This paper describes the OpenFMRI project (accessible online at http://www.openfmri.org), which aims to provide the neuroimaging community with a resource to support open sharing of task-based fMRI studies. We describe the motivation behind the project, focusing particularly on how this project addresses some of the well-known challenges to sharing of task-based fMRI data. Results from a preliminary analysis of the current database are presented, which demonstrate the ability to classify between task contrasts with high generalization accuracy across subjects, and the ability to identify individual subjects from their activation maps with moderately high accuracy. Clustering analyses show that the similarity relations between statistical maps have a somewhat orderly relation to the mental functions engaged by the relevant tasks. These results highlight the potential of the project to support large-scale multivariate analyses of the relation between mental processes and brain function. PMID:23847528
Poldrack, Russell A; Barch, Deanna M; Mitchell, Jason P; Wager, Tor D; Wagner, Anthony D; Devlin, Joseph T; Cumba, Chad; Koyejo, Oluwasanmi; Milham, Michael P
A System Architecture for Sensor-based Intelligent Robots (SASIR) is introduced. The system architecture consists of perception, motor, task planner, knowledge-base, user interface, and supervisor modules. SASIR is constructed using a frame data structure, which provides a suitable and flexible scheme for representation and manipulation of the world model, the sensor derived information, as well as for describing the actions required for the execution of a specific task. The experimental results show the basic validity of the general architecture as well as the robust and successful performance of two working systems: (1) the Autonomous Spill Cleaning (ASC) Robotic System, and (2) ROBOSIGHT, which is capable of a range of autonomous inspection and manipulation tasks. 45 refs.
Chen, C.; Trivedi, M.M. [Pacific Bell, San Ramon, CA (United States)
Apparent Diffusion Coefficient (ADC) of lesions obtained from Diffusion Weighted Magnetic Resonance Imaging is an emerging biomarker for evaluating anti-cancer therapy response. To compute the lesion's ADC, accurate lesion segmentation must be performed. To quantitatively compare these lesion segmentation algorithms, standard methods are used currently. However, the end task from these images is accurate ADC estimation, and these standard methods don't evaluate the segmentation algorithms on this task-based measure. Moreover, standard methods rely on the highly unlikely scenario of there being perfectly manually segmented lesions. In this paper, we present two methods for quantitatively comparing segmentation algorithms on the above task-based measure; the first method compares them given good manual segmentations from a radiologist, the second compares them even in absence of good manual segmentations.
Jha, Abhinav K.; Kupinski, Matthew A.; Rodríguez, Jeffrey J.; Stephen, Renu M.; Stopeck, Alison T.
Objectives: Isocyanate chemicals essential for polyurethane production are widely used industrially, and are increasingly found in consumer products. Asthma and other adverse health effects of isocyanates are well-documented and exposure surveillance is crucial to disease prevention. Hexamethylene diisocyanate (HDI)-specific serum immunoglobulin G (IgG) was evaluated as an exposure biomarker among workers at a US Air Force Air Logistics Center, which includes a large aircraft maintenance facility. Methods: HDI-specific IgG (HDI-IgG) titers in serum samples (n = 74) were measured using an enzyme-linked immunosorbent assay based upon the biuret form of HDI conjugated to human albumin. Information on personal protective equipment (PPE), work location/tasks, smoking, asthma history, basic demographics, and HDI skin exposure was obtained through questionnaire. Results: HDI-specific serum IgG levels were elevated in n = 17 (23%) of the workers studied. The prevalence and/or end-titer of the HDI-IgG was significantly (P < 0.05) associated with specific job titles, self-reported skin exposure, night-shift work, and respirator use, but not atopy, asthma, or other demographic information. The highest titers were localized to specific worksites (C-130 painting), while other worksites (generator painting) had no or few workers with detectable HDI-IgG. Conclusions: HDI-specific immune responses (IgG) provide a practical biomarker to aid in exposure surveillance and ongoing industrial hygiene efforts. The strategy may supplement current air sampling approaches, which do not assess exposures via skin, or variability in PPE use or effectiveness. The approach may also be applicable to evaluating isocyanate exposures in other settings, and may extend to other chemical allergens.
Wisnewski, Adam V.; Stowe, Meredith H.; Nerlinger, Abby; Opare-addo, Paul; Decamp, David; Kleinsmith, Christopher R.; Redlich, Carrie A.
Presents a paradigm for task scheduling and action planning\\/control of a robotic system. Based on the proposed max-plus algebra model, a robotic task involving both discrete and continuous actions can be scheduled, planned and controlled in a perceptive reference frame. Therefore, task scheduling, which usually deals with discrete type of events, as well as action planning, which usually deals with
Fisher information can be used as a surrogate for task-based measures of image quality based on ideal observer performance. A new and improved derivation of the Fisher information approximation for ideal-observer detectability is provided. This approximation depends only on the presence of a weak signal and does not depend on Gaussian statistical assumptions. This is also not an asymptotic result and therefore applies to imaging, where there is typically only one dataset, albeit a large one. Applications to statistical mixture models for image data are presented. For Gaussian and Poisson mixture models the results are used to connect reconstruction error with ideal-observer detection performance. When the task is the estimation of signal parameters of a weak signal, the ensemble mean squared error of the posterior mean estimator can also be expanded in powers of the signal amplitude. There is no linear term in this expansion, and it is shown that the quadratic term involves a Fisher information kernel that generalizes the standard Fisher information. Applications to imaging mixture models reveal a close connection between ideal performance on these estimation tasks and detection tasks for the same signals. Finally, for tasks that combine detection and estimation, we may also define a detectability that measures performance on this combined task and an ideal observer that maximizes this detectability. This detectability may also be expanded in powers of the signal amplitude, and the quadratic term again involves the Fisher information kernel. Applications of this approximation to imaging mixture models show a relation with the pure detection and pure estimation tasks for the same signals.
The manufacture of novel synthetic chemicals has increased in volume and variety, but often the environmental and health risks are not fully understood in terms of toxicity and, in particular, exposure. While efforts to assess risks have generally been effective when sufficient data are available, the hazard and exposure data necessary to assess risks adequately are unavailable for the vast majority of chemicals in commerce. The US Environmental Protection Agency has initiated the ExpoCast Program to develop tools for rapid chemical evaluation based on potential for exposure. In this context, a model is presented in which chemicals are evaluated based on inherent chemical properties and behaviorally-based usage characteristics over the chemical's life cycle. These criteria are assessed and integrated within a decision analytic framework, facilitating rapid assessment and prioritization for future targeted testing and systems modeling. A case study outlines the prioritization process using 51 chemicals. The results show a preliminary relative ranking of chemicals based on exposure potential. The strength of this approach is the ability to integrate relevant statistical and mechanistic data with expert judgment, allowing for an initial tier assessment that can further inform targeted testing and risk management strategies. PMID:23940664
Mitchell, Jade; Pabon, Nicolas; Collier, Zachary A; Egeghy, Peter P; Cohen-Hubal, Elaine; Linkov, Igor; Vallero, Daniel A
The Dutch Expert Committee on Occupational Standards recommends a health-based occupational exposure limit (HBR-OEL) for halothane of 0.41 mg\\/m3 (0.05 ppm) as an eight-hour time-weighted average concentration.
Mathematical methods were developed to construct dose and time distributions and their associated risks and threshold values for lethal and non-lethal effects of acute radiation exposure to include mortality and incidence, prodromal vomiting, and agranulocytosis. A new distribution (T-model) was obtained to describe time parameters of acute radiation syndrome such as the latency period, time to onset of vomiting, and time to initiation of agranulocytosis. Based on the dose and time distributions, the parameter translation method was defined using an orthogonal regression, which allows one to solve for these distributions in the case of acute radiation exposure. The assessment of threshold doses was performed for some effects of acute radiation syndrome: for the latency period, ?6-8 Gy absorbed dose and ?0.7-0.9 h time to onset of vomiting; and for incidence (agranulocytosis), ?2-3 Gy absorbed dose and ?2-3 h time to onset of vomiting. The obtained new formula for assessment of radiation risk is applicable to the time parameters of acute radiation syndrome. PMID:22217591
Osovets, S V; Azizova, T V; Day, R D; Wald, N; Moseeva, M B
Automated interpretation and classification of functional MRI (fMRI) data is an emerging research field that enables the characterization of underlying cognitive processes with minimal human intervention. In this work, we present a method for the automated classification of human thoughts reflected on a trial-based paradigm using fMRI with a significantly shortened data acquisition time (less than one minute). Based on our preliminary experience with various cognitive imagery tasks, six characteristic thoughts were chosen as target tasks for the present work: right-hand motor imagery, left-hand motor imagery, right foot motor imagery, mental calculation, internal speech/word generation, and visual imagery. These six tasks were performed by five healthy volunteers and functional images were obtained using a T(*)(2)-weighted echo planar imaging (EPI) sequence. Feature vectors from activation maps, necessary for the classification of neural activity, were automatically extracted from the regions that were consistently and exclusively activated for a given task during the training process. Extracted feature vectors were classified using the support vector machine (SVM) algorithm. Parameter optimization, using a k-fold cross validation scheme, allowed the successful recognition of the six different categories of administered thought tasks with an accuracy of 74.5% (mean)+/-14.3% (standard deviation) across all five subjects. Our proposed study for the automated classification of fMRI data may be utilized in further investigations to monitor/identify human thought processes and their potential link to hardware/computer control. PMID:19233711
Lee, Jong-Hwan; Marzelli, Matthew; Jolesz, Ferenc A; Yoo, Seung-Schik
In this paper, we develop a new paradigm for access control and authorization management, called task-based authorization controls (TBAC). TBAC models access controls from a task-oriented perspective than the traditional subject-object one. Access mediation now involves authorizations at various points during the completion of tasks in accordance with some application logic. By taking a task- oriented view of access control
The paper presents a modeling concept and a supporting runtime environment, which enables running simulation, control and measuring (data processing) tasks on distributed implementation platforms. Its main features: (1) it is scaleable in various application domains; (2) it has a model based system architecture; (3) it has a direct link to a real-time fast prototyping environment; (4) it is open
In this paper we introduce a new algorithm for comput- ing near optimal schedules for task graph problems. In con- trast to conventional approaches for solving those schedul- ing problems, our algorithm is based on the same principles that ants use to find shortest paths between their nest and food sources. Like their natural counterparts, artificial ants cooperate by means
This study tested the effect of screen size on the decision making performance of experienced and inexperienced basketball players during a video?based perceptual decision making task. Participants were 13 elite, 25 intermediate, and 34 novice participants who viewed 30 structured sequences of basketball games twice for between 4 to 6, once on a 43 cm (17 in.) computer monitor and
A frequent weakness of communicative approaches to foreign language teaching is a neglect of the intercultural dimension. Cultural knowledge is often treated as an addendum which focuses on learning facts about the target country. This article explores whether task-based language teaching (TBLT) can successfully address the intercultural…
|Gaze interaction affords hands-free control of computers. Pointing to and selecting small targets using gaze alone is difficult because of the limited accuracy of gaze pointing. This is the first experimental comparison of gaze-based interface tools for small-target (e.g. less than 12 x 12 pixels) point-and-select tasks. We conducted two…
Skovsgaard, Henrik; Mateo, Julio C.; Hansen, John Paulin
|Collocation is an aspect of language generally considered arbitrary by nature and problematic to L2 learners who need collocational competence for effective communication. This study attempts, from the perspective of L2 learners, to have a deeper understanding of collocational use and some of the problems involved, by adopting a taskbased…
This paper discusses the role of gender in participation and preferred mode of discussion in a peer- based critical thinking task, using the results from a project comparing the use of online versus face-to- face methods. Fifty-five students (45 females, 10 males) from a Scottish university were instructed to evaluate a research article according to supplied criteria, as part of
|This article addresses issues in model building and statistical inference in the context of student modeling. The use of probability-based reasoning to explicate hypothesized and empirical relationships and to structure inference in the context of proportional reasoning tasks is discussed. Ideas are illustrated with an example concerning…
In this research, we applied a transformation to the normal trajectory used to move and track a visual target in a virtual environment, in order to evaluate adaptation to a visual-based sensory motor transformation. The ability to recalibrate internal to external spatial reference frames is important when changing the relationship between the self and the environment. The virtual task was
|Teachers can promote long-lasting learning, build higher-order thinking skills, develop individual student accountability, and increase student achievement by incorporating performance learning tasks into the curriculum. In this second edition of "Performance-Based Learning," Sally Berman demonstrates how this model can be modified for learners…
|In this article, I examine some of the ideas about task-based language teaching (TBLT) which have emerged over the 17 years of the current editorship of ELTJ, focusing in particular on grammar and vocabulary, and enquiring to what degree these ideas take adequate account of classroom context. Over this period, TBLT scholars have built up a…
Recently, Learning Classier Systems (LCS) and particularly XCS have arisen as promising methods for classication tasks and data mining. This paper investigates two models of accuracy-based learning classier systems on different types of classi- cation problems. Departing from XCS, we analyze the evolution of a complete action map as a knowledge representation. We propose an alternative, UCS, which evolves a
|Using task-based exercises that required web searches and online activities, this course introduced non-traditional students to the sights and sounds of the German culture and language and simultaneously to computer technology. Through partner work that required negotiation of the net as well as of the language, these adult beginning German…
This paper is about how task-based design actually works in e-shopping web practice. During the past few years, we have applied and refined our design method in various case studies. Evaluating an entire method is very difficult to do, especially in a realistic setting. This paper will therefore not discuss hard research findings but will instead report on our experiences
|A frequent weakness of communicative approaches to foreign language teaching is a neglect of the intercultural dimension. Cultural knowledge is often treated as an addendum which focuses on learning facts about the target country. This article explores whether task-based language teaching (TBLT) can successfully address the intercultural…
|An exploratory study was designed to describe Internet search behaviors of deaf adolescents who used Internet search engines to complete fact-based search tasks. The study examined search behaviors of deaf high school students such as query formation, query modification, Web site identification, and Web site selection. Consisting of two…
The performance-based assessment was developed to assess students' higher order thinking skills in real-life problem-solving situations in Alberta, Canada. These tasks assess aspects of science that cannot be measured easily by regular paper and pencil tests. The purpose of this document is to provide teachers, administrators, students, and…
Alberta Dept. of Education, Edmonton. Student Evaluation Branch.
The purpose of this document is to provide teachers, administrators, students, and parents with samples of students' performances that exemplify standards in relation to the 1993 Grade 9 Science Performance-Based Assessment Tasks for the province of Alberta, Canada. A sample of 698 randomly selected students from 31 schools did the…
Alberta Dept. of Education, Edmonton. Student Evaluation Branch.
Combining an effective psychological treatment with conventional anxiolytic medication is typically not more effective than unimodal therapy for treating anxiety disorders. However, recent advances in the neuroscience of fear reduction have led to novel approaches for combining psychological therapy and pharmacological agents. Exposure-based treatments in humans partly rely on extinction to reduce the fear response in anxiety disorders. Animal studies have shown that d-cycloserine (DCS), a partial agonist at the glycine recognition site of the glutamatergic N-methyl- d-aspartate receptor facilitates extinction learning. Similarly, recent human trials have shown that DCS enhances fear reduction during exposure therapy of some anxiety disorders. This article discusses the biological and psychological mechanisms of extinction learning and the therapeutic value of DCS as an augmentation strategy for exposure therapy. Areas of future research will be identified.
Background The inclusion of family medicine in medical school curricula is essential for producing competent general practitioners. The aim of this study is to evaluate a task-based, community oriented teaching model of family medicine for undergraduate students in Iraqi medical schools. Methods An innovative training model in family medicine was developed based upon tasks regularly performed by family physicians providing health care services at the Primary Health Care Centre (PHCC) in Mosul, Iraq. Participants were medical students enrolled in their final clinical year. Students were assigned to one of two groups. The implementation group (28 students) was exposed to the experimental model and the control group (56 students) received the standard teaching curriculum. The study took place at the Mosul College of Medicine and at the Al-Hadba PHCC in Mosul, Iraq, during the academic year 1999–2000. Pre- and post-exposure evaluations comparing the intervention group with the control group were conducted using a variety of assessment tools. Results The primary endpoints were improvement in knowledge of family medicine and development of essential performance skills. Results showed that the implementation group experienced a significant increase in knowledge and performance skills after exposure to the model and in comparison with the control group. Assessment of the model by participating students revealed a high degree of satisfaction with the planning, organization, and implementation of the intervention activities. Students also highly rated the relevancy of the intervention for future work. Conclusion A model on PHCC training in family medicine is essential for all Iraqi medical schools. The model is to be implemented by various relevant departments until Departments of Family medicine are established.
Despite its theoretical appeal and research-based support, task-based language teaching (TBLT) continues to have a somewhat limited influence on actual second language teaching practices in many contexts. This study considers the relationship between teacher education and the broader use of TBLT. It investigates the effects of a…
Purpose: To develop a framework for taking the spatial frequency composition of an imaging task into account when determining optimal bin weight factors for photon counting energy sensitive x-ray systems. A second purpose of the investigation is to evaluate the possible improvement compared to using pixel based weights. Methods: The Fourier based approach of imaging performance and detectability index d' is applied to pulse height discriminating photon counting systems. The dependency of d' on the bin weight factors is made explicit, taking into account both differences in signal and noise transfer characteristics across bins and the spatial frequency dependency of interbin correlations from reabsorbed scatter. Using a simplified model of a specific silicon detector, d' values for a high and a low frequency imaging task are determined for optimal weights and compared to pixel based weights. Results: The method successfully identifies bins where a large point spread function degrades detection of high spatial frequency targets. The method is also successful in determining how to downweigh highly correlated bins. Quantitative predictions for the simplified silicon detector model indicate that improvements in the detectability index when applying task-based weights instead of pixel based weights are small for high frequency targets, but could be in excess of 10% for low frequency tasks where scatter-induced correlation otherwise degrade detectability. Conclusions: The proposed method makes the spatial frequency dependency of complex correlation structures between bins and their effect on the system detective quantum efficiency easier to analyze and allows optimizing bin weights for given imaging tasks. A potential increase in detectability of double digit percents in silicon detector systems operated at typical CT energies (100 kVp) merits further evaluation on a real system. The method is noted to be of higher relevance for silicon detectors than for cadmium (zink) telluride detectors.
Bornefalk, Hans [Department of Physics, Royal Institute of Technology, SE-106 91 Stockholm (Sweden)
Web-based Knowledge Gathering (WKG) is a specialized and complex information seeking task carried out by many users on the web, for their various learning, and decision-making requirements. We construct a contextual semantic structure by observing the actions of the users involved in WKG task, in order to gain an understanding of their task and requirement. We also build a knowledge warehouse in the form of a master Semantic Link Network (SLX) that accommodates and assimilates all the contextual semantic structures. This master SLX, which is a socio-contextual network, is then mined to provide contextual inputs to the current users through their agents. We validated our approach through experiments and analyzed the benefits to the users in terms of resource explorations and the time saved. The results are positive enough to motivate us to implement in a larger scale.
It is proposed that computers could be used to examine patients' subjective experience in the face of cancer threat. This study provides initial validation of a computer-based stress task by examining the psychological, autonomic, and endocrine aspects of an individual's subjective experience of cancer threat surrounding mammography screening. A repeated measures design was used. A total of 38 healthy women performed a stress task (pertaining to mammography) and a control task (pertaining to osteoporosis prevention) on separate days during which psychological, autonomic, and endocrine reactions were monitored. Compared with the control task, the stress task induced higher autonomic responses (skin conductance and heart rate variability) and endocrine responses (salivary cortisol) but not psychological distress. Further, both the autonomic (skin conductance) and endocrine responses to cancer threat were moderated by mastery, a trait known to have a stress-buffering effect. Yet such a moderating effect was not observed for psychological indices of stress--that is, mood. Implications for nursing research and interventions are discussed. PMID:17450707
Below we describe the winning system that we built for the KDD Cup 2002 Task 1 competition. Our system is a Rule-based Information Extraction (IE) system. It combines pattern matching, Natural Language Processing (NLP) tools, semantic constraints based on the domain and the specific task, and a post-processing stage for making the final curation decision based on the various evidence
Yizhar Regev; Michal Finkelstein-Landau; Ronen Feldman; Maya Gorodetsky; Xin Zheng; Samuel Levy; Rosane Charlab; Charles Lawrence; Ross A. Lippert; Qing Zhang; Hagit Shatkay
A large proportion of vocabulary is acquired incidentally from written contexts However, in text-based studies promoting generaUve processing, it is not dear if, or to what extent, generation influences incidental vocabulary learning This study examined the effects of text-basedtasks and background knowledge (pnor vocabulary knowledge and a disposiuon to use generative learning tactics when tackling new vocabulary) on incidental
Abstract In this paper, we describe the results of our programmatic research efforts aimed at inves- tigating the use of interactive computer-based,training technology,to support,knowledge acquisition and,integration for complex,task training environments.,We present the theore- tical rationale for our efforts and briefly describe the successive iterations of our investigation. Based upon the significant findings in our studies, we then present, within the
Haydee M. Cuevas; Stephen M. Fiore; Clint A. Bowers; Eduardo Salas
Recent developments in low-noise, large-area CCD detectors have renewed interest in radiographic systems that use a lens to couple light from a scintillation screen to a detector. The lenses for this application must have very large numerical apertures and high spatial resolution over a FOV. This paper expands on our earlier work by applying the principles of task-based assessment of image quality to development of meaningful figures of merit for the lenses. The task considered in this study is detection of a lesion in a mammogram, and the figure of merit used is the lesion detectability, expressed as a task-based signal-to-noise ratio (SNR), for a channelized Hotelling observer (CHO). As in the previous work, the statistical model accounts for the random structure in the breast, the statistical properties of the scintillation screen, the random coupling of light to the CCD, the detailed structure of the shift-variant lens point spread function (PSF), and Poisson noise of the X-ray flux. The lenses considered range from F/0.9 to F/1.2. All yield nominally the same spot size at a given field. Among the F/0.9 lenses, some of them were designed by conventional means for high resolution and some for high contrast, and the shapes of the PSF differ considerably. The results show that excessively large lens numerical apertures do not improve the task-based SNR but dramatically increase the optics fabrication cost. Contrary to common wisdom, high-contrast designs have higher task-based SNRs than high-resolution designs when the signal is small. Additionally, we constructed a merit function to successfully tune the lenses to perform equally well anywhere in the FOV.
Chen, Liying; Foo, Leslie D.; Cortesi, Rebecca L.; Thompson, Kevin P.; Barrett, Harrison H.
We consider the problem of classification of imaginary motor tasks from electroencephalography (EEG) data for brain-computer interfaces (BCIs) and propose a new approach based on hidden conditional random fields (HCRFs). HCRFs are discriminative graphical models that are attractive for this problem because they (1) exploit the temporal structure of EEG; (2) include latent variables that can be used to model different brain states in the signal; and (3) involve learned statistical models matched to the classification task, avoiding some of the limitations of generative models. Our approach involves spatial filtering of the EEG signals and estimation of power spectra based on autoregressive modeling of temporal segments of the EEG signals. Given this time-frequency representation, we select certain frequency bands that are known to be associated with execution of motor tasks. These selected features constitute the data that are fed to the HCRF, parameters of which are learned from training data. Inference algorithms on the HCRFs are used for the classification of motor tasks. We experimentally compare this approach to the best performing methods in BCI competition IV as well as a number of more recent methods and observe that our proposed method yields better classification accuracy. PMID:22414728
We consider the problem of classification of imaginary motor tasks from electroencephalography (EEG) data for brain-computer interfaces (BCIs) and propose a new approach based on hidden conditional random fields (HCRFs). HCRFs are discriminative graphical models that are attractive for this problem because they (1) exploit the temporal structure of EEG; (2) include latent variables that can be used to model different brain states in the signal; and (3) involve learned statistical models matched to the classification task, avoiding some of the limitations of generative models. Our approach involves spatial filtering of the EEG signals and estimation of power spectra based on autoregressive modeling of temporal segments of the EEG signals. Given this time-frequency representation, we select certain frequency bands that are known to be associated with execution of motor tasks. These selected features constitute the data that are fed to the HCRF, parameters of which are learned from training data. Inference algorithms on the HCRFs are used for the classification of motor tasks. We experimentally compare this approach to the best performing methods in BCI competition IV as well as a number of more recent methods and observe that our proposed method yields better classification accuracy.
Recovery of locomotor function was investigated in seven cosmonauts exposed to microgravity for 6 months. Crew members executed a locomotor task with visual cues (eyes open, EO) and without them (eyes closed, EC). The locomotor task consisted of ascending a two-step staircase, jumping down from a 30-cm high platform, and finally walking 4 m in the straight-ahead direction. Subjects were tested
BACKGROUND: In recent years, cleaning has been identified as an occupational risk because of an increased incidence of reported respiratory effects, such as asthma and asthma-like symptoms among cleaning workers. Due to the lack of systematic occupational hygiene analyses and workplace exposure data, it is not clear which cleaning-related exposures induce or aggravate asthma and other respiratory effects. Currently, there
Anila Bello; Margaret M Quinn; Melissa J Perry; Donald K Milton
Docosahexaenoic acid (22:6n-3, DHA) is important for optimal infant central nervous system development and Pb exposure during development can produce neurological deficits. The effect of Pb exposure on behavioral deficits and evaluation whether n-3 fatty acid deficiency exacerbates these effects wa...
The study reported in this article aimed to investigate the way working memory capacity (WMC) interacts with careful online planning--a task-based implementation variable--to affect second language (L2) speech production. This issue is important to teachers, because it delves into one of the possible task-based implementation variables and thus…
BACKGROUND: The inclusion of family medicine in medical school curricula is essential for producing competent general practitioners. The aim of this study is to evaluate a task-based, community oriented teaching model of family medicine for undergraduate students in Iraqi medical schools. METHODS: An innovative training model in family medicine was developed based upon tasks regularly performed by family physicians providing
On January 29, 2001, President George W. Bush issued two executive orders related to faith-based and community organizations. The first established a base of operations within the White House for such initiatives, and the second established centers within various cabinet- level departments, including the Department of Justice (DOJ). As the latter's website notes, "The Task Force's purpose is to promote good works by neighbors, particularly in the areas of juvenile delinquency, prisoners and their families, victims of crime, domestic violence, and drug addiction/treatment/prevention." Visitors to the site can learn about funding opportunities administered by the DOJ and also read some of its publications, such as "Moving Beyond the Walls: Faith and Justice Partnerships Working for High Risk Youth". Interested parties may also want to look at the Task Force's FAQ section and sign up to receive email updates.
... Claims based on exposure to ionizing radiation. 3.311 Section 3.311 Pensions... Claims based on exposure to ionizing radiation. (a) Determinations of exposure...disease is a result of exposure to ionizing radiation in service, an assessment will be...
Background The instantaneous rate of change of alcohol exposure (slope) may contribute to changes in measures of brain function following administration of alcohol that are usually attributed to breath alcohol concentration (BrAC) acting alone. To test this proposition, a 2-session experiment was designed in which carefully-prescribed, constant-slope trajectories of BrAC intersected at the same exposure level and time since the exposure began. This paper presents the methods and limitations of the experimental design. Methods Individualized intravenous infusion rate profiles of 6% ethanol that achieved the constant slope trajectories for an individual were precomputed using a physiologically-based pharmacokinetic model. Adjusting the parameters of the model allowed each infusion profile to account for the subject’s ethanol distribution and elimination kinetics. Sessions were conducted in randomized order and made no use of feedback of BrAC measurements obtained during the session to modify the pre-calculated infusion profiles. In one session, an individual’s time course of exposure, BrAC(t), was prescribed to rise at a constant rate of 6.0 mg% per min until it reached 68 mg% and then descend at ?1.0 mg% per min; in the other, to rise at a rate of 3.0 mg% per min. The 2 exposure trajectories were designed to intersect at a BrAC(t=20 min) = 60 mg% at an experimental time of 20 minutes. Results Intersection points for 54 of 61 subjects were within prescribed deviations (range of ± 3 mg% and ± 4 min from the nominal intersection point. Conclusion Results confirmed the feasibility of the applying the novel methods for achieving the intended time courses of the BrAC, with technical problems limiting success to 90% of the individuals tested.
Plawecki, Martin H.; Zimmermann, Ulrich S.; Vitvitskiy, Victor; Doerschuk, Peter C.; Crabb, David; O'Connor, Sean
We discuss how a task-based situational market segmentation may be applied to on-line newspapers, distinguishing between fact finding, information gathering and browsing. During a period of four weeks we had 41 users keep a diary and recorded their surfing behavior on different on-line newspapers. The results of a Naive Bayes classification with feature selection indicate that content-related attributes such as
An experiment was run in which elderly and younger people used a keyboard editor and a simulated listening typewriter to compose letters. Performance was measured and participants rated the systems used.Our general conclusions were as follows:There are no major differences in performance between elderly computer users and their younger counterparts in carrying out a computer-based composition task.Elders appear to be
We present and evaluate an implemented system with which to rapidly and easily build intelli- gent software agents for Web-basedtasks. Our design is centered around two basic functions: ScoreThisLink and ScoreThisPage .I f given highly accurate such functions, standard heuristic search would lead to ecient retrieval of useful information. Our approach allows users to tailor our system's behavior by
\\u000a The perceived robustness of multi-agent systems is claimed to be one of the great benefits of distributed control, but centralised\\u000a control dominates in space applications. We propose the use of market-based control to allocate tasks in a distributed satellite\\u000a system. The use of an artificial currency allows us to take the capabilities, energy levels and location of individual satellites,\\u000a as
Johannes van der Horst; Jason Noble; Adrian Tatnall
Recent developments in low-noise, large-area CCD detectors have renewed interest in radiographic systems that use a lens to couple light from a scintillation screen to a detector. The lenses for this application must have very large numerical apertures and high spatial resolution over a FOV. This paper expands on our earlier work by applying the principles of task-based assessment of
Liying Chen; Leslie D. Foo; Rebecca L. Cortesi; Kevin P. Thompson; Harrison H. Barrett
Recent developments in low-noise, large-area CCD detectors have renewed interest in radiographic systems that use a lens to couple light from a scintillation screen to a detector. The lenses for this application must have very large numerical apertures and high spatial resolution over a FOV. This paper expands on our earlier work by applying the principles of task-based assessment of
Liying Chen; Leslie D. Foo; Rebecca L. Cortesi; Kevin P. Thompson; Harrison H. Barrett
Load balancing is a key concern when developing parallel and distributed computing applications. The emergence of computational grids extends this problem, where issues of cross-domain and large-scale scheduling must also be considered. In this work an agent-based grid management infrastructure is coupled with a performance-driven task scheduler that has been developed for local grid load balancing. Each grid scheduler utilises
Junwei Cao; Daniel P. Spooner; Stephen A. Jarvis; Subhash Saini; Graham R. Nudd
Reduction of contour error is an important issue in contour-following applications. One of the common approaches to this problem is to design a controller based on contour error information. However, for the free-form contour-following tasks, there is a lack of effective algorithms for calculating contour errors in real time. To deal with this problem, this paper proposes a real-time contour
The coupling of dosimetry measurements and modeling represents a promising strategy for deciphering the relationship between chemical exposure and disease outcome. To support the development and implementation of biological monitoring programs, quantitative technologies for measuring xenobiotic exposure are needed. The development of portable nanotechnology-based electrochemical sensors has the potential to meet the needs for low cost, rapid, high-throughput and ultrasensitive detectors for biomonitoring an array of chemical markers. Highly selective electrochemical (EC) sensors capable of pM sensitivity, high-throughput and low sample requirements (<50uL) are discussed. These portable analytical systems have many advantages over currently available technologies, thus potentially representing the next-generation of biomonitoring analyzers. This manuscript highlights research focused on the development of field-deployable analytical instruments based on EC detection. Background information and a general overview of EC detection methods and integrated use of nanomaterials in the development of these sensors are provided. New developments in EC sensors using various types of screen-printed electrodes, integrated nanomaterials, and immunoassays are presented. Recent applications of EC sensors for assessing exposure to pesticides or detecting biomarkers of disease are highlighted to demonstrate the ability to monitor chemical metabolites, enzyme activity, or protein biomarkers of disease. In addition, future considerations and opportunities for advancing the use of EC platforms for dosimetric studies are discussed.
Barry, Richard C.; Lin, Yuehe; Wang, Jun; Liu, Guodong; Timchalk, Charles A.
Improving the image quality and speed is an endless demand for printer applications. To meet the market requirements, we have launched the world first laser printer (DocuColor 1256 GA) introducing 780-nm single-mode 8×4 VCSEL arrays in the light exposure system in 2003. The DocuColor 1256 GA features 2400 dots per inch (dpi) resolution which is the highest in the industry and a speed of 50 pages per minute (ppm). A VCSEL array design has an advantage that it can increase the pixel density and also increase the printing speed by simultaneously scanning the 32-beam to the photoconductor in the exposure process. Adopting VCSELs as a light source also contributes to the reduction of the machine's power consumption. The VCSELs are industrially manufactured based on the original in-situ monitored oxidation process to control the oxide aperture size. As a result, uniform characteristics with a less than 5% variation in both output power and divergence angle are obtained. Special care is also taken in the assembly process to avoid additional degradation in performance and quality. This technology is currently extended to high-end tandem color machines (2400 dpi, 80 ppm) to grasp on-demand publishing market. This paper will cover the key technologies of the VCSEL based light exposure system as well as its manufacturing process to assure its quality.
Before 12 months of age, infants have difficulties coordinating and sequencing their movements to retrieve an object concealed in a box. This study examined (a) whether young infants can discover effective retrieval solutions and consolidate movement coordination earlier if exposed regularly to such a task and (b) whether different environments, indexed by box transparency, would impact the rate of learning
|Before 12 months of age, infants have difficulties coordinating and sequencing their movements to retrieve an object concealed in a box. This study examined (a) whether young infants can discover effective retrieval solutions and consolidate movement coordination earlier if exposed regularly to such a task and (b) whether different environments,…
Materials science, which entails the practices of selecting, testing, and characterizing materials, is an important discipline within the study of matter. This paper examines how third grade students' materials science performance changes over the course of instruction based on an engineering design challenge. We conducted a case study of nine students who participated in engineering design-based science instruction with the goal of constructing a stable, quiet, thermally comfortable model house. The learning outcome of materials science practices was assessed by clinical interviews conducted before and after the instruction, and the learning process was assessed by students' workbooks completed during the instruction. The interviews included two materials selection tasks for designing a sturdy stepstool and an insulated pet habitat. Results indicate that: (1) students significantly improved on both materials selection tasks, (2) their gains were significantly positively associated with the degree of completion of their workbooks, and (3) students who were highly engaged with the workbook's reflective record-keeping tasks showed the greatest improvement on the interviews. These findings suggest the important role workbooks can play in facilitating elementary students' learning of science through authentic activity such as engineering design.
The purpose of this study was to examine the effects of depleted self-control strength on skill-based sports task performance. Sixty-two participants completed the following: a baseline dart-tossing task (20 tosses), with measures of accuracy, reaction time, and myoelectrical activity of the arms taken throughout; a self-control depletion (experimental) or a nondepletion (control) manipulation; and a second round of dart tossing. As hypothesized, participants in the experimental condition had poorer mean accuracy at Round 2 than control condition participants, and a significant decline in accuracy from Round 1 to Round 2. Experimental condition participants also demonstrated poorer consistency in accuracy compared with control condition participants at Round 2 and a significant deterioration in consistency from Round 1 to Round 2. In addition, consistency in reaction time improved significantly for the control group but not for the experimental group. The results of this study provide evidence that ego depletion effects occur in the performance of a skill-based sports task. PMID:23798587
McEwan, Desmond; Martin Ginis, Kathleen A; Bray, Steven R
Prominent computational models describe a neural mechanism for learning from reward prediction errors, and it has been suggested that variations in this mechanism are reflected in personality factors such as trait extraversion. However, although trait extraversion has been linked to improved reward learning, it is not yet known whether this relationship is selective for the particular computational strategy associated with error-driven learning, known as model-free reinforcement learning, vs. another strategy, model-based learning, which the brain is also known to employ. In the present study we test this relationship by examining whether humans' scores on an extraversion scale predict individual differences in the balance between model-based and model-free learning strategies in a sequentially structured decision task designed to distinguish between them. In previous studies with this task, participants have shown a combination of both types of learning, but with substantial individual variation in the balance between them. In the current study, extraversion predicted worse behavior across both sorts of learning. However, the hypothesis that extraverts would be selectively better at model-free reinforcement learning held up among a subset of the more engaged participants, and overall, higher task engagement was associated with a more selective pattern by which extraversion predicted better model-free learning. The findings indicate a relationship between a broad personality orientation and detailed computational learning mechanisms. Results like those in the present study suggest an intriguing and rich relationship between core neuro-computational mechanisms and broader life orientations and outcomes.
Skatova, Anya; Chan, Patricia A.; Daw, Nathaniel D.
To prepare students for instructive collaboration, it is necessary to have insight into students' psychological needs and interest development. The framework of self-determination theory was used to conduct a field experiment involving 114 students in vocational education. These students followed a practical business course which required they work in small learning groups. During the course, students were asked to complete the Quality of Working in Groups Instrument, an online measure of how strong autonomy, competence, social relatedness, and task interest are fulfilled. SEM showed that students' psychological needs were jointly and uniquely related to task interest over time. The significance of this on-line test for the assessment of interest within project-based education is discussed. PMID:18175501
Minnaert, Alexander; Boekaerts, Monique; de Brabander, Cornelis
Magnetic resonance imaging enables the noninvasive mapping of both anatomical white matter connectivity and dynamic patterns of neural activity in the human brain. We examine the relationship between the structural properties of white matter streamlines (structural connectivity) and the functional properties of correlations in neural activity (functional connectivity) within 84 healthy human subjects both at rest and during the performance of attention- and memory-demanding tasks. We show that structural properties, including the length, number, and spatial location of white matter streamlines, are indicative of and can be inferred from the strength of resting-state and task-based functional correlations between brain regions. These results, which are both representative of the entire set of subjects and consistently observed within individual subjects, uncover robust links between structural and functional connectivity in the human brain.
Hermundstad, Ann M.; Bassett, Danielle S.; Brown, Kevin S.; Aminoff, Elissa M.; Clewett, David; Freeman, Scott; Frithsen, Amy; Johnson, Arianne; Tipper, Christine M.; Miller, Michael B.; Grafton, Scott T.; Carlson, Jean M.
The current study assessed the effects of developmental PCB and/or MeHg exposure on an operant task of timing and inhibitory control and determined if amphetamine (AMPH) drug challenges differentially affected performance. Long-Evans rats were exposed to corn oil (control), PCBs alone (1 or 3 mg/kg), MeHg alone (1.5 or 4.5 ppm), the low combination (1 mg/kg PCBs + 1.5 ppm MeHg), or the high combination (3 mg/kg PCBs + 4.5 ppm MeHg) throughout gestation and lactation. An environmentally relevant, formulated PCB mixture was used. Male and female offspring were trained to asymptotic performance on a differential reinforcement of low rates (DRL) operant task as adults. PCB-exposed groups had a lower ratio of reinforced to non-reinforced responses than controls. Groups exposed to MeHg alone were not impaired and the deficits observed in PCB-exposed groups were not seen when PCBs were co-administered with MeHg. AMPH was less disruptive to responding in males receiving PCBs alone, MeHg alone, and 1.0 mg/kg PCB + 1.5 ppm MeHg. Paradoxically, the disruption in responding by AMPH in males given 3.0 mg/kg PCB + 4.5 ppm MeHg did not differ from controls. Exposed females from all treatment groups did not differ from controls in their AMPH response. Overall, the findings suggest that developmental exposure to PCBs can decrease DRL performance. Co-exposure to MeHg seemed to mitigate the detrimental effects of PCBs on performance. The finding that the disruptive effects of AMPH on DRL performance were lessened in some groups of exposed males suggests that alterations in dopaminergic functioning may have a role in behavioral changes seen after perinatal PCB and MeHg exposure.
Sable, Helen J. K.; Eubig, Paul A.; Powers, Brian E.; Wang, Victor C.; Schantz, Susan L.
This manuscript highlights research focused on the development of field-deployable analytical instruments based on EC detection. Background information and a general overview of EC detection methods and integrated use of nanomaterials in the development of these sensors are provided. New developments in EC sensors using various types of screen-printed electrodes, integrated nanomaterials, and immunoassays are discussed. Recent applications of EC sensors for assessing exposure to pesticides or detecting biomarkers of disease are highlighted to demonstrate the ability to monitor chemical metabolites, enzyme activity, or protein biomarkers of disease. In addition, future considerations and opportunities for advancing the use of EC platforms for dosimetric studies are covered.
Barry, Richard C.; Lin, Yuehe; Wang, Jun; Liu, Guodong; Timchalk, Charles
Should computer-based study tasks use multiple-choice or constructed-response question format? It was hypothesized that a\\u000a constructucted-response study task (CR) with feedback would be superior to multiple-choice study tasks that allowed either\\u000a single or multiple tries (STF and MTF). Two additional recognition study task treatments were included that required an overt\\u000a constructed response after feedback (STF+OR and MTF+OR) in order to
|This study investigates the effect of Web-based Chemistry Problem-Solving, with the attributes of Web-searching and problem-solving scaffolds, on undergraduate students' problem-solving task performance. In addition, the nature and extent of Web-searching strategies students used and its correlation with task performance and domain knowledge also…
This study investigates the relationship of language proficiency to language production and negotiation of meaning that non-native speakers (NNSs) produced in an online task-based language learning (TBLL) environment. Fourteen NNS-NNS dyads collaboratively completed four communicative tasks, using an online TBLL environment specifically designed…
We present a qualitative comparative analysis of the mathematical concepts elementary school teachers, participating in a content-based professional development course, and high school students, in a pre-calculus course, developed as they worked collaboratively on a conceptually challenging task. These two groups developed some similar and some very different mathematics concepts while working on the same task. Metaphor We introduce a
In recent theories of event-based prospective memory, researchers have debated what degree of resources are necessary to identify a cue as related to a previously established intention. In order to simulate natural variations in attention, the authors manipulated effort toward an ongoing cognitive task in which intention-related cues were embedded in 3 experiments. High effort toward the ongoing task resulted
The aim of most population-based studies of media is to relate a specific exposure to an outcome of interest. A research program has been developed that evaluates exposure to different components of movies in an attempt of assess the association of such exposure with the adoption of substance use during adolescence. To assess exposure to movie substance use, one must
James D. Sargent; Keilah A. Worth; Michael Beach; Meg Gerrard; Todd F. Heatherton
The potential human health risk(s) from chemical exposure must frequently be assessed under conditions for which adequate human or animal data are not available. The default method for exposure-duration adjustment, based on Haber's rule, C (external exposure concentration) or C (the ten Berge modification) × t (exposure duration) = K (a constant toxic effect), has been criticized for prediction errors.
Jane Ellen Simmons; Marina V. Evans; William K. Boyes
As a result of the recent recommendations of ICRP 60 and in anticipation of possible regulation on occupational exposure of commercial aircrew, a two-part investigation was carried out over a one-year period to determine the total dose equivalent on representative Canadian-based flight routes. As part of the study, a dedicated scientific measurement flight (using both a conventional suite of powered detectors and passive dosimetry) was used to characterise the complex mixed radiation field and to intercompare the various instrumentation. In the other part of the study, volunteer aircrew carried (passive) neutron bubble detectors during their routine flight duties. From these measurements, the total dose equivalent was derived for a given route with a knowledge of the neutron fraction as determined from the scientific flight and computer code (CARI-LF) calculations. This investigation has yielded an extensive database of over 3100 measurements providing the total dose equivalent for 385 different routes. By folding in flight frequency information and the accumulated flight hours, the annual occupational exposures of 26 flight crew have also been determined. This study has indicated that most Canadian-based domestic and international aircrew will exceed the proposed annual ICRP 60 public limit of 1 mSv.y-1, but will he well below the occupational limit of 20 mSv.y-1. PMID:11542925
Lewis, B J; Tume, P; Bennett, L G; Pierre, M; Green, A R; Cousins, T; Hoffarth, B E; Jones, T A; Brisson, J R
|The present study investigated the effects of acute stress exposure on learning performance in humans using analogs of two paradigms frequently used in animals. Healthy male participants were exposed to the cold pressor test (CPT) procedure, i.e., insertion of the dominant hand into ice water for 60 sec. Following the CPT or the control…
|Merrill proposes First Principles of Instruction, including a problem- or task-centered strategy for designing instruction. However, when the tasks or problems are ill-defined or complex, task-centered instruction can be difficult to design. We describe an online task-centered training at a land-grant university designed to train employees to use…
In this study, a troubleshooting task model of automobile chassis is proposed. It integrates a task implementation method with virtual interactive techniques. Declarative knowledge entities make up the kernel of task model. It makes the task more controllable and automatable. Solution of the knowledge entities is directly used to support automotive chassis repair and maintenance. Furthermore, this system allows learners
A figure of merit (FOM) for frequency-domain diffusive imaging (FDDI) is theoretically developed adapting the concept of Hotelling observer signal-to-noise ratio. Different from conventionally used FOMs for FDDI, the newly developed FOM considers diffused intensities, modulation amplitudes, and phases in combination. The FOM applied to Monte Carlo simulations of signal- and background-known-exactly problems shows unique characteristics that are in agreement with findings in the literature. We believe that a taskbased assessment using the FOM improves the characterization of FDDI systems and allows for complete system optimization. PMID:23454973
Individuals who report sensitivity to electromagnetic fields often report cognitive impairments that they believe are due to exposure to mobile phone technology. Previous research in this area has revealed mixed results, however, with the majority of research only testing control individuals. Two studies using control and self-reported sensitive participants found inconsistent effects of mobile phone base stations on cognitive functioning. The aim of the present study was to clarify whether short-term (50 min) exposure at 10 mW/m(2) to typical Global System for Mobile Communication (GSM) and Universal Mobile Telecommunications System (UMTS) base station signals affects attention, memory, and physiological endpoints in sensitive and control participants. Data from 44 sensitive and 44 matched-control participants who performed the digit symbol substitution task (DSST), digit span task (DS), and a mental arithmetic task (MA), while being exposed to GSM, UMTS, and sham signals under double-blind conditions were analyzed. Overall, cognitive functioning was not affected by short-term exposure to either GSM or UMTS signals in the current study. Nor did exposure affect the physiological measurements of blood volume pulse (BVP), heart rate (HR), and skin conductance (SC) that were taken while participants performed the cognitive tasks. PMID:19475647
Prominent computational models describe a neural mechanism for learning from reward prediction errors, and it has been suggested that variations in this mechanism are reflected in personality factors such as trait extraversion. However, although trait extraversion has been linked to improved reward learning, it is not yet known whether this relationship is selective for the particular computational strategy associated with error-driven learning, known as model-free reinforcement learning, vs. another strategy, model-based learning, which the brain is also known to employ. In the present study we test this relationship by examining whether humans' scores on an extraversion scale predict individual differences in the balance between model-based and model-free learning strategies in a sequentially structured decision task designed to distinguish between them. In previous studies with this task, participants have shown a combination of both types of learning, but with substantial individual variation in the balance between them. In the current study, extraversion predicted worse behavior across both sorts of learning. However, the hypothesis that extraverts would be selectively better at model-free reinforcement learning held up among a subset of the more engaged participants, and overall, higher task engagement was associated with a more selective pattern by which extraversion predicted better model-free learning. The findings indicate a relationship between a broad personality orientation and detailed computational learning mechanisms. Results like those in the present study suggest an intriguing and rich relationship between core neuro-computational mechanisms and broader life orientations and outcomes. PMID:24027514
Many Korean workers are exposed to repetitive manual tasks or prolonged poor working postures that are closely related to back pain or symptoms of musculoskeletal disorders. Workers engage in tasks that require not only handling of heavy materials, but also assuming prolonged or repetitive non-neutral work postures. Poor work postures that have been frequently observed in the workplaces of shipbuilding shops, manufacturing plants, automobile assembly lines and farms often require prolonged squatting, repetitive arm raising and wrist flexion and simultaneous trunk flexion and lateral bending. In most manufacturing industries, workers have to assume improper work postures repetitively, several hundreds of times per day depending on daily production rate. A series of psychophysical laboratory experiments were conducted to evaluate the postural load at various joints. A postural load assessment system was then developed based on a macro-postural classification scheme. The classification scheme was constructed based on perceived discomfort for various joint motions as well as previous research outcomes. On the basis of the perceived discomfort, postural stress levels for the postures at individual joints were also defined by a ratio scale to the standing neutral posture. Laboratory experiments simulating automobile assembly tasks were carried out to investigate the relationship between body-joint and whole-body discomfort. Results showed a linear relationship between the two types of discomfort, with the shoulder and low back postures being the dominant factor in determining the whole body postural stresses. The proposed method was implemented into a computer software program in order to automate the procedure of analysing postural load and to enhance usability and practical applicability. PMID:16040522
This paper introduces the software life-cycle based V&V (verification and validation) tasks for the KNICS (Korea nuclear instrumentation and control system) project. The objectives of the V&V tasks are mainly to develop the programmable logic controller (PLQ for safety-critical instrumentation and control (I&C) systems, and then to apply the PLC to developing the prototype of the safety-critical software based digital
S. W. Cheon; G. Y. Park; K. H. Cha; J. S. Lee; K. C. Kwon
|Factors affecting performance on base-10 tasks were investigated in a series of four studies with a total of 453 children aged 5-7 years. Training in counting-on was found to enhance child performance on base-10 tasks (Studies 2, 3, and 4), while prior knowledge of counting-on (Study 1), trading (Studies 1 and 3), and partitioning (Studies 1 and…
Children's time estimation literature lacks of studies comparing prospective and retrospective time estimates of long lasting ecological tasks, i.e. tasks reflecting children's daily activities. In the present study, children were asked to estimate prospectively or retrospectively how much time they played a video game or read a magazine. Regardless of the task, the results revealed that prospective time estimates were longer than the retrospective ones. Also, time estimates of the video game task were longer, less accurate and more variable than those of the reading task. The results are discussed in the light of the current literature about time estimation of long lasting ecological tasks.
The aim of most population-based studies of media is to relate a specific exposure to an outcome of interest. A research program has been developed that evaluates exposure to different components of movies in an attempt of assess the association of such exposure with the adoption of substance use during adolescence. To assess exposure to movie substance use, one must measure both viewing time and content. In developing the exposure measure, the study team was interested in circumventing a common problem in exposure measurement, where measures often conflate exposure to media with attention to media. Our aim in this paper is to present a validated measure of exposure to entertainment media, the Beach method, which combines recognition of a movie title with content analysis of the movie for substance use, to generate population based measures of exposure to substance use in this form of entertainment.
Sargent, James D.; Worth, Keilah A.; Beach, Michael; Gerrard, Meg; Heatherton, Todd F.
Practice, effort and specificity have been proved as the major principles for gait rehabilitation. The best way to improve performance of a motor task is to execute that specific motor task again and again. To assure the consistency of a task specific repetitive gait rehabilitation training with accuracy, from both the patient and therapist ends for a long training session,
It is well established in the literature that secondary tasks adversely affect driving behavior. Previous research has focused on discovering the general trends by analyzing the average effects of secondary tasks on a population of drivers. This paper conjectures that there may also be individual effects, i.e., different effects of secondary tasks on individual drivers, which may be obscured within
Tulga Ersal; Helen J. A. Fuller; Omer Tsimhoni; Jeffrey L. Stein; Hosam K. Fathy
In this paper, we introduce an agent-based simulation for investigating the impact of social factors on the formation and evolution of task-oriented groups. Task-oriented groups are created explicitly to perform a task, and all members derive benefits from task completion. However, even in cases when all group members act in a way that is locally optimal for task completion, social forces that have mild effects on choice of associates can have a measurable impact on task completion performance. In this paper, we show how our simulation can be used to model the impact of stereotypes on group formation. In our simulation, stereotypes are based on observable features, learned from prior experience, and only affect an agent's link formation preferences. Even without assuming stereotypes affect the agents' willingness or ability to complete tasks, the long-term modifications that stereotypes have on the agents' social network impair the agents' ability to form groups with sufficient diversity of skills, as compared to agents who form links randomly. An interesting finding is that this effect holds even in cases where stereotype preference and skill existence are completely uncorrelated.
Purpose: Since the introduction of clinical x-ray phase-contrast mammography (PCM), a technique that exploits refractive-index variations to create edge enhancement at tissue boundaries, a number of optimization studies employing physical image-quality metrics have been performed. Ideally, task-based assessment of PCM would have been conducted with human readers. These studies have been limited, however, in part due to the large parameter-space of PCM system configurations and the difficulty of employing expert readers for large-scale studies. It has been proposed that numerical observers can be used to approximate the statistical performance of human readers, thus enabling the study of task-based performance over a large parameter-space. Methods: Methods are presented for task-based image quality assessment of PCM images with a numerical observer, the most significant of which is an adapted lumpy background from the conventional mammography literature that accounts for the unique wavefield propagation physics of PCM image formation and will be used with a numerical observer to assess image quality. These methods are demonstrated by performing a PCM task-based image quality study using a numerical observer. This study employs a signal-known-exactly, background-known-statistically Bayesian ideal observer method to assess the detectability of a calcification object in PCM images when the anode spot size and calcification diameter are varied. Results: The first realistic model for the structured background in PCM images has been introduced. A numerical study demonstrating the use of this background model has compared PCM and conventional mammography detection of calcification objects. The study data confirm the strong PCM calcification detectability dependence on anode spot size. These data can be used to balance the trade-off between enhanced image quality and the potential for motion artifacts that comes with use of a reduced spot size and increased exposure time. Conclusions: A method has been presented for the incorporation of structured breast background data into task-based numerical observer assessment of PCM images. The method adapts conventional background simulation techniques to the wavefield propagation physics necessary for PCM imaging. This method is demonstrated with a simple detection task.
Zysk, Adam M.; Brankov, Jovan G.; Wernick, Miles N.; Anastasio, Mark A.
This paper describes a computing environment which supports computer-based scientific research work. Key features include support for automatic distributed scheduling and execution and computer-based scientific experimentation. A new flexible and extensible scheduling technique that is responsive to a user`s scheduling constraints, such as the ordering of program results and the specification of task assignments and processor utilization levels, is presented. An easy-to-use constraint language for specifying scheduling constraints, based on the relational database query language SQL, is described along with a search-based algorithm for fulfilling these constraints. A set of performance studies show that the environment can schedule and execute program graphs on a network of workstations as the user requests. A method for automatically generating computer-based scientific experiments is described. Experiments provide a concise method of specifying a large collection of parameterized program executions. The environment achieved significant speedups when executing experiments; for a large collection of scientific experiments an average speedup of 3.4 on an average of 5.5 scheduled processors was obtained.
Ahrens, J.P.; Shapiro, L.G.; Tanimoto, S.L. [Univ. of Washington, Seattle, WA (United States). Dept. of Computer Science and Engineering
In two studies, we investigated how people use base rates and the presence versus the absence of new information to judge which of two hypotheses is more likely. Participants were given problems based on two decks of cards printed with 0-4 letters. A table showed the relative frequencies of the letters on the cards within each deck. Participants were told the letters that were printed on or absent from a card the experimenter had drawn. Base rates were conveyed by telling participants that the experimenter had chosen the deck by drawing from an urn containing, in different proportions, tickets marked either 'deck 1' or 'deck 2'. The task was to judge from which of the two decks the card was most likely drawn. Prior probabilities and the evidential strength of the subset of present clues (computed as 'weight of evidence') were the only significant predictors of participants' dichotomous (both studies) and continuous (Study 2) judgments. The evidential strength of all clues was not a significant predictor of participants' judgments in either study, and no significant interactions emerged. We discuss the results as evidence for additive integration of base rates and the new present information in hypothesis testing. PMID:23560666
Rusconi, Patrice; Marelli, Marco; Russo, Selena; D'Addario, Marco; Cherubini, Paolo
Despite interest in media effects on sexual behavior, there is no single method for assessing exposure to sexual content in media. This paper discusses the development of sexual content exposure measures based on adolescent respondents' sexual content ratings of titles in television, music, magazines, and video games. We assessed the construct and criterion validity of these exposure measures by examining
Amy Bleakley; Martin Fishbein; Michal Hennessy; Amy Jordan; Ariel Chernin; Robin Stevens
Lung cancer risk estimation in relation to residential radon exposure remains uncertain, partly as a result of imprecision in air-based retrospective radon-exposure assessment in epidemiological studies. A recently developed methodology provides estimates for past radon concentrations and involves measurement of the surface activity of a glass object that has been in a subject's dwellings through the period for exposure assessment.
FRÉDÉRIC LAGARDE; ROLF FALK; KATINKA ALMRÉN; FREDRIK NYBERG; HELENA SVENSSON; GÖRAN PERSHAGEN; Fréeéric Lagarde
Intoduction Integrated teaching and problem-based learning (PBL) are powerful educational strategies. Difficulties arise, however, in their application in the later years of the undergraduate medical curriculum, particularly in clinical attachments. Two solutions have been proposed - the use of integrated clinical teaching teams and time allocated during the week for PBL sep- arate from the clinical work. Both approaches have
RM Harden; J Crosby; MH Davis; PW Howie; AD Struthers
The two predominant arsenic exposure routes are food and water. Estimating the risk from dietary exposures is complicated, owing to the chemical form dependent toxicity of arsenic and the diversity of arsenicals present in dietary matrices. Two aspects of assessing dietary expo...
The experimental no effect level was 20 mg ethyl acrylate\\/m3. The exposure with slight effects was at 100 mg\\/m3. The real no effect level is between 20 and 100 mg\\/m3. Therefore an occupational exposure limit of 20 mg eth-acr\\/m3 TWA 8 hr is advised with \\
PURPOSE: This population-based case-control study examined occupational exposure to electromagnetic fields in relation to female breast cancer incidence among 843 breast cancer cases and 773 controls.METHODS: Exposure was classified based on work in the two longest-held jobs, and indices of cumulative exposure to magnetic fields based on a measurement survey.RESULTS: Female breast cancer was not associated with employment as an
Edwin Van Wijngaarden; Leena A Nylander-French; Robert C Millikan; David A Savitz; Dana Loomis
The extent of outdoor exposure during winter and factors affecting it were examined in a cross-sectional population study in Finland. Men and women aged 25-74 years from the National FINRISK 2002 sub-study (n=6,591) were queried about their average weekly occupational, leisure-time and total cold exposure during the past winter. The effects of gender, age, area of residence, occupation, ambient temperature, self-rated health, physical activity and education on cold exposure were analysed. The self-reported median total cold exposure time was 7 h/week (8 h men, 6 h women),<1 h/week (2 h men, 0 h women) at work, 4 h/week (5 h men, 4 h women) during leisure time and 1 h/week (1 h men, 1.5 h women) while commuting to work. Factors associated with increased occupational cold exposure among men were: being employed in agriculture, forestry and industry/mining/construction or related occupations, being less educated and being aged 55-64 years. Factors associated with increased leisure-time cold exposure among men were: employment in industry/mining/construction or related occupations, being a pensioner or unemployed, reporting at least average health, being physically active and having college or vocational education. Among women, being a housewife, pensioner or unemployed and engaged in physical activity increased leisure-time cold exposure, and young women were more exposed than older ones. Self-rated health was positively associated with leisure time cold exposure in men and only to a minor extent in women. In conclusion, the subjects reported spending 4% of their total time under cold exposure, most of it (71%) during leisure time. Both occupational and leisure-time cold exposure is greater among men than women. PMID:16788837
Mäkinen, Tiina M; Raatikka, Veli-Pekka; Rytkönen, Mika; Jokelainen, Jari; Rintamäki, Hannu; Ruuhela, Reija; Näyhä, Simo; Hassi, Juhani
The best available concepts for a 100 kW Solar Lunar Power Plant based on static and dynamic conversion concepts have been examined. The two concepts which emerged for direct comparison yielded a difference in delivered mass of 35 MT, the mass equivalent of 1.4 lander payloads, in favor of the static concept. The technologies considered for the various elements are either state-of-the-art or near-term. Two photovoltaic cell concepts should receive high priority for development: i.e., amorphous silicon and indium phosphide cells. The amorphous silicon, because it can be made so light weight and rugged; and the indium phosphide, because it shows very high efficiency potential and is reportedly not degraded by radiation. Also the amorphous silicon cells may be mounted on flexible backing that may roll up much like a carpet for compact storage, delivery, and ease of deployment at the base. The fuel cell and electrolysis cell technology is quite well along for lunar base applications, and because both the Shuttle and the forthcoming Space Station incorporate these devices, the status quo will be maintained. Early development of emerging improvements should be implemented so that essential life verification test programs may commence.
ObjectivesTo evaluate the agreement between job-title based estimates for upper extremity physical work exposures and exposure estimates from work observation and worker self-report.MethodsSelf-reported exposure questionnaires were completed by 972 workers, and exposure estimates based on worksite observation were completed for a subset of 396 workers. Job-title based estimates were obtained from O*NET, an American database of job demands. Agreement between
Bethany T Gardner; David A Lombardi; Ann Marie Dale; Alfred Franzblau; Bradley A Evanoff
Though aphasia is primarily characterized by impairments in the comprehension and/or expression of language, research has shown that patients with aphasia also show deficits in cognitive-linguistic domains such as attention, executive function, concept knowledge and memory. Research in aphasia suggests that cognitive impairments can impact the online construction of language, new verbal learning, and transactional success. In our research, we extend this hypothesis to suggest that general cognitive deficits influence progress with therapy. The aim of our study is to explore learning, a cognitive process that is integral to relearning language, yet underexplored in the field of aphasia rehabilitation. We examine non-linguistic category learning in patients with aphasia (n=19) and in healthy controls (n=12), comparing feedback and non-feedback based instruction. Participants complete two computer-based learning tasks that require them to categorize novel animals based on the percentage of features shared with one of two prototypes. As hypothesized, healthy controls showed successful category learning following both methods of instruction. In contrast, only 60% of our patient population demonstrated successful non-linguistic category learning. Patient performance was not predictable by standardized measures of cognitive ability. Results suggest that general learning is affected in aphasia and is a unique, important factor to consider in the field of aphasia rehabilitation. PMID:23127795
This paper generalizes a discomfort model for climbing tasks of the handicapped with prosthesis limb. The model is integrated,\\u000a by ICT technology, into the simulated task scenario to indicate to what extent the climbing task causes discomfort and to\\u000a analyze the direct cause of discomfort at the micro-motion level. Furthermore, it can predict the potential harm and accidents\\u000a in the
A significant problem with no simple solutions in current real-time literature is analyzing the end-to-end schedulability\\u000a of tasks in distributed systems with cycles in the task graph. Prior approaches including network calculus and holistic schedulability\\u000a analysis work best for acyclic task flows. They involve iterative solutions or offer no solutions at all when flows are non-acyclic.\\u000a This paper demonstrates the
Introduction At least 36 countries are suffering from severe shortages of healthcare workers and this crisis of human resources in developing countries is a major obstacle to scale-up of HIV care. We performed a case study to evaluate a health service delivery model where a task-shifting approach to HIV care had been undertaken with tasks shifted from doctors to nurses and community health workers in rural Haiti. Methods Data were collected using mixed quantitative and qualitative methods at three clinics in rural Haiti. Distribution of tasks for HIV services delivery; types of tasks performed by different cadres of healthcare workers; HIV program outcomes; access to HIV care and acceptability of the model to staff were measured. Results A shift of tasks occurred from doctors to nurses and to community health workers compared to a traditional doctor-based model of care. Nurses performed most HIV-related tasks except initiation of TB therapy for smear-negative suspects with HIV. Community health workers were involved in over half of HIV-related tasks. HIV services were rapidly scaled-up in the areas served; loss to follow-up of patients living with HIV was less than 5% at 24 months and staff were satisfied with the model of care. Conclusion Task-shifting using a community-based, nurse-centered model of HIV care in rural Haiti is an effective model for scale-up of HIV services with good clinical and program outcomes. Community health workers can provide essential health services that are otherwise unavailable particularly in rural, poor areas.
While only limited data are available to characterize the potential toxicity of over 8 million commercially available chemical substances, there is even less information available on the exposure and use-scenarios that are required to link potential toxicity to human and ecologic...
Given the centrality of the argumentation process to science and consequent importance to science education, inviting science students to engage in argumentation and scaffolding that argumentation in order that it lead to learning and not frustration is important. The present research invites small groups of science content learners (54 preservice elementary teachers at a large research university) to use analogical-mapping-based comparison tasks in service of argumentation to determine which of two possible analogues, in this case simple machines, is most closely related to a third. These activities and associated instruction scaffolded student small-groups' argumentation in four ways: (1) supporting new analogical correspondences on the heels of prior correspondences; (2) discerning definitions and descriptions for simple machine elements; (3) identifying and dealing with ambiguity in potential correspondences; and (4) making reflections on prior analogical correspondences in service of their final arguments. Analogical-mapping-based comparison activities scaffolded student small groups both in their argumentation and in content learning about simple machines. Implications, limitations, and directions for future related research are also discussed.
Augmented tabletops have recently attracted considerable attention in the literature. However, little has been known about the eects that these interfaces have on learning tasks. In this paper, we report on the results of an empirical study that explores the usage of tabletop systems in an expres- sive collaborative learning task. In particular, we focus on measuring the dierence in
The authors propose a robot teaching interface which uses virtual reality. The teaching method provides a user interface with which a novice operator can easily direct a robot. The operator performs the assembly task in a virtual workspace generated by a computer. The operator's movements are recognized as robot task-level operations by using a finite automation. The system interprets the
The first purpose of the studies reported was to compare the ability of autistic children in understanding false belief with the corresponding abilities of normal or mentally retarded children. Unexpected location tasks adapted from the Sally- Anne false belief tasks were used in experiment 1. Autistic children performed worse than normal children and the mentally retarded children, but the difference
With the emerging many-core paradigm, parallel programming must extend beyond its traditional realm of scientific applications. Converting existing sequential applications as well as developing next-generation software requires assistance from hardware, compilers and runtime systems to exploit parallelism transparently within applications. These systems must decompose applications into tasks that can be executed in parallel and then schedule those tasks to minimize load imbalance. However, many systems lack a priori knowledge about the execution time of all tasks to perform effective load balancing with low scheduling overhead. In this paper, we approach this fundamental problem using machine learning techniques first to generate performance models for all tasks and then applying those models to perform automatic performance prediction across program executions. We also extend an existing scheduling algorithm to use generated task cost estimates for online task partitioning and scheduling. We implement the above techniques in the pR framework, which transparently parallelizes scripts in the popular R language, and evaluate their performance and overhead with both a real-world application and a large number of synthetic representative test scripts. Our experimental results show that our proposed approach significantly improves task partitioning and scheduling, with maximum improvements of 21.8%, 40.3% and 22.1% and average improvements of 15.9%, 16.9% and 4.2% for LMM (a real R application) and synthetic test cases with independent and dependent tasks, respectively.
Li, J; Ma, X; Singh, K; Schulz, M; de Supinski, B R; McKee, S A
With the emerging many-core paradigm, parallel programming must extend beyond its traditional realm of scien- tific applications. Converting existing sequential applications as well as developing next-generation software requires assistance from hardware, compilers and runtime systems to exploit par- allelism transparently within applications. These systems must decompose applications into tasks that can be executed in parallel and then schedule those tasks
Jiangtian Li; Xiaosong Ma; Karan Singh; Martin Schulz; Bronis R. De Supinski; Sally A. Mckee
The Deputy Secretary of Defense tasked the Defense Business Board (DBB) to form a Task Group with the support of the Defense Science Board (DSB) to evaluate and make recommendations to the Department of Defense (DoD) regarding actions that could improve t...
PURPOSE: This population-based case-control study examined occupational exposure to electromag- netic fieldsin relation to female breast cancer incidence among 843 breast cancer casesand 773 controls. METIIODS: Exposure was classified based on work in the two longest-held jobs. and indices of cumu1a~ tive exposure to magnetic fieldsbasedon a measurement survey. , RESULTS: Female breast cancer was not associated with employment as
In objective or task-based assessment of image quality, figures of merit are defined by the performance of some specific observer on some task of scientific interest. This methodology is well established in medical imaging but is just beginning to be applied in astronomy. In this paper we survey the theory needed to understand the performance of ideal or ideal-linear (Hotelling) observers on detection tasks with adaptive-optical data. The theory is illustrated by discussing its application to detection of exoplanets from a sequence of short-exposure images. PMID:20890393
In objective or task-based assessment of image quality, figures of merit are defined by the performance of some specific observer on some task of scientific interest. This methodology is well established in medical imaging but is just beginning to be applied in astronomy. In this paper we survey the theory needed to understand the performance of ideal or ideal-linear (Hotelling) observers on detection tasks with adaptive-optical data. The theory is illustrated by discussing its application to detection of exoplanets from a sequence of short-exposure images.
Barrett, Harrison H.; Myers, Kyle J.; Devaney, Nicholas; Dainty, J. C.; Caucci, Luca
Interruption-based advertising has gained prominence in the online channel. Yet, little attention has been paid to deriving design principles and conceptualizations for online interruption-based advertising. This paper examines three novel design factors related to this phenomenon, namely, exposure timing, advertising intent, and brand image. Exposure timing pertains to the time by which the advertisement (ad) is launched within a website.
Jason C. F. Chan; Zhenhui Jiang; Bernard C. Y. Tan
|Purpose of the present study was to test a conceptual model of relations among achievement goal orientation, self-efficacy, cognitive processing, and achievement of students working within a particular collaborative task context. The task involved a collaborative computer-based modeling task. In order to test the model, group measures of…
Sins, Patrick H. M.; van Joolingen, Wouter R.; Savelsbergh, Elwin R.; van Hout-Wolters, Bernadette
This paper proposes a multi-robot coordination architecture for dynamic task, role and behavior selections. The proposed architecture employs the motivation of task, the utility of role, a probabilistic behavior selection and a team strategy for efficient multi-robot coordination. Multiple robots in a team can coordinate with each other by selecting appropriate task, role and behavior in adversarial and dynamic environment. The effectiveness of the proposed architecture is demonstrated in dynamic environment robot soccer by carrying out computer simulation and real environment.
The recently approved ANSI\\/CEA-2018 standard is motivated by the current usability crisis in computer controlled electronic products. The standard facilitates a new user interface design methodology that uses a task model at runtime to guide users.
Cancer risk from exposure to benzene for a working lifetime was estimated from data obtained in studies with rodents. ancers of the Zymbal gland and the blood-forming system were selected as endpoints for the assessment because of their consistent occurrence. he combined metaboli...
18 mg\\/m3 is an adverse effect level, causing minimal local (no systemic) effects in rats and mice. To take this into account a safety factor of 10 is applied which results in a recommended occupational exposure limit of 1.8 mg\\/m3 (1 ppm) TWA 8 h for DMA.
A technique has been developed that uses Puff, a volcanic ash transport and dispersion (VATD) model, to forecast the relative exposure of aircraft and ground facilities to ash from a volcanic eruption. VATD models couple numerical weather prediction (NWP) data with physical descriptions of the initial eruptive plume, atmospheric dispersion, and settling of ash particles. Three distinct examples of variations
|This study compares quantitative and qualitative results for task-based second language (L2) grammar instruction conducted as whole-class, teacher-led discourse (TLD) versus small-group, learner-led discourse (LLD). Participants included 78 English-speaking adults from six university classes of beginning L2 Spanish, with two assigned to each…
Navigation, orientation - to find one's way - belong to the elementary tasks of everyday life. Do behavioural data retrieved from these scenarios allow for a prediction of whether a subject is anxious, disoriented, determined or satisfied? Can behaviour be predicted on these bases? Insights like this might be of great use, for example for the optimisation of public spaces.
There are very few acoustic studies reflecting on the localization of speech function within the different loci of the cerebellum. Taskbased performance profile of subjects with lesion in different cerebellar loci is not reported. Also, the findings on nonfocal cerebellar lesions cannot be generalized to lesions restricted to the cerebellum.…
Technology today offers many new opportunities for innovation in educational assessment through rich new assessment tasks and potentially powerful scoring, reporting and real-time feedback mechanisms. One potential limitation for realizing the benefits of computer-based assessment in both instructional assessment and large scale testing comes in…
Two studies with heterosexual female and malecollege students explored the effects on mood and bodyimage resulting from a negative versus a positiveoutcome in a competitive interaction. In study 1,participants either succeeded or failed in comparison to anopposite-sex confederate on a gender-neutral task ofanagram solution. Study 2 added the dimension of thegender stereotypicality of the task by creatingempirically derived feminine (beauty
This research study focused on the relationship between physical activity in the classroom and students' time-on-task behavior. Students' levels of on- and off-task behavior were compared during typical school days and days in which students received some sort of physical activity instruction. A total of three classes of approximately fifty students were used in data collection through observations and individual
The aims were to evaluate the inter-method reliability of a registration sheet for patient handling tasks, to study the day-to-day variation of musculoskeletal complaints (MSC) and to examine whether patient handling tasks and psychosocial factors were associated with MSC.Nurses (n=148) fulfilled logbooks for three consecutive working days followed by a day off. Low back pain (LBP), neck\\/shoulder pain (NSP), knee
The authors previously described how image reconstruction algorithms can be evaluated on the basis of how well binary-discrimination tasks can be performed using the reconstructions. The test statistic in the detection task was the estimated activity within the object, also known as the non-prewhitening matched-filter output. This approximation to the likelihood function was used because a full characterization of the
We have described previously132 how image reconstruction algorithms can be evaluated on the basis of how well binary-discrimination tasks can be performed using the reconstructions. The test statistic in the detection task was the estimated activity within the object, also known as the non-prewhitening matched- filter output. This approximation to the likelihood function was used because a full characterization of
Recognizing others' emotional states is crucial for effective social interaction. While most facial emotion recognition tasks use explicit prompts that trigger consciously controlled processing, emotional faces are almost exclusively processed implicitly in real life. Recent attempts in social cognition suggest a dual process perspective, whereby explicit and implicit processes largely operate independently. However, due to differences in methodology the direct comparison of implicit and explicit social cognition has remained a challenge. Here, we introduce a new tool to comparably measure implicit and explicit processing aspects comprising basic and complex emotions in facial expressions. We developed two video-basedtasks with similar answer formats to assess performance in respective facial emotion recognition processes: Face Puzzle, implicit and explicit. To assess the tasks' sensitivity to atypical social cognition and to infer interrelationship patterns between explicit and implicit processes in typical and atypical development, we included healthy adults (NT, n = 24) and adults with autism spectrum disorder (ASD, n = 24). Item analyses yielded good reliability of the new tasks. Group-specific results indicated sensitivity to subtle social impairments in high-functioning ASD. Correlation analyses with established implicit and explicit socio-cognitive measures were further in favor of the tasks' external validity. Between group comparisons provide first hints of differential relations between implicit and explicit aspects of facial emotion recognition processes in healthy compared to ASD participants. In addition, an increased magnitude of between group differences in the implicit task was found for a speed-accuracy composite measure. The new Face Puzzle tool thus provides two new tasks to separately assess explicit and implicit social functioning, for instance, to measure subtle impairments as well as potential improvements due to social cognitive interventions. PMID:23805122
Recognizing others' emotional states is crucial for effective social interaction. While most facial emotion recognition tasks use explicit prompts that trigger consciously controlled processing, emotional faces are almost exclusively processed implicitly in real life. Recent attempts in social cognition suggest a dual process perspective, whereby explicit and implicit processes largely operate independently. However, due to differences in methodology the direct comparison of implicit and explicit social cognition has remained a challenge. Here, we introduce a new tool to comparably measure implicit and explicit processing aspects comprising basic and complex emotions in facial expressions. We developed two video-basedtasks with similar answer formats to assess performance in respective facial emotion recognition processes: Face Puzzle, implicit and explicit. To assess the tasks' sensitivity to atypical social cognition and to infer interrelationship patterns between explicit and implicit processes in typical and atypical development, we included healthy adults (NT, n = 24) and adults with autism spectrum disorder (ASD, n = 24). Item analyses yielded good reliability of the new tasks. Group-specific results indicated sensitivity to subtle social impairments in high-functioning ASD. Correlation analyses with established implicit and explicit socio-cognitive measures were further in favor of the tasks' external validity. Between group comparisons provide first hints of differential relations between implicit and explicit aspects of facial emotion recognition processes in healthy compared to ASD participants. In addition, an increased magnitude of between group differences in the implicit task was found for a speed-accuracy composite measure. The new Face Puzzle tool thus provides two new tasks to separately assess explicit and implicit social functioning, for instance, to measure subtle impairments as well as potential improvements due to social cognitive interventions.
Background Tanning lamp sessions have increased in Europe in recent years. Recent epidemiological studies have confirmed a proven link between melanoma and artificial UV exposure. However, in France, little information is available to determine the exposure of the population. This article presents the results from the ‘Baromètre cancer 2010’ concerning the proportion of users exposed to artificial UV radiation in France, their characteristics and level of information on the risks associated. Methods A two stage random sampling telephone survey assisted by CATI system (household, individual) was performed from 3 April 2010 to 7 August 2010 on a sample of 3,359 people aged 15 to 75 years old. Results In 2010, 13.4% of the French population reported to have tanning lamp sessions at least once in their lifetime and 3.5% of the total population reported the use of artificial UV radiation over the last twelve months. Exposure over the last twelve months is most commonly seen among females (5.0%) and young population between 20–25 years old (9.6%). In addition, 3.5% of those under 18 years report having attended UV booths at least once during their lifetime even though they are forbidden to minors. Moreover, more than one the third of users reported more than 10 exposures within a year. The places of exposure cited most often were beauty salons (50%) and tanning centers (46%). Only 49.2% of those surveyed felt that they were well informed on the risks of cancer associated with UV booths. Furthermore, the population was found to have misconceptions about artificial UV radiation. One quarter of the population, believe that artificial UV radiation use before vacation protects the skin from sunburn. Conclusions This first study on artificial UV radiation exposure in France has better quantified and characterized the users. It has also defined the state of knowledge and the perception of risk by the general French population. This work will contribute to determine actions of prevention to reduce cancer risk related to artificial UV radiation.
Purpose: Tomosynthesis is a promising modality for breast imaging. The appearance of the tomosynthesis reconstructed image is greatly affected by the choice of acquisition and reconstruction parameters. The purpose of this study was to investigate the limitations of tomosynthesis breast imaging due to scan parameters and quantum noise. Tomosynthesis image quality was assessed based on performance of a mathematical observer model in a signal-known exactly (SKE) detection task. Methods: SKE detectability (d?) was estimated using a prewhitening observer model. Structured breast background was simulated using filtered noise. Detectability was estimated for designer nodules ranging from 0.05 to 0.8 cm in diameter. Tomosynthesis slices were reconstructed using iterative maximum-likelihood expectation-maximization. The tomosynthesis scan angle was varied between 15° and 60°, the number of views between 11 and 41 and the total number of x-ray quanta was ?, 6×105, and 6×104. Detectability in tomosynthesis was compared to that in a single projection. Results: For constant angular sampling distance, increasing the angular scan range increased detectability for all signal sizes. Large-scale signals were little affected by quantum noise or angular sampling. For small-scale signals, quantum noise and insufficient angular sampling degraded detectability. At high quantum noise levels, angular step size of 3° or below was sufficient to avoid image degradation. At lower quantum noise levels, increased angular sampling always resulted in increased detectability. The ratio of detectability in the tomosynthesis slice to that in a single projection exhibited a peak that shifted to larger signal sizes when the angular range increased. For a given angular range, the peak shifted toward smaller signals when the number of views was increased. The ratio was greater than unity for all conditions evaluated. Conclusion: The effect of acquisition parameters on lesion detectability depends on signal size. Tomosynthesis scan angle had an effect on detectability for all signals sizes, while quantum noise and angular sampling only affected the detectability small-scale signals.
Me-acr can be absorbed totally after dermal, and oral exposure. Me-acr is metabolized fairly rapidly and elimination occurs for a large part via exhalation of CO2 and to a lesser extent via urinary excretion of thio-ethers. Me-acr is irritant to corrosive to the skin. Skin sensitization and cross sensitization to some of the related compounds occurs. In a two-year inhalatory
This study aimed to expand the current understanding of smoking maintenance mechanisms by examining how putative relapse risk factors relate to a single behavioral smoking choice using a novel laboratory smoking-choice task. After 12 hours of nicotine deprivation, participants were exposed to smoking cues and given the choice between smoking up to two cigarettes in a 15-minute window or waiting and receiving four cigarettes after a delay of 45 minutes. Greater nicotine dependence, higher impulsivity, and lower distress tolerance were hypothesized to predict earlier and more intensive smoking. Out of 35 participants (n=9 female), 26 chose to smoke with a median time to a first puff of 1.22 minutes (standard deviation=2.62 min, range=0.03–10.62 min). Survival analyses examined latency to first puff, and results indicated that greater pre-task craving and smoking more cigarettes per day were significantly related to smoking sooner in the task. Greater behavioral disinhibition predicted shorter smoking latency in the first two minutes of the task, but not at a delay of more than two minutes. Lower distress tolerance (reporting greater regulation efforts to alleviate distress) was related to more puffs smoked and greater nicotine dependence was related to more time spent smoking in the task. This novel laboratory smoking-choice paradigm may be a useful laboratory analog for the choices smokers make during cessation attempts and may help identify factors that influence smoking lapses.
The radiation doses to Task Force BIG BANG troops who participated in Shot GALILEO are determined and compared with film badge data gathered during Exercise Desert Rock VII-VIII. Fallout contours and decay rates, are established and used to calculate total external beta and gamma doses, based on estimated stay time for various troop activities. Uncertainties are calculated for each parameter. Radiation dose from internal emitters, due to inhalation of resuspended contaminated particles, is calculated. Total external gamma dose is estimated to be 1070-1780 mrem as compared with a mean film badge reading of 1900 mrem. 50-year bone dose, due to internal emitters, is estimated to be 10-25 mrem.
Exposure to perchlorate is widespread in the United States and many studies have attempted to character the perchlorate exposure by estimating the average daily intakes of perchlorate. These approaches provided population-based estimates, but did not provide individual-level exposure estimates. Until recently, exposure activity database such as CSFII, TDS and NHANES become available and provide opportunities to evaluate the individual-level exposure to chemical using exposure surveillance dataset. In this study, we use perchlorate as an example to investigate the usefulness of urinary biomarker data for predicting exposures at the individual level. Specifically, two analyses were conducted: (1) using data from a controlled human study to examine the ability of a physiologically based pharmacokinetic (PBPK) model to predict perchlorate concentrations in single-spot and cumulative urine samples; and (2) using biomarker data from a population-based study and a PBPK model to demonstrate the challenges in linking urinary biomarker concentrations to intake doses for individuals. Results showed that the modeling approach was able to characterize the distribution of biomarker concentrations at the population level, but predicting the exposure-biomarker relationship for individuals was much more difficult. The type of information needed to reduce the uncertainty in estimating intake doses, for individuals, based on biomarker measurements is discussed. PMID:22520969
Synopsis This paper describes the relationships between exposure, hazard mapping, and vulnerability analysis, and represents the initial hazard mapping results of the exposure analysis. Based on different scales, vulnerability can be divided into five layers. Several connections of these five layers, the levels of vulnerability are defined, as vulnerability of the individual, village, country, and central government. Landslide susceptibility is
EPA EXPOsure toolBOX, or EPA-Expo-Box, is a web-based toolbox that has been developed by EPA’s Office of Research and Development (ORD), National Center for Environmental Assessment (NCEA). It is intended for exposure and risk assessors and it comprises a series of Tool Set...
BACKGROUND: Recent advances in GIS technology and remote sensing have provided new opportunities to collect ecologic data on agricultural pesticide exposure. Many pesticide studies have used historical or records-based data on crops and their associated pesticide applications to estimate exposure by measuring residential proximity to agricultural fields. Very few of these studies collected environmental and biological samples from study participants.
Justine LE Allpress; Ross J Curry; Carol L Hanchette; Michael J Phillips; Timothy C Wilcosky
|This study compared several exposure control procedures for CAT systems based on the three-parameter logistic testlet response theory model (Wang, Bradlow, & Wainer, 2002) and Masters' (1982) partial credit model when applied to a pool consisting entirely of testlets. The exposure control procedures studied were the modified within 0.10 logits…
Boyd, Aimee M.; Dodd, Barbara; Fitzpatrick, Steven
We used a simple method based on inductively coupled plasma mass spectrometry (ICP-MS) to determine the isotopic composition of uranium in urine at levels that indicate occupational exposure to depleted uranium (DU). DU exposure is indicated by a range fo...
P. R. Boyd R. W. Tardiff J. W. Ejnik A. J. Carmichael M. M. Hamiliton
As computer systems have become more and more decentralized and parallel in operation, interest in concurrent and distributed software has grown. One very important and challenging problem for distributed-software engineering is program behavior analysis. We have advocated the use of Petri nets to define a general static analysis framework for Ada tasking. The framework has evolved into a collection of tools that have proven to be a very valuable platform for experimental research. In this paper, the authors define and discuss the design and implementation of tools that make up their tasking-oriented-toolkit for the ADA language (TOTAL). Both modeling and query/analysis methods and tools are discussed. Example Ada tasking programs are used to demonstrate the utility of each tool individually as well as the way the tools integrate together.
Shatz, S.M.; Black, C.; Tu, S. (Dept. of Electrical Engineering and Computer Science, Univ. of Illinois at Chicago, Chicago, IL (US)); Mai, K. (AT and T Bell Laboratories, Naperville, IL (US))
In many studies, the estimation of the apparent diffusion coefficient (ADC) of lesions in visceral organs in diffusion-weighted (DW) magnetic resonance images requires an accurate lesion-segmentation algorithm. To evaluate these lesion-segmentation algorithms, region-overlap measures are used currently. However, the end task from the DW images is accurate ADC estimation, and the region-overlap measures do not evaluate the segmentation algorithms on this task. Moreover, these measures rely on the existence of gold-standard segmentation of the lesion, which is typically unavailable. In this paper, we study the problem of task-based evaluation of segmentation algorithms in DW imaging in the absence of a gold standard. We first show that using manual segmentations instead of gold-standard segmentations for this task-based evaluation is unreliable. We then propose a method to compare the segmentation algorithms that does not require gold-standard or manual segmentation results. The no-gold-standard method estimates the bias and the variance of the error between the true ADC values and the ADC values estimated using the automated segmentation algorithm. The method can be used to rank the segmentation algorithms on the basis of both the ensemble mean square error and precision. We also propose consistency checks for this evaluation technique.
Jha, Abhinav K.; Kupinski, Matthew A.; Rodríguez, Jeffrey J.; Stephen, Renu M.; Stopeck, Alison T.
Surface exposure dating using terrestrial cosmogenic nuclides provides the opportunity to establish glacial chronologies in semi-arid high mountain regions, where the lack of organic material for radiocarbon dating has limited our knowledge about the timing and the causes of glacial advances so far. However, several scaling systems and calculation schemes exist. This can result in significant systematic uncertainties, particularly at high altitudes as e.g. in the Central Andes. We present and discuss previously published exposure ages from Bolivia and Argentina in order to illustrate the extent of the current uncertainties. It is neither possible to unambiguously determine whether the local Last Glacial Maximum (local LGM) in the tropics occurred in-phase with or predated the global LGM, nor can the subsequent Late Glacial stages be dated accurately enough to infer temperature or precipitation changes at millennial-scale timescales. We then also present new results from the Tres Lagunas in the Sierra de Santa Victoria, NW Argentina. There we can compare our glacial exposure age chronology with bracketing radiocarbon ages from lake sediments. The Tres Lagunas may thus serve as a high-altitude calibration site for 10Be dating. Paleoclimatically, we conclude that glacial deposits in NW-Argentina document glacial advances in-phase with the global LGM, but that the prominent moraines there date to the Late Glacial. This coincides with the well-documented intensification and/or southward shift of the tropical circulation and reflects the strong precipitation-sensitivity of glaciers in arid and semi-arid environments.
Ilgner, J.; Zech, R.; Baechtiger, C.; Kubik, P. W.; Veit, H.
In this paper, we propose a new scene-adaptive exposure control algorithm for digital still camera to achieve fast convergence to targeted average luminance level. We focus especially on an electrical shutter control method, which enables the widest range of control. We tested to find out the relationship between the electrical shutter sped, which is decided by the number of reset pulses of CCD, and the luminance. We composed various luminance environments and generated the exposure data of every combination of luminance environments and the number of reset pulses. We obtained the relationship that as increasing the number of reset pulses, the average luminance level of the captured image decreases. The method that we propose is the Secant Method, one of iteration method and we use it for fast and stable convergence to targeted luminance level. On the plane whose axes are average luminance value and the number of reset pulses, a straight line is defined by two points. One point is computed from the image captured at t0. Another point has minimum luminance value under the assumption that maximum reset pulses cause minimum luminance value 0. The new shutter sped on the straight line makes a new point t1. By repeating this, we can find the targeted luminance stable and fast.
How to make a faster and more efficient task assignment is a hot issue in the multi-UAVs coordination area, by using the Tactical Digital Information Links (TADILs) to share and exchange combat information in modern aerial warfare. Due to the shortage of traditional contract net, a filter model is proposed according to the negotiation mechanism of contract net protocol theory
|Recently, Redford (2010) found that monkeys seemed to exert metacognitive control in a category-learning paradigm. Specifically, they selected more trials to view as the difficulty of the category-learning task increased. However, category-learning difficulty was determined by manipulating the family resemblance across the to-be-learned…
The focus of this paper is on interaction protocols and topologies of multi-agent systems for task allocation in manufacturing applications. Resource agents in manufacturing are members of a network whose possible logical topologies and governing interaction protocol influence the scheduling and control in the multi-agent system. Four models are identified in the paper, each having specific rules and characteristics for
Mohammad Owliya; Mozafar Saadat; Mahbod Goharian; Rachid Anane
Several architectures for decentralized workflow enactment have been proposed to improve workflow execution through cooperation of geographically distributed agents. To our knowledge, none of them solves the problem of dynamic task scheduling in trans-organizational relations, which are characterized by a limited observability of the global workflow execution state, due to confidentiality reasons. We present a new distributed dynamic scheduling approach
|Investigates the effects of learner-control and the aptitude-treatment interaction effects between instructional control and learner characteristics in procedural learning. Participants were 128 undergraduates. Results indicate empirical evidence illustrating the power of learner-control and relevant ability on learning procedural tasks in…
|Learning effects were assessed for the block design (BD) task, on the basis of variation in 2 stimulus parameters: perceptual cohesiveness (PC) and set size uncertainty (U). Thirty-one nonclinical undergraduate students (19 female) each completed 3 designs for each of 4 varied sets of the stimulus parameters (high-PC/high-U, high-PC/low-U,…
Miller, Joseph C.; Ruthig, Joelle C.; Bradley, April R.; Wise, Richard A.; Pedersen, Heather A.; Ellison, Jo M.
This report presents task analyzed masturbation instruction for a 29 year old profoundly mentally retarded male. This individual's sole method of masturbation was to rub his penis against the floor or bed mattress which resulted in very infrequent ejaculations and was potentially dangerous to his genitalia. It was decided to teach this individual a more safe and effective method of
|Presents the necessary steps for designing an effective language learning tool to foster communication and negotiation, considering the importance of supporting integral education, using tasks, providing elaborated input and feedback, and promoting collaborative learning. Reports on a study conducted using such a tool to determine whether…
Upon request by the Minister of Social Affairs and Employment the Health Council of the Netherlands recommends health-based occupational exposure limits for the concentration of toxic substances in air of the workplace. These recommendations are made by t...
...Based on Exposure to Ionizing Radiation (Prostate Cancer and Any Other Cancer) AGENCY: Department of Veterans Affairs. ACTION...all evidence currently available to him, prostate cancer and any other cancers may be induced by...
The fundamental principles of industrial hygiene are based upon the recognition, evaluation, and control of workplace hazards. Occupational safety and health professionals (e.g., industrial hygienists) perform this task by assessing numerous complex factors. In many situations industrial hygienists are not available; therefore, an expert system has been developed to assist the performance of workplace exposure assessments (WEAs). The Workplace Exposure
This study sought to measure the degree of students' success in learning and proper use of Existential Constructions (ECs), namely there is\\/are, in English as a foreign language through Focus on Form (FonF) techniques in Task-based language teaching. For this purpose, 60 Iranian learners of English were randomly selected and assigned to one experimental and two control groups. Analysis of
Hussein Muhammadi Farsani; Mansour Tavakoli; Ahmad Moinzadeh
This study sought to measure the degree of students’ success in learning and proper use of Existential Constructions (ECs), namely there is\\/are, in English as a foreign language through Focus on Form (FonF) techniques in Task-based language teaching. For this purpose, 60 Iranian learners of English were randomly selected and assigned to one experimental and two control groups. Analysis of
Hussein Muhammadi Farsani; Mansour Tavakoli; Ahmad Moinzadeh
Our study integrates automated natural language-oriented assessment and analysis methodologies into feasible reading comprehension\\u000a tasks. With the newly developed T-MITOCAR toolset, prose text can be automatically converted into an association net which\\u000a has similarities to a concept map. The “text to graph” feature of the software is based on several parsing heuristics and\\u000a can be used both to assess the
he U.S. Preventive Services Task Force (USPSTF) is an internationally recognized, independent panel of non- federal experts in primary care, prevention, and research methods that makes evidence-based recommendations to guide the delivery of clinical preventive services. Convened and supported by the Agency for Healthcare Research and Quality (AHRQ), the USPSTF is charged by the U.S. Congress to review the scientific
Janelle Guirguis-Blake; Ned Calonge; Therese Miller; Albert Siu; Steven Teutsch; Evelyn Whitlock
Human exposure to background radiofrequency electromagnetic fields (RF-EMF) has been increasing with the introduction of new technologies. There is a definite need for the quantification of RF-EMF exposure but a robust exposure assessment is not yet possible, mainly due to the lack of a fast and efficient measurement procedure. In this article, a new procedure is proposed for accurately mapping the exposure to base station radiation in an outdoor environment based on surrogate modeling and sequential design, an entirely new approach in the domain of dosimetry for human RF exposure. We tested our procedure in an urban area of about 0.04?km(2) for Global System for Mobile Communications (GSM) technology at 900?MHz (GSM900) using a personal exposimeter. Fifty measurement locations were sufficient to obtain a coarse street exposure map, locating regions of high and low exposure; 70 measurement locations were sufficient to characterize the electric field distribution in the area and build an accurate predictive interpolation model. Hence, accurate GSM900 downlink outdoor exposure maps (for use in, e.g., governmental risk communication and epidemiological studies) are developed by combining the proven efficiency of sequential design with the speed of exposimeter measurements and their ease of handling. PMID:23315952
BACKGROUND Sunlight exposure is responsible for a large number of dermatological diseases. OBJECTIVE We estimated the prevalence of sunlight exposure and its associated factors in adults from southern Brazil in a cross-sectional, population-based study. METHODS We investigated a representative sample of individuals aged ? 20 years (n=3,136). Sunlight exposure and its associated factors were evaluated in two distinct situations: at leisure time and at work. The time period investigated ranged from December 2004 to March 2005, comprising 120 days of the highest ultraviolet index in the urban area of the city of Pelotas, in southern Brazil. The participants were asked about sunlight exposure for at least 20 minutes between 10 A.M. and 4 P.M. The analysis was stratified by sex, and sunlight exposure was grouped into five categories. RESULTS Among the 3,136 participants, prevalence of sunlight exposure at the beach was 32.8% (95% CI, 30.3 - 35.2) and 26.3% (95% CI, 24.2 28.3) among men and women, respectively. The prevalence at work was 39.8% (95% CI, 37.2 - 42.4) among men and 10.5% (95% CI, 9.1 - 12.0) among women. Age was inversely associated with sunlight exposure. Family income and achieved schooling were positively associated with sunlight exposure at leisure time and inversely associated with sunglight exposure at work. Self-reported skin color was not associated. Knowledge of any friend or relative who has been affected by skin cancer was positively associated with sunlight exposure among men at work. CONCLUSION Despite the media campaigns on the harmful effects of excessive sunlight exposure, we found a high prevalence of sunlight exposure during a period of high ultraviolet index.
Duquia, Rodrigo Pereira; Menezes, Ana Maria Baptista; de Almeida, Hiram Larangeira; Reichert, Felipe Fossati; dos Santos, Ina da Silva; Haack, Ricardo Lanzetta; Horta, Bernardo Lessa
BACKGROUND: Real-time fMRI is especially vulnerable to task-correlated movement artifacts because statistical methods normally available in conventional analyses to remove such signals cannot be used in the context of real-time fMRI. Multi-voxel classifier-based methods, although advantageous in many respects, are particularly sensitive. Here we systematically studied various movements of the head and face to determine to what extent these can "masquerade" as signal in multi-voxel classifiers. METHODS: Ten subjects were instructed to move systematically (twelve instructed movements) throughout fMRI exams and data from a previously published real-time study was also analyzed to determine the extent to which non-neural signals contributed to the high reported accuracy in classifier output. RESULTS: Of potential concern, whole-brain classifiers based solely on movements exhibited false positives in all cases (P < .05). Artifacts were also observed in the spatial activation maps for two of the twelve movement tasks. In the retrospective analysis, it was determined that the relatively high reported classification accuracies were (fortunately) mostly explainable by neural activity, but that in some cases performance was likely dominated by movements. CONCLUSION: Movement tasks of many types (including movements of the eyes, face, and body) can lead to false positives in classifier-based real-time fMRI paradigms. PMID:23551805
A recently collected EEG dataset is analyzed and processed in order to evaluate the performance of a previously designed brain-computer interface (BCI) system. The EEG signals are collected from 29 channels distributed over the scalp. Four subjects completed three sessions each by performing four different mental tasks during each session. The BCI is designed in such a way that only one of the mental tasks can activate it. One important advantage of this BCI is its simplicity, since autoregressive modeling and quadratic discriminant analysis are used for feature extraction and classification, respectively. The autoregressive order which yields the best overall performance is obtained during a fivefold nested cross-validation process. The results are promising as the false positive rates are zero while the true positive rates are sufficiently high (67.26% average).
Existing evidence suggests that disgust is an important affective process related to health anxiety. The present study sought to determine the contribution of health anxiety symptoms in the prediction of disgust and behavioral avoidance in a large, nonclinical sample (N=156). Regression analyses showed that overall health anxiety symptoms predicted disgust on a behavioral approach task independent of gender, negative affect, and fear of contamination. Particularly, health anxiety-related reassurance seeking was found to be uniquely associated with disgust and behavioral avoidance after controlling for the aforementioned covariates. In addition, the interaction between health anxiety and contamination fear was tested, and remained significant when controlling for gender and negative affect. These results suggest that heightened contamination fear is associated with elevated disgust reactions such that high levels of health anxiety leads even those low in contamination fear to be disgusted during a behavioral task. These results are in line with previous research on the role of disgust in health anxiety. PMID:22607189
\\u000a The preservation of digital objects is a topic of prominent importance for archives and digital libraries. This paper focuses\\u000a on the problem of preserving the performability of tasks on digital objects. It formalizes the problem in terms of Horn Rules and details the required inference services.\\u000a The proposed framework and methodology is more expressive and flexible than previous attempts as
Learning effects were assessed for the block design (BD) task, on the basis of variation in 2 stimulus parameters: perceptual cohesiveness (PC) and set size uncertainty (U). Thirty-one nonclinical undergraduate students (19 female) each completed 3 designs for each of 4 varied sets of the stimulus parameters (high-PC\\/high-U, high-PC\\/low-U, low-PC\\/high-U, and low-PC\\/low-U), ordered randomly within a larger set of designs
Joseph C. Miller; Joelle C. Ruthig; April R. Bradley; Richard A. Wise; Heather A. Pedersen; Jo M. Ellison
ABSTRACT Tests of murine ,higher order cognition are important in order ,to fully explore the effects of genetic ,alterations and potential therapies in models ,of human mental retardation and other deficits of intellect. In this study, inbred and F1hybrid strains of mice,were assessed in a five odor sequence task previously described,for rats which ,employs ,buried ,food rewards. Data are provided
EMILY KATZ; OLIVER ROTHSCHILD; ANDRIANA HERRERA; SOFIA HUANG; ANNA WONG; YVETTE WOJCIECHOWSKI; AIDA GIL; QI JIANG YAN; ROBERT P. BAUCHWITZ
In this paper, a programming model is presented which enables scalable parallel performance on multi-core shared memory architectures. The model has been developed for application to a wide range of numerical simulation problems. Such problems involve time stepping or iteration algorithms where synchronization of multiple threads of execution is required. It is shown that traditional approaches to parallelism including message passing and scatter-gather can be improved upon in terms of speed-up and memory management. Using spatial decomposition to create orthogonal computational tasks, a new task management algorithm called H-Dispatch is developed. This algorithm makes efficient use of memory resources by limiting the need for garbage collection and takes optimal advantage of multiple cores by employing a “hungry” pull strategy. The technique is demonstrated on a simple finite difference solver and results are compared to traditional MPI and scatter-gather approaches. The H-Dispatch approach achieves near linear speed-up with results for efficiency of 85% on a 24-core machine. It is noted that the H-Dispatch algorithm is quite general and can be applied to a wide class of computational tasks on heterogeneous architectures involving multi-core and GPGPU hardware.
The present meta-analysis investigated whether event-based prospective memory (PM) age effects differ by task order specificity. In specified PM tasks, the order of the ongoing and the PM task response is predetermined, which imposes demands on cognitive control to navigate the possible response options. In contrast, unspecified PM tasks do not require responding in a particular order. Based on 57 studies and more than 5,500 younger and older adults, results showed larger PM age effects in specified compared with unspecified PM tasks. Additionally, the effect of task focality on age differences was replicated. Results suggest that both pre- and postretrieval processes independently affect PM age effects. (PsycINFO Database Record (c) 2013 APA, all rights reserved). PMID:24041004
Cosmogenic exposure dating has greatly enhanced our ability to define glacial chronologies spanning several global cold periods, and glacial boulder exposure ages are now routinely used to constrain deglaciation ages. However, exposure dating involves assumptions about the geological history of the sample that are difficult to test and yet may have a profound effect on the inferred age. Two principal geological factors yield erroneous inferred ages: exposure prior to glaciation (yielding exposure ages that are too old) and incomplete exposure due to post-depositional shielding (yielding exposure ages that are too young). Here we show that incomplete exposure is more important than prior exposure, using datasets of glacial boulder 10Be exposure ages from the Tibetan Plateau (1420 boulders), Northern Hemisphere palaeo-ice sheets (631 boulders), and present-day glaciers (208 boulders). No boulders from present-day glaciers and few boulders from the palaeo-ice sheets have exposure ages significantly older than independently known deglaciation ages, indicating that prior exposure is of limited significance. Further, while a simple post-depositional landform degradation model can predict the exposure age distribution of boulders from the Tibetan Plateau, a prior exposure model fails, indicating that incomplete exposure is important. The large global dataset demonstrates that, in the absence of other evidence, glacial boulder exposure ages should be viewed as minimum limiting deglaciation ages.
Heyman, Jakob; Stroeven, Arjen P.; Harbor, Jonathan M.; Caffee, Marc W.
Purpose of the present study was to test a conceptual model of relations among achievement goal orientation, self-efficacy, cognitive processing, and achievement of students working within a particular collaborative task context. The task involved a collaborative computer-based modeling task. In order to test the model, group measures of mastery-approach goal orientation, performance-avoidance goal orientation, self-efficacy, and achievement were employed. Students’
Patrick H. M. Sins; Wouter R. van Joolingen; Elwin R. Savelsbergh; Bernadette van Hout-Wolters
This article explores the role of age, cognitive abilities, prior experience, and knowledge in skill acquisition for a computer-based simulated customer service task. Fifty-two participants aged 50-80 performed the task over 4 consecutive days following training. They also completed a battery that assessed prior computer experience and cognitive abilities. The data indicated that overall quality and efficiency of performance improved with practice. The predictors of initial level of performance and rate of change in performance varied according to the performance parameter assessed. Age and fluid intelligence predicted initial level and rate of improvement in overall quality, whereas crystallized intelligence and age predicted initial e-mail processing time, and crystallized intelligence predicted rate of change in e-mail processing time over days. We discuss the implications of these findings for the design of intervention strategies. PMID:17565169
Assessing occupational exposure in retrospective community-based case-control studies is difficult as measured exposure data are very seldom available. The expert assessment method is considered the most accurate way to attribute exposure but it is a time consuming and expensive process and may be seen as subjective, nonreproducible, and nontransparent. In this paper, we describe these problems and outline our solutions as operationalized in a web-based software application (OccIDEAS). The novel aspects of OccIDEAS are combining all steps in the assessment into one software package; enmeshing the process of assessment into the development of questionnaires; selecting the exposure(s) of interest; specifying rules for exposure assignment; allowing manual or automatic assessments; ensuring that circumstances in which exposure is possible for an individual are highlighted for review; providing reports to ensure consistency of assessment. Development of this application has the potential to make high-quality occupational assessment more efficient and accessible for epidemiological studies.
...based on chronic effects of exposure to mustard gas and Lewisite. 3.316 Section...based on chronic effects of exposure to mustard gas and Lewisite. (a) Except...Full-body exposure to nitrogen or sulfur mustard during active military service...
...based on chronic effects of exposure to mustard gas and Lewisite. 3.316 Section...based on chronic effects of exposure to mustard gas and Lewisite. (a) Except...Full-body exposure to nitrogen or sulfur mustard during active military service...
This paper presents thoughts about a different form of air quality management that focuses on long-term personal exposure of individuals and goes beyond the traditional system of air quality standards. The exposure-based approach is a customer-oriented methodology with the aim of individually lowering the personal intake of chronically acting air pollutants and hence improve human health conditions more directly.
To investigate the association between sun exposure and risk of non-Hodgkin lymphoma (NHL) by histologic subtypes and to explore whether or not vitamin D intake modify sun–NHL association, we analysed data from a population-based, case–control study conducted in Nebraska between 1999 and 2002. Information on sun exposure during the spring, summer, fall and winter was collected from 387 cases and
Lori K. Soni; Lifang Hou; Susan M. Gapstur; Andrew M. Evens; Dennis D. Weisenburger; Brian C.-H. Chiu
This study was conducted to assess the quality of interview-basedexposure estimates obtained in a large epidemiologic case-control study: The Northern Germany Leukemia and Lymphoma Study (1997-2002) (NLL). The NLL used standardized, face-to-face, computer-assisted interviews to record subjects' lifetime use of radiofrequency (RF)-emitting appliances such as cellular telephones, cordless telephones, baby monitors, and television headphones. Exposure assessment comprised 3 levels
Thomas Behrens; Claudia Terschüren; Wolfgang Hoffmann
Chemical transport through human skin can play a significant role in human exposure to toxic chemicals in the workplace, as well as to chemical\\/biological warfare agents in the battlefield. The viability of transdermal drug delivery also relies on chemical transport processes through the skin. Models of percutaneous absorption are needed for risk-basedexposure assessments and drug-delivery analyses, but previous mechanistic
The three nickel-base superalloys B-1900, TRW-NASA VIA, and René 80 were studied utilizing metallographic and residue analysis techniques in conjuction with mechanical property tests to determine the effect of thermal exposure on the microstructure and mechanical properties. Exposure times of 10, 100, 1000, and 5000 h at temperatures from 1400 to 2000°F (760 to 1093°C) were evaluated. Four minor phases-MC,
The three nickel-base superalloys B-1900, TRW-NASA VIA, and René 80 were studied utilizing metallographic and residue analysis\\u000a techniques in conjuction with mechanical property tests to determine the effect of thermal exposure on the microstructure\\u000a and mechanical properties. Exposure times of 10, 100, 1000, and 5000 h at temperatures from 1400 to 2000°F (760 to 1093°C)\\u000a were evaluated. Four minor phases-MC,
This paper describes the design of a programmable stand-alone system for real time vision pre-processing tasks. The system's architecture has been implemented and tested using an ACE16k chip and a Xilinx xc4028xl FPGA. The ACE16k chip consists basically of an array of 128x128 identical mixed-signal processing units, locally interacting, which operate in accordance with single instruction multiple data (SIMD) computing architectures and has been designed for high speed image pre-processing tasks requiring moderate accuracy levels (7 bits). The input images are acquired using the optical input capabilities of the ACE16k chip, and after being processed according to a programmed algorithm, the images are represented at real time on a TFT screen. The system is designed to store and run different algorithms and to allow changes and improvements. Its main board includes a digital core, implemented on a Xilinx 4028 Series FPGA, which comprises a custom programmable Control Unit, a digital monochrome PAL video generator and an image memory selector. Video SRAM chips are included to store and access images processed by the ACE16k. Two daughter boards hold the program SRAM and a video DAC-mixer card is used to generate composite analog video signal.
PURPOSE Primary care faces the dilemma of excessive patient panel sizes in an environment of a primary care physician shortage. We aimed to estimate primary care panel sizes under different models of task delegation to nonphysician members of the primary care team. METHODS We used published estimates of the time it takes for a primary care physician to provide preventive, chronic, and acute care for a panel of 2,500 patients, and modeled how panel sizes would change if portions of preventive and chronic care services were delegated to nonphysician team members. RESULTS Using 3 assumptions about the degree of task delegation that could be achieved (77%, 60%, and 50% of preventive care, and 47%, 30%, and 25% of chronic care), we estimated that a primary care team could reasonably care for a panel of 1,947, 1,523, or 1,387 patients. CONCLUSIONS If portions of preventive and chronic care services are delegated to nonphysician team members, primary care practices can provide recommended preventive and chronic care with panel sizes that are achievable with the available primary care workforce. PMID:22966102
Altschuler, Justin; Margolius, David; Bodenheimer, Thomas; Grumbach, Kevin
Today's largest systems have over 100,000 cores, with million-core systems expected over the next few years. This growing scale makes debugging the applications that run on them a daunting challenge. Few debugging tools perform well at this scale and most provide an overload of information about the entire job. Developers need tools that quickly direct them to the root cause of the problem. This paper presents AutomaDeD, a tool that identifies which tasks of a large-scale application first manifest a bug at a specific code region at a specific point during program execution. AutomaDeD creates a statistical model of the application's control-flow and timing behavior that organizes tasks into groups and identifies deviations from normal execution, thus significantly reducing debugging effort. In addition to a case study in which AutomaDeD locates a bug that occurred during development of MVAPICH, we evaluate AutomaDeD on a range of bugs injected into the NAS parallel benchmarks. Our results demonstrate that detects the time period when a bug first manifested itself with 90% accuracy for stalls and hangs and 70% accuracy for interference faults. It identifies the subset of processes first affected by the fault with 80% accuracy and 70% accuracy, respectively and the code region where where the fault first manifested with 90% and 50% accuracy, respectively.
Bronevetsky, G; Laguna, I; Bagchi, S; de Supinski, B R; Ahn, D; Schulz, M
In recent models of decision-making, cognitive scientists have examined the relationship between option generation and successful performance. These models suggest that those who are successful at decision-making generate few courses of action and typically choose the first, often best, option. Scientists working in the area of expert performance, on the other hand, have demonstrated that the ability to generate and prioritize task-relevant options during situation assessment is associated with successful performance. In the current study, we measured law enforcement officers' performance and thinking in a simulated task environment to examine the option generation strategies used during decision-making in a complex domain. The number of options generated during assessment (i.e., making decisions about events in the environment) and intervention (i.e., making decisions about personal courses of action) phases of decision-making interact to produce a successful outcome. The data are explained with respect to the development of a situational representation and long-term working memory skills capable of supporting both option generation processes. PMID:21461753
Ward, Paul; Suss, Joel; Eccles, David W; Williams, A Mark; Harris, Kevin R
PURPOSE Primary care faces the dilemma of excessive patient panel sizes in an environment of a primary care physician shortage. We aimed to estimate primary care panel sizes under different models of task delegation to nonphysician members of the primary care team. METHODS We used published estimates of the time it takes for a primary care physician to provide preventive, chronic, and acute care for a panel of 2,500 patients, and modeled how panel sizes would change if portions of preventive and chronic care services were delegated to nonphysician team members. RESULTS Using 3 assumptions about the degree of task delegation that could be achieved (77%, 60%, and 50% of preventive care, and 47%, 30%, and 25% of chronic care), we estimated that a primary care team could reasonably care for a panel of 1,947, 1,523, or 1,387 patients. CONCLUSIONS If portions of preventive and chronic care services are delegated to nonphysician team members, primary care practices can provide recommended preventive and chronic care with panel sizes that are achievable with the available primary care workforce.
Altschuler, Justin; Margolius, David; Bodenheimer, Thomas; Grumbach, Kevin
Tasks were assigned to Oak Ridge National Laboratory (ORNL) researchers for the development of lignin-based carbon fiber from a specific precursor that was produced by the Participant (Weyerhaeuser Corporation). These tasks included characterization of precursor polymers and fibers; and the development of conversion parameters for the fibers. ORNL researchers provided recommendations for in-house characterization of the precursor at the participant's
Felix L Paulauskas; Amit K Naskar; Soydan Ozcan; James R Keiser; John Peter Gorog
The job-exposure matrix described has been developed for use in population based studies of occupational morbidity and mortality in England and Wales. The job axis of the matrix is based on the Registrar General's 1966 classification of occupations and 1968 classification of industries, and comprises 669 job categories. The exposure axis is made up of 49 chemical, physical, and biological agents, most of which are known or suspected causes of occupational disease. In the body of the matrix associations between jobs and exposures are graded to four levels. The matrix has been applied to data from a case-control study of lung cancer in which occupational histories were elicited by means of a postal questionnaire. Estimates of exposure to five known or suspected carcinogens (asbestos, chromates, cutting oils, formaldehyde, and inhaled polycyclic aromatic hydrocarbons were compared with those obtained by detailed review of individual occupational histories. When the matrix was used exposures were attributed to jobs more frequently than on the basis of individual histories. Lung cancer was significantly more common among subjects classed by the matrix as having potential exposure to chromates, but neither method of assigning exposures produced statistically significant associations with asbestos or polycyclic aromatic hydrocarbons. Possible explanations for the failure to show a clear effect of these known carcinogens are discussed. The greater accuracy of exposures inferred directly from individual histories was reflected in steeper dose response curves for asbestos, chromates, and polycyclic aromatic hydrocarbons. The improvement over results obtained with the matrix, however, was not great. For occupational data of the type examined in this study, direct exposure estimates offer little advantage over those provided at lower cost by a matrix.
It has been proposed that cognitive reserve is supported by two neural mechanisms: neural compensation and neural reserve. The purpose of this study was to test how these neural mechanisms are solicited in aging in the context of visual selective attention processing and whether they are inter- or intra-hemispheric. Younger and older participants were scanned using fMRI during a visual letter-matching task with two attentional load levels. The results show that in the low-load condition, the older participants activated frontal superior gyri bilaterally; these regions were not activated in the younger participants, in accordance with the compensation mechanism and the Posterior-Anterior Shift in Aging (PASA) phenomenon. However, when task demand increased, the older participants recruited the same regions (parietal) as the younger ones, showing the involvement of a similar neural reserve mechanism. This result suggests that successful cognitive aging relies on the concurrent use of both neural compensation and neural reserve in high-demand tasks, calling on the frontoparietal network. In addition, the finding of intra-hemispheric-based neurofunctional reorganization with a PASA phenomenon for all attentional load levels suggests that the PASA phenomenon is a function more of compensation than of reserve. PMID:23453977
A measurement procedure is explained which makes the combination of simple simulations, broadband and frequency-selective electromagnetic measurements to check in a practical way if a base station site complies with restrictions regarding the exposure to electromagnetic radiation. The measuring method enables the accurate and objective control of a base station site in a practical and reliable way. The procedure provides
Electromagnetic–thermal analysis of human exposure to base station antennas radiation is presented in this article. The formulation is based on a simplified cylindrical representation of the human body. Electromagnetic analysis involves incident and internal field dosimetry, while the thermal model deals with the bio-heat transfer phenomena in the body. The electric field induced in the body is determined from the
Next generation telecommunication network operators securely opening up their network capabilities and services to third party service providers require flexible service delivery platforms. Policy based service exposure, service discovery and service composition mechanisms are required to offer chargeable services and service building blocks to external entities in a customizable way. Based on research and developments conducted while prototyping solutions for
Purpose – The purpose of this paper is to explore the efficacy of a community-based outreach initiative, piloted in Worcester, Massachusetts, to reduce children's exposure to toxic chemicals in common household products by changing parental behavior regarding product purchase and use. Design\\/methodology\\/approach – The program model was based on the premise that community health workers have the potential to deliver
Purpose--The purpose of this paper is to explore the efficacy of a community-based outreach initiative, piloted in Worcester, Massachusetts, to reduce children's exposure to toxic chemicals in common household products by changing parental behavior regarding product purchase and use. Design/methodology/approach--The program model was based on the…
This paper presents analyses of data from surveys of radio base stations in 23 countries across five continents from the year 2000 onward and includes over 173,000 individual data points. The research compared the results of the national surveys, investigated chronological trends and compared exposures by technology. The key findings from this data are that irrespective of country, the year and cellular technology, exposures to radio signals at ground level were only a small fraction of the relevant human exposure standards. Importantly, there has been no significant increase in exposure levels since the widespread introduction of 3G mobile services, which should be reassuring for policy makers and negate the need for post-installation measurements at ground level for compliance purposes. There may be areas close to antennas where compliance levels could be exceeded. Future potential work includes extending the study to additional countries, development of cumulative exposure distributions and investigating the possibility of linking exposure measurements to population statistics to assess the distribution of exposure levels relative to population percentiles.
Only recently have investigations of the relationship between media violence exposure (MVE) and aggressive behavior focused on brain functioning. In this study, we examined the relationship between brain activation and history of media violence exposure in adolescents, using functional magnetic resonance imaging (fMRI). Samples of adolescents with no psychiatric diagnosis or with disruptive behavior disorder (DBD) with aggression were compared
Andrew J. Kalnin; Chad R. Edwards; Yang Wang; William G. Kronenberger; Tom A. Hummer; Kristine M. Mosier; David W. Dunn; Vincent P. Mathews
The US EPA Worker Protection Standard requires pesticide safety training for farmworkers. Combined with re-entry intervals, these regulations are designed to reduce pesticide exposure. Little research has been conducted on whether additional steps may reduce farmworker exposure and the potential for take-home exposure to their families. We conducted an intervention with 44 strawberry harvesters (15 control and 29 intervention group
Asa Bradman; Alicia L Salvatore; Mark Boeniger; Rosemary Castorina; John Snyder; Dana B Barr; Nicholas P Jewell; Geri Kavanagh-Baird; Cynthia Striley; Brenda Eskenazi
This paper reports on a usability evaluation of BoBIs (Back-of-the-book Indexes) as searching and browsing tools in an e-book\\u000a environment. This study employed a task-based approach and within-subject design. The retrieval performance of a BoBI was\\u000a compared with a ToC and Full-Text Search tool in terms of their respective effectiveness and efficiency for finding information\\u000a in e-books. The results demonstrated
There is a good amount of evidence that exposure therapy is an effective treatment for posttraumatic stress disorder (PTSD). Notwithstanding its efficacy, there is room for improvement, since a large proportion of patients does not benefit from treatment. Recently, an interesting new direction in the improvement of exposure therapy efficacy for PTSD emerged. Basic research found evidence of the pharmacological enhancement of the underlying learning and memory processes of exposure therapy. The current review aims to give an overview of clinical studies on pharmacological enhancement of exposure-based treatment for PTSD. The working mechanisms, efficacy studies in PTSD patients, and clinical utility of four different pharmacological enhancers will be discussed: d-cycloserine, MDMA, hydrocortisone, and propranolol.
de Kleine, Rianne A.; Rothbaum, Barbara O.; van Minnen, Agnes
Background Published research on the use of Web-based behavior change programs is growing rapidly. One of the observations characterized as problematic in these studies is that participants often make relatively few website visits and spend only a brief time accessing the program. Properly structured websites permit the unobtrusive measurement of the ways in which participants access (are exposed to) program content. Research on participant exposure to Web-based programs is not merely of interest to technologists, but represents an important opportunity to better understand the broader theme of program engagement and to guide the development of more effective interventions. Objectives The current paper seeks to provide working definitions and describe initial patterns of various measures of participant exposure to ChewFree.com, a large randomized controlled trial of a Web-based program for smokeless tobacco cessation. Methods We examined measures of participant exposure to either an Enhanced condition Web-based program (interactive, tailored, and rich-media program) or a Basic condition control website (static, text-based material). Specific measures focused on email prompting, participant visits (number, duration, and pattern of use over time), and Web page viewing (number of views, types of pages viewed, and Web forum postings). Results Participants in the ChewFree.com Enhanced condition made more visits and spent more time accessing their assigned website than did participants assigned to the Basic condition website. In addition, exposure data demonstrated that Basic condition users thoroughly accessed program content, indicating that the condition provided a meaningful, face-valid control to the Enhanced condition. Conclusions We recommend that researchers conducting evaluations of Web-based interventions consider the collection and analysis of exposure measures in the broader context of program engagement in order to assess whether participants obtain sufficient exposure to relevant program content.
Boles, Shawn M; Akers, Laura; Gordon, Judith S; Severson, Herbert H
Over recent years, physiologically based pharmacokinetic (PBPK) models have been used to better describe internal doses resulting from exposures to chemicals in the environment. PBPK models are mathematical descriptions of how chemicals are absorbed into,...
Context: Metatarsal stress fractures are common in cleated-sport athletes. Previous authors have shown that plantar loading varies with footwear, sex, and the athletic task. Objective: To examine the effects of shoe type and sex on plantar loading in the medial midfoot (MMF), lateral midfoot (LMF), medial forefoot (MFF), middle forefoot (MidFF), and lateral forefoot (LFF) during a jump-landing task. Design: Crossover study. Setting: Laboratory. Patients or Other Participants: Twenty-seven recreational athletes (14 men, 13 women) with no history of lower extremity injury in the last 6 months and no history of foot or ankle surgery. Main Outcome Measure(s): The athletes completed 7 jumping trials while wearing bladed-cleat, turf-cleat, and running shoes. Maximum force, contact area, contact time, and the force-time integral were analyzed in each foot region. We calculated 2 × 3 analyses of variance (? = .05) to identify shoe-condition and sex differences. Results: We found no shoe × sex interactions, but the MMF, LMF, MFF, and LFF force-time integrals were greater in men (P < .03). The MMF maximum force was less with the bladed-cleat shoes (P = .02). Total foot and MidFF maximum force was less with the running shoes (P < .01). The MFF and LFF maximum forces were different among all shoe conditions (P < .01). Total foot contact area was less in the bladed-cleat shoes (P = .01). The MMF contact area was greatest in the running shoes (P < .01). The LFF contact area was less in the running shoes (P = .03). The MFF and LFF force-time integrals were greater with the bladed-cleat shoes (P < .01). The MidFF force-time integral was less in the running shoes (P < .01). Conclusions: Independent of shoe, men and women loaded the foot differently during a jump landing. The bladed cleat increased forefoot loading, which may increase the risk for forefoot injury. The type of shoe should be considered when choosing footwear for athletes returning to activity after metatarsal stress fractures. PMID:24067149
Debiasio, Justin C; Russell, Mary E; Butler, Robert J; Nunley, James A; Queen, Robin M
A method for cognitive task analysis is described based on the notion of 'generic tasks'. The method distinguishes three layers of analysis. At the first layer, the task structure, top-level goals of a certain task are identified that have to be fulfilled...
Personal exposures were estimated for a large cohort of workers in the U.S. domestic system for distributing gasoline by trucks and marine vessels. This assessment included development of a rationale and methodology for extrapolating vapor exposures prior to the availability of measurement data, analysis of existing measurement data to estimate task and job exposures during 1975-1985, and extrapolation of truck and marine job exposures before 1975. A worker's vapor exposure was extrapolated from three sets of factors: the tasks in his or her job associated with vapor sources, the characteristics of vapor sources (equipment and other facilities) at the work site, and the composition of petroleum products producing vapors. Historical data were collected on the tasks in job definitions, on work-site facilities, and on product composition. These data were used in a model to estimate the overall time-weighted-average vapor exposure for jobs based on estimates of taskexposures and their duration. Taskexposures were highest during tank filling in trucks and marine vessels. Measured average annual, full-shift exposures during 1975-1985 ranged from 9 to 14 ppm of total hydrocarbon vapor for truck drivers and 2 to 35 ppm for marine workers on inland waterways. Extrapolated past average exposures in truck operations were highest for truck drivers before 1965 (range 140-220 ppm). Other jobs in truck operations resulted in much lower exposures. Because there were few changes in marine operations before 1979, exposures were assumed to be the same as those measured during 1975-1985. Well-defined exposure gradients were found across jobs within time periods, which were suitable for epidemiologic analyses. PMID:8020436
For the last few years, development and optimization of three-dimensional (3D) x-ray breast imaging systems, such as breast tomosynthesis and computed tomography, has drawn much attention from the medical imaging community, either academia or industry. However, the trade offs between patient safety and the efficacy of the devices have yet to be investigated with use of objective performance metrics. Moreover, as the 3D imaging systems give depth information that was not available in planar mammography, standard mammography quality assurance and control (QA/QC) phantoms used for measuring system performance are not appropriate since they do not account for background variability and clinically relevant tasks. Therefore, it is critical to develop QA/QC methods that incorporate background variability with use of a task-based statistical assessment methodology.1 In this work, we develop a physical phantom that simulates variable backgrounds using spheres of different sizes and densities, and present an evaluation method based on statistical decision theory,2 in particular, with use of the ideal linear observer, for evaluating planar and 3D x-ray breast imaging systems. We demonstrate our method for a mammography system and compare the variable phantom case to that of a phantom of the same dimensions filled with water. Preliminary results show that measuring the system's detection performance without consideration of background variability may lead to misrepresentation of system performance.
The need for resource-intensive laboratory assays to assess exposures in many epidemiologic studies provides ample motivation to consider study designs that incorporate pooled samples. In this paper, we consider the case in which specimens are combined for the purpose of determining the presence or absence of a pool-wise exposure, in lieu of assessing the actual binary exposure status for each member of the pool. We presume a primary logistic regression model for an observed binary outcome, together with a secondary regression model for exposure. We facilitate maximum likelihood analysis by complete enumeration of the possible implications of a positive pool, and we discuss the applicability of this approach under both cross-sectional and case-control sampling. We also provide a maximum likelihood approach for longitudinal or repeated measures studies where the binary outcome and exposure are assessed on multiple occasions and within-subject pooling is conducted for exposure assessment. Simulation studies illustrate the performance of the proposed approaches along with their computational feasibility using widely available software. We apply the methods to investigate gene–disease association in a population-based case-control study of colorectal cancer.
Seroprevalence data illustrate that human exposure to Toxocara is frequent. Environmental contamination with Toxocara spp. eggs is assumed to be the best indicator of human exposure, but increased risk of exposure has also been associated with many other factors. Reported associations are inconsistent, however, and there is still ambiguity regarding the factors driving the onset of Toxocara antibody positivity. The objective of this work was to assess the validity of our current conceptual understanding of the key processes driving human exposure to Toxocara. We constructed an agent-based model predicting Toxocara antibody positivity (as a measure of exposure) in children. Exposure was assumed to depend on the joint probability of 3 parameters: (1) environmental contamination with Toxocara spp. eggs, (2) larvation of these eggs and (3) the age-related contact with these eggs. This joint probability was linked to processes of acquired humoral immunity, influencing the rate of antibody seroreversion. The results of the simulation were validated against published data from 5 different geographical settings. Using simple rules and a stochastic approach with parameter estimates derived from the respective contexts, plausible serological patterns emerged from the model in nearly all settings. Our approach leads to novel insights in the transmission dynamics of Toxocara. PMID:23574630
Kanobana, K; Devleesschauwer, B; Polman, K; Speybroeck, N
The need for resource-intensive laboratory assays to assess exposures in many epidemiologic studies provides ample motivation to consider study designs that incorporate pooled samples. In this paper, we consider the case in which specimens are combined for the purpose of determining the presence or absence of a pool-wise exposure, in lieu of assessing the actual binary exposure status for each member of the pool. We presume a primary logistic regression model for an observed binary outcome, together with a secondary regression model for exposure. We facilitate maximum likelihood analysis by complete enumeration of the possible implications of a positive pool, and we discuss the applicability of this approach under both cross-sectional and case-control sampling. We also provide a maximum likelihood approach for longitudinal or repeated measures studies where the binary outcome and exposure are assessed on multiple occasions and within-subject pooling is conducted for exposure assessment. Simulation studies illustrate the performance of the proposed approaches along with their computational feasibility using widely available software. We apply the methods to investigate gene-disease association in a population-based case-control study of colorectal cancer. PMID:22415630
With the increased use of computer-based technology, the work of the human has become more cognitive. As a result, the objective measurement of mental workload has become vital in the design of jobs and the development of adaptive interfaces. This research focused on developing and validating an Ohm's law analogy for mental workload based on individuality and the human's ability
Umesh H. Patel; Gavriel Salvendy; Leslie A. Geddes; Thomas Kuczek
As part of Alberta Education's (Canada) broadened assessment initiatives, a sample of 693 Grade 3 students from across the province participated in the Mathematics Performance-based Assessment, 1994. This activity-based assessment, using picture books and manipulatives, was developed by Grade 3 teachers to assess a broad range of skills not easily…
Alberta Dept. of Education, Edmonton. Student Evaluation Branch.
Acquiring information about our environment through touch is vital in everyday life. Yet very little literature exists about factors that may influence haptic or tactile processing. Recent neuroimaging studies have reported haptic laterality effects that parallel those reported in the visual literature. With the use of a haptic variant of the classical line bisection task, the present study aimed to determine the presence of laterality effects on a behavioural level. Specifically, three handedness groups including strong dextrals, strong sinistrals, and-the to-date largely neglected group of-mixed-handers were examined in their ability to accurately bisect stimuli constructed from corrugated board strips of various lengths. Stimulus factors known to play a role in visuospatial perception including stimulus location, the hand used for bisection, and direction of exploration were systematically varied through pseudo-randomisation. Similar to the visual domain, stimulus location and length as well as participants' handedness and the hand used for bisection exerted a significant influence on participants' estimate of the centre of haptically explored stimuli. However, these effects differed qualitatively from those described for the visual domain, and the factor direction of exploration did not exert any significant effect. This indicates that laterality effects reported on a neural level are sufficiently pronounced to result in measurable behavioural effects. The results, first, add to laterality effects reported for the visual and auditory domain, second, are in line with supramodal spatial processing and third, provide additional evidence to a conceptualisation of pseudoneglect and neglect as signs of hemispheric attentional asymmetries. PMID:22385141
Background/Question/Methods We have created an integrated web-based tool designed to estimate exposure doses and ecological risks under the Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) and the Endangered Species Act. This involved combining a number of disparat...
Great social concern has risen about the potential health hazard of living near a cellular telephony base-station antenna, and certain technical questions have been posed on the appropriate way to measure exposure in its vicinity. In this paper, a standard spherical near-near field transformation is proposed to obtain the electromagnetic field close to the antenna in free space conditions. The
|This article reports on an exploratory study into young people's exposure to aggression and violence. It undertakes a collective examination of the domains occupied by young people and in doing so focuses on an area that has for the most part been overlooked by previous researchers in the UK. The analysis is based on the responses of 98 young…
Potential inhalation exposure of building occupants to volatile chemicals in water-based hard-surface cleaners was evaluated by analyzing 267 material safety data sheets (MSDSs). Among the 154 chemicals reported, 44 are volatile or semi-volatile. Hazardous air pollutants (HAPs) r...
Privacy conscious online shoppers find themselves forced to disseminate and share private data with new entities when engaging in online transactions. This paper provides a solution that limits private data exposure to entities that already have it. First, we provide a high-level classification of current privacy assurance practices. We identify three categories that are based on who is responsible for
Background Asbestos is classified as a human carcinogen, and studies have consistently demonstrated that workplace exposure to it increases the risk of developing lung cancer. Few studies have evaluated risks in population-based settings where there is a greater variety in the types of occupations, and exposures. Methods This was a population based case–control study with 1,681 incident cases of lung cancer, and 2,053 controls recruited from 8 Canadian provinces between 1994 and 1997. Self-reported questionnaires were used to elicit a lifetime occupational history, including general tasks, and information for other risk factors. Occupational hygienists, who were blinded to case–control status, assigned asbestos exposures to each job on the basis of (i) concentration (low, medium, high), (ii) frequency (<5%, 5-30%, and >30% of the time in a normal work week), and (iii) reliability (possible, probable, definite). Logistic regression was used to estimate odds ratios (ORs) and their corresponding 95% confidence intervals (CI). Results Those occupationally exposed to (i) low, and (ii) medium or high concentrations of asbestos had ORs for lung cancer of 1.17 (95% CI=0.92 – 1.50) and 2.16 (95% CI=1.21-3.88), respectively, relative to those who were unexposed. Medium or high exposure to asbestos roughly doubled the risk for lung cancer across all three smoking pack-year categories. The joint relationship between smoking and asbestos was consistent with a multiplicative risk model. Conclusions Our findings provide further evidence that exposure to asbestos has contributed to an increased risk of lung cancer in Canadian workplaces, and suggests that nearly 3% of lung cancers among Canadian men are caused by occupational exposure to asbestos.
We study the effects of a moving reference frame on the classification of different mental tasks in a brain-machine interface (BMI). We use the band powers and power differences of the electroencephalogram (EEG) signals from 8 surface electrodes during 5 pre-determined mental tasks as the features for the neural network (NN) mental task classifier. We compare the NN classifier performance
In a four-facility occupational epidemiology study of chloroprene monomer and polymer production workers, the chloroprene (CD) and vinyl chloride monomer (VCM) exposures were modeled for plant specific job title classes. In two facilities an acetylene-based process was used and in the other two plants only a butadiene-based process was used in the monomer synthesis. In the Acetylene process VCM was
Nurtan A. Esmen; Thomas A. Hall; Margaret L. Phillips; E. Paige Jones; Heather Basara; Gary M. Marsh; Jeanine M. Buchanich
In this paper, the human exposure to the electromagnetic field radiated by a radio base-station antenna operating around 900 MHz in an urban environment has been analyzed. A hybrid ray-tracing\\/finite-difference time-domain (FDTD) method has been used to evaluate the incident field and the power absorbed in an exposed subject in the presence of reflecting walls. The base-station antenna has been
Paolo Bernardi; Marta Cavagnaro; Stefano Pisa; Emanuele Piuzzi
This invited article presents a brief overview of the status of evidence-based psychosocial treatments for anxiety disorders in mainstream and\\/or Caucasian youth relative to the little data that has accumulated about psychosocial treatments for anxiety disorders in Latino youth. The article describes an emerging culturally prescriptive framework for working with minority youth and a corresponding exposure-based cognitive behavioral treatment program
Simultaneous challenge of posture and cognition (‘dual tasks’) may predict falls better than tests of isolated components of postural control. We describe a new balance test (the Multiple Tasks Test, MTT) which (1) is based upon simultaneous assessment of multiple (>2) postural components; (2) represents everyday situations; and (3) can be applied by clinicians. Relevant risk factors for falls and
Bastiaan R Bloem; Vibeke V Valkenburg; Mathilde Slabbekoorn; Mirjam D Willemsen
In this study, acrylamide exposure from selected cereal-based baby food samples was investigated among toddlers aged 1-3years in Turkey. The study contained three steps. The first step was collecting food consumption data and toddlers' physical properties, such as gender, age and body weight, using a questionnaire given to parents by a trained interviewer between January and March 2012. The second step was determining the acrylamide levels in food samples that were reported on by the parents in the questionnaire, using a gas chromatography-mass spectrometry (GC-MS) method. The last step was combining the determined acrylamide levels in selected food samples with individual food consumption and body weight data using a deterministic approach to estimate the acrylamide exposure levels. The mean acrylamide levels of baby biscuits, breads, baby bread-rusks, crackers, biscuits, breakfast cereals and powdered cereal-based baby foods were 153, 225, 121, 604, 495, 290 and 36?g/kg, respectively. The minimum, mean and maximum acrylamide exposures were estimated to be 0.06, 1.43 and 6.41?g/kg BW per day, respectively. The foods that contributed to acrylamide exposure were aligned from high to low as bread, crackers, biscuits, baby biscuits, powdered cereal-based baby foods, baby bread-rusks and breakfast cereals. PMID:23954552
Despite the interest in the effects of the media on sexual behavior, there is no single method for assessing exposure to a particular type of media content (e.g., sex). This paper discusses the development of six sexual content exposure measures based on adolescents' own subjective ratings of the sexual content in titles in 4 media (i.e., television, music, magazines, videogames). We assessed the construct and criterion validity of these measures by examining the associations among each of these measures of exposure to sexual content as well as their associations with adolescents' sexual activity. Data were collected in summer 2005 through a web-based survey using a quota sample of 547 youth aged 14-16 from the Philadelphia area. Adolescents rated how often they were exposed to specific television shows, magazine titles, etc. on 4-point never to often scales. They also rated the sexual content of those titles on 4-point no sexual content to a lot of sexual content scales. Sexual behavior was measured using an ordered index of lifetime pre-coital and coital sexual activity. The strength of association between exposure to sexual content and sexual activity varied by medium and measure. Based on our findings, we recommend the use of a multiple media weighted sum measure. This measure produces findings that are consistent with those of similar studies. PMID:20411048
In computational grids, heterogeneous resources with different ownerships are dynamically available and distributed geographically. It is not realistic to build the resource allocation mechanisms for such computational platform without considering economic issues. Developing computational economic-based approaches is a promising avenue for building efficient, scalable and stable resource allocation mechanisms without a centralized controller for computational grids. The key difficulty in
An orthogonal subspace-based approach for two dimensional sinusoids can be applied to the bispectrum and the estimation of the frequencies of quadratically phase coupled sinusoids. The method determines peaks in the bispectral domain by the direct partitioning of noise and signal subspaces without the need for transfer function parametrization. It can be used for determining domainant fre- quencies involved in
|In an effort to better understand learning teams, this study examines the effects of shared mental models on team and individual performance. The results indicate that each team's shared mental model changed significantly over the time that subjects participated in team-based learning activities. The results also showed that the shared mental…
|This study addresses the need for research in three areas: (1) teachers' understandings of scientific inquiry; (2) conceptual understandings of evolutionary processes; and (3) technology-enhanced instruction using an inquiry approach. The purpose of this study was to determine in what ways "The Galapagos Finches" software-based materials created…
Crawford, Barbara A.; Zembal-Saul, Carla; Munford, Danusa; Friedrichsen, Patricia
Introduction: This is a study of hierarchical navigation; how users browse a taxonomy-based interface to an organizational repository to locate information resources. The study is part of a project to develop a taxonomy for an library and information science department to organize resources and support user browsing in a digital repository.…
Khoo, Christopher S. G.; Wang, Zhonghong; Chaudhry, Abdus Sattar
|In designing experiments to investigate retrieval of event memory, researchers choose between utilizing laboratory-based methods (in which to-be-remembered materials are presented to participants) and autobiographical approaches (in which the to-be-remembered materials are events from the participant's pre-experimental life). In practice, most…
McDermott, Kathleen B.; Szpunar, Karl K.; Christ, Shawn E.
A problem in evaluating the hazard represented by an environmental toxicant is that exposures can occur via multiple media such as water, land, and air. Lead is one of the toxicants of concern that has been associated with adverse effects on heme metabolism, serum vitamin D levels, and the mental and physical development of infants and children exposed at very low environmental levels. Effects of lead on development are particularly disturbing in that the consequences of early delays or deficits in physical or mental development may have long-term consequences over the lifetime of affected individuals. Experimental and epidemiologic studies have indicated that blood lead levels in the range of 10-15 micrograms/dl, or possibly lower, are likely to produce subclinical toxicity. Since a discernible threshold has not been demonstrated, it is prudent to preclude development of a Reference Dose (RfD) for lead. As an alternate, the U.S. Environmental Protection Agency (U.S. EPA) has developed the uptake/biokinetic lead model that provides a means for evaluating the relative contribution of various media to establishing blood lead levels in children. This approach will allow for the identification of site- and situation-specific abatement strategies based on projected blood lead levels in vulnerable human populations exposed to lead in air, diet, water, soil/dust, and paint; thus making it possible to evaluate regulatory decisions concerning each medium on blood levels and potential health effects.35 references.
DeRosa, C.T.; Choudhury, H.; Peirano, W.B. (Environmental Criteria and Assessment Office, U.S. Environmental Protection Agency, Cincinnati, OH (United States))
This article presents an integrated, biologically based, source-to-dose assessment framework for modeling multimedia/multipathway/multiroute exposures to arsenic. Case studies demonstrating this framework are presented for three US counties (Hunderton County, NJ; Pima County, AZ; and Franklin County, OH), representing substantially different conditions of exposure. The approach taken utilizes the Modeling ENvironment for TOtal Risk studies (MENTOR) in an implementation that incorporates and extends the approach pioneered by Stochastic Human Exposure and Dose Simulation (SHEDS), in conjunction with a number of available databases, including NATA, NHEXAS, CSFII, and CHAD, and extends modeling techniques that have been developed in recent years. Model results indicate that, in most cases, the food intake pathway is the dominant contributor to total exposure and dose to arsenic. Model predictions are evaluated qualitatively by comparing distributions of predicted total arsenic amounts in urine with those derived using biomarker measurements from the NHEXAS — Region V study: the population distributions of urinary total arsenic levels calculated through MENTOR and from the NHEXAS measurements are in general qualitative agreement. Observed differences are due to various factors, such as interindividual variation in arsenic metabolism in humans, that are not fully accounted for in the current model implementation but can be incorporated in the future, in the open framework of MENTOR. The present study demonstrates that integrated source-to-dose modeling for arsenic can not only provide estimates of the relative contributions of multipathway exposure routes to the total exposure estimates, but can also estimate internal target tissue doses for speciated organic and inorganic arsenic, which can eventually be used to improve evaluation of health risks associated with exposures to arsenic from multiple sources, routes, and pathways.
This study investigates occupational exposure to electromagnetic fields in front of a multi-band base station antenna for mobile communications at 900, 1800, and 2100?MHz. Finite-difference time-domain method was used to first validate the antenna model against measurement results published in the literature and then investigate the specific absorption rate (SAR) in two heterogeneous, anatomically correct human models (Virtual Family male and female) at distances from 10 to 1000?mm. Special attention was given to simultaneous exposure to fields of three different frequencies, their interaction and the additivity of SAR resulting from each frequency. The results show that the highest frequency--2100?MHz--results in the highest spatial-peak SAR averaged over 10?g of tissue, while the whole-body SAR is similar at all three frequencies. At distances > 200?mm from the antenna, the whole-body SAR is a more limiting factor for compliance to exposure guidelines, while at shorter distances the spatial-peak SAR may be more limiting. For the evaluation of combined exposure, a simple summation of spatial-peak SAR maxima at each frequency gives a good estimation for combined exposure, which was also found to depend on the distribution of transmitting power between the different frequency bands. PMID:21365667
Kos, Bor; Vali?, Blaž; Kotnik, Tadej; Gajšek, Peter
Wind and waves are major forces affecting the geomorphology and biota in coastal areas. We present a generally applicable method for measuring and calculating fetch length, fetch direction and wave exposure. Fetch length and direction, measured by geographic information system-based methods, are used along with wind direction and wind speed data to estimate wave height and period by applying forecasting curves. The apparent power of waves approaching the shore, used as a proxy for wave exposure, is then calculated by a linear wave model. We demonstrate our method by calculating fetch lengths and wave exposure indices for five areas with varying exposure levels and types of meteorological conditions in the Finnish Archipelago Sea, situated in the northern Baltic Sea. This method is a rapid and accurate means of estimating exposure, and is especially applicable in areas with geomorphologically varying and complicated shorelines. We expect that our method will be useful in several fields, such as basic biogeographical and biodiversity research, as well as coastal land-use planning and management.
China's urban and rural populations face very serious health risks from combustion particles. Major sources of exposure to inhalable particulates include the burning solid fuels (biomass and coal) for household cooking and heating, coal-fired industrial and residential boilers, tobacco smoking, and diesel motor vehicles. China began to address particulate pollution problems over 25 years ago and has implemented a series of progressively more aggressive policies. This paper reviews the successes and limitations of past and existing policies for particulate controls, as well as the effects of China's economic reforms and energy policies on particulate exposure and pollution management. We examine the challenge of emissions reporting, required as part of both China's pollution levy system and emerging system for "total emissions control." Finally, we discuss practical steps toward exposure-based regulation of particulates, which would take advantage of the high cost-effectiveness for lifesaving of controlling particulate exposure from household and neighborhood sources relative to that of controlling exposure from industrial sources. PMID:12492170
We examined occupational exposures among subjects with sinonasal cancer (SNC) recorded in a population-based registry in the Lombardy Region, the most populated and industrialized Italian region. The registry collects complete clinical information and exposure to carcinogens regarding all SNC cases occurring in the population of the region. In the period 2008–2011, we recorded 210 SNC cases (137 men, 73 women). The most frequent occupational exposures were to wood (44 cases, 21.0%) and leather dust (29 cases, 13.8%), especially among men: 39 cases (28.5%) to wood and 23 cases (16.8%) to leather dust. Exposure to other agents was infrequent (<2%). Among 62 subjects with adenocarcinoma, 50% had been exposed to wood dust and 30.7% to leather dust. The proportions were around 10% in subjects with squamous cell carcinoma and about 20% for tumors with another histology. The age-standardized rates (×100,000 person-years) were 0.7 in men and 0.3 in women. Complete collection of cases and their occupational history through a specialized cancer registry is fundamental to accurately monitor SNC occurrence in a population and to uncover exposure to carcinogens in different industrial sectors, even those not considered as posing a high risk of SNC, and also in extraoccupational settings.
Mensi, Carolina; Sieno, Claudia; Riboldi, Luciano; Bertazzi, Pier Alberto
Over 2 million military and civilian personnel per year (over 1 million in the United States) are occupationally exposed, respectively, to jet propulsion fuel-8 (JP-8), JP-8 +100 or JP-5, or to the civil aviation equivalents Jet A or Jet A-1. Approximately 60 billion gallon of these kerosene-based jet fuels are annually consumed worldwide (26 billion gallon in the United States), including over 5 billion gallon of JP-8 by the militaries of the United States and other NATO countries. JP-8, for example, represents the largest single chemical exposure in the U.S. military (2.53 billion gallon in 2000), while Jet A and A-1 are among the most common sources of nonmilitary occupational chemical exposure. Although more recent figures were not available, approximately 4.06 billion gallon of kerosene per se were consumed in the United States in 1990 (IARC, 1992). These exposures may occur repeatedly to raw fuel, vapor phase, aerosol phase, or fuel combustion exhaust by dermal absorption, pulmonary inhalation, or oral ingestion routes. Additionally, the public may be repeatedly exposed to lower levels of jet fuel vapor/aerosol or to fuel combustion products through atmospheric contamination, or to raw fuel constituents by contact with contaminated groundwater or soil. Kerosene-based hydrocarbon fuels are complex mixtures of up to 260+ aliphatic and aromatic hydrocarbon compounds (C(6) -C(17+); possibly 2000+ isomeric forms), including varying concentrations of potential toxicants such as benzene, n-hexane, toluene, xylenes, trimethylpentane, methoxyethanol, naphthalenes (including polycyclic aromatic hydrocarbons [PAHs], and certain other C(9)-C(12) fractions (i.e., n-propylbenzene, trimethylbenzene isomers). While hydrocarbon fuel exposures occur typically at concentrations below current permissible exposure limits (PELs) for the parent fuel or its constituent chemicals, it is unknown whether additive or synergistic interactions among hydrocarbon constituents, up to six performance additives, and other environmental exposure factors may result in unpredicted toxicity. While there is little epidemiological evidence for fuel-induced death, cancer, or other serious organic disease in fuel-exposed workers, large numbers of self-reported health complaints in this cohort appear to justify study of more subtle health consequences. A number of recently published studies reported acute or persisting biological or health effects from acute, subchronic, or chronic exposure of humans or animals to kerosene-based hydrocarbon fuels, to constituent chemicals of these fuels, or to fuel combustion products. This review provides an in-depth summary of human, animal, and in vitro studies of biological or health effects from exposure to JP-8, JP-8 +100, JP-5, Jet A, Jet A-1, or kerosene. PMID:12775519
Meta-analysis is an important tool for interpreting results of functional neuroimaging studies and is highly influential in predicting and testing new outcomes. Although traditional label-based review can be used to search for agreement across multiple studies, a new function- location meta-analysis technique called activation likelihood estimation (ALE) offers great improve- ments over conventional methods. In ALE, reported foci are modeled
Angela R. Laird; Kathryn M. McMillan; Jack L. Lancaster; Peter Kochunov; Peter E. Turkeltaub; Jose V. Pardo; Peter T. Fox
Using WebQtiests for inquiry-based learning represents a higher-order use of technology requiring students to exercise information seeking, analyzing, and synthesizing strategies. This research was designed to obtain a better understanding of how to enhance the peda- gogical effectiveness ofWebQuests and of how students interact with the various features inherent to informational Web sites. A major objective was to examine the
Using WebQuests for inquiry-based learning represents a higher-order use of technology requiring students to exercise information seeking, analyzing, and synthesizing strategies. This research was designed to obtain a better understanding of how to enhance the pedagogical effectiveness of WebQuests and of how students interact with the various features inherent to informational Web sites. A major objective was to examine the
The effectiveness of individual therapy by exposure and response prevention (ERP) for obsessive-compulsive disorder (OCD) is well established, yet not all patients respond well, and some show relapse on discontinuation. This article begins by providing an overview of the personal and interpersonal experiences of OCD, focusing on interpersonal processes that maintain OCD symptoms and interfere with ERP. The study then describes a couple-based treatment program that the authors have developed to enhance ERP for individuals with OCD who are in long-term relationships. This program involves psychoeducation, partner-assisted exposure therapy, couple-based interventions aimed at changing maladaptive relationship patterns regarding OCD (i.e., symptom accommodation), and general couple therapy. Three case examples are presented to illustrate the couple-based techniques used in this treatment program. PMID:22619395
Abramowitz, Jonathan S; Baucom, Donald H; Wheaton, Michael G; Boeding, Sara; Fabricant, Laura E; Paprocki, Christine; Fischer, Melanie S
The potential for human exposure to engineered nanoparticles due to the use of nanotechnology-based consumer sprays (categorized as such by the Nanotechnology Consumer Products Inventory) is examined along with analogous products, which are not specified as nanotechnology-based (regular products). Photon correlation spectroscopy was used to obtain particle size distributions in the initial liquid products. Transmission electron microscopy was used to determine particle size, shape, and agglomeration of the particles. Realistic application of the spray products near the human breathing zone characterized airborne particles that are released during use of the sprays. Aerosolization of sprays with standard nebulizers was used to determine their potential for inhalation exposure. Electron microscopy detected the presence of nanoparticles in some nanotechnology-based sprays as well as in several regular products, whereas the photon correlation spectroscopy indicated the presence of particles <100?nm in all investigated products. During the use of most nanotechnology-based and regular sprays, particles ranging from 13?nm to 20??m were released, indicating that they could he inhaled and consequently deposited in all regions of the respiratory system. The results indicate that exposures to nanoparticles as well as micrometer-sized particles can be encountered owing to the use of nanotechnology-based sprays as well as regular spray products. PMID:21364702
Nazarenko, Yevgen; Han, Tae Won; Lioy, Paul J; Mainelis, Gediminas
This report describes the results of Phase 1 of research undertaken to study area-related problems associated with the performance of amorphous silicon based photovoltaic submodules. Objectives are to determine the optimum submodule configuration; demonstrate proof-of-concept, single-junction submodules with 10 percent conversion efficiency over an area larger than 900 sq cm; and demonstrate proof-of-concept tandem submodules with 9 percent conversion efficiency over an area larger than 900 sq cm. The research was divided into three subtask areas: semiconductor materials, nonsemiconductor materials, and submodules.
This paper describes the implementation of a community-based youth violence prevention project that utilized an educational curriculum and a mass media campaign. The extent of penetration of the intervention into target areas and the degree of contamination of control areas are assessed, and the most frequently contacted forms of educational outreach are identified. Two sources of data, provider interviews and a random digit dialed telephone survey, were used to track the source and extent of teens' exposure to the intervention. Agency provider data revealed that 40% of the 92 contacted agencies actually conducted violence prevention education, reaching 22% of the target area teens. Approximately one-half of the surveyed teens reported some exposure to the program, with 13% of the teens in target areas reporting participation in interactive educational activities associated with the project. The most common source of exposure was the media campaign. Most teens report a single exposure, usually to the media campaign, although 29% report contact with more than one form of violence prevention education. While the project did not achieve community saturation, the data show that the community-based model of intervention for violence prevention is feasible and effective in reaching teenagers. This research highlights some difficulties in evaluating prevention programs, including reconciling community ownership with project identification, the ethics of curtailing services for control purposes, and factors influencing recall of participation. PMID:1290766
Hausman, A J; Spivak, H; Prothrow-Stith, D; Roeber, J
Anterior Cruciate ligament (ACL) injuries are one of the most common and devastating knee injuries sustained whilst participating in sport. ACL reconstruction (ACLR) remains the standard approach for athletes who aim to return to high level sporting activities but the outcome from surgery is not assured. Secondary morbidities and an inability to return to the same competitive level are common following ACLR. One factor which might be linked to these sub-optimal outcomes may be a failure to have clearly defined performance criteria for return to activity and sport. This paper presents a commentary describing a structured return to sport rehabilitation protocol for athletes following ACLR. The protocol was developed from synthesis of the available literature and consensus of physiotherapists and strength and conditioning coaches based in the home country Institute of Sports within the United Kingdom. PMID:24016398
Objectives: Three groups of children from low-income, urban environments were examined to determine the effects of prenatal drug exposure (PDE) and caregiving environment on sustained visual attention (SVA) at 7 years of age. Methods: Drug-exposed children remaining in maternal care (n 43), drug-exposed children placed in nonmaternal care (n 45), and community comparison (CC) children (n 56) were admin- istered
John P. Ackerman; Antolin M. Llorente; Maureen M. Black; Claire S. Ackerman; Lacy A. Mayes; Prasanna Nair
Water based paints contain organic solvents and many additives, such as biocides, surfactants, pigments, binders, amines, and monomers. The chemical complexity may introduce new potential health hazards to house painters, in particular irritative and allergic disorders. This study was performed to compare how house painters experience work with water based paints or solvent based paints, and to evaluate whether exposure to water based paints increases mucous membrane and dermal symptoms among house painters. 255 male house painters aged 20 to 65 were invited to participate in the study. Controls were two industrial populations, in total 302 men, without exposure to water based paints. Self administered questionnaires were used to assess the painter's experiences of working with different types of paints and the occurrence of symptoms in the exposed and unexposed groups. Hygiene measurements were performed during normal working days when only water based paints and no solvent based paints were used. The painters were exposed to low concentrations of dust, metals, ammonia, formaldehyde, and volatile organic compounds. The work environment was considered better when working with water based paints than with solvent based paints. There were more complaints of frequent urination when working with water based paint. Taste or olfactory disturbances were less common. General as well as work related eye and skin irritation was more common among the exposed workers. For other symptoms no significant differences were found. The study indicates that the introduction of water based paints has improved the work environment for house painters. Water based paints cause less discomfort and airway irritation than the earlier solvent based paints. Adverse general health effects seem low. Some of the painters may have dermal symptoms caused by the components in water based paints.
Human exposure guidelines for halogenated hydrocarbons (halons) and halon replacement chemicals have been established using dose-response data obtained from canine cardiac sensitiza- tion studies. In order to provide a tool for decision makers and regulators tasked with setting guidelines for egress from exposure to halon replacement chemicals, a quantitative approach, using a physiologically based pharmacokinetic model, was established that allowed
This research is intended to contribute to the development of automated and human-in-the-loop systems for higher level fusion to respond to the information requirements of command decision making. In tactical situations with short time constraints, the analysis of information requirements may take place in advance for certain classes of problems, and provided to commanders and their staff as part of the control and communications systems that come with sensor networks. In particular, it may be possible that certain standing orders can assume the role of Priority Intelligence Requirements. Standing orders to a sensor network are analogous to standing orders to Soldiers. Trained Soldiers presumably don't need to be told to report contact with hostiles, for example, or to report any sighting of civilians with weapons. Such standing orders define design goals and engineering requirements for sensor networks and their control and inference systems. Since such standing orders can be defined in advance for a class of situations, they minimize the need for situation-specific human analysis. Thus, standing orders should be able to drive automatic control of some network functions, automated fusion of sensor reports, and automated dissemination of fused information. We define example standing orders, and outline an algorithm for responding to one of them based on our experience in the field of multisensor fusion.
As a person learns a new skill, distinct synapses, brain regions, and circuits are engaged and change over time. In this paper, we develop methods to examine patterns of correlated activity across a large set of brain regions. Our goal is to identify properties that enable robust learning of a motor skill. We measure brain activity during motor sequencing and characterize network properties based on coherent activity between brain regions. Using recently developed algorithms to detect time-evolving communities, we find that the complex reconfiguration patterns of the brain's putative functional modules that control learning can be described parsimoniously by the combined presence of a relatively stiff temporal core that is composed primarily of sensorimotor and visual regions whose connectivity changes little in time and a flexible temporal periphery that is composed primarily of multimodal association regions whose connectivity changes frequently. The separation between temporal core and periphery changes over the course of training and, importantly, is a good predictor of individual differences in learning success. The core of dynamically stiff regions exhibits dense connectivity, which is consistent with notions of core-periphery organization established previously in social networks. Our results demonstrate that core-periphery organization provides an insightful way to understand how putative functional modules are linked. This, in turn, enables the prediction of fundamental human capacities, including the production of complex goal-directed behavior. PMID:24086116
Bassett, Danielle S; Wymbs, Nicholas F; Rombach, M Puck; Porter, Mason A; Mucha, Peter J; Grafton, Scott T
Biomass derived energy currently accounts for about 3 quads of total primary energy use in the United States. Of this amount, about 0.8 quads are used for power generation. Several biomass energy production technologies exist today which contribute to this energy mix. Biomass combustion technologies have been the dominant source of biomass energy production, both historically and during the past two decades of expansion of modern biomass energy in the U. S. and Europe. As a research and development activity, biomass gasification has usually been the major emphasis as a method of more efficiently utilizing the energy potential of biomass, particularly wood. Numerous biomass gasification technologies exist today in various stages of development. Some are simple systems, while others employ a high degree of integration for maximum energy utilization. The purpose of this study is to conduct a technical and economic comparison of up to three biomass gasification technologies, including the carbon dioxide emissions reduction potential of each. To accomplish this, a literature search was first conducted to determine which technologies were most promising based on a specific set of criteria. The technical and economic performances of the selected processes were evaluated using computer models and available literature. Using these results, the carbon sequestration potential of the three technologies was then evaluated. The results of these evaluations are given in this final report.
Martha L. Rollins; Les Reardon; David Nichols; Patrick Lee; Millicent Moore; Mike Crim; Robert Luttrell; Evan Hughes
Biomass derived energy currently accounts for about 3 quads of total primary energy use in the United States. Of this amount, about 0.8 quads are used for power generation. Several biomass energy production technologies exist today which contribute to this energy mix. Biomass combustion technologies have been the dominant source of biomass energy production, both historically and during the past two decades of expansion of modern biomass energy in the U. S. and Europe. As a research and development activity, biomass gasification has usually been the major emphasis as a method of more efficiently utilizing the energy potential of biomass, particularly wood. Numerous biomass gasification technologies exist today in various stages of development. Some are simple systems, while others employ a high degree of integration for maximum energy utilization. The purpose of this study is to conduct a technical and economic comparison of up to three biomass gasification technologies, including the carbon dioxide emissions reduction potential of each. To accomplish this, a literature search was first conducted to determine which technologies were most promising based on a specific set of criteria. During this reporting period, the technical and economic performances of the selected processes were evaluated using computer models and available literature. The results of these evaluations are summarized in this report.
Martha L. Rollins; Les Reardon; David Nichols; Patrick Lee; Millicent Moore; Mike Crim; Robert Luttrell; Evan Hughes
As a person learns a new skill, distinct synapses, brain regions, and circuits are engaged and change over time. In this paper, we develop methods to examine patterns of correlated activity across a large set of brain regions. Our goal is to identify properties that enable robust learning of a motor skill. We measure brain activity during motor sequencing and characterize network properties based on coherent activity between brain regions. Using recently developed algorithms to detect time-evolving communities, we find that the complex reconfiguration patterns of the brain's putative functional modules that control learning can be described parsimoniously by the combined presence of a relatively stiff temporal core that is composed primarily of sensorimotor and visual regions whose connectivity changes little in time and a flexible temporal periphery that is composed primarily of multimodal association regions whose connectivity changes frequently. The separation between temporal core and periphery changes over the course of training and, importantly, is a good predictor of individual differences in learning success. The core of dynamically stiff regions exhibits dense connectivity, which is consistent with notions of core-periphery organization established previously in social networks. Our results demonstrate that core-periphery organization provides an insightful way to understand how putative functional modules are linked. This, in turn, enables the prediction of fundamental human capacities, including the production of complex goal-directed behavior.
Bassett, Danielle S.; Wymbs, Nicholas F.; Rombach, M. Puck; Porter, Mason A.; Mucha, Peter J.; Grafton, Scott T.
Lot-to-lot ADI CD data are generally used to tighten the variation of exposure energy of an exposure tool through an APC feedback system. With decreasing device size, the process window of an exposure tool becomes smaller and smaller. Therefore, whether the ADI CD can reveal the real behavior of a scanner or not becomes more and more a critical question, especially for the polysilicon gate layer. CD-SEM has generally been chosen as the metrology tool for this purpose. Because of the limitations of top-down CD-SEMs, an APC system could be easily misled by improper ADI CD data if the CD data were measured on a T-topped photo resist. ArF resist shrinkage and line edge roughness are also traditional causes for improper CD feedback if the user did not operate the CDSEM carefully. Another candidate for this APC application is spectroscopic-ellipsometry-based scatterometry technology, commonly referred to as SpectraCD. In recent studies, SpectraCD was proven to be able to reveal profile variation with excellent stability. The feasibility of improving a CDSEM-based APC system by a SpectraCD-based system in a high-volume manufacturing fab is therefore worthy of study. This study starts from an analysis of the historical data for the polysilicon ADI CD of a 130 nm product. Two different sets of CD measured from the two different metrology tools were analyzed. In the fab, CDSEM was the metrology tool chosen for the APC feedback. The CD data measured by SpectraCD over a 2 month timeframe were plotted as a CD trend chart of the specific exposure tool. There are several trend-ups and trend-downs observed, even though the overall CD range is small. After a series of analyses, the exposure tool has been proven to be quite stable and the CD data measured by SpectraCD also reveal the real behavior of the exposure tool correctly. The scanner is shown to have been misled by improper CD feedback. In comparison with CDSEM, the linearity of the correlation between ADI and AEI CDs, which represents the consistence of etch bias, can also be improved from 0.4 to 0.8 by SpectraCD. The root causes are still under investigation, but one suspected reason is related to resist profile. All the analysis results will be reported in this paper. The data provided sufficient motivation for switching the APC feedback system of the fab from a CDSEM-based system to a SpectraCD-based system. The results of the new APC system will also be discussed.
Lin, Wen-Kuang; Liao, Shih-Hsien; Tsai, Ronghao; Yeh, Mike; Hsieh, Calvino; Yu, Y.; Lin, Benjamin Szu-Min; Fu, Steven; Dziura, Thaddeus G.
We describe a time-line-based methodology for collecting exposure data for epidemiologic studies and for processing these data for statistical analysis with readily available software for the personal computer. The four components to this approach are: (1) collecting data in a memory-enhancing time-line format; (2) entering data from time lines into a computer database and editing them; (3) making a quantitative
JANET LAWLER-HEAVNER; A JAMES RUTTENBER; MING YIN; TED D WADE
For the computation of the exposure of workers to radiated fields of base station antennas a procedure which combines the Finite-Difference Time-Domain (FDTD) method and the Hybrid(2)-method has been developed. The Hybrid(2)- Method is used for the calculation of the antenna currents and fields and the FDTD-method for the calculation of the fields and SAR inside a human body model.
|Digital Game-Based Learning (DGBL) activities were examined in comparison with effective, research-based learning strategies to observe any difference in student engagement and time-on task behavior. Experimental and control groups were randomly selected amongst the intermediate elementary school students ages 8 to 10 years old. Student…
Wildland fire base camps commonly house thousands of support personnel for weeks at a time. The selection of the location of these base camps is largely a strategic decision that incorporates many factors, one of which is the potential impact of biomass smoke from the nearby fire event. Biomass smoke has many documented adverse health effects due, primarily, to high levels of fine particulate matter (PM(2.5)). Minimizing particulate matter exposure to potentially susceptible individuals working as support personnel in the base camp is vital. In addition to smoke from nearby wildland fires, base camp operations have the potential to generate particulate matter via vehicle emissions, dust, and generator use. We monitored particulate matter at three base camps during the fire season of 2009 in Washington, Oregon, and California. During the sampling events, 1-min time-weighted averages of PM(2.5) and particle counts from three size fractions (0.3-0.5 microns, 0.5-1.0 microns, and 1.0-2.5 microns) were measured. Results showed that all PM size fractions (as well as overall PM(2.5) concentrations) were higher during the overnight hours, a trend that was consistent at all camps. Our results provide evidence of camp-based, site-specific sources of PM(2.5) that could potentially exceed the contributions from the nearby wildfire. These exposures could adversely impact wildland firefighters who sleep in the camp, as well as the camp support personnel, who could include susceptible individuals. A better understanding of the sources and patterns of poor air quality within base camps would help to inform prevention strategies to reduce personnel exposures. PMID:22364357
Conducting a sound skin sensitization risk assessment prior to the introduction of new ingredients and products into the market place is essential. The process by which low-molecular-weight chemicals induce and elicit skin sensitization is dependent on many factors, including the ability of the chemical to penetrate the skin, react with protein, and trigger a cell-mediated immune response. Based on our chemical, cellular and molecular understanding of allergic contact dermatitis, it is possible to carry out a quantitative risk assessment. Specifically, by estimating the exposure to the allergen and its allergenic potency, it is feasible to assess quantitatively the sensitization risk of an ingredient in a particular product type. This paper focuses on applying exposure-based risk assessment tools to understanding fragrance allergy for 2 hypothetical products containing the fragrance allergen cinnamic aldehyde. The risk assessment process predicts that an eau de toilette leave-on product containing 1000 ppm or more cinnamic aldehyde would pose an unacceptable risk of induction of skin sensitization, while a shampoo, containing the same level of cinnamic aldehyde, would pose an acceptable risk of induction of skin sensitization, based on limited exposure to the ingredient from a rinse-off product application. PMID:11846748
Gerberick, G F; Robinson, M K; Felter, S P; White, I R; Basketter, D A
Chemical transport through human skin can play a significant role in human exposure to toxic chemicals in the workplace, as well as to chemical/biological warfare agents in the battlefield. The viability of transdermal drug delivery also relies on chemical transport processes through the skin. Models of percutaneous absorption are needed for risk-basedexposure assessments and drug-delivery analyses, but previous mechanistic models have been largely deterministic. A probabilistic, transient, three-phase model of percutaneous absorption of chemicals has been developed to assess the relative importance of uncertain parameters and processes that may be important to risk-based assessments. Penetration routes through the skin that were modeled include the following: (1) intercellular diffusion through the multiphase stratum corneum; (2) aqueous-phase diffusion through sweat ducts; and (3) oil-phase diffusion through hair follicles. Uncertainty distributions were developed for the model parameters, and a Monte Carlo analysis was performed to simulate probability distributions of mass fluxes through each of the routes. Sensitivity analyses using stepwise linear regression were also performed to identify model parameters that were most important to the simulated mass fluxes at different times. This probabilistic analysis of percutaneous absorption (PAPA) method has been developed to improve risk-basedexposure assessments and transdermal drug-delivery analyses, where parameters and processes can be highly uncertain.
Background: Exposure-response analyses in occupational studies rely on the ability to distinguish workers with regard to exposures of interest. Aims: To evaluate different estimates of current average exposure in an exposure-response analysis on dust exposure and cross-shift decline in FEV1 among woodworkers. Methods: Personal dust samples (n = 2181) as well as data on lung function parameters were available for 1560 woodworkers from 54 furniture industries. The exposure to wood dust for each worker was calculated in eight different ways using individual measurements, group basedexposure estimates, a weighted estimate of individual and group basedexposure estimates, and predicted values from mixed models. Exposure-response relations on cross-shift changes in FEV1 and exposure estimates were explored. Results: A positive exposure-response relation between average dust exposure and cross-shift FEV1 was shown for non-smokers only and appeared to be most pronounced among pine workers. In general, the highest slope and standard error (SE) was revealed for grouping by a combination of task and factory size, the lowest slope and SE was revealed for estimates based on individual measurements, with the weighted estimate and the predicted values in between. Grouping by quintiles of average exposure for task and factory combinations revealed low slopes and high SE, despite a high contrast. Conclusion: For non-smokers, average dust exposure and cross-shift FEV1 were associated in an exposure dependent manner, especially among pine workers. This study confirms the consequences of using different exposure assessment strategies studying exposure-response relations. It is possible to optimise exposure assessment combining information from individual and group basedexposure estimates, for instance by applying predicted values from mixed effects models.
Schlunssen, V; Sigsgaard, T; Schaumburg, I; Kromhout, H
Feeding children with maize may expose them to fumonisins (FBs). This study assessed FB exposure for infants consuming maize in Tanzania by modeling maize consumption data (kg/kg body weight (bw)/day) with previously collected total FB contamination (microg/kg) patterns for sorted and unsorted maize harvested in 2005 and 2006. Consumption was estimated by twice conducting a 24 h dietary recall for 254 infants. The exposure assessment was performed with the @RISK analysis software. Of the infants, 89% consumed maize from 2.37 to 158 g/person/day (mean; 43 g/person/day +/- 28). Based on the contamination for sorted maize; in 2005, the percentage of infants with FB exposures above the provisional maximum tolerable daily intake (PMTDI) of 2 microg/kg (bw) (26% (95% confidence interval (CI); 23-30)) was significantly higher than the level of 3% (90% CI; 2-12) in 2006. Pooling the datasets for sorted maize from the two seasons resulted in a seemingly more representative risk (10% (95% CI; 6-17)) of exceeding the PMTDI. However, infants who might have consumed unsorted maize would still be at a significantly higher risk (24% (95% CI; 15-34)) of exceeding the PMTDI. Sorting and other good maize management practices should be advocated to farmers in order to minimize FB exposure in rural areas. PMID:18837467
Kimanya, Martin E; De Meulenaer, Bruno; Baert, Katleen; Tiisekwa, Bendantunguka; Van Camp, John; Samapundo, Simbarashe; Lachat, Carl; Kolsteren, Patrick
This study was conducted to assess the quality of interview-basedexposure estimates obtained in a large epidemiologic case-control study: The Northern Germany Leukemia and Lymphoma Study (1997-2002) (NLL). The NLL used standardized, face-to-face, computer-assisted interviews to record subjects' lifetime use of radiofrequency (RF)-emitting appliances such as cellular telephones, cordless telephones, baby monitors, and television headphones. Exposure assessment comprised 3 levels of precision: ever use, gross vs. net appliance-years, and lifetime cumulative exposure hours. In the current study, the authors analyzed data from 3041 interviews of NLL controls, representing an age-stratified random sample of the general populations of 6 counties in Northern Germany. Weighted kappa coefficients for gross vs. net appliance-years for men were 0.59 (95% confidence interval [CI] = 0.46, 0.71) for baby monitors and 0.98 (95% CI = 0.97, 0.99) for cordless phones; for women, the coefficients were 0.68 (95% CI = 0.56, 0.79) and 0.97 (95% CI = 0.94, 0.98), respectively. Weighted kappa values were considerably lower when net appliance-years and lifetime cumulative exposure hours were compared. Study results demonstrated that interview information on use of RF-emitting appliances, when measured at different levels of precision, can result in misclassification and biased risk estimates. PMID:16238163
Behrens, Thomas; Terschüren, Claudia; Hoffmann, Wolfgang
Many nuclear projects such as environmental restoration and waste management challenges involve radiation or other hazards that will necessitate the use of remote operations that protect human workers from dangerous exposures. Remote work is far more costly to execute than what workers could accomplish directly with conventional tools and practices because task operations are slow and tedious due to difficulties of remote manipulation and viewing. Decades of experience within the nuclear remote operations community show that remote tasks may take hundreds of times longer than hands-on work; even with state-of-the-art force- reflecting manipulators and television viewing, remote task performance execution is five to ten times slower than equivalent direct contact work. Thus the requirement to work remotely is a major cost driver in many projects. Modest improvements in the work efficiency of remote systems can have high payoffs by reducing the completion time of projects. Additional benefits will accrue from improved work quality and enhanced safety.
Hamel, W.R. [Univ. of Tennessee, Knoxville, TN (United States); Osborn, J. [Carnegie-Mellon Univ., Pittsburgh, PA (United States)
In this study, a novel methodology is proposed to create heat maps that accurately pinpoint the outdoor locations with elevated exposure to radiofrequency electromagnetic fields (RF-EMF) in an extensive urban region (or, hotspots), and that would allow local authorities and epidemiologists to efficiently assess the locations and spectral composition of these hotspots, while at the same time developing a global picture of the exposure in the area. Moreover, no prior knowledge about the presence of radiofrequency radiation sources (e.g., base station parameters) is required. After building a surrogate model from the available data using kriging, the proposed method makes use of an iterative sampling strategy that selects new measurement locations at spots which are deemed to contain the most valuable information-inside hotspots or in search of them-based on the prediction uncertainty of the model. The method was tested and validated in an urban subarea of Ghent, Belgium with a size of approximately 1km(2). In total, 600 input and 50 validation measurements were performed using a broadband probe. Five hotspots were discovered and assessed, with maximum total electric-field strengths ranging from 1.3 to 3.1V/m, satisfying the reference levels issued by the International Commission on Non-Ionizing Radiation Protection for exposure of the general public to RF-EMF. Spectrum analyzer measurements in these hotspots revealed five radiofrequency signals with a relevant contribution to the exposure. The radiofrequency radiation emitted by 900MHz Global System for Mobile Communications (GSM) base stations was always dominant, with contributions ranging from 45% to 100%. Finally, validation of the subsequent surrogate models shows high prediction accuracy, with the final model featuring an average relative error of less than 2dB (factor 1.26 in electric-field strength), a correlation coefficient of 0.7, and a specificity of 0.96. PMID:23759207
The rapid growth of mobile communications has not only led to a rising number of mobile telephones. It has also made base stations essential for these services widespread on many roofs. However, not everyone is aware that working close to sources of high frequency electromagnetic fields (EMF), such as transmitter antennas for mobile phones, pagers and police, fire and other emergency services, can result in high EMF exposure. This paper deals with measurements and calculations of the compliance boundary for workers in one typical roof top base station setting according to EU Directive and other relevant EN standards. PMID:16790176
This study investigated cognitive constructs to be measured by word problems in algebra. One performance-based assessment was administered to 290 high school students. Students' responses were scored by three scoring systems: the correct/incorrect criterion (0/1); a holistic scoring rubric (0-4); and an analytical scoring rubric for measuring…
Background and Aims The degree of diagnostic radiation exposure in children with inflammatory bowel diseases (IBD) is largely unknown. Here we describe this exposure in a population-based sample of children with IBD and determine characteristics associated with moderate radiation exposure. Methods We ascertained radiological study use, demographic characteristics, IBD medication use, and the requirement for hospitalization, emergency department (ED) encounter, or inpatient GI surgery among children with IBD within a large insurance claims database. Characteristics associated with moderate radiation exposure (at least one computed tomography (CT) or three fluoroscopies over two years) were determined using logistic regression models. Results We identified 965 children with Crohn’s Disease (CD) and 628 with Ulcerative Colitis (UC). Over 24 months, 34% of CD subjects and 23% of UC subjects were exposed to moderate diagnostic radiation [odds ratio (OR) 1.71, 95% confidence interval (CI), 1.36–2.14]. CT accounted for 28% and 25% of all studies in CD and UC subjects, respectively. For CD subjects, moderate radiation exposure was associated with hospitalization (OR 4.89, 95% CI 3.37–7.09), surgery (OR 2.93, 95% CI 1.59–5.39), ED encounter (OR 2.65, 1.93–3.64 95% CI), oral steroids (OR 2.25, 95% CI 1.50–3.38), and budesonide (OR 1.80, 95% CI 1.10–3.06); an inverse association was seen with immunomodulator use (OR 0.67, 95% CI 0.47–0.97). Except for oral steroids and immunomodulators, similar relationships were seen in UC. Conclusion A substantial proportion of children with IBD are exposed to moderate amounts of radiation as a result of diagnostic testing. This high utilization may impart long-term risk given the chronic nature of the disease.
Palmer, Lena; Herfarth, Hans; Porter, Carol Q.; Fordham, Lynn A.; Sandler, Robert S.; Kappelman, Michael D.
This Health and Safety Plan (HSP) was developed for the Environmental Investigation of Ground-water Contamination Investigation at Wright-Patterson Air Force Base near Dayton, Ohio, based on the projected scope of work for the Phase 1, Task 4 Field Investigation. The HSP describes hazards that may be encountered during the investigation, assesses the hazards, and indicates what type of personal protective equipment is to be used for each task performed. The HSP also addresses the medical monitoring program, decontamination procedures, air monitoring, training, site control, accident prevention, and emergency response.
Radiofrequency (RF) waves have long been used for different types of information exchange via the air waves--wireless Morse code, radio, television, and wireless telephone (i.e., construction and operation of telephones or telephone systems). Increasingly larger numbers of people rely on mobile telephone technology, and health concerns about the associated RF exposure have been raised, particularly because the mobile phone handset operates in close proximity to the human body, and also because large numbers of base station antennas are required to provide widespread availability of service to large populations. The World Health Organization convened an expert workshop to discuss the current state of cellular-telephone health issues, and this article brings together several of the key points that were addressed. The possibility of RF health effects has been investigated in epidemiology studies of cellular telephone users and workers in RF occupations, in experiments with animals exposed to cell-phone RF, and via biophysical consideration of cell-phone RF electric-field intensity and the effect of RF modulation schemes. As summarized here, these separate avenues of scientific investigation provide little support for adverse health effects arising from RF exposure at levels below current international standards. Moreover, radio and television broadcast waves have exposed populations to RF for > 50 years with little evidence of deleterious health consequences. Despite unavoidable uncertainty, current scientific data are consistent with the conclusion that public exposures to permissible RF levels from mobile telephone and base stations are not likely to adversely affect human health. PMID:17431492
Valberg, Peter A; van Deventer, T Emilie; Repacholi, Michael H
Among a range of cognitive deficits, human cocaine addicts display increased impulsivity and decreased performance monitoring. In order to establish an animal model that can be used to study the underlying neurobiology of these deficits associated with addiction, we have developed a touch screen based Stop Signal Response Task for rhesus monkeys. This task is essentially identical to the clinically used Stop Signal Task employed for diagnostic and research purposes. In this task, impulsivity is reflected in the amount of time needed to inhibit a response after it has been initiated, the Stop Signal Response Time (SSRT). Performance monitoring is reflected by the slowing of response times following Stop trials (Post-Stop Slowing, PSS). Herein we report on the task structure, the staged methods for training animals to perform the task, and a comparison of performance values for control and cocaine experienced animals. Relative to controls, monkeys that had self-administered cocaine, followed by 18 months abstinence, displayed increased impulsivity (increased SSRT values), and decreased performance monitoring (decreased PSS values). Our results are consistent with human data, and thereby establish an ideal animal model for studying the etiology and underlying neurobiology of cocaine-induced impulse control and performance monitoring deficits. PMID:18948136
Liu, Shijing; Heitz, Richard P; Bradberry, Charles W
For the last two decades, the organ and tissue equivalent dose as well as effective dose conversion coefficients recommended by the International Commission on Radiological Protection (ICRP) have been determined with exposure models based on stylized MIRD5-type phantoms representing the human body with its radiosensitive organs and tissues according to the ICRP Reference Man released in Publication No. 23, on Monte Carlo codes sometimes simulating rather simplified radiation physics and on tissue compositions from different sources. Meanwhile the International Commission on Radiation Units and Measurements (ICRU) has published reference data for human tissue compositions in Publication No. 44, and the ICRP has released a new report on anatomical reference data in Publication No. 89. As a consequence many of the components of the traditional stylized exposure models used to determine the effective dose in the past have to be replaced: Monte Carlo codes, human phantoms and tissue compositions. This paper presents results of comprehensive investigations on the dosimetric consequences to be expected from the replacement of the traditional stylized exposure models by the voxel-basedexposure models. Calculations have been performed with the EGS4 Monte Carlo code for external and internal exposures to photons and electrons with the stylized, gender-specific MIRD5-type phantoms ADAM and EVA on the one hand and with the recently developed tomographic or voxel-based phantoms MAX and FAX on the other hand for a variety of exposure conditions. Ratios of effective doses for the voxel-based and the stylized exposure models will be presented for external and internal exposures to photons and electrons as a function of the energy and the geometry of the radiation field. The data indicate that for the exposure conditions considered in these investigations the effective dose may change between +60% and -50% after the replacement of the traditional exposure models by the voxel-basedexposure models.
A review of studies, including both articles published in peer-reviewed journals and reports that were not peer reviewed, regarding occupational exposure to benzene and total hydrocarbons in the downstream petroleum industry operations was performed. The objective was to provide a broad estimate of exposures by compiling exposure data according to the following categories: refinery, pipeline, marine, rail, bulk terminals and trucks, service stations, underground storage tanks, tank cleaning, and site remediations. The data in each category was divided into personal occupational long-term and short-term samples. The summarized data offers valuable assistance to hygienists by providing them with an estimate and range of exposures. The traditional 8-hour time-weighted average (TWA) exposure and the 40-hour workweek do not generally coincide with exposure periods applicable to workers in marine, pipeline, railcar, and trucking operations. They are more comparable with short-term exposure or task-basedexposure assessments. The marine sector has a large number of high exposures. Although relatively few workers are exposed, their exposures to benzene and total hydrocarbons are sometimes an order of magnitude higher than the respective exposure limits. It is recommended that in the future, it would be preferable to do more task-basedexposure assessments and fewer traditional TWA long-term exposure assessments within the various sectors of the downstream petroleum industry. PMID:11331990
This paper presents a method for multi-exposure images fusion based on wavelet packet transform, combining the local energy distributions of multi-exposure images with the edge detection. After decomposing two images involved in fusion into sub images in low-frequency and high-frequency with wavelet packet transform, we use different methods for low-frequency and high-frequency to obtain fusion coefficients. In low frequency processing, the method that threshold value is set for local energy is used while the edge detection method is used in high frequency, where the edge detection operator help compute the information quantity of different high frequency images. Then the coefficients for fusion are selected according to different strategies adopted for low- and high-frequency. Finally, the fusion image is reconstructed through inverse wavelet packet transform. The result shows that the fusion method is effective and the fusion image can preserve the details of the each input image successfully.
Background Aluminum oxide-based nanowhiskers (AO nanowhiskers) have been used in manufacturing processes as catalyst supports, flame retardants, adsorbents, or in ceramic, metal and plastic composite materials. They are classified as high aspect ratio nanomaterials. Our aim was to assess in vivo toxicity of inhaled AO nanowhisker aerosols. Methods Primary dimensions of AO nanowhiskers specified by manufacturer were 2–4?nm x 2800?nm. The aluminum content found in this nanomaterial was 30% [mixed phase material containing Al(OH)3 and AlOOH]. Male mice (C57Bl/6?J) were exposed to AO nanowhiskers for 4?hrs/day, 5?days/wk for 2 or 4 wks in a dynamic whole body exposure chamber. The whiskers were aerosolized with an acoustical dry aerosol generator that included a grounded metal elutriator and a venturi aspirator to enhance deagglomeration. Average concentration of aerosol in the chamber was 3.3?±?0.6?mg/m3 and the mobility diameter was 150?±?1.6?nm. Both groups of mice (2 or 4 wks exposure) were necropsied immediately after the last exposure. Aluminum content in the lung, heart, liver, and spleen was determined. Pulmonary toxicity assessment was performed by evaluation of bronchoalveolar lavage (BAL) fluid (enumeration of total and differential cells, total protein, activity of lactate dehydrogenase [LDH] and cytokines), blood (total and differential cell counts), lung histopathology and pulmonary mechanics. Results Following exposure, mean Al content of lungs was 0.25, 8.10 and 15.37??g/g lung (dry wt) respectively for sham, 2 wk and 4 wk exposure groups. The number of total cells and macrophages in BAL fluid was 2-times higher in animals exposed for 2 wks and 6-times higher in mice exposed for 4 wks, compared to shams (p?0.01, p?0.001, respectively). However no neutrophilic inflammation in BAL fluid was found and neutrophils were below 1% in all groups. No significant differences were found in total protein, activity of LDH, or cytokines levels (IL-6, IFN-?, MIP-1?, TNF-?, and MIP-2) between shams and exposed mice. Conclusions Sub-chronic inhalation exposures to aluminum-oxide based nanowhiskers induced increased lung macrophages, but no inflammatory or toxic responses were observed.
Background Arsenic is a potent pollutant that has caused an environmental catastrophe in certain parts of the world including Bangladesh where millions of people are presently at risk due to drinking water contaminated by arsenic. Chronic arsenic exposure has been scientifically shown as a cause for liver damage, cancers, neurological disorders and several other ailments. The relationship between plasma cholinesterase (PChE) activity and arsenic exposure has not yet been clearly documented. However, decreased PChE activity has been found in patients suffering liver dysfunction, heart attack, cancer metastasis and neurotoxicity. Therefore, in this study, we evaluated the PChE activity in individuals exposed to arsenic via drinking water in Bangladesh. Methods A total of 141 Bangladeshi residents living in arsenic endemic areas with the mean arsenic exposure of 14.10 ± 3.27 years were selected as study subjects and split into tertile groups based on three water arsenic concentrations: low (< 129 ?g/L), medium (130-264 ?g/L) and high (> 265 ?g/L). Study subjects were further sub-divided into two groups (?50 ?g/L and > 50 ?g/L) based on the recommended upper limit of water arsenic concentration (50 ?g/L) in Bangladesh. Blood samples were collected from the study subjects by venipuncture and arsenic concentrations in drinking water, hair and nail samples were measured by Inductively Coupled Plasma Mass Spectroscopy (ICP-MS). PChE activity was assayed by spectrophotometer. Results Arsenic concentrations in hair and nails were positively correlated with the arsenic levels in drinking water. Significant decreases in PChE activity were observed with increasing concentrations of arsenic in water, hair and nails. The average levels of PChE activity in low, medium and high arsenic exposure groups were also significantly different between each group. Lower levels of PChE activity were also observed in the > 50 ?g/L group compared to the ?50 ?g/L group. Moreover, PChE activity was significantly decreased in the skin (+) symptoms group compared to those without (-). Conclusions We found a significant inverse relationship between arsenic exposure and PChE activity in a human population in Bangladesh. This research demonstrates a novel exposure-response relationship between arsenic and PChE activity which may explain one of the biological mechanisms through which arsenic exerts its neuro-and hepatotoxicity in humans.
Objectives: Occupational exposure assessment for population-based case–control studies is challenging due to the wide variety of industries and occupations encountered by study participants. We developed and evaluated statistical models to estimate the intensity of exposure to three chlorinated solvents—methylene chloride, 1,1,1-trichloroethane, and trichloroethylene—using a database of air measurement data and associated exposure determinants. Methods: A measurement database was developed after an extensive review of the published industrial hygiene literature. The database of nearly 3000 measurements or summary measurements included sample size, measurement characteristics (year, duration, and type), and several potential exposure determinants associated with the measurements: mechanism of release (e.g. evaporation), process condition, temperature, usage rate, type of ventilation, location, presence of a confined space, and proximity to the source. The natural log-transformed measurement levels in the exposure database were modeled as a function of the measurement characteristics and exposure determinants using maximum likelihood methods. Assuming a single lognormal distribution of the measurements, an arithmetic mean exposure intensity level was estimated for each unique combination of exposure determinants and decade. Results: The proportions of variability in the measurement data explained by the modeled measurement characteristics and exposure determinants were 36, 38, and 54% for methylene chloride, 1,1,1-trichloroethane, and trichloroethylene, respectively. Model parameter estimates for the exposure determinants were in the anticipated direction. Exposure intensity estimates were plausible and exhibited internal consistency, but the ability to evaluate validity was limited. Conclusions: These prediction models can be used to estimate chlorinated solvent exposure intensity for jobs reported by population-based case–control study participants that have sufficiently detailed information regarding the exposure determinants.
Hein, Misty J.; Waters, Martha A.; Ruder, Avima M.; Stenzel, Mark R.; Blair, Aaron; Stewart, Patricia A.
This review is based on the proceedings from the Second Lebow Conference held in Chicago in 2007. The conference concentrated on developing a framework for innovative studies in the epidemiology of environmental exposures, focusing specifically on the potential relationship with brain tumors. Researchers with different perspectives, including toxicology, pharmacokinetics, and epidemiological exposure assessment, exchanged information and ideas on the use of biomarkers of exposure in molecular epidemiology studies and summarized the current knowledge on methods and approaches for biomarker-basedexposure assessment. This report presents the state of science regarding biomarker-basedexposure assessment of the 4 most common neurocarcinogens: acrylamide, 1,3-butadiene, N-nitroso compounds, and polycyclic aromatic hydrocarbons. Importantly, these chemicals are also carcinogenic in other organs; therefore, this discussion is useful for environmental epidemiologists studying all cancer types.
Il'yasova, Dora; McCarthy, Bridget J.; Erdal, Serap; Shimek, Joanna; Goldstein, Jennifer; Doerge, Daniel R.; Myers, Steven R.; Vineis, Paolo; Wishnok, John S.; Swenberg, James A.; Bigner, Darell D.; Davis, Faith G.
The primary aim of the Acute Exposure Guideline Level (AEGL) program is to develop scientifically credible limits for once-in-a-lifetime or rare acute inhalation exposures to high-priority, hazardous chemicals. The program was developed because of the need of communities for information on hazardous chemicals to assist in emergency planning, notification, and response, as well as the training of emergency response personnel. AEGLs are applicable to the general population, including children, the elderly, and other potentially susceptible subpopulations. AEGLs are the airborne concentrations of chemicals above which a person could experience notable discomfort or irritation (AEGL-1); serious, long-lasting health effects (AEGL-2); and life-threatening effects or death (AEGL-3). AEGLs are determined for five exposure periods (10 and 30 min and 1, 4, and 8 h). Physiologically based pharmacokinetic (PBPK) models can be very useful in the interspecies and time scaling often required here. PBPK models are used for the current article to predict AEGLs for trichlorethylene (TCE), based on the time course of TCE in the blood and/or brain of rats and humans. These AEGLs are compared to values obtained by standard time-scaling methods. Comprehensive toxicity assessment documents for each chemical under consideration are prepared by the National Advisory Committee for AEGLs, a panel comprised of representatives of federal, state, and local governmental agencies, as well as industry and private-sector organizations. The documents are developed according to National Research Council (NRC) guidelines and must be reviewed by the NRC Subcommittee on Acute Exposure Guideline Levels before becoming final. AEGLs for 18 chemicals have been published, and it is anticipated that 40 to 50 chemicals will be evaluated annually. PMID:15192858
Bruckner, James V; Keys, Deborah A; Fisher, Jeffrey W
With the increasing number of cellular base stations in both indoor and outdoor environments, there is a growing need for simple and accurate methods for RF exposure assessments and for the determination of exposure limit compliance distances for such equipment. For low output power devices, whose compliance distances can be assumed to be short, measurements of SAR in a phantom
L. Hamberg; N. Lovehagen; M. Siegbahn; C. Tornevik
|Does exposure to terrorism lead to hostility toward minorities? Drawing on theories from clinical and social psychology, we propose a stress-based model of political extremism in which psychological distress--which is largely overlooked in political scholarship--and threat perceptions mediate the relationship between exposure to terrorism and…
Canetti-Nisim, Daphna; Halperin, Eran; Sharvit, Keren; Hobfoll, Stevan E.
Although high radon concentrations have been linked to increased risk of lung cancer by both experimental studies and investigations of underground miners, epidemiologic studies of residential radon exposure display inconsistencies. The authors therefore decided to conduct a population-based case-control study in northwest Spain to determine the risk of lung cancer associated with exposure to residential radon. The study covered a
Juan Miguel Barros-Dios; María Amparo Barreiro; Alberto Ruano-Ravina; Adolfo Figueiras
Two new diglycolamide-basedtask-specific ionic liquids (DGA-TSILs) were evaluated for the extraction of actinides and lanthanides from acidic feed solutions. These DGA-TSILs were capable of exceptionally high extraction of trivalent actinide ions, such as Am(3+), and even higher extraction of the lanthanide ion, Eu(3+) (about 5-10 fold). Dilution of the DGA-TSILs in an ionic liquid, C(4)mim(+)·NTf(2)(-), afforded reasonably high extraction ability, faster mass transfer, and more efficient stripping of the metal ion. The nature of the extracted species was studied by slope analysis, which showed that the extracted species contained one NO(3)(-) anion, along with the participation of two DGA-TSIL molecules. Time-resolved laser fluorescence spectroscopy (TRLFS) analysis showed a strong complexation with no inner-sphere water molecule in the Eu(III)-DGA-TSIL complexes in the presence and absence of C(4)mim(+)·NTf(2)(-) as the diluent. The very high radiolytic stability of DGA-TSIL 6 makes it one of the most-efficient solvent systems for the extraction of actinides under acidic feed conditions. PMID:23319409
Gender-based differences can be observed from pharmacokinetic, behavioral, or anatomical assessments. No single assessment tool will provide a complete answer, but the use of a variety of indices, each with known gender-related outcome differences, can reveal agent-induced gender-based alterations. In a series of initial range-finding studies in rats conducted at the National Center for Toxicological Research (NCTR), the effects of dietary exposure to the weak estrogen, genistein, have been assessed using a number of techniques with validated gender-related outcome measures. The findings indicated that (1) the internal dose of genistein was higher in females than males after equivalent dietary exposure and this was consistent with the faster rate of genistein elimination in males; (2) in behavioral assessments, males and females in the high-dose dietary genistein group consumed more of a sodium-flavored solution; however, no genistein-related changes were observed in open field or running wheel activity, play behavior, or intake of a saccharin-flavored solution; and (3) dose-related alterations of the volume of the sexually dimorphic nucleus of the medial preoptic area were observed in genistein-exposed male rats but not females. These observations describe the utility of a variety of gender-based assessment tools and indicate that dose-related effects of developmental and chronic dietary exposure to genistein can be observed in the rodent. Additional studies, perhaps in nonhuman primates, are necessary to further predict the effect(s) of genistein on human gender-based development. PMID:11488560
Slikker, W; Scallet, A C; Doerge, D R; Ferguson, S A
Exposure to perchlorethylene, especially for dry cleaning workers and for people living near dry cleaning shops, could lead to several diseases and disorders. This study examines the value of solid-phase microextraction (SPME) for sampling perchlorethylene in the atmosphere of dry cleaning shops. Carboxen/polydimethylsiloxane (CAR/PDMS) in 0.5-cm retracted mode was selected. There were no significant differences between sampling rates at different temperatures (range of 20 to 30 °C) and air velocities (2 to 50 cm/s). On the opposite, relative humidity (RH) had a significant effect on sampling rates. Method reproducibility was realized in the laboratory and field conditions and was 6.2 % and 7 to 11 %, respectively. Repeatability was also determined as 8.9 %. Comparison of the results according to the American Industrial Hygiene Association exposure assessment strategy showed the SPME sampler yields more conservative results in comparison with traditional standard method. PMID:23054278
Zare Sakhvidi, Mohammad Javad; Bahrami, Abdul Rahman; Ghiasvand, Alireza; Mahjub, Hossein; Tuduri, Ludovic
Unrealistic steady-state assumptions are often used to estimate toxicant exposure rates from biomarkers. A biomarker may instead be modeled as a weighted sum of historical time-varying exposures. Estimating equations are derived for a zero-inflated gamma distribution for daily exposures with a known exposure frequency. Simulation studies suggest that the estimating equations can provide accurate estimates of exposure magnitude at any reasonable sample size, and reasonable estimates of the exposure variance at larger sample sizes. PMID:21668990
A computational and experimental method is employed to provide an understanding of a critical human space flight problem, posture control following reduced gravity exposure. In the case of an emergency egress, astronauts' postural stability could be life saving. It is hypothesized that muscular gains are lowered during reduced gravity exposure, causing a feeling of heavy legs, or a perceived feeling of muscular weakness, upon return to Earth's 1 g environment. We developed an estimator-based model that is verified by replicating spatial and temporal characteristics of human posture and incorporates an inverted pendulum plant in series with a Hill-type muscle model, two feedback pathways, a central nervous system estimator, and variable gains. Results obtained by lowering the variable muscle gain in the model support the hypothesis. Experimentally, subjects were exposed to partial gravity (3/8 g) simulation on a suspension apparatus, then performed exercises postulated to expedite recovery and alleviate the heavy legs phenomenon. Results show that the rms position of the center of pressure increases significantly after reduced gravity exposure. Closed-loop system behavior is revealed, and posture is divided into a short-term period that exhibits higher stochastic activity and persistent trends and a long-term period that shows relatively low stochastic activity and antipersistent trends. PMID:11541209
To explore the role of the multiplicity of cellular hits by radon progeny alpha particles for lung cancer incidence, the number of single and multiple alpha particle hits were computed for basal and secretory cells in the bronchial epithelium of human airway bifurcations. Hot spots of alpha particle hits were observed at the branching points of bronchial airway bifurcations. The effect of single and multiple alpha particle intersections of bronchial cells during a given exposure period, selected from a Poisson distribution, on lung cancer risk were simulated by a transformation frequency--tissue response model, based on experimentally observed cellular transformation and survival functions. Calculations of lung cancer risk at low radon exposure levels suggest that single hits produce a linear-dose response relationship, while the superposition of single and increasing multiple hits at higher exposure levels may also be approximated by a quasi-linear dose-effect curve. The simulations predict a carcinogenic enhancement effect for radon progeny accumulations at bifurcation branching sites, which may increase current risk estimates. PMID:21471125
Exposure is a rapid and effective treatment for simple phobias. This study tested the assumption that endorphin release may be involved in exposure to a feared situation. Thirty spider-phobic Ss underwent exposure to 17 phobic-related, graded performance tasks. Half the Ss were randomly assigned to naltrexone, an opioid antagonist, and half to a placebo. Measures of heart rate, blood pressure,
Thomas V. Merluzzi; C. Barr Taylor; Michael Boltwood; K. Gunnar Götestam
This paper describes the use of a use case\\/taskbased method in the development of a portable neuromuscular stimulator device. The developed unit allows a variety of stimulus delivery algorithms to be incorporated dependent on the patient's requirements. The developed system consists of a stimulator unit, stimulator firmware, external sensors, a programmer unit, two stimulation channels and electrodes. A clinician
|A functional assessment-based intervention (FABI) was designed and implemented to increase the on-task behavior of David, a second-grade student in a general education classroom. David attended an elementary school that used a comprehensive, integrated, three-tiered (CI3T) model of prevention. The school's principal nominated David for Project…
This study tested the direct effects of three dimensions of organizational justice – distributive justice, procedural justice, and interactional justice – on contextual performance, counterproductive work behaviors, and task performance. The study also examined the moderating effects of an ability measure of emotional intelligence (EI) on the justice–performance relationship. Based on the data from 211 employees across nine organizations from
The aims were to evaluate the inter-method reliability of a registration sheet for patient handling tasks, to study the day-to-day variation of musculoskeletal complaints (MSC) and to examine whether patient handling tasks and psychosocial factors were associated with MSC. Nurses (n=148) fulfilled logbooks for three consecutive working days followed by a day off. Low back pain (LBP), neck/shoulder pain (NSP), knee pain (KP), psychosocial factors (time pressure, stress, conscience of the quality of work) and patient transfers and care tasks were reported. The logbook was reliable for both transfer and care tasks. The numbers of nurses reporting MSC and the level of pain increased significantly during the three working days (15%-30% and 17%-37%, respectively) and decreased on the day off. Stress and transfer task were associated with LPB and transfer tasks were associated with KP. Our results confirm a relationship between work factors and MSC and indicate that logs could be one way to obtain a better understanding of the complex interaction of various nursing working conditions in relation to MSC. PMID:18789431
Warming, S; Precht, D H; Suadicani, P; Ebbehøj, N E
The deprotection blur of Rohm and Haas XP 5435, XP 5271, and XP5496 extreme ultraviolet photoresists has been determined as their base weight percent is varied. They have also determined the deprotection blur of TOK EUVR P1123 photoresist as the post-exposure bake temperature is varied from 80 C to 120 C. In Rohm and Haas XP 5435 and XP5271 resists 7x and 3x (respective) increases in base weight percent reduce the size of successfully patterned 1:1 line-space features by 16 nm and 8 nm with corresponding reductions in deprotection blur of 7 nm and 4 nm. In XP 5496 a 7x increase in base weight percent reduces the size of successfully patterned 1:1 line-space features from 48 nm to 38 nm without changing deprotection blur. In TOK EUVR P1123 resist, a reduction in post-exposure bake temperature from 100 C to 80 C reduces deprotection blur from 21 nm to 10 nm and reduces patterned LER from 4.8 nm to 4.1 nm.
We evaluated a community-based participatory research worksite intervention intended to improve farmworkers' behaviors at work and after work to reduce occupational and take-home pesticide exposures. The workers received warm water and soap for hand washing, gloves, coveralls, and education. Self-reported assessments before and after the intervention revealed that glove use, wearing clean work clothes, and hand washing at the midday break and before going home improved significantly. Some behaviors, such as hand washing before eating and many targeted after-work behaviors, did not improve, indicating a need for additional intervention.
The Japan Society for Occupational Health started to recommend an occupational exposure limit based on biological monitoring\\u000a (OEL-B) in 1993. Up to 1998, OEL-Bs for mercury, lead, hexane and 3,3?-dichloro-4,4?-diaminodiphenylmethane had been adopted\\u000a and those for 17 chemical substances (arsenic, cadmium, chromium, nickel, acetone, methanol, benzene, toluene, xylene, styrene,\\u000a tetrachloroethylene, trichloroethylene, N,N-dimethylacetoamide, N,N-dimethylformamide,carbon disulfide, carbon monoxide, and organophospate insecticides) are
Objective. Epidemiologic and community health studies of traffic-related air pollution and childhood asthma have been limited by resource intensive exposure assessment techniques. The current study utilized a novel participant-based approach to collect air monitoring data f...
Little justification is generally provided for selection of in vitro assay testing concentrations for engineered nanomaterials (ENMs). Selection of concentration levels for hazard evaluation based on real-world exposure scenarios is desirable. We reviewed published ENM concentr...
Summaries Three year exposure tests were carried out to evaluate the protection performance of solvent-based and water-based coating\\u000a systems at marine and desert sites in the State of Kuwait. Electrochemical impedance spectroscopy measurements were conducted\\u000a for these two coating systems after 3-year exposure to the two different atmospheres. Samples were removed from the two sites\\u000a and the EIS measurement was conducted
Task-oriented training is emerging as the dominant and most effective approach to motor rehabilitation of upper extremity function after stroke. Here, the authors propose that the task-oriented training framework provides an evidence-based blueprint for the design of task-oriented robots for the rehabilitation of upper extremity function in the form of three design principles: skill acquisition of functional tasks, active participation training, and individualized adaptive training. The previous robotic systems that incorporate elements of task-oriented trainings are then reviewed. Finally, the authors critically analyze their own attempt to design and test the feasibility of a TOR robot, ADAPT (Adaptive and Automatic Presentation of Tasks), which incorporates the three design principles. Because of its task-oriented training-based design, ADAPT departs from most other current rehabilitation robotic systems: it presents realistic functional tasks in which the task goal is constantly adapted, so that the individual actively performs doable but challenging tasks without physical assistance. To maximize efficacy for a large clinical population, the authors propose that future task-oriented robots need to incorporate yet-to-be developed adaptive task presentation algorithms that emphasize acquisition of fine motor coordination skills while minimizing compensatory movements. PMID:23080042
Schweighofer, Nicolas; Choi, Younggeun; Winstein, Carolee; Gordon, James
Collection of multiple-task brain imaging data from the same subject has now become common practice in medical imaging studies. In this paper, we propose a simple yet effective model, “CCA+ICA”, as a powerful and new method for multi-task data fusion. This joint blind source separation (BSS) model takes advantage of two multivariate methods: canonical correlation analysis and independent component analysis, to achieve both high estimation accuracy and to provide the correct connection between two datasets in which sources can have either common or distinct between-dataset correlation. In both simulated and real fMRI applications, we compare the proposed scheme with other joint BSS models and examine the different modeling assumptions. The contrast images of two tasks: sensorimotor (SM) and Sternberg working memory (SB), derived from a general linear model (GLM), were chosen to contribute real multi-task fMRI data, both of which were collected from 50 schizophrenia patients and 50 healthy controls. When examining the relationship with duration of illness, CCA+ICA revealed a significant negative correlation with temporal lobe activation. Furthermore, CCA+ICA located sensorimotor cortex as the group-discriminative regions for both tasks and identified the superior temporal gyrus in SM and prefrontal cortex in SB as task-specific group-discriminative brain networks. In summary, we compared the new approach with competitive methods with different assumptions, and found consistent results regarding each of their hypotheses on connecting the two tasks. Such an approach fills a gap in existing multivariate methods for identifying biomarkers from brain imaging data.
Background: In contrast with established evidence linking high doses of ionizing radiation with childhood cancer, research on low-dose ionizing radiation and childhood cancer has produced inconsistent results. Objective: We investigated the association between domestic radon exposure and childhood cancers, particularly leukemia and central nervous system (CNS) tumors. Methods: We conducted a nationwide census-based cohort study including all children < 16 years of age living in Switzerland on 5 December 2000, the date of the 2000 census. Follow-up lasted until the date of diagnosis, death, emigration, a child’s 16th birthday, or 31 December 2008. Domestic radon levels were estimated for each individual home address using a model developed and validated based on approximately 45,000 measurements taken throughout Switzerland. Data were analyzed with Cox proportional hazard models adjusted for child age, child sex, birth order, parents’ socioeconomic status, environmental gamma radiation, and period effects. Results: In total, 997 childhood cancer cases were included in the study. Compared with children exposed to a radon concentration below the median (< 77.7 Bq/m3), adjusted hazard ratios for children with exposure ? the 90th percentile (? 139.9 Bq/m3) were 0.93 (95% CI: 0.74, 1.16) for all cancers, 0.95 (95% CI: 0.63, 1.43) for all leukemias, 0.90 (95% CI: 0.56, 1.43) for acute lymphoblastic leukemia, and 1.05 (95% CI: 0.68, 1.61) for CNS tumors. Conclusions: We did not find evidence that domestic radon exposure is associated with childhood cancer, despite relatively high radon levels in Switzerland. Citation: Hauri D, Spycher B, Huss A, Zimmermann F, Grotzer M, von der Weid N, Weber D, Spoerri A, Kuehni C, Röösli M, for the Swiss National Cohort and the Swiss Paediatric Oncology Group (SPOG). 2013. Domestic radon exposure and risk of childhood cancer: a prospective census-based cohort study. Environ Health Perspect 121:1239–1244;?http://dx.doi.org/10.1289/ehp.1306500
Hauri, Dimitri; Spycher, Ben; Huss, Anke; Zimmermann, Frank; Grotzer, Michael; von der Weid, Nicolas; Weber, Damien; Spoerri, Adrian; Kuehni, Claudia E.
Background: In contrast with established evidence linking high doses of ionizing radiation with childhood cancer, research on low-dose ionizing radiation and childhood cancer has produced inconsistent results.Objective: We investigated the association between domestic radon exposure and childhood cancers, particularly leukemia and central nervous system (CNS) tumors.Methods: We conducted a nationwide census-based cohort study including all children < 16 years of age living in Switzerland on 5 December 2000, the date of the 2000 census. Follow-up lasted until the date of diagnosis, death, emigration, a child's 16th birthday, or 31 December 2008. Domestic radon levels were estimated for each individual home address using a model developed and validated based on approximately 45,000 measurements taken throughout Switzerland. Data were analyzed with Cox proportional hazard models adjusted for child age, child sex, birth order, parents' socioeconomic status, environmental gamma radiation, and period effects.Results: In total, 997 childhood cancer cases were included in the study. Compared with children exposed to a radon concentration below the median (< 77.7 Bq/m3), adjusted hazard ratios for children with exposure ? the 90th percentile (? 139.9 Bq/m3) were 0.93 (95% CI: 0.74, 1.16) for all cancers, 0.95 (95% CI: 0.63, 1.43) for all leukemias, 0.90 (95% CI: 0.56, 1.43) for acute lymphoblastic leukemia, and 1.05 (95% CI: 0.68, 1.61) for CNS tumors.Conclusions: We did not find evidence that domestic radon exposure is associated with childhood cancer, despite relatively high radon levels in Switzerland.Citation: Hauri D, Spycher B, Huss A, Zimmermann F, Grotzer M, von der Weid N, Weber D, Spoerri A, Kuehni C, Röösli M, for the Swiss National Cohort and the Swiss Paediatric Oncology Group (SPOG). 2013. Domestic radon exposure and risk of childhood cancer: a prospective census-based cohort study. Environ Health Perspect 121:1239-1244;?http://dx.doi.org/10.1289/ehp.1306500. PMID:23942326
Hauri, Dimitri; Spycher, Ben; Huss, Anke; Zimmermann, Frank; Grotzer, Michael; von der Weid, Nicolas; Weber, Damien; Spoerri, Adrian; Kuehni, Claudia E; Röösli, Martin
Increased loads of land-based pollutants are a major threat to coastal-marine ecosystems. Identifying the affected marine areas and the scale of influence on ecosystems is critical to assess the impacts of degraded water quality and to inform planning for catchment management and marine conservation. Studies using remotely-sensed data have contributed to our understanding of the occurrence and influence of river plumes, and to our ability to assess exposure of marine ecosystems to land-based pollutants. However, refinement of plume modeling techniques is required to improve risk assessments. We developed a novel, complementary, approach to model exposure of coastal-marine ecosystems to land-based pollutants. We used supervised classification of MODIS-Aqua true-color satellite imagery to map the extent of plumes and to qualitatively assess the dispersal of pollutants in plumes. We used the Great Barrier Reef (GBR), the world's largest coral reef system, to test our approach. We combined frequency of plume occurrence with spatially distributed loads (based on a cost-distance function) to create maps of exposure to suspended sediment and dissolved inorganic nitrogen. We then compared annual exposure maps (2007-2011) to assess inter-annual variability in the exposure of coral reefs and seagrass beds to these pollutants. We found this method useful to map plumes and qualitatively assess exposure to land-based pollutants. We observed inter-annual variation in exposure of ecosystems to pollutants in the GBR, stressing the need to incorporate a temporal component into plume exposure/risk models. Our study contributes to our understanding of plume spatial-temporal dynamics of the GBR and offers a method that can also be applied to monitor exposure of coastal-marine ecosystems to plumes and explore their ecological influences. PMID:23500022
Álvarez-Romero, Jorge G; Devlin, Michelle; Teixeira da Silva, Eduardo; Petus, Caroline; Ban, Natalie C; Pressey, Robert L; Kool, Johnathan; Roberts, Jason J; Cerdeira-Estrada, Sergio; Wenger, Amelia S; Brodie, Jon
Objectives: To study the risk of birth defects by parental occupational exposure to 50 Hz electromagnetic fields. Methods: The Medical Birth Registry of Norway was linked with census data on parental occupation. An expert panel constructed a job exposure matrix of parental occupational exposure to 50 Hz magnetic fields. Exposure to magnetic fields was estimated by combining branch and occupation into one of three exposure levels: <4 hours, 4–24 hours, and >24 hours/week above approximately 0.1 µT. Risks of 24 categories of birth defects were compared across exposure levels. Out of all 1.6 million births in Norway in the period 1967–95, 836 475 and 1 290 298 births had information on maternal and paternal exposure, respectively. Analyses were based on tests for trend and were adjusted for parents' educational level, place of birth, maternal age, and year of birth. Results: The total risk of birth defects was not associated with parental exposure. Maternal exposure was associated with increased risks of spina bifida (p=0.04) and clubfoot (p=0.04). A negative association was found for isolated cleft palate (p=0.01). Paternal exposure was associated with increased risks of anencephaly (p=0.01) and a category of "other defects" (p=0.02). Conclusion: The present study gives an indication of an association between selected disorders of the central nervous system and parental exposure to 50 Hz magnetic fields. Given the crude exposure assessment, lack of comparable studies, and the high number of outcomes considered, the results should be interpreted with caution.
Current occupational exposures to chemical agents were assessed as part of an epidemiological study pertaining to the cancer and mortality patterns of Ontario construction workers. The task-basedexposure assessment involved members from nine construction trade unions. Air samples were taken using personal sampling pumps and collection media. A DustTrak direct-reading particulate monitor was also employed. Exposure assessments included measurements of airborne respirable, inhalable, total, and silica dust; solvents; metals; asbestos; diesel exhaust and man-made mineral fibers (MMMF). In total, 396 single- or multi-component (filter/tube), 798 direct-reading, and 71 bulk samples were collected. The results showed that Ontario construction workers are exposed to potentially hazardous levels of chemical agents. The findings are similar to those reported by other researchers, except for silica exposure. In our study, silica exposure is much lower than reported elsewhere. The difficulty associated with assessing construction workers' exposures is highlighted. PMID:14612300
Verma, Dave K; Kurtz, Lawrence A; Sahai, Dru; Finkelstein, Murray M
Siting criteria are established by regulatory authorities to evaluate potential accident scenarios associated with proposed nuclear facilities. The 0.25 Sv (25 rem) siting criteria adopted in the United States has been historically based on the prevention of deterministic effects from acute, whole-body exposures. The Department of Energy has extended the applicability of this criterion to radionuclides that deliver chronic, organ-specific irradiation through the specification of a 0.25 Sv (25 rem) committed effective dose equivalent siting criterion. A methodology is developed to determine siting criteria based on the prevention of deterministic effects from inhalation intakes of radionuclides which deliver chronic, organ-specific irradiation. Revised siting criteria, expressed in terms of committed effective dose equivalent, are proposed for nuclear facilities that handle primarily plutonium compounds. The analysis determined that a siting criterion of 1.2 Sv (120 rem) committed effective dose equivalent for inhalation exposures to weapons-grade plutonium meets the historical goal of preventing deterministic effects during a facility accident scenario. The criterion also meets the Nuclear Regulatory Commission and Department of Energy Nuclear Safety Goals provided that the frequency of the accident is sufficiently low. PMID:9827508
As a result of the recent recommendations of the ICRP-60 and in anticipation of possible regulation on occupational exposure of commercial aircrew, a two-phase investigation was carried out over a 1-y period to determine the total dose equivalent on representative Canadian-based flight routes. In the first phase of the study, dedicated scientific flights on a Northern round-trip route between Ottawa and Resolute Bay provided the opportunity to characterize the complex mixed-radiation field and to intercompare various instrumentation using both a conventional suite of powered detectors and passive dosimetry. In the second phase, volunteer aircrew carried (passive) neutron bubble detectors during their routine flight duties. From these measurements, the total dose equivalent was derived for a given route with a knowledge of the neutron fraction as determined from the scientific flights and computer code (CARI-3C) calculations. This study has yielded an extensive database of over 3,100 measurements providing the total dose equivalent for 385 different routes. By folding in flight frequency information and the accumulated flight hours, the annual occupational exposures of 20 flight crew have been determined. This study has indicated that most Canadian-based domestic and international aircrew will exceed the proposed annual ICRP-60 public limit of 1 mSv y(-1) but will be well below the occupational limit of 20 mSv y(-1). PMID:11045532
Tume, P; Lewis, B J; Bennett, L G; Pierre, M; Cousins, T; Hoffarth, B E; Jones, T A; Brisson, J R
Snow avalanche terrain in backcountry regions of Canada is increasingly being assessed based upon the Avalanche Terrain Exposure Scale (ATES). ATES is a terrain based classification introduced in 2004 by Parks Canada to identify "simple", "challenging" and "complex" backcountry areas. The ATES rating system has been applied to well over 200 backcountry routes, has been used in guidebooks, trailhead signs and maps and is part of the trip planning component of the AVALUATOR™, a simple decision-support tool for backcountry users. Geographic Information Systems (GIS) offers a means to model and visualize terrain based criteria through the use of digital elevation model (DEM) and land cover data. Primary topographic variables such as slope, aspect and curvature are easily derived from a DEM and are compatible with the equivalent evaluation criteria in ATES. Other components of the ATES classification are difficult to extract from a DEM as they are not strictly terrain based. An overview is provided of the terrain variables that can be generated from DEM and land cover data; criteria from ATES which are not clearly terrain based are identified for further study or revision. The second component of this investigation was the development of an algorithm for inputting suitable ATES criteria into a GIS, thereby mimicking the process avalanche experts use when applying the ATES classification to snow avalanche terrain. GIS based classifications were compared to existing expert assessments for validity. The advantage of automating the ATES classification process through GIS is to assist avalanche experts with categorizing and mapping remote backcountry terrain.
In this paper, maximum specific absorption rate (SAR) estimation formulas for RF main beam exposure from mobile communication base station antennas are proposed. The formulas, given for both whole-body SAR and localized SAR, are heuristic in nature and valid for a class of common base station antennas. The formulas were developed based on a number of physical observations and are
B. Thors; M. L. Strydom; B. Hansson; F. Meyer; K. Karkkainen; P. Zollman; S. Ilvonen; C. Tornevik
Laser Speckle Contrast Analysis (LASCA) was introduced in 1981. Since then, several enhancements were applied to it. Nowadays, thetechnique can provide relatively high accuracy as well as high temporal and spatial resolution during the examination of ocular or cerebraltissues. However, in the case of skin, the results are highly affected by the intensive scattering on the skin surface, as the scattering onthe non-moving parts of the sample lead to the detrimental decrease of the accuracy. We present a LASCA method based on the use ofmultiple exposure times, combined with the switching-mode control of the light intensity and a special sampling technique to achieve nearto real-time measurement of the skin perfusion. The system based on our method is able to automatically handle the destructive effect ofthe skin surface and re-tune itself according to the changes of the sample, while it provides full-field perfusion maps with high accuracy,without the need of any precalibrations.
Background: Job-exposure matrices (JEMs) applicable to the general population are usually constructed by using only the expertise of specialists. Aims: To construct a population based JEM for chemical agents from data based on a sample of French workers for surveillance purposes. Methods: The SUMEX job-exposure matrix was constructed from data collected via a cross-sectional survey of a sample of French workers representative of the main economic sectors through the SUMER-94 survey: 1205 occupational physicians questioned 48 156 workers, and inventoried exposure to 102 chemicals. The companies' economic activities and the workers' occupations were coded according to the official French nomenclatures. A segmentation method was used to construct job groups that were homogeneous for exposure prevalence to chemical agents. The matrix was constructed in two stages: consolidation of occupations according to exposure prevalence; and establishment of exposure indices based on individual data from all the subjects in the sample. Results: An agent specific matrix could be constructed for 80 of the chemicals. The quality of the classification obtained for each was variable: globally, the performance of the method was better for less specific and therefore more easy to assess agents, and for exposures specific to certain occupations. Conclusions: Software has been developed to enable the SUMEX matrix to be used by occupational physicians and other prevention professionals responsible for surveillance of the health of the workforce in France.
Rationale: Exposure to arsenic through drinking water has been linked to respiratory symptoms, obstructive lung diseases, and mortality from respiratory diseases. Limited evidence for the deleterious effects on lung function exists among individuals exposed to a high dose of arsenic. Objectives: To determine the deleterious effects on lung function that exist among individuals exposed to a high dose of arsenic. Methods: In 950 individuals who presented with any respiratory symptom among a population-based cohort of 20,033 adults, we evaluated the association between arsenic exposure, measured by well water and urinary arsenic concentrations measured at baseline, and post-bronchodilator-administered pulmonary function assessed during follow-up. Measurements and Main Results: For every one SD increase in baseline water arsenic exposure, we observed a lower level of FEV1 (-46.5 ml; P < 0.0005) and FVC (-53.1 ml; P < 0.01) in regression models adjusted for age, sex, body mass index, smoking, socioeconomic status, betel nut use, and arsenical skin lesions status. Similar inverse relationships were observed between baseline urinary arsenic and FEV1 (-48.3 ml; P < 0.005) and FVC (-55.2 ml; P < 0.01) in adjusted models. Our analyses also demonstrated a dose-related decrease in lung function with increasing levels of baseline water and urinary arsenic. This association remained significant in never-smokers and individuals without skin lesions, and was stronger in male smokers. Among male smokers and individuals with skin lesions, every one SD increase in water arsenic was related to a significant reduction of FEV1 (-74.4 ml, P < 0.01; and -116.1 ml, P < 0.05) and FVC (-72.8 ml, P = 0.02; and -146.9 ml, P = 0.004), respectively. Conclusions: This large population-based study confirms that arsenic exposure is associated with impaired lung function and the deleterious effect is evident at low- to moderate-dose range. PMID:23848239
Parvez, Faruque; Chen, Yu; Yunus, Mahbub; Olopade, Christopher; Segers, Stephanie; Slavkovich, Vesna; Argos, Maria; Hasan, Rabiul; Ahmed, Alauddin; Islam, Tariqul; Akter, Mahmud M; Graziano, Joseph H; Ahsan, Habibul
Schizophrenia is characterized by the impairment of several facets of social cognition. This has been demonstrated in numerous studies that focused on specific aspects of social cognition such as the attribution of intentions, emotions, or false beliefs to others. However, most of these studies relied on complex verbal descriptions or impoverished social stimuli. In the present study, we evaluated a new task (Versailles-Situational Intention Reading, V-SIR) that is based on video excerpts depicting complex real-life scenes of social interactions. Subjects were required to rate the probabilities of several affirmations of the intentions of one of the characters. The V-SIR task was administered to schizophrenic patients (N=15), depressed patients (N=12), manic patients (N=15), and healthy controls (N=15). The performance of schizophrenic patients was significantly impaired in comparison to healthy and depressed subjects. There was a trend toward a significant difference between schizophrenic and manic patients. Manic patients also demonstrated impaired performance relative to healthy subjects. Schizophrenic patients' V-SIR scores were significantly correlated with their scores on another attribution of intentions task that used comic strips. These results show that tasksbased on more ecological stimuli are powerful enough to detect theory-of-mind abnormalities in pathological populations such as schizophrenic patients. PMID:19346006
Polybrominated diphenyl ethers (PBDEs) are used commercially as additive flame retardants and have been shown to transfer into environmental compartments, where they have the potential to bioaccumulate in wildlife and humans. Of the 209 possible PBDEs, 2,2',4,4'-tetrabromodiphenyl ether (BDE-47) is usually the dominant congener found in human blood and milk samples. BDE-47 has been shown to have endocrine activity and produce developmental, reproductive, and neurotoxic effects. The objective of this study was to develop a physiologically based pharmacokinetic (PBPK) model for BDE-47 in male and female (pregnant and non-pregnant) adult rats to facilitate investigations of developmental exposure. This model consists of eight compartments: liver, brain, adipose tissue, kidney, placenta, fetus, blood, and the rest of the body. Concentrations of BDE-47 from the literature and from maternal-fetal pharmacokinetic studies conducted at RTI International were used to parameterize and evaluate the model. The results showed that the model simulated BDE-47 tissue concentrations in adult male, maternal, and fetal compartments within the standard deviations of the experimental data. The model's ability to estimate BDE-47 concentrations in the fetus after maternal exposure will be useful to design in utero exposure/effect studies. This PBPK model is the first one designed for any PBDE pharmaco/toxicokinetic description. The next steps will be to expand this model to simulate BDE-47 pharmacokinetics and distributions across species (mice), and then extrapolate it to humans. After mouse and human model development, additional PBDE congeners will be incorporated into the model and simulated as a mixture.
Emond, Claude, E-mail: email@example.com [Departement de sante environnementale et sante au travail Faculte de medecine, Universite de Montreal, P.O. Box 6128, Main Station, Montreal, Quebec, H3C 3J7 (Canada); BioSimulation Consulting Inc., Newark, DE 19711 (United States); Raymer, James H.; Studabaker, William B.; Garner, C. Edwin [RTI International, Research Triangle Park, NC 27709 (United States); Birnbaum, Linda S. [Office of Research and Development, National Center for Environmental Assessment, U.S. Environmental Protection Agency, Research Triangle Park, NC 27709 (United States)
Tributyl phosphate (TBP) is a toxic organophosphorous compound widely used in many industrial applications, including significant usage in nuclear processing. The industrial application of this chemical is responsible for occupational exposure and environmental pollution. In this study, (1)H NMR-based metabonomics has been applied to investigate the metabolic response to TBP exposure. Male Sprague-Dawley rats were given a TBP-dose of 15 mg/kg body weight, followed by 24h urine collection, as was previously demonstrated for finding most of the intermediates of TBP. High-resolution (1)H NMR spectroscopy of urine samples in conjunction with statistical pattern recognition and compound identification allowed for the metabolic changes associated with TBP treatment to be identified. Discerning NMR spectral regions corresponding to three TBP metabolites, dibutyl phosphate (DBP), N-acetyl-(S-3-hydroxybutyl)-L-cysteine and N-acetyl-(S-3-oxobutyl)-L-cysteine, were identified in TBP-treated rats. In addition, the (1)H NMR spectra revealed TBP-induced variations of endogenous urinary metabolites including benzoate, urea, and trigonelline along with metabolites involved in the Krebs cycle including citrate, cis-aconitate, trans-aconitate, 2-oxoglutarate, succinate, and fumarate. These findings indicate that TBP induces a disturbance to the Krebs cycle energy metabolism and provides a biomarker signature of TBP exposure. We show that three metabolites of TBP, dibutylphosphate, N-acetyl-(S-3-hydroxybutyl)-L-cysteine and N-acetyl-(S-3-oxobutyl)-L-cysteine, which are not present in the control groups, are the most important factors in separating the TBP and control groups (p<0.0023), while the endogenous compounds 2-oxoglutarate, benzoate, fumarate, trigonelline, and cis-aconetate were also important (p<0.01). PMID:20688139
Neerathilingam, Muniasamy; Volk, David E; Sarkar, Swapna; Alam, Todd M; Alam, M Kathleen; Ansari, G A Shakeel; Luxon, Bruce A
Radon plays an important role for human exposure to natural sources of ionizing radiation. The aim of this article is to compare two approaches to estimate mean radon exposure in the Swiss population: model-based predictions at individual level and measurement-based predictions based on measurements aggregated at municipality level. A nationwide model was used to predict radon levels in each household and for each individual based on the corresponding tectonic unit, building age, building type, soil texture, degree of urbanization, and floor. Measurement-based predictions were carried out within a health impact assessment on residential radon and lung cancer. Mean measured radon l